In golf there’s a popular saying: play the course, not your opponent.

Related: How ‘CAASM’ closes gaps

In an enterprise, it’s the same rule. All areas of an organization need to be free to “play their own game.”

And  when malware, ransomware, or other cyber threats get in the way, the focus shifts from forward progress to focused co-operation. A security strategy should clear obstacles and enable  every part of a business operation to run smoothly.

Smarter security is the rising tide that lifts all ships. As all parts of an organization overlap with security, an increase in one allows benefits in others.

Departments such as support, manufacturing, design, services, and delivery are enhanced by smart security measures, which allay distracting setbacks and increase the overall inertia. This leads to revenue gains and positive customer outcomes.

What constitutes “smarter security?” Smarter security to me broadly refers to relentlessly focusing on fundamentals while maturing the program, making sure your risk posture aligns with your business strategy.

Complexity challenge

The complexity that has abounded in the past few years has left us more connected and data-driven than ever before. Business initiatives demand faster, more efficient outcomes and technology responds. However, security – the often overlooked and undervalued visitor – is struggling to communicate across the table.

When it comes down to it, C-level goals and CISO initiatives are not all that misaligned. We all want fast, powerful, capable tools that can launch our business into the future with its best foot forward. And we all want to avoid breaches and PR failures in the process.

However, enterprises often experience a disconnect between business objectives and security guidelines. It is in this disconnect that cybercriminals find opportunity.

Reffkin

The attack surface is expanding relentlessly and exponentially, while security initiatives aren’t ingrained into every department’s daily operation. The need for reset and oversight is so great that a new class of technology is emerging to give organizations a better grip on the digital sprawl that’s come to define modern-day enterprise architecture.

Gartner refers to it as “CAASM,” or cyber asset attack surface management. The concept of focusing on your attack surface is a good place to start if struggling to find where to begin.

This smarter form of security fills a glaring gap in today’s solution-saturated market; strategy, and the strategy that can only come from getting a full view of the course.

Automated offense

Smart security also means doing more with less so the company as a whole can run lean. This means secure file transfer solutions, so you don’t waste time with slow encrypting protocols. It means anti-phishing tools so your teams can open emails without needless hesitation or risk.

It also means offensive security measures and vulnerability management so your team can fix problems before they can be exploited and derail operations.

Automating the security tasks of an organization – or hiring out when necessary – keeps those basic hygiene concerns out of mind and allows a business to perform at its best. When done right, a smarter security strategy is unseen.

As I’ve mentioned before, the issue of security is essentially a problem-solving one. These are not security problems for security’s sake. They are fundamentally business problems that rely on security to solve them.

How do we innovate and stay ahead of the competition without our speed backfiring and creating more bugs? How do we take time to manage vulnerabilities in our CRM when we’ve promised 24/7 customer care that relies on it? How can we accomplish our CEO’s vision for full process automation when we’re still transitioning to the cloud – and are unfamiliar with the security terrain?

Smarter security measures mean more subtle, intuitive, predictive solutions that can grease the wheels for whatever a fast-thinking enterprise can come up with next.

Sometimes the issue is resources. Part of problem-solving is examining the trouble spot from all angles. Managed solutions can help. Data Loss Prevention can lift the strain of vigilance and increase security in the workflow.

The overall trend is this: technology, progress, and change are driving the business objectives of today, and “smarter security” solutions are ones that can keep up, stay out of the way, and enable all aspects of a business to perform at their top level.

About the essayist: Chris Reffkin is chief information security officer at cybersecurity software and services provider Fortra. He has deep experience implementing and overseeing security strategy for a myriad of top-tier organizations.

The 2020s are already tumultuous.

Related: The Holy Grail of ‘digital resiliency’

Individuals are experiencing everything from extraordinary political and social upheaval to war on the European continent to the reemergence of infectious diseases to extreme weather events.

Against this unsettling backdrop, citizens, consumers, employees, and partners will look to organizations that they trust for stability and positive long-term relationships.

Not every organization knows how to cultivate trust, however, or that it’s even possible to accomplish. As a result, in 2023, specific industries that normally experience healthy levels of trust will see major declines in trust that will take years to repair. Others will buck historical trends just to simply maintain their current trust levels.

Organizations should take into account the following predictions as they plot out the next steps of their trust journey in the year ahead:

•Trust in consumer technology will decline by 15 percent.

Over the past three years, technology has proven critical to consumers’ daily lives — from remote working and home-schooling to entertainment and e-commerce. Technology firms experienced unprecedented popularity because of this.

This honeymoon is coming to an end, however; expect to see trust in consumer technology companies declining by 15 percent in 2023. Regulatory crackdowns on poor privacy practices, continued supply chain issues, and ongoing challenges in retaining talent will all impact consumers’ sentiments negatively.

When consumers trust a brand less, they also lose trust in other businesses associated with it. This is the time for firms to map their value chain, assess trust fluctuation across their ecosystem, and be ready to act to safeguard trust.

•Half of firms will use AI for employee monitoring — battering employer trust.

Iannopollo

Forrester finds that around the world, employees trust their employer more than their colleagues. For example, 60 percent of US employees trust their colleagues while 64 percent trust their employer. Expect this trend to invert by the end of 2023 as employers overstep their bounds with the use of AI to monitor work-from-home productivity.

For those that choose to collect personal information from employees to measure performance, the data is grim. In 2022, Forrester finds that 56 percent of employees whose employer collects their personal information to measure performance are likely to actively look for a new opportunity at a new organization in the next year — 14 percentage points higher than the average.

Firms seeking to lead in employee experience must eliminate outdated notions of “time spent” and instead focus on outcome-based performance measurement.

•Banks will lose consumer trust in a period of economic turmoil.

In 2022, consumer trust in banks fell for the first time in several years. Additionally, Forrester data reveals that only 54 percent of US consumers believe their bank exhibits the trait of empathy.

As the economy continues to flash warning signals, consumers’ ire and resentment toward their bank will make it even harder to earn trust. Because of this, trust will decline for most banks.

To maintain consumer trust in 2023, banks must lead with empathy and take a data-driven approach to earning trust with concrete, targeted steps that can help them navigate the cost-of-living crisis.

•People’s trust in government will increase in the US.

Trust falls when governments are no longer able to create a better future for their people. In 2023, the US will buck historical trends that saw trust shrinking by building on dependability as a core lever of trust, as well as by investing heavily in such other key trust levers as accountability, competency, and transparency. For example, President Biden’s Management Agenda is doubling down on the combined power of customer and employee experience.

•Three-quarters of Californians will have asked firms to stop selling their data by the end of 2023.

Privacy continues to be a critical consumer value. According to Forrester, 47 percent of Californian online adults have exercised their CCPA right to ask companies to stop selling their data, while 30 percent have asked companies to delete their data.

As the privacy discussion takes center stage in the US over the next 12 months — especially given the potential for new federal legislation and the enforcement of existing state-level legislation — consumers’ privacy activism will continue to grow.

Now is the time for organizations to shore up their privacy and data protection programs and require that all new products, services, and experiences are private by design.

Companies understand that trust will be critical in the next 12 months and more so than ever before. Companies must develop a deliberate strategy to ensure that they gain and safeguard trust with their customers, employees, and partners.

Measuring trust in their brands, engaging line-of-business owners and other leaders to identify key initiatives (with regional variations as necessary), and setting a realistic time frame are all fundamental steps that they must take to get started on this important journey.

About the essayist: Enza Iannopollo is a principal analyst on Forrester’s security and risk team and a Certified Information Privacy Professional (CIPP/E). Her research focuses on compliance with data protection rules, privacy as a competitive differentiator, ethics, and risk management.

There is much that can be gleaned from helping companies identify and manage their critical vulnerabilities 24X7.

Related: The case for proactive pentests

Based on insights from our team of elite security researchers here at Bugcrowd, these are three trends gaining steam as 2022 comes to a close – trends that I expect to command much attention in 2023.

Continuous pentesting

For years, penetration testing has played an important role in regulatory compliance and audit requirements for security organizations. However, a longtime challenge with pentesting has been the “point-in-time” nature of the tests.

At some pre-defined period-of-time, the test is completed against the then-current version of the application and a report is delivered. The challenge is that application development has changed significantly in recent years; often by the time a pentest is completed and the report is delivered, the information is already out of date due to changes in the application.

Over the coming year, we will see an accelerating shift from traditional pentesting to more PenTesting-as-a-Service (PTaaS). Rather than point-in-time assessments, organizations are leveraging pentesting as an important tool in their risk and security program, rather than a necessary-evil to maintain compliance with internal or external requirements.

By completing incremental testing on the application, security organizations can gain current and ongoing visibility into the security posture of the application as the smaller scope allows for faster testing turnaround. This enables security organizations to receive real-time information into the current security posture of the application, network, or infrastructure.

Gerry

It’s important to remember that every change to a network or application, whether a major release or incremental release, represents an opportunity for new vulnerabilities to be introduced. Security organizations must maintain the ability to gain real-time visibility into their current posture – both from a risk governance perspective and from a compliance perspective.

Security vendor consolidation

The rapid expansion of new security products has led to many organizations purchasing the “latest and greatest” without having a strong integration plan in place. Without a clear deployment and integration plan, even the best security product will go underutilized.

For the past few years, the industry has seen an incredible amount of M&A consolidation. As a result, security organizations are looking internally for ways to leverage existing tool sets or upgrade existing tool sets versus adding to their ever-growing technology stack.

This growing need for security vendor consolidation will continue to be driven by both the cost of the security products and the limited internal resources to effectively operate the products.

Narrowing the talent gap

Attracting strong candidates has always been a core part of any business, and, like all businesses, finding senior talent, whether in cybersecurity or another function, requires a combination of attractive compensation, career growth, flexibility to work anywhere, and a mission that employees want to support.

It’s also important to find talent from non-traditional and diverse backgrounds, provide them with the necessary training and enablement, pay them well with additional equity incentives, and empower them to do what needs to be done.

For years, we’ve been led to believe there is a significant gap between the number of open jobs and qualified candidates to fill those jobs. While this is partially true, it doesn’t provide a true view into the current state of the market.

Employers need to take a more active approach to recruiting from non-traditional backgrounds, which, in turn, significantly expands the candidate pool from just those with formal degrees to individuals, who, with the right training, have incredibly high potential.

Additionally, this provides the opportunity for folks from diverse backgrounds, who otherwise wouldn’t be able to receive formal training, to break into the cybersecurity industry providing income, career and wealth-creation opportunities that they otherwise may not have access to.

Organizations need to continue to expand their recruiting pool, account for the bias that can currently exist in cyber-recruiting, and provide in-depth training via apprenticeships, internships, and on-the-job training, to help create the next generation of cyber-talent.

About the essayist: Dave Gerry is CEO of Bugcrowd, which supplies a security platform that combines contextual intelligence with actionable skills from elite security researchers to help organizations identify and fix critical vulnerabilities before attackers exploit them.

It’s all too easy to take for granted the amazing digital services we have at our fingertips today.

Related: Will Matter 1.0 ignite the ‘Internet of Everything’

Yet, as 2022 ends, trust in digital services is a tenuous thing. A recent survey highlights the fact that company leaders now understand that digital trust isn’t nearly what it needs to be. And the same poll also affirms that consumers will avoid patronizing companies they perceive as lacking digital trust.

DigiCert’s 2022 State of Digital Trust Survey polled 1,000 IT professional and 400 consumers and found that lack of digital trust can drive away customers and materially impact a company’s bottom line

“It’s clear that digital trust is required for organizations to instill confidence in their customers, employees and partners,” Avesta Hojjati, DigiCert’s vice president of Research and Development, told me. “Digital trust is the foundation for securing our connected world.”

I recently had the chance to visit with Hojjati. We conversed about why digital trust has become an important component of bringing the next iteration of spectacular Internet services to full fruition. And we touched on what needs to happen to raise the bar of digital trust. Here are a few key takeaways from our evocative discussion:

Vigilance required

As 2022 comes to a close, connectivity is exploding. This portends many more digital wonders to come. Yet threat actors continue to breach corporate networks with impunity. And now, finally, digital trust is commanding attention.

One hundred percent of the IT pros who participated in DigiCert’s survey acknowledged the importance of gaining and keeping digital trust. The backdrop is an operating environment is which their organizations’ network attack surface is scaling up. What’s more, 99 percent of the IT pros said they believed their customers would switch to a competitor should they lose trust in the enterprise’s digital security.

Meanwhile, more than half, some 57 percent, of consumers polled by DigiCert acknowledged that they’ve experienced cybersecurity issues such as account takeovers, password exposure and payment card fraud. And nearly half, 47 percent, said they’ve stopped doing business with a company after losing trust in that company’s digital security.

Consumers aren’t blind; they’ve become wary of companies that lack online vigilance. Some 84 percent said they would consider not patronizing a company that fails to manage digital trust, with 57 percent saying switching to a more trustworthy provider would be likely.

“Consumers understand what digital trust is and they’re making it a requirement for any entity they’re dealing with to protect their data and their online accounts,” Hojjati says. “If they find that’s not the case, consumers have no problem switching to another vendor.”

Baked-in security

So how did we get here? Over the past decade, digital transformation has advanced rapidly – and even more so post Covid 19. In this environment, companies chased after the operational efficiencies – without duly considering security. And as this shift to reliance on cloud-infrastructure and remote workers accelerated, no one accounted for the fresh pathways left wide open to malicious hackers.

Hojjati

“Enterprises were slow to acknowledge that digital trust was missing,” Hojjati observes. “We dove too quickly into making everything digitalized, but we didn’t realize that this superfast inter-operability and hyper interconnectivity absolutely requires a foundation of trust.”

Digital trust has emerged as a must-have; without it confidence in online business processes are destined to erode. At a macro level, this means security must somehow get deeply baked into leading-edge IT architectures. Systemic changes need to be agreed upon and universally adopted. Smart, adaptable, automated security needs to be infused into the ephemeral, highly distributed and cloud-centric digital infrastructure that will take us forward.

At a micro level, company leaders and captains of industry must arise as champions and stewards of digital trust, Hojjati argues, not only for their own internal employees and operations, but also for their customers, partners and extended communities.

Infusing digital trust

Moving forward, digital trust must become a cornerstone of security. One core technology for providing digital trust is the public key infrastructure (PKI), or more precisely, advanced implementations of PKI. As a prominent supplier of PKI services and digital certificate lifecycle management systems for companies worldwide, DigiCert brings this skin into the game. PKI is the framework by which digital certificates get issued to authenticate the identity of users and devices; and it is also the plumbing for encrypting data that moves across the public Internet.

PKI already is deeply engrained in the legacy Internet; companies use it to certify and secure many types of digital connections coming into, as well as inside of, their private networks.

Because PKI is ubiquitous and time-tested it is well-suited to be a leading technology used for infusing digital trust into the next iteration modern networks designed to handle massive interconnectivity and support vast interoperability. This is the working premise espoused by DigiCert and other security experts.

“Modern digital systems simply could not exist without trusted operations, processes and connections,” Hojjati says. “They require integrity, authentication, trusted identity and encryption.”

Public awareness, not to mention public demand for improved security, is an important catalyst. Consumer preference for digital services they can fully trust should remind  industry and company leaders to stay focused on doing what needs to get done.

Indeed, industry consensus is being shaped around new sets of standards needed to replace the outdated protocols and policies that gave us the legacy Internet. This heavy lifting is being undertaken by a number of industry forums far out of the public eye.

Refreshed standards

One milestone advance achieved by this effort is Matter 1.0 – the new home automation connectivity standard rolling out this holiday season. There are high hopes that Matter will blossom into the lingua franca for the Internet of Things.

For its part, DigiCert continues to be a prominent participant in the public-private consortia developing and refining a fresh portfolio of security standards needed to engrain digital trust. This includes new security protocols not just for digital certificates but for all things to do with smart buildings, smart transportation systems and smart infrastructure, as well.

As the details get hammered out, it would be wise for companies and industry sectors to jump on board the digital trust band wagon, the sooner the better. And if fear of losing customers adds to their motivation, then so be it.

“Digital trust by design is something company decision makers have to consider,” Hojjati says. “They need to make digital trust a strategic imperative.”

DigiCert recommends assigning a senior executive with explicit duties to support digital trust. One way to do this might be to create the role of  “digital trust officer,” Hojjati says. A DTO could focus on mitigating exposures spinning out of an ever-expanding attack surface; in other words, implementing advanced security systems and procedures on premises, for remote workers and up and down the supply chain, he says.

Clearly new rules of the road like this are needed. Encouragingly, they’re coming. I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

 

The Internet of Everything (IoE) is on the near horizon.

Related: Raising the bar for smart homes

Our reliance on artificially intelligent software is deepening, signaling an era, just ahead, of great leaps forward for humankind.

We would not be at this juncture without corresponding advances on the hardware side of the house. For instance, very visibly over the past decade, Internet of Things (IoT) computing devices and sensors have become embedded everywhere.

Not as noticeably, but perhaps even more crucially, big advances have been made in semiconductors, the chips that route electrical current in everything from our phones and laptops to automobile components and industrial plant controls.

I recently visited with Thomas Rosteck, Division President of Connected Secure Systems (CSS) at Infineon Technologies, a global semiconductor manufacturer based in Neubiberg, Germany. We discussed how the Internet of Things, to date, has been all about enabling humans to leverage smart devices for personal convenience.

“What has changed in just the past year is that things are now starting to talk to other things,” Rosteck observes. “Smart devices and IoT systems are beginning to interconnect with each other and this is only going to continue.”

This ascension to the next level of connectivity is underscored by Matter 1.0, the new home automation connectivity standard rolling out this holiday season. Matter paves the way for not just more Internet-connected gadgetry; it makes possible a new tier of highly interoperable digital systems providing amazing services that are highly secure and that intrinsically preserve individual privacy.

For a full drill down on our evocative discussion please watch the accompanying videocast. Here are the main takeaways:

Dispersing electricity

Strictly speaking, a semiconductor is any crystalline solid that resists the flow of electricity in a distinctive way. We call them microchips or just chips and they are the building blocks of diodes, transistors and integrated circuits – the components that direct electrical current to carry out processing routines.

Semiconductors are the hardware components that make up the nervous system of each and every smart device — from tiny sensors to sprawling cloud servers and everything in between. Rosteck outlined how advanced semiconductors will be indispensable in two broad areas going forward: power modules and microcontrollers. Both come into play across the breadth of IoT and, even more so, with respect to IoE.

Rosteck

For instance, semiconductor power modules enable the generation, transmission and consumption of electricity in everything from the control room of a modern industrial plant to the timer on your smart coffee grinder.

Power modules must continue to advance; energy consumption of big digital systems must continue to become more and more efficient to support the smart commercial buildings and transportation systems of the near future, Rosteck says.

With power modules circulating electricity very efficiently at a macro level, advanced microcontrollers can grab the spotlight. These are the unseen chipsets that carry out discreet tasks, such as activating your smart auto’s proximity sensors and rear view camera or controlling your smart home’s thermostat and garage door opener.

Microcontrollers are, in essence, mini computing engines; today they serve mainly as the knobs and flip switches of IoT; going forward they’ll evolve into sophisticated controls that make complex decisions, autonomously, as part of new IoE systems.

Energy at the edges

How microcontrollers distribute energy is a very big deal. Innovation in the semiconductor industry is focused on finding smarter ways to disperse tiny bursts of electricity to a sprawling galaxy of IoT devices and new IoE systems. Energy needs to be dispersed very efficiently, in just the right measure, to support the machine intelligence routines increasingly taking place at the cloud edge, Rosteck explained.

“When I transport energy or when I consume energy, I must do this efficiently, meaning not wasting energy by ‘turning it on its head,’ but turning energy into what I really want to use it for,” he says.

Rosteck described for me a smart home of the near future. It would be equipped with array of Internet-connected devices that work in concert to optimize energy consumption. Unseen and unnoticed by the resident, interconnected systems would be capable of correlating real-time weather data, traffic patterns and the resident’s work schedule and then calculate the precise amount of energy needed on a given day, or even hour of the day.

Such smart homes could become the norm in the era of IoE. This would lead to an optimum blending of private and public sources of energy. Individual consumers could tap solar energy from their roof tops, public utilities would supply power from legacy power plants as well as from new renewable energy operations.

The result: energy conservation would advance significantly. It’s notable that technologists and social scientists are discussing how to leverage interconnected digital infrastructure, i.e. the Internet of Everything, to foster similar “greater good” scenarios in other arenas. This includes mainstreaming autonomous transportation systems, perhaps even redistributing wealth more equitably across the planet.

Baking in security

First, however, two things need to radically change: digital systems must be able to interconnect much more seamlessly than is possible at this moment; and cybersecurity needs to rise to a much higher level. And this is where Matter 1.0 comes into play.

To start, any Matter-compliant smart home device will be able to interoperate with whatever virtual assistant the resident might have. Making it possible for a consumer to use Amazon AlexaGoogle Assistant,  Apple HomeKit or Samsung SmartThings  to operate all types of Matter-compliant devices is a giant step in convenience — and a small step toward a much greater good. Work has commenced on future iterations of Matter that will make IoT systems in commercial buildings and healthcare facilities much more interoperable than is the case today.

Cybersecurity remains a major obstacle that must be dealt with. Interconnected systems that can easily be hacked, of course, would be untenable. Thus, Matter sets forth an extensive process for issuing a “device attestation certificate” for each Matter-compliant device. This process revolves around extending the tried-and-true Public Key Infrastructure framework and associated Digital Certificates that assure website authenticity and carry out encryption across the legacy Internet.

That said, Matter is a new kind of tech standard. The standards that allowed the legacy Internet to blossom commercially – protocols skewed toward open and anonymous access — also doomed networks to be endlessly vulnerable to breaches. By stark contrast, Matter requires robust security of our next generation of interconnected devices and systems to be deeply secure from day one.

“If you bake a cake, you can’t change the flavor of the cake once it’s finished baking. It’s the same with standards, you must think about security from the beginning,” Rosteck says. “Matter is the first standard that I know of that accounted for security in the beginning.”

Indeed, when Google, Amazon, Apple and Samsung convened three years ago to draw up Matter, one of the very first moves the tech giants championed was to set up a security work group, Rosteck says. This is how security got baked in from the start. And the result is that the Matter standard is poised to foster a quickening of hardware and software advances that will take us to the next level of connectivity — securely.

There’s still a long way to go. I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

 

Much more effective authentication is needed to help protect our digital environment – and make user sessions smoother and much more secure.

Related: Why FIDO champions passwordless systems

Consider that some 80 percent of hacking-related breaches occur because of weak or reused passwords, and that over 90 percent of consumers continue to re-use their intrinsically weak passwords.

Underscoring this trend,  Uber was recently hacked — through its authentication system. Let’s be clear, users want a better authentication experience, one that is more secure, accurate and easier to use.

The best possible answer is coming from biometrics-based passwordless, continuous authentication.

Gaining traction

Passwordless, continuous authentication is on track to become the dominant authentication mechanism in one to two years.

Continuous authentication is a means to verify and validate user identity —  not just once, but nonstop throughout an entire online session. This is accomplished by constantly measuring the probability that an individual user is who he or she claims to be; a variety of behavioral patterns sensed in real time and machine learning get leveraged to do this.

Passwordless, continuous authentication addresses the dire need for higher and better security. Cyber attacks continue to grow in sophistication, and ransomware attacks are only the tip of the iceberg. Compromised credentials represent the most usual way attackers penetrate networks. That simply is not tolerable, going forward.

Schei

With a market and a society ready to go for it, passwordless authentication expansion is about to accelerate. In fact,  demand for passwordless systems is expected to grow 15 percent per annum – topping $5.5 billion by 2032. It’s no surprise that passwordless authentication is at the core of Gartner’s report on emerging technologies and trends for 2022.

Invisible security

Authentication systems that leverage machine learning and biometric technology are now ready to replace legacy password-centric technologies. Machine learning can be applied to facial recognition data, for example, to provide an invisible security layer, with no actions required from the user.

This invisible authentication is very difficult to hack. This is because it relies on biometric features that can’t be shared. Widely adopted from healthcare to law enforcement, it  can deliver secure, accurate authentication even when the user is wearing a mask; it prevents unauthorized access that can now be done by compromising devices we use as a second factor of authentication.

In industries such as banking, healthcare and law enforcement, where employees work under pressure to handle sensitive information, cybersecurity and productivity often contradict each other.

Password-based multi-factor authentication (MFA) systems, for instance, require constantly logging in and out of user sessions; employees waste working time, and can even suffer from MFA fatigue. These inefficiencies can open the gate to cyber attacks.

By contrast, passwordless, continuous authentication affords a double gain for companies: cybersecurity is materially improved, while authentication friction gets erased. This improves daily productivity, not to mention employees’ happiness.

Continuous vigilance

Current authentication tools focus on single sign on. This means that the authentication mechanism confirms the user at the beginning of the session but offers no guarantees during a user session.

One opportunity attackers seek out is when an authenticated user leaves the device unattended. Up to 95 percent of cyberthreats are successful because of a human error, including unattended sessions or visual hacking incidents, such as shoulder surfing.

This lack of extended security cannot be addressed through legacy sign-on authentication tools such as Microsoft Hello, that  rely on one-time image authentication.

Fortunately, there’s a growing trend towards passwordless, continuous authentication

One touchless delivery model is through face recognition, and a good example is the core  functionality built into GuacamoleID, supplied by Hummingbird.AI.  GuacamoleID uses sophisticated vision AI to recognize and secure user sessions, thus enabling touchless automated access to computers for security, privacy and compliance in law enforcement, healthcare and financial services.

Passwordless, continuous authentication improves the user experience by making it frictionless – and it materially boosts security by ensuring that there’s always the right person behind the device.

About the Author: Nima Schei,  is the founder and CEO of Hummingbirds AI, a supplier of technology that leveraging artificial intelligence to automate access to computers through face matching.

 

Government assistance can be essential to individual wellbeing and economic stability. This was clear during the COVID-19 pandemic, when governments issued trillions of dollars in economic relief.

Related: Fido champions passwordless authentication

Applying for benefits can be arduous, not least because agencies need to validate applicant identity and personal identifiable information (PII). That often involves complex forms that demand applicants gather documentation and require case workers to spend weeks verifying data. The process is slow, costly, and frustrating.

It’s also ripe for fraud. As one example, the Justice Department recently charged 48 suspects in Minnesota with fraudulently receiving $240 million in pandemic aid.

The good news is that an innovative technology promises to transform identity validation is capturing the attention of government and other sectors. Self-sovereign identity (SSI) leverages distributed ledgers to verify identity and PII – quickly, conveniently, and securely.

Individual validation

Any time a resident applies for a government benefit, license, or permit, they must prove who they are and provide PII such as date of birth, place of residence, income, bank account information, and so on. The agency manually verifies the data and stores it in a government database.

Whenever the resident wants to apply for services from another agency, the process repeats. Every transaction involves redundant steps and is an opportunity for fraud. Meanwhile, PII in government databases is at risk for cybertheft.

SSI – sometimes referred to as decentralized identity – uses a different strategy. Rather than rely on centralized databases, PII is validated via a distributed ledger or blockchain. Data is never stored by the government agency, yet they can still be sure they are transacting with the right person. This approach makes the data fundamentally secure and makes identity theft virtually impossible. Once the data is initially validated, it can be trusted by every agency, every time.

SSI also puts residents in control. They decide which data to release to which agencies and can revoke access at any time. They don’t need to worry about data privacy or whom the data might be shared with. Finally, they don’t have to endure a lengthy process of gathering data and waiting for approvals.

Conceptually, SSI functions the same way in any scenario. But three use cases demonstrate its promise.

Simplifying applications

Bhatnagar

For programs that benefit families, applications can run 20 pages and take weeks to process. An example is the Supplemental Nutrition Assistance Program (SNAP). Applicants must provide details on the entire household, including dates of birth, incomes, assets such as bank accounts, and expenses such as utilities.

Many people who receive SNAP benefits are also eligible for Medicaid, Temporary Assistance for Needy Families (TANF), and the Children’s Health Insurance Program (CHIP). Without SSI, residents must manually submit the same information to each program, and each program must manually verify the information before storing it in a database.

Furthermore, benefits applications like SNAP aren’t one-off processes. Say a mother with two children suddenly finds herself a single parent with no employment. She might qualify for SNAP until she gets a job. Then she might have another child and qualify again. Without SSI, each time she re-applies, her data needs to be re-verified and re-stored.

With SSI, applicants submit their household data for verification only once. When that information is verified, each datapoint is stored in the resident’s digital wallet as a credential. When they need to share that information with another agency, it’s validated via the public ledger in minutes.

With SSI, once a credential is in the digital wallet, all programs can trust it. The process is faster and easier for both the applicant and the benefits administrator.

Preventing fraud

Government-backed loans for college, certificate, and vocational programs help residents achieve financial wellbeing and contribute to society, but they’re also opportunities for fraud. For instance, California community colleges received 65,000 fraudulent loan applications in 2021.

What’s more, institutions collect, verify, and store vast amounts of student data. When a specific department needs student data for its own needs, it often repeats the process. Meanwhile, all that data makes colleges targets for cybertheft.

SSI solves these issues. Once their identity is verified via the distributed ledger, students can release data to any institution or department. Schools can trust the data, and they no longer need to store it in their own databases. Plus, identify theft and loan fraud become virtually impossible.

The student’s digital wallet can expand over time with relevant data such as course credits, grade point averages, and degrees. Once the data is verified, it remains trustworthy – even if, say, the school that issued a degree no longer exists.

Medical marijuana access

More than 30 U.S. states and territories have legalized cannabis products for medical use. To access medical marijuana, patients typically require a medical marijuana card.

The process normally starts with a doctor’s prescription. The patient then applies to the state for a card. Once the card is issued, the patient presents it at a dispensary to purchase a cannabis product. In cases where the patient isn’t mobile, a caregiver is authorized to make the purchase.

SSI streamlines and provides assurance throughout this process. The state can trust any patient identity or PII already verified via the distributed ledger. The doctor’s credentials can be validated in the same way. Prescriptions and authorized caregivers can be stored as patient credentials.

The dispensary needn’t worry about being held liable for accepting a fake medical marijuana card. In fact, once patient data is validated in the distributed ledger, no party in the supply chain needs to independently verify it.

For residents, SSI provides control over PII and eases worries about confidentiality. For governments, it streamlines data verification and strengthens cybersecurity, saving significant time and cost. For both, it can build trust and enable easier access to services that benefit individuals and communities. Ultimately, SSI promises to transform how people and organizations manage sensitive data across a multitude of use cases.

About the essayist: Piyush Bhatnagar, Vice President of Security Products and Platforms at GCOM Software. A graduate of Cornell University, Bhtnagar received his MBA in General Management and Strategy from Cornell’s Johnson Graduate School of Management. In addition he holds Masters Degree in Science (Computer Science) from Allahabad University as well as a Bachelor’s Degree in Science from University of Delhi.

Ever feel like your smart home has dyslexia?

Siri and Alexa are terrific at gaining intelligence with each additional voice command. And yet what these virtual assistants are starkly missing is interoperability.

Related: Why standards are so vital

Matter 1.0 is about to change that. This new home automation connectivity standard rolls out this holiday season with sky high expectations. The technology industry hopes that Matter arises as the  lingua franca for the Internet of Things.

Matter certified smart home devices will respond reliably and securely to commands from Amazon AlexaGoogle Assistant,  Apple HomeKit or Samsung SmartThings. Think of it: consumers will be able to control any Matter appliance with any iOS or Android device.

That’s just to start. Backed by a who’s who list of tech giants, Matter is designed to take us far beyond the confines of our smart dwellings. It could be the key that securely interconnects IoT systems at a much deeper level, which, in turn, would pave the way to much higher tiers of digital innovation.

I had the chance to sit down, once more, with Mike Nelson, DigiCert’s vice president of IoT security, to discuss the wider significance of this milestone standard.This time we drilled down on the security pedigree of Matter 1.0. Here are the main takeaways:

Pursuing interoperability

Connectivity confusion reigns supreme in the consumer electronics market. From wrist watches to refrigerators and TVs to thermostats, dozens of smart devices can be found in a typical home. Each device tends to be controlled by a separate app, though many can now also respond to one proprietary virtual assistant or another.

And then there’s Zigbee, Z-Wave and Insteon. These new personal networking protocols have caught fire with tech-savvy consumers hot to pursue DIY interoperability.

The tech giants saw this maelstrom coming. Google, Amazon, Apple, Samsung and others have spent nearly three years hammering out Matter. 1.0. What they came up with is an open-source standard designed to ensure that smart home devices from different manufacturers can communicate simply and securely via an advanced type of mesh network. 

Nelson

“Matter will create a level of interoperability that makes it so that a consumer can control any Matter-compliant device with whatever virtual assistant they might have,” Nelson says. “It’s going to become a product differentiator because it’s going to create so much value for them.”

This fall, certain brands of smart light bulbs, switches, plugs, locks, blinds, shades, garage door openers, thermostats and HVAC controllers will hit store shelves bearing the Matter logo. If all goes well, soon thereafter Matter-compliant security cameras, doorbells, robot vacuums and other household devices will follow.

Industry work groups already have started brainstorming future iterations of Matter that will make IoT systems in commercial buildings and healthcare facilities much more interoperable – and secure – than is the case today. Beyond that, Matter could bring true interoperability and more robust security to smart cities and autonomous transportations systems. Someday, perhaps, Matter might help to foster major medical breakthroughs and much-needed climate change mitigation.

Preserving digital trust

It’s not too difficult to visualize how imbuing true interoperability into advanced IoT systems, starting small with smart homes, can take us a long way, indeed. It’s also crystal clear that to get there, security needs to become much more robust.

Matter seeks to achieve this right out of the gate by leveraging and extending the public key infrastructure (PKI) — the tried-and-true authentication and encryption framework that underpins the legacy Internet.

PKI preserves digital trust across the Internet by designating a Certificate Authority (CA) to issue digital certificates, which are then relied upon to authenticate user and machine identities during the data transfer process. PKI also keeps data encrypted as it moves between endpoints.

Matter sets forth a similar approach for preserving trust, going forward, of the data transfers that will take place across advanced IoT systems. An extensive process for issuing a “device attestation certificate” for each Matter-approved device has been put into place. DigiCert, which is a globally leading provider of digital trust and happens to be a leading Certificate Authority, recently became the first organization approved to serve much the same role when it comes to issuing Matter attestation certificates.

With respect to Matter, DigiCert has met the requirements to be designated as the first Product Attestation Authority (PAA.) This boils down to DigiCert taking extensive measures to create, preserve and distribute, at scale, an instrument referred to as a ‘root of trust.

Nelson described for me how these roots of trust are at the core of each certificate issued for every smart device that meets the Matter criteria.

Observes Nelson: “The root of trust creates an immutable identity . . . So when you have a Yale lock trying to connect to an Amazon virtual assistant, the first thing it does is look to see if there’s a trusted signature from a trusted root. If it’s there, it greenlights the communication and now two secure, compliant devices can interoperate. So these roots of trust become the magic of secure interoperability.”

It’s encouraging to see security baked in at the ground floor level of a milestone standard; Matter could pave the way for the full fruition of an  Internet of Everything that’s as secure as it ought to be. For that to happen, wide consumer adoption must follow; hardware manufacturers and software developers must jump on the Matter band wagon. I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

 

Phishing emails continue to plague organizations and their users.

Related: Botnets accelerate business-logic hacking

No matter how many staff training sessions and security tools IT throws at the phishing problem, a certain percentage of users continues to click on their malicious links and attachments or approve their bogus payment requests.

A case in point: With business losses totaling a staggering $2.4 billion, Business Email Compromise (BEC), was the most financially damaging Internet crime for the seventh year in a row, according to the FBI’s 2022 Internet Crime Report.

BEC uses phishing to trick users into approving bogus business payments to attackers’ accounts. BEC succeeds despite years of training users to recognize and address BEC emails properly and next-generation tools that harness AI, machine learning, and natural language processing to block phishing and BEC attempts.

The truth is that neither humans nor machines will ever be 100 percent successful tackling the phishing and BEC challenge. Even harnessing both side by side has not proven 100 percent effective.

What is the answer? Meld humans and AI tools into a single potent weapon that can beat the clock and catch just about every phishing email and BEC that attackers throw at it. Let’s examine how each of these strategies works and why both working together stands the best chance of solving the problem.

Leveraging AI/ML

Most people have a pretty good idea how phishing emails and BEC use social engineering to trick their unwitting victims. After extensive research and target identification, the attacker sends an innocent looking email to the victim, who is often someone in the finance department.

Ovadia

The email appears to come from the CEO, CFO, or a supplier, who requests with great urgency that the recipient update a supplier, partner, employee, or customer bank account number (to the attacker’s) or pay a phony late invoice. Thanks to careful research, the invoice is likely to look very convincing.

Legacy secure email gateways (SEG’s) miss these phishing emails because they lack the malicious attachments and links these tools typically look for. SEG’s are also only good at identifying widely known threats and require a lot of time and resources to maintain.

A more recent alternative, next-generation email security tools use advanced AI/ML with natural language processing, visual scanning, and behavioral analysis to recognize potential phishing emails.

Machine learning identifies and even predicts advanced attacks simply by analyzing large data sets, including emails, for similarities, correlations, trends, and anomalies. It requires few instructions and little maintenance.

As with many security tools, however, machine learning often fails to identify zero-day attacks–in this case spear phishing emails–if they’re different enough from previous ones.

With new types of phishing emails released by millions of attackers daily, it’s no surprise that a few get past the best designed ML models. ML can catch 99 percent of phishing emails, but you need more help to catch the remaining one percent.

Human-machine melding

Fortunately, it turns out that while some people can be fooled by phishing emails, others are adept at spotting suspicious emails and the phishing attempts that ML often misses. Multiply that human capability by thousands across hundreds of organizations of all sizes and you can create a very valuable threat intelligence system.

Such a system could potentially feed new phishing information right back into the machine learning models in real time, so they can start identifying similar phishing exploits immediately. Obviously, a machine learning system trained on phishing information only seconds or minutes old will spot potential zero-day attacks much more competently and rapidly than a machine with information that is days or weeks old.

The key is to meld the capabilities of human and machine into one, as the two-working side by side with no interaction cannot be nearly as effective. This melded process must constitute a constant feedback loop with an army of hundreds of thousands of human eyeballs.

The only way to solve a problem that grows exponentially is with a solution that grows exponentially as well. This is a similar strategy used by Waze, Google Maps, and Uber to keep users out of heavy traffic and allow them to share rides.

No doubt phishing and BEC will continue to grow in both frequency and sophistication. Technology and humans cannot catch all of them alone but working tightly together they can come very close.

About the essayist: Lomy Ovadia is Senior Vice President of Research and Development at  Ironscales, an Atlanta-based email security company.

Here’s a frustrating reality about securing an enterprise network: the more closely you inspect network traffic, the more it deteriorates the user experience.

Related: Taking a risk-assessment approach to vulnerabilities

Slow down application performance a little, and you’ve got frustrated users. Slow it down a lot, and most likely, whichever knob you just turned gets quickly turned back again—potentially leaving your business exposed.

It’s a delicate balance. But there’s something you can do to get better at striking it: build that balance into your network testing and policy management.

Navigating threats

Why do so many businesses struggle to balance network security and user experience? Because recent trends create new challenges on both sides of the equation. Trends like:

More distributed users and applications. Even before COVID, enterprises saw huge increases in people working outside the traditional corporate firewall. Today, users could be working anywhere, accessing applications and data from any number of potential vulnerable public and private clouds. It adds up to a much larger potential attack surface.

•More dynamic environments. Security has always been a moving target, with new threat vectors emerging all the time. Today though, the enterprise network itself changes just as frequently. With software-defined networks, shifting cloud infrastructures, and continuous integration/continuous delivery (CI/CD) pipelines, the network you have today might look very different tomorrow.

•Pervasive encryption: Most application and Internet traffic is now encrypted by default, making it much harder to secure the network from malicious traffic. Inspecting encrypted traffic adds significant latency—sometimes cutting application performance literally in half. If you don’t have much higher-performing security controls than you’ve used in the past, your latency-sensitive applications can become effectively unusable.

These are big challenges, and most organizations are still searching for answers. For example, half of enterprise firewalls capable of inspecting encrypted traffic don’t have that feature turned on due to performance concerns. You might preserve user Quality of Experience (QoE) that way, but you’re leaving your business vulnerable.

A smarter approach

Jeyaretnam

The constant push and pull between security and performance isn’t an anomaly. It’s baked into network threat defense, and no miracle tool is coming that will make the problem go away. But that doesn’t mean you can’t do something about it. In fact, the smartest thing to can do is just acknowledge it will always be a problem—and adapt your change management processes to reflect that. You do that via synthetic testing.

Using modern emulation assessment tools, you can deploy test agents at strategic points in your environment (within the on-premises network, in public and private clouds, at branch offices, and more) to simulate the network topology. You can then inject emulated traffic to test the performance limits of your network devices, web applications, and media services with all security controls engaged.

With this approach, you can establish a baseline for application performance on the network and ensure that user QoE remains good, even with network threat controls fully engaged. You can identify the right mix and size of security solutions to deploy and validate that you’re getting what you paid for. Then—and this is the key—you can proactively verify performance and security against the established baseline every time something changes in the network.

Balancing security and QoE

This approach is already widely used by organizations that can’t tolerate performance problems, such as service providers and financial enterprises in areas like high-speed trading. Given the steady growth of cyberthreats, encryption, and distributed users and applications, enterprises in every industry should be following their lead.

If you’re ready to implement continuous testing, here are four principles to keep in mind:

•Look beyond vendor data sheets. Enterprises often devote significant effort evaluating network security solutions prior to implementation, but surprisingly little to validating their performance once deployed. That’s a good way to get surprised. In too many cases, network and security organizations don’t even realize they have a performance problem until users start complaining.

•Emulate your unique environment. Even when a security vendor’s reported specs reflect reality, they’re based on ideal conditions—not your network. As you design your test scenarios, make sure you’re emulating the real-world production environment, with all applications and security controls configured as they will be for real users. You can then drill down into exactly what throughput looks like, what latencies different network applications are experiencing, and verify that you’re supporting your business practice.

•Think like an attacker. Along those lines, to validate security efficacy, make sure you’re testing against a realistic set of threat vectors that you’re looking to protect against. Keep in mind, attackers won’t just send basic threats; they’ll use evasions and obfuscations to try to hide what they’re doing. Your network security simulations should do the same.

•Test and test again. The most important step you can take to balance network security and performance: adopt a posture of continuous assessment. Start by identifying your baseline—what the environment looks like when everything is working as it should, when the security controls that matter to your business are active, and your users have good  quality of experience, QoE. Then, test against that baseline every time something changes.

Whether it’s a new network security solution, a software upgrade, a policy or configuration update, or any other change, you should immediately measure the effects of that change on user experience. You can now identify problems right away—before your users. And, since you’re measuring performance from multiple points across your environment, you can quickly zero in on their cause.

By taking these steps, you may not permanently solve the problem of balancing network security and performance. But you’ve solved it for today—and you’ve put the tools and procedures in place to keep solving it in the future.

About the essayist: Sashi Jeyaretnam is Senior Director of Product Management for Security Solutions, at Spirent,  a British multinational telecommunications testing company headquartered in Crawley, West Sussex, in the United Kingdom.