A new tier of overlapping, interoperable, highly automated security platforms must, over the next decade, replace the legacy, on-premise systems that enterprises spent multiple kings’ fortunes building up over the past 25 years.

Related: How ‘XDR’ defeats silos

Now along comes a new book, Evading EDR: The Definitive Guide for Defeating Endpoint Detection Systems, by a red team expert, Matt Hand, that drills down a premier legacy security system that is in the midst of this transition: endpoint detection and response, EDR.

Emerging from traditional antivirus and endpoint protection platforms, EDR rose to the fore in the mid-2010s to improve upon the continuous monitoring of servers, desktops, laptops and mobile devices and put security teams in a better position to mitigate advanced threats, such as APTs and zero-day vulnerabilities.

Today, EDR is relied upon to detect and respond to phishing, account takeovers, BEC attacks, business logic hacks, ransomware campaigns and DDoS bombardments across an organization’s environment. It’s a key tool that security teams rely upon to read the tea leaves and carry out triage, that is, make sense of the oceans of telemetry ingested by SIEMs and thus get to a position where they can more wisely fine-tune their organization’s automated vs manual responses.

Last Watchdog visited with Hand to get his perspective of what it’s like in the trenches, deep inside the world of managing EDRs, on the front lines of non-stop cyber attacks and reactive defensive tactics. He says he wrote Evading EDR to help experienced and up-and-coming security analysts grasp every nuance of how EDR systems work, from a vendor-agnostic perspective, and thus get the most from them. His guidance also happens to shed some revealing light about the ground floor of the cyber arms race while illustrating why network security needs to be overhauled.

LW: From a macro level, do security teams truly understand their EDRs? How much are they getting out of them at this moment; how much potential would you say is actually being tapped vs. left on the table?

Hand:   I don’t think that a majority of teams who rely on EDR truly understand their inner workings or are getting the most out of them. EDRs have historically been considered a “black box” – something that activity goes into, and alerts come out of. Most teams that I’ve encountered trust that their EDR works perfectly out of the box and unfortunately that’s just not the case.

Every EDR needs to be tuned to the specific environment in which it is deployed. Some vendors have a period during customer onboarding wherein the EDR observes what is typical in the environment and creates a baseline, but this shouldn’t be the end of tuning. The next step should be building custom detections tailored to the organization. Unfortunately, most SOCs are still understaffed so detection engineering often goes on the back burner in favor of managing the alert queue.

LW: Your chapter teasers suggest there remains a ton of viable attack paths in the nooks and crannies of Windows systems; is this where attackers are making hay with Living off the Land (LotL) tactics? Can you please frame what this looks like.

Hand:   In any significantly complex system, there will inevitably be edge and corner cases that we just can’t account for. Windows is a very complex operating system and there are a ton of native capabilities that attackers can leverage. This can include using traditional living-off-the-land binaries or something as niche as a Win32 API function that allows for arbitrary code to be executed.

Finding and closing all of these attack vectors is an immense, if not entirely unfeasible, task. This fact highlights the importance of growing beyond solely using brittle, signature-based detections and investing in robust detections that capture the common denominator between many techniques and operations that an attacker can employ. This is only a band aid though and we should be looking at Microsoft and other OS developers to invest more into secure-by-design principles.LW: Your book is targeted to precious commodity: experienced cybersecurity professionals. Aren’t reactive systems that require specialized human expertise, like EDR, on their way out?

Hand:   I don’t believe so. I think the biggest problem is in reactivity and how it forces us to use our more experienced engineers. Let’s say that there is some cool new post-exploitation technique circulating. Should I pull my most experienced engineers away from building proactive defenses to test, validate, and remediate any issues or should I rely more on my vendor(s) to ensure we’re covered? If a vendor can identify and shore up a deficiency in their product, it would benefit all customers and not just those with the technical expertise to throw at the problem.

Looking beyond this, if we accept the fact that we have a staffing shortage and truly senior engineers are rare, we have two options – forge more engineers or use ours more effectively. Right now, the impact an engineer has is typically limited to their own organization. For instance, if an engineer writes a detection to catch that cool new post-exploitation technique, the outside world will likely never know.

What if instead of keeping the output of the hard work that goes into extending the usefulness of an EDR (research, writing detections, tuning, etc.), we shared that information openly with others in the industry so that everyone can benefit from it? If a surgeon finds a cool new method to perform an operation that has better patient outcomes, do they squirrel it away at their hospital or do they publish it to a journal and teach others?

 LW: Where do you see EDR fitting in 10 years from now? Does it have a place in the leading-edge security platforms and frameworks that are shifting more to a focus on proactive resiliency at the cloud edge, instead of reactive systems on endpoints?

Hand:   Yes, 100%. At the end of the day, an endpoint is any system that runs code, whether those be workstations, servers, mobile devices, cloud systems, ICS, or any other type of system. The nature of endpoints has and will continue to change, but there will always be endpoints that need defending. Perimeter defense has also been around for ages, but now the nature of the perimeter is changing.

Hand

Trying to decide which is more important isn’t the conversation we should be having. Rather, we should accept that proactive hardening and increasing the resiliency of Internet-facing systems, which would fall into a “prevention” category, is equally as important as ensuring that we can catch an adversary that slips through the cracks. Realistically, if a motivated and well-resourced attacker wants to get into your environment, they will.

It’s just a matter of time. If we accept that fact, we should spend our limited time and resources making it reasonably difficult to breach the perimeter (MFA, asset management, inbound mail filtering, training) while also preparing for the inevitability of a breach by implementing robust detective controls that can catch an adversary as early in their attack chain as possible to reduce the impact of the breach and allow responders to more confidently evict them.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


 

Cisco’s $28 billion acquisition of Splunk comes at an inflection point of security teams beginning to adopt to working with modern, cloud-native data lakes.

Related: Dasera launches new Snowflake platform

For years, Splunk has been the workhorse SIEM for many enterprise Security Operation Centers (SOCs). However, security teams have challenges with Splunk’s steeply rising costs. And now, early adopters of security data lakes like Snowflake are saving more than two-thirds of what they were paying for their Splunk license.

Splunk’s inability to migrate to a modern cloud-native architecture makes it difficult to take advantage of these cost-saving benefits or implement advanced data science use cases critical for threat detection. The Cisco acquisition shall exacerbate these challenges and speed up the adoption of security data lakes.

While it’s great to see data lakes gaining so much momentum, many security teams struggle to take advantage of them. Ripping and replacing Splunk overnight is unrealistic. Enterprise security teams need a path to incrementally migrate to a modern data lake with minimal impact on their SOC workflows.

SOCs require the ability to manage detections and analyze real-time security threats in a unified manner, regardless of where their data is stored, which is best achieved by separating their analytics layer from their data logging layer.

Here’s how to leverage the power of decoupling to create a distributed data lake architecture where security teams can choose to use multiple data platforms like Splunk and Snowflake, while maintaining a consistent security analytics layer.

Data lake connectors 

From detections written in SQL, KQL, or SIEM-specific languages like Splunk’s SPL, to the utilization of Python notebooks and various data science models for threat hunting, the variety and volume of data in data platforms can pose processing and detection development challenges for detection engineers who are not subject matter experts in multiple query languages. Influxes of data ingestion and the flat architecture of data lakes have led to difficulties in extracting value from repositories.

Gonzalez

Relying on data collection and organization tools like the traditional SIEM to analyze the various log data for threat detection requires constant updating of the analysis methods and, more importantly, puts the onus of observability onto the security engineer. Every new data source becomes a headache for the multiple teams required to collaborate together to get each data source in a usable state.

For detection engineers to efficiently identify and thwart potential threat actors, the data logging and analytics layers need to be decoupled. This provides the flexibility to easily grow and change security to support the organizational/business changes (ex: moving from Splunk to Snowflake over time), reduce costs, and finally start to keep up or even stay ahead of alerts.

Impactful analysis

A decoupled, purpose-built threat detection platform can work across distributed data lake architectures. SOC teams will no longer need to modify detection logic, hunting notebooks, data science models, or wait for IT to prepare data sources.

Each data lake can be connected to the threat detection platform which can analyze and detect threats using a unified set of detection logic and advanced AI, with real-time normalization.

This streamlines security operations, and improves response agility, while also reducing vendor lock-in, giving CISOs flexibility for more cost-effective options. It also alleviates the cost and political implications associated with data migration and enables unified querying and analysis across multiple data lake architectures.

To achieve decoupling, organizations need to implement a unified detection layer and adopt the right AI tooling.

Implementing a unified detection layer simplifies the process of building detection content, even with diverse skill sets among security analysts. It also provides a standardized schema, enhancing the adaptability of security operations to different data storage scenarios. The unified detection layer should act as a hub for all detection content that connects to and processes detections within each data lake, regardless of the query language.

When you decouple the activity of threat detection from tools for which it is not inherently designed, you free up those resources to do what they need to do: address and remediate threats. Detection engineers can now spend more time protecting the business than figuring out how to protect the business.

Agnostic security

Decoupling enables rapid data access and flexibility in a distributed data lake architecture, meeting the demands of modern data management. By minimizing reliance on vendor-specific data logging platforms, data access can be expanded.

SOCs will gain control over their data storage strategy, allowing them to keep the data where it is. At the same time, SOC teams can keep pace with user expectations of more SaaS-ified, agile data management and future-proof security operations.

By leveraging a unified detection layer and AI, organizations can optimize data storage and analysis processes, leading to smarter and faster detection of security threats. Additionally, it promotes interoperability among different data sources and tools, ensuring a more seamless and flexible security infrastructure.

Data duplication and the associated operational costs are reduced, unnecessary logs and the associated costs are reduced, and the dependency on having fully normalized data in your data repository is eliminated in favor of data feeds. Additionally, analysts can be more effective by leveraging low/no-code detection builders, so they neither need to worry about parsing/normalizing the data nor be experts in a specific query language or technology.

With this shift, you can take advantage of modern innovations in storage architectures while simultaneously gaining access to specialized detection and response innovations.

About the essayist: Kevin Gonzalez, is senior director of security and operations at , Anvilogic, a Palo Alto based cybersecurity company founded by veterans from across the security industry building the future of AI in cybersecurity.

The ubiquity of smart surveillance systems has contributed greatly to public safety.

Related: Monetizing data lakes

Image capture devices embedded far and wide in public spaces help deter crime as well as aid first responders — but they also stir rising concerns about an individual’s right to privacy.

Enter attribute-based encryption (ABE) an advanced type of cryptography that’s now ready for prime time. I’ve had several discussions with scientists who’ve led the development of ABE over the past two decades.

Most recently, I had the chance to visit with Takashi Goto, Vice President, Strategy, and Fang Wu, Consultant, at NTT Research. We discussed how ABE is ready to help resolve some rather sticky privacy issues stemming from widespread digital surveillance – and also do much more.

For a full drill down on this leading-edge form of agile cryptography, please view the accompanying videocast. Here are my takeaways.

Customized decryption

ABE builds upon digital certificates and the Public Key Infrastructure (PKI) that underpins secure communications across the Internet. Traditionally, PKI issues a single key to decrypt a given digital asset, which is fine, if the correct person possesses the decryption key.

However, cybercriminals have perfected numerous ways to steal or subvert decryption keys. ABE makes it much more difficult to fraudulently decrypt an asset in its entirety; it does this by pulling user and data attributes into the encryption picture — in a way that allows decryption to be flexible.

For instance, ABE can correlate specific company attributes to certain user attributes. It can differentiate departments, such as HR, accounting or the executive suite, as well as keep track of user roles, such as manager, clerk or subcontractor. It can then apply policies so that only users with the proper attributes can decrypt certain assets and only in very specific ways.

Alternatively, the digital asset itself — such as an image or even a video stream — can be assigned detailed attributes, with each attribute assigned a separate decryption key. A user can decrypt specific parts of an image or video stream, but only if he or she has the correct key enabling that particular access.

“ABE enables fine-grained access control and policy setting at the data layer, so you can actually blur faces or any text shown in the image,” Goto says. “You can still get useful information from the image, but if you don’t have the correct key, you won’t be able to decrypt certain attributes, such as a face or a license plate number.”

Versatile benefits

It’s taken a while to get here. ABE has undergone significant theoretical advancements since 2005. But it has only been in the past couple of years that proof-of-concept projects have gotten underway. Today, Goto says, ABE is fully ready to validate in real world deployments.

NTT is partnering with the University of Technology Sydney to introduce an ABE service that fits with existing IT infrastructure, including cloud computing, healthcare, IoT and secure data sharing. This comes after the partners have spent the past couple of years fine tuning an architectural design that’s compatible with existing IT systems, he says.

Wu observes that ABE’s fine-grained access control capability could enhance any of the major areas of digital services that exists today, while also being future-proofed. We should soon begin to see examples of ABE being implemented in virtual computing and cloud storage scenarios — to help ensure that decryption happens only when the correct combination of attributes presents itself.

And when it comes to cloud collaboration, ABE holds promise to help improve both security and operational efficiencies — in everything from rapid software development to global supply chains to remote work scenarios.

“Attribute-based encryption can be utilized to do a number of things,” Wu noted. “It’s an advanced way to partition sensitive data into different groups and then allow the user to access only what he or she needs to access; this can play a vital role in helping to avoid large-scale data breaches.”

With ABE, encryption happens once, while decryption attributes can be amended, as needed. This adds complexity and computational overhead. But those are solvable challenges. There’s a clear path forward for ABE to improve security and help preserve privacy. I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

 

Cisco’s recent move to acquire SIEM stalwart Splunk for a cool $28 billion aligns with the rising urgency among companies in all sectors to better protect data — even as cyber threats intensify and disruptive advancements in AI add a wild card to this challenge.

Related: Will Cisco flub Splunk?

Cisco CEO Chuck Robbins hopes to boost the resiliency the network switching giant’s growing portfolio of security services. Of course, it certainly doesn’t hurt that Cisco now gets to revenue from Splunk customers like Coca-Cola, Intel, and Porsche.

Last Watchdog engaged Gurucul CEO Saryu K. Nayyar in a discussion about the wider implications of this deal. Gurucul is known for its innovations in User and Entity Behavior Analytics (UEBA) as well as its advanced SIEM solutions. Here’s the exchange, edited for clarity and length:

LW: What are tech giants like Microsoft, Google and now Cisco doing in the SIEM space?

Nayyar: Microsoft, Google, and Cisco are not security-first companies, but they recognize that SIEM is at the heart of security operations, so it’s not surprising they want to get in. It seems their strategy is to leverage their existing customer base and products to get traction in this space. 

LW: Why are suppliers of  legacy firewall, vulnerability management and EDR  solutions also now integrating SIEM capabilities?

Nayyar: Many security vendors want a piece of the SIEM market, even if their technology isn’t necessarily purpose-built. These vendors aren’t so much ‘doing SIEM’; rather, they’re positioning a set of point products to solve pieces of the puzzle, not the whole puzzle. The importance of SIEM continues to rise along with the constant velocity and veracity of threats, so this trend of jumping on the SIEM band wagon will likely continue.

LW: For some historical context, could you summarize how we went from SIM to SIEM and how Gurucul came to pioneer UEBA?

Nayyar:: The transition from SIM to SIEM was born out of necessity. Security teams needed greater visibility across their operating environment. Combining a security Information tool with a security event tool made it easier to correlate alerts generated by security products, like firewalls and IDS, normalize it, and then analyze it to identify potential risks.

SIEMs of today, like Gurucul’s, have evolved leaps and bounds over legacy SIEMs with the addition of purpose-built machine learning and analytics models,  along with the ability to scale.

Gurucul pioneered UEBA technology a decade ago – in fact our company was built around this capability. UEBA focuses on behavioral patterns for users and entities to identify anomalies and activity outside of the norm. We use machine learning models on open choice big data lakes to detect unknown threats early in the attack chain.

Instead of being stuck in reactive mode, security analysts could proactively determine if an attack was underway. This significantly improved their ability to accurately identify a potential threat early in the kill chain before damage happens.

LW: Then along came SOAR and next-gen SIEM, correct? What was behind the emergence of these advances?

Nayyar: SOAR gave analysts a playbook for responding to an attack campaign so they didn’t have to reinvent the wheel each time. Many attacks, while varied in how they are used, have a known set of characteristics. The MITRE Attack framework is an example of how various attack techniques, even if unique, can still be mapped to known techniques and procedures. SOAR uses the output of detection engines and investigations and recommends workflows or playbooks to build a response plan, saving time and effort.

Next-gen SIEM came about to address the shortcomings of legacy SIEMs when it comes to things like ineffective data ingestion, a flood of unprioritized alerts from security control products, and weak threat detections. Early SIEMs were log management and compliance tools, they were never built to address real-time threat detection and response.

Essentially, next-gen SIEM combines the capabilities of UEBA, SOAR and XDR so security teams can proactively – and accurately – assess threats and respond quickly. Another characteristic of a next-gen SIEM is its ability to ingest and interpret any data from any source and easily scale.

LW: To what extent is Cisco’s acquisition of Splunk just a microcosm of a wider shift of network security that’s taking place? Can you frame how legacy security tools (NGFW, WAF, web gateways, SIEM, SOAR, UEBA, XDR, VM, IAM, etc.) appear to be converging, in some sense, with brand-new cloud-centric solutions (API Security, RBVM, EASM CAASM, CNAPP, CSPM, DevSecOps, ISAT, BAS, etc.)

Nayyar: While there will always be point products to solve specific problems, the best solution for customers is a platform that combines the best-of-breed technologies into a single framework.

Related: Reviving obervability.

As the SIEM has long been central to gathering data and information across the entire infrastructure, it’s naturally evolving into an observability platform where the data can be used for various use cases beyond just security, such as application and cloud performance monitoring and management. There is greater awareness that IT functions can work together to improve the gathering of data, analytics, and prioritization of security-related events to improve the organization’s resiliency.

 LW: How should a company leader at a mid-market enterprise think about all this? What’s the most important thing to keep in mind?

Nayyar

Nayyar: Mid-market enterprises need the ability to reduce manual tasks and detect and respond faster. They are resource-restrained and don’t typically have specialized analyst roles. They need a SIEM that can automate their workflow and provide prioritized, risk-driven context that enables them to respond to threats in real time.

LW: What do you expect network security to look like five years from now?

Nayyar: Traditional network security is becoming less relevant as edge computing and zero trust networks evolve. The incorporation of edge networking, cloud migration, and identity and access data is changing how we look at security and its interaction with IT.

However, companies making investments in their security stack will likely continue to use a layered approach versus a deprecative approach. For example, Anti-virus will continue to be supported on endpoints even though its efficacy has dramatically reduced. This also means that automating and simplifying management of these layers is important.

LW: Anything else?

Nayyar: When we look at the SIEM market, legacy log-based architectures that were built for centralized deployments have failed to provide the needed visibility and detection of threats in the cloud. And, cloud-vendor approaches, like GCP and Azure or cloud-only SIEMs, have failed to recognize that most organizations are hybrid and will continue to be hybrid for many years.

As data becomes more de-centralized and spread across multiple clouds and geographies, it becomes significantly harder to analyze and identify attack campaigns. All the while, attackers are becoming more sophisticated.

The only way to make sense of all the data is through sophisticated analysis leveraging data lakes, machine learning and AI. These capabilities exist today; security operations teams don’t have to be saddled with tools that have failed to keep up with the threat environment.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


 

Supply chain security grows more crucial daily as cybercriminals attempt to disrupt distribution and transportation. In response, industry professionals must automate their cybersecurity tools to stay ahead.

Why so? The 2020 SolarWinds cybersecurity incident — which industry experts call the supply chain attack of the decade — was an incredibly high-profile breach affecting massive corporations. While it may seem like an outlier, it reveals an alarming trend.

Professionals on the incident response team believe cybersecurity hasn’t improved and no one has learned from the situation. They point out how supply chains rely on software yet lack the security tools to protect them.

Simply put, cyberattacks are on the rise. Data breaches exposed over 37 billion records in 2020 alone — a 141% jump from 2019. Businesses must automatically secure their supply chains to protect themselves and comply with consumer-protection laws.

Automation best practices

The best practices for automating supply chain cybersecurity cover each stage of the process, ranging from installation to use.

•Comprehensive Integration. Organizations will only get the full benefits of supply chain cybersecurity automation with thorough integration. What use is automatic threat detection without an immediate response? A single-function tool creates security gaps since it needs to rely on others.

•Scalability. Tools should be scalable to grow with the business and maintain security. For example, automatic threat response software must be able to handle security even during a surge in malicious activity. Supply chain professionals have to ensure their technology can scale to meet demand increases.

•Ongoing monitoring. While automated tools can be beneficial, businesses must track them to ensure success. Ongoing monitoring is one of the best practices for cybersecurity automation because it results in optimal functioning. Supply chain professionals will need to measure performance metrics patiently to see how the technology improves upon previous tools.

•Vendor inclusion. While most supply chains rely on third-party vendors, they increase the chance of cyber attacks. Still, businesses trust them to handle cybersecurity since they’re supposed to be convenient. Even if they’re careful and use quality security measures, they broaden the attack surface.

For example, experts believe the June 2023 MOVEit supply chain cyber attack originated from a third-party employee working with cybercriminals. A single individual’s actions resulted in a data breach reaching over 160 people.

Automatic third-party risk management identifies potential relationship vulnerabilities, improving cybersecurity. Businesses should include this approach in their automation process to minimize security gaps and better protect themselves.

Tools tips

Although automation itself is convenient, its integration can be time consuming and complex. Supply chain professionals should consider implementing these tips to improve their processes. Here’s what to use for supply chain cybersecurity automation:

Quality tools: Better tools have higher performance potential. For example, quality artificial intelligence only needs milliseconds to process millions of data points.

Employee support: Many automated tools need human oversight or maintenance to reach their full potential. Their performance would benefit from employee support.

Modern tools: Companies should overhaul legacy systems to reduce security gaps between them and the new automation technology.

Quality data: Data-driven automation technology is only as good as the information it collects. Professionals must ensure they only use relevant, accurate details.

While many tools can complete tasks independently, only some can do so securely. Cybersecurity automation is most effective when organizations leverage quality technology and manual assistance.

Automation benefits

Amos

Timeliness, efficiency, reduced downtime and improved protection against cyber attacks are the top benefits of supply chain cybersecurity automation. Processes like threat identification and incident response move much more quickly and are often more accurate.

Efficiency is one of the most significant benefits of supply chain cybersecurity automation. Industry leaders need help finding skilled workers, with around 57% of organizations stating labor shortages are their largest obstacle as of 2023.

Businesses should consider adopting cybersecurity automation technology since it’s a cost-effective approach to labor shortages. Additionally, it may produce higher-quality work since many tools leverage massive data sets.

Automatic supply chain cybersecurity is essential for modern-day organizations, considering how cyber attacks continue to become more frequent. They must implement the best practices and consider optimizing their processes to protect themselves.

About the essayist: Zac Amos writes about cybersecurity and the tech industry, and he is the Features Editor at ReHack. Follow him on Twitter or LinkedIn for more articles on emerging cybersecurity trends.

Clean Code’ is a simple concept rooted in common sense. This software writing principle cropped up some 50 years ago and might seem quaint in today’s era of speedy software development.

Related: Setting IoT security standards

At Black Hat 2023, I had the chance to visit with Olivier Gaudin, founder and co-CEO, and Johannes Dahse, head of R&D, at SonarSource, a Geneva, Switzerland-based supplier of systems to achieve Clean Code. Olivier outlined the characteristics all coding should have and Dahse explained how healthy code can be fostered. For a drill down, please give the accompanying podcast a listen.

Responsibility for Clean Code, Olivier told me, needs to be placed with the developer, whether he or she is creating a new app or an update. Caring for source code when developing and deploying applications at breakneck speed mitigates technical debt – the snowballing problems associated with fixing bugs.

Guest experts: Olivier Gaudin, co-CEO, Johannes Dahse, Head of R&D, SonarSource

“If you try to go faster but don’t take good care of the code, you are actually going slower,” Olivier argues. “Any change is going to cost you more than it should because your code is bad, dirty, junky or whatever you want to call it that’s the opposite of clean code.”

What’s more, Clean Code improves security —  by reinforcing “shift left,” the practice of testing as early as feasible in the software development lifecycle.

Olivier and Dahse make a persuasive argument that Clean Code can and should arise as the innermost layer of security. The transformation progresses. I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

 

Editor’s note: I recently had the chance to participate in a discussion about the overall state of privacy and cybersecurity with Erin Kapczynski, OneRep’s senior vice president of B2B marketing. OneRep provides a consumer service that scrubs your personal information from Google and dozens of privacy-breaching websites. Here is Erin’s Q&A column, which originally went live on OneRep’s well-done blog.)

For the first expert interview on our blog, we welcomed Pulitzer-winning investigative reporter Byron V. Acohido to share his ideas about the current cyber threat landscape, the biggest threats for businesses today, the role of AI and machine learning in cyberattacks and cyberdefence, and the most effective methods for companies to protect themselves.

Byron is the founder and editor-in-chief of The Last Watchdog on Privacy & Security. He previously chronicled the emergence of cybercrime while covering Microsoft for USA TODAY. His news analysis columns, podcasts, and videos are crafted to foster a useful understanding of complex privacy and cybersecurity developments for company decision-makers and individual citizens — for the greater good.

Erin: So, let’s get started. How did you first get interested in cybersecurity as a career? What drew you to this field?

Byron: I was initially drawn to cybersecurity as a USA TODAY technology reporter assigned to cover Microsoft. I held this position from 2000 through 2014, during which time Windows emerged as a prime target for both precocious script kiddies and emerging criminal hacking rings. I began to research and write about the drivers behind what was happening to businesses and to individual consumers using Windows, both the evolving threats and the emerging business/home network defenses.

Erin: How has the cyber threat landscape evolved since you first got into cybersecurity?

Byron: Since I started, the cyber threat landscape has grown exponentially, with more sophisticated attacks and diverse attackers ranging from individual hackers to professional criminal rings to state-sponsored entities. We’ve arrived at a critical juncture: to enable the full potential of the Internet of Everything, attack surface expansion must be slowed and ultimately reversed. A shift from legacy, perimeter-focused network defenses to dynamic, interoperable defenses at the cloud edge, directed at ephemeral software connections, must fully play out.

Erin: What cybersecurity technologies are you most excited about right now?

Byron: On the software side of things, some exciting breakthroughs are about to gain meaningful traction in leveraging machine learning and automation to shape new security platforms and frameworks that are much better suited to helping companies implement cyber hygiene, as well as execute effective, ongoing threat detection and incident response. Adding to this will be very smart uses of generative AI – centered around wisely directing LLM capacities onto specific data lakes containing threat intelligence information. On the hardware side, major advances in semiconductors as well as rising deployment of optical-based networking hubs will make a huge difference in efficient management of vastly interconnected, highly interoperable systems; amazing new digital services will be the result — and also improved cybersecurity and robust digital resiliency. These emergent software and hardware advances will pave the way for factoring in quantum computers.

Erin: What are some of the biggest cyber threats that businesses face today?

Byron: The economic impact of phishing, ransomware, business logic hacking, Business Email Compromise (BEC) and Distributed Denial of Service (DDoS) attacks continues to be devastating. However, I’d argue that the fundamental cyber threat is within: in the lack of awareness and/or lack of due diligence on the part of company decision-makers who leave their organizations vulnerable; such leaders have been slow to embrace cyber hygiene practices and fail to grasp why they need to wisely select the security tools and services that can make their organization more resilient to cyber attacks.

Kapczynski

Erin: Could you share your thoughts on the role of artificial intelligence, machine learning and the growth of IoT devices in both cyber defense and cyberattacks?

Byron: Organizations are oriented toward leveraging these technologies to innovate and gain competitive advantage, without paying close enough attention to how they also expand their network attack surface. Their dual-edged nature demands careful implementation and management. The flip side (and the good news) is that we’re entering an era where advanced cloud configuration, threat detection and threat response capabilities that leverage machine learning and automation are more readily available than ever before. More good news: there’s a trend toward increasingly proficient MSSPs stepping forward to help SMBs, mid-market enterprises and large enterprises do this.

Erin: Deep fakes are becoming more sophisticated. How can individuals and organizations detect and protect themselves against the misuse of deep fake technology?

Byron: To detect deep fakes, organizations can use digital watermarking, AI-driven detection tools, and media provenance tracking.

Erin: In your opinion, what are the most common cybersecurity mistakes that companies make?

Byron: Companies often underestimate threats, neglect basic cyber hygiene, and fail to educate employees on cybersecurity.

Erin: What are some of the most common social engineering tactics that cybercriminals use?

Byron: Phishing, pretexting, SMS toll fraud, baiting and tailgating are among the common tactics used by cybercriminals.

Erin: What role does human error play in cybersecurity incidents? How can companies minimize risks?

Byron: It’s a significant factor in many breaches. Regular training and simulations can help reduce risks associated with human errors.

Regular training and simulations can help reduce risks associated with human errors.

Erin: How has the ransomware threat evolved in recent years?

Byron: It’s gone from simple file encryption to multifaceted, multi-staged attacks that leverage Dark Web services, such as initial access brokers (IABs,) as well as make use of Living off the Land (LotL) embedded tools. To subvert improved network defenses, ransomware purveyors continually innovate to penetrate deeply, avoid detection, cause disruption and ultimately put the targeted company in a posture where paying the ransom is the least evil.

Erin: What are the cybersecurity implications of remote workforces?

Byron: Post-COVID-19, the shift to a remote workforce is here to stay. Zero trust — and more specifically, zero-trust network access, or ZTNA — thus has become a must-have capability. A user gets continually vetted, with only the necessary level of access granted, per device and per software application; and behaviors get continually analyzed to sniff out suspicious patterns. Remote access is granted based on granular policies that take the least-privilege approach.

Erin: What are some of the most effective methods for companies to protect themselves from cyberattacks?

Byron: Gaining accurate visibility of all cloud and on-premise digital assets; configuring cloud IT infrastructure wisely; adopting ZTNA principles; implementing robust cyber hygiene, based on NIST standards; conducting regular audits, including advanced penetration testing; conducting ongoing, effective threat detection and response; and implementing leading-edge software applications security practices for all software development and deployment, including software updates — these are the best practices of the moment.

Erin: What advice would you give to leaders to improve cybersecurity culture in their organizations? What is the role of cybersecurity awareness training for a company’s employees?

Byron: Leadership should prioritize cybersecurity at all levels. Regular awareness training for employees is indispensable.

Leadership should prioritize cybersecurity at all levels. Regular awareness training for employees is indispensable.

Erin: Do you think cyber insurance should play a bigger role in companies’ cybersecurity strategies? What factors should organizations consider when selecting a cyber insurance policy?

Byron: It’s an important risk management tool. Organizations should consider coverage limits, policy exclusions, and incident response assistance when selecting a policy.

Erin: What role should governments play in combating cybercrime?

Byron: Governments and industry standards bodies are, in fact, moving methodically to drive adoption of stricter privacy and data security standards in areas such as IoT home device safety, data privacy, software bill of materials, supply chain security. Organizations can and should get ahead of these compliance trends to gain competitive advantage and to assure long-term viability.

Erin: How do you see cyberwarfare between nation-states shaping up in the future?

Byron: It has been steadily intensifying and can be expected to continue to do so, with Russia, China and North Korea continuing to improve their respective positions to carry out attacks on critical infrastructure, while also continuing to manipulate social media and mainstream news outlets — to spread disinformation campaigns in order to gain strategic advantages. Russia, China and North Korea are setting an example; lesser nations with despot leaders are likely to play copycat – and develop and utilize their versions of asymmetrical warfare for self-serving reasons. Where this all leads is unknowable.

Erin: What advice would you give to someone looking to get started in a cybersecurity career?

Byron: Stay curious, keep learning and seek mentors. Experience in the field is as valuable as formal education.

Erin: What skills or certifications do you think are most important for cybersecurity professionals to have?

Byron: While certifications like CISSP and CISM are valuable, hands-on skills, critical thinking, and problem-solving are equally important.

Erin: What are the top three sources of information about cybersecurity you can recommend to people who want to stay up on developments in this area?

Byron: Stay updated with reports from cybersecurity firms, follow cybersecurity news portals, and join professional networks and forums.

Erin: What is your vision for the future of cybersecurity over the next decade? What trends do you expect to see? What gets you most excited?

Byron: Massive interconnectivity at the cloud edge is just getting started and will only intensify, going forward. This portends amazing advancements for humankind – but first a tectonic shift in network-centric security must fully play out. The stakes are sky-high, and the cybersecurity industry is at a critical juncture. A new tier of overlapping, interoperable tools, platforms and frameworks is direly needed. This new architecture must result in security getting baked deep inside the highly interconnected systems that will give us autonomous transportation, climate-rejuvenating buildings and spectacular medical breakthroughs.

The stakes are sky-high, and the cybersecurity industry is at a critical juncture. A new tier of overlapping, interoperable tools, platforms, and frameworks is direly needed.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.

Something simply must be done to slow, and ultimately reverse, attack surface expansion.

Related: What Cisco’s buyout of Splunk really signals

We’re in the midst of driving towards a dramatically scaled-up and increasingly connected digital ecosystem. Companies are obsessed with leveraging cloud-hosted IT infrastructure and the speedy software development and deployment that goes along with that.

And yet it remains all too easy for malicious hackers to get deep access, steal data, spread ransomware, disrupt infrastructure and attain long run unauthorized access.

I heard a cogent assessment of the shift that must take at the Omdia Analyst Summit at Black Hat USA 2023. In a keynote address, Omdia’s Eric Parizo, managing prinicipal analyst, and Andrew Braunberg, principal analyst, unveiled an approach they coined as “proactive security.”

What I came away with is that many of the new cloud-centric security frameworks and tools fit as components of proactive security, while familiar legacy solutions, like firewalls and SIEMs, can be categorized as either preventative or reactive security. This is a useful way to look at it.

Rising reliance on proactive tools seems inevitable, although legacy tools continue to advance and have their place. The Omdia analysts called out a handful of key proactive methodologies: Risk-Based Vulnerability Management (RBVM), Attack Surface Management (ASM), and Incident Simulation and Testing (IST).

RBVM solutions don’t merely identify vulnerabilities, it quantifies and prioritizes them, making risk management more strategic. Notably, some 79 percent of enterprises recently polled by Omdia consider this risk-ranking capability indispensable.

Last Watchdog followed up with Braunberg to ask him, among other things, what RBVM solutions signal about the ramping up of proactive security. Here’s what he had to say:

LW: What is ‘proactive security’ and why is it gaining traction?

Braunberg: Proactive solutions seek out and mitigate likely threats and threat conditions before they pose a danger to the environment. These tools provide visibility, assessment, and control of an organization’s attack surface and an understanding of viable attack paths based on asset exposures and the effectiveness of deployed security controls. Omdia believes it is gaining traction because, for too long, enterprises have been investing in security solutions that only help after an attack is already on their doorstep – or has broken down the door! Proactive Security finally helps get ahead of adversaries, finding and fixing the opportunities they seek to exploit, before they can exploit them.

LW: Legacy on-prem tools tend to be preventative, advanced on-prem tools are reactive and the shiny new cloud-centric solutions are proactive. Is that fair?

Braunberg: Well, it’s fair to say that modern software defined architectures, such as cloud, can introduce many more potential exposures and that a proactive approach is particularly effective in identifying and controlling configuration drift in these environments. But Omdia believes that a mix of preventative, reactive, and proactive tools are appropriate across all components of the digital landscape.

LW: Your ‘continuous security protection lifecycle’ argument suggests we’re in an early phase of what: co-mingling; consolidating; integration of these three categories?

Braunberg

Braunberg: Omdia sees several trends at work in the market today. There is a strong trend of consolidation in proactive security segments. We predict that proactive security functionality will roll up into comprehensive proactive security platforms over the next several years. But we also see traditional reactive security suites incorporating proactive features. So, we expect consolidation, co-mingling, and integration for the foreseeable future.

LW: How would you characterize where we are today?

Braunberg:  There is significant innovation and investment in many traditional segments of proactive security. This is driven primarily by a desire to support better risk-based analytics to prioritize risk and better inform remediations. But as noted, we are also in the early stages of market consolidation.

LW: What does Cisco’s $28 billion acquisition of Splunk signal about the trajectory that network security is on?

Braunberg: It’s less about network security as much as it is filling a need for Cisco. The networking giant sees Splunk as a premium brand in a market segment, SIEM, that it had yet to enter, giving Cisco a strong opportunity to upsell existing Cisco Secure customers

LW: Won’t companies have to rethink and revamp long-engrained budgeting practices?

Braunberg: Absolutely. Omdia believes that over the coming years, enterprises should and will increase the percentage of their cybersecurity technology budgets allocated for proactive security solutions. Not only will this provide a forward-leaning approach to get ahead of threats and threat conditions before they can hurt the enterprise, but it will also reduce cybersecurity risk, in turn providing improved ROI for the security solution.

LW: How does ‘risk-based vulnerability management’ factor in?

Braunberg: RBVM will play a key role in proactive strategies. These products are already expanding into more comprehensive tools for addressing security hygiene issues across the entire digital domain for both production code and code in development.

LW: Can you characterize what’s happening in the field today with early adopters of this approach?

Braunberg: Omdia’s recent primary research, the 2023 Omdia Cybersecurity Decision Maker Survey, querying global security practitioners, found an overwhelming need to rank vulnerabilities and to prioritize next actions based on risk. Early adopters of proactive tools are primarily focused on this need.

LW: What are you hearing from these early adopters?

Braunberg: In addition to the obvious benefit of more efficient, effective security practices in the form of specific product categories like risk-based vulnerability management, which provides prioritization and remediation decision based on contextual risk to the organization, but also increased emphasis on the core tenants of Proactive Security: visibility and risk.

Proactive helps underscore the importance of being able to detect, define, categorize, and understand the risk of all assets in the extended enterprise environment. From there, it becomes possible to identify opportunities to address threat conditions, such as the need for software patches, vulnerable configurations, or even poor practices and policies.

Going forward, this will further the importance of maturation on security risk, leading to more dedicated risk teams and discerning ROI from security solutions based on their ability to reduce risk.

LW: Five years from now, will it be equal parts proactive, preventative and reactive — or some other mix?

Braunberg: It’s too early to say what the pie chart might look like, but for most organizations today, the priority is to increase the emphasis on and shift toward Proactive Security, from both a strategic and technical planning perspective. Omdia believes it’s time to shift the conversation to one of ROI based on risk reduction, and vendors offering Proactive Security solutions will be best positioned to make that case.

LW: Anything else?

Braunberg: We just published our new report on the Fundamentals of Proactive Security, which is a 6,000-word deep dive on the topic. It’s available to Omdia Cyber clients. Plus, we’ll have more on Proactive, on our sister site Dark Reading, and elsewhere in the near future.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.

Surrounded by the invisible hum of electromagnetic energy, we’ve harnessed its power to fuel our technological marvels for decades.

Related: MSFT CEO calls for regulating facial recognition tech

Tesla’s visionary insights from 1900 hinted at the potential, and today, we bask in the glow of interconnected networks supporting our digital lives. Yet, as we embrace this wave of connectivity, we often overlook the pressing need for protection.

Since 1984, when Japan’s pioneering 1G network blanketed the nation, we’ve been swept up in the excitement of progress. But let’s pause and consider—how often do we truly contemplate safeguarding ourselves from the very forces that fuel our interconnected world?

Link to identities

Over the past decade, mobile data traffic has surged an astonishing 4,000-fold, while an additional 400 million users have joined the digital realm over the past 15 years. As we venture into the era of 5G and witness the rise of private networks, the surge of electromagnetic charge is palpable, raising questions about the potential consequences.

Beyond the realms of charge, there lies a pivotal concern—the intricate linkage between our data and identities. This burgeoning fusion necessitates a higher level of vigilance, given the expanding ambit of our digital footprints.

The concept of Mobile Phone Penetration concept mentioned in all Mobile economy forecasts unveils an intricate dance between usage and population. Often overlooked, the SIM card—short for Subscriber Identification Module—acts as the nexus between our identity and technology, illuminating the thin line between connection and surveillance.

Arns

Gazing toward the horizon of 2030, an ambitious vision looms—a vision of achieving a 90% average subscriber penetration and smartphone adoption across Europe, China, CIS, and the USA. Such ambition thrusts mobile devices into the hands of nearly everyone over the age of 12, inviting us to reconsider our interaction with these potent tools.

Yet, as we hold these devices close, we’re forced to ponder—why does our understanding of their inner workings remain so limited? How can we fortify ourselves against potential threats? The dichotomy is striking—our dependency on technology has deepened, but our comprehension of its nuances lags.

Paradox challenge

Beyond the realm of sensitivity, consider our data—the intrepid voyager navigating electromagnetic currents. Recent revelations, such as the TechCrunch exposé on “Spyhide stalkerware,” unmask the vulnerability of our devices. The exposé recounts the stealthy exfiltration of private phone data from a staggering 60,000 compromised Android devices dating back to 2016[6]

Herein lies the paradox—data centers, government strongholds, and even spacecraft are fortified with Faraday technology against electromagnetic threats, while individuals who champion this cause are often typecast as cinematic caricatures. Think Gene Hackman’s paranoia in “Enemy of the State,” or the intrigue-laden worlds of “Mr. Robot” and “Mission Impossible.” These portrayals obscure the reality that personal data protection is far from a fanciful notion.

This paradox further extends to our interaction with technology. Despite our daily reliance on devices, our grasp of their mechanics remains tenuous, mirroring our limited understanding of complex economic systems.

In this unfolding narrative, education emerges as the harbinger of change. An evolution beckons—the “New Normal.” This new era demands selective signal blocking, conscious data guardianship, and a resolute commitment to digital privacy. In this paradigm, devices transform from mere instruments to instruments of empowerment, propelling human interaction to the forefront.

The clarion call is clear—craft new rituals, where data holds sacred value, shared purposefully. Let devices augment human connection, not replace it. Cultivate an awareness of their ability to listen, and use it as an impetus to seize control. Dance to the tune of empowerment, where trust is fortified.

Even in our material realm, simplicity prevails. The solution lies not in elaborate (and illegal)  jamming tech installations, but in the subtle elegance of Faraday Signal Blocking Products — guardians of privacy.

An imperative emerges—knowledge and data, potent instruments, should not rest in the hands of the few. For, as history has shown, the wielders of knowledge possess power. The moment to reclaim control over devices is now. Let’s create new Habits and embrace the New Normal.

About the essayist: Nikoline Arns creates projects that prioritize privacy and freedom of expression, particularly in the context of social impact. Since 2018, she has been aligned with Web3 values. In her latest venture, she has joined forces with SignalBlockerProducts.com to introduce privacy solutions for both office spaces and households.

Once again, politicians are playing political football, threatening a fourth partial government shutdown in a decade.

Related: Biden’s cybersecurity strategy

As this political theater runs its course one of the many things at risk is national security, particularly on the cyber warfare front. Given the divergent paths of the U.S. Senate and the U.S. House of representatives, federal agencies could see funding largely choked off on Sunday, resulting in the furloughing of hundreds of thousands of federal workers.

A wide range of federal government services, once more, would slow to a crawl —  everything from economic data releases to nutrition benefits for poor children. And the Cybersecurity and Infrastructure Security Agency (CISA) may be forced to send home some 80 percent of its workforce, drastically shrinking its capabilities as a catalyst for public-private sharing of fresh threat intelligence.

Out of 3,117 employees, only 571 will remain active during a shutdown, based on the Department of Homeland Security’s updated plan for “lapse in appropriations.” This plan contrasts with most other DHS sectors, where employees like airport screeners and FEMA staff will continue their duties during the shutdown.

Last Watchdog caught up with Rep. Lou Correa, D – Calif., who serves on the House Homeland Security Committee, and is the top Democrat on the Border Security and Enforcement Subcommittee. Here’s Correa’s observation:

Correa

“Our national security will be put at risk because of the political stunts being pulled by my Republican colleagues right now. Whether it’s the Cybersecurity and Infrastructure Security Agency or the Department of Homeland Security, thousands of federal workers who serve on the front lines of our nation will be doing so with little-to-no agency support—and will be forced to work without pay.

“That takes a toll on morale, will cause staffing shortages, and will put American lives at risk. My colleagues on the other side of the aisle must put political gamesmanship aside and pass clean government spending bills—to prevent catastrophe, keep our constituents safe, and our government open for its citizens.”

Last Watchdog also sought commentary from cybersecurity thought leaders: here’s what they had to say:

Martin Jartelius, CISO, Outpost24

Jartelius

CISA ceasing to function will lead to organizations being less prepared to respond to the same threats we would see with or without them in operations. CISA is just one of several sources to turn to for information and support, many organizations start by finding a trusted provider and as they grow and mature tap into several sources to get a good insight. Backing this with solid inventory of your attack surface so you can prepare to defend . . . should replace those bits many rely on CISA for with something more tangible and hands-on.

Tim Helming, security evangelist, DomainTools

Helming

CISA’s ability to carry out the same level of intelligence gathering and analysis that they usually do may be affected. It may mean that the staff remaining available after the shutdown will be stretched thin and overtaxed. CISA has been quite prolific with advisories and it’s likely that the pace could slow during the shutdown. None of this means that we’re going to see an uptick in successful attacks . . . as always, we need to be highly vigilant; there have been several high-profile breaches in the last couple of weeks unrelated to the shutdown, and those certainly warrant tight operations.

Colin Little, security engineer, Centripetal

Little

A federal government shutdown can weaken the nation, leaving it more vulnerable to cyberattacks and potentially harming international cooperation in the realm of cybersecurity. Maintaining robust cybersecurity practices during a shutdown should be a top priority to mitigate these risks and ensure the continued protection of critical systems and sensitive data. Think of it in terms of an active warzone; if 80 percent of front-line units stopped receiving troop pay, reinforcements and supplies, the result would be disastrous especially over a protracted period of time.