A workflow consists of an assorted number of tasks and usually follows an algorithm that decides the order based on external or internal contributing factors. 

In the DevSecOps world, getting the right sequence at the right time and place is paramount. Various critical tasks need to be ‘orchestrated’ in the right sequence, order or algorithm for desired results.

While the tasks themselves are largely automated, the real challenge lies in connecting them more seamlessly. Orchestration tools shine in this area, serving as the linchpin that binds individual tasks together. 

Job orchestration is the process of automating the sequence and management of multiple tasks that form a workflow or process. It should be capable of handling all scenarios, complexities and system dependencies. The goal of automating workflows is to enhance efficiencies while minimizing redundancy and business costs.

Enterprises can opt for niche tools or go for a consolidated platform to orchestrate DevSecOps workflows. Using a common platform provides better integration between the different tasks and fosters democratization between DevOps stakeholders. However, there are a few best practices that one must follow while implementing job orchestration in the DevSecOps environment.

Best Practices for Job Orchestration in DevSecOps

1.Define Your DevSecOps Pipelines Clearly 

DevSecOps processes are structured into phases, each with a multitude of tasks to be executed. For instance, in a rollout scenario, nodes need to be scanned. This requires a pre-prepared node list and the execution of some scripts. Lastly, the status needs to be checked again. All these tasks might need to be performed at a specific time, for a specific region and in a specific manner (for instance, ten at a time). Orchestrating these tasks becomes an easier process with well-defined DevSecOps pipelines.

2.Adopt the “As-Code” Methodology

In a modern deployment pipeline, your orchestration tool enables you to automate your workflow with code. When you use code to define your workflow, your processes become more flexible, efficient and consistent. It also creates a common language understood by disparate team members and aids in better cross-collaboration. In short, using an ‘as-code’ method to orchestrate your workflows fosters democratization in DevOps.

3.Choose Scalable Tools

If you’re a growing enterprise, scalability is a factor you cannot ignore. Orchestrating a few tasks for a small number of nodes is easy. But, does your tool scale when your environment changes? Does it scale when the number of nodes increases significantly? Does it scale when your IT architecture changes? Test for scalability before you decide.

4.Incorporate Security Early On

Make sure that security is ingrained in all your workflow processes right from the beginning. Shifting security left is the single most important factor in DevSecOps. Staying true to this philosophy necessitates that all platforms, including the orchestration tool that you use, imbibe security as part of their process.

5.Implement Version Control

Using an as-code method for orchestration means that you will be writing scripts to define the workflows. Therefore, it goes without saying that you must keep your orchestration scripts under version control to maintain consistency and traceability and to promote collaboration amongst team members. Additionally, your versioned scripts minimize redundancy. Even an orchestration script written for an ad-hoc event can be reused in similar settings in the future.

6.Provide Greater Cross-Team Visibility

The importance of collaboration in the DevSecOps process cannot be stressed enough. To nurture this culture, it is important that all stakeholders have complete visibility or, at minimum, visibility that caters to their role. Using an orchestration tool provides a consolidated view, either as a dashboard or report.

7.Include Security and Compliance Scans

Consistent security and compliance checks help identify vulnerabilities and maintain regulatory and organizational policies in your DevSecOps pipelines. 

These are just some of the best practices to adhere to while working with job orchestration tools in DevOps. Job orchestration is critical in this era of complex IT environments and DevSecOps practices. 

When you integrate and manage numerous tasks and processes with the help of an orchestration tool, you’re moving towards greater efficiency, lowered costs, fewer errors and faster time to market. All of these encompass DevOps—continuous delivery and continuous improvement.

 

The post 7 Best Practices for Job Orchestration appeared first on Cybersecurity Insiders.

Building on its collection of out-of-the-box metrics, SeeMetrics is now operative with every user in the security organization in mind

SeeMetrics, the leading cybersecurity data fabric for metrics automation and risk management platform, today announces the expansion of the platform for all members of the security team, addressing the diverse and pressing needs of security leaders and teams for specific metrics use cases. 

Providing the fastest transition from siloed operational product data into a range of different dashboards and views, SeeMetrics now meets various security users’ entire range of measurement needs, helping them easily narrate their particular cybersecurity story, including hygiene, risk, board reporting and more.

The demand for metrics varies based on the role – a CISO may require metrics for strategic oversight or board reporting, GRC teams may need to focus on policies and frameworks, the head of vulnerability may need to measure program performance while product SMEs, their specific tools. SeeMetrics has expanded to cater to these distinct needs, offering the flexibility to serve various purposes while leveraging the same underlying data as the foundation for all measurements. Users can dive into metrics to see historical trends, and select the relevant filters to view the data that matters most to them. This adaptability ensures that each team can access the insights they require, whether for high-level strategy,day-to-day operations or reporting purposes.

SeeMetrics aggregates, correlates and normalizes data from all the different products and allows the user to easily filter the metric for the relevant need. This is used to quickly identify the gaps and gain a multiplier effect of improvement across the stack. All of SeeMetrics’ boards, designed with the out-of-the box metrics to serve specific measurement needs, are entirely customizable allowing each user to add or remove metrics as needed, ensuring that the data aligns with the users’ unique security and business needs, along with the narrative they wish to communicate to stakeholders. 

Modern cybersecurity organizations rely on a wide array of tools and are composed of diverse teams, each with distinct roles and objectives. Although metrics play a crucial role in cybersecurity, making them truly valuable requires a huge amount of resources —  90% rely on static spreadsheets, manually fed from dozens of siloed security products, to present measurements and enforce policies. This offline process hinders continuous access and visibility to operational data, complicates communication with the board, insurance and other auditors and also makes alignment between the various roles very challenging. 

“SeeMetrics is ushering in a new standard of the way metrics are generated and leveraged, improving overall security governance., Now, we allow a range of security players to access their individual metrics: so each can have their own board,” said Shirley Salzman, Co-Founder and CEO of SeeMetrics. “This new standard means that across the entire security organization teams are getting continuous, trending, flexible, and enriched metrics, with no manual work needed.”

The post SeeMetrics Expands The Use of Cybersecurity Metrics to Empower The Full Security Team appeared first on Cybersecurity Insiders.

In the ever-evolving landscape of enterprise technology, a seismic shift is underway. Network as a Service (NaaS) is not just another IT trend; it’s a fundamental reimagining of enterprise connectivity that’s reshaping the digital landscape. As a veteran with over a decade in networking, I’ve witnessed numerous technological shifts, but none as transformative as the rise of NaaS.

The Financial Revolution: From CapEx to OpEx

NaaS is revolutionizing network expenditure, transforming it from a capital-intensive model to an operational one. This shift is gaining unprecedented momentum across various network domains, from campus LANs to wide-area networks. According to recent industry projections, the global NaaS market is expected to experience remarkable growth in the coming years, with a compound annual growth rate (CAGR) of 35%.

Key Business Advantages of the NaaS Model:

1. Improved Cash Flow Management

NaaS allows businesses to shift from unpredictable capital expenses (CapEx) to a subscription-based operational expense (OpEx) model:

  • Predictable costs make budgeting easier and reduce financial strain.
  • Avoiding large upfront investments frees up cash flow for other areas of the business.

2. Greater Flexibility to Scale Resources

NaaS offers seamless scalability, enabling businesses to adjust network resources based on changing demands:

  • Scalability on demand: Businesses can instantly increase or decrease capacity without physical hardware investments.
  • Faster response to market changes: NaaS supports quick expansion or contraction as business needs evolve.

3. Always Current Technology

NaaS eliminates the risk of outdated hardware and software by continuously providing access to the latest technology:

  • Automatic updates ensure businesses always operate on the most current systems without the need for disruptive upgrades.
  • No hardware refresh cycles: New technology is integrated seamlessly, ensuring continuous modernization.

To illustrate the stark differences between traditional networking approaches and NaaS, consider the following comparison:

Democratizing Cutting-Edge Technology

In today’s fast-paced tech environment, staying current is both challenging and expensive. NaaS is changing this dynamic by democratizing access to advanced networking capabilities.

Key technologies that NaaS is bringing to the forefront:

  • Zero Trust Security architectures
  • AI-driven network optimization
  • Advanced micro segmentation
  • Edge computing integration
  • 5G and IoT convergence

Empowering Network Professionals: From Maintenance to Innovation

Contrary to earlier fears of job displacement, NaaS is elevating the role of network professionals. Offloading routine tasks, it’s freeing up engineers to focus on strategic initiatives.

This shift allows network professionals to:

  • Engage in high-level network design and architecture
  • Develop innovative solutions to business challenges
  • Focus on security strategy and implementation
  • Drive digital transformation initiatives

Business Agility: The New Competitive Edge

In today’s digital-first world, agility is paramount. NaaS provides the flexibility and scalability needed to respond quickly to market changes.

This agility manifests in several ways:

  • Rapid deployment of new network services
  • Easy expansion into new geographic markets
  • Quick adaptation to changing workloads and traffic patterns
  • Seamless integration of new technologies and applications

A global manufacturing client I recently advised was able to set up a fully operational branch office network in a new country within three days using NaaS, a process that previously took months.

The Road Ahead: NaaS 2.0

Looking to the future, NaaS is evolving rapidly. Key trends to watch include:

  1. Deeper integration with 5G for enhanced mobile and IoT connectivity
  2. Advanced AI for predictive network management and self-healing networks
  3. Quantum-safe security measures to counter emerging threats
  4. Seamless multi-cloud networking capabilities

Conclusion: Embracing the NaaS Future

The shift to NaaS represents more than just a change in network management; it’s a fundamental rethinking of networking’s role in business strategy. For CTOs and IT leaders, embracing NaaS isn’t about keeping up with trends—it’s about positioning organizations for success in an increasingly digital world.

As someone who’s navigated the complexities of enterprise networking for years, I’m excited about the possibilities NaaS brings. It’s not just a more efficient way to manage networks; it’s a catalyst for innovation, enabling businesses to focus on their core competencies while leveraging cutting-edge network capabilities.

In an era where agility and innovation are paramount, NaaS offers a strategic pathway for businesses to stay ahead of the curve. As enterprises consider this transition, the key is to approach it thoughtfully—start by evaluating current network challenges and identifying where NaaS can provide the most immediate impact. A gradual, phased adoption allows for smoother integration while minimizing disruption. By focusing on scalability, security, and alignment with business objectives, IT leaders can ensure that NaaS not only enhances their network infrastructure but also empowers their organizations to thrive in a rapidly evolving digital landscape. Now is the time to embrace NaaS, not just as a technological shift, but as a cornerstone of future-ready business strategy.

The future of networking is here, and it’s as a service. The question isn’t whether to embrace NaaS, but how quickly you can leverage it to drive your business forward. Those who act decisively will find themselves at a significant advantage in the digital race.

 

The post How NaaS is Reshaping Enterprise Connectivity appeared first on Cybersecurity Insiders.

As per Gartner® press release, “spending on security services – consulting, IT outsourcing, implementation and hardware support – is forecast to total $90 billion in 2024, an increase of 11% from 2023.” However, with a cyberattack every 44 seconds and it costing about $5 million to fix a breach, things still look grim.

The old barrack-like cybersecurity model cannot defend against this barrage of attacks. While leaders acknowledge the importance of cybersecurity to business continuity, reputation, and trust, they expect it to be absolute. This viewpoint assumes the enterprise is a finite entity where security could be enforced top-down and at the edge. Given the ubiquity of a shared cloud environment, hybrid working culture, mobile workforce, and an expanding network, pumping money and expecting 100% cybersecurity is akin to the Pentagon aphorism of “providing all assistance short of actual help”.

Since the goalposts shift with evolving threats, there is no end state for cyber-readiness, and 100% security is not possible. However, most incidents can be traced back to a smaller set of avoidable vulnerabilities or known unknowns. Also important is cyber-resilience, which allows businesses to operate in a degraded environment where access to networks and data is uncertain. 

To drive cyber-readiness and resilience, enterprises need to strategically reevaluate their approach to cybersecurity with an eye on the evolving threat landscape. Here’s what needs to change:

  • Secure Access, Not Access Points: With trends such as Bring Your Own Device, remote work, and cloud-hosted data centers and SaaS applications, connectivity requirements can no longer be served by an enterprise-controlled network. The edge is unmanageable, which calls for embedding security protocols across each layer of the network, including devices, applications, and users. Zero Trust Network Architectures assumes a breach has already happened and treats all entities as suspects, requiring identity and intent verification for each access request. It ensures the right users get access to the right applications and data at the right time and enables enterprises to provide secured connectivity to applications and data across devices, locations, users, or networks.
  • Prioritize Risk-Based Vulnerability Management: With increasing attack vectors, it is critical to identify zero-tolerance areas that require urgent attention to contain the attack and those that can be isolated with minimal disruption. Business risk prioritization contributes to appropriate monitoring and incident response mechanisms. This brings visibility and oversight to areas with regulatory implications. Most regulatory fines are not about why a breach happened but if the enterprise did everything under its control to pre-empt it. Ensuring critical areas are always on the radar helps cover the ground, allowing niche resources to focus on what matters the most.
  • Rely on Strategic Partners: While they monitor and track metrics internally to make strategic calls on cybersecurity, enterprises must also be aware of vulnerabilities outside their remit that can have a domino effect. With interconnected and global ecosystems, external vulnerabilities can be equally damning. Enterprises must collaborate with systems of intelligence, strategic partners, and industry consortia to collate threat data and analyze it to inform their cybersecurity practices. To keep the lights on, business continuity plans must consider the connected nature of operations, fail-safe measures, and disaster recovery.
  • Go on the Offensive: The growing Generative AI (GenAI) clout will only further the loss of enterprise agency and potentially open loopholes for bad actors to exploit. A World Economic Forum report states that GenAI will take two years to give defenders an advantage over attackers. AI increasingly leads social engineering attacks with the prevalence of deep fakes, sophisticated phishing attempts, and digital arrests. Creating AI mechanisms to combat them and creating the required intelligence to deter AI-led malicious activity at source can help enterprises go on the offensive. This can be too niche for a single enterprise, making a collaborative approach led by industry consortia ideal. 
  • Improve Visibility by Automating and Orchestrating: Two-thirds of cyber incidents can be traced back to human error – where an employee falls prey to bad actors or does something intentionally that proves damaging. The way to address this is to eliminate human touchpoints and embed automation, AI, and machine learning mechanisms to take over and orchestrate low-level tasks encoded with security policies. A pivot to zero trust architecture along with DevSecOps framework for automation also helps contain human-led errors with better enforcement of role-based access policies.

Defend Forward

Traditional approaches to cybersecurity are reactive. In today’s high-stakes environment, it is crucial for enterprises to proactively detect threats, search for vulnerabilities, establish systems to take corrective actions, and prevent malicious actors from making an impact. Cybersecurity must adapt to a military-style threat intelligence collection and preparedness that keeps enterprises ahead of bad actors. Partnering with service providers specializing in cutting-edge cyber defense strategies can be a start while instilling a security-first mindset through personnel training and automation-first tooling is more essential than ever.

 

The post Five Strategies for Creating Water-Tight Cybersecurity for Business Outcome & Value appeared first on Cybersecurity Insiders.

Non-Human Identities (NHIs) such as service accounts, tokens, access keys, and API keys, are critical components of modern business operations across all sectors and industries. Take the financial services industry: NHIs play a fundamental role in technologies like blockchain and open banking, managing secure access and data integrity across increasingly decentralized environments. As organizations adopt more cloud services and automation, the number of NHIs grows exponentially. In an average enterprise environment, today, NHIs outnumber human identities by a factor of 10x-50x. 

However, NHI management is often neglected, leaving misconfigurations, unrotated secrets, and overprivileged access vulnerabilities exposed to unauthorized access, data exfiltration, and ultimately, costly cyberattacks. 

NHIs are the access points to enterprise data and applications, making them attractive targets for cybercriminals. NHIs frequently possess elevated privileges to carry out their tasks, which heightens the risk if their credentials are compromised. In fact, on average, we find that there are five times more highly privileged NHIs than humans. 

Adding to this issue, traditional Privileged Access Management (PAM), and Identity & Access Management (IAM) solutions and best practices cannot address the scale, ephemerality, and distributed nature of NHIs. Unlike human users, NHIs cannot be protected with Multi-Factor Authentication (MFA), which makes it harder to limit the impact of breaches. While password rotation for human accounts is a mature and efficient process, the same cannot be said for secrets and keys due to the lack of visibility of usage and ownership context. While solutions like secret scanners can help spot vulnerabilities such as hard-coded or shared secrets, the operational complexity of performing operations like rotations or decommissioning is often insurmountable.  

With traditional identity best practices rendered obsolete and NHIs proliferating every day, the industry needs solutions to properly secure this massive attack surface. The recent Dropbox, Okta, Slack, and Microsoft cyberattacks, which involved the exploitation of NHIs, spotlight the costly effects of improper NHI management. 

Against this backdrop, organizations must incorporate comprehensive NHI management into their security and identity programs. Key best practices for managing NHIs include: 

  • Maintain a comprehensive and up-to-date inventory of all NHIs within the organization 
  • Understand the business context and owners of each NHI  
  • Apply the principle of least privilege  
  • Monitor the environment continuously to detect and respond to suspicious activities involving NHIs  
  • Define governance policies and implement them via automation  

Secret rotation is a key NHI governance process to prioritize. All too often, NHIs leverage secrets that are infrequently rotated. Rotating secrets reduces the risk of credential compromise by minimizing the window of opportunity for attackers and mitigating exposure to insider threats. Rotating secrets should become an integral part of organizations’ mover/leaver processes to safely offboard employees. 

Adopting an enterprise platform purpose-built to secure the complete lifecycle of NHIs is a simple and effective way to avoid cyber incidents stemming from the unique challenges of managing and securing NHIs. Investing in these tools is necessary to protect against evolving threats and uphold security in a dynamic digital landscape. 

Implementing an NHI management platform can empower organizations with:

  • Complete visibility, providing a holistic view of all NHIs, and understanding their usage; dependencies; and relationships within an IT stack.
  • Proactive security posture management, continuously assessing and improving the security posture of NHIs, and taking proactive measures to mitigate risks.
  • Automated governance, automating the entire lifecycle of NHIs from discovery to decommissioning, ensuring robust security and operational efficiency.
  • Seamless integration, integrating with an existing security stack, providing a unified approach to identity management.

Until recently, identity security was synonymous with governance and access management for human identities. This is no longer the case as NHIs have massively expanded the enterprise perimeter. Notable high-profile cyber incidents have underscored how compromised NHIs can lead to significant security breaches, highlighting why a robust NHI management framework is a strategic imperative for sustaining business operations in our interconnected world. Modern NHI management solutions are pivotal in addressing these challenges and helping organizations prevent potentially devastating cyberattacks. 

 

 

The post Non-Human Identity Management: Addressing the Gaping Hole in the Identity Perimeter appeared first on Cybersecurity Insiders.

As companies work to reap the benefits of artificial intelligence (AI), they also must beware of its nefarious potential. Amid an AI-driven uptick in social engineering attacks, deepfakes have emerged as a new and convincing threat vector. Earlier this year, an international company lost $25 million after a financial employee fell for a deepfake video call impersonating the company’s CFO. While such a story may sound like an anomaly, the reality is that generative AI is creating more data than ever before—data that bad actors can use to make attacks more convincing. Additionally, the technology has supported the multiplication of such attacks, growing from single attacks to quickly become tens of thousands of attacks, each tailored to the target in question. 

As deepfakes and other AI-generated social engineering attacks continue to become more common and convincing, companies must evolve beyond traditional threat intelligence. To remain secure, they must leverage AI themselves, embrace segmentation, and educate their employees on an ongoing basis.

Fighting fire with fire

Deepfakes are an extremely sophisticated way for bad actors to get through the door. Instead of receiving an oddly worded email from an alleged Nigerian prince, AI can help bad actors send highly personalized and convincing emails that mask the usual red flags. Once they have access to the network, they can start exporting, collecting and sharing data that can be used to build a convincing attack for their target. Thus, companies need tools that can identify a normal baseline for every user’s schedule and behavior. Then, AI can be leveraged to quickly identify and remediate anomalies that arise, like someone logging in at weird hours or stockpiling large amounts of information. By employing AI to detect suspicious activity, companies can sift through tremendous amounts of noise to uncover red flags.

Embrace segmentation 

The impact of deepfakes and other social engineering attacks can be minimized by dramatically shrinking the attack surface through segmentation. Government agencies with extremely sensitive data have always had several rings of protection: unclassified, classified, and top-secret networks. This is a mindset all companies must embrace. Having everything on a single network is extremely risky, even if that network uses zero-trust principles. 

In fact, the recent Crowdstrike outage completely debilitated airlines because they have everything on a single network, which creates a single point of failure. In addition to separating crown jewel data from less critical data, it can also be useful to rely on different applications, such as using Microsoft Teams for standard messaging and a dedicated chat capability for more sensitive conversations. Segmenting networks, communication styles, and data enclaves ensures that, even if a bad actor gets through the door using a deepfake, they won’t have complete and total access to sensitive information.

Educate employees

In an ideal situation, segmentation and anomaly detection aren’t required because bad actors never get in at all, which is why educating employees on the rise of deepfakes may be the most effective way to ensure company-wide security. Zero-trust is a mindset—not just a technology or a protocol—and teaching employees to be extremely diligent can go a long way. If there’s even a small chance that a request is nefarious, employees should be encouraged to verify it outside of the channel the request came in on. That may mean picking up the phone and simply calling the individual in question. Additionally, teaching about the capabilities that exist and reminding employees to think before they click are simple but effective ways to prevent deepfakes. 

Altogether, the technology available to bad actors is going to continue to evolve, but companies can keep up with the pace of change by deploying AI themselves, embracing segmentation, and educating their employees about the threats that exist. Without these steps, organizations will remain vulnerable to deepfakes and other social engineering attacks, which leaves their data and reputations at risk.

 

The post How to Stay Ahead of Deepfakes and Other Social Engineering Attacks appeared first on Cybersecurity Insiders.

Introduction

With 90% of the world’s data being created in the last two years, and the total amount of data set to reach 181 Zettabytes in 2025, IT and security teams are under pressure to help their business leverage data to support key business initiatives, without exposing its data to risk. The challenges of identifying, monitoring, and protecting sensitive information have intensified. Many organizations struggle with fragmented tools and limited data discovery, as well as manual and weak classification accuracy that fail to scale, leading to data security blind spots that expose critical data to risks. This new reality has paved the way for a new security category to dramatically rise in popularity. Data Security Posture Management (DSPM) has become vital for providing continuous visibility, automatic classification, and security posture of sensitive data spread across growing SaaS, IaaS, PaaS, and existing on-premises environments.

This 2024 DSPM Adoption Report is based on a comprehensive survey of 637 IT and cybersecurity professionals that reveals how organizations are approaching DSPM, the challenges they face, the effectiveness of their current solutions, and their adoption plan over the next 12 months. Through this survey, we uncover the critical needs and priorities of enterprises when it comes to securing their data across various environments.

Key Survey Findings

  • DSPM Adoption on the Rise: DSPM is becoming the fastest-growing security category with 75% of organizations saying they will adopt DSPM by mid-2025. This is a faster rate of adoption than that of Security Service Edge (SSE) solutions, Extended Detection and Response (XDR), and Cloud Security Posture Management (CSPM). This rapid adoption reflects the recognition that DSPM is crucial for managing data security risks in modern, multi-environment infrastructures, especially given the vital role that data plays within the business.
  • Visibility Gaps Weaken Security Postures: An overwhelming 83% of respondents believe that a lack of visibility into data is weakening the overall security posture of their organizations. This underscores the need for tools that provide comprehensive and real-time visibility into sensitive data across all environments.
  • The Data Discovery and Data Classification Gap: A staggering 87% of enterprises find their current data discovery and classification solutions lacking, with only 13% considering them very effective. This underscores a critical deficiency in data security practices, emphasizing the urgent need for more precise and automated solutions to safeguard sensitive information.
  • Challenges in Detecting and Responding to Exposures: More than 60% of organizations do not feel confident in their ability to detect and respond to data security and privacy exposures. This highlights a critical gap that must be addressed through enhanced monitoring, automated response capabilities, and better alignment between detection tools and security strategies.
  • Core DSPM Features: Real-time data monitoring (43%), data discovery (38%), and data classification (35%) are seen as the core features that enterprises should prioritize in any DSPM proof of value engagement. These features are essential for providing the visibility and control needed to secure sensitive data effectively, as real-time monitoring and integration with discovery and classification have been historically lacking.

We would like to extend our gratitude to Cyera for their insights and valuable contributions to this report. Their expertise in the data security space has been instrumental in creating this important research.

As organizations continue to navigate the complexities of data security, we hope this report, which is generated by the responses of your peers, provides valuable insights and practical guidance for strengthening your data security posture. By addressing the challenges outlined and prioritizing the key features and strategies discussed, we are confident that your organization will be well-equipped to manage the risks associated with sensitive data in the years ahead.

Best,

Holger Schulze

Founder, Cybersecurity Insiders

Today’s Biggest Data Security Challenges

Data security remains a top priority for organizations as they navigate an increasingly complex threat landscape. Data security is top of mind for organizations due to the rapidly increasing frequency and cost of data breaches. This growing financial and business impact, along with the complex regulatory landscape and the expanding use of cloud and AI technologies, makes it essential for organizations to enhance their data security posture. The primary challenges around data security today, as reflected in the report, highlight the tension between ensuring robust data protection and managing data access and visibility within diverse environments.

The results reveal that 57% of respondents view excessive data access—often stemming from over privileged accounts—as a pressing concern. Overprivileged access to data, along with the lack of visibility into sensitive data—cited by 50% of respondents as a significant challenge—are the two greatest data security challenges today. This validates the need for a stronger correlation between identity and data access, ensuring that only the right individuals have access to the right data at the right time. Managing exceedingly large amounts of data was also cited by 46% of respondents, reflecting the growing difficulties in maintaining control over expanding data sets, particularly in hybrid and cloud environments.

Additional responses include: Data accuracy given incomplete data visibility, which can lead to incorrect conclusions 39% | Concerns over restrictive data access – overly-constrictive controls 35% | Lack of visibility or control over how SaaS services transfer and use sensitive data 33% | Other 1%

 

Given these findings, organizations should prioritize implementing solutions that enhance visibility and control over sensitive data without stifling business operations. Focusing on technologies that enable granular data discovery, coupled with automated policy enforcement, can help mitigate the risks associated with both over privileged access and poor data management. Moreover, maintaining a comprehensive and real-time view of sensitive data will allow organizations to proactively address security gaps and avoid the pitfalls of excessive or restrictive access controls.

Critical Data at Risk

Because data breaches have far-reaching consequences, understanding which types of data organizations are most concerned about is crucial for shaping effective security strategies.

The survey reveals that customer data, at 68%, and financial data, at 63%, are by far the top concerns for IT and cybersecurity professionals, reflecting the value and high stakes associated with the compromise of these types of information.

Customer data is the most valued, as it directly impacts customer trust and brand reputation. Financial data follows closely, given the potential for immediate monetary loss and regulatory repercussions. Intellectual property, cited by 37% of respondents, underscores the importance of safeguarding proprietary information which can protect a company’s competitive edge within the market. Interestingly, employee data and health records are still significant concerns at 36% and 28% respectively, highlighting the breadth of data types that organizations must protect.

Additional responses include: Operational data 22% | Partner data 19%

 

Organizations should ensure that their security posture is tailored to protect these critical data types, with an emphasis on technologies that offer robust encryption, access controls, and continuous monitoring. As customer and financial data are the top priorities, implementing data-centric security measures that focus on these areas can help prevent breaches and mitigate the impact should they occur. Additionally, aligning data security measures with the specific risks associated with each type of data—such as intellectual property or employee information—will create a more resilient and adaptive security framework.

Insider and Third-Party Risk

Understanding which entities pose the greatest data security risks is crucial for organizations as they seek to protect sensitive information from both internal and external threats.

The survey results reveal that employees, at 45%, are seen as the most significant concern, highlighting the ongoing challenge of insider threats. This is particularly critical as employees often have extensive access to sensitive data as part of their daily job and often switch roles throughout their tenure at an organization, making them a potential weak point in an organization’s security posture.

Third parties, including partners, contractors, and auditors, follow at 31%, underscoring the risks associated with external relationships. As organizations increasingly rely on third-party services, the potential for data exposure grows, making it essential to manage and monitor these interactions carefully. The risk grows exponentially as “Nth party” users, the third parties of third parties, will often require access to company data as well. The rising concerns around AI copilots (14%) and other non-human identities (10%), such as IoT devices, combining for 24%, point to the new challenges posed by emerging technologies, which can introduce vulnerabilities if not properly secured.

To address these concerns, organizations should adopt a comprehensive data security strategy that includes visibility into what sensitive data is accessible by insiders, rigorous third-party data access visibility and control, and proactive measures to secure AI and IoT technologies’ access to data. By doing so, they can reduce the likelihood of data incidents from both existing and emerging threats, ensuring that sensitive information remains protected.

Data Discovery Roadblocks

Discovering and managing sensitive data is the foundation of any successful data security program or strategy because it directly impacts an organization’s ability to protect its most critical digital assets. Without effective data discovery, organizations face significant risks, including high exposure to data security risks, noncompliance with regulations centered around data, and the inability to mitigate the impact of data incidents in a timely manner.

The report reveals that organizations face numerous challenges in this area, which can impede their ability to protect sensitive information and respond to emerging threats. The most significant challenge, cited by 56% of respondents, is the difficulty in maintaining up-to-date data. Lack of continuous data discovery exacerbates this issue, making it difficult for teams to stay current with the ever-changing data landscape. Additionally, 52% of participants report challenges in consolidating and analyzing data from multiple sources, a problem that is often compounded by the lack of support across heterogeneous environments—cloud, SaaS, DBaaS, and on premises. This creates significant barriers to obtaining a unified view of data across these diverse platforms.

Time-consuming processes and the lack of real-time visibility into data, noted by 44% and 38% of respondents respectively, further emphasize the inefficiencies that hinder effective data security. These issues not only slow down response times but also increase the risk of missing critical security events. Moreover, the lack of scalability (32%) and the high potential for human error (31%) indicate that many organizations are struggling to keep pace with the growing volume of data and the intricacies involved in managing it securely.

To overcome these challenges, organizations should invest in advanced data discovery capabilities that offer automation, real-time visibility, and scalability across heterogeneous environments. These solutions must also support a mix of structured, unstructured, and semi-structured data. By doing so, they can reduce the manual effort involved, minimize the risk of errors, and ensure that their data security posture remains strong as data environments grow and evolve.

Data Classification Hurdles: Automation Is Key

Following data discovery, the next critical step in a robust data security strategy is data classification, enabling organizations to identify and prioritize the protection of their most sensitive information. However, the report reveals significant challenges organizations face with their current data classification methods, which can severely undermine the effectiveness of their overall data security strategy.

The most pressing issue, identified by 49% of respondents, is the time-consuming nature of data classification processes. This challenge is largely due to the lack of automation and a continued reliance on manual methods, which slows down the process and increases the risk of leaving sensitive data exposed for longer than necessary. Similarly, 46% of participants report difficulties in consolidating and analyzing data from multiple sources, reflecting the complexities of managing classification across diverse environments, such as cloud, SaaS, DBaaS, and on-premises systems.

A lack of real-time visibility into data classes, cited by 42% of respondents, further exacerbates these challenges, making it difficult for organizations to maintain an up-to-date understanding of their data landscape. The inability to automatically learn new classifications, noted by 41%, highlights a significant gap in adaptability, which is crucial in dynamic data environments. False positives due to regular expressions (32%) and a lack of customizability (30%) also present barriers, potentially leading to misclassified data and inefficient protection measures.

To address these challenges, organizations should consider adopting advanced data classification tools that offer automation, real-time visibility, and adaptive learning capabilities. These capabilities can be found within DSPM solutions that use Large Language Models (LLMs) to classify data that is unique to the organization or a particular industry. By integrating these features, companies can streamline the classification process, reduce the reliance on manual efforts, and ensure that their data is accurately classified and protected according to its sensitivity. Additionally, enhancing customizability and reducing reliance on rigid regular expressions will allow for more precise and context-aware classifications, ultimately strengthening the organization’s data security posture.

Impact of Data Visibility on Security Posture

Data visibility is a cornerstone of a strong security posture, as organizations must know what data they have, where it resides, and how it is being accessed in order to protect it effectively.

The survey results underscore the critical nature of data visibility, with 83% of respondents acknowledging that a lack of visibility into data is weakening the overall security posture of enterprises. This reflects widespread concern about the risks associated with blind spots in data management.

Specifically, 39% of participants believe that insufficient data visibility significantly raises their security risk. This finding aligns with earlier concerns about over privileged access and the difficulty in managing large amounts of data, emphasizing how these gaps in visibility can lead to vulnerabilities that are easily exploited by malicious actors. Only a small percentage (8%) believe that data visibility does not impact risk, suggesting that most organizations recognize the importance of having a clear and comprehensive view of their data.

For organizations, it’s essential to prioritize tools and processes that enhance their ability to locate, monitor, and manage sensitive data. Implementing continuous data discovery, combined with real-time monitoring, can help close the visibility gaps that currently expose organizations to unnecessary risk. By improving visibility, companies can strengthen their security posture and reduce the likelihood of data breaches or unauthorized access.

Current Methods for Data Inventory and Discovery

Effective data inventory and discovery are essential for maintaining a robust security posture, yet the survey results reveal that organizations are still relying on a diverse array of tools, many of which do not integrate seamlessly with one another. This fragmented approach can hinder the ability to stay up-to-date with changes in data and complicate efforts to analyze data across the environment, ultimately impacting overall data security.

The most commonly used methods include Data Backup and Recovery (46%) and Data Loss Prevention (DLP) solutions (45%). While these tools play crucial roles in protecting and recovering data, they often function independently, leading to silos that limit visibility and coordination across the organization. Data Detection and Response (DDR) is also widely used (41%), further confirming that teams are employing multiple specialized tools to address various aspects of data security.

Governance, Risk, and Compliance (GRC) at 38% and Data Privacy Software at 37% reflect the growing emphasis on regulatory compliance and privacy concerns, yet these tools also may not fully integrate with other discovery methods. Security Information and Event Management (SIEM) systems, used by 36% of respondents, provide valuable insights, but often lack the comprehensive data visibility needed for effective discovery across hybrid environments. The use of newer approaches like SaaS Security Posture Management (SSPM) and Cloud Security Posture Management (CSPM), at 29% and 23% respectively, indicates a move toward cloud-focused solutions. Even so, it’s important to note that unlike DSPM, SSPM and CSPM are not focused on data security, but rather the infrastructure or applications posture itself.

Additional responses include: Data Access Governance (DAG) 24% | Cloud Security Posture Management (CSPM) 23% | Data Security Posture Management (DSPM) 19% | Other 3%

 

For organizations to overcome the challenges of tool fragmentation, it is critical to adopt solutions that integrate data inventory and discovery across environments. By focusing on platforms that offer comprehensive and unified data visibility, companies can streamline their discovery processes, reduce data silos, and ensure that they are fully equipped to manage and protect their data effectively in an increasingly complex landscape.

Methods for Gaining Visibility into Sensitive Data

Achieving comprehensive visibility into sensitive data across diverse environments is a growing challenge for organizations, especially as they manage data in cloud, on-premises, and hybrid settings. The survey results highlight the varied approaches that companies are taking to address this challenge, yet they also reveal significant gaps that undermine data security efforts.

Over half of the respondents (53%) indicate that they rely on different security services for each of their environment types, such as SaaS, IaaS, PaaS, and on-premises. This fragmented approach complicates data visibility and increases the likelihood of blind spots across the enterprise. While 49% of organizations have adopted at least some form of integration across security solutions, which offer a more unified view of data, it’s clear that many companies still struggle to consolidate their security efforts across diverse platforms.

Additionally, a concerning 36% of respondents continue to rely on manual data cataloging and classification processes. This reliance on manual methods not only increases the risk of human error but also slows down the ability to respond to security threats quickly. Compounding this issue, 27% of organizations report that they do not currently have a solution that supports visibility across all environments, further exposing them to risk.

For organizations to strengthen their data security posture, it’s critical to move away from siloed and manual approaches. Adopting integrated solutions that provide comprehensive visibility across all environments will help reduce gaps and improve the efficiency of data protection efforts. By streamlining data discovery and classification, companies can ensure they have a clear and real-time view of their sensitive information, regardless of where it resides.

Effectiveness of Data Discovery and Classification

Understanding the effectiveness of data discovery and classification tools is essential for organizations aiming to protect their most critical sensitive data. The survey results reveal a mixed picture, highlighting both progress and ongoing challenges in these key areas.

When it comes to data discovery solutions, only 14% of respondents believe their discovery tools are very effective, meaning that 86% of organizations do not have complete confidence in their discovery capabilities. In comparison, data classification methods receive similarly low favorable ratings. Only 13% of respondents consider their classification tools to be very effective. This means that 87% of enterprises do not believe their classification methods are at the highest level of effectiveness, highlighting a need for better data classification solutions.

The fact that only a fraction of enterprises consider their existing discovery and classification solutions to be very effective underscores a critical gap in data security. Even when organizations can discover their critical data, they may struggle to classify it effectively, which undermines broader data protection efforts. This widespread lack of confidence suggests that many companies need to rethink and upgrade their data security repertoire.

Additional responses include: We do not discover data today 2% | We do not classify data today 3%

 

To address these gaps, organizations should focus on integrating their data discovery and classification services, ensuring that once data is discovered it can be accurately and efficiently classified. The key should be to prioritize DSPM solutions that can provide data discovery at scale and within an appropriate time frame (days, not months or years), and combine this with classification that has high precision and automation for continued posture assessment. Investing in tools that enhance automation and reduce manual efforts will help shift more organizations from neutral or ineffective ratings to positive ones. By improving both discovery and classification, companies can better safeguard their most sensitive data and strengthen their overall security posture.

Location of Sensitive Data

In today’s corporate environments, sensitive data is not confined to a single location but is scattered across a complex mix of on-premises systems, SaaS platforms, and cloud infrastructures. This dispersion makes it increasingly difficult for organizations to maintain comprehensive visibility and control over their most critical data, intensifying the need for robust security solutions that can operate seamlessly across all environments.

Nearly 40% of respondents report that the majority of their sensitive data remains on-premises, highlighting the continued need for strong on-premises data security. Next, 30% of organizations indicate that their sensitive data primarily resides within SaaS environments, reflecting the growing dependence on cloud-based applications. However, this shift also introduces new challenges in managing and securing data across multiple SaaS providers.

Alarmingly, 15% of organizations admit they have no way of knowing where their sensitive data is located, significantly increasing their exposure to potential breaches. An additional 13% report that the majority of their data resides in IaaS environments, further complicating the data management landscape.

This broad distribution of data across corporate systems underscores the necessity for comprehensive data security solutions that offer unified visibility and control across the entire data landscape. Without such tools, organizations risk leaving critical data undiscovered and unprotected, especially when they lack a clear understanding of where that data exists. Implementing integrated and scalable solutions will be key to overcoming these challenges and ensuring data security across all environments.

Critical Activities to Monitor for Data Security

Maintaining a strong data security posture requires vigilant monitoring of key activities that could indicate potential threats or vulnerabilities. The survey results reveal a clear prioritization of what professionals consider the most critical activities to keep under surveillance, with unauthorized and over privileged access events emerging as top concerns.

Unauthorized access attempts are viewed as the most critical activity to monitor, with 58% of respondents highlighting this as a priority. This focus on unauthorized access aligns with the broader concern about protecting sensitive data from breaches, whether due to external attacks or insider threats. Closely related, 55% of participants emphasize the importance of monitoring over privileged access for humans, reflecting the risks associated with granting excessive permissions that can lead to unintended data exposure or misuse.

Interestingly, overprivileged access for non-human identities—such as automated processes, bots, or IoT devices— is also seen as crucial, with 42% of respondents prioritizing it. This concern surpasses the need to monitor traditional activities like data removal (40%), usage patterns related to sensitive data (36%), and even industry compliance violations (31%). The emphasis on non-human identities underscores the evolving threat landscape where automation and connected devices are introducing new security risks if not properly managed.

Data exfiltration attempts are another high-priority activity, cited by 56% of respondents. Additionally, changes in data access permissions (49%) and data transfer and sharing activities (41%) are recognized as critical areas to monitor, as they can signal potential security breaches or policy violations.

Additional responses include: Industry compliance violations 31% | Configuration changes in data stores 22%

 

Given these findings, organizations should prioritize DSPM solutions that offer data detection and monitoring capabilities that provide comprehensive visibility into both human and non-human access events. Implementing tools that can detect unauthorized access, flag over privileged accounts, and track changes in data permissions will be essential in maintaining a strong security posture. Moreover, the focus on nonhuman identities indicates a growing need for security measures that can address the unique risks posed by automation and connected devices in today’s data environments.

Challenges in Managing Data Security Posture

Managing data security posture across complex environments is increasingly challenging for organizations, especially as they navigate multi-cloud and hybrid architectures.

A majority of respondents (51%) report that managing data security posture across multi-cloud and hybrid environments is a top challenge. This complexity often stems from the need to coordinate security efforts across various platforms, each with its own unique risks and requirements. Closely following this, 48% of participants cite a lack of visibility into data within their SaaS environments, highlighting how difficult it can be to maintain control over data that resides outside of traditional on-premises systems.

Integration issues with existing security infrastructure is another critical concern, affecting 43% of respondents. These integration challenges can create friction between new and legacy systems, further complicating the already intricate task of managing data security. This lack of cohesion adds unnecessary complexity, which can stall or even derail data security projects if not addressed effectively.

Understanding which human and non-human identities have access to sensitive data is also a significant challenge, with 31% of respondents identifying it as an area of concern. As organizations adopt more automated processes and connect to IoT devices, keeping track of who—or what—has access to sensitive data becomes increasingly difficult.

Additional responses include: Limited automation for data incident remediation processes 25% | Lack of visibility into data that exists within my onpremises environment 23% | Lack of monitoring into data events that matter 22%

 

This reality—where understanding data security posture across hybrid cloud and SaaS environments is fraught with challenges—can lead to stalled or failed data security initiatives if not carefully managed. Organizations must prioritize solutions that provide comprehensive visibility and seamless integration across all environments. By doing so, they can reduce complexity, enhance control, and ensure the success of their data security efforts in a rapidly evolving landscape.

Effectiveness of Data Security Posture Management

As organizations increasingly turn to Data Security Posture Management (DSPM) tools to protect their sensitive data, the survey results reveal promising insights into the effectiveness of these solutions.

Among those who have adopted DSPM, the majority—63%—report that these tools have been either effective or very effective in identifying and mitigating security risks associated with data. This positive feedback highlights the value that DSPM brings to an organization’s overall security strategy.

However, 28% of respondents remain neutral, likely reflecting experiences with early DSPM solutions, or “DSPM 1.0” tools, that may lack comprehensive support across multiple environments or struggle with scalability and precision. These limitations can prevent organizations from fully realizing the benefits of DSPM, leading to less confidence in the solution’s effectiveness.

To maximize the effectiveness of DSPM, organizations should focus on solutions that not only scale across diverse environments—such as SaaS, IaaS/PaaS, and on-premises—but also provide precise, actionable insights into data security risks. By advancing beyond early iterations of DSPM and adopting modern, more robust and scalable tools, companies can manage and mitigate data security threats more effectively.

Future Investment in Data Security Posture Management

As the importance of securing sensitive data continues to rise, the survey results indicate a strong trend toward the adoption of Data Security Posture Management (DSPM) solutions.

Currently, 19% of enterprises have already implemented DSPM, and by mid-2025, 75% of organizations are expected to have adopted this technology. This positions DSPM as the fastest-growing security category globally.

When looking at future investment plans, 56% of respondents are either likely or very likely to invest in a DSPM solution within the next 12 months. This enthusiasm underscores the recognition that DSPM is becoming a critical component of modern data security strategies.

Only a small fraction of respondents are unlikely (7%) or very unlikely (5%) to invest in DSPM, which suggests that the majority of organizations understand the value DSPM provides, even if they have not yet taken the steps to adopt it.

As DSPM continues to evolve and address the challenges of data security across various environments, more enterprises are likely to make it a cornerstone of their data security program.

Expected and Observed Benefits of DSPM

As organizations increasingly adopt Data Security Posture Management (DSPM) solutions, the anticipated benefits reflect the growing need for more effective and accurate data security practices. The survey results reveal that security professionals are most excited about DSPM’s ability to enhance data discovery and improve precision in data classification—two areas where previous solutions have often fallen short.

Nearly half of the respondents (48%) expect or have already observed an improved ability to discover sensitive data within their environments. This benefit is particularly valuable given that many earlier discovery solutions lacked comprehensive support across all environments, leading to significant blind spots. By addressing these gaps, DSPM tools enable organizations to gain a more complete understanding of their data landscape, and with higher levels of confidence and automation.

Additionally, 43% of respondents are enthusiastic about DSPM’s potential to boost accuracy in data classification, reducing the occurrence of false positives. In the past, high rates of false positives have been a major pain point for data security leaders, creating unnecessary noise and making it difficult to focus on genuine threats. DSPM’s enhanced precision in classification offers a solution to this frustration, allowing for more efficient and effective data protection.

Beyond discovery and classification, other significant DSPM benefits include better compliance with data protection regulations (36%) and a more comprehensive view of data exposures and vulnerabilities (35%). The ability to reduce the risk of privacy exposures (34%) and enable the confident use of data for AI purposes (33%) also highlight DSPM’s evolving role in addressing modern security challenges, such as ensuring that data fed to AI models is secure and compliant.

Additional responses include: None – I do not plan to adopt a DSPM solution 11% | Other 2%

 

Core DSPM Features: What Matters Most

When it comes to Data Security Posture Management (DSPM), organizations are clear about what they need most: near real-time data monitoring, data discovery, and data classification. These three features emerged as the top priorities in the survey, highlighting their critical role in strengthening data security and forming the foundation of any DSPM solution.

Real-time data monitoring and alerting of data events, prioritized by 43% of respondents, is seen as the most crucial feature. This focus reflects the need for immediate visibility into data activities, allowing organizations to detect and respond to threats as they happen. However, the true value of real-time monitoring is fully realized only when it’s paired with robust data discovery (38%) and data classification (35%) capabilities. The integration of these features is essential, as monitoring alone is insufficient without a clear understanding of what sensitive data exists and how it should be classified.

This gap—where real-time data monitoring often operates in isolation from discovery and classification—highlights why these three features should be the primary focus in any DSPM proof of value engagement. Without the ability to correlate real-time events with accurate discovery and classification, organizations risk missing critical insights that could prevent data breaches.

Other important features include integration with existing security tools (32%) and automated remediation capabilities (31%). These functionalities ensure that DSPM can seamlessly fit into the broader security infrastructure and take proactive steps to address vulnerabilities. Continuous risk assessment (30%) and comprehensive reporting (28%) are also valued, offering ongoing visibility into security posture and detailed insights for decision-making.

Additional responses include: Continuous risk assessment and vulnerability detection 30% | Comprehensive reporting and analytics 28% | Policy management and enforcement 14% | Other 4%

 

Ultimately, real-time monitoring, data discovery, and classification stand out as the core needs that organizations should prioritize when evaluating DSPM solutions. Ensuring that these features work in sync will empower security teams to maintain a more effective and resilient data security posture.

Evaluation Considerations When Choosing a DSPM Solution

Selecting the right Data Security Posture Management (DSPM) solution is crucial for organizations aiming to safeguard their sensitive data effectively. The decision-making process is complex, as it directly impacts the organization’s ability to discover, classify, protect, and manage data across diverse environments. Given the rapidly evolving threat landscape, choosing a DSPM solution that aligns with an organization’s unique security needs is of paramount importance.

Precision stands out as the top priority for security professionals, with 51% of respondents identifying the accuracy of data classification as their primary consideration. This focus on precision is critical, as accurate classification forms the foundation of any effective data security strategy. Without it, organizations cannot properly identify and protect their most sensitive information, leaving critical gaps in their security posture and making it difficult to focus existing security personnel on the data that matters most.

Following closely, 47% of respondents prioritize the ability to support all environments. With data scattered across on-premises systems, cloud platforms, and SaaS applications, comprehensive coverage is essential. Security leaders understand that a DSPM solution must seamlessly handle data across all environments to provide the visibility and control necessary to mitigate risks. Integration capabilities with existing tools are also highly valued, with 45% of respondents citing this as a key evaluation. In an increasingly complex security ecosystem, the ability for DSPM to send signals and work in tandem with other security technologies is of paramount importance. This ensures that data security is not siloed but rather integrated into the broader security framework, enhancing overall effectiveness.

Other important factors include automated and continuous scanning capabilities (36%), which help maintain up to-date data security in real time, and the speed and ease of deployment (30%), which can significantly impact the success and adoption of a DSPM solution. Cost and return on investment (23%) and compliance mapping (19%) are also important, though they take a backseat to the more pressing concerns of accuracy, coverage, and integration.

Additional responses include: Cost and return on investment 23% | Compliance mapping and support 19% | Other 3%

 

Ultimately, when data security leaders are considering a DSPM vendor, precision in classification, support across all environments, and strong integration capabilities should be at the top of their evaluation criteria to ensure that a DSPM solution can effectively manage sensitive data and align with the broader security strategy of the organization.

Effectiveness in Detecting and Responding to Data Security Exposures

The ability to detect and respond to security and privacy exposures of sensitive data is a critical aspect of maintaining a strong data security posture.

Given the increasing frequency and sophistication of cyber threats, organizations must be confident in their ability to protect their most valuable assets. However, only 13% of respondents believe their organization is very effective at detecting and responding to data security and privacy exposures, with an additional 26% considering themselves effective. This means that 61% of organizations do not feel they have a strong ability to manage these crucial tasks.

This lack of confidence is concerning, as it suggests that a significant number of organizations may be leaving sensitive data vulnerable to breaches and other security or privacy incidents.

For organizations to improve their effectiveness, it’s essential to invest in solutions that provide comprehensive data visibility, automated issue identification, and ongoing risk monitoring.

Data Security Budgets: A Priority for the Year Ahead

As organizations continue to face a dynamic and challenging threat landscape, the allocation of resources toward data security remains a critical priority. The survey results reflect this focus, with a significant portion of respondents expecting their data security budgets to either increase or remain stable over the next 12 months.

Specifically, 22% of respondents anticipate a significant increase in their data security budget, while an additional 31% expect a more moderate increase. This indicates that over half of organizations recognize the need for continued investment in data protection, reinforcing the importance of maintaining and enhancing their security posture.

Meanwhile, 33% of respondents expect their budget to remain the same, further highlighting that data security continues to be a priority, even in organizations where spending levels are not expected to rise. Notably, only 14% believe their data security budget will decrease, underscoring the widespread understanding that cutting back on security investments could leave organizations vulnerable to escalating threats.

Overall, these findings demonstrate that data security remains at the forefront of business priorities. As companies allocate their budgets for the coming year, it is clear that most will continue to invest in safeguarding their sensitive information, ensuring they are well-prepared to defend against evolving risks.

Essential DSPM Best Practices for Elevating Data Security

To get the most out of your Data Security Posture Management (DSPM) efforts, it’s crucial to adopt proven best practices that enhance data protection and streamline operations. By focusing on continuous discovery, automatic classification, and integration across diverse IT environments, these practices ensure a comprehensive data security posture management.

1.Ensure Continuous Data Discovery: With 83% of respondents identifying visibility gaps as a security weakness, continuous data discovery across all environments is crucial. This minimizes blind spots and helps identify and protect sensitive data more effectively.

2.Prioritize Classification: 87% of enterprises do not believe their classification methods are at the highest level of effectiveness. Automating this process improves speed and reduces manual errors. Selecting a solution with unsupervised AI-powered classification can address the need for learned classifications missed by RegEx, enhancing precision and reducing false positives.

3.Implement Real-Time Monitoring: Real-time monitoring is critical for quick threat detection. As 43% of respondents prioritize this feature, ensure your DSPM solution includes robust alerting to mitigate risks as they arise.

4.Integrate with Existing Security Tools: Integration across existing IT security platforms is key for cohesive security strategies. With 45% of organizations prioritizing this, ensure your DSPM solution seamlessly connects with your current tools to enhance overall security.

5.Focus on Scalability Across Environments: Managing data security across multi-cloud and hybrid environments is challenging, with 51% citing it as a concern. Choose a DSPM solution that scales effectively across all environments to maintain consistent protection.

6.Develop a Budget Line Item for DSPM Budget: 53% of IT and security organizations will be increasing their data security budget. Given that DSPM is new, you may not be able to fund the solution by using an existing line item. Prioritize setting aside a DSPM budget when meeting with your business stakeholders (which should include the data team, security team, privacy team, and IT team) to ensure that you can implement DSPM within your security plans.

7.Identity Data Access: Managing who has access to what data is a fundamental aspect of DSPM. Implementing strict data access controls, with a focus on least privilege and zero trust principles, ensures that only authorized users can access sensitive data, reducing the risk of insider threats and unauthorized access.

Methodology & Demographics

The 2024 DSPM Adoption Report is based on an extensive survey of 637 cybersecurity professionals conducted in August 2024. The study explored how organizations are approaching DSPM, the challenges they face, the effectiveness of their current solutions, and their adoption plan over the next 12 months. The respondents encompass technical executives and IT security practitioners, providing a balanced representation of organizations of diverse sizes across a wide range of industries.

About Cyera

Data is the fastest-growing attack surface in the world. Founded in 2021, Cyera, which has raised $460M in total funding and is valued at $1.4bn, is a pioneer in the data security space. Cyera empowers security leaders at Paramount Pictures, Mercury Financial, and others to quickly discover their data attack surface, classify data with high precision, comply with data regulations and privacy standards, and monitor, detect, and quickly remediate data risk.

What makes Cyera unique is its agentless design that deploys in just five minutes across any environment – and its unsupervised, AI-powered classification engine that auto-learns unique classifications and delivers 95% classification precision. These insights are then combined with the data security company’s Identity capabilities. Cyera can discover human and non-human identities (i.e., AI copilots), assign trust levels to them, assess their level of access to sensitive data, and determine the context in which the identities can access that data. These platform capabilities are complemented by Cyera’s proactive data risk assessment, 24x7x365 data monitoring, and Data Incident Response services. These services make Cyera’s data security experts readily available to Cyera’s customers.

With Cyera, security leaders can focus on enabling their businesses to safely use

About Cybersecurity Insiders

Cybersecurity Insiders brings together 600,000+ IT security professionals and world-class technology vendors to facilitate smart problem-solving and collaboration in tackling today’s most critical cybersecurity challenges.

Our approach focuses on creating and curating unique content that educates and informs cybersecurity professionals about the latest cybersecurity trends, solutions, and best practices. From comprehensive research studies and unbiased product reviews to practical e-guides, engaging webinars, and educational articles – we are committed to providing resources that provide evidence-based answers to today’s complex cybersecurity challenges.

Contact us today to learn how Cybersecurity Insiders can help you stand out in a crowded market and boost demand, brand visibility, and thought leadership presence.

Email us at info@cybersecurity-insiders.com or visit cybersecurity-insiders.com

The post The 2024 DSPM Adoption Report appeared first on Cybersecurity Insiders.

As IT estates become more complex, AIOps and observability tools can equip IT professionals to strengthen the resilience and security of their operations. Guy Warren, CEO at ITRS discusses the challenges firms face with monitoring diverse IT estates and AIOps’ vital role in overcoming them.

The need to deliver resilient business services is increasing demand for reliable, high-performing IT operations. At the same time, organizations are diversifying their IT estates to include public, private, and hybrid cloud environments, alongside microservices and third-party integrations. This complexity is making it more difficult to maintain an accurate picture of IT infrastructure, drive operational efficiency, and catch emerging issues before they compromise business services.

This is where AIOps — Artificial Intelligence for IT Operations — comes into play.

AIOps capabilities can leverage machine learning to rapidly process vast amounts of IT monitoring data, making it possible to detect anomalies and trends that might otherwise go unnoticed.

The AIOps market is expected to grow at a compound annual growth rate of 17.4% between now and 2030, driven by businesses’ need for efficiency and agility. With 93% of organizations already investing or planning to invest in AIOps, what IT monitoring challenges is it helping to address and how does it benefit business resilience?

Tackle ITOps’ infamous alert fatigue

Disparate workflows, a mix of different tools, and siloed data generated from diverse IT estates cause a lot of noise — a.k.a. alert storms — and make it hard to distinguish between benign performance fluctuations and emerging critical issues. The vast quantity of low-priority alerts monitoring solutions often deliver can distract you from the problems that could have business impact.

For example, unexplained spikes in outbound or inbound network traffic can be a symptom of a Distributed Denial of Service (DDoS) attack or data exfiltration attempts, while frequent reboots could be caused by malware tampering with IT infrastructure.

AIOps analyses and classifies alert streams, helping to filter out irrelevant data and low-priority notifications. As a result, you can better identify and prioritize problems, allowing you to concentrate on critical issues. What’s more, AIOps can group related alerts to deliver more contextualized and detailed insight. With a fuller understanding, you can connect the dots between operational issues and security risks, improving response time and driving more efficient remediation across functions.

Advance anomaly and threat detection

Large-scale IT environments consist of many components and each component logs substantial rows of data. Catching anomalies within this data can be like trying to find a needle in a haystack.

AIOps uses machine learning to analyze expected behavior patterns and more accurately pinpoint problematic deviations from the norm. Surges in resource consumption, such as CPU and memory, could well be driven by legitimate or scheduled tasks and be addressed with resource optimization. But they might also be the sign of a malware infection or compromised servers running unauthorized processes.

Thanks to AIOps, you can rely on rapid analysis and precise anomaly detection, giving you advanced warning of issues before they escalate into serious incidents. The ability to mitigate potential risks long before they cause outages and disruption is imperative to safeguarding your operational resilience, security, and the performance of your business services.

Proactive monitoring is great; pre-emptive action is even better

Precise anomaly detection lets you stay on top of emerging issues, but AIOps takes this even further so you can anticipate future concerns. By storing relevant monitoring data and combining it with robust analytics, you can run historical analysis and forecast trends.

With these predictive capabilities, you can foresee upticks in key metrics, such as with resource consumption. This informs pre-emptive resource adjustments, letting you mitigate the risk of potential outages as well as drive cost efficiencies.

Using predictive analytics to continuously analyze your IT environments and resource usage better enables you to right-size your IT estate. This reduces your attack surface and minimizes unnecessary exposure, letting you switch off underutilized resources or services that present a potential vulnerability or entry point for cyberattacks.

Managing extensive and complex IT estates requires the capacity to withstand and respond quickly to issues. When customers expect nothing short of excellence for the business services they depend on, resilience is non-negotiable.

The rise of AIOps provides organizations with valuable capabilities for maintaining infrastructure performance, averting the risk of outages, and recovering quickly from disruptions. By leveraging machine learning, AIOps enables you to progress from reactive problem-solving to pre-emptive action that strengthens resilience, ensuring your business continues to operate smoothly and securely.

The post How AIOps enhances operational resilience in the face of IT complexity appeared first on Cybersecurity Insiders.

From the water we drink, to the roads we drive on, to the information we consume, technology is woven into the fabric of society. Nearly every aspect of our lives depends on technology. However, the convergence of digital threats with physical risks has increasingly become evident. A single cyberattack or technological disruption can bankrupt a business or put human lives at risk.

This raises the question: Do organizations prioritize digital risks? The answer is negative. Research shows that only three percent of businesses have developed true resilience against cyber threats. Here are primary reasons for this disparity:

1.Overreliance on Technology, Inadequate Emphasis on Resilience

Many organizations incorporate technology to propel business growth, disregarding the potential consequences of system failures. Consider the case of smart motorways in the UK, originally engineered to be safe and worry-free. We know today they are not. Was the government operating under the assumption (or misconception) that technology would be flawless and solve all problems? Recall the global CrowdStrike incident. Did financial institutions, hotels, airports, and hospitals write contingency plans dealing with a complete shutdown of their operations? 

2.Lack of Commitment from the Top

No doubt, businesses worry about cybersecurity and protecting information. However, they struggle with the equitable allocation of resources — whether to invest in product features, new markets, or improving the customer experience. When organizations look to trim costs, security is too often the target. That’s because security does not easily lend itself to convenient ROI metrics. 

3.Lack of Transparency in Third-party and Supply Chain Relationships

In the past year, more than half of organizations (54%) suffered a software supply chain attack, with the average attack going undetected for about 235 days. An organization’s ecosystem is no longer confined to four walls but extends through multiple layers and hierarchies. The challenge lies in really understanding the most effective strategies for managing risk across multiple levels. 

4.Neglecting the Human Factor 

Many organizations view cybersecurity as a technological issue that can only be addressed by technological means, overlooking the important role of people. This approach has inherent risks because people are often the primary cause of cybersecurity breaches. On the flip side, it is the versatility and creativity of people and their adaptability in detecting anomalies and identifying social engineering schemes that will ultimately help the organization resolve and recover from cybersecurity attacks.

How Can Organizations Foster Resilience and Improve Governance?

Organizations must evolve and advance, but they should also bear in mind that nothing is foolproof. Cyber criminals are well-trained and well-funded enterprises, with access to sophisticated state-of-the art tools. Below are some best practices to foster resilience and cybersecurity governance: 

1.Retain Basic Skills: While it’s beneficial to train employees to rely on technology, it is also important to equip staff with basic skills for emergency situations where devices and laptops fail or are no longer accessible. 

2.Hold People and Organizations Accountable: Governments, legislators, board of directors and other stakeholders need to shift from a passive “it happens” attitude towards holding entities accountable for their decisions. Did they assess the risk appropriately? Did they plan for a contingency by preparing an alternative course of action? Do they anticipate unexpected events such as a cyberattack?

3.Rely on People for Finding Solutions: In major data breaches, it was often human intervention, not technology, which helped companies recover. Organizations cannot afford to exclude users from the solution design, or sideline people in favor of technological fixes.

4.Improve Governance within the Supply Chain: Prioritize and triage vendors based on their exposure to digital risks. Implement a process for ongoing assurance and have a reporting and monitoring process in place to track changes in supplier risks. Embed supplier risk assessments in the entire supply chain lifecycle.

5.Customize Risk Mitigation Strategies: There’s no universal one-size-fits-all or a blueprint for risk mitigation. Assess your organization’s unique risk posture, its business direction, its security readiness and its willingness to invest in cybersecurity controls. Approach risk mitigation in a streamlined manner. Utilize standard cybersecurity frameworks such as ISF SOGP, NIST SP 800-53B or ISO/IEC 27002:2022 to guide risk mitigation efforts.  

Technological risks are not insignificant, and no quick fixes are readily available. From a business leadership standpoint, a thorough risk management and governance strategy must be adopted for benefit. Organizations need not go at this alone but can rest assured of partnering with experts in cybersecurity and compliance. Whether or not organizations seriously address identified risks is a business investment decision.

 

The post Technology Governance Needs A Rethink on Prioritizing Resilience Against Digital Threats appeared first on Cybersecurity Insiders.

The cybersecurity arms race, with the security ecosystem on one side and threat actors on the other, sees adversaries pitted against each other in a struggle for supremacy. Each side continually evolves its technologies and tactics, with the arrival of advanced AI massively raising the stakes.

In this context, cybercrime is a huge growth industry. Reporting by The Register earlier this year shed some fascinating light on the sheer scale of the problem, quoting the CEO of JPMorgan Chase, who said: “There are people trying to hack into JPMorgan Chase 45 billion times a day.”

With the experience of just that one organisation in mind, it’s alarming but perhaps not altogether surprising that the overall costs of cybercrime continue to skyrocket and are expected to surpass $10 trillion globally. To give that number some kind of context, it’s a figure not far short of the GDP numbers for Japan, Germany and India combined.

Despite the growing level of risk, today’s cybersecurity technologies and professionals keep the vast majority of networks safe, with many organisations implementing a multi-layered approach to guard against different types of threats. At the same time, the wider concept of resilience – the ability to withstand, recover from and continue operating in the face of disruptions such as cyberattacks – has become a board-level priority.

Real-time risks

Take the risks posed by ransomware, for example, where mitigation depends heavily on early detection and swift response. Today’s real-time detection solutions now offer continuous network monitoring by scanning for anomalies or suspicious activities indicative of ransomware. These systems can detect active file encryption or unusual data access patterns and trigger immediate alerts, enabling rapid response. This kind of early warning system not only limits the volume of data encrypted by attackers but also accelerates incident response times, significantly reducing the potential impact of an attack.

In addition to detection, advanced analytics technologies can assess the origin, techniques and behaviour built into ransomware, helping cybersecurity teams quickly isolate affected systems, remove the threat and restore operations. This combination of real-time monitoring and detailed forensic analysis greatly enhances an organisation’s ability to recover with minimal disruption.

This approach can be implemented as part of a Continuous Data Protection (CDP) strategy, whereby every change made to data is saved in real-time. Unlike traditional backup methods that run at specific intervals (such as daily or weekly), CDP captures and records all changes to data as they happen, allowing for near-instantaneous recovery to any point in time before a failure or data corruption occurs.

These capabilities are rapidly becoming must-haves, particularly given evolving ransomware capabilities that target backup systems alongside primary data. Organisations finding themselves in this situation face significant challenges in restoring operations after an attack as they render traditional backups significantly less reliable. In many cases, fully restoring data from traditional backups can take weeks, especially for large enterprises managing vast amounts of critical information.

While traditional backups might suffice for non-critical or less sensitive data, modern enterprises require advanced recovery solutions to ensure the rapid restoration of mission-critical applications and minimal downtime. Disaster recovery platforms that leverage CDP provide near-instant recovery for complex applications with minimal data loss.

The last line of defence

Another key component of an effective layered strategy is the implementation of a security vault. This is a highly secure, isolated environment designed to protect critical data and ensure rapid recovery after a cyberattack, particularly from ransomware. Often acting as a last line of defence, they integrate air-gapped isolation and a zero-trust architecture to provide extremely robust protection.

Even if primary cybersecurity defences fail, the role of the vault is to ensure that the organisation’s most critical data remains untouched and recoverable. Typically, vaults will also employ immutable storage, whereby data cannot be altered once saved, preventing ransomware, for example, from corrupting backup copies. For highly regulated industries, in particular, these capabilities are essential for compliance and recovery orchestration, as they allow organisations to test and validate recovery processes within a secure sandbox before bringing data back into the production environment.

Modern security vault technologies also address the shortcomings associated with traditional architectures, particularly speed of recovery or recovery time objective (RTO). Restoring data from tape or from lower-tier storage can extend recovery by days or weeks, while scanning for clean copies prolongs the process even further, as does recovery onto anything other than production-grade storage.

Moreover, if law enforcement or security teams are performing forensic analysis on production infrastructure, organisations may find they need to run workloads elsewhere for some 

time after recovery – a capability many backup and cold cloud storage solutions simply aren’t designed to support.

A key takeaway here is that while security threats in general and ransomware in particular continue to test the boundaries of protection and recovery strategies, using CDP as a foundation for resilience can act as a game-changer for organisations that find themselves under constant attack. Those who prioritise these capabilities will be ideally placed to remain secure as the risks continue to evolve.

 

The post Understanding the critical role of resilience in defending against ransomware appeared first on Cybersecurity Insiders.