ISACA Survey Reveals AI-Driven Social Engineering as the Leading Cyber Threat of 2026

Published:

AI-Driven Social Engineering: The Cyber Threat of 2026

According to a recent ISACA report, AI-driven social engineering is poised to become one of the most pressing cyber threats by 2026. A survey involving 3,000 IT and cybersecurity professionals revealed that 63% of respondents consider this emerging threat a significant challenge. This marks a notable shift, as social engineering tactics enhanced by artificial intelligence have eclipsed long-standing threats such as ransomware and supply chain attacks, which were cited by 54% and 35% of respondents, respectively.

Understanding AI-Driven Social Engineering

AI-driven social engineering refers to the manipulation of individuals through calculated deception, using AI tools to craft convincing scenarios that can trick people into divulging sensitive information or performing actions detrimental to their organizations. As AI technologies become increasingly sophisticated, the potential for malicious actors to exploit them in conducting social engineering campaigns is rising sharply.

The ISACA report highlights a rather alarming reality: most IT professionals recognize the dual nature of AI—it can be harnessed for innovation and efficiency, but it also creates new risks they are inadequately prepared to combat. Only 13% of organizations feel "very prepared" to handle generative AI risks. Meanwhile, about half consider themselves "somewhat prepared," and a quarter admitted to being "not very prepared."

The report emphasizes that ongoing development of governance frameworks, policies, and training programs is essential to address existing gaps and vulnerabilities in this rapidly evolving landscape.

The Role of Regulation: A Compliance Nightmare

Navigating the regulatory environment surrounding AI technologies is critical for organizations. Karen Heslop, ISACA’s VP of Content Development, pointed out that regulations, particularly those regarding AI safety and security, could be instrumental in bridging the current preparedness gap. The European Union is viewed as a frontrunner in establishing clear technology compliance standards, particularly with initiatives like the EU AI Act that aim to provide clarity for companies operating within its jurisdiction.

In stark contrast, the regulatory landscape in the United States is described as a “compliance nightmare.” With various states developing individual laws on AI safety and security—without a cohesive federal framework—organizations could find themselves juggling multiple compliance obligations across different jurisdictions. Heslop aptly noted the challenges that face smaller companies operating nationwide; having to adhere to numerous sets of laws can be prohibitively complex.

The Cyber Talent Shortage

Yet another pressing concern raised by the ISACA report is the growing talent shortage in cybersecurity. Only 18% of survey respondents believe they have a strong talent pipeline, which underscores an urgent need to cultivate a more robust cadre of professionals ready to tackle emerging threats.

Chris Dimitriadis, ISACA’s chief global strategy officer, highlighted the necessity of creating a “stronger army” of cybersecurity talent to protect digital ecosystems. However, while 39% of respondents expressed plans to hire for digital trust roles, 44% foresee difficulties in sourcing qualified candidates.

Preparing for 2026: Recommendations from ISACA

Given these findings, the ISACA report outlines five actionable takeaways that organizations can adopt to prepare for the challenges of 2026:

  1. Establish Robust AI Governance and Risk Frameworks: Organizations must create comprehensive governance structures that account for the unique risks posed by AI technologies.

  2. Accelerate Workforce Upskilling: Investing in continuous learning, training, and development of talent pipelines is essential for maintaining a resilient workforce capable of addressing cybersecurity challenges.

  3. Modernize Legacy Systems: Updating outdated systems can significantly reduce vulnerabilities and enhance organizational agility in response to evolving threats.

  4. Strengthen Cyber Resilience: Developing and regularly testing incident response plans, ransomware recovery strategies, and cross-functional crisis management protocols is crucial for effective risk management.

  5. Prepare for Regulatory Complexity: Organizations should proactively engage with expert communities, monitor regulatory changes, and invest in compliance tools to navigate the increasingly complex landscape of international regulations.

The insights presented in the 2026 ISACA Tech Trends and Priorities report are drawn from a survey conducted between August 22 and September 4, 2025, involving 2,966 ISACA members as well as non-member certification holders specializing in fields of digital trust including cybersecurity, IT audit, governance, risk, and compliance.

As AI continues to evolve and penetrate deeper into our daily operations, the need for proactive measures to mitigate risks while embracing its opportunities will become ever more critical. The time to prepare is now.

Related articles

Recent articles

New Products