Real-world AI voice cloning attack: A red teaming case study

<p>As an ethical hacker, I put organizations’ cyberdefenses to the test, and — like malicious threat actors — I know that social engineering remains one of the most effective methods for gaining unauthorized access to private IT environments.</p>
<p>The Scattered Spider hacking group has repeatedly proven this point in its social engineering attacks targeting IT help desks at major enterprises, including casino giants <a target=”_blank” href=”https://www.cybersecuritydive.com/news/caesars-social-engineering-breach/695995/” rel=”noopener”>Caesars Entertainment</a> and MGM Resorts, as well as British retailer Marks and Spencer. In such attacks, a threat actor impersonates a legitimate employee and convinces the help desk to reset that user’s password, often using an authoritative tone or sense of urgency to manipulate the other person into granting account access. Such classic <a href=”https://www.techtarget.com/searchsecurity/tip/How-to-avoid-and-prevent-social-engineering-attacks”>social engineering</a> tactics often manage to bypass technical defenses entirely by exploiting human behavioral weaknesses.</p>
<p>I’ve used phone-based social engineering in my own red teaming strategy for years, and recent improvements in deepfake and voice cloning technology have made such <a href=”https://www.techtarget.com/searchsecurity/tip/Generative-AI-is-making-phishing-attacks-more-dangerous”>voice phishing (vishing) attacks even more effective</a>. In this article, I will walk you through a recent, real-world example that demonstrates how easily threat actors are now using AI-enabled deepfakes and voice cloning to deceive end users. CISOs must test their organizations’ ability to withstand such attacks, as well as educate employees on what these techniques look like and how to stop them.</p>
<section class=”section main-article-chapter” data-menu-title=”How an AI voice cloning attack tricked a seasoned employee”>
<h2 class=”section-title”><i class=”icon” data-icon=”1″></i>How an AI voice cloning attack tricked a seasoned employee</h2>
<p>As part of a red teaming exercise, a large business recently asked me to try to hack into the email account of one of its senior leaders. Typically, you need the following three elements to gain access to an email account:</p>
<ol class=”default-list”>
<li>The email address.</li>
<li&g

[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.

This article has been indexed from Search Security Resources and Information from TechTarget

Read the original article: