Connect with us

Hi, what are you looking for?

Tech

Beware of Deepfakes in the AI Age, Warns Kaspersky

Artificial-Intelligence
Artificial-Intelligence

The widespread adoption of artificial intelligence (AI) and machine learning technologies in recent years is providing threat actors with sophisticated new tools to perpetrate their attacks.

One of these are deepfakes which include generated human-like speech or photo and video replicas of people. While the time and effort to create these attacks often outweigh their potential ‘rewards’, Kaspersky warns that companies and consumers across Africa must still be aware that deepfakes will likely become more of a concern in the future.

“The potential for malicious use when it comes to deepfakes is clear,” says Bethwel Opil, Enterprise Client Lead at Kaspersky in Africa. “From blackmailing to perpetrating financial fraud and spreading misinformation via social media, the potential knock-on effects may be significant.

Invariably, cybercriminals are still looking for cheaper and quicker methods to propagate their campaigns.

However, we anticipate an increase in targeted attacks using deepfakes, especially against influential people and high net worth individuals or organisations in the coming years that will justify the time and effort it takes attackers to create deepfakes.”

Kaspersky research has found the availability of deepfake creation tools and services on darknet marketplaces.

These services offer generative AI video creation for a variety of purposes, including fraud, blackmail, and stealing confidential data.

According to the estimates by Kaspersky experts, prices per one minute of a deepfake video can be purchased for as little as $300.

There are also concerns across the continent when it comes to the significant divide around digital literacy amongst Internet users.

According to the recent Kaspersky Business Digitisation Survey¹ 51% of employees surveyed in the Middle East, Turkiye and Africa (META) region said they could tell a deepfake from a real image, however in a test only 25%2 could actually distinguish a real image from an AI-generated one.

This puts organisations at risk given how employees are often the primary targets of phishing and other social engineering attacks.

For example, cybercriminals can create a fake video of a CEO requesting a wire transfer or authorising a payment, which can be used to steal corporate funds. Compromising videos or images of individuals can be created, which can be used to extort money or information from them.

“Despite the technology for creating high-quality deepfakes not being widely available yet, one of the most likely use cases that will come from this is to generate voices in real-time to impersonate someone.

“For example, a finance worker at a multinational firm was recently tricked into transferring $25 million to fraudsters because of deepfake technology posed as the company’s chief financial officer in a video conference call.

“Africa is not immune to this threat and it’s important to remember that deepfakes are a threat not only to businesses, but also to individual users – they spread misinformation, are used for scams, or to impersonate someone without consent – and are a growing cyberthreat to be protected from,” says Bethwel Opil.

 

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

ad

You May Also Like

Tech

European Union has fined Meta nearly €800 million for violating antitrust laws by automatically granting Facebook users access to its classified ads service, Facebook...

Tech

In the lead up to Anambra Innovation Week 2024, in a landmark collaboration poised to position Anambra State as Africa’s Silicon Valley, the United...

Business

Commander Mitchell Ofoyeju, the Tincan Island Port Commander of the National Drug Law Enforcement Agency (NDLEA), has expressed his admiration for QNET for championing...

Security & Crime

The Honourable Minister of Defence H.E Mohammed Badaru Abubakar CON mni has called on the Nigerian Airforce to intensify the use of their various...