What does deep fakes mean?

29 May 2019

A deep fake is a sophisticated digital forgery of an image, sound or video enabled by artificial intelligence (AI)

Such forgeries are so good that the human eye is unlikely to detect that the image has been manipulated. The goal of a deep fake, generally, is to mislead and deceive, making it appear as though a person has said or done something when, in fact, that is not the case.

Altering videos and creating fake content is nothing new. Many people and entities, from governments to common criminals, have used misinformation campaigns for political, social or personal gain for quite some time. 

However, what once took skilled engineers significant time to create can now be done rather cheaply, quickly and much more convincingly. 

Supported by advances in artificial intelligence, deep fakes have proliferated across the internet, as the technology has become less expensive and more accessible. There are now apps and websites dedicated to creating fake material, bringing layers of artificial neural networks or “deep learning” to amateurs.

Why Does it Matter?

With the barrier to entry for deep fake technology now so low, there exists a real possibility that information can be transformed for nefarious purposes and masked truths. A ubiquity of false information has the potential to profoundly affect democratic institutions by eroding public trust, destabilising free markets and compromising national security.

Sign up to our latest  News & Insights

Businesses have always strived to protect the “CIA” triad of information security – confidentiality, integrity and availability. But while corporate cyber defenders are battled-tested against data confidentiality and availability threats, they are only now seeing the true potential of data integrity risks. As a result, businesses may not be fully prepared to respond to this sleeping giant, which can have a lasting impact on organizations and executives.

Spurred by financial gain, geopolitical influence, or social causes, online actors have already targeted businesses with false information campaigns. The worst, however, may be yet to come. Instead of using social media to make false claims about a company, an already common tactic, what if a malicious actor uses deep fake technology to secretly change content on a company website or release a manipulated public report just before a filing deadline? The mere threat of such an action could throw a company into chaos and send stock prices into a free fall.

Would your company be ready to respond to a fake video of an executive committing an illegal act or altered audio of someone saying something offensive or inaccurate? The possibilities are endless.

As the onslaught of AI-enabled forgeries become a reality, casting a shadow over the old adage “seeing is believing,” businesses must continue to build resilience and take a comprehensive approach to addressing new data integrity threats.

  • TALK TO AN EXPERT

  • DOWNLOAD AND SHARE

  • Sarah StephensSarah Stephens

    As part of Marsh JLT Specialty's London-based Financial Lines Group, Sarah and her team work both directly with our clients and with network colleagues and independent partners to make sense of cyber, technology, and media E&O (PI) risks and create leading edge bespoke insurance solutions in the London market.

    Prior, Sarah spent 12 years with Aon in a variety of roles. Her last role at Aon was Head of Cyber & Commercial E&O for the Europe, Middle East, and Africa (EMEA) Region, working with colleagues across business groups and clients in the region to identify, analyse, and drive awareness of cyber risks, exposures, and both insurance and non-insurance solutions.

    Previously, Sarah spent seven years with Aon’s US Cyber and Errors & Omissions practice group thinking nonstop about cyber insurance way before it was cool. Her first four years at Aon were spent in the Account Management group working with large clients and developing a keen eye for excellent client service.

    For further information or to learn more about cyber insurance, contact Sarah Stephens, Head of Cyber, on +44 (0)20 3394 0486.

  • For more articles like this, download our Cyber Decoder

    Share this article