For the Future of Money Award (2019)
The 2019 edition of the Future of Money Award (FoMA) competition was challenging in designing the financial crime of the future (a speculative one, of course). Some elements from the initial briefing:
“As our money becomes completely electronic, new crimes will undoubtedly be committed, which take advantage of the technology itself or the social conventions on which it rests. Perhaps if we could imagine what crimes might happen, we could start to think about how to prevent them. What financial crimes could be committed within a completely electronic marketplace?”
The invitation was open to “to employ a speculative design approach and design a future financial crime. A crime which utilises a new loophole, a change in social convention or specific technology development”.
Our design fiction, Augmented CEO Fraud, has won the first prize and was exhibited during the FoMA partner conference Money20/20 in Amsterdam.
The Augmented CEO Fraud is an elaborate scam relying on artificial intelligences to identify extortion opportunities and to impersonate an executive to fool an employee into executing unauthorised wire or data transfers.
According to the FBI, in 2018, the “Business Email Compromise”, also known as CEO Fraud, is now a $12 billion scam. A CEO Fraud is based on a psychological manipulation to trick employees into divulging confidential data or providing access to funds.
This project speculates on an augmented version of existing CEO Fraud, set in 2022, building on the growing use of artificial intelligences, especially the use of the systems allowing deep fakes production to fool vulnerable employees. A deep fake is a technique for human image and/or voice synthesis based on artificial intelligence, rendering the words or gestures a person actually uses to make it look like this individual performed or said something they never did.
In this scenario, cybercriminals use deep fake technologies to renew an existing social engineering-based extortion tactic. They use these technologies to build a profile and a plausible render that perfect the impersonation of an influential executive in the position of formulating crucial requests to subordinates. Indeed, in a near future, we might predict employees would be aware of traditional impersonation-based fraud emails or phishing strategies; reducing the efficiency of such approaches and requiring new tech-based strategies to exploit the weaknesses of an organisation.
This white-collar crime uses artificial intelligence systems to automate the fraud in two ways:
- Aggregating information - especially collected on social networks - to detect vulnerability and impersonation opportunity, in order to identify which organisation and which employee - as a weak link - to target first.
- Rendering deepfakes to impersonate a CEO or an attorney to require wire or data transfers. This tactic might also be based on impersonating an empowered third party, targeting wealthy victims, such in the case of a fake minister asking for a favour; as in a case, in 2018, with the impersonation of Jean-Yves Le Driand, the French Foreign Affairs Minister, extorting millions of euros in damage.
The Augmented CEO fraud combines a series of impacts on a public or private organisation when the impersonator moves into action:
- Important amounts of money being stolen through this tactic (hundreds to millions of $ for each crime).
- Data breach, as the goal of criminals might not be only the money, but very sensitive, and almost invaluable, information about the organisation or its assets.
- The destabilisation of the organisation, as it creates a tarnished reputation and trust loss among clients, furnishers, but also employees and partners.