A deepfake is a video, photo or audio file which portrays a person saying things they never said, or doing things they never did. Deepfakes are created by artificial intelligence, analysing and mapping people’s faces, bodies and voices from photographs, videos and audio.
One expert described deepfakes as “Photoshop on steroids”.
Recently, the number of deepfake videos has increased, the quality has improved, and the videos are getting easier for anyone to make.
Experts predict that soon the technology will be so good that people will be unable to spot a deepfake with the naked eye.
The technology has been used in scientific and medical research, and for educational purposes – e.g. letting visitors to museums have ‘conversations’ with historical figures. Deepfakes have also been used in movies such as Star Wars: Rogue One and the Fast and Furious films, to shoot scenes with deceased actors.
And deepfakes are often used as a joke – e.g. to swap the faces of different celebrities, or to insert an ordinary person into a movie clip or music video.
Unfortunately, deepfakes can be used to cause harm. For example, they may be used in cyber bullying to portray people in scenarios that are embarrassing, offensive or hurtful.
Deepfakes are also used to degrade women, especially, by inserting their images into porn. At present, these videos mostly target celebrities, but as the technology becomes more widespread, we may well see a rise in deepfakes being used in image-based abuse.
Meanwhile, many people see deepfakes as a threat to democracy, because they show politicians and public figures saying or doing things they never said or did. Deepfakes can be used to spread messaging which is inaccurate and dangerous, and erode people’s trust in institutions such as parliament, the courts, and the news media. At the same time, it’s becoming easier for people to dismiss real, factual footage by claiming ‘it’s fake’.
Check out these great videos for school students explaining deepfakes and their implications, and read the position statement from the Office of the eSafety Commissioner.
Find out what your teens know about deepfakes – it’s OK if they know more than you! You don’t have to be a tech expert. It’s more important to talk about our values and how we treat other people, including things like kindness, honesty, respect and trust.
Conversation-starters might include:
Eventually, it may become impossible to spot a deepfake by looking at it. But the ‘bad’ deepfakes show signs including:
And we can ask smart questions such as:
We can lobby our politicians, tech companies, and educators to do something about deepfakes, such as:
If you know a child under 18 who has been bullied online, including through deepfakes, please contact the Office of the eSafety Commissioner. They also help people who’ve had sexual or nude images shared without consent, including via deepfakes.
And for free, confidential counselling, contact Kids Helpline, 1800 RESPECT, eheadspace, or Lifeline.