Korea urged to proactively cope with ‘deepfakes’

Calls are growing for the SK government to introduce measures to deal with the abuse of manipulated videos, also called “deepfakes,” as the spread of these videos constitutes a breach of individual privacy, and at an extreme level could even undermine elections, according to experts.

A deepfake is a video produced by machine learning and artificial intelligence technologies, and involves changing images and audio to falsify someone’s identity, words or actions.

Experts said there should be tougher penalties for abusing deepfakes intentionally as it is getting harder to distinguish what is real with the rapid development of deepfake technology, and manipulated videos could be used to defraud people and organizations.

They said the government also needs to develop a way to pre-emptively detect deepfakes and prevent the spread of fake content, similar to how antivirus software blocks cyberattacks.

According to a recent study by Dutch cybersecurity company Deeptrace Labs, 25 percent of pornographic deepfake subjects are female K-pop stars. The company said Korean stars have been increasingly featured in the manipulated videos because of their explosive popularity across the globe.

To protect such victims, Rep. Park Dae-chul of the Liberty Korea Party proposed a “deepfake bill” to revise the current Act on Sexual Crimes of Violence, so that it can set a penalty of up to seven years in prison for those who distribute fake videos.

“AI technology, a core to producing deepfakes, is important for the country’s development, but I proposed the bill to actively respond to people who abuse the technology,” the lawmaker said.

However, a legislative researcher who helped the lawmakers propose the deepfake bill said it will take time for the bill to officially come into law because it needs to go through more steps.

“To make the bill a law, lawmakers have to go through a few more steps. They have to share what matters with the technology with government departments and hold public hearings to improve social awareness on the issues,” said Kim Yoo-hyang, the director of the science media and telecommunication team at the National Assembly Research Service (NARS).

“Given the fake news bill that prohibits the spread of deliberate disinformation is still pending, it will take more time for the deepfake bill,” she said.” NARS is a state-run agency that offers policy recommendations to lawmakers.

While the bill needs more time to be submitted, Kim said it carries an important meaning.

“Many women are falling victim to manipulated videos using deepfake technology. The technology can also destabilize elections. So proposing the deepfake revision is a meaningful first step for society,” she said.

The director added there needs to be technological measures that can prevent the occurrence and spread of deepfakes.

“The U.S., which places a high value on freedom of speech, has no law that prohibits the dissemination of fake news, while many other countries are increasingly moving to enact such measure into law. But when it comes to the deepfake issue, four bills were submitted to the state assemblies and five bills were forwarded to the federal government. So Korea also needs to improve its capability to effectively and legally punish people who infringe on others’ privacy using deepfake technology. We also needs a technological measure to pre-emptively detect manipulated videos,” the director said.

Lee Dong-hwi, a professor at the department of information security at Dongshin University, agreed that the country needs a law to protect victims of deepfakes, but society needs to take a careful approach when deciding its scope because it may jeopardize freedom of speech.

“The reason why lawmakers are proposing the deepfake bill is because it is really hard to distinguish what is real and what is fake. As a deepfake is a video that shows something that didn’t actually happen, there must be a law that can protect the victims. But the government should also be careful in deciding on the scope of the law because it may harm freedom of speech,” the professor said.

Give it a share: