South Korea is stepping up its fight against AI-generated election misinformation ahead of local elections on June 3.
At the National Election Commission office in Gwacheon, teams of workers spend their days scanning social media platforms, chatrooms, and online political groups looking for fake AI-created content.
Officials say the rapid growth of artificial intelligence is making the job harder than ever. New AI tools can now create highly realistic videos, audio clips, songs, and fake news reports that are difficult to tell apart from real content.
One election monitor, Choi Ji-hee, said each new AI model makes fake content more convincing, turning the fight against misinformation into a constant challenge.
Recent examples uncovered by investigators include:
- A fake TV report claiming a mayoral candidate was recognized by Time magazine
- An AI-generated K-pop song promoting one politician while attacking rivals
South Korea tightened its election laws in 2023 to deal with these threats. Under the rules, AI-generated content involving candidates that could mislead voters is banned during the three months before an election.
People who repeatedly break the law, or create especially harmful fake content, can face up to seven years in prison or heavy fines.
The country has embraced AI faster than most nations. Government data shows that nearly half of South Koreans use generative AI tools, while OpenAI says South Korea has more paid subscribers than any country outside the United States.
At the same time, concerns over AI misinformation have exploded. Officials say reports of fake AI-generated election content jumped sharply between the 2024 general election and the 2025 presidential race.
The issue has also become politically sensitive after former president Yoon Suk Yeol repeated false election conspiracy claims during the political crisis surrounding his failed martial law attempt in late 2024.
Experts say South Korea’s strict approach reflects growing public concern about AI abuse, including election manipulation and deepfake crimes targeting women and girls.
A recent survey found that most South Koreans believe AI-generated content could influence election results and support stronger laws to stop it.







