South Korea has been plunged into political chaos after now impeached President Yoon Suk Yeol made a shocking declaration of emergency martial law on December 3. The move stunned the nation, especially after reports emerged that Yoon had ordered troops to occupy the National Election Commission headquarters on that tumultuous night.
The controversy deepened as speculation grew over Yoon’s motives, with many pointing to the influence of far-right YouTube channels that have long promoted baseless election fraud theories. Allegations of irregularities in the 2020 and 2024 general elections have been popularised by these channels, raising questions about the president’s reliance on such fringe narratives.
Further fuelling the debate, it resurfaced that Yoon avidly consumes content from these extremist platforms. Reports suggest it was an “open secret” that Yoon had invited approximately 30 far-right YouTubers to his inauguration, including operators of channels like Lee Bong-gyu TV and Sisa Warehouse, boasting 927,000 and 144,000 subscribers, respectively.
Prominent lawmaker Lee Hae-min, a former Google employee, weighed in on December 13, stating, “Yoon truly believes in the election fraud theories circulating on far-right YouTube channels. He sees them as the root cause of all problems.”
Critics, including former People Power Party leader Han Dong-hoon, have warned of the dangers of aligning with such extremist influencers. Han called these YouTubers “commercial fearmongers” and cautioned that their influence risks dismantling conservative politics in South Korea.
This unfolding drama has spotlighted YouTube’s role in shaping political discourse and raised urgent questions about how its algorithms steer users’ beliefs and behaviors.
YouTube & the confirmation bias trap
Experts argue that YouTube’s recommendation algorithms amplify confirmation bias by tailoring content to users’ preferences.
Han Jeong-hun, a professor at Seoul National University and author of a 2022 dissertation on political polarization in Korean YouTube, explained, “When you enter YouTube’s main page, recommended videos appear based on your viewing history. This creates a cycle where users repeatedly engage with similar content, reinforcing a biased consumption of information that can influence behavior over time.”
Impact Shorts
More ShortsYouTube’s design further exacerbates the issue. “Provocative content, whether political or otherwise, is more likely to go viral because the algorithm prioritizes engagement to boost monthly active users,” said Yu Hyun-jae, a journalism professor at Sogang University.
Older generations & YouTube’s growing influence
The problem is particularly pronounced among South Korea’s older population, who increasingly rely on YouTube for political news.
“There has been a significant rise in people in their 50s, 60s, and 70s turning to YouTube for political content,” noted Lee Moon-haeng, a media communication professor at Suwon University. “They find peer groups that reinforce their views, often without the differing perspectives their children might provide.”
The convenience of YouTube’s algorithm-driven recommendations keeps users hooked, making it difficult for them to explore diverse content. Efforts to disable or bypass the algorithm often fail because of the loss of convenience, said Yu.
Mozilla researcher Jesse McCroskey highlighted this challenge in a 2022 report, pointing out that even when users try to limit recommendations, YouTube merely reduces the frequency of unwanted content. “This means that users cannot meaningfully control YouTube’s algorithm,” McCroskey said.
Need for media literacy
As political tensions escalate both right-wing and left-wing YouTubers are seizing the moment to shape public opinion. Channels on both sides of the spectrum, including leftist platforms like The DDanziGroup and Ruliweb, have also been accused of spreading disinformation.
Experts argue that the solution lies in improving media literacy. “Few countries have such a close connection between YouTube and politics as South Korea, thanks to its nearly unrestricted internet and robust infrastructure,” said Yu. “The best solution is media literacy education for the public.”
Lee echoed these concerns, warning about the dangers of blindly accepting and sharing misinformation. “People see something they believe is important or accurate and spread it without verifying its truth. This amplifies misinformation,” he said.
Han Jeong-hun added that regulating content or blocking algorithms is neither feasible nor desirable in a democratic society. Instead, the public must take responsibility for filtering and verifying information. “Citizens in a democracy should cross-check content with other media sources if they suspect it’s false,” Han said.
With inputs from agencies.


)

)
)
)
)
)
)
)
)
