A singer has to apologize for lip-syncing. Another artist suddenly appears in an advertising video that they themselves do not know they have ever participated in. Two seemingly separate stories, but in fact sharing a common origin is the erosion of authenticity in artistic life, both on stage and in digital space.
The Ministry of Culture, Sports and Tourism's request to rectify lip-syncing is not new, but it comes at the right time. Because as performance support technology increasingly develops, the boundary between "technical support" and "experience swapping" is also increasingly fragile.
It is undeniable that in many large stages, especially outdoor events or live television, the use of background music and dubbed singing is a technical solution to ensure program quality. But if it changes from support to complete replacement, then it is a manifestation of dishonesty.
Audiences can accept an imperfect voice. But they find it difficult to accept being faked and believed to be real. When audiences spend money to buy tickets to a music program, what they want is direct experience, the real emotions of artists on stage, not coming to listen to singers lip-sync or playback.
Therefore, it is necessary for the Ministry of Culture, Sports and Tourism to tighten lip-syncing. And even more necessary is to establish clear standards, what kind of live singing is, what kind of heavy singing is, and in which cases must be made public to the public.
Another issue is impersonating artists with AI. If lip-syncing and covering on stage is a matter of honesty in performance, then the story of impersonating artists with AI is a much bigger challenge, because it breaks the real-fake boundary in the digital space.
Now, with just a few steps, a fake video can make an artist advertise a product, or express a point of view, or call for an action they have never done before.
More dangerously, these contents spread at lightning speed, before the truth could be verified.
When viewers see that news as true, damage has occurred, not only the personal honor of the artist, but also the public's trust in information on the internet.
To solve this problem from the root, in addition to regulations requiring labeling content created by AI, more coordination from many parties is needed, from management agencies, digital platforms, to artists and users themselves.
Management agencies need to complete the legal framework quickly and strongly enough to handle violations. Platforms must be responsible for detecting and removing fake content. Artists need to proactively protect their image.
And equally important is that audiences and users must be vigilant and not easily trust and share unverified content.
Finally, when technology can reproduce human images and voices almost perfectly, the true value in audiovisuals - the core value of culture, becomes even more precious.
A live voice may not be perfect, but it is a real voice, real emotions. An AI video can be perfect to every pixel, but it is a cold "hypocrisy".
If society accepts imitation for convenience, for entertainment, for curiosity, then sooner or later, the real-fake boundary will be blurred. And then, not only will art be damaged, but social belief in the truth will also be eroded.