OpenAI's launch of Sora, a short-form video sharing social media application similar to TikTok but full of AI-generated clips, is sparking controversy within the company.
Some employees and former employees expressed concern that this step could slip away from the nonprofit mission that is considered the soul of OpenAI, developing AI that benefits humanity.
John Hallman, a researcher at OpenAI, admitted on X that he felt worried when the company announced Sora 2, but believed that the group had made efforts to create a positive and safe experience.
Boaz Barak, a Harvard professor and researcher at OpenAI, also said that Sora has impressive technical properties, but it is too early to confirm that this application avoids the traps that have caused many criticisms on social media platforms.
Meanwhile, former OpenAI employee Rohan Pandey took the opportunity to call on researchers to join the startup Periodic Labs, an AI project aimed at scientific discovery, rather than building endless TikTok AI machines.
Many similar opinions also appeared, reflecting the division in the research community.
The launch of Sora highlights OpenAI's internal tensions, both as the fastest growing consumer technology company in the world and a non-profit AI lab.
CEO Sam altman explained on X that Sora's goal is to help OpenAI both maintain AGI research and introduce exciting new products, enough to generate profits and cover growing computing needs.
He reiterated that ChatGPT was once skeptical but eventually became an important source of funding for the study.
However, observers questioned how long will OpenAI wait for a bountiful business opportunity because it goes against its mission? This concern is even more evident as the company is switching to a for-profit model to raise capital, even aiming for IPO.
California chief executive Rob Bonta has also recently warned against ensuring OpenAI remains a safety focus as it restructures.
With Sora, OpenAI asserted that the application is designed for entertainment and creativity, not optimizing users' time spent sticking their eyes on the screen.
The company is committed to limiting addictive mechanisms, such as sending reminders when users Google for too long and prioritizing the display of videos from friends over strange content.
This is considered an effort to prevent the mistakes of previous social networks early.
However, on the first day of release, users discovered some interactive-stimulating details, such as the emocation icon when clicking like a video.
This has led analysts to question whether OpenAI is truly out of the trajectory of addictive social platforms.
Sam altman once admitted in a podcast that the imbalance of data sourcing algorithms has caused serious consequences for society.
For Sora, the real challenge lies in whether OpenAI can build an AI video platform that is both engaging and complying with the original mission, or will repeat the vicious cycle of the social media industry.