Just yesterday, we shared that OpenAI has officially launched Sora. This text-based video generator is now a reality and can be used in many places with a certain level of freedom. This marks a significant milestone in the AI world, as Sora is one of the most advanced tools for creating videos using artificial intelligence. However, one of its key features, which is highly desired by regular users, remains restricted.
Please follow us on Facebook and Twitter.
Access to this feature has been granted to only a small number of people worldwide, specifically those within OpenAI’s trusted circle. This feature allows users to generate videos lasting several seconds from photographs of real people. According to OpenAI, the tool still faces challenges related to user security and privacy. Due to its powerful capabilities, it has the potential for misuse, making its controlled access crucial.
As you can imagine, creating a video from a photograph can be amazing, but it also comes with risks if the tool is made available to everyone. It could be used to generate fake nudes, recreate actions that invade people’s privacy, and similar harmful activities.
The ability to create videos using a photo or video of a real person as a “seed” presents a high risk of misuse. OpenAI is addressing this cautiously, allowing gradual access while analyzing early usage patterns, as mentioned in the OpenAI Blog.
In simpler terms, OpenAI has not yet figured out how to fully prevent the misuse of its most advanced video tool. To reduce the risk, the tool is currently available only to a select group of trusted individuals. Unless you are part of OpenAI’s trusted circle, you won’t have access to this exclusive feature.
Sora Has Protection Systems, But They Are Not Enough
Sora is equipped with various security measures to prevent malicious use in video creation. One example is its age filter, which restricts the use of sexual or violent prompts if the initial image features individuals under 18. According to OpenAI, the AI applies a “stricter threshold” in such cases to ensure greater protection.
Additionally, every video created with Sora includes detailed metadata. This metadata contains information about the creator, the initial image, and other critical details that could help trace the source if the tool is used for criminal activities. However, as TechCrunch points out, this metadata can be removed relatively easily, which limits its effectiveness as a protective measure.
So, it seems that Sora will continue to restrict access to its tool for creating videos from real images, at least until OpenAI finds a way to prevent the internet from being flooded with sexual videos generated from just a single photograph.