In the vast array of AI-powered tools transforming the creative industries, Meta has introduced one of its latest innovations – Meta’s Movie Gen, a tool capable of generating visual stories. In this article, we will explore the processes that Meta Movie Gen is altering and whether it brings significant changes to the market.
What is Meta Movie Gen?
There are likely several video generating AI tools available, but what makes Meta’s product stand out? We can break this down into a few parts.
First and foremost, one of the key features of this technology, built from a set of foundational models, is that it can automatically generate new videos based on nothing more than textual information.
Secondly, the technology incorporates additional functions, such as editing pre-recorded videos or manipulating still images. In this case, Meta highlights that a video can be created not only from pre-recorded footage but also from elements within a static image, transforming it into a dynamic video enhanced with text-based prompts.
Thirdly, the AI generated videos can include sound, which is also created by artificial intelligence. The sound is synchronized so that background noises and musical scores match the visuals. For instance, the sound of an engine would align with the movement of a car, or the roar of a waterfall would coincide with the visual of cascading water. However, at this stage, the technology does not replicate human voices in videos.
Meta Movie Gen’s Position in the Market
While Meta is one of the market leaders, it faces fierce competition and constantly competes for the top position. One of the competitors is OpenAI’s tool, Sora.
Essentially, both AI-powered tools are designed to generate video content, and Meta’s product was launched as a response to OpenAI’s offering. Although both products are still in the experimental phase, requiring further testing and refinement, their distinct features are becoming evident.
For example, Movie Gen generates short videos lasting up to 16 seconds, while Sora can create videos up to 60 seconds, making it better suited for more cinematic content or more complex stories. This is possible because Sora allows for multiple scene sequences in the generated video. However, Meta is improving its product in terms of sound synchronization, which OpenAI has not yet fully explored.
Additionally, Movie Gen personalizes its suggestions, allowing for detailed video editing, including changing backgrounds or objects, which Sora does not yet support. Therefore, there is no single right answer as to which product leads the market. The choice depends on the idea and the specific needs that each tool can best address.
Risks Posed by Meta Movie Gen and Other AI Video Generators
It is important to note that Meta Movie Gen is not yet publicly available, with only a description of the product released so far. A significant risk at this stage is that, while innovative, the technology remains costly and has room for improvement, particularly in reducing the time between a query and the generated result.
Like any other AI-powered tool, questions and concerns arise regarding the illegal use of data to generate content and the potential for harmful applications. As such, there are still many unresolved issues.
Although Meta claims that this technology is trained using licensed and publicly available datasets, the lack of specific source disclosures raises some scepticism.
Final Word
It is still too early to predict the long-term impact of Meta Movie Gen and how it will affect filmmakers, content creators, or even actors. However, this technology is likely to bring changes and may reduce the need for human labour in certain areas.
Nevertheless, it serves as a prime example of the possibilities offered by today’s technological landscape, which is shaping a completely different future for the market.
If you are interested in this topic, we suggest you check our articles:
- Image Background Removal: A Breakdown of How It Works
- The Meaning of an Image is Changing, Thanks to GenAI
Sources: Data Science Dojo, TechCrunch, The Verge