A new AI tool from Microsoft can now generate realistic-looking animation from still images of faces in a move to create more lifelike deepfake technology.
Microsoft Research Asia unveiled its latest findings last Saturday as capable of replicating life-like lip movements and other facial nuances when real people speak.
Users just need to insert the recorded voice audio into the AI machine and it will automatically generate the lip-synching and estimated expressions that will accompany the animation.
Microsoft's Newest AI Brings 'Mona Lisa' to Life
To test out the limits of the new technology, the tech giant demonstrated its application on Leonard da Vinci's famous painting, the "Mona Lisa," allowing the painting to move seamlessly.
Previous iterations of the technology were only able to replicate mouth movements from real videos of people speaking and that is not without any hitches.
As of writing, Microsoft does not plan on releasing the new AI model, VASA-1, to the public until it is certain that the technology is "used responsibly and in accordance with proper regulations."
Microsoft opposed the misuse of its technology "to create misleading or harmful contents of real persons."
Microsoft's New AI Tool Presents Deeper Concerns on Deepfake Technology
While the company has explicitly expressed its opposition to the misuse of its AI technologies, experts are still concerned about the implications of such innovation to the growing misinformation culture powered by deepfakes.
A recent Senate hearing has highlighted such concerns with a deepfake detection team warning that recent improvements on the technology have made that "anybody with a Google search" can now make "anything as entertaining and dangerous as they can imagine."
It does not help that several independent research studies have found that Microsoft's AI tools are still able to generate images of political figures and provide inaccurate information about vital issues in society.
As of writing, the technology is further spreading outside politicians and celebrities as deepfakes are now also being used to replicate likenesses of online influencers to promote scams and unauthorized products.