Artificial intelligence still lives in the gray areas of the law, as it is relatively new in terms of widespread use. With that being said, Microsoft, GitHub, and Microsoft partner OpenAI are asking the court to dismiss the lawsuit filed against them concerning AI-powered piracy.
AI vs Law
The three companies were hit with a lawsuit regarding using licensed code by GitHub's Copilot tool, a technology that suggests lines of code for a programmer directly into the code editor. The issue lies in the fact that the tool learns from GitHub codes that are publicly available.
This calls into question whether or not it counts as a violation of copyright laws, which is also the concern in a proposed class action lawsuit brought by the programmer and lawyer Mathew Butterick, as well as a legal team from Joseph Saveri Law Firm.
Although the plaintiffs claim that the software pirates code on an "unprecedented scale," Microsoft and GitHub responded by saying that it fails on two intrinsic defects, which are lack of injury as well as lack of viable claim, as mentioned on The Verge.
OpenAI added that the accusers are alleging a grab bag of claims that fail to "plead violations of cognizable legal rights." Microsoft, GitHub, and OpenAI say that they were not personally harmed by the tool, therefore making their claims rely solely on hypothetical events.
Machine Learns from Man
The three companies clarified that Copilot helps programmers with suggestions it learned from an entire body of knowledge gleaned from public code, as opposed to the accused withdrawal of code from the body of open source code available to the public.
As a counter-argument, Microsoft and GitHub expressed that the plaintiffs are undermining open source principles by asking for an injunction and a "multi-billion dollar windfall," even though the companies are willing to share the software as open source.
In fact, Microsoft is extending its partnership with OpenAI. The tech company is pouring billions into the collaboration as they plan to continue to build Azure's leading AI infrastructure, which will help others build and deploy AI applications globally.
As for the use of copyrighted content, the plaintiffs also believe that AI-generating art MidJourney and Stability AI are also violating copyright laws, as it takes artwork from the internet to generate an outcome.
Then again, an argument can be made that the AI is simply learning from what it can see through the Internet, not copying them. However, Getty Images believes that Stability AI's stable diffusion tool is taking images from its site.
According to the media source site, Stability AI unlawfully copied and processed millions of images that are protected by copyright and associated metadata owned or represented by the platform. Getty Images added that AI has the potential to stimulate creative endeavors.
This is an issue for the company seeing as it provided licenses to other AI programs for purposes like training artificial intelligence systems, all while respecting personal and intellectual property rights. Stability AI, however, did not ask for any license.