OpenAI's GPT Store contains several chatbots that violate copyrights. That's what TechCrunch writes. The chatbots could generate content based on well-known franchises, without receiving permission from the creators.
TechCrunch's editors found several chatbots based on well-known film, television and game franchises. For example, they can generate images based on franchises such as Star Wars or Monsters Inc. from Disney. The digital store would also feature GPTs that could impersonate well-known characters from such popular franchises, without the GPT developers obtaining a license to do so. For example, the tech medium noticed chatbots that pretend to be Aang from Avatar: The Last Airbender and Wario.
Kit Walsh, an attorney for the Electronic Frontier Foundation, told TechCrunch that such chatbots could be used to create works, which could lead to copyright infringement, he said. The man also states that problems may arise with protected trademarks of some companies.
OpenAI has a policy with which it wants to ensure that users do not break the law. Developers who want to publish a chatbot on the digital store must adhere to the terms of use and also get verified. According to a spokesperson, this verification process is partly done by automatic systems, partly by people and partly by reports from users.
“We allow creators to make their GPTs respond 'in the style of' a specific real person, as long as they do not impersonate a real person, for example by giving GPTs the name of a real person, instructing them completely imitate it and include their image as a GPT profile photo,” an OpenAI spokesperson told TechCrunch. The company has not yet responded to the possible copyright violations.
OpenAI opened the GPT Store at the beginning of this year. This is a digital store in which chatbots based on the GPT-4 language model can be offered for a fee. That chatbot was supposed to be released in November 2023, but OpenAI postponed the release until 'early 2024'. The company said it wanted to make improvements first, without going into detail.