- Suchir Balaji, who used to work for OpenAI, killed himself at the end of last month.
- Balaji brought up the possibility that OpenAI was breaking copyright laws after he left the company.
- His name is on a case against OpenAI in the New York Times that could have big effects.
Suchir Balaji, a former OpenAI researcher, was found dead in an apartment in San Francisco eight days before his body was found. The 26-year-old’s name was in a lawsuit against his former company that could have big effects on the future of AI and the internet.
The New York Times sued OpenAI and Microsoft in December of last year, saying they used “millions” of stories from the newspaper without permission to train their popular ChatGPT model. The companies have said they didn’t break any copyright laws.
Business Insider saw court papers that show on November 18 the lawyers for the Times asked the judge to add Balaji as a “custodian” in the case. In their letter, the lawyers said that Balaji had “unique and relevant documents” that could help their case against OpenAI and Microsoft for copyright infringement.
The Times also suggests former OpenAI workers as guardians, such as cofounder Ilya Sutskever. The court papers don’t show what Sutskever might have done to help the case.
The Times is one of many copyright lawsuits that were brought against the AI company after ChatGPT came out in 2022.
If the courts agree with the Times or other news outlets and writers who have sued, it could be expensive for AI companies and limit the data they can use to train their models.
The Times’ case doesn’t ask for a specific amount of money, but it does say that OpenAI and Microsoft are to blame for “billions of dollars” in damages.
People who work for OpenAI, Microsoft, and The New York Times did not answer calls for comment.
Concerns Were Made By Balaji About OpenAI’s Use Of Protected Data
When Balaji joined OpenAI in 2020, he worked on teaching the ChatGPT and GPT-4 models, as well as court papers and news stories from The New York Times. The researcher quit OpenAI in August “because he no longer wanted to contribute to technologies that he believed would bring society more harm than benefit,” according to the Times. He said that the company’s work broke copyright laws.
He wrote an essay on his personal website on October 23 that made people wonder if OpenAI’s use of copyrighted data could be called fair use.
“While generative models rarely produce outputs that are substantially similar to any of their training inputs, the process of training a generative model involves making copies of copyrighted data,” Balaji said. “If these copies are made without permission, this might be copyright infringement, based on whether the use of the model in question is “fair use.” Because fair use is different for each person, it’s not possible to say for sure when generative AI is reasonable use.
The Times wrote an article about the former OpenAI expert the same day.
He told the Times, “If you believe what I believe, you have to just leave the company.”
Eight days after Balaji’s name was in the Times’ lawyer’s letter, around 1:15 p.m. on November 26, police from the San Francisco Police Department went to an apartment in the Lower Haight area to check on someone’s well-being.
“When police and medics got there, they found a dead adult male who seemed to have killed himself,” an SFPD spokesman told BI. “No evidence of foul play was found during the initial investigation.”
The man who died was eventually identified as Balaji by the office of the city’s chief medical examiner.
BI was told by David Serrano Sewell, executive head of the city’s office of the chief medical examiner, that the death was caused by suicide. Afterward, he did not say anything else.
Also Read: Seven Months After Openai First Showed It Off, Chatgpt Can Now Understand Real-time Video
A representative for OpenAI told BI on Friday, “We are devastated to hear this very sad news. Our thoughts are with Suchir’s family and friends during this difficult time.”
What do you say about this story? Visit Parhlo World For more.