The Weekend Leader - Stop OpenAI from launching new GPT models: AI policy group to US FTC

Stop OpenAI from launching new GPT models: AI policy group to US FTC

San Francisco

31-March-2023

Photo : IANS

Microsoft-owned OpenAI, the company that created AI chatbot ChatGPT, is facing a new complaint with the US Federal Trade Commission (FTC) being asked to investigate the company and suspend its commercial deployment of large language models, including ChatGPT.

The nonprofit research group Center for AI and Digital Policy (CAIDP) made public its complaint against OpenAI on Thursday, alleging the company violated Section 5 of the FTC Act, which prohibits unfair and deceptive business practices, reports CNBC.

According to CAIDP, GPT-4 is "biased, deceptive, and a risk to privacy and public safety".


Moreover, the AI policy group said the large language model fails to meet the agency's standards for AI to be "transparent, explainable, fair, and empirically sound while fostering accountability", according to the report.

The group is asking the FTC to require OpenAI to establish an independent assessment of GPT products before they are deployed in the future.

Further, the report said that it also wants the FTC to establish a public incident reporting system for GPT-4, similar to the systems it already has in place for reporting consumer fraud.

Watch This TWL Video

The AI policy group also requests that the agency undertake a rulemaking initiative to develop standards for generative AI products.

Meanwhile, several top entrepreneurs and AI researchers, including Tesla and Twitter CEO Elon Musk and Steve Wozniak, Co-founder of Apple, have written an open letter, asking all AI labs to immediately pause training of AI systems more powerful than GPT-4 for at least 6 months.

Arguing that AI systems with human-competitive intelligence can pose profound risks to society and humanity, more than 1,100 global AI researchers and executives signed the open letter to pause "all giant AI experiments". - IANS



Milky Mist Cheese