Artificial Intelligence must comply with consumer protection laws, Mass. AG says
Artificial Intelligence may be new, and the laws protecting consumers comparatively old, but the Bay State’s attorney general said protecting the public will not be ignored.
Massachusetts Attorney General Andrea Campbell issued a legal advisory on Tuesday, telling developers, suppliers, and users of AI technology that the state’s consumer protection laws apply to the burgeoning tech.
“There is no doubt that AI holds tremendous and exciting potential to benefit society and our commonwealth in many ways, including fostering innovation and boosting efficiencies and cost-savings in the marketplace. Yet, those benefits do not outweigh the real risk of harm that, for example, any bias and lack of transparency within AI systems, can cause our residents,” Campbell said.
According to the AG’s office, the reported use of artificial intelligence to generate “deep-fake” images of political figures, celebrities, and everyday people, and the emergence of “voice-cloning” technologies, all already fall under state laws protecting consumers from “unfair and deceptive” practices.
“Consumers in the commonwealth enjoy the protection of Chapter 93A which creates a ‘flexible set of guidelines’ as to what should be considered ‘unfair and deceptive’ and which is intended ‘to grow and change with the times,’” the advisory reads, in part.
Campbell’s caution is it would be a breach of state law to use deep fakes or voice cloning for fraudulent purposes, to advertise capabilities that AI does not possess, to sell or offer an AI to solve problems it’s not able to handle, or to “misrepresent the reliability” or safety of an AI system.
Campbell’s advisory comes as the nation experiences an AI boom, with many big-name companies employing the technology to make decisions and increase automation.
Google, Amazon, and Apple apply the tool in their voice-activated technologies. Netflix and other streaming services use it to make recommendations to users. Car companies are using the technology to automate driving. Programs like ChatGPT and Deep Dream use AI to generate content.
All of that, according to the attorney general, is already regulated under state law.
“As AI usage becomes more common, this advisory serves as an important notice that our state’s consumer protection, anti-discrimination, and data privacy laws apply to AI, just as they would within any other applicable context,” Cambell said in a statement. “My office intends to enforce these laws accordingly.”