Categories
News

Massachusetts Issues Advisory on AI and Consumer Protection


The Massachusetts Attorney General’s Office (AGO) lately issued an advisory clarifying that present Massachusetts legislation applies to synthetic intelligence (AI) to the identical extent as every other product within the stream of commerce.

Massachusetts Legal professional Basic Andrea Campbell turned the primary legal professional normal within the nation to share such steerage about AI. The advisory opens with a tribute to AI’s potential societal advantages and notes the Commonwealth’s particular place in guiding the know-how’s growth.

Nonetheless, the advisory’s central objective is a warning to AI builders, suppliers, and customers that Massachusetts legislation, together with the Massachusetts Consumer Protection Act (Chapter 93A), applies to AI. This Act makes it illegal to have interaction in unfair or misleading enterprise acts within the state of Massachusetts.

The AGO shared the next non-exhaustive record of unfair or misleading AI enterprise acts:

  • Falsely promoting the standard, worth, or usability of AI programs.
  • Supplying an AI system that’s faulty, unusable, or impractical for the aim marketed.
  • Misrepresenting the reliability, method of efficiency, security, or circumstances of an AI system, together with statements that the system is free from bias.
  • Providing on the market an AI system in breach of guarantee in that the system will not be match for the peculiar objective for which such programs are used, or that’s unfit for the precise objective for which it’s offered the place the provider is aware of of such objective.
  • Misrepresenting audio or video of an individual for the aim of deceiving one other to have interaction in a enterprise transaction or provide private info as if to a trusted enterprise accomplice as within the case of deepfakes, voice cloning, or chatbots used to have interaction in fraud.
  • Failing to adjust to Massachusetts statutes, guidelines, laws, or legal guidelines, meant for the safety of the general public’s well being, security, or welfare.

The advisory leaves an vital observe reminding companies that AI programs are required to adjust to privateness safety, discrimination, and federal client safety legal guidelines.

AI Regulation will Proceed to Enhance

You possibly can fairly count on that AI will more and more be the topic of latest regulation and litigation on the state and federal ranges. On the nationwide stage, the Biden administration issued an Executive Order in October 2023 directing numerous federal companies to regulate to the growing utility and dangers of synthetic intelligence. Within the wake of that Government Order, the Federal Commerce Fee has already taken its first steps towards AI regulation in a proposed rule prohibiting AI from impersonating human beings. The Division of Labor has introduced principles that can apply to the event and deployment of AI programs within the office, and other federal agencies have also taken action.

In 2024, Colorado and Utah state lawmakers handed their very own AI legal guidelines that can possible function fashions to different states contemplating AI laws. Each the Colorado Artificial Intelligence Act and Utah’s Artificial Intelligence Policy Act serve to convey AI use throughout the scope of present state client safety legal guidelines. Reflecting the AGO’s warning, plaintiffs have already began asserting privacy and consumer claims primarily based on AI know-how on enterprise web sites.

On the worldwide stage, the EU Artificial Intelligence Act of March 13, 2024 is an intensive AI regulation that separates AI functions into totally different danger ranges and regulates them accordingly. Unacceptable danger functions are banned, whereas excessive danger functions are topic to in depth precautionary measures and oversights. AI builders and suppliers doing enterprise in Europe ought to take into account whether or not they’re topic to the EU AI Act and guarantee their product complies.

Making ready for AI Compliance, Enforcement, and Litigation Dangers

There are excessive ranges of uncertainty surrounding how AI can be deployed sooner or later, and how legislators, lawmakers, and courts will apply new and present legal guidelines to the know-how.

Nonetheless, it’s possible that compliance obligations and enforcement and litigation dangers will proceed to extend within the coming years. Companies ought to due to this fact seek the advice of with skilled counsel earlier than deploying or contracting to make use of new AI instruments to make sure they’re taking efficient steps to mitigate these dangers. Organizations ought to take into account the next non-exhaustive record of measures:

  • Creating an inside AI coverage governing the group’s and its staff’ use of AI within the office.
  • Creating and/or updating due diligence practices to make sure that the group is conscious of how third-party distributors are utilizing, or plan to make use of, AI, together with due diligence regarding what information is collected, transmitted, saved, and used when coaching AI instruments with machine studying.
  • Actively monitoring state and federal legal guidelines for brand new authorized developments affecting the group’s compliance obligations.
  • Making certain that the group and its third-party distributors have applicable and ongoing governance processes in place, together with steady monitoring and testing for AI high quality and absence of impermissible bias.
  • Offering clear disclosure language regarding AI instruments, capabilities, and options, together with particular notifications when a buyer engages with an AI assistant or instrument.
  • Modifying privateness insurance policies and phrases and circumstances to elucidate using AI know-how and what opt-out or dispute decision phrases can be found to clients.
  • Reviewing and updating present third-party contracts for AI-related phrases, disclosure obligations regarding AI and danger, and legal responsibility allocation associated to AI.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *