AI Risks and Benefits in Massachusetts: Attorney General’s Advisory

2024-04-18 19:29:25

BOSTON — As the executive branch of state government touts the competitive advantage of investing energy and money in artificial intelligence in Massachusetts’ technology, government, health and education sectors, the state’s top attorney is issuing warnings regarding its risks.

Attorney General Andrea Campbell issued an advisory to AI developers, providers and users on Tuesday, reminding them of their obligations under the state’s consumer protection laws.

“AI has tremendous potential benefits for society,” Campbell’s adviser said. “This presents exciting opportunities to drive efficiencies and cost savings in the marketplace, foster innovation and imagination, and spur economic growth.”

However, she warned: “AI systems have already been shown to pose serious risks to consumers, including bias, lack of transparency or explainability, implications for data privacy, and more. Despite these risks, businesses and consumers are rapidly adopting AI systems on and use it rapidly affecting virtually all aspects of life.”

Developers promise that their complex and opaque systems are accurate, fair, efficient and fit for purpose, but Campbell notes that the systems are being deployed “in ways that can mislead consumers and the public,” referring to chatbots used to perpetrate scams. or of fake computer-generated images and videos called “deepfakes” that mislead consumers and viewers regarding a participant’s identity. Misleading and potentially discriminatory results from these systems might run afoul of consumer protection laws, according to the advisory.

The advice echoes a dynamic in the state’s enthusiastic embrace of gambling at the executive level, with Campbell warning once morest possible harmful impacts while remaining shy of a full-fledged objection to extensions such as an online Lottery.

Gov. Maura Healey has touted applied artificial intelligence as a potential boon for the state, creating an artificial intelligence strategic task force by means of executive order in February. Healey is also seeking $100 million in her economic development account — the “Mass Clues Act” – to create an Applied AI Hub in Massachusetts.

“Massachusetts has the opportunity to be a world leader in Applied AI – but it’s going to take us bringing together the brightest minds in technology, business, education, healthcare and government. That is exactly what this task force will do,” Healey said in a statement that accompanied the task force announcement. “Members of the task force will collaborate on strategies that keep us ahead of the curve using AI and GenAI technology, which will bring significant benefit to our economy and communities across the state.”

In conversation with Healey last month, tech journalist Kara Swisher sharply criticized the enthusiastic embrace of AI hype, describing it as just “marketing in the moment” and comparing it to the crypto bubble – and signs of a similar AI bubble are worrying other technology reporters. Tech companies see its value in “printing whatever we’re printing at the moment, and it’s actually exhausting,” Swisher said, adding that certain types of tasking algorithms such as search tools are already common, but the trend now is “to An AI on it and say it’s AI. It’s not.”

Ultimately, Swisher acknowledged, technology is becoming cheaper and more capable of certain types of labor than humans — as in the case of mechanized farming — and it’s up to officials like Healey to figure out how to balance new technology while protecting the people who touch it

Mohamad Ali, COO of IBM Consulting, have in common in CommonWealth Beacon that there must be significant investment in an AI-enabled workforce that prioritizes trust and transparency.

Artificial intelligence policy in Massachusetts, as in many states, is a mixed bag that crosses all branches of government. The executive branch is betting big that the technology might boost the state’s innovation economy, while the Legislature weighs the risks of deep counterfeiting non-consensual pornography and election communication.

Reliance on large language model styles of artificial intelligence – fusing the feel of a search algorithm with the promise of a skilled researcher and writer – has caused headaches for courts. Because several widely used AI tools use predictive text algorithms trained on, but not always limited to, existing work, large language model AI can “hallucinate” and fabricate facts and quotes that don’t exist.

In a February order in the alarming wrongful death and sexual abuse case filed once morest the Stoughton Police Department, Associate Judge Brian Davis chastised attorneys for relying on AI systems to prepare legal research and blindly submitting inaccurate information generated by the systems to court. The AI ​​hallucinations and the unchecked use of AI in legal applications are “disturbing developments that adversely affect the practice of law in the Commonwealth and beyond,” Davis wrote.

GET THE MORNING HEADLINES DELIVERED TO YOUR INBOX

SIGN IN

1713469177
#Massachusetts #leans #artificial #intelligence #waves #yellow #flag #Rhode #Island #Current

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.