- Wisconsin lawmakers were set to vote Thursday on proposed artificial intelligence regulations.
- State agencies’ prospective use of AI, as well as its use by political actors, are among factors considered in the proposal. Another seeks to make producing and possessing AI-generated child pornography a felony.
- At least 25 states began proposing AI regulation last year alone, with many lawmakers struggling to grapple with the fast-emerging medium and its prevalence.
Wisconsin lawmakers were set to vote Thursday on proposals to regulate artificial intelligence, joining a growing number of states grappling with how to control the technology as November’s elections loom.
The Assembly was scheduled to vote on a bipartisan measure to require political candidates and groups to include disclaimers in ads that use AI technology. Violators would face a $1,000 fine.
More than half a dozen organizations have registered in support of the proposal, including the League of Women Voters and the state’s newspaper and broadcaster associations. No groups have registered against the measure.
SOUTH DAKOTA BILLS CRIMINALIZING AI CHILD PORN, XYLAZINE, HEAD TO NOEM’S DESK
Another Republican-authored proposal up for a floor vote in the Assembly would make producing and possessing child pornography produced with AI technology a felony punishable by up to 25 years in prison. Current state law already makes producing and possessing child pornography a felony with a 25-year maximum sentence, but the statutes don’t address digital representations of children. No groups have registered against the bill.
A third bill on the Assembly calendar calls for auditors to review how state agencies use AI. The measure also would give agencies until 2030 to develop a plan to reduce their positions. By 2026, the agencies would have to report to legislators which positions AI could help make more efficient and report their progress.
The bill doesn’t lay out any specific workforce reduction goals and doesn’t explicitly call for replacing state employees with AI. Republican Rep. Nate Gustafson said Thursday that the goal is to find efficiencies in the face of worker shortages and not replace human beings.
“That’s flat out false,” Gustafson said of claims the bills are designed to replace humans with AI technology.
AI can include a host of different technologies, ranging from algorithms recommending what to watch on Netflix to generative systems such as ChatGPT that can aid in writing or create new images or other media. The surge of commercial investment in generative AI tools has generated public fascination and concerns about their ability to trick people and spread disinformation.
States across the U.S. have taken steps to regulate AI within the last two years. Overall, at least 25 states, Puerto Rico and the District of Columbia introduced artificial intelligence bills last year alone.
NEW HAMPSHIRE AG TRACES ROBOCALLS WITH ‘AI-GENERATED CLONE’ OF BIDEN’S VOICE BACK TO TEXAS-BASED COMPANIES
Legislatures in Texas, North Dakota, West Virginia and Puerto Rico have created advisory bodies to study and monitor AI systems their state agencies are using. Louisiana formed a new security committee to study AI’s impact on state operations, procurement and policy.
The Federal Communications Commission earlier this month outlawed robocalls using AI-generated voices. The move came in the wake of AI-generated robocalls that mimicked President Joe Biden’s voice to discourage voting in New Hampshire’s first-in-the-nation primary in January.
Sophisticated generative AI tools, from voice-cloning software to image generators, already are in use in elections in the U.S. and around the world. Last year, as the U.S. presidential race got underway, several campaign advertisements used AI-generated audio or imagery, and some candidates experimented with using AI chatbots to communicate with voters.
CLICK HERE TO GET THE FOX NEWS APP
The Biden administration issued guidelines for using AI technology in 2022 but they include mostly far-reaching goals and aren’t binding. Congress has yet to pass any federal legislation regulating AI in political campaigns.