Are the robots going to take over the world?! There is no question that artificial intelligence is finding its way into our everyday lives. Some people love interacting with Alexa as part of their daily activities. Others worry about the loss of autonomy and privacy that accompanies the burgeoning AI world, and some dread that someday humans may become secondary to the artificial intelligence we have created. The AI train already is leaving the station, and before it gets too far down the tracks, what is the federal government doing in terms of potential regulation?
In a time of deep partisan divide in which Republicans and Democrats in Congress disagree on practically everything, a bipartisan group of legislators has reintroduced a bill to accelerate the adoption of artificial intelligence in the federal government.
Getting Their AI Houses in Order
Indeed, Senators Rob Portman (R-Ohio, and a University of Michigan Law classmate of your blogger), Kamala Harris (D-California), Cory Gardner (R-Colorado), and Brian Schatz (D-Hawaii), recently reintroduced the Artificial Intelligence in Government Act. Rather than seek to regulate artificial intelligence in the business world, a primary underlying purpose of this proposed legislation is to bring more AI technical experts into the federal government to build up federal artificial intelligence capabilities.
According to Senator Portman: “Artificial intelligence will have significant impacts for our country, economy, and society. Ensuring that our government has the capabilities and expertise to navigate those impacts will be important in the coming years and decades. This bipartisan legislation will help ensure our government understands the benefits and pitfalls of this tendency as it engages in a responsible, accountable rollout of AI.”
In the House of Representatives, a bipartisan effort by Representatives Mark Meadows (R-North Carolina) and Jeffry McNerney (D-California) has introduced the legislation there as well.
Looking Forward so We Don’t Get Left Behind
In terms of some of the specifics, the bills task the General Services Administration with the creation of a Center of Excellence to offer expertise and to “conduct forward-looking, original research on federal AI policy and promote U.S. competitiveness.” Also, the Office of Personnel Management would be required to set up and revise jobs listings to include artificial intelligence skills and competencies.
Moreover, federal agencies would need to set up governance plans for implementing the use of AI while safeguarding “civil liberties, privacy and civil rights.” Along those lines, Senator Harris stated: “As we embrace the new jobs and new opportunities brought about by the growth of artificial intelligence, we must also be clear about the potential downsides of this powerful technology, including racial and gender bias.”
In the business world, the AI cat has been out of the bag (yes, another metaphor) for quite a while. What have we learned from the Artificial Intelligence in Government Act? Plainly, the federal government wants to get up to speed and include artificial intelligence in the course and scope of its work. The prime purpose of this legislation is not to regulate corporate AI. Thus, if you are worried that the robots are going to take over the world, the Artificial Intelligence in Government Act will not be your salvation. But, if you are an AI fan, you should be heartened that the federal government wants become AI-savvy.
Eric Sinrod (@EricSinrod on Twitter) is a partner in the San Francisco office of Duane Morris LLP, where he focuses on litigation matters of various types, including information technology and intellectual property disputes. You can read his professional biography here. To receive a weekly email link to Mr. Sinrod’s columns, please email him atejsinrod@duanemorris.com with Subscribe in the Subject line. This column is prepared and published for informational purposes only and should not be construed as legal advice. The views expressed in this column are those of the author and do not necessarily reflect the views of the author’s law firm or its individual partners.