From simple algorithms to advanced machine learning, artificial intelligence is a complex and rapidly evolving technology. It can enhance efficiency in everyday tasks, offer personalized services, and assist in healthcare advancements. However, at the same time, AI is outpacing the regulatory frameworks necessary to safeguard public interests.
To shed light on the nuanced challenges of governing AI, TCNJ welcomed scholar, writer, and policy adviser Alondra Nelson to campus on October 15 as part of the Kathryn A. Foster Distinguished Visitor Series. She participated in a conversation moderated by Judi Cook, executive director of TCNJ’s Center for Excellence in Teaching and Learning.
Nelson, who was named to the inaugural TIME100 list of the most influential people in artificial intelligence, served as acting director of the White House Office of Science and Technology Policy, where she oversaw the release of President Biden’s Blueprint for an AI Bill of Rights in October 2022.
Nelson outlined the need for an iterative, adaptive, and multi-stakeholder approach to governing rapidly evolving AI technologies.
In case you missed it, here’s what Nelson wants us to know about AI:
The same laws apply.
“The first thing we need to realize is that the laws and regulations that we have already also apply to AI,” Nelson says. “All of our laws — laws against fraud, laws against forms of sexual abuse, and other kinds of criminal activity — still apply.”
The challenge, she argues, is for policymakers to interpret existing laws when a sophisticated, powerful, algorithmic system is creating the potential for fraud or criminal activity.
“There’s more policy innovation, not only technology innovation, that needs to take place,” she says.
AI is dynamic.
Whereas many of our laws for other technologies like appliances and cars may have held up and remained relevant for decades, Nelson argues that regulations around AI might only have a two-year shelf life. “It’s going to require a different kind of agility in how you do policymaking.”
At its core, AI is hardware, software, computational power, and data. But systems that used to be static are now dynamic and are learning and changing quickly.
It uses a lot of power.
Nelson encourages people to be judicious in their use of AI, and be mindful of the resources it takes to generate the quick answers you might be looking for. “Understand that if you’re making a decision to use one of the chatbots, you’re also making a decision to use 10 times, at least, more energy to get that output,” she says. “We’re bringing online nuclear reactors to generate energy to run these systems.”
In other words, think twice before asking ChatGPT or Gemini something that can be answered with a simple Google search.
Don’t fear it.
Nelson encourages students and educators alike to experiment with AI as a best practice for learning how it works. “Design curricula to allow use of these tools rather than just warning about plagiarism,” she says. “Use the tools.”
— Emily W. Dodd ’03