Published February 9, 2022
We’re on the brink of a new era, one in which machines aren’t simply tools we use but partners we work with. Developing that partnership in a way that is ethically, morally, and socially just is the subject of the first episode of the Advance Rutgers podcast. Peter March, executive dean of the School of Arts and Sciences and Distinguished Professor of Mathematics at Rutgers University–New Brunswick, explains how an exciting interdisciplinary effort called Minds and Machines will push the frontiers of science and responsible innovation in the age of intelligent tools.
Learn more about this and other signature initiatives taking place at Rutgers.
Podcast Transcript
Christine Fennessy:
Welcome to Advance Rutgers, a podcast about the many ways that Rutgers, The State University of New Jersey, is addressing the critical issues of our day. This podcast will explore those groundbreaking initiatives: what they are, why they matter, and who they benefit. Today, we’re talking about Minds and Machines. Minds and Machines is an interdisciplinary project that will push the frontiers of science and responsible innovation in the age of intelligent tools. Thanks for joining us.
Peter March:
I’m Peter March, the executive dean of the School of Arts and Sciences at Rutgers University in New Brunswick, New Jersey.
Christine Fennessy:
Dean March is the champion for Minds and Machines. And by machines, we’re talking intelligent tools—so a huge, huge range of things. Everything from personal assistants that never get sick of us, to self-driving cars, to technology that’s on the horizon, like drones that deliver.
Peter March:
They’re sort of characterized by a combination of artificial intelligence in a physical object.
Christine Fennessy:
These tools have made our lives so much easier. They clean our houses, they invest our money, and they connect us with our friends.
Peter March:
They also give us an opportunity to perform tasks or functions that were either too difficult or too dangerous to do before. So for example, a robot exploring a disaster area where it’s either too hot or too dangerous in some way for a human being.
Christine Fennessy:
And they can be used to save lives.
Peter March:
Quite amazingly, police in New Delhi in the last year or so have used artificial intelligence, facial recognition technology to find almost 3,000 lost children in New Delhi in a period of four days.
Christine Fennessy:
But there’s a flip side to these tools. They can get into the wrong hands. For example, Dean March says that same technology that can help find missing kids could be reversed. It could be used to find children and abduct them. And the technology itself can fail.
Peter March:
Even the top facial recognition software misidentifies or fails to identify dark-skinned people at rates five to 10 times that of white people.
Christine Fennessy:
The consequences can be serious when that software is put inside an autonomous vehicle as part of its navigation system.
Peter March:
It puts people of dark complexion at significantly greater risk for accident than people of light complexion.
Christine Fennessy:
So why do intelligent tools like facial recognition get things so wrong? Dean March says that what’s actually programmed into artificial intelligence isn’t the ability to simply recognize an object like a dog, a cat, or a chair. Instead, AI is trained to learn. And how well it learns, and the decisions that it ultimately makes, is a function of its training data, which is put in by humans.
Peter March:
Even people of goodwill have biases of various kinds. And that bias comes into the functioning of the AI through this learning process.
Christine Fennessy:
He says this kind of bias is inevitable, because we all have them.
Peter March:
You have a certain way of dressing, a certain way of thinking, you associate with certain kinds of people. You’re in groups that have a common understanding.
Christine Fennessy:
And so these biases creep into the programming of artificial intelligence.
Peter March:
If you are training in AI to help you in a financial transaction, mortgages or something, well, you’re going to feed it your institutional data. And if that data for historical reasons has various kinds of social, ethical biases, racial biases in it, well, the AI’s learning from that data. And so the bias becomes part of the way the AI works.
Christine Fennessy:
But it doesn’t have to be this way. And that’s where Minds and Machines comes in. Dean March likes to think of the initiative as a three-legged platform. One of those legs will be the development of new technology. The second leg will be developing that technology in a way that deals with negative consequences before they happen. And that means thinking very closely about the data used to train the AI. So it won’t cause it to make unfair or inappropriate decisions.
Peter March:
We are trying very deliberately to bring in these ethical, philosophical, moral, legal, social issues, not as add-ons at the end, but an integral part of the technology development.
Christine Fennessy:
It also means asking, “How could a bad actor use this tool?” And addressing that in the development stage.
Peter March:
And not to discover this when this piece of technology’s out there in the wild.
Christine Fennessy:
The third leg is education. Dean March says that Minds and Machines will prepare students at Rutgers for a world that doesn’t even exist yet.
Peter March:
Having some understanding of this kind of technology, data science, machine learning, artificial intelligence is necessary in the world that is just around the corner. And we want to be a leader in preparing our students for success in that world.
Christine Fennessy:
In part, Minds and Machines will do that by giving every student a foundation in data science, whether they major in it or not. The goal is to teach them how to parse information from misinformation.
Peter March:
We want to make sure that our students are appropriately skeptical about data.
Christine Fennessy:
He says the skill is going to be essential to being a good citizen. The educational component of Minds and Machines will also include a data science major for undergraduates and a certificate program in data science, also for undergraduates. So those are the three legs of the Minds and Machines platform: technological development, making sure that development is equitable, and education.
Peter March:
And standing on that platform, we want to create a Minds and Machines institute, which allows us at Rutgers to make our ideas, our students, our researchers available to partners in business and industry and government. So this institute is the means by which we can interact with the world outside the four walls of the academy.
Christine Fennessy:
The researchers and students at the Minds and Machines institute will reflect a highly interdisciplinary environment. They’ll come from a huge range of disciplines—everything from philosophy to engineering, to law and statistics. And they’ll work together to help businesses, nonprofits, and governmental organizations solve real-world problems around issues of artificial intelligence. And some of those problems? They’ll be ones that computer scientists and mathematicians have never encountered before. But Dean March says that’s what drives new research.
Peter March:
So that provides stimulation back to the developers of new technology, and that’s the power stroke.
Christine Fennessy:
Dean March says we’re on the brink of a new era, where machines aren’t simply tools we use, but partners we work with.
Peter March:
Here’s what excites me about this. See, it’s not Minds in Machines or Minds or Machines. It’s Minds and Machines. It’s a partnership between humans and increasingly intelligent machines. And it’s so exciting to think about what that world may be like.
Christine Fennessy:
It’s a future, he says, that uniquely suits Rutgers’ talent and expertise.
Peter March:
We’re not going to be passive participants in this fundamental change in our society. We’re going to be active participants. We’re going to try and shape it.
Christine Fennessy:
That’s it for today’s show. I’d like to thank Peter March for his time and his insight. Music in this episode is by Epidemic Sound. And you can subscribe to the show wherever you get your podcasts.
Multidisciplinary projects like Minds and Machines embody the innovative drive of Rutgers, New Jersey’s academic, health, and research powerhouse. I’m your host and producer, Christine Fennessy. Join us next time as we explore more initiatives that will better the world.