science

Colleges Grapple With Teaching the Technology and Ethics of AI


PITTSBURGH — About 18 months ago, Shawn Blanton, a professor of electrical and computer engineering at Carnegie Mellon University, met with some of his graduate students to redesign his course on artificial intelligence.

“We need to transform this course to make it more relevant outside these walls,” he said.

It had only been three years since Professor Blanton started the class, but as artificial intelligence moves from the stuff of dystopian fantasies — robots run amok — to the reality of everyday use, universities around the country are grappling with the best ways to teach it.

This year, Carnegie Mellon said it became the first university in the country to offer a separate undergraduate A.I. degree through its College of Computer Science. The Massachusetts Institute of Technology last month announced plans to establish a college for A.I., backed by $1 billion in investments.

And the expansion is not just happening in the country’s top science and technology schools. The University of Rhode Island this fall opened an A.I. lab operated by its college library.

But this growth also means new challenges, such as figuring out how to teach the subject in ways understandable to those who are not computer science majors and addressing ethical dilemmas raised by the technology, such as privacy and job displacement.

“We have to start teaching those who will be practitioners and users in the broad discipline of A.I., not just computer scientists,” said Emily Fox, an associate professor of computer science, engineering and statistics at the University of Washington.

Professor Fox developed an A.I. course for nonmajors, which was first offered last spring. To qualify, students had only to have completed courses in basic probability and basic programming, far fewer prerequisites than typically needed by students taking A.I.

She had to cap enrollment at 110 students because of such high interest.

Demi Tu, a senior studying information technology at the University of Washington, is an example of the value of reaching out to students who are not classic technology whizzes. She said she was so taken with what she learned in Professor Fox’s class that she may choose to pursue it in graduate school.

“Before taking the class, I did not know what A.I. was specifically,” she said. “I just wanted the initial exposure. But the class really opened up a different path for me.”

Educators are also struggling to balance what some see as an essential teaching of the deep fundamentals of artificial intelligence with a desire by some in the industry to focus on the less expensive, less complicated training of workers who can complete the tasks at hand without that deep understanding.

Or as Levent Burak Kara, a professor of mechanical engineering at Carnegie Mellon, noted, there is a tension in teaching A.I. between ensuring “students understand what’s under the hood and what industry wants.”

Freshman year may actually be too late to start teaching A.I. Fei-Fei Li, director of Stanford University’s Artificial Intelligence Lab and its Stanford Vision Lab, began a three-week summer program in 2015 for high school students, focused on offering young women early exposure to A.I.

She then co-founded AI4ALL, a nonprofit; this year six campuses ran programs exposing high school students, particularly women, people of color and those from rural regions, to the technology.

“We want young people to think about their future participation in developing or guiding this technology,” Professor Li said. “Today’s developers are not diverse enough or inclusive enough. We want to educate tomorrow’s A.I.’s technologists, thinkers and leaders and instill them with a human-centered frame of mind.”

The University of Rhode Island has tried to make A.I. more accessible to a broad range of students by opening the lab in its library.

“We’re democratizing A.I.,” said Karim Boughida, the dean of university libraries.

The 600-square-foot lab provides workstations for students to take tutorials in areas such as robotics, natural language processing and smarthomes, and to design their own projects. There is also a space where students, faculty and community members can learn, discuss and debate the ethics and future of A.I.

The ethical issues raised by A.I. — among them privacy, security and job displacement — and how to teach them are something educators across the country are wrestling with. And many professors and students say more needs to be done in A.I. classes — not just in separate ethics courses — to ensure students become workers who are thoughtful about the role of A.I.

“For instance, we think of self-driving cars as 20 years down the road, but the way things are progressing, it will be a lot sooner,” said Dillon Pulliam, who is studying for his master’s degree in electrical and computer engineering at Carnegie Mellon. “We need policies — if the car hits a pedestrian, who is responsible?”

At the University of Washington, a new class called “Intelligent Machinery, Identity and Ethics,” is being taught this fall by a team leader at Google and the co-director of the university’s Computational Neuroscience program.

Daniel Grossman, a professor and deputy director of undergraduate studies at the university’s Paul G. Allen School of Computer Science and Engineering, explained the purpose this way:

The course “aims to get at the big ethical questions we’ll be facing, not just in the next year or two but in the next decade or two.”

David Danks, a professor of philosophy and psychology at Carnegie Mellon, just started teaching a class, “A.I, Society and Humanity.” The class is an outgrowth of faculty coming together over the past three years to create shared research projects, he said, because students need to learn from both those who are trained in the technology and those who are trained in asking ethical questions.

“The key is to make sure they have the opportunities to really explore the ways technology can have an impact — to think how this will affect people in poorer communities or how it can be abused,” he said.



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.