Skip To Main

Headshot Housan Abbas and Eric Slyman
Sidebar

Building In EthicsTwo researchers tackling tricky issues of safety and bias.

By Oregon Stater Staff

Getting your Trinity Audio player ready...

Two researchers tackling tricky issues of safety and bias.

Headshot Houssam Abbas

We trust AI to navigate us to new locations and one day it could drive our cars. But should we let AI make ethical decisions for us? Houssam Abbas, assistant professor of electrical and computer engineering, often shares this thought problem: A self-driving car is faced with an unavoidable accident. In the seconds it has before impact, it can choose to either plow into the car in front of it, possibly harming the occupants, or drive off the road into a ditch. What guidelines does it use to make that choice? To make ethics accessible to machines, Abbas is working to boil down the delicate balance of human decision-making into mathematical equations. He uses deontic logics, a family of mathematical languages that model how we think about our obligations and permissions. Abbas and his students work on several collaborative projects that include academic and industry partners to develop formal methods for verification of engineered systems.

Many artificial intelligence models are trained with information from the internet, which is steeped in stereotypes. For example, an AI image generator, when asked to produce a picture of a doctor, might return an image of a white man by default. And this can get even worse when companies remove seemingly redundant photos — through a process called deduplication — to speed up AI training. Eric Slyman, Ph.D. candidate in artificial intelligence and now engineer at Adobe — creator of Adobe Photoshop, Acrobat and other industry-standard apps — helped create a cost-effective tool with researchers there that builds in awareness of social biases that may be in training data. Called FairDeDupe, it makes it possible to instruct an AI to preserve image variety by not pitching out photos of nondominant groups. “We let people define what is fair in their setting instead of the internet or other large-scale datasets deciding that,” Slyman said.

Headshot of Eric Slyman

Never miss an issue — subscribe to the Stater newsletter!