LOADING

Type to search

Uncategorized

Generation AI

AI created image of robot hand and human hand reaching to touch

Entering the era of artificial intelligence

By Molly Englund

What is the future of AI? Is it a technology that will become as integral to our daily lives as smartphones are, or is it inflated hype?

On campus, students are diving into these questions in Professor Jill Zimmerman’s Principles of Artificial Intelligence course. Meanwhile, Goucher alums like Jim Segedy ’07 and Luka Trikha ’23 and professors like Amanda Draheim and Lana Oweidat are among the researchers and engineers who want to figure that out.

In Fall 2023, Amanda Draheim, an assistant professor of psychology at Goucher, taught a seminar in clinical psychology. She wanted to teach her students more about case conceptualization, which, in the clinical world, means “being able to organize your thoughts around a client’s presenting concerns and symptoms, as well as their strengths and their histories,” said Draheim. The goal is to form the narrative for a client’s treatment—the source of their distress, for example, and what direction to take with their therapy. “Skill in case conceptualization is a core component of what gets trained in clinical psychology Ph.D. programs,” she said. “It’s an incredibly challenging skill to learn.”

While teaching the class, Draheim saw a lot of debate about how to navigate students’ desires to use tools like ChatGPT, the generative AI chatbot. One morning, she woke up with an idea: Generative AI, which creates text responses to user inquiries based on large language models, might be used to teach case-conceptualization skills. “For training graduate students, there might be some straightforward ways in which we could embrace this tool,” said Draheim. “But first we have to learn how to use it and how to use it to maximum effect.”

Lana Oweidat, an associate professor of rhetoric and composition and the Writing Center director at Goucher, was also interested in the conversations happening in higher education about AI, which largely focused on what to do about students presenting AI-generated text as their own work. “There was a lot of emphasis on the negatives and not much on the potential,” she said. She decided to involve AI in her research, but with a focus on social justice concerns. “That’s something I feel very strongly about in my teaching and scholarship,” said Oweidat. She wondered if AI could be used as a writing support for marginalized groups. “I was thinking, who are the students that might actually benefit from this tool? How can we use it in an ethical, responsible, critical way?”

Two Goucher alums who majored in computer science, Luka Trikha ’23 and Jim Segedy ’07, are exploring AI outside of academia. Trikha works for the State Department, performing research to see how AI might be incorporated into the daily work of different offices in the department. Trikha and his colleagues look at how various products work, as there could be separate advantages to different generative AI options. ChatGPT might be better for some things and Google’s Gemini better for others. An international office might need language translations for specific dialects, for example, or another office might need images categorized. “That’s currently done by workers manually going through images, making labels, sorting them into specific fields,” he said. “If we had the opportunity to automate that and make everyone’s life easier, that’s progress.” These researchers also work with really big data sets to see what can be automated, in order to help the analysts who use them.

Jim Segedy is a software engineer at Epic, one of the largest vendors of health care software. Medical records have been transformed in the last few decades, from paper filed in a cabinet to electronic documents shared between different doctors’ offices and health care systems. “When you go to the doctor’s office and they’re typing in all of the problems and prescriptions in the computer, that’s probably our software,” said Segedy. He’s worked at Epic for nine years, currently leading the development team that manages the web services platform.

One project Epic is working on integrates ambient listening technology from companies like Microsoft to provide summaries of patient-provider conversations. That way, “doctors can spend more time interacting with patients,” Segedy said. Electronic medical records have allowed health care providers to track much more information about a patient, which helps them make better decisions about care. “But the consequence is that doctors will spend a whole lot more time looking at the computer instead of at the patient,” he said. “A lot of what we focus on is trying to figure out how to use AI technology to get folks focused on the patient again and get them out of the chart.”

At Goucher, professors Draheim and Oweidat were both awarded a Year of Exploration grant from the college’s KRES Fund to conduct research on AI projects. Draheim, whose expertise is in internalizing disorders—like major depressive, anxiety, and post-traumatic stress disorders—is using her grant this summer to gather evidence on the ethical risks and benefits of using generative AI as a training tool, particularly in terms of protecting patient privacy. She plans to start by sending out a survey to the clinicians who train graduate students. Eventually, Draheim hopes to develop scripts and advice that students can use to practice case conceptualization with AI “in a way that prioritizes cultural considerations and recovery,” she said. “Strengths, sources of resilience, and culture and history are really important pieces of the puzzle that get actively taught in case-conceptualization skills.”

Oweidat is using her KRES grant this summer to travel to Jordan, where she grew up, to learn from second-language writers. She will look at how Palestinian and Syrian students who came to the country as refugees use AI tools. “A lot of folks there speak English and write in English as a second language,” said Oweidat, “so I decided to travel there to collect data from second-language writers about their use of AI, or lack thereof, and the potentials and the shortcomings. How does this population use generative AI in brainstorming, researching, and composing texts? How does the technology help them face linguistic challenges? And how do their views of this technology support or challenge intellectual property, ownership, and attribution?” After spending the summer interviewing students, she will use the data to draft an article about her findings.

Nearly everyone interviewed for this article about AI has doubts about AI. Trikha, in the State Department, said that the results of the pilots they’ve started haven’t been overwhelming successes. Oweidat says she wants to figure out if AI tools can even live up to expectations or if the ethical concerns are too great. “This kind of technology can generate text and, when prompted, it can create images and pictures, too. Where do you draw the line in terms of intellectual property?” she asked, as these tools are typically trained on huge sets of data that include copyrighted material.

Draheim is pragmatic about students using AI. “I’m starting to develop my own ideas about where to stand with generative AI,” she said. “I’m a proponent of finding ways to use it.” While Goucher’s academic honor code prohibits students from using it without their professors’ permission, she doesn’t think it works to simply ban AI from classrooms—if a student is told not to use it, they’re going to want to use it. As a psychology professor, she thinks motivational interviewing is a better technique to encourage behavioral change—guiding the student to examine their own desire to use the tool, just as a therapy patient is encouraged to interrogate their own motivations for their behavior. Draheim also stresses that there are many ethical considerations that would need to be established before using AI with actual clients.

There’s also the environmental factor that causes doubts among researchers. An October 2023 article in Scientific American predicted that by 2027, AI servers will consume more than 85 terawatt-hours of electricity every year, which is more than some small countries use. That comes with a huge carbon footprint. “I don’t believe it’s sustainable,” said Trikha. “It takes a lot of power, a lot of energy, and there will come a time when that conversation is going to spark among environmentalists, and the AI industry will be caught with their tails between their legs.” On the other hand, AI optimists hope that as AI improves, it will help us find ways to reduce our reliance on fossil fuels.

But even with these doubts, the people who research AI see many potential possibilities. Epic has a product called Cosmos that aggregates anonymous data from millions of patient encounters to power medical insights. Perhaps a doctor encounters a strange case they haven’t seen before. “We can use our data to say, ‘There aren’t very many people who have presented with this combination of symptoms, but we can find four of them. Here are two doctors still working who treated those patients and might be able to talk you through what they did,’” said Segedy.

For those who might worry about giving AI too much power over our health, Segedy points out that AI is not making decisions. “It is providing context in the moment and citing its sources so that the doctor can make the decision,” he said.

That’s the best-case argument for AI: augmenting human capabilities. “Generative AI is the latest in a series of technological innovations that greatly reduce the effort required to accomplish something. It’s changed the way that I search for information, write emails, or get a quick summary of a long email thread. It can listen to meetings and create notes. And that enables people to spend their time and energy focusing on what they’re uniquely suited to do: creatively apply technology to make their lives better while they’re focusing on the most important decisions,” Segedy said. He pointed out that Microsoft calls their generative AI products “Copilots,” an apt description for the role AI should occupy in our lives.

Trikha also wants AI to do good. He was drawn to government work because he wanted to be part of a mission that wasn’t about making money. “What we try to do is work for the better of the people and the mission.” He wants to use AI “for the people, instead of just to get more advertisements.”

 

Tags