campus news

How is AI used in the office? Forum frames the discussion

An audience listens to a speaker's presentation.

Tiffany Fuzak (at podium), business intelligence analyst with the Office of Institutional Analysis, told the audience she routinely uses ChatGPT as a tool while coding. Photo: Nancy J. Parisi

By JAY REY

Published July 22, 2024

Print
“One of the really cool things about ChatGPT is that it talks to people at their level. I think that’s part of what makes it really powerful. Once you get the hang of it, once you get used to it, I have found it can really help you level up professionally. ”
Tiffany Fuzak, business intelligence analyst
Office of Institutional Analysis

Some of your colleagues use generative AI to summarize or distill dense research papers. Some use AI as a tool while writing computer code. Others are looking at ways to use it for recruiting and retaining students.

But there are issues — technological, ethical and legal — that employees should be aware of to use AI responsibly.

Those are just a few of the impressions from a campus forum last Thursday to engage university staff on how this quickly advancing technology can — and is — being used by colleagues across campus in their daily work.

Here are four takeaways from the latest in a series of “AI at UB” forums hosted by UB Information Technology and the Office of Curriculum, Assessment and Teaching Transformation (CATT):

Interest in integrating AI at work is widespread across campus. Nearly 500 people listened in on the forum held in Knox Hall, with more than 100 in attendance and nearly 400 viewing the livestream.

Staff are starting to dabble with AI for varied uses. Admissions conducted a survey of its employees that showed 80% were somewhat familiar with artificial intelligence, while more than a quarter have used it, said Ben Olsen, associate director of data analytics, Office of Admissions.

Olsen was one of four panelists at the forum who spoke about their own experiences using generative AI.

His division is currently researching AI products that could help staff read through transcripts, consider where best to use scholarship funds and provide some predictive modeling on how many students might enroll in a given year.

A speaker uses a microphone during a presentation.

Ben Olsen, associate director of data analytics, Office of Admissions, described the research his department has conducted. Photo: Nancy J. Parisi

“No one size is going to fit everything that we needed,” Olsen said. “Nothing has been decided yet, but we have identified the products that we probably will move forward with.”

Tiffany Fuzak, business intelligence analyst with the Office of Institutional Analysis, told the audience she routinely uses ChatGPT as a tool while coding, maybe to help write a query or fix an error.

“One of the really cool things about ChatGPT is that it talks to people at their level,” Fuzak said. “I think that’s part of what makes it really powerful. Once you get the hang of it, once you get used to it, I have found it can really help you level up professionally.”

Be aware of pitfalls. Row Onishi, director of digital products and strategy with University Communications, said the unit created a committee last fall to research AI usage among communicators at peer institutions and came up with some best practices and guidelines for its staffers.

A speaker gestures during a speech.

Row Onishi, director of digital products and strategy with University Communications, decribed the committee convened last fall to develop best practices and guidelines for its staffers. Photo: Nancy J. Parisi

The committee cautioned about inputting any personal or proprietary data and being aware that AI-generated responses could be plagiarized, include copyrighted material or contain biases and stereotypes. It’s critical to fact check when using AI.

“I think it’s important that we realize we have the ability to determine whether or not that content is good to use or not good to use,” Onishi said. “And that is our responsibility.”

UBIT also has formed various committees to assess generative AI opportunities and research practices at other universities, said Kevin Eye, a principal technology architect for Enterprise Application Services within UBIT.

A speaker uses a microphone during a presentation.

Kevin Eye, a principal technology architect for Enterprise Application Services, UBIT, gave remarks on the work done to assess generative AI opportunities and research practices at other universities. Photo: Nancy J. Parisi

They recommended more AI training, policies and guidelines for facilitating discussions like the one on Thursday. Eye also pointed to a guidance statement on the UBIT website.

Employees are interested in AI training. Based on discussions with employees across campus, there’s interest in AI training and ongoing opportunities to share best practices with counterparts in other offices, said E. Bruce Pitman, interim vice president and chief information officer.

Employees want to know about policies to govern their use of AI given the speed at which the technology is changing and advancing, said Pitman, who served as moderator of the forum.

“Finally, and perhaps it ought to be the first question, what tools, what techniques should we be using on campus?” Pitman said.

“So, those questions are being asked and I don’t have the answers yet, but they’re out there,” he said.

A recording of the 90-minute forum can be found here.