FEATURE STORY

AI @ FSU

AI @ FSU

At Frostburg State University, a quiet but transformative shift is underway as the institution begins to explore the integration of AI (artificial intelligence) into both its academic and administrative spheres.

What was once the realm of science fiction is now becoming a tangible part of campus life, from pilot programs in classrooms that use AI to personalize learning, to early efforts in applying intelligent systems to streamline operations, improve student services and enhance data-informed decision-making. These first steps are part of a broader vision to prepare students not just to use technology, but to thrive in a world being rapidly reshaped by it.

This transition aligns closely with a major new initiative by the University System of Maryland, which recently announced a partnership with Google to bring career certificates and AI-focused training to all of its institutions. The collaboration is designed to expand access to high-demand digital skills and create new pathways into technology-driven careers -- especially for students at regional universities like FSU.

At the same time, Frostburg’s administrative departments are beginning to explore how AI can be used to support the university’s internal operations. These early-stage efforts reflect a growing awareness that artificial intelligence is not just a tool for STEM fields -- it’s a cross-cutting innovation with the potential to impact every corner of the university.

While Frostburg is still in the beginning stages of this transformation, the momentum is building. Faculty, staff and students are starting to engage in conversations about both the promise and the ethical considerations of AI. As pilot projects take root and new training opportunities become available, the university is positioning itself not only as a place where AI is studied, but as a campus that actively applies it to enhance learning, increase efficiency and build a more resilient future*.

*By the way, the previous introduction was written using AI.


AI Has Arrived

BY TY DEMARTINO '90

In the Catherine R. Gira Center for Communication and Information Technologies, the great AI debate swirls in the air. According to Dr. Michael Flinn ’94/M'02, chair of the FSU’s Computer Science and Information Technologies Department, AI is coming at us -- full steam ahead. 

“AI is a freight train moving at super-sonic speed,” Flinn said. “And if you don’t jump on, you’re going to miss it.”  

To prepare for that oncoming tech train, Flinn and his computer science colleagues are quickly incorporating the ever-advancing technology into their coursework. Even though most of his students are comfortable with AI, Flinn believes that it is the faculty’s responsibility to teach their students how to use AI both responsibly, effectively and efficiently. 

“As a society, we cannot ignore AI. We know students are already using it,” Flinn said referring to popular AI apps like ChatGPT and Microsoft Copilot. “Our job is to help students use all technology responsibly and prepare them for a future career where they will work side-by-side with AI, robots and other yet realized innovations.” 

Mike Flinn

Dr. Michael Flinn ’94/M'02

Flinn is known for preaching the benefits of AI beyond his classroom, spreading the tech word to his FSU colleagues in other colleges and departments, and across the nation. He has led several campus presentations on how AI can be used in their day-to-day jobs -- responsibly and effectively. The discussion ultimately leads to the ethics surrounding its use. When one of his colleagues from the humanities asked Flinn if he felt he was “cheating” when he first used AI, Flinn’s answer was no. 

To justify his quick response, Flinn displayed an abacus, a graphing calculator and a Wikipedia prompt -– three tools that were once frowned upon and viewed as “cheating” and now are widely embraced, even required, by some professors (abacus withstanding). The common denominator of these tools is that humans are in control of how they are used. 

“At the end of the day, you are in the driver’s seat with AI,” Flinn pointed out. 

The department chair is not naïve in knowing that some students will use AI to do the work for them instead of with them. But he also knows that students have been finding ways to avoid their class assignments since class assignments were invented. 

“A cheater’s going to cheat,” he emphatically stated. “There are so many misconceptions, and they are amplified because [AI] is such a game changer.” 

Flinn has empathy for the other departments who wrestle with the ethics, especially those divisions that rely on “heavy writing” such as the Social Sciences and Humanities. 

“Computer Science is uniquely positioned to embrace change because we have to,” said Flinn. “But I get where these other departments are coming from and I’m confident they will adapt to finding new and unique ways to help student leverage the power of AI, while continuing to be creative, critical thinkers”.  


Keeping the Human in Humanities

As the spring semester shut down, Assistant Professor of English Naomi Gades sat in her third-floor office in Dunkle Hall discussing the way she teaches English 101 with AI.

“I familiarized myself with AI for the sake of my students,” said Gades who teaches first-year writing and composition courses.

She has already infused the use of AI in her courses with the mindset: “If you can’t beat them, join them.”

Or can you beat AI? Gades challenged her students to find out.

She created a “Beat the Bot” curriculum in which students analyze an AI-generated narrative in which a plot does not make sense. The students then work in small group to draft a better story version of the story.

Like Flinn, Gades is a firm believer that universities must learn how to use AI and not ignore that it is part of their students’ lives and learning. But she also wants educators to maintain the “human” in the Humanities.

“We need people who are aware of AI, know how to work it but are also excellent human beings,” Gades said, noting a computer can’t feel compassion or empathy. “We need people with these skills.”

naomi gades

Naomi Gades

To prepare for the future, Gades is taking a page from the past -- literally. “I’m going to teach like it’s 1999,” said Gades, a self-proclaimed Prince fan.

She plans to re-introduce blue books into her English courses. Yes, those same thin blue examination books -- with their tight margins, crisp lined white pages and signature light blue cover. Gades will perform an in-class diagnostic by having students write about a given prompt. While later in the course she will introduce the use of AI and will even grade their chats, she first wants to evaluate a students’ basic writing skills.

Gades likens her approach to those in medical fields. While there are computers and machines that can monitor blood pressure and conduct screenings, there comes a time that health career students might have to rely on their own skills. “I’d want my nurse to know basic things about nursing.”

It’s a matter of teaching her English students how to work with the technology, including different prompting phrasing and strategies, instead of allowing the technology to do the work for them.

“So, they come away knowing what’s behind [the technology]. It’s just not a magic box,” Gades said.  “My responsibility is to make sure my students have certain skills or ideas and the ability to communicate those ideas.”

Gades, who has attended several conferences on embracing AI in the classroom, doesn’t believe she’s an anomaly among her Humanities colleagues. She’s quick to point out that her fellow English faculty are perpetually retooling their syllabi to keep up with trends in academia. Gades has already taught a 400-level course entitled Digital Rhetoric for English majors interested in tech fluency and how they might interact with technology in the workplace. In this course, students use an AI app to draft a short essay about a niche subject of which they know a great deal. Then the students must analyze the output to point out where the bot “got it wrong.” It’s a fun competition that engages students and shows that sometimes the computer doesn’t always know best.

“I want to show my students there is value in their special knowledge,” Gades stressed about her English students who have expert knowledge of language and audiences. “They have the superpower.”


The Super Powers That Be

Back over in the basement of the Gira CCIT, FSU’s Chief Information Officer Tim Pelesky and his team are using their superpowers to develop and introduce AI resources to the FSU campus.

Pelesky points out that artificial intelligence, in one form or another, has been around since the 1950s, but its most-recent advancements have positioned AI among such monumental tech advancements as the desktop computer, the internet and mobile devices.

“AI is another one of these big tech moments.”

On campus, Pelesky said he saw the use of AI in the classroom first as students started to employ it and professors expressed concern. That usage, however, has sparked conversation and action.

AI discussion is now at the forefront of Pelesky’s regular department meetings. “We have a lot of brilliant minds in IT that embrace technology and the changes it will bring.”

tim pelesky

Tim Pelesky

Those minds built an AI agent for the FSU website that will be launched this summer to help field inquiries from prospective students. “Bobby the Botcat” is an AI chatbot trained to answer any and all questions about FSU. Bobby can also lead inquirers to a “live” person, if they prefer. The main goal of this new Bobcat Botcat is to reduce workloads in Admissions so staff can focus on other assignments. Pelesky and his team are brainstorming more ways to use AI in other campus departments. Faculty and staff can already interact with Bobby the Botcat by logging in to the Faculty/Staff Portal.

“We want [the campus] to see the value in using this technology in their daily work,” said Pelesky who has shown departments how they can use AI in mundane tasks, such as creating presentations. “Every department can benefit from these things.”

Pelesky knows it’s a delicate balance for an institution that is known for its personal, one-on-one interaction with students. But he is also quick to note that for many young people, who have been raised interfacing with technology, personal interaction isn’t a preference or a priority.

Pelesky explained there will be three groups of people when it comes to using AI on campus: Those who will refuse. Those who will be open to it. And those who will adapt.

But there’s no denying it. AI is here to stay.

“We can’t help but join in or we will be left behind,” he said. “The genie is already out of the bottle.”


Alumna Weighs in on AI in the Workplace...and the World

Timi Hadra ’99, a partner and delivery center executive for IBM Consulting, knows a thing or two about AI. She had the opportunity to represent IBM at the U.S. House of Representatives Subcommittee on Cybersecurity, IT and Government Innovation. “AI is here, and it is redefining work, who does it, how they do it, and will require more people to work with technology. We – the private sector, government, and educators – must collectively act now to ensure Americans are prepared to work alongside digital tools, take on higher-level and more meaningful work, and thrive in lifelong careers,” she said. 

Timi Hadra

Timi Hadra ’99

Hadra shares her professional insight on AI with Profile and how those who are fearful or reluctant can prepare. Click the questions to read Hadra's answers.
  • When you were a student at FSU, what were some technology advances you used that were pre-cursors to the AI and technology explosion we are experiencing now?

    I graduated in 1999. We were already using the internet, although it was seen as a novelty, with its potential impact on life hardly imagined. Most students were obtaining their very first cell phone, and the concept of smartphones and their pervasive influence on our daily lives was yet to come. At that time, computer labs on campus were pivotal for completing assignments due to limited personal computer ownership. I don’t think we grasped that we were tinkering with technology that would fundamentally shift all ways of working, across all industries. In terms of how AI is different, I do see the similarities where people are tinkering with AI. For example, using ChatGPT to create funny pictures and memes, but even more, I see people jumping in to adopt AI into their daily work and personal processes. I do think people are realizing the AI revolution is going to be bigger than the internet and while there seems to be some concern around risk, or reluctance to adopt AI technologies, more often than not, I’m seeing people of all ages leveraging AI.

  • How is AI being infused into your corporation's business model?

    At IBM, we've embraced AI as a core component of our business strategy, integrating it across all operations to enhance productivity, decision-making and customer experiences. Our AI-powered tools and platforms have generated significant productivity improvements, totaling $3.5 billion since the start of 2023. We provide our employees with comprehensive AI training and easy access to generative AI tools on our intranet, encouraging continuous learning and innovation. Regular hack-a-thons also foster a culture of AI experimentation and application development.

  • What do you think is the biggest challenge universities/colleges are facing when it comes to incorporating AI into their daily practices?

    Universities and all entities are concerned about security, and rightfully so. It’s critical that users understand the security of the AI technology they are using and understand the guardrails. Putting guardrails in place that guide what data faculty, staff and students can input into an AI tool is paramount to ensure data isn’t compromised. With regard to infusing AI into the learning process, the primary challenge lies in balancing the use of AI as a tool for efficiency with the need to ensure students are well-versed in foundational skills that aren't easily replaced by AI. It's crucial to determine which tasks students should learn to perform manually and which can be safely delegated to AI. How do we think about AI as the “calculator,” if you will. And what I mean by that is, when the calculator became mainstream, there was reluctance to allow students to use them in the classroom. We wanted students to know how to do math – add, subtract, multiply, etc. So, we taught the fundamentals, and now, even though I can do long division with paper and pencil, it’s much faster for me to use my calculator app on my phone and get the answer in seconds. Similarly, we need to find the balance for what we need students to learn how to do without AI and when we should encourage students to use AI to be more efficient.  It’s not a simple question, but it’s an important challenge to resolve.

  • Why do you think there's fear surrounding AI?

    The fear surrounding AI stems from a combination of factors, including its vast and potentially disruptive impact on various aspects of life, the lack of understanding about how AI works, concerns around data security and worries about job displacement due to automation. This fear is reminiscent of past technological shifts, such as the Industrial Revolution, which also sparked anxieties about machines replacing human labor. Each technological revolution, including the AI revolution, necessitates a societal adjustment period, but history shows that these changes ultimately yield new opportunities and improved ways of living and working. With the AI revolution, it’s moving so fast, it can feel overwhelming to keep up, and many of us probably already feel behind. Fortunately, in the era, we have so much more access to information and tools to help us stay informed, learn and grow with AI.

  • How comfortable are recent graduates and your younger hires with AI?

    I lead a technology team at IBM, so as you can imagine, our new hires can’t get their hands on AI tools and training fast enough. Recent graduates entering our technology team at IBM are incredibly adept with AI and related technologies, having grown up in a world where digital tools are integral to learning and communication.  I don’t think they can even imagine what it was like to have your elementary school teacher wheel a cart with a large television strapped to it for movie day or “watching” a filmstrip with a separate cassette tape for audio.   All that to say, they are well equipped to adapt, embrace and leverage almost any technology that comes at them. I think the biggest challenge facing this generation of early professionals is not lack of comfort with technology like AI – they excel in tech. The challenge is being too comfortable with living in a virtual world and allowing technology and virtual engagement to replace frequent, in-person, human interactions. I’m pleased to see our most recent college grad hires are eager to be in person more often. The key is to strike a balance between virtual engagement and in-person interaction as they navigate the benefits and drawbacks of an increasingly digital professional landscape.

  • What advice would you give people who don't understand or have fears about AI?

    My advice would be to approach AI as an extension of human capabilities rather than a replacement. Embrace AI as a tool that can free up time for more creative and strategic tasks, not just routine ones. Encourage continuous learning and experimentation with AI to uncover its potential applications in personal and professional life. Finally, engage in discussions about AI's implications and participate in shaping its ethical use, ensuring that its adoption aligns with societal values and promotes overall well-being.

Timi Hadra is also a member of the FSU Foundation Board of Directors.

Campus Visit Days