Panelists chat about integrity's role in artificial intelligence

Microsoft’s Thomas Soemo worked with AI technology even before it was called artificial intelligence.

Photos by Ralph Freso

When ChatGPT hit the public sphere in November, everyone was fascinated at what it could do. The advanced artificial intelligence interface can "write" poems and college-level essays and solve mathematical equations.

But the technology also is turning the academic world upside down as educators worry about cheating and plagiarism.

AI's growing appeal and utility are undeniable. While its game-changing promise to improve efficiency, accelerate research and formulate extensive data into concise formats has proven valuable, academia is increasingly concerned about where AI sits in the scope of academic integrity.

It is why Grand Canyon University’s chapter of the National Society of Collegiate Scholars addressed AI technology at the recent campuswide Integrity Week, which stressed the importance of integrity in academic and professional settings. Speakers addressed the concerns at the event's "AI's Role in Academia: Perceptions and Prospects" panel.

AI, or as Associate Professor of Computer Science Dr. Isac Artzi calls it, “simulated intelligence,” mines a vast pool of resources and breaks them down into which ones are most relevant to you, making that experience accessible and accelerated.

Software programming major Rhese Soemo (right) offered a student perspective on artificial intelligence.

In layman’s terms, senior software program Honors College student Rhese Soemo describes AI as a "fancy machine with glue and scissors." He sat alongside his father on the panel, Thomas Soemo, who has 20 years of experience working at Microsoft,

It cuts up statements, glues them together and then gives them to you. It’s really just a new version of the search engine,” said Rhese Soemo. “It grabs all the content and formulates them in a way that is easier to view.”

Artzi added, “Education has evolved significantly but slowly over the last 200 years. Technology has mutated frequently, starting with the invention of the wheel, but there hasn't been a mutation in education; we need to mutate.”

Computer science instructor Dr. Isac Artzi says the education system must evolve with the changes in technology.

During COVID, Zoom calls, remote jobs and new methods for delivering instruction emerged. All those changes were made to survive the deadly virus. It forced a change in the health care system, education system and workplace — a mutation.

“A lot of businesses and organizations suffered during COVID, but we (GCU) grew,” said Artzi. “... We accepted the fact that we had to mutate. It was the survival of the fittest.”

It is just another way the University adapts to change. The advancement of technology is inevitable, and rather than fearing it, we must embrace it.

“People see AI as if it’s a terminator or has the sentience to destroy the world. Is it impressive technology? Yes, but it is only doing things it has learned,” Thomas Soemo reassured the students. “It is trying to simulate brain function and understand basic senses that we as humans take for granted, such as sight, sound, complex language, communication, hearing and interpretation. That is the kind of stuff it focuses on, and when you think if it like that, its no different than us.”

Sophomore William McKinley directs a question to the guest speakers during a panel discussion on AI’s role in academia.

The panel of computer science and technology professionals told the audience we cannot assume that the results from AI are always 100% accurate. Just like any search engine such as Google, Bing and Yahoo, we can’t always trust the internet. In the world of AI, the term is referenced as hallucination.

“Instead of seeing webpages, AI creates a magical knowledge source of data. Some words are true but some are false, so it is going to do its best to assume it knows exactly what you are asking for,” Thomas Soemo said. “This is what we mean by hallucinating. Sometimes the resources come back completely wrong, and if someone knows what they are doing, they can differentiate that, but most people just trust AI to be accurate. This isn’t the case.”

Just two weeks ago, Google introduced Bard, ChatGPT’s equivalent and new rival. The chatbot was shown giving an incorrect response when asked, “What new discoveries from the James Webb Space Telescope can I tell my 9-year-old about?"

College of Science, Engineering and Technology instructor Michael Sarlo (right) speaks on how he integrates AI technology within his daily lesson plans.

In response, Bard replied that the telescope took the very first pictures of a planet outside our solar system. But that information was wrong, as NASA confirmed that the European Southern Observatory’s Very Large Telescope holds that honor, taking the first picture of an exoplanet in 2004.

Bard is only one example of why we must be wary of technology in academic settings. Having the knowledge of what is accurate and using it as a tool is the standard we should uphold. We should not use AI as a way to achieve an easy A.

College of Science, Engineering and Technology instructor Michael Sarlo understands that with the emergence of AI technology, students have more access to resources that might increase the prevalence of cheating. Rather than allowing the mutation to take over, Sarlo asks the "whys" and implements it into his teaching style.

“For the students that do cheat, we need to ask, ‘Why do they cheat?’" asked Sarlo. “What can we do? How can we enhance the classroom with AI? Can we use ChatGPT to generate meaningful assignments? How can I make you invest in yourself so that you don’t feel the need to cheat?”

Robert Loy, head of Technology Programs for CSET, moderated the Q&A session.

As the use of AI technology continues to increase, so do the gray areas of what is considered cheating.

Where does ethics come in? There is a line called cheating. If you put a question in ChatGPT only to copy and paste it into your document and you’re done, that is a problem,” said Thomas Soemo. “There is a lack of integrity out there, and schools need to teach students how to use these tools to enrich their knowledge and ensure they don’t cross that line.”

Healthcare administration senior Eva Schroeder has a very specific interest in artificial intelligence. She is concerned people will use ChatGPT and similar AI technology to pass the Medical College Admissions Test (MCAT) and nursing licensure exams and that nurses will be sent into clinical environments not actually knowing how to help people.

Senior Eva Schroeder's perspective on artificial intelligence shifted after the panel discussion.

“If people just write it off as a magic box, then you’re going to create that black market of people using it the wrong way,” she said. “It is acceptable if people use it as a tool to encourage their knowledge, but it is about finding that line. It’s something that needs to be discussed sooner rather than later, and not just whether it is good or bad.”

Honors College Dean Dr. Breanna Naegeli said of AI technology, “We can’t fight it. The goal is to embrace it but encourage our students to make ethical decisions with integrity in mind.”

Contact staff writer Lydia P. Robles at 602-639-7665 or [email protected].

****

GCU News: Integrity Week is all-inclusive

Calendar

Calendar of Events

M Mon

T Tue

W Wed

T Thu

F Fri

S Sat

S Sun

3 events,

2 events,

1 event,

3 events,

3 events,

5 events,

3 events,

1 event,

2 events,

0 events,

1 event,

0 events,

1 event,

2 events,

0 events,

0 events,

0 events,

3 events,

5 events,

0 events,

2 events,

1 event,

0 events,

3 events,

3 events,

3 events,

1 event,

0 events,

GCU Magazine

Bible Verse

Be diligent in these matters; give yourself wholly to them, so that everyone may see your progress. Watch your life and doctrine closely. Persevere in them, because if you do, you will save both yourself and your hearers. (1 Timothy 4:15-16)

To Read More: www.verseoftheday.com/