Editorial: Students should embrace ChatGPT’s possibilities

Editorial%3A+Students+should+embrace+ChatGPTs+possibilities

It has become increasingly common for college students to turn to ChatGPT, a free software released by artificial intelligence research company OpenAI, for assistance with their academic tasks. Some students use ChatGPT so they don’t have to scan through Google to look for the right website or information. One might think that ChatGPT searches information through Google or other search engines, but shockingly it “does not scan the internet for information but instead draws from text databases consisting of books, Wikipedia entries, articles, Reddit conversations, and other forms of writing” (Drozdowski, Ed.D.) That sounds promising to many because some assume that since it’s from the textbooks and articles, the results it provides are accurate.

Not exactly.

ChatGPT is has its limitations. Because there’s no guarantee that the source information is accurate or not biased, the solutions presented by the software may contain errors, but they are often logically and reliably explained. The scariest part is when ChatGPT answers a query with full confidence.

Understandably, ChatGPT has raised concerns among college professors and students. ChatGPT gives answers – right or wrong – to almost all homework and other assignments. To prevent students from relying on the software for homework and assignments, Norwich professors have started asking questions that ChatGPT cannot understand; all they had to do was confuse the AI. Since ChatGPT doesn’t have common sense, it’s really easy to manipulate questions and data to make it hard for AI to understand. In some cases, professors have rephrased their questions to ensure that the software cannot provide the correct answers. One easy trick is that they can run their questions through ChatGPT and make changes to the question until AI gets it wrong. But this might complicate understanding the question for students.

Some argue that this approach is unfair to students and restricts their access to valuable resources. The use of ChatGPT and other chatbots has become a strong part of the student experience and should not be limited. It is important for professors to educate their students on the limitations and potential errors of using ChatGPT rather than simply restricting its use. Sometimes even when you try hard, you may not understand the professor or the material. When that happens, we usually Google and read some articles or watch YouTube videos. Sometimes, the information we are seeking is just hard to find, or it’s written in between thousands of lines or words. How great would it be if you could just ask someone who knows the topic and give you specific information? It’s like asking your classmate to explain what just happened in a class or asking a CASA peer tutor for help. Students use ChatGPT because it answers your specific questions right away, and you can even ask the AI to explain it as you explain it to a second grader. How cool is that?

In reality, college students’ increasing reliance on ChatGPT highlights the power and potential of artificial intelligence in education. As important as it is to be mindful of the software’s limitations and potential errors, it is equally as important to embrace the opportunities it affords students. As long as you do not use ChatGPT to cheat or break any Norwich rules and regulations, you should be just fine.