Latest News Archive

Please select Category, Year, and then Month to display items
Previous Archive
16 August 2024 Photo Supplied
Dr Peet van Aardt
Dr Peet van Aardt is the head of the UFS Writing Centre and the Coordinator of the Initiative for Creative African Narratives (iCAN).

Opinion article by Dr Peet van Aardt, Centre for Teaching and Learning and Head of the UFS Writing Centre, University of the Free State. 


The use and permittance of artificial intelligence tools such as ChatGPT at the University of the Free State (UFS) should be discouraged, writes Dr Peet van Aardt.

A decade ago, academics were encouraged to find ways to incorporate social media platforms like Facebook and Twitter in their teaching. Seeing as students were spending so much time on these platforms, the idea was that we need to take the classroom to them. Until they found out young people do not use social media to study, but rather to create and share entertainment content.

During the late 2000s, News24.com, the biggest news website in Africa, went on a mission to nurture and expand what was known as “community journalism” because everybody started owning smartphones, the news outlet’s leadership thought it was the opportunity to provide a platform for people to share photos, videos and stories of news events that took place around them. Until they realised that the vast majority of people didn’t want to contribute to journalism; they merely wanted to consume it.

Lest we assume students will use AI in a responsible and productive manner, at the UFS Writing Centre we find that students over-rely on ChatGPT in their assignments and essays. We should do everything in our power to discourage its use because it threatens what we do at a university on three levels.

It’s an educational issue

There are five main academic literacies we want to teach our students: reading, writing, speaking, listening and critical thinking. When students prompt ChatGPT to write their essay for them, immediately the reading and writing literacies are discarded because the student does not write the essay, nor do they read any source material that would help them form an argument. Critical thinking goes out the window, because it is merely a copy-and-paste job they are performing. And speaking? We see in the Writing Centre that students who use ChatGPT cannot discuss their “work”. The student voice is being killed.

There are lecturers who take the approach of motivating students to use prompted content from ChatGPT in order to critique and discuss the AI output. This is fine, should the students be operating at a level where their academic literacies have been established. In short: for postgraduate use it might be acceptable. Undergraduate students need to go through the process of becoming scholars and master their subject matter before they can be expected to critique it. It is basic pedagogy, and our duty as staff at the UFS, because it aligns with the Graduate Attribute of Critical Thinking.

It’s a moral issue

In addition to the academic literacies we attempt to instil in our students are attributes of ethical reasoning and written communication. The fact that AI tools “scrape” the internet for content without any consent from the content creators means that it is committing plagiarism. It is theft – “the greatest heist in history” as some refer to it. Do we want our students to develop digital skills and competencies on immoral grounds? Because often this is the reason given when students are encouraged or allowed to use AI: “The technology is there, and therefore we must learn to go with the flow and let the students to use it.” By this reasoning one can make the argument that the UFS rugby team (go, Shimlas!) must use performance-enhancing substances because it will make the players faster, stronger and “the technology is there”.

Academics also face a moral dilemma as there seems to brew a view that fire should be fought with fire: that AI can assist and even lead in tasks such as plagiarism detection, assessment and content development. But fighting fire with fire just burns down the house. Let us not look to AI to lessen our workload.

It’s an economic issue

Technology in education should be used to level the playing field. Companies develop online tools with a primary goal of making money – despite what the bandwagon passengers in the East and West promise us. Their operations cost a lot of money, and so they release free versions to get people hooked on it, and then they develop better products and place them behind a paywall. What this then means is that students who can afford subscription costs get access to the premium product, while the poor students get left behind. How can we assess two students who cannot make use of the same version of a tool? This will widen the gap in performance between students from different economic backgrounds. And consider the deletion of the authentic student voice (as alluded to earlier), these AI tools just represent a new platform for colonisation and therefore have no place in our institution.

OK, so what?

Lecturers who want advice on how to detect overreliance on AI tools can have a look at this video, which forms part of the AI Wayfinder Series – a brilliant project by the UFS’s Interdisciplinary Centre for Digital Futures and the Digital Scholarship Centre. These centres also have other helpful resources to check out.

But as an institution we need to produce a policy on how to deal with the threat and possibilities of AI. (Because in society and in certain disciplines it can make a contribution – just not for undergraduate studies in a university context.) Currently, a team that includes staff from the Faculty of Law and that of Economic and Management Sciences is busy drafting guidelines which departments can implement. Then, after a while, a review of these guidelines-in-practice can be done to lead us on the path of establishing a concrete policy.

If we as educators consider the facts that the use of AI tools impede the development of academic literacies (on undergraduate level), it silences local, authentic voices and it can create further economic division among our student community, we should not promote its use at our institution. Technology is not innovative if it does not improve something.

Dr Peet van Aardt is the Head of the UFS Writing Centre and the Coordinator of the Initiative for Creative African Narratives (iCAN). Before joining the UFS in 2014 he was the Community Editor of News24.com. 

News Archive

Eye tracker device a first in Africa
2013-07-31

 

 31 July 2013

Keeping an eye on empowerment

"If we can see what you see, we can think what you think."

Eye-tracking used to be one of those fabulous science-fiction inventions, along with Superman-like bionic ability. Could you really use the movement of your eyes to read people's minds? Or drive your car? Or transfix your enemy with a laser-beam?

Well, actually, yes, you can (apart, perhaps, from the laser beam… ). An eye tracker is not something from science fiction; it actually exists, and is widely used around the world for a number of purposes.

Simply put, an eye tracker is a device for measuring eye positions and eye movement. Its most obvious use is in marketing, to find out what people are looking at (when they see an advertisement, for instance, or when they are wandering along a supermarket aisle). The eye tracker measures where people look first, what attracts their attention, and what they look at the longest. It is used extensively in developed countries to predict consumer behaviour, based on what – literally – catches the eye.

On a more serious level, psychologists, therapists and educators can also use this device for a number of applications, such as analysis and education. And – most excitingly – eye tracking can be used by disabled people to use a computer and thereby operate a number of devices and machines. Impaired or disabled people can use eye tracking to get a whole new lease on life.

In South Africa and other developing countries, however, eye tracking is not widely used. Even though off-the-shelf webcams and open-source software can be obtained extremely cheaply, they are complex to use and the quality cannot be guaranteed. Specialist high-quality eye-tracking devices have to be imported, and they are extremely expensive – or rather – they used to be. Not anymore.

The Department of Computer Science and Informatics (CSI) at the University of the Free State has succeeded in developing a high-quality eye tracker at a fraction of the cost of the imported devices. Along with the hardware, the department has also developed specialised software for a number of applications. These would be useful for graphic designers, marketers, analysts, cognitive psychologists, language specialists, ophthalmologists, radiographers, occupational and speech therapists, and people with disabilities. In the not-too-distant future, even fleet owners and drivers would be able to use this technology.

"The research team at CSI has many years of eye-tracking experience," says team leader Prof Pieter Blignaut, "both with the technical aspect as well as the practical aspect. We also provide a multi-dimensional service to clients that includes the equipment, training and support. We even provide feedback to users.

"We have a basic desktop model available that can be used for research, and can be adapted so that people can interact with a computer. It will be possible in future to design a device that would be able to operate a wheelchair. We are working on a model incorporated into a pair of glasses which will provide gaze analysis for people in their natural surroundings, for instance when driving a vehicle.

"Up till now, the imported models have been too expensive," he continues. "But with our system, the technology is now within reach for anyone who needs it. This could lead to economic expansion and job creation."

The University of the Free State is the first manufacturer of eye-tracking devices in Africa, and Blignaut hopes that the project will contribute to nation-building and empowerment.

"The biggest advantage is that we now have a local manufacturer providing a quality product with local training and support."

In an eye-tracking device, a tiny infra-red light shines on the eye and causes a reflection which is picked up by a high-resolution camera. Every eye movement causes a change in the reflection, which is then mapped. Infra-red light is not harmful to the eye and is not even noticed. Eye movement is then completely natural.

Based on eye movements, a researcher can study cognitive patterns, driver behaviour, attention spans, even thinking patterns. A disabled person could use their eye-movements to interact with a computer, with future technology (still in development) that would enable that computer to control a wheelchair or operate machinery.

The UFS recently initiated the foundation of an eye-tracking interest group for South Africa (ETSA) and sponsor a biennial-eye tracking conference. Their website can be found at www.eyetrackingsa.co.za.

“Eye tracking is an amazing tool for empowerment and development in Africa, “ says Blignaut, “but it is not used as much as it should be, because it is seen as too expensive. We are trying to bring this technology within the reach of anyone and everyone who needs it.”

Issued by: Lacea Loader
Director: Strategic Communication

Telephone: +27 (0) 51 401 2584
Cell: +27 (0) 83 645 2454
E-mail: news@ufs.ac.za
Fax: +27 (0) 51 444 6393

We use cookies to make interactions with our websites and services easy and meaningful. To better understand how they are used, read more about the UFS cookie policy. By continuing to use this site you are giving us your consent to do this.

Accept