Latest News Archive

Please select Category, Year, and then Month to display items
Previous Archive
16 August 2024 Photo Supplied
Dr Peet van Aardt
Dr Peet van Aardt is the head of the UFS Writing Centre and the Coordinator of the Initiative for Creative African Narratives (iCAN).

Opinion article by Dr Peet van Aardt, Centre for Teaching and Learning and Head of the UFS Writing Centre, University of the Free State. 


The use and permittance of artificial intelligence tools such as ChatGPT at the University of the Free State (UFS) should be discouraged, writes Dr Peet van Aardt.

A decade ago, academics were encouraged to find ways to incorporate social media platforms like Facebook and Twitter in their teaching. Seeing as students were spending so much time on these platforms, the idea was that we need to take the classroom to them. Until they found out young people do not use social media to study, but rather to create and share entertainment content.

During the late 2000s, News24.com, the biggest news website in Africa, went on a mission to nurture and expand what was known as “community journalism” because everybody started owning smartphones, the news outlet’s leadership thought it was the opportunity to provide a platform for people to share photos, videos and stories of news events that took place around them. Until they realised that the vast majority of people didn’t want to contribute to journalism; they merely wanted to consume it.

Lest we assume students will use AI in a responsible and productive manner, at the UFS Writing Centre we find that students over-rely on ChatGPT in their assignments and essays. We should do everything in our power to discourage its use because it threatens what we do at a university on three levels.

It’s an educational issue

There are five main academic literacies we want to teach our students: reading, writing, speaking, listening and critical thinking. When students prompt ChatGPT to write their essay for them, immediately the reading and writing literacies are discarded because the student does not write the essay, nor do they read any source material that would help them form an argument. Critical thinking goes out the window, because it is merely a copy-and-paste job they are performing. And speaking? We see in the Writing Centre that students who use ChatGPT cannot discuss their “work”. The student voice is being killed.

There are lecturers who take the approach of motivating students to use prompted content from ChatGPT in order to critique and discuss the AI output. This is fine, should the students be operating at a level where their academic literacies have been established. In short: for postgraduate use it might be acceptable. Undergraduate students need to go through the process of becoming scholars and master their subject matter before they can be expected to critique it. It is basic pedagogy, and our duty as staff at the UFS, because it aligns with the Graduate Attribute of Critical Thinking.

It’s a moral issue

In addition to the academic literacies we attempt to instil in our students are attributes of ethical reasoning and written communication. The fact that AI tools “scrape” the internet for content without any consent from the content creators means that it is committing plagiarism. It is theft – “the greatest heist in history” as some refer to it. Do we want our students to develop digital skills and competencies on immoral grounds? Because often this is the reason given when students are encouraged or allowed to use AI: “The technology is there, and therefore we must learn to go with the flow and let the students to use it.” By this reasoning one can make the argument that the UFS rugby team (go, Shimlas!) must use performance-enhancing substances because it will make the players faster, stronger and “the technology is there”.

Academics also face a moral dilemma as there seems to brew a view that fire should be fought with fire: that AI can assist and even lead in tasks such as plagiarism detection, assessment and content development. But fighting fire with fire just burns down the house. Let us not look to AI to lessen our workload.

It’s an economic issue

Technology in education should be used to level the playing field. Companies develop online tools with a primary goal of making money – despite what the bandwagon passengers in the East and West promise us. Their operations cost a lot of money, and so they release free versions to get people hooked on it, and then they develop better products and place them behind a paywall. What this then means is that students who can afford subscription costs get access to the premium product, while the poor students get left behind. How can we assess two students who cannot make use of the same version of a tool? This will widen the gap in performance between students from different economic backgrounds. And consider the deletion of the authentic student voice (as alluded to earlier), these AI tools just represent a new platform for colonisation and therefore have no place in our institution.

OK, so what?

Lecturers who want advice on how to detect overreliance on AI tools can have a look at this video, which forms part of the AI Wayfinder Series – a brilliant project by the UFS’s Interdisciplinary Centre for Digital Futures and the Digital Scholarship Centre. These centres also have other helpful resources to check out.

But as an institution we need to produce a policy on how to deal with the threat and possibilities of AI. (Because in society and in certain disciplines it can make a contribution – just not for undergraduate studies in a university context.) Currently, a team that includes staff from the Faculty of Law and that of Economic and Management Sciences is busy drafting guidelines which departments can implement. Then, after a while, a review of these guidelines-in-practice can be done to lead us on the path of establishing a concrete policy.

If we as educators consider the facts that the use of AI tools impede the development of academic literacies (on undergraduate level), it silences local, authentic voices and it can create further economic division among our student community, we should not promote its use at our institution. Technology is not innovative if it does not improve something.

Dr Peet van Aardt is the Head of the UFS Writing Centre and the Coordinator of the Initiative for Creative African Narratives (iCAN). Before joining the UFS in 2014 he was the Community Editor of News24.com. 

News Archive

Heart diseases a time bomb in Africa, says UFS expert
2010-05-17

 Prof. Francis Smit

There are a lot of cardiac problems in Africa. Sub-Saharan Africa is home to the largest population of rheumatic heart disease patients in the world and therefore hosts the largest rheumatic heart valve population in the world. They are more than one million, compared to 33 000 in the whole of the industrialised world, says Prof. Francis Smit, Head of the Department of Cardiothoracic Surgery at the Faculty of Health Sciences at the University of the Free State (UFS).

He delivered an inaugural lecture on the topic Cardiothoracic Surgery: Complex simplicity, or simple complexity?

“We are also sitting on a time bomb of ischemic heart disease with the WHO (World Health Organisation) estimating that CAD (coronary artery disease) will become the number-one killer in our region by 2020. HIV/Aids is expected to go down to number 7.”

Very little is done about it. There is neither a clear nor coordinated programme to address this expected epidemic and CAD is regarded as an expensive disease, confined to Caucasians in the industrialised world. “We are ignoring alarming statistics about incidences of adult obesity, diabetes and endemic hypertension in our black population and a rising incidence of coronary artery interventions and incidents in our indigenous population,” Prof. Smit says.

Outside South Africa – with 44 units – very few units (about seven) perform low volumes of basic cardiac surgery. The South African units at all academic institutions are under severe threat and about 70% of cardiac procedures are performed in the private sector.

He says the main challenge in Africa has become sustainability, which needs to be addressed through education. Cardiothoracic surgery must become part of everyday surgery in Africa through alternative education programmes. That will make this specialty relevant at all levels of healthcare and it must be involved in resource allocation to medicine in general and cardiothoracic surgery specifically.

The African surgeon should make the maximum impact at the lowest possible cost to as many people in a society as possible. “Our training in fields like intensive care and insight into pulmonology, gastroenterology and cardiology give us the possibility of expanding our roles in African medicine. We must also remember that we are trained physicians as well.

“Should people die or suffer tremendously while we can train a group of surgical specialists or retraining general surgeons to expand our impact on cardiothoracic disease in Africa using available technology maybe more creatively? We have made great progress in establishing an African School for Cardiothoracic Surgery.”

Prof. Smit also highlighted the role of the annual Hannes Meyer National Registrar Symposium that culminated in having an eight-strong international panel sponsored by the ICC of EACTS to present a scientific course as well as advanced surgical techniques in conjunction with the Hannes Meyer Symposium in 2010.

Prof. Smit says South Africa is fast becoming the driving force in cardiothoracic surgery in Africa. South Africa is the only country that has the knowledge, technology and skills base to act as the springboard for the development of cardiothoracic surgery in Africa.

South Africa, however, is experiencing its own problems. Mortality has doubled in the years from 1997 to 2005 and half the population in the Free State dies between 40 to 44 years of age.

“If we do not need health professionals to determine the quality and quantity of service delivery to the population and do not want to involve them in this process, we can get rid of them, but then the political leaders making that decision must accept responsibility for the clinical outcomes and life expectancies of their fellow citizens.

“We surely cannot expect to impose the same medical legal principles on professionals working in unsafe hospitals and who have complained and made authorities aware of these conditions than upon those working in functional institutions. Either fixes the institutions or indemnifies medical personnel working in these conditions and defends the decision publicly.

“Why do I have to choose the three out of four patients that cannot have a lifesaving operation and will have to die on their own while the system pretends to deliver treatment to all?”

Prof. Smit says developing a service package with guidelines in the public domain will go a long way towards addressing this issue. It is also about time that we have to admit that things are simply not the same. Standards are deteriorating and training outcomes are or will be affected.

The people who make decisions that affect healthcare service delivery and outcomes, the quality of training platforms and research, in a word, the future of South African medicine, firstly need rules and boundaries. He also suggested that maybe the government should develop health policy in the public domain and then outsource healthcare delivery to people who can actually deliver including thousands of experts employed but ignored by the State at present.

“It is time that we all have to accept our responsibilities at all levels… and act decisively on matters that will determine the quality and quantity of medical care for this and future generations in South Africa and Africa. Time is running out,” Prof. Smit says.
 

We use cookies to make interactions with our websites and services easy and meaningful. To better understand how they are used, read more about the UFS cookie policy. By continuing to use this site you are giving us your consent to do this.

Accept