I’m a bit worried about ChatGPT, and I don’t think that I am the only one.
It is unlike me to be so up-to-date with cutting edge technology. ChatGPT was only launched in November 2022 by American artificial intelligence research laboratory OpenAI. It is a sophisticated, AI-powered chatbot, capable of engaging in realistic speech and making articulate replies to user-posed questions, drawing on the knowledge resources of the internet. Normally, I would only hear about something like ChatGPT several years after it was originally created; usually, when it was just about to be superseded by something better and newer instead.
But, ChatGPT is suddenly everywhere. If you believe the hype, it is taking over the world. There are suggestions that it will eventually replace everything from teachers to journalists.
In the short term, it is proposed that it might become the predominant search engine, replacing Google. Microsoft’s search engine Bing is already launching an updated version with ChatGPT built in. And it clearly has Google worried. As a response, they are developing their own equivalent of ChatGPT called Bard. Although given its recent performance regarding the James Webb Space Telescope, they still have a little way to go.
But what will this mean for search engines?
Currently, search engines use an algorithm to rank pages in an attempt to provide the most relevant––or, if you are a cynic, the most commercial––information to match your search query. So, amidst all the potentially thousands of responses that your search query may bring up, theoretically the top ranked pages will most likely be ones from larger, peer-reviewed organisations, gradually tapering down to a response from Johnny Nutter operating out of a small, back-bedroom in Leytonstone.
A ChatGPT search engine will operate in the same way but rather than reveal all the multiplicity of responses, it will simply supply one aggregated, lucidly-expressed answer, using data mined primarily from the top-ranked pages.
This may all sound well and good, but what happens if you sometimes want to know what Johnny Nutter thinks? What if you want to check the primary sources from where your information has come? What if you want to listen to a heterodox opinion, not conforming to the mainstream?
It sounds a little like the studypack approach to learning, which has proliferated in universities. Students are given distilled chunks of what they are ‘supposed’ to learn for their subject, rather than let loose in the university library, with free rein to discover information for themselves.
With ChatGPT, as with studypacks, I fear that there is a danger that the ability to develop critical thinking will be lost. It is too easy to accept the answer given, particularly when it appears to come from an authoritative source, rather than to investigate around the answer and come up with your own conclusion and, to do this, it is necessary to consult multiple sources. Even, sometimes, Johnny Nutter, if only to decide that he is talking absolute nonsense.
When the internet is already awash with so much fake news, the fact that this is the database that ChatGPT relies on for its knowledge seems a little bit like having to draw your drinking water straight from the Ganges.
Maybe it just comes down to what you trust: artificial intelligence or your own intelligence?
© Simon Turner-Tree

Did Simon Turner-Tree write this article, or did ChatGPT write this article?
(You can tell that Simon did; he uses a far greater number of adverbs than ChatGPT. Ed.)
[…] when I thought I was getting ahead of the curve with my early discovery of ChatGPT, along comes Mechanical Turk and I realise that once again I am actually about twenty years behind […]
LikeLike