interesting article i found .
me personally use AI a lot lately , much more precise and to the point than google searches.
i know this topic has been discussed to death lol, just wanted to share the article ...
Overall, I think that this particular application of ChatGPT is very useful, but the comparison for the particular example used for this study (singular) is... unfair.
Search engines are only tasked with presenting you with sources of information relevant to your query. They may provide a helpful summary or quick answers of the most relevant results, depending on the context, but that's about it. This means that you still have to do most of the work and that may seem like search engines are at a disadvantage, but consider this.
Your initial search lands you on a nice website, where each article links to related items and reviews. There's also a top navigation bar with search functionality, drop-down menus to explore categories of items. You can sort and filter by price, manufacturer, and many other metrics.
What started as a search for "teddy bear" landed you on a forum where you are asking people about the best way of imprinting photos onto a wooden box, a nice DIY project that is a little more personal.
I'm not saying that you couldn't possibly end up in the same situation with ChatGPT, but where it shines is in the quick feedback loop, as opposed to a more thorough research which does take some time. At least, as far as the exercise from the study is concerned. Depending on how much time the participants had, this surely did influence the outcome. Honestly, a lot of the times I get some good ideas just by browsing stores physically, something that neither Google nor ChatGPT can do .
Where I do agree that search engines are at a great disadvantage is the current state of the web. Today, and for quite some time now, most content is created and indexed with the purpose of generating ad revenue. This means following the guidelines of an algorithm that rewards click bait tactics, which produces low effort content overall since the main purpose is to get people to click on their referral links. That, and the endless sea of popups, cookie banners, re-captchas... makes it exhausting to search for things effectively.
I find the Comparison a bit ... unbalanced. The Thing is: When You type something in a Search Engine, You get Results and You have to look for it to find what You want - or even not actually wanted but would be useful. But at the End: You type something and You get a List, where You have to look on and compare.
ChatGPT is not a Seach Engine - it uses Search Engines. So, it makes that, what You would make like describted above and give You an Output called ''Answer''. But if this is a right Answer isn't directly clear. Because You don't have something to compare - You only have what the Machine gives You. And You don't know how this Answer is created.
I searched the word "chronic" on Bing the other day to make sure I was communicating its meaning correctly. I knew it meant a long-term or ongoing condition, but being a medical term I wanted to make sure I wasn't missing some important nuance or something. The "AI" bit at the top of the results showed excerpts from about 7 websites, including Wikipedia, all of which were just "click here to learn what chronic means". Directly below, in the actual search results, was the same Wikipedia page again but this time the excerpt actually stated the definition I was looking for.
Another time recently, I searched about carbon monoxide. The "AI" bit at the top gave decent info, and also included a photo of a detector... A smoke detector, which categorically WILL NOT warn you about the presence of CO.
I work in fire and life safety and my company is encouraging staff to try out various "AI" tools to see what could help us, so most of my experience is from that angle. And yes, I have tried the latest (free) version of chatjippity, which I have ranted about elsewhere in this forum.
Every time, they get something subtly wrong or give me such generic answers that they're practically useless. But if I didn't already know the subject well, I'd probably be fooled because they're well-written and presented in an easy way, and there's so much hype about "AI" all the time these days.
That really, really concerns me. How many people are getting dodgy answers about topics they aren't already familiar with, and trusting them?
People that work in sales tend to be quite charismatic and good speakers, sounding very confident in what they're saying is absolute truth. They've been prepped beforehand about the product they are selling, have a few stats to show to proof they know what they're doing, but fill in any gaps with whatever random that pops to reinforce the illusion of knowledge.
That is the best way to lie convincingly: add a few truths intertwined with the stuff you don't know, but always appear confident.
All these text generation tools remind me of exactly that, with the notable difference that from time to time a human might actually decide that is best to admit that they don't know, in order to be of actual service.
The bottom line is the same as it has always been with any tool: use it for the job it's intended to do.
Search engines are used for searching information, leaving you to judge whether it's authentic, reliable or useful. Text generation tools are used to spit out text, and you still have to judge whether is authentic, reliable or useful.
In the particular study linked in the article, I think it's very valid use case because the whole exercise was to come up with new ideas. But I really wish people were more aware of the value of using each tool for its intended purpose.
Used to be if people didn't believe something you told them, they'd Google and check a few sources to be sure. Now, that habit has become a curse, as I see regularly people asking Gemini or ChatGPT questions and trusting whatever answers they're getting.
We as humanity failed when we just accepted that "AI can make mistakes" and continued asking "it" questions.