Today, I noticed that my behavior has shifted over the past few months. Right now, I exclusively use ChatGPT for any kind of search or question.
Using Google now feels completely lackluster in comparison.
I've noticed the same thing happening in my circle of friends as well—and they don’t even have a technical background.
How about you?
I still prefer tranditional search engines over LLMs but I admit, its results feels worse than it has traditionally.
I don't like LLMs for two reasons:
* I can't really get a feel for the veracity of the information without double checking it. A lot of context I get from just reading results from a traditional search engine is lost when I get an answer from a LLM. I find it somewhat uncomfortable to just accept the answer, and if I have to double check it anyways, the LLM's answer is kind of meaningless and I might as well use a traditional search engine.
* I'm missing out on learning opertunities that I would usually get otherwise by reading or skimming through a larger document trying to find the answer. I appreciate that I skim through a lot of documentation on a regular basis and can recall things that I just happened to read when looking for a solution for another problem. I would hate it if an LLM would drop random tidbits of information when I was looking for concrete answers, but since its a side effect of my information gathering process, I like it.
If I were to use an AI assistant that could help me search and curate the results, instead of trying to answer my question directly. Hopefully in a more sleek way than Perplexity does with its sources feature.
Yes, I use search engine(s) constantly - namely Kagi, which really does feel like Google used to. I tried using LLMs for a recent project of mine when I was trying to figure out if something was possible, and they were actively misleading, every time. My issue for this project was that what I was asking for did end up not being currently possible, but LLMs wouldn't tell me that and would make up incorrect ways to solve my problem, since they didn't want to tell me it couldn't be done.
Really, these days, either I know some resource exists and I want to find it, in which case a search engine makes much more sense than an LLM which might hallucinate, or I want to know if something is possible / how to do it, and the LLM will again hallucinate an incorrect way to do it.
I've only found LLMs useful for translation, transcription, natural language interface, etc.
This is my favorite thing about Kagi; you can do both. If you just append a question mark, it'll run the search through a simple LLM and give you those results (with citations) right before standard search. From there, you can proceed into a more sophisticated frontier model if that's more effective.
"Search" can mean a lot of things. Sometimes I just want a website but can't remember the URL (traditional); other times I want an answer (LLMs); and other times, I want a bunch of resources to learn more (search+LLMs).
I use Kagi exclusively and refuse to offload my brain to a thing which has no accuracy guarantee ever. The emitted answers to the queries it has given can be completely bogus, and developers of these things low key expect me to believe what their black box say? Nah, never.
Instead I use a search engine and do my own reading and filtering. This way I learn what I'm researching, too, so I don't fall into the vicious cycle of drug abu ^H^H^H^H^H laziness. Otherwise I'll inevitably rely more on more on that thing, and be a prisoner of my own doing by increasingly offloading my tasks to a black box and be dependent on it.
Google results have gotten so terrible over the years. I switched to Kagi long ago and haven't looked back. Whenever I use Google on another computer, I'm shocked by how awful the results are compared to Kagi.
As for AI search, I do find it extremely useful when I don't know the right words to search for. The LLM will instantly figure out what I'm trying to say.
I'm the inverse, I still 90% of the time use search engines, mostly Google. LLM-s can't help me with researching Hungarian companies offering screws, furniture, TV-s etc I need for my home renovation. It can't find me the best route to go to a cafe, lookup users, find information on famous people. Google is also faster than me typing a good prompt.
I use LLM-s for what they are good at, generative stuff. I know some task take me a long time and I can shortcut with LLM-s easily.
So here's a ChatGPT example query* which is completely off:
https://chatgpt.com/share/67f5a071-53bc-8013-9c32-25cc2857e5...
* It's intentionally bad be able to compare with Google.
And here's the web result, which is spot on:
LLMs are still notorious for hallucination; last I checked ChatGPT in particular still hallucinates about 1/3rd of the time.
So yeah, I do still use search engines, specifically Kagi and (as a fallback) DuckDuckGo. From either of them I might tack on a !g if I'm dissatisfied with the results, but it's pretty rare for Google's results to be any better.
When I do use an LLM, it's specifically for churning through some unstructured text for specific answers about it, with the understanding that I'll want to verify those answers myself. An LLM's great for taking queries like "What parts of this document talk about $FOO?" and spitting out a list of excerpts that discuss $FOO that I can then go back and spot-check myself for accuracy.
Yes, I'm still using Google as I haven't found LLMs useful as a search engine replacement.
No, I find it unwilling to produce factual information.
For example Jeep consistently lands at the bottom of the reliability ratings. Try asking GPT if Jeeps are reliable. The response reads like Jeep advertising.
I use DuckDuckGo, with the occasional reddit !g appended if I'm looking for something experience-based.
For me, searches fall into one of three categories, none of which are a good fit for LLMs:
1. A single business, location, object, or concept (I really just want the Google Maps or Wikipedia page, and I'm too lazy to go straight to the site). For these queries, LLMs are either overkill or outdated.
2. Product reviews, setup instructions, and other real-world blog posts. LLMs want to summarize these, and I don't want that.
3. Really specific knowledge in a limited domain ("2017 Kia Sedona automatic sliding door motor replacement steps," "Can I exit a Queue-Triggered Azure Function without removing it from the queue?"). In these cases, the LLMs are so prone to hallucination that I can't trust them.
AFAICT ChatGPT is mostly useless and can't be trusted to answer questions accurately. So no, mostly all search engines. To be honesty I'm surprised anybody uses it for anything other than trivial uses.
I use (and pay for) Kagi.
Even without much customization (lenses, scoring, etc) it's so much better (for my use cases) I happily pay for it.
Recently I have also started to use Perplexity more for "research for a few minutes and get back to me" type of things.
Queries like "what was that Python package for X" I usually ask an AI right from my editor, or ChatGPT if I'm in the browser already.
I use both, but direct search is faster since I have to fact check the LLM's answer.
2 recent success stories:
I was toying around with an esp32 - i was experimenting to turn it into a bluetooth remote control device. The online guides help to an extent, setting up and running sample projects, but the segue into deploying my own code was less clear. LLMs are "expert beginners" so this was a perfect request for it. I was able to jump from demos to live deploying my own code very quickly.
Another time I was tinkering with opnsense and setting up VLANs. The router config is easy enough but what I didnt realize before diving in was that the switch and access point require configuration too. What's difficult about searching this kind of problem is that most of the info is buried in old blog posts and forum threads and requires a lot of digging and piecing together disparate details. I wasn't lucky enough to find someone who did a writeup with my exact setup, but since LLMs are trained on all these old message boards, this was again a perfect prompt playing to its strengths.
I use Kagi and sometimes DDG. When I do a search I'd rather do my own reading than be lied to. It's not even like using it for code, when you can quickly iterate if needed-there is no way to verify the information you got is correct and that is a major problem imo.
Using search engines are still _significantly_ faster for me for the vast majority of the queries I want answers for.
The results from LLMs are still too slow, vary too much in quality and still frequently hallucinate.
My typical use-case is that when I'm looking for an answer I make a search query, sometimes a few. Then scan through the list of results and open tabs for the most promising of them - often recognising trusted, or at least familiar, sites. I then scan through those tabs for the best results. It turns out I can scan rapidly - that whole process only takes a few seconds, maybe a minute for the more complex queries.
I've found LLMs are good when you have open-ended questions, when you're not really sure what you're looking for. They can help narrow the search space.
I use Kagi, but I will say, the Quick Answer (Place an interrogation after your query for an LLM based answer) has been way more useful than I initially thought
Do you really though? Because I had this conversation recently and she was still typing stuff into the browser bar in chrome and then clicking on stuff from search results. I think a lot of folk think ChatGPT has superseded search but they don’t realise they are still carrying out a load of low level or transactional search queries via chrome.
100% still search first. If I am not super knowledge on the domain I am searching for I use an AI to get me keywords and terminology and then search.
At most I use AI now to speed up my research phase dramatically. AI is also pretty good at showing what is in the ballpark for more popular tools.
However I am missing forum style communities more and more, sometimes I don't want the correct answer, I want to know what someone that has been in the trenches for 10 years has to say, for my day job I can just make a phone call but for hobbies, side projects etc I don't have the contacts built up and I don't always have local interest groups that I can tap for knowledge.
I only use LLMs when I don't know what I am looking for. Otherwise, search engines all the way.
LLMs can't be trusted, you have no way to tell between a correct answer and a hallucination. Which means I often end up searching what the LLM told me just to check, and it is often wrong.
Search engines can also lead you to false information, but you have a lot more context. For example, a StackOverflow answer has comments, and often, they point out important nuances and inaccuracies. You can also cross-reference different websites, and gauge how reliable the information is (ex: primary source vs Reddit post). A well trained LLM can do that implicitly, but you have no idea how it did for your particular case.
I still use search engines. I do not like to be spoon fed. I want to learn from the real people. Not AI generated shit. ChatGPT or other LLMs are trained on old data, they do not contain newer information. Newer information and knowledge are produced by the real humans. LLMs are great for quick fact checking, but not for searching. For example: What's the height of mount Everest? LLMs will most probably give right answer.
What are the specs for new Goolge Pixel 9a? LLM can't answer this may after a year they can.
Search engines are still required for me. LLM's still get lots of very important things wrong.
Last night, I asked Claude 3.7 Sonnet to obtain historical gold prices in AUD and the ASX200 TR index values and plot the ratio of them, it got all of the tickers wrong - I had to google (it then got a bunch of other stuff wrong in the code).
Also yesterday, I was preparing a brief summary of forecasting metrics/measures for a stakeholder and it incorrectly described the properties of SMAPE (easily validated by checking Wikipedia).
I constantly have issues with my direct reports writing code using LLM's. They constantly hallucinate things for some of the SDK's we use.
I was among the first to champion AI search, even before Perplexity rose to fame. You.com was the first AI search to quote sources well, and I used it extensively.
But now, the veracity of most LLMs' responses is terrible. They often include “sources” unrelated to what they say and make up hallucinations when I search for what I'm an expert in. Even Gemini in Google Search told me yesterday that Ada Lovelace invented the first programming language in the 18th century. The trust is completely gone.
So, I'm back to the plain old search. At least it doesn't obscure its sources, and I can get a sense of the veracity of what I find.
Yes, I do. I'd never use an LLM for any meaningful or important information because by design, they will simply shove the most likely next token and you get a ton of responses which are pure nonsense if you start digging into it. Mind you, I've been noticing that Google has been terrible for a long while now. Kagi seemed alright at first but it also gave a lot of nonsense. The final straw for Kagi was the fact that they are backed by yandex and by extension fund the russian terrorist state. Lately I've switched to Qwant and so far it covers almost all of my needs.
I use Kagi and I don't think I'd notice if quick answer disappeared. Most of the time I have an answer in the time it would take for GPT to present a prompt.
I use search engines (Google) and when they (it) fails to provide me the responses I need, I turn to ChatGPT. For example:
I recently upgraded my video card, and I run a 4K display. Suddenly the display was randomly disconnecting until I restarted the monitor. I googled my brains out trying to figure out the issue, and got nowhere.
So I gave ChatGPT a shot. I told it exactly what I upgraded from/to, and which monitor I have, and it said "Oh, your HDMI 2.0 cable is specced to work, but AMD cards love HDMI2.1 especially ones that are grounded, so go get one of those even if it's overspecced for your setup."
So I did what it said, and it worked.
I don't even consider using a llm as a search engine. But I do agree, google has declined drastically in quality. Personally I'm on duckduckgo, though it always depends on the topics you search for.
Yes, DDG for 95% of issues. Using an AI to search seems really, really, really dumb to me.
Yes. I like being able to evaluate my sources. For programming or other technical topics, I'll rather read the original documentation, or third-party information whose credibility I can have some idea about.
For other topics, exact pedantic correctness may not always be as important, but I definitely do want to be able to evaluate my sources nevertheless, for other obvious reasons.
Search is actually pretty much what I want: a condensed list of possible sources of information for whatever I'm looking for. I can then build my own understanding of the topic by checking the sources and judging their credibility. Search seems to have been getting worse lately, sadly, but it's still useful.
Transforming Research with AI PDF Summary
Meet Sarah – a researcher burning the midnight oil, trying to extract insights from a 50-page paper. She’s exhausted, overwhelmed by the endless paragraphs, trying to condense hours of reading into key findings for her thesis. Sound familiar?
If you’re tired of drowning in long PDFs, AI PDF Summary is here to change your workflow. In just one click, AI PDF Summary helps you:
Instant Analysis: Summarize documents and highlight essential information in seconds. Key Insights: Quickly access the most important findings, methodologies, and conclusions. Automatic References: AI extracts references from your PDF, saving you hours of tedious work.
Stop wasting time, and start using AI PDF Summary for faster, more efficient research.
Try it now: https://clck.ru/3LQuDa
I subscribe to https://kagi.com/. I use search to find expert and authoritative sources of information with human authors who can be held responsible for their contents, and that I can cite in my own work. I’m not interested in the output of a copy-paste machine that steals others’ work, makes things up, and spits out prose worse than a politician’s.
Kagi with a lengthy exclusion/block list (fact checkers, Pinterest, etc), Brave Search, DDG as a rare 3rd option. Not using any explicit AI search engines like Perplexity, but I make use of Kagi's summaries a lot.
Search engines have gotten worse but they are still much more helpful for finding certain resources compared to LLMs. I am fond of the search operators that still work like `filetype:pdf`, site:example.com`, `intitle:trailmix`.
If they get rid of those operators, then that would be really bad. But I have a feeling that’s what a lot of search engine people are itching to do.
Nah. I'm perfectly conscious of the fact that ChatGPT can't be trusted with searches. Google is still my daily driver.
There is a class of problems I no longer use search for. I find LLM’s give really good results for things like command line usage. Or even things like configuring an application. Basically anything that can summarize lots of disperate sources.
Conversely it’s a huge mistake to rely on LLMs for anything that requires authoritative content. It’s not good at appropriately discounting low quality sources in my experience. Google can have a similar problem, but I find it easier to find good sources there first for many topics.
Where LLMs really replace modern google is for topics you only kind of think should exist. Google used to show some pretty tenuously related links by the time you got to page 5 results and there you might find terms that bring you closer to what you’re looking for. Google simply doesn’t do that anymore. So for me, one of the joys is being able to explore topics in a way I haven’t been able to for over a decade
Search engines are like older versions of LLMs. They are not designed as human-like assistants, but their goal of providing answers is similar in nature.
Search engines tend to over summarize less, and provide lots of references. Something LLM researchers worked hard to achieve.
If they feel lackluster for you, maybe you are not interested in those specific use cases in which they shine.
Similarly, the reason could be that you don't want to check references for yourself, and you prefer to trust the selection of cross references provided by your LLM of choice.
It is likely that your close circle of friends share an identity similar to yours. That is, by many, considered a defining characteristic of friendship. Although it can be a sign of the rising popularity of LLMs, one must take it as an anecdote and not a statistically significant fact.
I do prefer a soft selection of queries on different search engines and different LLM models. Since you asked for an opinion and self-declared an ability to do searches and questions yourself, I don't feel obligated to cite sources for this answer.
I use ChatGPT sometimes, but only after I've exhausted google's results for my search and not found the answer I was looking for, or when the query is so obscure that the enhanced problem solving ability of ChatGPT warrants going to it first. I like ChatGPT for solving mundane math problems because I can check its work, and getting the answers I need that way are quicker than doing it myself manually. I still don't trust ChatGPT for anything subjective, because I get spurious results from it anytime the answer to my question is not cut and dry. But what it can do, it does well.
I don't have a circle of friends, so I have no idea what other people are doing, outside of what I read online.
I use search engines all the time (kagi specifically). AIs don't have up to date information. How would you find reviews for products via an AI? It'll just come up with one or two, and when you read you can pick out nuance and also tell if its a genuine review or made up garbage. Or find a place to buy something. Or a place to go, and to read other peoples comments on it. Summaries aren't very useful over comments imo.
I use an LLM a lot for coding. However, I was never as much into doing web searches for programming problems anyway, I used docs more and rarely needed sites like SO. I haven't therefore moved away from search engines for that side of things.
I use a search engine 99% of the time. Occasionally i use an LLM, but even for checking the most simple information I am not able to have any confidence that the answer it gives is correct. It seems to lie to me every time i use it and contradicts itself when i tell it that it made an error. It provides no citation to where it got its information and that seems completely essential. I very rarely see any use for it. Even if a search engine is much slower i will not compromise on knowing where the information is sourced from so I can judge its accuracy, bias, etc. I feel disturbed by all the people that have lower information standards
Yes, LLMs are no match for my decades of search skills.
I use search engines, but that's because I just yeet in a few words and I get a result, either directly through the preview or after a click through to the results.
With chatbots I first need to formulate a question (or, I feel like I do), then wait for it to slowly churn out an overly wordy response. Or I need to prompt it first to keep it short.
I suppose this difference is different if you already used a search engine by asking it a fully formulated question like "What is a html fieldset and how do I use it?" instead of "html fieldset" and clicking through to MDN.
For those same questions that you're probably asking ChatGPT, a google search would show me google's LLM answer at top, maybe reddit threads that would illuminate the topic a bit more for me, maybe stackoverflow threads where 2-3 people show different approaches to the solution and maybe some random forum somewhere with example code i could repurpose. Sure, chatgpt will answer the question but it won't have all the other noise that I can glean from and maybe come up with a better solution.
I would use the analogy of consuming a perfectly tasty and nutritional meal crafted by chef chatgpt vs visiting a few restaurants around your neighborhood and tasting different cuisines. neither approach is wrong but you get different things and values out of each approach. Do what you feel like doing!
Last week, there was a specific coding problem I needed help with, I asked chatgpt which gave me a great answer. Except I spent a few hours trying to figure out why the function chatgpt was using wasn't being included, despite the #include directives being all correct. neither chatgpt nor google were helpful. The solution was to just take a different approach to my code, if I only googled, I wouldn't have spent that time chasing the wrong solution.
Also consider this, when you ask a question, there are a bunch of rude people (well meaning) that ask you questions like "what are you really trying to do?" and who criticize a bunch of unrelated things about your code/approach/question. a lot of times that's just annoying but sometimes that gives you really good insights on the problem domain.
I fundamentally cannot trust a searching system that includes a disclaimer that it can make stuff up (hallucinate) and there's nothing you can do about it.
I 100% use search engines, especially to find doc that I know exists. Google/DDG are so fast.
If it is more of an open ended question that I am not sure there'll be a page with an answer for, I am more likely to use ChatGPT/Claude.
I use ChatGPT only occasionally, mostly for laughs, but primarily use Google. It's not as good as it used to be, but it is still the best available. I think there is an opening for a new search engine company now (unlike 10 years ago when Google was unbeatable), and I suppose LLM's might be a part of it. ChatGPT is not it, though.
Same with my wife (non-technical) and teenage daughter.
Yes. Why would I use AI to find information?
It depends on the type of query, anything has to do with locality or recency, LLMs just don't _really_ work all that well, or even at all.
Someone at work yesterday asked me if I knew which bus lines would be active today due to the ongoing strike. Googled, got a result, shared back in under 10 seconds.
Out of curiosity I just checked with various LLMs through t3.chat, with all kinds of features, none had anything more than a vague "check with local news" to say. Last one I tried Gemini with Deep Research and what do you know, it actually found the information and it was correct!
It also took nearly 5 minutes..
Like I feel if your search is about _reality_ (what X product should I buy, is this restoraunt good, when is A event in B city, recipes etc.) then LLMs are severely lacking.
Too slow, almost always incomplete answers if not straight up incorrect, deep research tends to work if you have 20 minutes to spare both to get an initial answer and manually go and vet the sources/look for more information in them.
I'm using ChatGPT or Perplexity as my defaults for any research/questions I have (open research). I do go to Google when I have a specific company I want to quickly check some details (close research).
The trend of using LLMs for everything feels like a "when all you have is a hammer, everything starts to look like a nail" situation.
People should do what makes them feel good, but I think we're all going to get a bit dumber if we rely too much on LLMs for our information.
I personally still use search engines daily when I know what it is that I am searching for. I am actually finding that I am reaching less for LLMs even though it is getting easier and cheaper (I pay for T3 Chat at $8USD p/m).
Where I find LLMs useful is when I am trying to unpack a concept or I can't remember the name of something. The result of these chats often lead to their own Google searches. Even after all this development, the best LLMs still hallucinate constantly. The best way that I've found to reduce hallucinations is to use better prompts. I have used https://promptcowboy.ai/ to some success for this.
I don’t use LLMs for factual information at all because it is likely biased or wrong.
Yes, I still use search engines. So do all but one of my friends, both technical and not. I have not found LLMs to be anything close to a good replacement for them.
I use Kagi search when I want to find something, and chatgpt free when I want a question answered.
Most of my searches still use traditional search engines for two reasons:
- If I am seeking technical information, I would rather get it from the original source. It is often possible to do that with a search. The output from an LLM is not going to be the original source. Even with dealing with secondary sources, it is typically easier to spot red flags in a secondary source than it is with the output of an LLM.
- I often perform image searches. I have no desire for generated images, though I'm not going to object to one if someone else "curated" the outputs of an AI model.
That said, I will use an LLM for things that aren't strictly factual. i.e. I can judge if it is good enough for my needs by simply reading it over.
Mostly GPT, but for World of Warcraft, GPT is absolutely horrible. It's like it has been corrupted by the 20 years of bad/incorrect user data, or maybe just the sheer amount of it in general.
As an example, someone typo'd an abbreviation, so I asked GPT and it gladly made up something for me. So I gave it a random abbreviation, and it did the same (using its knowledge of the game).
Even when I tell it the specific version I'm playing it gets so much wrong it's basically useless. Item stats, where mobs are located, how to do a certain quest - anything. So I'm back to using websites like wowhead and google.
Yes.
Until LLMs stop responding with over confident “MBA talk” that sounds impressive but doesn’t really say much, I’ll continue to use search engines.
All the time. I don't like LLMs, and don't trust them. I tried to use copilot but ended up shutting it off because I spent more time trying to decipher and ignore its (wrong) suggestions than I did solving the problem.
GPT is completely useless for most of my daily searches. Searching for specific content on a site? I can just put in site:domain.com keywords and get useful results without having to read useless overview paragraphs about the site in question.
Image searches without having to describe every minute detail of what I'm looking for?
Bah, even some searches that are basically looking for wikipedia/historical lookups....so much easier UI in Google Search than chatgpt's endless paragraphs with unclear sources etc.
For some things Google's AI results are helpful too, if not to just narrow down the results to certain sources.
There's no chat interface helping any of this
People started using search engines to ask stupid questions. An LLM like Gemini etc is hands-down better for that. A search engine is still better for actually searching. I do not need a 5000 word screed about a guacamole recipe.
Why would I ever search on a ChatGPT - thats not what they are for. They are for helping summarize things, writing copy, designing excel. Making silly images.
Search is for finding specific websites and products. Totally different things.
For sure. For me, searching is learning. I think, there is not only one exact answer to my question. Also because I am not asking my question absolutely correctly, as such precision would take a lot of time. So while reading through the list of possible answers I often develop ideas and/or paths that lead to a possible solution. This is good for my fantasy and for future structuring of my search processes or research considerations. I also occasionally have the opportunity to discover information that is important for my other projects e.g. some days ago I searched for a group of software-filters that find borders in groups of pixels where the pixels build more a soft cloud than a separation between rock and water. I found a company - Tempus AI - that developed a successful working medical AI...so I bought this share...
I use DDG multiple times a day, every day. I don't find ChatGPT to be a suitable substitute for helping me locate resources on the web; hallucinated links waste my time trying to get to useful information.
Tr kagi, its pretty good with filters down/upranking sites. I usually don't use ai for search purposes very much, mostly to avoid multiple pages of docs by asking it how to do things.
Yep. I ask LLMs the XY questions since they don’t get annoyed, and when my question is very concrete and reduced to its essence, I ask the search engine and usually get a better answer than the LLM would give me.
Basically, there’s a lot of good and specific information on the web, but not necessarily combined in the way I want. LLMs can help break apart my specific combination at a high level but struggle with the human ability to get to solutions quickly.
Or maybe I just suck at asking questions haah
Do you have that friend who knows the answer to anything and who you thought was a genius until smartphones appeared and you started googling his answers? LLMs are that guy.
For programming stuff that can be immediately verified LLMs are good. They also cover many cases where search engines can't go (e.g. "what was that song where X did Y?"). But looking up facts? Not yet. Burned many times and not trying it again until I hear something changed fundamentally.
I still use google scholar, right dao for deep search (tens of thousands of results), searx instances, and kagi for now but it's not worth the $10/mo for only ~200 results per search.
The serendipity of doing search with your own eyes and brain on page 34 of the results cannot be understated. Web surfing is good and does things that curated results (ie, google's <400, bing's <900, kagi's <200, LLM's very limited single results) cannot.
Here's what I do:
1. questions where I expect SEO crap, like for cooking recipes, are for LLMs. I use the best available LLM for those to avoid hallucinations as much as possible, 2.5 pro these days. With so much blogspam, LLMs are actually less likely to hallucinate at this point than the real internet IMO.
2. Questions whose answer I can immediately verify, like "how do I do x in language y", also go to an LLM. If the suggestion doesn't work, then I google. My stackoverflow usage has fallen to almost 0.
3. General overviews / "how is this algorithm called" / "is there a library that does x" are LLMs, usually followed by Googling about the solutions discussed.
4. When there's no answer to my exact question anywhere, or when I need a more detailed overview of a new library / language, I still read tutorials and reference docs.
5. Local / company stuff, things like "when is this place open and how do I call them" or "what is the refund policy of this store" are exclusively Google. Same for shopping (not an American, so LLM shopping comparisons aren't very useful to me). Sadly, online reviews are still a cesspool.
It’s tough to find anything useful these days because of all the spam - especially due to AI, content. If I do use it, I usually use it to find something on Reddit.
I generally use search.brave.com which has an integrative AI Assistant summary. Sometimes the summary does a nice job and other times I just skip it and go find a link that is from somewhere I recognize. If I want to know how to do something, I skip the summary. If I just want to know if something exists or is possible then the summary is sometimes enough. I have no real desire to replace my search engine usage with an LLM.
When Google's results are garbage I will sometimes ChatGPT or others. This is increasing, but that has more to do with Google producing ever worsening results than any desire to use LLMs to "search".
Google wants to show me products to buy, which I'm almost never searching for, or they're "being super helpful" by removing/modifying my search terms, or they demonstrate that the decision makers simply don't care (or understand) what search is intended to accomplish for the user (ex: ever-present notices that there "aren't many results" for my search).
Recently tried to find a singer and song title based on lyrics. Google wouldn't present either of those, despite giving it the exact lyrics. ChatGPT gave me nonsense until I complained that it was giving me worse results than Google, at which point it gave me the correct singer but the wrong song, and then the correct song after pointing out that it was wrong about that.
Still can't get Google to do it unless my search is for the singer's name and song title, which is a bit late to the party.
Yes, why would I not? I, unlike you, do not intend to have a shallow knowledge on things I wanna know about. In a few years, it's going to get worse and no one would have deep expertise on anything (especially junior engineers) if they keep using LLMs. DDG is still far better than Google although I have started to see more ads on DDG searches which is quite annoying
It's a mix of both for me.
I use gemini more on my phone, where I feel like going through search results and reading is more effort, but I'll fall back to searching on duck duck go fairly often.
On a desktop I generally start at duck duck go, and if it's not there, then I don't bother with AI. (I use copilot in my editor, and it's usually helpful, but not really "search").
Yes.
ddg is often faster for when I want to get to an actual web site and find up-to-date info, for "search as navigation".
llm's are often faster for finding answers to slightly vague questions (where you know you're going to have to burn at least as much climate on wading through blogspam and ads and videos-that-didn't-need-to-be-videos if you do a search).
I used an AI tool for the first time this weekend to get a military CAC to authenticate to websites through Firefox on Arch. It took more than half a dozen uses of the AI tool to get what I was looking for though. Super edge case and even the AI struggled like a human.
Yes, I still use search engines and almost always find what I need in long form if I can’t figure it out on my own.
I never use ChatGPT for anything. I don't trust it for anything (nor should anybody), don't support the company that made it (unethically and on false pretenses as a nonprofit), and have absolutely no desire to contribute to its development.
When I need to search, I use a search engine and try to find a trustworthy source, assuming one is available.
Ironically, it's not that LLMs have become super useful, it's that the dominant search engines have become significantly worse, while at the same time they peddle AI results. It almost feels as if it was better for them if you used LLMs.
I won't deny LLMs can be useful, but they're like the news: double-check and form your own conclusions.
I use Kagi to search, but I usually use it with the a "?" at the end which triggers an LLM response in addition to search results. It gives me the answer I want like 95% of the time, and I don't feel the need to dig into the search results. For me this tends to be way better than just searching or just using ChatGPT.
I’m someone that grew up with AltaVista and thus I’m pretty good with my search terms and modifiers. And I often remember specific phrases from the websites I’m looking for. However, Google is more and more optimised for people NOT knowing what they’re looking for and is now even ignoring “quotes” for exact terms unless you switch it to verbatim mode. Which is a shame.
I’m mostly using my personal SearXNG instance and am still finding what I’m looking for.
On systems where I don’t have access to that, I’m currently trying Mojeek and experiment with Marginalia. Both rather traditional search engines.
I’m not a big fan of using LLMs for this. I rather punch in 3-5 keywords instead of explaining to some LLM what I’m looking for.
Switched over to DuckDuckGo a month ago. Results aren’t always great but it works 90% of the time.
I use perplexity pro + Claude a lot as well. Maybe too much but mostly for coding and conversations about technical topics.
It really depends on intent.
I have noticed that I’ve started reading a lot more. Lots of technical books in the iPad based on what I’m interested in at the moment.
LLMs have taken up a significant share of my technical/programming questions, because there’s a pretty good chance it’ll give me a correct or mostly correct answer and if it doesn’t, the results aren’t catastrophic. I don’t trust them for much else though, and so I still use a search engine (Kagi) for most other things. For odd exceptions, I ask the LLM to cite its answers and in the event that it can’t do that or provides false citations, I fall back on search engines.
These tools are useful, but in my view the level of trust seemingly commonly being placed in them far exceeds their capabilities. They’re not capable of distinguishing confidently worded but woefully incorrect reddit posts from well-verified authoritative pages which combined with their inclination for hallucinations and overeagerness to please the user makes them dangerous in an insidious way.
The idea of taking an answer from any black box is profoundly unacceptable. Even if the black box didn't hallucinate. Why wouldn't I prefer to follow a link to a site so that I can evaluate its trustworthiness as a source?
Why would I want to have a conversation in a medium of ambiguity when I could quickly type in a few keywords instead? If we'd invented the former first, we'd build statues of whoever invented the latter.
Why would I want to use a search service that strips privacy by forcing me to be logged in and is following the Netflix model of giving away a service cheap now to get you to rely on it so much that you'll have no choice but to keep paying for it later when it's expensive and enshittified?
I have Gemini results in my Google searches. They're "good enough" that I rarely venture to LLMs.
When I do, it's because either I can't think of good terms to use, and the LLM helps me figure out what I'm looking for, or I want to keep asking follow-up questions.
Even then, I probably use an LLM every other week at most.
Yes, ChatGPT has flaws (strange "hallucinations"?), but I found the same with me. Questions that I get no where with Google Search-n-friends (Duckduckgo/Qwant/Bing/etc etc) I had to give a last try with ChatGPT, and ChatGPT seems to fare considerably better.
Given my time dedicated to researching thing, I feel like I am "more productive" b/c I waste less time.
But I do my due diligence to double-check what ChatGPT suggests. So if I ask ChatGPT to recommend a list of books, I double-check with Goodreads and Amazon reviews/ratings. Like that. I guess it's like having a pair-research-sesson with an AI librarian friend? I am not sure.
But I know that I am appreciative. Does anyone remember how bad chatbots were before the arrival of low-hanging-AI-fruits like generative AI? Intel remembers.
Often I remember having read an article or seen a website in ~2014 or something, and now I want to find a link to it so I can cite it. I use a search engine for this, typing in the gist of what I can remember, set a date range (more clicks than it should take), and that's how I get to it.
This can be very difficult, if there's a lot of semantic overlap with a more commonly-searched mainstream topic, or if the date-range-filtering is unreliable.
Sometimes I'll look for a recipe for banana bread or something, and searching "banana bread recipe" will get me to something acceptable. Then I just have to scroll down through 10 paragraphs of SEO exposition about how much everyone loves homemade banana bread.
Searching for suppliers for products that I want to buy is, ironically, extremely difficult.
I don't trust LLMs for any kind of factual information retrieval yet.
I use search engine for 2 purpose and not sure if its a common practice across.
Specific search expecting 1 answer. These type search is enhanced by ChatGPT. Google is losing here.
Wild goose chase / brainstorming. For this, I need a broad set of answers. I am looking for a radically different solution. Here, today's Google is inferior to the OG Google. That is for 2 reasons.
1. SEOs have screwed up the results. A famous culprit is pinterest and many other irrelevant site that fill the first couple of pages.
2. Self-sensoring & shadow banning. Banning of torrent sites, politically motivated manipulation. Though the topic I am searching is not political, there is some issue with the result. I can see the difference when I try the same in Bing or DuckDuckGo.
This thread is yet another thing that makes me fear for the future of humanity.
No, I don't use the hallucination machines to search, and I never will.
I use search engines to search. I use the "make shit up" machine when I want shit made up. Modern voice models are great for IVR menus and other similar tasks. Image generation models have entirely taken over from clipart when I want a meaningless image to represent an idea. LLMs are even fun to make up bogus news articles, boilerplate text to fill a template, etc. They're not search engines though and they can't replace search engines.
If I want to find real information I use a search engine to find primary sources containing the keywords I'm looking for, or well referenced secondary sources like Wikipedia which can lead me to primary sources.
I can only imagine how much slower using an LLM would be, especially when it only gives you a single answer which is not what you're looking for and you have to keep asking for "something else"
I echo what others say, Kagi is a joy to use and feels just like Google used to be - useful
It depends on what I'm after. I still use regular searches quite a bit.
But a lot of my classic ADHD "let's dive into this rabbit hole" google sessions have definitely been replaced by AI deep searches like Perplexity. Instead of me going down a rabbit hole personally for all the random stuff that comes across my mind, I'll just let perplexity handle it and I come back a few minutes later and read whatever it came up with.
And sometimes, I don't even read that, and that's also fine. Just being able to hand that "task" off to an AI to handle it for me is very liberating in a way. I still get derailed a bit of course, but instead of losing half an hour, it's just a few seconds of typing out my question, and then getting back to what I've been doing.
Exclusively. I don't want to think of a question to ask, or think about phrasing some prompt so I get a useful result, I just want to through a few related words and terms into a search box and see where that gets me, and then use the results to refine my search terms further.
How do you know the information it generates is correct?
Just now for example I wanted to know how Emma Goldman was deported despite being a US citizen. Or whether she was a citizen to begin with. If an LLM gave me an answer I for sure would not trust it to be factual.
My search was simple: Emma Goldman citizenship. I got a wikipedia article, claiming it was argued that her citizenship was considered void after her ex husband’s citizenship was revoked. Now I needed to confirm it from a different source and also find out why her ex’s citizenship was revoked. So I searched his name + citizenship and got an New Yorker article claiming it was revoked because of some falsified papers. Done
If an LLM told me that, I simply wouldn’t trust it and would need to search for it anyway.
I try to use Google. If I put my search question into the Android Firefox url bar and hit enter, Google will show up with some useful answers (if it's not in the AI answer, Google is useless, because there are 5 pages of bullshit before it begins to show me actual web page search results).
But if I then click the Google search text box at the top, and start typing, it takes 20 seconds for my text to start appearing (the screen is clearly lagged by whatever Google is doing in the background), and then somehow it starts getting jumbled. Google is the only web page this happens to.
I actually like their results, they just don't want me to see their results. Weird business model.
LLMs are unreliable transformators for information which is already quite unreliable. So yes, I use Kagi. Averagely, using a search engine takes less time to achieve the same reliability (of course, perfect reliability is impossible). At least for me for sure.
Absolutely, as the great Fravia+ RIP :( once said, it is in your advantage to know where and how to find possible solutions for your problems. And i am willing to even go so far as to say:
The more you thrust the models, the less cognitive load you are spending checking and verifiefing which will lead to what people call ai but which actually is nothing more than a for loop over in memory loaded data. That those who still think that: for Îśessage in messages... can represent any sort of intelligence actually has already brainwashed on a new itteration of the "one armed bandit" where you click regenerate indefinatly with a random seed being distracted from what is going on around you
I still use search engine. LLMs are great but often get outdated by 6-12 months. Usually, I search for random coding topics and asking LLMs will often reproduce the outdated top(most voted) answer verbatim. However, there are some nice folks at SO/SE who come back later and update the answers or submit a new one, but LLMs often don’t return these but continue producing various spin off of the top answer by modifying variable names or adding/removing comments.
Hence, search still remains my hope until SO and the likes decay.
Additionally, many search engines now already generate quick summaries or result snippets without a lot of prompt-fu, hence LLMs have actually become 40:60(llm:search) ratio day to day.
I still use search engines, and not ChatGPT or any LLM as my primary behaviour.
Of course, I have used Phind and other LLMs, and the results sometimes are useful, but in general the information they give back feels like a summary written for the “Explain Like I'm Five” crowd, it just gives me more questions than answers, and frustrates me more than it helps me.
Where LLMs excel is when I don't know the exact search term to use for some particular concept. I ask the LLM about something, it answers with the right terms I can use in a search engine to find what I want, then I use these terms instead of my own words, and what I want is in the search results, in the first page.
ChatGPT takes 5-10 seconds to respond. Until it's as fast as google, i'm not switching.
The question is: are you searching for answers to something, or are you searching for a site/article/journal/whatever in order to consume the actual content? If you are searching for a page/article/journal/ in order to find an answer, then the journal/article itself was just a detour, if the LLM could give you the answer and you could trust it. But if you were looking for the page/article itself, not some piece of information IN the article then ChatGPT can (at best) give you the same URL google did, but 100x slower?
Yes, if I need (relatively) accurate answers (with the sources / urls / web pages), I'd use keyword search on Google.
Still have a trust issue with LLM/ChatGPT for facts. Maybe in a couple years my mindset will shift and trust LLM/chatgpt more.
I switched from Google to StartPage twelve years ago and have seen no need to change. I have trialed Kagi, and would move there if Startpage became unreliable. I've not used any LLM as a search engine alternative, and I have no plans to do so.
I use duckduckgo and google for search – sometimes I find the google AI answers helpful (even though sometimes inaccurate still) for more categorical questions that I simply need an anwser or somewhere to start with (i.e. "can you eat charcoal?"), I then go ahead a click some links...
I use ChatGPT for text summation and translation, and midjourney for slide decks and graphic design ideation.
Constantly use search. Using chatgpt exclusively is like those kids that only use tiktok
No, I judge ChatGPT by the same standards as I judge humans. It's an inveterate liar, and I much prefer to deal with trustworthy sources of information even if they are slightly less convenient than that smarmy bullshitter.
I still primarily use search engines like Brave Search, DuckDuckGo, Bing, and Google (in that order). I've started sometimes bothering to read the search engines' AI overviews instead of skipping them, although I almost always still click through to their sources for any particular statement.
I just tried ChatGPT and saw that you can ask it to search the web and also can see its sources now. I still remembered how it was last time I used it, where it specifically refused to link out to external sources (looks like they changed it around last November). That's a pretty good improvement for using it as search.
Search has gotten so bad I have replaced like 80% of it with LLMs, typically Claude or Gemini. I've also switched my searches over to duckduckgo whenever I do end up searching for something but even that is on the bad side.
Kagi is a very good alternative to google. When you're actually doing some research and have an exhaustive look at search results, kagi provides much more detailed results than google.
I'd rank kagi > chatgpt > google any day.
If I was habitually asking some llm for nuggets of information I'd have to use web search to verify it in any case.
But in fact I overwhelmingly use search over llm because it's an order of magnitude quicker (I also have google search's ai bobbins turned off by auto-using "web" instead of "all".)
I've used llm "for real" about 3 times in the last two months, twice to get a grounding in an area where I lacked any knowledge, so I could make better informed web searches, and once in a (failed) attempt to locate a piece of music where web search was unsuccessful.
I use mixture of solutions for Web browsing
- I use RSS to see 'what's new', and to search it. My RSS client support search
- I maintain list of domains, so when I want to find particular place I check my list of domains (I can search domain title, description, etc.). I have 1 million of domains [0]
- If I want more precise information I try to google it
- I also may ask chatgpt
So in fact I am not using one tool to find information. I use many tools, and often narrowing it down to tools that most likely will have the answer.
Yup. I want to see a variety of sources, evaluate them, and understand the answer.
I just very recently tried using ChatGPT for a situation where’d I’d typically search, draw some conclusions, search again, etc. Basically, planning an open ended vacation.
The biggest issue is when GPT returns something that doesn’t match your knowledge, experience, or intuition and you ask the “are you sure?” question, it seems to inevitably come back with “you’re right!”. But then why/how did it get it wrong the first time? Which one is actually true? So I go back to search (Kagi).
So for me, LLMs are about helping to process and collate large bodies information, but not final answers on their own.
ChatGPT was 12 months ago for me
I use Claude pretty exclusively, and GPT as a backup because GPT errors too much and tries to train on you too much and has a lackluster search feature. The web UIs are not these company’s priority, as they focus more on other offerings and API behavior. Which means any gripe will not be addressed and you have to just go for the differentiating UX.
For a second opinion from Claude, I use ChatGPT and Google pretty much the same amount. Raw google searches are just my glorified reddit search engine.
I also use offline LLM’S a lot. But my reliance on multimodal behavior brings me back to cloud offerings.
Agreed, I also use ChatGPT now mostly for searches, because it will pick the best sources that are not content farms, so I don't have to look through garbage to get a result. I started doing this about a year ago and was like "Oh, wow. This could disrupt search engines" and refer to it as "Super Google" now. I always have it link me to the source of what I'm looking for so I don't worry about it hallucinating, for common information I'm looking for reputable sources but don't know which `X` source has them for `Y` info.
When searching something non-programming related, I do. For example, I'm building an addition on our home. Searching for building materials, ideas, and any building science questions I have, I often find LLMs lacking. Even then, maybe 40% of the time Gemini gives me a good enough response.
On the flip side, any time I'm searching for something programming (FE, JavaScript in my case) it's last resort because an LLM is not giving me the answer I'm looking for.
This is still shocking to me, I really never thought I would replace my reliance on Google with something new.
I will use search as part of any research based learning so long as it remains a functional option. So long as there's some chance of the AI giving incorrect data where precision is necessitated to learn the fact, I will remain leary and manually search those specific areas later on. At the very least.
Operator words still do work in google, albeit less so than in the past - they still do the job.
I see the AI as being there to do the major leg work. But the devil's in the details and we can't simply take their word that something is fact without scrutinizing the data.
I've stopped using search engines actively 18 months ago. My first stop is an LLM. Once I understand what I actually need, I do a web search to go to the product/tool website. I do this not because LLMs are that good but because web search result quality went way down in the same period.
One interesting trend that I like is that I started using local LLMs way more in the last couple of months. They are good enough that I was able to cancel my personal ChatGPT subscription. Still using ChatGPT on the work machines since the company is paying it.
Search engines are up to date. I can search something that happened today. LLMs are several years behind. Until that's fixed i think we'll still be using the once very useful search engines
I've been tracking this (literally on paper). I've moved around 75% of my queries to LLMs from search engines. And, the main reason I use search engines is because of queries on mobile, where the devices still make it much easier to search with a traditional search engine; and the omnibar.
Keep in mind that I'm not counting in my 75% queries where I get my answer from Google Gemini I'm just guessing if you added that in, it would rise to 85-90%.
My thought is if browsers and phones started pushing queries over to an LLM, search (and search revenue) would virtually disappear.
I think this is mainly a symptom of poor alignment between search engines and their customers. ChatGPT works well now but the plan there seems to be to monetize search-like queries. I fear it won't be long until chat AI agents using this model bring us back to the same frustrations we have today with Google et all.
There is some room for optimism, though. There's been a rise in smaller search engines with different funding models that are more aligned with user needs. Kagi is the only one that comes to mind (I use it), but I'm sure there are others.
I find LLMs good for general knowledge, and clever rubber ducking, but if I have a very specific niche issue in some sort of language or framework, generally search is a better bet to find a stack overflow post of people going through the exact same thing I went through. And even if they don't have solutions, they'll usually link you to more valuable and specific references that you can use.
Though lately for more in-depth research I've been enjoying working with the LLM to have it do the searching for me and provide me links back to the sources.
Everyone still uses search for news, fact checking and as a nat language dns (eg: website for golang) The two worlds are en route for a merger and Google will most likely come out ahead.
That’s if they can swing the immense ads machine (and by that I mean the ads organisation not the tech) and point it at a new world and a different GTM strategy.
They still haven’t figured out how to properly incentivise content producers. A lazy way would be to display ads that the source websites would display alongside the summary or llm generated response and pass on any CPM to the source.
I've recently tinkered with creating an automated install script for a home server. It was good practice for me and I want to setup a small home server with pihole, sonarr and so on. I've created it with Claude and ChatGPT and both preformed poorly. Huge chunks of code, that might be running, but creates much more mess than it should. Only after going and reading the documentation the old fashioned way and with the help of search, I was able to reduce the size of the script and solve many problems with it.
Yes, because I still can't trust the output from LLMs, at all, really.
I'll use Claude about 75% of the time, and then a search engine about 25% of the time. That 25% of the time, I'm usually looking for:
- Specific documentation
- Datasets
- Shopping items
- Product reviews
But for the search engines I use, their branded LLM response takes up half of the first page. So that 25% figure may actually be a lot smaller.
It's important to note that these search engine LLM responses are often ludicrously incorrect -- at least, in my experience. So now I'm in this weird phase where I visit Google and debate whether I need to enter search terms or some prompt engineering in the search box.
ChatGPT is frequently wrong with its answers, so yes search and forums and websites are still the best option.
For example I asked it about rear springs for a 3rd gen 4runner and it recommended springs for a 5th gen.
I recently had someone reach out and tell me they liked a Chrome extension that I built. They found it by asking ChatGPT if there was a way to do X on Y, and it recommended my extension to them.
I was very surprised to hear this, and it made me wonder how much of traditional SEO will be bypassed through LLM search results. How do you leverage trying to get ranked by an LLM? Do you just provide real value? Or do you get featured on a platform like Chrome Extensions Store to improve your chances? I don't know, but it is fun to think about.
The category of search I've stopped doing is the one where I'd append "reddit" to the end. The models are going to do a great job of distilling a wide range of opinion into something super digestible better than the old flow of looking for a bunch of threads on the subject from expert-amateurs and having to read them all myself.
For the people who say they've reduced their search engine use by some large percentage, do you never need to find a particular document on the web or look for reference material?
My native language is Chinese. Most of my colleague use www.baidu.com as search engine. But I do not like baidu, because the search results are full of ADs. I also use ChatGPT or deepseek. But in my work (linux kernel driver portting), the AI is not good enough, I can not believe it totally. So for some case, I still use Google to search answer with keyword in English. If you want to konw why I use English as keywords, because of CSDN,this site has been polluting the internet for far too long.
Yes, of course, because i find it way more informative to search in broad terms, digest varying sources and arrive at conclusions rather than by asking a question that invariably lacks enough context (that I’d find by reading docs or SO posts) to actually produce helpful results, let alone a deeper understanding of the topic than before i started.
Learning is fun! Reading is good for you! Being spoon fed likely-inaccurate/incomplete info or unmaintainable code is not why i got into computers.
It's become kind of tempting to use chatgpt for that because you don't have to search yourself for that one post somewhere that describes what you're looking for. But I found little use for anything critical because it's just wrong way too often. Recently it gave me the info on how to use an API which turned out to be deprecated since two years. But for setting up my parents iPad where I was looking for a setting I couldn't find, it's fine.
I refuse to use AI. Artificial Intelligence enables and encourages Natural Stupidity.
I switched my default search engine in my browser to perplexity.ai a few months ago and am super happy with it. The only time I use Google anymore is to specifically visit www.google.com and put site:example.com in the search field, when I know the results I am looking for are only found within that site. I've only had to do that five or six times in the last few months.
And yes, just plain old Google search is completely lackluster in comparison to the perplexity.ai search I get to do today.
Almost always use search-augmented LLMs now, to infinitely better results. Whether I'm wondering about a movie or looking for information on a programming language feature, or even specifics about niche things, an LLM gets me there much quicker.
Earlier today I was trying to remember the name of the lizard someone tweeted about seeing in a variety store. Google search yielded nothing. Gemini immediately gave me precise details of what I was talking about, it linked to web resources about it.
Of course I do and always will. Not once did I consider using an LLM for anything even remotely serious. Generative AI, in any form, is a toy, a novelty. It's fun to play with and make fun of sometimes, but that's about it. And honestly, I'm tired as hell of generative AI being this new hotness that everyone must stick somewhere somehow. In their products, in their workflows, in their lives. I'm so looking forward for this fad to pass.
I've become very very bad at Google searches. Nothing seems accurate anymore, I should say precise, I'm hitting vendor/official/party-line stuff, or wordy blogs that say nothing.
I use ChatGPT at home constantly, for history questions, symptoms of an illness, identification of a plant hiking, remembering a complex term or idea I can't articulate, tips for games, and this list goes on.
At work it's Copilot.
I've come to loathe and mock Google search and I can't be the only one.
Depends on content, sometimes i use GPT to find stuff im lazy for and i know google would waste my time more likely, but generally i still use google, there are a lot of miscellaneous searches where an LLM would do worse than a search engine (currency exchange rate, stock price, quick facts etc..) tho I wish google had an option to block some sites from showing up, some searches are just filled with garbage - and i would like to block the whole domain from ever showing up
I was primarily using ChatGPT & Perplexity. Then I started calling B.S. on them, and more often than not their replies were "You're right! Sorry about that!" Simply saying "that's wrong" reveals a terrifying amount of "hallucination" by AI. Far, far more answers stated confidently as fact, turned out to be completely made up.
If I want to play with ideas, I chat with AI. If I need facts, I use search.
It depends on what I'm looking for. If I have a specific thing that I'm just looking for an answer on, then I typically will use ChatGPT. Most of my google searches are either navigational, things i know Google will return more quickly than ChatGPT ("how old is this actor", "when was xx player drafted") or when I'm interested in browsing results (looking for recipes for borscht, I want to see a few different recipes).
I don’t trust LLMs for search and neither should anyone. I speak as a professional tester. They are essentially untested. It should offend us all that OpenAI puts such unreliable software out there.
Unlike Google, or Duck Duck Go, which serve up links that we can instantly judge are relevant to us, LLM spin stories that sound pretty good but may be and often are insidiously wrong. It’s too much effort to fact check them, so people don’t.
Yes. But I noticed a trend of people asking stuff like "why doesn't this work" in various community forums, which ends up being them sourcing the method from the likes of ChatGPT / Gemini, etc. Lesson - don't do that, especially when you are going to be wasting others' time on explaining why things didn't work. Search things properly. Read documentation. Even if you use AI, never trust its results.
I don't use ChatGPT or any other AI. If I search for something, I search for authoritative documents on the topic. That is, official docs, articles, books. Asking ChatGPT would be like asking a random person who provides an opinion without any guarantees. It's potentially useful information but needs to be verified. So I need to search again to verify the information and get to an authoritative source.
Almost all of my “searches” are now done by either ChatGPT or Claude.
I'm still using Google for searches on Reddit these days because Reddit's own search engine is terrible.
Only a matter of time before OpenAI starts selling advertising services and weighting specific websites and services higher than others in their response generation for money. We should really outlaw this before it becomes a problem, and before it becomes a thing, because once it happens there is no turning back. Unlikely that anyone with the power to do this will actually have the foresight to do it though.
I use a variety of mostly search engines that get me an answer much faster. Google/Bing frequently point to sites/articles written by AI anyway. Using a LLM directly often gives too much garbage and doesn't often stick to just answering my question, so it becomes as useless as a modern google search. I prefer old style searching of just using key words and refining my query opposed to having it (miss)interpreted.
Yes. I even pay for Kagi. I very rarely feel the need to ask a chatbot for anything and every time I have I've been disappointed in the results. I'm surprised so many people find them useful.
These are the things I usually search for:
* lazy spell check * links to resources/services * human-made content (e.g. reviews, suggestions, communities)
Genuinely curious - those who use chatbots regularly in lieu of search, what kinds of things are you prompting it for?
Google has become useless as a search engine. Most of the results are from Reddit, Quora, or YouTube. I'm not going to watch a 5 minute video to get an answer. The information from Reddit/Quora is hardly written by experts. I now do a "verbatim" search with -reddit -quora -youtube. I do use ChatGPT to do a search for websites that have information about what I am looking for.
Yes, but increasingly rarely.
I mostly use Perplexity for search, sometimes ChatGPT. Only when I am looking for something _very_ specific do I use a traditional search engine.
Dropping usage of search engines compounded by lack of support led to me cancelling my Kagi subscription and now I just stick with Google in the very rare occasions that I use a search engine at all. For a dozen searches or so a month, it wasn't worth it to keep paying for Kagi.
Only if I know exactly what I'm looking for but don't want to type the URL or don't remember it. I never actually search for information with a search engine anymore.
The only advantage Google and other traditional search engines have over AIs is that they're very fast. If I know for certain I can get what I want in under 1s I might as well use Google. For everything else, Perplexity or ChatGPT is going to be faster.
Yes, all the time. For reference-level information, I don't trust AI summaries. If I need to know facts, I cannot have even the possibility of a lying auto-complete machine between me and the facts.
Exploratory/introductory/surface-level queries are the ones that get handed to auto-complete.
I like how Kagi lets me control whether AI should be involved by adding or omitting a question mark from my search query. Best of both worlds.
I do search from openai app with search option tunred on. It has better filtering than Google which shows all the irrelevant and low quality links like fake news as well. Quality of the search matter. Google does not have a way for me to mark what all websites i don't want in my search results so we need to rely on a more authentic source to filter all the chaos on the internet.
There was an interview with Eric Schmidt of Google on PBS around 2006 where he describes having multiple results for a search query is a "Bug" and there should be only one answer. It was interesting how OpenAI was first to market with ChatGPT beating out Google in this space. Its also interesting how the current Google CEO was not asked to step down after the flawed initial launch of Gemini.
Yes, I use Google Search.
But I appreciate and read the Google Gemini AI generated response at the top of the page.
Also, I'm an iPhone user. But I have a Google Pixel phone for deve work.
I find myself now using 'Hey Google' a lot more because of the Gemini responses.
It's particularly fun playing with it with the kids on road trips as we ask it weird questions, and get it to reply in olde english, or speak backwards in French and so on!
Yeah, I still use search engines.
LLMs are amazing for technical research or getting a quick overview and a clear explanation without clicking through ten links. But for everyday searches — checking restaurant hours, finding recent news, digging into niche forums, or comparing product — search engines are still way better.
I don’t think it’s a matter of one replacing the other — they serve different purposes.
Yes. It's still pretty common that I want the official website for a thing, or product reviews written by humans who actually know what they're talking about rather than a single answer to a question. DuckDuckGo is better at delivering it than LLMs are. I also don't want a hallucination, so I appreciate Perplexity's easily-checked citations when I do use an LLM.
Lacklustre doesn't mean ineffective. There was/is hardly any reason to switch other than 'new fangled thing' and in some specific circumstances (with a myriad of google switches and tricks that power users here know and use to squeeze out more useful results). And like a billion ppl are happily using google and getting decent results regularly.
We're in a bubble here.
I remember reading a thread on Reddit[1] where someone argued that Google stocks will go down coz no one uses search engine but the person was overwhelmed by investors who still believed in Google
I use search exclusively. I've tried ChatGPT for a number of tasks on various subjects and found it's responses to be shallow to the point of uselessness and frequently riddled with errors. It's fun to get it to do dumb shit like rewrite my resume in the style of Warhammer 40k, but for anything serious it's proven to be largely useless.
I've been feeling the same - I use Perplexity mostly and then ChatGPT most of the time (sometimes Grok if I know it's more likely to be based on X info). I wrote about some of the side effects of this new behavior https://paulstamatiou.com/browse-no-more
My use of Stackoverflow is the main casualty of LLMs
I used to use DDG for syntax problems (so many programming languages....) and it usually sent me to SO.
Now I use DeepSeek. Much friendlier, I can ask it stupid questions without getting shut down by the wankers on SO. Very good
I still use DDG to interface with current events and/or history. For history DDG is primarily, not only, an interface to Wikipedia
Just because I think some people are mistaking what is being talked about, OpenAI has chatGPT search, which is distinct from chatGPT.
Here's the difference as per chatgpt search https://chatgpt.com/share/67f5ae28-5700-800d-b241-386462a307...
ChatGPT et al are quite neat, but the interfaces are not great at all. For example, I'm going on a trip to Paris in a couple of months. I can ask a LLM what to do but it will just give me a list. There are no visuals, click throughs, maps, tips, experiences. blending 'AI' with search and a 'reader' can create a much better experience.
I'm also using chatgpt with its search enabled or perplexity for searching almost anything. Way more accurate and to the point.
I feel like the google search will become obsolete in a short time and they have to make big changes to their UX and search engine.
Although I guess most of its user base are still relying on the old ways so changing it right now has huge impacts on older users.
Way less than I used to. I have been a pretty advanced user since before google. The combination of AI and quick auto links to wikipedia articles on iOS have replaced much of it. The one place I still use it extensively is in local searches for businesses and when trying to find a brand or business that I know if they don't have an app.
Whether I reach for AI or Search depends on two questions. Am I looking for a site or information? If I'm looking for information, how easily can I verify it?
Websites have all kinds of extra context and links to other stuff on them. If I want to learn/discover stuff then they are still the best place to go.
For simple informational questions, all of that extra context is noise; asking gpt "what's the name of that cpp function that does xyz" is much faster than having to skim over several search results, click one, wait for 100 JavaScript libraries to load, click no on a cookies popup and then actually read the page to find the information only to realise the post is 15 years old and no longer relevant.
There are times where I know exactly what website to go to and where information is on that site and so I prefer that over AI. DDGs bangs are excellent for this: "!cpp std::string" and you are there.
Then there's the verifiability thing. Most information I am searching for is code which is trivial to verify: sometimes AI hallucinates a function but the compiler immediately tells me this and the end result is I've wasted 30 seconds which is more than offset by the time saved not scrolling through search.
Examples of things that aren't easy to verify: when's this deprecated function going to be removed, how mature is tool xyz.
Of course, there's also questions about things that happened after the AI's knowledge cutoff date. I know there are some that can access the internet now but I don't think any are free
If Google provided an option to disable AI search results, I'd happily turn it off.
I'd also happily turn off several other search features, more directly tied to revenue, which is probably why they don't like adding options. I'm sure their AI will be selling products soon enough. Got to make those billions spent back somehow.
I still use google. However, the most annoying thing i find is the AI generated response. I use google as a translator by adding „dict“ to the search. I also use the search function on old fashioned forums. It works. Yesterday i tried Chat GPT on math homework of one of my kids. Result was just crazy wrong. Complete garbage.
If you believe you n are still Googling too much, I made an extension to easy the transition
https://chromewebstore.google.com/detail/comparative-chatgpt...
I’m a Kagi subscriber; I like it but I use it less and less.
The more times goes by, the more I use both ChatGPT and Claude to search (at the same time, to cross-check the results) with Kagi used to either check the results when I know strictly nothing of the subject or for specific searches (restaurants, movie showings…).
I’ve almost completely stopped using Google.
> ... I exclusively use ChatGPT for any kind of search or question.
This constrains the search space to whatever training data set used for the LLM. A commercial search engine includes resources outside this data set.
Using a search engine for responses to natural language questions is of dubious value as that is not their intended purpose.
Yes I still use search engine specifically google as 1 habit 2 it's Ai search results with links to the actual content. Without the ai results answering my questions I would not have probably stayed with google. Google also seems to mitigate hallucinations by only showing content with links
I use search for things where factual accuracy is critical (i.e. the address of a specific store, a speech by a specific person, the lyrics of a song).
I use LLMs for things where an explanation where accuracy ranging between 0% to 100% is not a problem. When I need to get a feel for something, a pointer to some resource
I continue to get given ai responses which contradict their primary sources. Ask for UK specific Web and get given an American entity, which says so in the body text. It is clearly not British. It does not even misrepresent itself as non american.
Until the false results rate drops, it can't be trusted.
Yes, for finding local information or to search for specific things. For example, to find events or shops near me, or finding reviews.
I use ChatGPT for learning about topics I don't know much about. For example, I could spend 15 minutes reading wikipedia, or I could ask it to use Wikipedia and summarize for me.
Of course. Most of the time I'm searching for a physical place, a companies' website, a product, or news. ChatGPT is terrible and giving any of those answers. It's rare I want to know some sort of random fact. ChatGPT also doesn't give sources like, say, Wikipedia.
Yes, all my searching is using Google and I haven't had any issues with the results or finding what I want.
Accuracy issues aside, a draw that I feel towards using e.g. ChatGPT is that the information is displayed in a more consistent way. When using a search engine and opening a bunch of the results in tabs, I have to reorient myself to each site because they all have different visual designs.
I have reduced my traditional search engine use by, I'd guess, 90%.
Having said that, I use ChatGPT exactly like a search engine. If I want to find info I will explicitly enable the web search mode and usually just read the sources, not the actual summary provided by the LLM.
Why do this? I find if I don't quite know the exact term I am looking for I can describe my problem/situation and let ChatGPT make the relevant searches on my behalf (and presumably also do some kind of embedding lookup).
This is particularly useful in new domains, e.g. I've been helping my wife do some legal research and I can explain my layman's understanding of a situation and ask for legal references, and sure enough it will produce cases and/or gov.uk sources that I can check. She has been impressed enough to buy a subscription.
I have also noticed that my years (decades!) of search engine skills have atrophied quicker than expected. I find myself typing into Google as I would to ChatGPT, in a much more human way, then catch myself and realise I can actually write much more tersely (and use, e.g. site:).
I wish search engine algorithms/SEO were versioned, which would allow you to get a more consistent experience. The same applies to system prompts of closed LLMs.
The most important part for me is understanding how to communicate with each system, whether it's google-fu or prompting.
I've done the exact same thing in the last few months. If it's a search that ends in a question mark, I go to ChatGPT or another AI. Sometimes I even find myself going to Google out of pure autopilot habit, and then catch myself and go to ChatGPT. Old habits...
I feel same with you. But sometimes I found that chatgpt is obviously wrong. So since then I'm trying to be aware of this.
Mix of startpage, perplexity, qwant, 4get.ch, teclis and crowdview. ChatGTP occasionally for coding, never in details of specific topics - it fabulate a lot and shape answers to be politically correct. Google focus on ads so I abandoned them long time ago.
Yes, all the time. I use ChatGPT to help me get a general sense of direction when I'm learning something new. Once I have a few potential paths to explore, I combine regular Google searches with ChatGPT prompts to make sure my understanding aligns with reality.
Every day. There are some questions I have that are too vague and descriptive to ask a traditional SE, so I direct those to an LLM, but on the whole I don't want to have to specify to my search engine to not give me a 500 word essay every time I use it.
It depends. If I am searching for an answer with one variable, I use search engines, but if there are multiple variables, then GPTs are better.
* adult cat sleep time -> search engines. * my cat drops his toy into his water and brings it to me -> GPT
I still use Google a lot. I don't bother chatting when a few keywords will bring me to the resource I want (90%+ of my searches).
Besides, Google has some convenient features that I frequently use, e.g., currency/unit/timezone conversion, stock chart.
Overall, GOOD! But LLMs don't work as expected in some cases. For technical solutions, they usually don't take software version as a parameter which may cause issues. So I always had to cross-check the solution in forums and documentations
90% Kagi. 10% Copilot and ChatGPT.
I'm using Kagi as my default, falling back to Google (typically via '!g' in Kagi) for some technical queries. I use ChatGPT 4o several times a day, but typically not as a 1:1 replacement for web search.
I have had a number of occasions where I had a vague query that I was sure an LLM would handle better than a search engine, and yet I ended up failing to get a result with the LLM and Google came to my rescue (their search engine, not their AI!)
My searches have become site specific.
- What other people think of product XYZ: reddit - Subject specific/Historical: Wikipedia - News specific: My favored news sources - Coding related: I start with ChatGPT. To validate those answers I use Google
I use search the same as I did before LLMs. I don't find ChatGPT to be useful for finding information. I do use Claude extensively for writing code and explaining errors via Github Copilot in VSCode, but I still primarily use Google.
This is pretty much the reason why I think Google should divest of its search engine to private equity, and go all in on AI query while it is still high in valuation.
It will also help get rid of the antitrust issues that the chrome browser has created
I use DDG for normal stuff, many times a day. I use LLMs for difficult to find stuff or to discover keywords.
They can be very useful, especially when looking for something closely adjacent to a popular topic, but you got to check carefully what they say.
Yes - I haven't used ChatGPT so far. I initially tried to not read what the Google AI summarized, since I did not trust it, but these days, if I know enough about the subject to identify errors, I do read the summaries.
I really think this is a "drink the koolaid" phenomenon.
Personally, I don't want an LLM synthesized result to a query. I want to read original source material on websites, preferably written by experts, in the field in which my search is targeted.
What I find in serious regression in search, is interpretation of the search query. If I search for something like "Systems containing A but not B" I just get results that contain the words A and B. The logical semantics of asking for "not B" is completely ignored. Using "-B" doesn't work, since many discussions of something that doesn't have B, will mention the word B. These errors didn't seem to be so egregious historically. There seemed to be more correct semantic interpretation of the query.
I don't know if this has to do with applying LLMs in the backend of search, but if LLMs could more accurately interpret what I'm asking for, then I would be happy to have them parse my queries and return links that meet my query specifications more accurately.
But again, I don't want a synthesized result, I want to read original source material. I see the push to make everything LLM synthesized prose, to be just another attempt to put "Big Tech" between me and the info I'm trying to access.
Just link me to the original info please...
p.s. Something like the "semantic web" which would eliminate any 3rd party search agent completely would be the ideal solution.
I use search engines every day (primarily kagi). But I will use a fast llm with a search tool for some things, like providing context about a news story etc, (primarily gemini 2.0 flash with "grounding" on).
Yes, because Google also has AI and it’s integrated into my browser bar, Chat-Gpt is just secondary tool to me.
If I need something more complex like programming, talk therapy, or finding new music then I’ll hop on over to Chat.
Yes, but far less. I find I primarily use them when I want to go straight “to the source”.
Like I could interrogate an LLM about something technical “X” or I could just search “X documentation” and get to the ground truth.
The only time I use search engines now is when I’m screen sharing and feel obligated not to show my five different ChatGPT tabs. I glance over the links and feign interest, "Oh, that’s great..."
Yes for holiday planning. At work we are a massive confluence/rovo user so that is my go to with very occasional search engine use.
Our projects heavily use platform tools so I am looking there rather than Googling.
Yep, I continue to use multiple search engines and strategies to find stuff. Llms have been added as a tool but in general they mostly allow me to expand my context rather than provide outright answers.
I use Kagi (paid) to search for websites/articles/PDFs and ChatGPT (4o) otherwise.
I started using Kagi in an attempt to de-googlify, but it turns out that it's just downright good and now I prefer it.
chatgpt search has completely taken over my search, it's just better. Great that you then get to have follow up questions to narrow in or ask more. So common thing, looking for a product, it will find it, you can ask for other options, or ask it about the reviews for a specific product (it will give a summary and links, and may even embed a youtbue review), best pricing, nearest retailer, .... oh, none in your area, best place to order it from.... etc etc. I just have zero need for the classic search engine any more.
I use ChatGPT for queries like, "translate XYZ to English," particularly for short phrases where I don't care if it's exactly right; good enough works.
For everything else, I still use search.
I’ve been told enough falsehoods from AI, that it hasn’t earned my trust yet.
On the other hand, Google search is starting to be useless without curating my queries. And their AI suggestions are full of lies.
Would never use LLMs to find things, it's like asking a human to find things, they suck at it mostly and unlike human LLm doesn't have awareness to know it sucks
For simple searches e.g. webpage of company x in y I use DuckDuckGo, because the results are good and the CO2 footprint low. More questioning or interactive queries I use copilot.
I don't use any AI at all. I would prefer to keep my critical thinking abilities in tact.
I use Kagi as my search engine and GitHub code search for searching for code examples.
I haven't found a reason to use AI yet.
I'm very happy with my Kagi subscription. The results get me exactly what I want, and I can easily check the sources and downrank stuff I don't want.
I average around 1400-1600 searches per month.
I use CoPilot that is build into Windows 11 now almost exclusively. It's right there and gets the job done 99.9% of the time for me. that other 0.1% is when I use google.
I'm reaching for ChatGPT almost exclusively now, but if I talk to others I say I "googled it" because if I say I got it from ChatGPT the uninitiated don't trust it
Depends on the topic/goal. If I need concrete info, I'll use Kagi, and for more general questions ("should I be concerned about my cough"), I'll use an LLM.
I still use DDG several times a day (google is awful now). But I have transitioned over to Perplexity for certain searches, since it provides links to the source material it used.
I definitely still use search engines too. Googling topics with site:Reddit.com still a better way to get genuine opinions on e.g. product comparisons and recommendations.
Yes. Kagi. Or sometimes DuckDuckGo if I'm on a computer that isn't my own so I don't want to log into Kagi. Never Google though, that's gone to shit.
I mainly use search engines indirectly via Copilot (the app). It uses Bing in the background to give current results, so I can ask it about what happened yesterday.
I tend to use Google to to research and go through multiple website but do exclusively use AI to solve tech issues. So it really depends on the activity for me.
Yes, a lot. Search engine for me is not just for simple questions. I still want to search for specific articles, websites, etc. Want to filter by date...
google and bing are much worse now. I cant find anything. I cant remember the last time a discussion [platform] made it into the results. Kinda comical how even the ads are terrible. I think they discover it is better for me to keep searching rather than buy anything?
Twitter and reddit are garbage.
I sometimes use youtube search then fast forward with the subs on and the sound off.
The internet has ended. It's been a fun ride, thanks everyone.
6510 slaps hn with a large trout
I got so disappointed with Google search results that I started looking into alternatives. I tried duck duck go, phind and chatgpt and while those have their strengths, they weren't best fit for me.
I often search for solutions for some specific (often exotic) problems and LLMs are not best to handle them.
DDG does not have best results I'm not sure if those are better than those from Google. Definitely have different set of issues.
Finally seeing another positive comment at HN about Kagi I decided to pull the wallet and try it. And it's great. It feels like Google from 2000s
I decided to replace my subscription to anthropic and chatgpt with Kagi where I have access to both providers and also Gemini, meta and others. So in bottom line I it actually saving me money.
Their Ki assistant (LLM that iterate with multiple search queries when looking for answers) is actually neat. In general it best of both worlds. depending what do you need you can use LLM interface or classic search and they have both working great
Yes. But more for finding something I already know exists but don't remember the exact url (or are too lazy to type it in). Like going to some docs.
Search for what is known, llm for exploration.
Boils down to the fact that the internet is full of shitty blogspam that search happily returns if your question is vague.
I have not used any public AI, and I primarily use search. I use a local LLM running on two old Tesla GPUs for help with coding, but that’s about it.
Almost exclusively with "reddit" + something. Although I am using Deep Research more and more even for some of that.
Search is still dominant, using AI as search (perplexity, etc) is still growing.
Sparktoro (no affilitation) had a post or video about this somewhere very recently.
As a side note, we're seeing a small but meaningful amount of traffic referred from ChatGPT. Many people essentially use it as a search tool.
Yes, I still use search engines, although my use of them has decreased.
What I tend to use LLMs for is rubber ducking or the opening of research on a topic.
Daily. On top of general search engines, I have a bunch of custom ones added to Firefox so I can skip the Google/Bing/Yandex step.
I always have. Why would I want to use AI garbage?
just started trying asking question DeepSeek. Search engines return too many marketing materials, bloggers copy&paste nonsense and, recently, completely stupid AI autogenerated sites.
It is easy to filter them when you working with familiar domain, but trying to learn something completely new - it is better to ask DeepSeek for a summary, and then decide what to explore.
I gave LLMs a bit of a go but a couple of occasions wasted a bunch of time by giving incorrect answers, so my trust levels are pretty low.
LLMs and search engines have different use cases, albeit having overlap. Kind of similar to traditional computing vs machine learning.
Sure. It's sometimes faster to do "allowable attributes for CSS visibility" and visually scan the results for the keywords.
When they became infested with Ad engines = use declined. Now Chatxyz seems good, but Ad engines loom = evolutionary shittification
I started with Alta vista then DuckDuckGo and now Kagi. Recipes only via chatgpt and reverse image search with yandex of course.
The answer is yes, I still use, but not Google nor Bing. Relying exclusively on LLMs sounds a bit dangerous and naive.
Until when, I don't know.
I use mostly Perplexity, sometimes a "normal" LLM and, very rarely, Google. It depends on what I'm looking for.
2025 Google is trash. Kagi is in.
LLM is okay for some use cases, but the amount of times it hallucinates bullshit makes it not trustworthy.
Of course we do! Imagine search engine taking 10¢ for every search query you enter, what a ridiculous replacement.
grok (legal, historical, current events) , perplexity (claude , coding questions), gemini (via fabric / terminal -- coding questions, youtube, URL & book summaries). I only google by accident out of habit. Immediately regret the outcome using google ( spam ads, spammy content marketing, salacious content)
yes, I never got into using chatGPT a lot for example
but I will stay I have started to just use the AI summary at the top of Google though although it is wrong like I searched "why is the nose of a supra so long" and it started talking about people's faces vs. the car which granted yeah it's not a nose but yeah
Somewhat. Much less frequent than before
With LLM being good enough, I go to LLM for what I used to go for Wikipedia and StackOverflow.
I'm still using it.
If some AI answers I'm not sure or suspicious AI crafted it, I'll search it for cross validation.
Have you tried matterrank.ai? It might give you want you want, but as a search engine, instead of a chat interface.
I use Kagi search + Instant Answer. Instant Answer is most often giving me the result I am looking for.
Most engines now include LLM outputs on top pf search results now. Solely using an LLM is just not practical
My go-to is always Google. After I have exhausted the resources, and my patience, I turn to ChatGPT.
I rely purely on RSS-based search and email for asking questions to friends.
No, just joking. I use libraries to read books.
Kagi all day every day. Except for maps, I use dedicated maps app for places/businesses/etc.
I tried using ChatGPT to find my porn, and it refused to answer, so yeah, right back to a search engine.
According to my Kagi usage stats I have made 8907 searches and 9 AI interactions since June last year.
Google for quick searches (as a doorway to reddit and SO)
Perplexity for anything complex
Yandex for pics (Google pics got ridiculously bad)
I still use Google for everything; the difference is now I'm just unhappy with the results.
Not really, no. My peers and I were constantly opening ChatGPT when probing new topics. But now, with Gemini integrated within Google and the hallucinations of LLMs, seeing the SEO result along with the AI summary has become my goto choice. The one thing I find extremely frustrating (which I hope Google fixes) is not being able to continue the conversation with Gemini if I have follow-up questions.
I think this also stems from a new design paradigm emerging in the search domain of tech. The content results and conversational answers are merging – be it Google or your Algolia search within your documentation, a hybrid model is on the rise.
Almost never unless I am using it to navigate to a specific known domain that isn’t in my history
About a year ago it was 100% search engine. Today it's closer to 50/50 search/Chatgpt
I find it to be wrong so often, I'd be really careful assuming it's actually correct...
I don't use Google anymore, but I do use Google Scholar, sci-hub, Yandex and sometimes DDG.
I primarily use Brave Search. I'll yap with Claude for ideation every so often though.
People keep saying this but I keep using Google and I keep being happy with my results.
ChatGPT manages to be even worse than Google, and let me tell you: that's a low bar
I trust humans more than LLM's. They're like whatever was most popular mixed with hallucinations. I can gauge human credibility better.
I usually search for specific terms, often in quotes. My extra terms are variations on how people might word the question or answer.
Over time, I notice many sites are reliable for specific topics. I'll use site: operator to limit the search to them initially. If it's a research paper, adding "paper" and PDF usually links to it immediately. If government, it's often on a .gov page. And so on.
Search works well for me with these techniques for most of my needs. There has certainly even a drop in quality, with an increase in work, due to them optimizing for what generates ad clicks. That gives me a lot of sites that appear to be helpful but actually arent. I can usually spot and weed them out in one session for a given topic, though, since click farm sites are recognizable (intuitable) once you're used to them.
Finally, I try to follow the law since my Savior, Jesus Christ, requires it where possible. A.I.'s are usually trained with massive copyright infringement with outputs that may be copyright infringement. Search engines link me to the content creator to use directly. The creator also often says if they want it shared or how which I try to respect when I see it mentioned.
For years I was screaming at Google because I just wanted answers rather than endless blue links for me to research my answer. LLMs were such an amazing innovation in that it provides answers instead but now I'm finding my way back to keyword search. I think I'm still searching about 15% of the time.
I never use ai for search. It's too unreliable and biased.
Search Indexers are still way faster than LLMs currently, so I use Kagi or searxnc.
if you're using chatgpt, how are you ever actually sure if what you're being told is something a human actually said? are you really cross-checking everything, even things that aren't immediately falsifiable?
For me Google serves several roles.
1. Bookmark manager. I can write "maven download sources", click on baeldung and copy&paste command from there. I did that 100 times and I'll do it 101-th time. I have plenty of webpages that I know they exist and I know how to find them. I'm too lazy to actually bookmark and organize them and Google works just fine for me.
2. Search for new knowledge in general domains. This category of queries I sometimes use ChatGPT for it, but not always. It's hard to formulate the rules, but I have a feeling which tool is more useful for given question. Sometimes it's just quicker to type few keywords in Google rather than asking full-fledged question to ChatGPT and wait for it to return full-fledged answer.
3. Search for new knowledge in arcane domains. For example I might need new info about my city, where to buy laptop, etc. ChatGPT might know about US, but its knowledge about Kazakhstan is definitely limited, shallow and outdated, so real web comes to rescue here.
4. Using Google-specific functions. For example I very often use queries like "site:bla.com something", because website-hosted searches in 100% cases are braindead and I wouldn't even bother with it.
For me, ChatGPT main function is to create new content. For example I might want to start new Go project, so I'll ask ChatGPT to write hello world golang tutorial for me, then ask follow-up questions like what identifier naming should I follow, etc. There are, of course, hundreds of similar articles in the Internet, but ChatGPT delivers tailored data much faster.
Youtube's search engine is still good for finding songs I want to listen to.
I think the kids today use TikTok and such for searching, which is mental.
I use DuckDuckGo mostly, but have a Kagi subscription for other searches.
For local searches and product searches yes. For mostly everything else, no.
I use a search engine if I can more or less precisely describe what I want in a few words.
If I find that I really need to be more extensively descriptive, or to actually ask a question, I'll use an LLM, but that's pretty rare.
Here's an example of the latter that I did yesterday (this is Google Gemini):
"Hello! There is a poetic device where one line of a poem can be read both as the continuation of the previous line and as the first part of the following line. What is that called?"
"Hello! That poetic device is called an anastrophe or sometimes a hyperbaton, but these terms are more generally used for the inversion of the usual order of words or clauses in a sentence.
"While anastrophe or hyperbaton can create the effect you're describing, there isn't one single, widely recognized term specifically for a line that functions as both the end of one thought and the beginning of another in the way you've described.
"However, the effect you're describing is closely related to the idea of enjambment (also known as "run-on lines"). Enjambment is when a line of poetry ends without punctuation or a natural pause, and the thought continues onto the next line.
"..."
I proceeded to provide an example, saying "Points if you know what song this is ;-)", and Gemini confidently misidentified the song. (The correct song is "Black Cow" by Steely Dan; Gemini told me that the lyrics I'd quoted were from Bob Dylan's "Positively 4th Street," which doesn't contain any lyrics that even resemble what I'd quoted.)
Yes and I'm still, years later, get annoyed that I need to force them to search, instead of making self-humiliating attempts to answer some question that isn't even there while feeding me a bunch of spam blogs and irrelevant but somehow profitable results.
Stopped using chatgpt a while ago, use search engines almost exclusively
Searching google still gives more current info on changing topics, no?
I have not started to use LLMs, so yes I still use search engines.
no, and there are two major features of using LLM instead of search engines.
1. No prompt about decline/accepting cookies every time I want to look something up.
2. No ads.
the results are mediocre the same way using google is.
I don’t use LLMs for anything if I can avoid it. So far so good.
I usually do both at the same time. Ironically because Google.com is the shortest path to Gemini.
The other day I was also searching for something dumb: how to hammer a nail into concrete.
Google will find me instructions for a hammer-drill... no I just have a regular hammer. There's a link from wikiHow, which is okay, but I feel like it hallucinates as much as AI. Actually I just opened the link and the first instruction involves a hammer drill too. The second one is what I wanted, more wordy than ChatGPT.
Google then shows YouTube which has a 6 minute video. Then reddit which has bad advice half the time. I'm an idiot searching for how to hammer nails into a wall. I do not have the skill level to know when it's BS. Reddit makes me think I need a hammer drill and a fastener. Quora is next and it's even worse. It says concrete nails bend when hit, which even I know is false. It also convinces me that I need safety equipment to hit a nail with a hammer.
I just want a checklist to know that I'm not forgetting anything. ChatGPT gives me an accurate 5-step plan and it went perfectly.
I use DDG, but mostly with the bangs, so I search in the place I'm likely to get the answer I want. Which can be limiting, because I'm always using the same sites (e.g. imdb for media information) and won't discover new ones, and sometimes I don't know where I can find what I'm looking for.
For more general searches, depending on the topic, DDG is close to useless because link farms, AI slop, and returning results that aren't really what I'm looking for (some of the keywords weight too much). But I suspect this is a common problem in all search engines, so I'm not looking for a replacement. It is frustrating though. I can't believe the information doesn't exist, is just that it is unreachable.
I don't search using AI. Generally I'm not looking for information that can be distilled into an "answer"; and there's also that DDG is not feeding me AI answer (I think? May be I'm not paying attention).
I use Kagi for searching the web.
I use an LLM to generated regular expressions.
Search engines? I'm still using web rings. ;-)
Yes, Kagi. Don’t use ChatGPT at all. Sometimes use Claude
Why disengage your brain and trust AI when searching?
Perplexity is by far the best web search replacement
I use perplexity more than anything else these days.
I have replaced all my search need with perplexity
Yes I use google. I haven't tried chatGPT yet.
it really depends if i know what i am trying to search or just researching the general notion. the latter part i will use llms for
dunno, google searches don't seem that different to me compared to 10 years ago. i don't use paid services.
I have repeatedly tried to use LLMs as search engines, both general (like ChatGPT) or more focused on specific domains.
I have not been impressed by the results. In my experience, LLMs used this way generally output confident-sounding information but have one of two problems the majority of the time:
- The information is blatantly wrong, from a source that doesn't exist.
- The information is subtly wrong, and generated a predictive chain that doesn't exist from part of a source.
I have found them about on-par with the reliability of a straightforward Google search with no constraints, but that is more of a condemnation of how poor Google's modern performance as a search engine is, than an accolade for using LLMs for search.
yes. I find duckduckgo more on point and fast about what I need. Also, I don't have a limit for free usage.
If I don't know something, I'm not gonna trust ChatGPT to get me the right answer. It may do so 90% of the time, and make shit up the other 10%. Google sucks compared to what it was (and DDG still sucks worse, which is why I still use Goog as a fallback), but I still know how to sift through the results to find something truly informative (if it's out there; some searches, like whether some product is a scam, have been SEO'd to oblivion by the scammers).
Oh, and a major reason why Google sucks now? AI enshittification. They basically jettisoned their finely tuned algorithm in favor of "just run it through the LLM sausage grinder".
All the time and no it's not Kagi (not saying it's bad; I just don't need it). I use Google and on good days DDG and DDG is really bad anything local I want to search and so is Kagi (in my limited trial) and by local I don't mean my city but my entire gigantic country.
As for those AI chatbots -those are anything but useful for the general search purposes beyond a bit of surface level answers which you can't fully trust because they (still) hallucinate a lot. I tell chatgpt - "Give me a list of good X. (And don't you make anything yup!!!)" - yeah with those bangs; and it still makes shit up.
I've moved over to Kagi, last month.
Liking it a lot.
Brave Search + Google Scholar for me.
I mostly use Phind and Brave Search.
Open ended stuff -> LLMs
Rest? Still search engines
Yes I use a mix of Kagi and Ecosia
Considering I have been using different search engines since Excite and Alta Vista, the state of modern search is worse than when crawlers were in their infancy. It is so front loaded with SEO, a search for a simple doc reference gives you ten pages of links back to the sales and marketing pages for 12 applications that do something similar.
AI is a better search for now because SEO and paid prioritization in search hasn't infested that ecosystem yet but it's only a matter of time.
I dropped Google search years ago but every engine is experiencing enshitification.
Yes, but not for knowledge.
Nope. I’ve despised using Google Search for years, and thought it would eventually be replaced with another better search engine. At one point I even switched to a paid Kagi subscription for a few months and it was sooo much better than Google. I only stopped using Kagi because I’ve completely switched to ChatGPT now. It’s a really great search engine but for my daily use ChatGPT is more convenient and faster to use.
I got a few months of free Kagi with a Pragmatic Engineer newsletter subscription and I'm enjoying it. It reminds me of old Google before it got polluted. Kagi can't do anything about the web itself being polluted, but the search experience is good. I use Gemini a lot as well.
I'm very disappointed in Apple that changing the default browser in Safari requires you to install a Safari extension. Super lame stuff.
I mainly use Google or Duckduckgo (depending on what I'm searching for), but I think search quality has been declining because of AI slop.
Which is kind of a problem, especially for Google, because their incentive to limit AI slop in search results is reduced when AI is one of their products, and they stand to benefit from search quality declining across the board in relation to AI answers.
Yes. I still use them.
I am frequently disappointed with the results I obtain from search engines, but in some of these cases I can find the things I'm looking for by tweaking the advanced search settings.
On the other hand every time I've used language models to find information I've gotten back generic or incorrect text + "sources" that have nothing to do with my query.
I do, but LLMs generate so much slop that ordinary search results are less useful by the week. At this rate low-background steel will be plentiful in comparison to human-written true information. Stack overflow may be dead, but it's legacy will live forever. We never knew how good we had it until we lost it.
Yes, daily. (Kagi)
Half and half, for non-political stuff, I still lean on search engines and topic-specific sites.
For political stuff, I avoid wikipedia and just search engines in general and ask Grok/ChatGPT, specifying the specific biases I want it to filter out and know pieces of misinformation for it to ignore.
for me chatgpt has completely replaced google.
yes, most of the time I want a link to a page
I go out of my way to avoid any LLM generated thing, ESPECIALLY in search results, whether for coding, product research or news. The world is drowning in misinformation and misdirection, I don't need any additional hallucinations.
99% Perplexity
Not really. I use grok, if it fails, google will fail as well, so I switch to searching projects' documentation. It's not that grok/llms are awesome, it's just that google is useless.
I find the regurgitated slop of ChatGPT to have an unsatisfactory signal to noise ratio + too much confident lies and so I prefer direct searches.
Gemini is similar.
I sometimes use phind and find myself jumping directly to the sources.
Consider paying for kagi.
I use searxng
I use Kagi, and it's worth every penny. Google has enshittified itself into irrelevance, and ChatGPT is too ponderous.
Kagi is like Google in it's prime - fast, relevant and giving a range of results.
Now and then, but I also quite often use perplexity.ai for search. Sometimes it's just too convenient to let a robot sift through the search results for the information I want.
I think it depends on your use cases :
1. *Browsing*
This can be completely avoided. Here is a thing you can do on firefox with some tweaks in order to achieve no-search browsing
- Remove search suggestions in (about:preferences#search)
- Use the [History AutoDelete](https://addons.mozilla.org/en-US/firefox/addon/history-autod...) addon to remove searches from your History. This will avoid searches from your history to pollute the results
- Go to (about:config) and set `browser.urlbar.resultMenu.keyboardAccessible` to `false`
Now when you Ctrl + L into the tab, you will get results from your history, bookmarks and even open tabs. And the results are only a few Tab presses away, no need to move your hands off the keyboard.
If you don't like the results and want to launch a search anyways, well just press Enter instead and it will launch a search with the default search engine. A cool trick is to type % + space in the awesome bar to move around opened tabs/ You can also specifically look into bookmarks with * and history with ^
P.S : Ctrl + L, Ctrl + T, Ctrl + W and Ctrl + Alt + T are your best friends.
P.P.S: Now you can also learn more about custom search engines : https://askubuntu.com/a/1534489
2. *Quick answer* on a topic. This is the second most common use case and what Google has been trying to optimize for for a long time. Say you want to know how many people are there in Nepal or what is the actual percentage blue-eyed people in Germany. This is where llm shine I think but to be fair Google is just as good for this job.
3. *Finding resources* to work with. This one is a bit on the way out because, it's what people who want to understand want but we probably are few. This is valuable because those resources do not just give an answer but also provide the rationale/context/sources for the answer. But.
On the one hand, most people just want the answer, and most people can be you if, even though you deem yourself a curious person, you don't have the time right now to actually make the effort to understand. On the other hand, llms can craft tutorials and break down subjects for you which turn those resources much less valuable. I kind of feel like the writing is on the wall and the future for this use case is for "curating" search engines that will give you the best resources and won't be afraid to tell you "Nothing of value turned up" instead of giving you trash. Curious to hear your thougts about that.
Yes: Kagi
I use search engines because I want a source for the info I get that I can assess the trustworthiness of.
Sadly search is massively enshitified by AI generated SEO'd crap...
Yes for the sole reason LLMs are very arrogant and I don't want to end up similarly delusional about whatever happens to be that I'm researching.
Yes, as borked as Google search results have become, it doesn't make shit up like LLMs do.
Only for porn
Yes
yes.
No, I go straight to GPT. Because I’m not usually searching for a webpage. What I’m really looking for is to learn through the course of an interactive discussion. Where I can ask any question no matter how stupid it is. Imagine a patient elderly colleague who will never lose their temper or mock you. Sometimes they get things wrong, but that’s where critical thinking comes in.
Search is primarily a portal - you know a particular resource exists, you just don't know its exact URL.
You hear about this new programming language called "Frob", and you assume it must have a website. So you google "Frob language". You hear that there was a plane crash in DC, and assume (CNN/AP/your_favorite_news_site) has almost certainly written an article about it. You google "DC plane crash."
LLMs aren't ever going to replace search for that use case, simply because they're never going to be as convenient.
Where LLMs will take over from search is when it comes to open-ended research - where you don't know in advance where you're going or what you're going to find. I don't really have frequent use cases of this sort, but depending on your occupation it might revolutionize your daily work.