Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's not the type of citation they're talking about. Gemini uses a tool call to the Google search engine and thus can cite and read proper links. You're talking about an LLM that just hallucinates citations which don't exist.


Is Gemini the same thing that shows up in google search AI box? Because that thing is wrong all the time.

Just the other day I was searching for some details about the metal graphics api language, and something weird caught my eye as I scrolled past the AI stuff. Curious, I engaged, asking more basic questions and they were just.. wrong. Even right now, “what is the default vertex winding order in Metal?” is wrong. Or how about “does metal use a left or right handed coordinate system for the normalized device coordinates?”. I mean this is day one intro level stuff, and easily found on Apple’s dev site.

And the “citations” are ridiculous. It references some stack overflow commentary or a Reddit thread where someone asks a similar question. But the response is “I don’t know about Metal, but Vulcan/D3D use (something different)”. Seriously wtf.

GPT4 gives the same wrong answers with almost the same citations. GPT5 gets it right, for at least the examples above.

Either way, it’s hard to trust it for things you don’t know, when you can’t for things you do.


No it's gemini.google.com


Then what is the llm that shows up at the top of google search queries?


Maybe it's Gemini, maybe it's another one of their models, but I'm specifically talking about LLMs like Gemini, or, if you want a better example, Perplexity, which crawls web pages first and then cites them, so that there aren't bogus citations.


Oops, well it felt nice to vent anyway.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: