I wouldn't assume that it's the same, no. For all we knock them unconscious biases seem to get a lot of work done, we do all know real things that we learned from other unreliable humans, somehow. Not a perfect process at all but one we are experienced at and have lifetimes of intuition for.
The fact that LLMs seem like people but aren't, specifically have a lot of the signals of a reliable source in some ways, I'm not sure how these processes will map. I'm skeptical of anyone who is confident about it in either way, in fact.
The fact that LLMs seem like people but aren't, specifically have a lot of the signals of a reliable source in some ways, I'm not sure how these processes will map. I'm skeptical of anyone who is confident about it in either way, in fact.