Yes, I also wonder about this! Progress from children books to scientific papers etc.
Could it learn e.g. language structure faster in a pre-training stage?
Also somehow one needs to define a proxy to generalization to compute a loss and do backpropagation.
Yeah. This comment is profound to me. The internet works differently with these tools.
I haven't used the deep research features much but their ability to hash out concepts and build knowledge or even provide an amplified search experience is something...
I get why this comment was downvoted but I also get where you're coming from - yes, these models are becoming increasingly intelligent at understanding the nuance and where to look without knowing what to begin searching for.
But the downside is, you end up digging in the wrong direction if you leave it to a generalist system instead of a professional community in some cases which is counter productive.
Getting burnt is a good way to learn not to sometimes though...