Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> data-to-paper is a framework for systematically navigating the power of AI to perform complete end-to-end scientific research, starting from raw data and concluding with comprehensive, transparent, and human-verifiable scientific papers (example).

Even if this thing works I wouldn’t call it “end-to-end scientific research”. IMHO the most challenging and interesting part of scientific research is coming up with a hypothesis and designing an experiment to test it. Data analysis and paper writing is just a small part of the end-to-end process.



The very next paragraph:

> Towards this goal, data-to-paper systematically guides interacting LLM and rule-based agents through the conventional scientific path, from annotated data, through creating research hypotheses, conducting literature search, writing and debugging data analysis code, interpreting the results, and ultimately the step-by-step writing of a complete research paper.


> from annotated data, through creating research hypotheses

Then it’s all just wrong, automated p-hacking. You’re supposed to start with the hypothesis, not generate it from the data you’re about to publish.


More to the point you're supposed to start with an observation that your current theory can't explain. Then you make a hypothesis that tries to explain the observation and collect more observations to try and refute your hypothesis; if you're a good falsificationist, that is. That doesn't seem to be the process described above. Like you say it's just a pipeline from data to paper, great for writing papers, but not much for science.

But I guess these days in many fields of science and in popular parlance "data" has become synonymous with "observation" and "writing papers" with "research", so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: