Soon, very soon, there will be no way to discern the output of AI from that of a human, just as there is no way to determine whether I added two numbers by hand on paper manually, or I pushed the "+" on my calculator, given only their sum. Would you refuse to read an article if someone used a calculator to obtain results? What about numerical integrations? Would you refuse to read a book because a printing press made it? Perhaps books should have labels that say "WARNING this book was printed by a machine".
What if over the next few decades, science is creatively destroyed; no more science remains that isn't in some way produced using AI. I'm quite critical in using these tools but I'm not a Luddite.
1) There could be something novel about this, even if it's just the way it all hangs together.
2) If not, then it could be mundane but consistent output which is encouraging, with respect to previous interactions.
3) It could be wrong but then it is quite convincingly so, I have not yet checked the details.
Do you agree? Tell me what you think?