Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's irresponsible and unethical to "require" students to use generative AI tools. These tools are highly controversial and pose severe cultural, moral, and increasingly legal dilemmas. It's like requiring students to take certain types of drugs to participate in a class, or learn how to use certain types of firearms. Unless the class is literally "how to take drugs" or "how to use guns" everyone would understandably flip a lid, yet this certainly wasn't a "how to use ChatGPT" class but another topic entirely (entrepreneurship if I'm not mistaken).

If I discovered a class I was planning to attend in academia was going to require the use of AI, I would launch a protest immediately. This is a violation of my human rights. If you're OK with using generative AI tools, that's your prerogative, but forcing me to use them is like forcing me to eat food grown with pesticides if I'm a staunch organic foods consumer…or possibly even worse, forcing a vegetarian to eat meat. Way-ay-ay out of line.



I understand this clearly is something you feel strongly about, but do you think this is somehow drastically different from using other tools for doing secondary research? As in, using an index tool (like the Dewey decimal system, or Google) to surface previously performed work, so you can then summarize it?

There is always a line to draw, why is this the point where you think it is important to draw a line?


That's actually a great example of why I agree with that person.

It is totally reasonable to say "you may use AI tools in this class", or by comparison, "you must do research in this class, here are some ways you can do research, it is not cheating to use Google".

It is a very different thing to require the use of AI tools by someone who can competently write an essay on their own, or to require the use of Google by someone who happens to have a bunch of relevant sources bookmarked. That's moving from "can you do this" to "can you use this specific tool", and away from the actual point of the class.

Another example of this I can remember was requiring specific IDEs for some courses, which invariably led to a quarter of the class time being wasted explaining buttons in a now long-outdated version of Eclipse in excruciating detail, instead of any actual programming concepts or even Java-specific concepts.


I think if you take that to another extreme, you could say that forcing someone to write code with a computer versus a typewriter is an arbitrary rule.

But there are actual reasons to enforce the tool - in that case, so you can execute your code in a similar environment to your peers. It is specific, but that's the point of the restriction.


I'd feel the same way about a job that requires the use of generative AI, nothing to do per se with a classroom setting or even education.


Well, jobs vs classes are very different. I'd also probably leave a job that gave me arbitrary assignments and tests and made me pay to be there, too, but that's generally fine in a classroom setting.


Not sure if you are joking.


This feels like something Sydney would say.


how is using generative AI tools immoral?


There is considerable controversy about using artists works as training data for creating an AI that competes with artists.

Just to take a top google hit on the subject: https://www.nbcnews.com/tech/internet/lensa-ai-artist-contro...





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: