Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there any other type of development where you have to inform the US gov about it when you start developing it?

The EO says you must inform the gov when you start the training the AI if it's a risk to public health, etc.

How would one determine if it's a risk? Who determines the risk (I assume the government).

So effectively that means the gov determines what is right/wrong, true/false. Not good.



My friend, it is quite literally the goverment's job to decide what is right or wrong through drafting and enacting laws.

I am not stating that having to inform the government when you're training models is right. It's an asinine way of thinking by people who have no idea what they're talking about and are severely out of touch. The laws they've enacted were based on fear, often propagated by OpenAI themselves. It is still their job to do that though.


> My friend, it is quite literally the goverment's job to decide what is right or wrong through drafting and enacting laws.

It is not their job to say what is true or false. That's propaganda. They also do not have the right to enforce what you train your AI to say.

Would love to see the SC challenge this EO. It violates the 1A.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: