Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think last time I tried this, I kinda gave up as soon as it got to the point of hand-waving hardware and telling you to run notebooks on a third party's web service. How is that democratizing AI? It's the very opposite! Not good. What is the intended audience here? Uni-level students will learn most of these within their programs if they're interested in AI. So they're not really it. Is the intended audience "coders"? If so, most of these "coders" will have to somehow get their employer on board (typically a corporate entity very much NOT interested in "coding" something in a 3rd party ecosystem) or do it themeselves. Hence, I want to take an RTX 3080+, 64+ gb of ram, big ass SSD and I want to get through the training. Not learn some basics on somebody else's platform (come on, even from the pov of OS, running both training AND notebooks on some 3rd party's private platform is so against the idea of open source...) and call it a day. What use is that? That may be enough if you want to be a cog in somebody else's machine, but not if you want to do something useful by yourself (I say "may" because big tech generally isn't interested in your "mad AI skillz" unless you also have a student loan backed piece of paper proving you successfully learned that for the last couple of years).

There will always be smart individuals and talented small teams that can successfully integrate AI into their products, but it's not thanks to the courses like this.

If you're going to aim at coders, there has to be clear path demostrated from the beginning to the end. From starting up your first notebook on your local dev machine and running training on your local training machine to setting up inference in the final app (.net app or whatever)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: