Well, let me put it this way. I have a MacBook with 64 GB of vram so I can experiment with making an old-fashioned x.ai clone (the meeting scheduling one, not the "woke chatgpt" one) amongst other things now. I love how Apple Silicon makes things vroomy on my laptop.
I do know that getting those working in a cloud provider setup is a "pain in the ass" (according to ex-AWS friends) so I don't personally have hope in seeing that happen in production.
However, the premise makes me laugh so much, so who knows? :)
I do know that getting those working in a cloud provider setup is a "pain in the ass" (according to ex-AWS friends) so I don't personally have hope in seeing that happen in production.
However, the premise makes me laugh so much, so who knows? :)