Hacker Newsnew | past | comments | ask | show | jobs | submit | Leynos's commentslogin


I got the same thing trying to sign up for Oracle Cloud.

Thankfully Oracle aren't the only cloud platform, of course. Which makes it even more fun telling my friends and colleagues how shit Oracle Cloud are whenever the topic of cloud providers comes up even funnier.

On the other hand, Apple's position of dominance here makes their refusal to answer a GDPR subject access request all the more galling.


You do need to demonstrate eligibility to take up work in the UK and that is usually done by passport or something other form of id demonstrating your nationality and right to work.

You shouldn't need id to vote or access healthcare.


There are plenty of code review tools out there (e.g., CodeRabbit, Sourcery, etc.).

I haven't used this one, but you definitely should use the ones that work for you.


This is an advertised feature of ChatGPT, and you can switch it off if you want. https://help.openai.com/en/articles/11146739-how-does-refere...


Does the switch work though?


Same. I actually have in my system prompt, "Don't be afraid of using domain specific language. Google is a thing, and I value precision in writing."

Of course, it also talks like a deranged catgirl.


Hey, you leave my sycophantic robot friend alone.


Kimi K2 seemingly has a much more up to date training set.


Having larger models is nice because they have a much wider sphere of knowledge to draw on. Not in the sense of using them as encyclopedias. More in the sense that I want a model that is going to be able to cross reference from multiple domains that I might not have considered when trying to solve a problem.


I think that you have to be OpenAI (or X, Google or Anthropic) to be able to fine tune models of this scale through reinforcement learning at present.

Look at Tinker for an example of where things might be heading though (https://tinker-docs.thinkingmachines.ai/)

At present though, I get the sense that reinforcement learning at scale is the current battleground (and has been for most of 2025). But we also see over time, the general models adopt the skills taught to the specialized models. Look at how the learning that made codex-1 went into GPT5.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: