What a coincidence: "protect the children" narrative got amplified right about when implementing profiling became needed for openai profits. Pure magic
I get why you're questioning motives, I'm sure it's convenient for them at this time.
But age verification is all over the place. Entire countries (see Australia) have either passed laws, or have laws moving through legislative bodies.
Many platforms have voluntarily complied. I expect by 2030, there won't be a place on Earth where not just age verification, but identity is required to access online platforms. If it wasn't for all the massive attempts to subvert our democracies by state actors, and even political movements within democratic societies, it wouldn't be so pushed.
But with AI generated videos, chats, audio, images, I don't think anyone will be able to post anything on major platforms without their ID being verified. Not a chat, not an upload, nothing.
I think consumption will be age vetted, not ID vetted.
But any form of publishing, linked to ID. Posting on X. Anything.
I've fought for freedom on the Internet, grew up when IRC was a thing, knew more freedom on the net than most using it today. But when 95% of what is posted on the net, is placed there with the aim to harm? Harm our societies, our peoples?
Well, something's got to give.
Then conjoin that with the great mental harm that smart phones and social media do to youth, and.. well, anonymity on the net is over. Like I said at the start, likely by 2030.
(Note: having your ID known doesn't mean it's public. You can be registered, with ID, on X, on youtube, so the platform knows who you are. You can still be MrDude as an alias...)
Yeah, but I tried switching to minified JSON on a semantic labelling task and saw a ~5% accuracy drop.
I suspect this happened because most of the pre-training corpus was pretty-printed JSON, and the LLM was forced to derail from likely path and also lost all "visual cues" of nesting depth.
This might happen here too, but maybe to a lesser extent. Anyways, I'll stop building castles in the air now and try it sometime.
if you really care about structured output switch to XML. much better results, which is why all AI providers tend to use pseudo-xml in their system prompts and tool definitions
Example from this article looks more like "unspecified" behavior rather than "undefined". Title made me expect nasal demons, now I'm a bit disappointed
Probably not, because whatever Google is calling its remote attestation scheme this week (SafetyNet? Play Integrity?) has a way to check where the app was sourced and whether it has been altered.
Google is an asshole for making this. When Microsoft first proposed a scheme like that for PCs under the name Palladium, everyone knew it was a corporate power grab. Somehow, it got normalized.
Every time I see people mention things like this in node vs bun or deno conversations I wonder if they even tried them.
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
Yeah, just switch to a non-NPM compatible and inexistant ecosystem just because you need to specify a few parameters in a config.
Anthropic is trying to IPO at a valuation of $300B so if their engineers nor their AI can be bothered to do this then maybe they're not as good as they think they are.
>Yeah, just switch to a non-NPM compatible and inexistant ecosystem just because you need to specify a few parameters in a config.
Yeah.. you do not know what you are talking about.
I run my test suites on both node and bun because I prefer to maintain compatibility with node and it works just fine. I use some of bun's APIs, like stringWidth and have a polyfill module that detects if it's not bun and loads an alternative library through a dynamic import.
Not NPM compatible? Bun is literally a drop in for NPM, in fact you could use it solely as a replacement for NPM and only use node for actual execution if that's your thing. It's much faster to run bun install than npm install and it uses less disk space if you have many projects.
The developer experience on bun is so much better it isn't funny. It remains prudent, I would agree, to not depend too heavily on it, but Bun is many things: a runtime, a set of additional APIs, a replacement for the dogslow npm, a bundler etc. Pick the parts that are easily discarded if you fear Bun's gonna go away, and when using their APIs for their better performance write polyfills.
>Anthropic is trying to IPO at a valuation of $300B so if their engineers nor their AI can be bothered to do this then maybe they're not as good as they think they are.
Or maybe you should have a cold hard look in front of the mirror first and think twice before opening your mouth because the more you write the more the ignorance is shown.
They evidently evaluated Node.js in comparison to Bun (and Deno) earlier this year and came to a technical decision about which one worked best for their product.
Microsoft owns npm outright and controls every aspect of the infrastructure that node.js relies on. It also sits on the board (and is one of the few platinum members) of the Linux Foundation, which controls openjs. It is certainly MS.
Ads, obviously
reply