Hacker Newsnew | past | comments | ask | show | jobs | submit | lspears's commentslogin

Amazing


Thanks! :) I feel like there is a dreadful lack of local-only apps that are runnable on a single simple server, now that everything is overly distributed. Should we bring back more P2P apps?


Go’s restraint in adding new language features is a real gift to its users. In contrast, Swift feels like a moving target: even on a Mac Studio I’ll occasionally fail to compile a simple project. The expanding keyword list and nonstop churn make Swift harder to learn—and even harder to keep up with.


Amazing. What caused you to look for a solution without a particle?


I watched the search for the Higgs Boson and the search for Cold Dark Matter carry on in parallel for decades.

The former was clearly actual science: they had a theoretical particle, they knew what it did, it had a place that made sense in the Standard Model, they had an estimate for the energy range in which they could find it, they built an instrument to look for it, and they found it.

The latter... well, it was clearly epicycles. Endlessly tweakable, with six free parameters, not in the Standard Model, a bunch of different guesses as to what it actually was, a bunch of different energies at which it might be found – oh dear, not there, well it must be at a much higher energy then – always on the brink of discovery but never actually discovered...

And then, as I began researching my book on cosmological natural selection, I could see that an evolved, fine-tuned universe was going to have startling emergent-looking properties built into its developmental process. Baryonic matter was going to pull off some weird shit, as the interaction of extremely fine-tuned parameters led to highly unlikely-looking outcomes. These would look like inexplicable anomalies, if your fundamental assumption was that we lived in a random and arbitrary one-shot universe.

And cold dark matter started to look awfully like the kind of think you would have to invent to save the old paradigm...

So as I developed my approach, I assumed dark matter was an error, and did my best to explain everything using fine-tuned parameters, and baryonic matter only.


"Though BGP supports the traditional Flow-based Layer 3 Equal Cost Multi-Pathing (ECMP) traffic load balancing method, it is not the best fit for a RoCEv2-based AI backend network. This is because GPU-to-GPU communication creates massive elephant flows, which RDMA-capable NICs transmit at line rate. These flows can easily cause congestion in the backend network."


The whole end to end system seems pointless. If you want to learn ethics you can do so with ChatGPT alone. It can provide you interesting questions, review your papers, argue against you etc. The university is providing no value.


Make the smoothest learning gradient possible. It helps a lot with kids to increase the complexity over time. Riding a full bike has a steep learning curve. Riding a bike with training wheels then taking them off is a steep transition. Avoid large discontinuities.


Any way to get the 3D assets from this?


Try this: https://github.com/Rilshrink/WebGLRipper

(No clue if it works, I can't actually use it due to Europe)


What 3D assets? I don't see anything suggestive of a 3D renderer here.


This is great. I am working on a robotics application and this seems like a better abstraction than alternatives such local messaging servers. How do you deal with something like back pressure or not keeping up with incoming data?


The lib is based on channels and inherits the channel behavior in terms of backpressure. Simply put if no-one reads on one side of the pipeline, it wouldn't be possible to write anything on the other side. Still, it's possible to add buffering at arbitrary points in the pipeline using the rill.Buffer function.


I wonder if it will hallucinate tax loopholes. Based on its training data of existing laws, it seems likely.


I'm not saying I used ChatGPT/Bard exclusively to do my taxes, but it does a surprisingly good job explaining concepts on the tax forms that TurboTax just doesn't even bother helping you with.


Seems odd they don't reference this on the page. Instead they list:

"If you need a programmatic interface for tokenizing text, check out the transformers package for python or the gpt-3-encoder package for node.js."

with the links:

https://huggingface.co/docs/transformers/model_doc/gpt2#tran...

https://www.npmjs.com/package/gpt-3-encoder


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: