When writing about complex topics, it gets tough to get below grade 11. Sometimes, it's the complex terms that cannot be replaced. Has anyone tried the Hemingway app? How do you solve this challenge?
By comparing GPT’s responses to global survey data from 65 countries, the authors show that GPT’s psychology most closely mirrors people from WEIRD societies (Western, Educated, Industrialized, Rich, and Democratic). In fact, the further a country is from the U.S. culturally, the less GPT behaves like its population.
Hundreds of weight loss and diabetes jab users report pancreas problems. Some cases of pancreatitis reported to be linked to GLP-1 medicines have been fatal.
It’s pretty wild to think intelligence might have evolved more than once in vertebrates. The example of birds and mammals both developing complex brains is fascinating. It makes you wonder how much untapped potential animals might have in terms of intelligence that we haven’t fully understood yet.
While typically both mammals and birds are significantly more intelligent than the other vertebrates, it must be not forgotten that there exists an overlap in intelligence between the smartest reptiles, e.g. varans a.k.a. monitor lizards, and the dumber mammals and birds.
So also outside of mammals and birds there are some cases of brain evolution towards greater complexity, even if not reaching the typical mammal/bird level, and which are likely to also correspond to a somewhat different brain structure.
(Off topic, in my opinion, "reptiles", is a term that is properly applied only to lizards and snakes. Not only crocodiles and turtles are more closely related to birds than to lizards and snakes, but also none of them are crawling, as implied by the word "reptile". Actually the present crocodiles are awkward on land only because they are secondarily adapted to an aquatic life. Their terrestrial ancestors were much more agile, as still demonstrated by some crocodiles that are even now able to gallop.)
> Not only crocodiles and turtles are more closely related to birds than to lizards and snakes
Most fish (bony fish) are more closely related to us than they are to sharks and other cartilaginous fish. Technically we're all air breathing walking bony fish.
> It’s pretty wild to think intelligence might have evolved more than once in vertebrates
It's an utter bombshell if true. It means intelligence isn't "difficult" for evolution to arrive at, significantly increasing the odds of other intelligent life in the universe.
Though it's easy to dismiss as science fiction, this timeline paints a chillingly detailed picture of a potential AGI takeoff. The idea that AI could surpass human capabilities in research and development, and the fact that it will create an arms race between global powers, is unsettling. The risks—AI misuse, security breaches, and societal disruption—are very real, even if the exact timeline might be too optimistic.
But the real concern lies in what happens if we’re wrong and AGI does surpass us. If AI accelerates progress so fast that humans can no longer meaningfully contribute, where does that leave us?
I think MIT's breakthrough is a huge step forward, but it also highlights just how tricky drug delivery still is. Scaling up nanoparticle production is essential if we ever want these treatments to be more than just a cool lab experiment. But even with this progress, we’re still stuck with the same old problem—getting drugs exactly where they need to go without wrecking the rest of the body.
That’s why I find DNA nanobots fascinating, even if they still feel a little like science fiction. Instead of just improving how we make nanoparticles, DNA nanobots could take targeting to another level—programming them to only release drugs when they hit specific cancer markers. The idea of a ‘smart drug courier’ that doesn’t dump toxins everywhere is really appealing. Will it actually work in humans at scale? Who knows. But five years ago, I wouldn’t have expected MIT to make mass nanoparticle production look feasible either.
The thing is if someone prepared for an interview and cracked the job, they have to be good at it. I have realised that it's often our perception of them which makes them bad at the their jobs. Similar to how we usually blame motivation when the actual problem is clarity of role or job. If we believe in the motive behind it and have clarity of our role in it, motivation does not remain an issue.
We make the jobs bad by not being able to properly share the incentive behind it, what good it brings and to whom. Most of the time people don't want to work because they don't see the ROI in it.