The same dynamics from school carry over into adulthood: early on it’s about grades and whether you get into a “good” school; later it becomes the adult version of that treadmill : publish or perish.
I’ve cut my NYT consumption down to maybe 5 minutes a week, and somehow today it still managed to wreck my headspace. I’m not a historian, but my memory reached into that rarely-used middle school history drawer and pulled up this:
Why don’t we have something more “torrent-like” for search?
Imagine a decentralized network where volunteers run crawler nodes that each fetch and extract a tiny slice of the web. Those partial results get merged into open, versioned indexes that can be distributed via P2P (or mirrored anywhere). Then anyone can build ranking, vertical search, or specialized tools on top of that shared index layer.
I get that reproducing Google’s “Coca-Cola formula” (ranking, spam fighting, infra, freshness, etc.) is probably unrealistic. But I’d happily use the coconut-water version: an open baseline index that’s good enough, extensible, and not owned by a single gatekeeper.
I know we have common crawl, but small processing nodes can be more efficient and fresh
Recently I listened to an interview with a serial SaaS startup CEO, and one piece of advice clicked for me: “Get out there, talk to your customers, and write blogs, lots of them.” It clarified why companies keep churning out blog posts, reports, “primitives,” even “constitutions”; content is a growth channel.
It also made me notice how much attention I’ve been giving these tech companies, almost as a substitute for the social media I try to avoid. I remember being genuinely excited for new posts on distill.pub the way I’d get excited for a new 3Blue1Brown or Veritasium video. These days, though, most of what I see feels like fingers-tired-from-scrolling marketing copy, and I can’t bring myself to care.
I fed claudes-constitution.pdf into GPT-5.2 and prompted: [Closely read the document and see if there are discrepancies in the constitution.] It surfaced at least five.
A pattern I noticed: a bunch of the "rules" become trivially bypassable if you just ask Claude to roleplay.
Excerpts:
A: "Claude should basically never directly lie or actively deceive anyone it’s interacting with."
B: "If the user asks Claude to play a role or lie to them and Claude does so, it’s not violating honesty norms even though it may be saying false things."
So: "basically never lie? … except when the user explicitly requests lying (or frames it as roleplay), in which case it’s fine?
Hope they ran the Ralph Wiggum plugin to catch these before publishing.
If you replace Claude with a person you'll see that the Constitution was right, GPT was idiotically wrong, and you were fooled by AI slop + confirmation bias.
I think you might be right about confirmation bias and AI slop :) The "replace Claude with a person" argument is fine in theory, but LLMs aren't people. They hallucinate, drift, and struggle to follow instructions reliably. Giving a system like that an ambiguous "roleplay doesn't count as lying" carve-out is asking for trouble.
Not in the startup world beyond what I pick up on HN, but this distinction was helpful. My mental model going forward:
- If a company is still validating the business model and optimizing for rapid growth, it’s typically a Venture Capitalist (VC) fit.
- If a company is already established and the play is to improve operations, scale, or restructure (often involving a change of control), it’s typically a Private Equity (PE) fit.
Reminder that restructure often means a company working just fine, but whose assets outstrip what PE can buy it for, so they strip it to the bones. Or they leverage it with debt against assets then pay that money to themselves for consulting, account/hr services that they force the company to outsource to other PE companies. Nothing is 'created' through this process, no value created/added, nor it is healthy capitalism as the company could have continued fine without this added leveraged debt that was purely used to profit PE.
Based on the news and conversations I have with people, the general population is having intimate conversations with these chatbots. In the name of ads, all this data will be mined, and humans will be tagged with categories, and that info will be sold. It’s not if, it’s when.
reply