So, for example, I would write in my editor an expression to get the list of windows on the current display, then I can immediately map/filter through them, group, sort, read any parameters, etc. Then without missing a beat I can apply some transformations - programmatically move any window, resize, etc; on the fly, like playing a fucking video game.
Compare that to "more traditional" approach of issuing commands, then piping them to jq or something, then figuring out where the stuff is, then writing that into a file, then having your WM to pick up those changes, often you'd completely lose the state, then you'd have to go back to the script files, etc. I, don't have to save anything anywhere until I'm sure that shit works - I simply eval things on the go - I can "touch, grab and interact" with entities immediately after querying their info, there's no physical, mental or contextual separation here - I don't have to write something in the terminal, then something in one of my scripts, some parts in some other file, etc. - everything controlled directly from my editor.
Here's more practical example. Hyprland has something called Hyprsunset to deal with color temp and gamma. I wrote an extension that changes display color temp based on time of day - it's a simple clojure.core/async go loop, it reads from a hashmap where specific hours map to temperatures and then checks every 10 minutes; if it's time to apply new color, it does. That took me just a few minutes to whip out with a connected Lisp REPL. I'm pretty sure, it would've taken me far longer without it. The way how the "true" REPLs work (which e.g., Python one is not) simply is shockingly crazy awesome for rapid prototyping. Why more programmers don't do this is a complete mystery to me - getting into Lisp is not even that difficult, Clojure for example is far more simpler and more straightforward than even Javascript and Python. These days, you don't even need to know Emacs - install Calva for VSCode - that's all you need, it has quickstart guide and all.
Automated behaviours like forcing windows to different virtual desktops, modifying windowing behaviour dynamically. Some windows are automatically tiled and others float, etc.
Tell that to Havoc Pennington. Dbus was the solution he came up with based on requirements and constraints set by the DEs. A lot of people have claimed we need something better, but nobody has actually created something better. Till someone does, Dbus is the standard for client communication with Wayland compositors outside the core protocol. Sure beats piping stuff over X ClientMessage events.
Everyone agrees with this obviously but it's like saying that we should be able to levitate or live in utopia. It's almost a law of nature that the types that become powerful are not your most savory individuals and will use the power to reinforce their positions.
It's a law of nature that they will _try_.[0] That's why people should always have ways of defending themselves, whether it's with courts or guns.
[0]: This is not a figure of speech - many anti-social traits which result in NPD, ASPD and their subclinical versions[1] are genetic. There is literal evolutionary pressure to exploit others.
[1]: Meaning the trait is sufficiently pronounced to be harmful to others but not enough to be harmful to the person having it so it's not diagnosed as a disorder.
We have tons of different systems for accumulating power all over the world. Corporate structures, democracy vs autocracy, etc. In each of those societies, we see different types of leaders on a sliding scale of savoriness.
My point is that clearly there are some forms of governance which result in more savory people and so you can argue that it's the systems that define the outcomes rather than any "law of nature".
Nonsense. Network service layer separation solves a different problem than OOP. It doesn't replace it. Services and containers bring features and capabilities that OOP doesn't provide. It's orthogonal.
But I think that's really the point: it gets applied to problems it was clearly not meant to solve. The article doesn't really get into that, but that's how I'm reading it because I've seen it happen. To some programmers, every component of a program is its own service, even when there's no need to have multiple processes. The only tool they have is that hammer, so everything has to be a nail.
It's insane to me that AMD is not spending billions and billions trying to fix their software. Nvidia is the most valuable company in the world and AMD is the only one poised to compete.
They are, but the problem is that shifting an organization whose lifeblood is yearly hardware refreshes and chip innovation towards a ship-daily software culture is challenging. And software doesn’t “make money” the way hardware does so it can get deprioritized by executives. And vendors are lining up to write and even open source lots of software for your platform in exchange for pricing, preference, priority (great on paper but bad for long term quality). And your competitors will get ahead of you if you miss even a single hardware trend/innovation.
There was a podcast episode linked here a while ago about how the software industry in Japan never took off as it did in America and it was a similar conclusion. According to the host, the product being sold was hardware, and software was a means to fulfill and then conclude the contract. After that you want the customer to buy the new model, primarily for the hardware and software comes along for the ride.
It should be obvious by now though that there's symbiosis between software and hardware, and that support timescales are longer. Another angle is that it's more than just AMD's own software developers, also the developers making products for their customers who in turn buy AMD's if everyone works together to make them run well and it's those second developers they need to engage with in a way their efforts will be welcomed.
I worked at at a number of GPU vendors, and it felt like Nvidia was the only one that took software as an asset worth investing in, rather than as a cost center. Massively different culture.
Why would you assume cognitive bias? Any evidence? These things are indeed very expensive to run, and are often run at a loss. Wouldn't quantization or other tuning be just as reasonable of an answer as cognitive bias? It's not like we are talking about reptilian aliens running the whitehouse.
I'm just pointing out a personal observation. Completely anecdotal. FWIW, I don't strongly believe this. I have at least noticed a selection bias (maybe) in myself too as recently as yesterday after GPT 5.1 was released. I asked codex to do a simple change (less than 50LOC) and it made a unrelated change, an early return statement, breaking a very simple state machine that goes from waiting -> evaluate -> done.
However, I have to remind myself how often LLMs make dumb mistakes despite often seeming impressive.
Wait so how many degrees of separation do you have to be before you are ok? I mean fucking come on, this is ridiculous. DHH's blog entries are ugly, but are we really saying that valve shouldn't do business with a hardware company because they do business with one guy that says shitty things on a blog?
Any business larger than a certain size is gonna have a fan-out of hundreds if not thousands of business if you go 2 to 3 degrees of separation out. And they have to avoid any that have written mean blog posts?
I'm sure like 20-30% of open source software has contributions from assholes.
All chip manufacturers sell to military contractors and genocidal regimes. But valve should know not to do business with any chip manufacturers lol. Anyway
reply