I had a conversation the other day at a birthday party with my friend's neighbour from the building. The fellow is a semi-retired (FIRE) single guy. We started with a basic conversation but then he started talking about what he interested in and it became almost unintelligible. I kept having to ask him to explain what he was talking about but was increasingly unsuccessful as he continued. Sure enough though, he described that he spent significant time talking with "AIs" as he called them. He spends many hours a day chatting with ChatGPT, Grok and Gemini (and I think at least one other LLM). I couldn't help thinking "Dude, you have fucked up your brain." His insular behaviour and the feedback loop he has been getting from excessive interaction with LLMs has isolated him and I can't help but think that will only get worse for him. I am glad he was at the party and getting some interaction with humans. I expect that this type of "hikikomori" isolation will become even more common as LLMs continue to improve and become more pervasive. We are are likely to see this become a significant social problem in the next decade.
What was the nature of his interests, if you don't mind sharing? I'm always curious about how these things develop -- makes it easier to recognize.
Seems like a lot of them fall into either "I'm onto a breakthrough that will change the world" (sometimes shading into delusion/conspiracy territory), or else vague platitudes about oneness and the true nature of reality. The former feels like crankery, but I wonder if the latter wouldn't benefit from some meditation.
It was a mix of mystical philosophy and transhumanism and he does think that "the world is on the edge of a breakthrough" but he sees it as emergent. It is not something he is personal creating just something he believes is imminent and he is one of the first people to recognise it.
Thanks -- so a little of column A, a little of column B? Kind of feels similar to how early societies built whole religious practices out of interpreting stochastic phenomena like knucklebones or entrails, only supercharged because this particular viscera seems to talk back!
I often have to remind myself of the quote "Talk to a man about himself and he will listen for hours" when socializing to remember to ask questions and let the other party explore whatever topic/situation they are into. It seems like AI conversations are so one-sided a person might forget to cede the floor entirely.
Did he refer to the AI with a name? How much of a relationship did he have with his? I have multiple friends that have named their ChatGPT, and they refer to it in conversation, like "oh yeah, Sarah told me this or that the other day", except Sarah (names changed) is an LLM.
I'm worried about our future.
...except I went over to ChatGPT and asked it to project what the future looks like in seven years rather than think about it myself. Humanity is screwed.
I have similar issues over the years on various devices and it has always been frustrating to determine the cause. Power management in general is so inscrutable. I wish there was a tool which let me go back through recent history, even just since the last 100% charge, and tell me why my machine was not sleeping or idling and what was consuming power. Apple has added some energy tools over the years but has never offered tools to explain system behaviour.
Such a shame that PDF doesn’t just, like, include the semantic structure of the document by default. It is brilliant that we standardized on an archival document format that doesn’t include direct access to the document text or structure as a core intrinsic default feature.
I say this with great anger as someone who works in accessibility and has had PDF as a thorn in my side for
30 years.
I agree with this so much. I've tried to sometimes push friends and family to use text formats (at least I sent them something like Markdown), which is very easy to render in the browser anyways. But often you have to fall back to PDF, which I dislike very much. There's so much content like books and papers that are in PDF as well. Why did we pick a binary blob as shareable format again?
> Why did we pick a binary blob as shareable format again?
PDF was created to solve the problem of being able to render a document the same way on different computers, and it mostly achieved that goal. Editable formats like .doc, .html, .rtf were unreliable -- different software would produce different results, and even if two computers have the exact same version of Microsoft Word, they might render differently because they have different fonts available. PDFs embed the fonts needed for the document, and specify exactly where each character goes, so they're fully self-contained.
After Acrobat Reader became free with version 2 in 1994, everybody with a computer ended up downloading it after running across a PDF they needed to view. As it became more common for people to be able to view PDFs, it became more convenient to produce PDFs when you needed everybody to be able to view your document consistently. Eventually, the ability to produce PDFs became free (with e.g. Office 2007 or Mac OS X's ability to print to PDF), which cemented PDF's popularity.
Notably, the original goals of PDF had nothing to do with being able to copy text out of them -- the goal was simply to produce a perfect reproduction of the document on screen/paper. That wasn't enough of an inconvenience to prevent PDF from becoming popular. (Some people saw the inability for people to easily copy text from them as a benefit -- basically a weak form of text DRM.)
Thanks for the explanation! I was vaguely aware of those issues but not in depth. It all makes sense of course and now PDF is so deeply entrenched it's very difficult to push other formats. It's interesting that the contention between content and layout is still such an issue. I don't know what the fix is, maybe just the web?
Even assuming you could get people to do the work (probably the real issue here) could a single schema syntax capture the semantics of the universe of documents that exist as PDFs? PDFs succeeded because they could reproduce anything.
While I worked at Apple in 2021-22 their issues seemed about the same as nearly every other company producing consumer apps and devices; bloated slow garbage with very mediocre quality. Their engineering culture is terrible, especially as it relates to transfer of the “Apple ethos” to the next generation of devs. Apple is going to be indistinguishable from the rest of the pack within the next decade.
But most of all it seems like it was designed by people who don’t even know what it is for. That combined with the superficial “implement my Figma masterpiece in code” development approach that includes little to no user testing. Tog weeps. Don Norman weeps. Observe how much breaks when you do something as trivial as bump the default font size by one notch. I am sure it is pixel perfect at default size though.
Enter a birth date in a contact entry without a year. Watch as it jumps to the next day when you save because you are editing the date after 0000 of the next day in utc time. That bug has now been in MacOS/iOS for at least 17 years.
Sorry, got in to rant mode. I really want “less but better” from things in my life. We as consumers aren’t rewarding companies that take this approach apparently.
You can really see this when trying to build apps with Swift & SwiftUI. The language and the framework seem to be optimized for nice terse WWDC demos but both fall apart pretty quickly when you start to do any heavy UI lifting with them. And I think that's starting to bleed into their own native UI now too. The lousy macOS settings app is a good example.
Unfortunately there don't seem to be any good alternatives to Apple. Windows is even worse.
Yes, I was a fairly early SwiftUI guinea pig, when I'd mistakenly assumed it was solid because of how Apple was pushing it, and your "WWDC demo" is spot-on.
The DSL could've been better (while still syncing between code and direct-manipulation GUI painter). And the interaction model seemed like it wasn't to be trusted, and was probably buggy (and others confirmed bugs). The lack of documentation on some entitlements APIs being demoed as launched left me shouting very bad words on multiple days (which is not something I normally do) before I made everything work.
I could feel this, and ended up wrapping all my UI with a carefully hand-implemented hierarchical statechart, so that the app would work in the field for our needs, the first time, and every time. Normally, for consumer-grade work, I would just use the abstractions of the interface toolkit, and not have to formally model it separately.
Don't get me started on what a transparently incompetent load of poo some of the Apple developer Web sites were, for complying with the additional burdens that Apple places on developers, just because it can. Obvious rampant data consistency problems, poor HCI design, and just plain unreliable behavior. I think I heard that at least some of that had been outsourced, to one of those consulting firms that everyone knows isn't up to doing anything competently, but that somehow gets contracts anyway.
Not sure about other people, but for me, my UI framework making its own heuristic decisions about how to lay out and style my views is the last thing I want. It robs me of the certainty that my UI will look and work the way I intend. And this is why, as an Android developer, I still build my apps with decade-old tried and true technologies.
Yeah that view builder syntax is a perfect example of optimizing for the wrong thing. It makes for nice short examples but in real apps your compile times explode trying to untangle these crazy generics and the compiler very often just throws up its hands and tells you to figure it out. This means you just start commenting out bits of code until you find by trial and error what it doesn't like.
That this is shipping in the native UI framework for a trillion dollar tech company is astonishing.
Except those technologies are now deprecated and you don't know when they might be removed. Jetpack Compose is now the vendor-favored way to build apps, so best practice is to use that.
I don't care what "best practices" are. Seemingly everyone sticks to these, yet here we are discussing that software quality everywhere throughout the industry has taken a dip.
> Except those technologies are now deprecated and you don't know when they might be removed.
Views and activities and XML layout will never be removed, of that I'm certain. After all, Compose does use views in the end. That's the only way to build UIs that the system itself understands. And, unlike SwiftUI, Compose itself isn't even part of the system, it's a thing you put inside your app.
I don't care about deprecations. Google has discredited itself for me and its abuse of the @Deprecated annotation is one of the reasons. The one thing that's very unfortunate is that all tools unquestionably trust that the person who puts @Deprecated in the code they maintain knows what they're doing, and nothing allows you to selectively un-deprecate specific classes or packages; you can only ignore all deprecations in your class/method/statement.
And, by the way, I also ignore the existence of Kotlin. I still write Java, albeit it's Java 17. The one time I had to deal with Kotlin code (on a hackathon) it felt like I'm coding through molasses.
Their new wifi network selector is laggy as fuck. The old one was perfectly fine. This is just like windows reimplementing basic UIs in their UI-framework-of-the-year.
Windows is only worse if you don't consider the freedom of choosing the hardware it runs on and its ability to be modified to run as you see fit.
Apple is really losing the plot because they really need their software to be good to sell their hardware.
Microsoft doesn't even have to care that much because there is not a relevant alternative coming out any time soon (as the various Linux failures have shown), but at least you don't have to give them a lot of money (in fact as close to zero as possible if you really want to).
Having joined a large established FAANG, it's become quite apparent that in any large established entity with so much management and meta work, with strong incentives driving more energy towards the meta work than where the rubber meets the road, it's inevitable for product quality to deteriorate.
Internally the prioritized output becomes the meta work, not what reaches customers. What reaches customers is almost some kind of accidental byproduct of what the vast majority of people in the org spend their time on day-to-day.
My past experience is dominated by startups. The fake work I'm incentivized to spend time on would have been fire-able levels of misplaced priorities / waste everywhere else I've worked as an IC developer.
I've never worked for Apple, I'm assuming this pattern plays out everywhere at this scale.
Matches my brief FAANG experience well: the vast amount of time devoted to performance reviews and the gaming of them versus actual productive work was… something I’d never encountered in my previous 15 years of work.
Interesting observation. I suppose most startups haven't existed for long enough for the meta-optimizing employees to be promoted over work-optimizing employees, yet.
Meta optimizing employees have issues hiding at smaller companies where one person has a vision of what needs to happen and quickly identify someone not pulling in one direction.
Once you get big enough, upper management has tough time figure out who is meta optimizing over work-optimizing. Not to mention there might be multiple meta optimizing employees.
I've seen high performance organizations at 600-800, thanks to execs that spent quite a bit of time talking across levels: When at that size, some ICs get CEO 1:1s, you have some chances of quality control. After all some execs had been coders. The problem is that none of them had ever been middle managers, which meant they had no idea of how to tell a good one from a bad one.
TAs the company kept growing, and hired middle managers from bigger tech, they that Jira was the way to go, as it allowed for nice reports aggregating "insights"across the organization. In under a year, point-centered management arrived, and with it an exodus of top talent, all of which had massive amounts of equity anyway. Execs then wondered what happened, and why ability to ship features kept declining. I think they still don't know.
I'd also be curious as to how it relates to span of control (~4-15 direct reports) and therefore levels of management for a given org size, as information hiding about actual work performed seems tied to managerial masking.
Same. The compensation is substantially better at FAANG, but in terms of actual on the ground work being rewarded, almost never the case.
Meta-work (lots of "cross functional" documents, alignment meetings, sync ups with senior tech leads to brown nose, deliberately creating low quality output to justify hiring more people/growing one's "scope") is 90% of it.
Any actual output is largely accidental, coming from the 20% still naive, or idealistic enough to actually care about what they produce.
> It was designed by people who don’t even know what it is for.
This rings especially true with Windows.
There was a not-so-serious rumor that the whole MS design department uses Macs.
This may or may not be true, but recent UX changes make it clear that the designers don’t really use Windows beyond a superficial level. Many common interactions have become increasingly tedious and visually sluggish, both due to excessive animations and performance issues. Explorer in particular has become barely usable for anyone who frequently manages files.
Apple can stay far ahead simply by not falling even faster than Windows. Finder and Spotlight have gotten worse, but they remain light-years ahead of their Windows counterparts.
Ummmm, what? Spotlight I'll grant you, but Finder is hands down the worst file browser on any operating system.
There's no up button, no split screen, you can't copy a path easily, you can't show hidden files easily, you can't customize the columns in list mode, the column mode won't let you go up, there's no cut and paste.
Windows Explorer sucks, but not nearly as bad as finder. Dolphin, thunar, and Nautilus on Linux have all those features and more. I have to drop to terminal or install mucommander just to do basic things in the macOS filesystem.
Display the path bar at the bottom and you can get to any level of parent in 1 click. Without the path bar you can also right-click on the current folder name at the top of the window to also navigate to any level of parent.
> no split screen
This is not something I've ever found a use for in any OS, I always just open 2 windows. It does have tabs, and you can drags stuff between tabs, albeit with some delay. This seems minor, unless for very specific workflows.
> you can't copy a path easily
Right-click file or folder, when you then press Option, Copy changes to copy path.
> you can't show hidden files easily
Command+Shift+. toggles hidden files on and off. I find this pretty easy to remember, since dots prefix hidden files.
> you can't customize the columns in list mode
Right-click the headings and you can add/remove the ones you want? Is that what you're talking about?
> there's no cut and paste
Instead of an option when copying, it's an option when pasting. Command+C to copy, then add Option while pasting... Command+Option+V. I almost never use Cut, even on Windows of Linux, I don't want to cut something, get interrupted, do something else, and lose my file. Having it move, then delete the source with the paste action, is safer.
It sounds like you haven't used Finder that much, or weren't willing to learn or adapt your behaviors.
There are some things about other file explorers I like, but I don't find myself struggling to use Finder at all. I mostly miss column view when I'm on anything that isn't Finder.
It is hidden behind a keyboard shortcut -there is no menu. cmd-shift-g I think. But it is literally the only way I know to get to folders besides the designated shortcuts (documents, pictures, etc) and I've been a daily mac user for many years.
Not sure I even agree about spotlight. Maybe I’m misunderstanding something but can never find what I’m looking for. Even when I’m in the directory, searching for a file in that directory. It’ll just show me random download files.
Granted, I haven’t even tried to use it in years. So maybe it’s not so bad these days?
I assume they didn't expect users to use directory hierarchies much and thought everybody would dump their files into flat dirs and search them with spotlight.
Spotlight has been broken on both of my Macs since Sequoia. It doesn't find anything under Downloads dir even though it should be indexed. Or any non-Apple apps under Applications. Re-indexing did nothing.
TBF a ton of windows users aren't primarily from the platform, and have either a second machine or more experience on other OSes.
The dev community might be an outlier, but people choosing a windows machine to get WSL on a mainstream and well-supported hardware is not uncommon.
Same for those with a macos work laptop but a windows gaming machine, or artists using a mac for personal stuff and windows for 3D/2D creation.
Having Windows designers making platform transitions easier kinda makes sense, though I agree it shouldn't penalize existing users as much as it does now.
> It was designed by people who don’t even know what it is for.
> This rings especially true with Windows.
Just take a look at the Windows 11 "Control Panel" or whatever is called and how that looks like just another UI on top of the main system, that does not make sense
I don't disagree, but the average business user is someone who uses the M365 suite and a handful of webapps. We are getting ready to roll it out and our test users haven't had many complaints. IT is a different story, however, for the reasons you stated. It's like they just shuffled all the system and config menus for fun.
Context menu alone takes a few hundreds milliseconds to load every time. And then you have the infamous "show more options" to click if you want to do most of things (in my use case).
Open a folder isn't much faster either, there is visible delay. with the current-day hardware there is no reason why this isn't instant.
Compare it with Windows XP or Windows 7, the difference is night and day.
Interaction with OneDrive is horrible too, this is particularly bad because it was fine on Win10. When a folder is syncing it constantly "refreshes" itself which causes you to lose the focus if you're renaming files. This is the single most annoying thing because I do close a doc -> immediately rename it all the time.
I still had spinning rust when I upgraded. Win7 was fine. UI wasn't quite as snappy as XP, but it still felt pretty responsive.
After upgrading? EVERYTHING took forever. The friggin' start menu lagged noticeably on almost every interaction.
Upgrading to a solid state disk mostly fixed it, so they had clearly done something foundational that'd radically increased disk IO system wide. Solid state's fast, but it's not fast enough, if they'd kept going down that road. Eventually it'd start to show up there, too.
Windows 7 with spinning rust microsoft did ReadyBoost, where something could have incredibly fast seek times but mediocre throughput.
Vista was the worst windows, other than 8 and ME.
Suffice to say if windows was actually slow when i used it, i would not use it. I didn't use ME, XP, Vista, or 8. There's a pattern here. I did use Xp x64 edition, but that came out ~2 years after XP, and did not have the pre-service-pack issues XP did.
It’s comforting knowing that I’m not the only one being driven crazy by the renaming file focus thing.
Now when I paste a file and go to rename, I wait and watch the focus selection switch 3 times before I know I’m good to type
i replied to this person who ignored what i asked for, and i uploaded a video and linked it. I right clicked, clicked "rename" and then typed a name, pressed control-Z and then enter.
If we're talking about a transactional file sync service preventing you from editing a file while it's synchronizing; then i am not sure what to tell you. Both you and the person you replied to seem to merely like complaining.
just installed win 11 pro an hour ago and https://i.imgur.com/61AdEBR.png context menu looks fine. Maybe they fixed it? i didn't change any settings at all.
i don't get it. Where's the sluggishness? Do you think i also modified my system to somehow run faster or otherwise tampered with the video evidence i provided?
Are you serious? who cares about the context menu, it's such a non-issue it took me like 15 seconds to figure out how to get the "archive" and rename commands back. The rest of the stuff in there is all software that added context menu items.
I get the feeling you haven't really used windows since win95 or something. like, you have a windows machine at work that you don't like because it's slow for whatever reason. Three people made specific complaints, i responded with a video and a screenshot disproving what they said, and... yeah i don't get it.
(not OP) I open the “videos” folder and it takes 10 seconds to show the files list (there’s only like 50 files). I tried various forum solutions (whose existence proves it’s a bug) and nothing worked. Only happens on the videos folder.
Have you tried searching with Explorer? Or opening the start menu or a folder? I'm currently 100% a Windows shop, and it's embarrassingly slow on my silly fast computer.
To be fair, searching with Spotlight has been equally slow and useless for me… Whenever I need to find a file and mistakenly use Command F in my Finder, the complete cessation of activity that inevitably results reminds me yet once again to just go to my terminal to use trusty GNU’s find instead.
but more to the point you have to enable indexing and let the indexing service run. Microsoft caught flak for "SearchIndexer.exe" using 25% of a CPU 24/7 that i think it's much less aggressive now. But i don't use that search because windows searches CIFS shares slowly, too. Everything.exe indexes and the searches are near enough instant that it's not even worth splitting hairs or stopwatch timers.
> Just get used to working around them and ignore?
Pretty much. It's not like the other operating systems are better in this regard. In general there's a lot more software that's buggy like this than software that's reliable
I accept them as I do in windows and Linux because I have built workflows around the things I do want on each of them respectively. I’ve long since given up the dream of any one platform or technology choice meeting all my needs, for me at least, it’s a fools errand.
I don't even understand this bug description. It's an edge case I guess I never ran into, so pretty easy to handle.
That said, it's not like everything is perfect, just 100% better than my drive-by experiences trying to have a gaming PC (dead, again), and an Android phone for testing purposes.
To clarify their issue, you can celebrate someones birthday without knowing the year they are born, you only need to know the reocurring date. You can't enter a birthday in apple contacts without a year, if you attempt it it sets it as the date of tomorrow.
My experience with apple is something's either a 2 minute fix or unfixable which to be fair is a reasonable way to do things though much less appealing to me (though less relevant for many users as stock android/windows continue to give users less and less control).
Are we talking about the Contacts app on macOS? I just added the birthday "9/22" to a contact. On blur, the value changes to "September 22". On save, I see the value "September 22" reflected in the birthday field of the contact.
FWIW, I've had the expiry month of credit cards stored in Safari increment on occasion (leading to failed online payments, trouble getting a flight, etc.) several times, to the extent that I now always include the expiry month in the card nickname. Mind boggling.
If you hit save too fast while a numeric dial control - like the sort used in the iOS Clock app for alarms - is still (barely) spinning.. it will happily just silently keep its old value.
This is easy to repro by spinning an alarm's minutes and hitting save before it's completely and utterly stopped.
This UI bug (?) has existed for as long as I can remember.
I will point out I was at Google for a similar length of time and saw nothing but amazing code, yet the problem was is building anything but the ads money maker.
If there was a way to combine Apples magic marketing brainwashing with Google’s engineering it would be an amazing thing to watch
This saddens me, as I learned the lessons of less but better through Apple over 20 years ago through Steve and Jony, which ultimately led to Rams. It was a pretty transformative lesson in my life, and extended far beyond tech or products.
I hope they are able to course correct with the right leadership. A culture that cares deeply about the little things is hard to build and has to be supported at the highest levels.
> Sorry, got in to rant mode. I really want “less but better” from things in my life. We as consumers aren’t rewarding companies that take this approach apparently.
If *becoming the most valuable company in the world* isn't being "rewarded", then what possibly is?
No, it's the hypercapitalist endless drive for ultra short-term, next quarter profits at the cost of anything else that causes this. Obvious irony being that Apple would've never become this big if Jobs had followed this approach.
This of course is the #1 reason of the downfall of the West, above all else - pure short-termism.
It is not a new scam. My grandmother had all of her teeth removed at about age 45 in the early 1960s to get dentures. Not all of her teeth were bad but the dentist encouraged her that removing all of the teeth was best because she didn't want to have to buy a new partial plate every time she lost another tooth.
The same dentist was offering the same to adults of any age. My mom, about age 19 at the time, was also offered it as solution to having misaligned teeth. She asked a critical question of the dentist; "Do you have dentures or are those your own teeth?" When he replied that his teeth were not dentures she "noped" right out there and still has most of her teeth to this day.
Not just planes and fairly recently too. I was working at a Boeing subsidiary when the
737 Max MCAS happened. They dumped everyone they could on loathsome “$9000 USB cable” type time and materials defense work. I was a senior Java architect and they quickly “retrained” me to do HIL component testing in plain old C. It might have been seen as a move to improve cash flow but realistically it had the effect causing almost all of the software staff to leave in short order which I guess also improved cash flow. The subsidiary is still struggling several years later to rebuild their software team.
It is pat time to impose an excise tax on all packaging based on how long it takes to decay to base elements. Time until naturally “recycled” shouldn’t be a cost you can externalize. The tax owed can be reduced when recycling actually occurs, essentially passing on the difference to the recycler.
I have worked primarily from home for most of the last 25 years at several companies. I joined Apple during the pandemic and worked from home for the first 18 months but left Apple about 15 months ago for Amazon, in part, because of the increasing RTO pressure at Apple. The Apple RTO policy has only gotten stupider since I left.
Unfortunately about a year after I joined Amazon chose to adopt an RTO policy as well. It is not going well. Dissatisfaction is high, people are quitting, people are planning to quit, respect for upper management dwindles. There are many heartbreaking stories of productive long term employees being force out by RTO. WFH had also been opening new opportunities to people who were unable to commute either because of disability, location or other reasons. It has been sad to see those people shut out again.
The RTO policy has really soured the culture. Adrian highlights the issues very well. Amazon is a global company. We are expected to work with remote teams routinely. Many teams, like mine are also geographically distributed. Going in to the office doesn't make a damn bit of sense for most people.
If you're able to share, I'm curious what you mean by "Apple RTO policy has only gotten stupider"? Isn't it still 3 days in-office (or 4 days for some teams)?
The degree to which it is enforced, managers being required to enforce the policy, and to which compliance is used as a measure in performance reviews have all increased and, as you mention, some teams have gone to 4 days with pressure to exceed the minimum three days. At least one person I've spoken with indicated that their promotion was delayed until they could get their RTO compliance numbers up.
What degree of criticism crosses the threshold of "hostile"? For some on the conservative side any criticism or comparison of the US by Americans or worse yet by foreigners unacceptable; the "love it or leave it" crowd. There have been repeated incidents of various politicians, journalists and authors being denied entry for little more (and sometimes less) than saying that the US "kinda sucks".
Obviously there's little good reason to admit someone who wants to overthrow the government (there are already enough of those locally) but literally anyone who says that the US is less than perfect?
I certainly don't want this decision to be in the hands of an unaccountable and capricious ICE officers. I've personally experienced just a taste of how petty they can be and have little reason to doubt that they are routinely spiteful assholes.