Reddit would be even worse if the translations were better, now you don't have to waste much time because it hits you right in the face. Never ever translate something without asking about it first.
When I search for something in my native tongue it is almost always because I want the perspective of people living in my country having experience with X. Now the results are riddled with reddit posts that are from all over the world with crappy translation instead.
It is a trade-off between convenience and freedom. Netflix vs buying your movies. Spotify vs mp3s. Most tech products have alternatives. But you need to be flexible and adjust your expectations. Most people are not willing to do that
The issue is that real life is not adaptable. Resources and capital are slow.
That's the whole issue with monopolies for example, innit? We envision "ideal free market dynamics" yet in practice everybody just centralizes for efficiency gains.
Right, and my point is that "ideal free market dynamics" conveniently always ignore this failure state that seems to always emerge as a logical consequence of its tenets.
I don't have a better solution, but it's a clear problem. Also, for some reason, more and more people (not you) will praise and attack anyone who doesn't defend state A (ideal equilibrium). Leaving no room to point out state B as a logical consequence of A which requires intervention.
The definition of a monopoly basically resolves to "those companies that don't get pressured to meaningfully compete on price or quality", it's a tautology. If a firm has to compete, it doesn't remain a monopoly. What's the point you're making here?
There absolutely are options but we aren't using them because nobody cares enough about these downsides. bsky is up, with Mastodon you even have choice between tons of servers and setting up your own. Yet, nobody cares enough about the occasional outage to switch. It's such a minor inconvenience that it won't move the needle one bit. If people actually cared, businesses would lose customers and correct the issue.
More like it's time for the pendulum to swing back...
We had very decentralized "internet" with BBSes, AOL, Prodigy, etc.
Then we centralized on AOL (ask anyone over 40 if they remember "AOL Keyword: ACME" plastered all over roadside billboards).
Then we revolted and decentralized across MySpace, Digg, Facebook, Reddit, etc.
Then we centralized on Facebook.
We are in the midst of a second decentralization...
...from an information consumer's perspective. From an internet infrastructure perspective, the trend has been consistently toward more decentralization. Initially, even after everyone moved away from AOL as their sole information source online, they were still accessing all the other sites over their AOL dial-up connection. Eventually, competitors arrived and, since AOL no longer had a monopoly on content, they lost their grip on the infrastructure monopoly.
Later, moving up the stack, the re-centralization around Facebook (and Google) allowed those sources to centralize power in identity management. Today, though, people increasingly only authenticate to Facebook or Google in order to authenticate to some 3rd party site. Eventually, competitors for auth will arrive (or already have ahem passkeys coughcough) and, as no one goes to Facebook anymore anyway, they'll lose grip on identity management.
It's an ebb and flow, but the fundamental capability for decentralization has existed in the technology behind the internet from the beginning. Adoption and acclimatization, however, is a much slower process.
These centralized services do and did solve problems. I'm old enough to remember renting a quarter rack, racking my own server and other infrastructure, and managing all that. That option hasn't gone away, but there are layers of abstraction at work that many people probably haven't and don't want to be exposed to.
Aaand even if we ignore the "benefit" of Cloudflare and AWS outages being blamed on them, rather than you, what does uptime look like for artisanaly hosted services on a quarter rack vs your average services on AWS and Cloudflare?
Yes, but the problem is that people often choose a hobby that will benefit their career.
If you are going to spend time on a hobby why not pick a hobby that also benefits your career? Win win?
I struggle with that, partly because computer science was my hobby. Then I went to university studying it, and enjoying it as a hobby. Then I started working, still enjoying it as a hobby.
And if I have 10 interesting topics I want to explore on my free time. Why not pick one that will also benefit my work?
After all, I don't have as much time for my hobbies nowadays. So picking one that also benefits and influences my work is more fun and meaningful and also allows me to be paid doing something I would have done for free anyway.
This article highlights the problem with that approach.
Feels like it is very common in our industry. A very high percentage of "Show HN" fits dangerously close to that. Which is not necessarily a bad thing, it is just exposing yourself to the risks mentioned in the post.
I'm with you. But the worst case isn't a hobby. The worst case is if you burn out and at the same time loose all appetite for both your work and your hobbies at the same time.
People say things like that, and I wonder if I’ve just been living in a gilded tower of using Apple Mail with decent IMAP server implementations.
I’m also pretty familiar with the wire protocol and its implementation — it’s never struck me as particularly horrible.
A new protocol isn’t likely to solve the problem of poorly implemented clients and servers — e.g. Google doesn’t really care about good IMAP support, so they’re unlikely to care much about JMAP, either. They just want you to use their webapp.
Sounds like it :) I’ve been very happy with Mail.app since MacOS 10.0. My use has always been with my employer’s IMAP servers, and my own cryus (and eventually) dovecot self-hosted IMAP servers.
Mail.app is what NeXT used internally, and Apple uses to this day AFAIK. Steve Jobs historically paid a lot of attention to it and wasn’t shy about weighing in on any changes.
Most of the complaints that I’ve heard about it seemed to stem from poor IMAP servers (e.g. Gmail), but it sounds like your knowledge in the space would be a lot more detailed and recent than mine, so I would be very interested in your thoughts.
Gmail does indeed _intentionally_ provide poor IMAP service. But the long and short of it is that Apple Mail simply isn't a first-class product. It's an afterthought.
I know you're in this for the satire, but it's less about the webapps needing the memory and more about the content - that's why I mentioned video editing webapps.
For video editing, 4GiB of completely uncompressed 1080p video in memory is only 86 frames, or about 3-4 seconds of video. You can certainly optimize this, and it's rare to handle fully uncompressed video, but there are situations where you do need to buffer this into memory. It's why most modern video editing machines are sold with 64-128GB of memory.
In the case of Figma, we have files with over a million layers. If each layer takes 4kb of memory, we're suddenly at the limit even if the webapp is infinitely optimal.
Apparently with 24 bytes per pixel instead of bits :)
Although to be fair, there's HDR+ and DV, so probably 4(RGBA/YUVA) floats per pixel, which is pretty close..
Let's see. I can cache the information that example.com is valid up to May 31 2026, but then how do I know that it gets revoked on any day before that date?
And if I cache the information that it is revoked, how do I know that it's allowed again?
I could check, let's say one time per day even if I don't access that site.
In any case I'm still leaking which domains I browse and I keep trusting cached certificates until the next check.
On the other side, with short lived certificates I would be trusting a certificate for a longer time, until it expires.
Downloading a list of all certificates and their status from every CAs is probably unfeasible.
It seems that we can't escape a tradeoff between privacy and security.
I bet they'll phase it out and try to force their worse service, wherein your data is stored on their servers, like they tried to do with PINs. It took enormous pushback to get them to stop mandatory PINs, and even then they made it nagware for a year or two.
I didn't trust their rationale about PINs and remote attestation somehow meaning your data is secured by a small passphrase, just like I won't trust them to not remove a useful and existing feature I already rely on for backups.
Also not mentioned, they designed their existing backup solution to require reverse-engineered community solutions to actually access your data; I have to use a Github project to unencrypt the backup and export my chats, which is something I've never had to do with any other messenger.
From your link, I wish they would answer this, and they've been asked numerous times, and to my knowledge have avoided the question (which is very concerning to me):
>This is excellent news! Will there also be official documentation on the backup format, potentially even official tooling like signalbackup-tools[0] to access/parse backups offline? I'm asking because, having used Signal/TextSecure for 10 years now, my backups are worth a lot to me (obviously) and there have been times when I would have liked to mine & process my backed-up data. (Extract media from conversations in an automated manner, build a more elaborate search, …)
I'm like that poster and backup all my chats obsessively, since way back in the day, and experienced a period with Signal where it was impossible for me to access my own data because of their position.
Took me a really long time to realize that I should scroll. Because why would I? There is absolutely no indication that there is anything to scroll to.
I clicked on the two avatars but that didn't get me very far and the only thing left to click was "by alvin chang" but that was about as fruitful as I imagined it would be.
So I assumed it was a podcast, re-checking that I had audio on etc. But nope, so I checked another browser. Same there... Then I read HN comments, ah ... Great design? ...
Same here — once you get the scrolling part it's pretty great, but like you I was stuck at the top for a while. A downwards-pointing arrow on the hero would help a lot here.
Firefox in Windows has the tiniest little scrollbar indicator in the top right that honestly blends in very well with the background. I didn't realize I needed to scroll until I came to the comments. I clicked around... got some interaction... but basically left the first time being very confused.
I have Firefox on macOS as well, but I don't see a scroll bar until I start scrolling. Could be because I'm using an external trackpad, and not a mouse.
I was going to say that somehow I knew I had to scroll the first time I entered. But I went back after reading your comment and I have no idea how did I find out the first time, there is no indication that there is content bellow.
I was viewing on desktop and the blank space all around made it immediately feel like an article that required a scroll to view the content below the fold.
Seeing the timestamps change as I scrolled and seeing a progress "bar" update within the speech balloons during the dialogs made it more obvious I just had to scroll to see the content change.
I do think the progress bar color is low contrast enough that some might not see it and not realize they have to scroll to cause the dialog to update, though.
> Took me a really long time to realize that I should scroll. Because why would I? There is absolutely no indication that there is anything to scroll to.
> I clicked on the two avatars but that didn't get me very far and the only thing left to click was "by alvin chang" but that was about as fruitful as I imagined it would be.
Thank god, I wasn’t the only one, just posted a similar comment here.
A random macOS binary is more likely to run on another macOS install from anytime in the last half decade than a Linux binary on the same distribution.
Even Apple’s famously fast deprecation is like rock by comparison.
I'm not sure why you think this is a good metric; the space of "random Mac binaries" is far smaller. There's probably something to be said for this "curation," but you pay for it, both literally with money and in limited selection.
I don’t know; you don’t think having Win32 be the unofficial API is a problem?
It literally means Windows will always exist - as the preferred IDE and Reference Spec for the Linux desktop. It also means all evolution of Linux will be ironically constrained by Win32 compatibility requirements.
Your vision have motion blur. Staring at your screen at fixed distance and no movement is highly unrealistic and allows you to see crisp 4k images no matter the content. This results in a cartoonish experience because it mimics nothing in real life.
Now you do have the normal problem that the designers of the game/movie can't know for sure what part of the image you are focusing on (my pet peeve with 3D movies) since that affects where and how you would perceive the blur.
Also have the problem of overuse or using it to mask other issues, or just as an artistic choice.
But it makes total sense to invest in a high refresh display with quick pixel transitions to reduce blur, and then selectively add motion blur back artificially.
Turning it off is akin to cranking up the brightness to 400% because otherwise you can't make out details in the dark parts off the game ... thats the point.
But if you prefer it off then go ahead, games are meant to be enjoyed!
Your eyes do not have built-in motion blur. If they are accurately tracking a moving object, it will not be seen as blurry. Artifically adding motion blur breaks this.
Sure they do, the moving object in focus will not have motion blur but the surroundings will. Motion blur is not indiscriminately adding blur everywhere.
> Motion blur is not indiscriminately adding blur everywhere.
Motion blur in games is inaccurate and exaggerated and isn’t close to presenting any kind of “realism.”
My surroundings might have blur, but I don’t move my vision in the same way a 3d camera is controlled in game, so in the “same” circumstances I do not see the blur you do when moving a camera in 3d space in a game. My eyes jump from point to point, meaning the image I see is clear and blur free. When I’m tracking a single point, that point remains perfectly clear whilst sure, outside of that the surroundings blur.
However motion blur in games does can literally not replicate either of these realities, it just adds a smear on top of a smear on top of a smear.
So given both are unrealistic, I’d appreciate the one that’s far closer to how I actually see which is the one without yet another layer of blur. Modern displays add blur, modern rendering techniques add more, I don't need EVEN more added on top with in-game blur on top of that.
When I search for something in my native tongue it is almost always because I want the perspective of people living in my country having experience with X. Now the results are riddled with reddit posts that are from all over the world with crappy translation instead.
reply