Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I appreciate the Cursed Knowledge section on their website https://immich.app/cursed-knowledge

> Cursed knowledge we have learned as a result of building Immich that we wish we never knew



Reading it I see this

  50 extra packages are cursed

  There is a user in the JavaScript community who goes around adding "backwards compatibility" to projects. They do this by adding 50 extra package dependencies to your project, which are maintained by them.

which bring to this user: Jordan Harband https://github.com/sponsors/ljharb Does anyone know what they actually mean with that cursed knowledge point? And what's the "backwards compatibility" that Jordan also boasts in his GH profile?


To not just link to another thread: The specialty of ljharbs issues sits somewhere between "JavaScript is a very dynamic programming language that grew a lot and quite fast" and "we cannot trust developers to do the right thing".

His libraries tend to build up on older runtime implementations and freeze every used functionality during runtime, so they provide "second-run safety" and "backwards compatibility". Developers disagree with some of its effects, such as a grown dependency tree and impacts in performance of multiple magnitudes (as measured in micro-benchmarks). ljharb seems to follow a rather strong ideology, but is a member of the TC39 group and a highly trusted person.


ljharb is also conveniently paid per download. His actions border on malicious especially when viewing from a supply chain attack angle.

https://github.com/A11yance/axobject-query/pull/354#issuecom...


It definitely feels a bit strange and potentially alarming, but after reading through that whole thread he ultimately seems like a sincere person doing work that he thinks matters, now getting dogpiled for it.


If he had kept his strange and alarming behavior to himself, he wouldn't be 'getting dogpiled' for it now.

The problem is that he's forcing his ways on others. If we're identifying an aggressor here, it's him. The project maintainers are the victims.


At least in the thread linked here, it seems like his maintainership over the project is legitimate, which makes it wrong to characterize him as "forcing" his ways on anyone.


Even ignoring that examples of his behavior are easily found elsewhere, the link itself shows him completely disregarding feedback from other contributors to force his own way.

Honestly, I can't understand the intent behind such a defensive rebuttal to the criticism of his actions.


I don't care one way or another. I'm not a JS developer. I'm just struck by a reaction that seems quite extreme, and very visible dogpiling.


My point wasn't about javascript. He got pushback because he ignored everyone and just did his own thing. It has nothing to do with javascript and you can see that in the link. That's a weird excuse.


I haven't found one person who agrees with him on what he thinks matters. His way is wasteful and slow and just indefensible.


Also, I imagine cost of the globally wasted CPU cycles is much higher than what he profits. It's a pure abuse of resources.


I don't have much to add myself, but there was a bit of discussion around this back in August that you might be interested in: https://news.ycombinator.com/item?id=44831811


Wow! Didn't know Immich's Cursed page had already a dedicated post on HN.

I love reading about opensource drama, especially if it's some technology I don't use directly, it's like watching a soap opera.


This user makes money of off how many downloads their packages receive.

https://github.com/A11yance/axobject-query/pull/354#issuecom...


What a dumpster fire.

Is he really being paid per download, or is he just being sponsored? It’s not clear if either would imply some form of malicious intent either.


Seems like this thread answers your question https://news.ycombinator.com/item?id=37604373


Agree, I wish every project would have that!

I still think the conclusion on "setTimeout is cursed"[0] is faulty:

> The setTimeout method in JavaScript is cursed when used with small values because the implementation may or may not actually wait the specified time.

The issue to me seems that performance.now()[1] returns the timestamp in milliseconds and will therefor round up/down. So 1ms errors are just within its tolerance.

[0]: https://github.com/immich-app/immich/pull/20655

[1]: https://developer.mozilla.org/en-US/docs/Web/API/Performance...


JS is not a realtime language.

setTimeout() does not actually guarantee to run after the elapsed time. It merely gets queued for the next async execution window after that timer elapsed. Hence it can also be off by infinity and never get called - because JS is single threaded (unless you use a worker - which comes with its own challenges) and async windows only open if the main thread is "idle".

Usually, this is very close to the time you set via setTimeout, but it's very frequently slightly off, too.


setTimeout guarantees that the time provided is the time that has at least been elapsed, if it elapses at all - I think that is known to every JavaScript engineer out there.

Then there are also gotchas like these[0][1]:

> As specified in the HTML standard, browsers will enforce a minimum timeout of 4 milliseconds once a nested call to setTimeout has been scheduled 5 times.

Still, the issue is rather how to measure the elapsed time reliably, for unit-tests among other things.

[0]: https://developer.mozilla.org/en-US/docs/Web/API/Window/setT...

[1]: https://html.spec.whatwg.org/multipage/timers-and-user-promp... (first Note)


Indeed - I was a bit surprised by them mentioning this to be honest, since, as I understand it, this is kind of a widely accepted limitation of setTimeout - it's purely a 'best effort' timer. It's not intended to be something where "yes after exactly Xms it'll execute.


This isn't quite the whole picture. If called in a nested context, `setTimeout` callbacks get executed in the next execution window, or at least 4ms after the initial call, whichever is earlier. Similarly, I believe `setInterval` has a minimum interval that it can't run faster than.

See: https://developer.mozilla.org/en-US/docs/Web/API/Window/setT...


> JS is not a realtime language.

Is there even such a thing? You're at the mercy of the platform you're running on. And Windows, Linux, Mac, Android, and iOS are not realtime to begin with.

I guess if you're running on a realtime platform but in a VM like JS does, you can then take that property away, downgrading the "language" from being realtime. I wouldn't call that a language property still though, maybe my VM implementation doesn't make that downgrade after all.


They think Postgres is cursed with a 2^16 limit; SQL Server has a parameter limit of ~2,000. I guess at least it's low enough that you're going to fail early.


Sure, but SQL Server DB protocol (TD) has a dedicated Bulk Insert specific for that functionality. TDS isn't perfect, but it is much better then the postgresql wire protocol v3.

Sometime I want to build a DB front-end that you send up some type of iceberg/parquet or similar, and return a similar file format over a quic protocol. Like quic, persistent connections could be virtualized, and bulk insert could be sane and normalized: eg insert these rows into a table or temp table, then execute this script referencing it. While I'm at it, I'll normalize PL/SQL so even brain-dead back-ends (sqlite) could use procedural statements and in-database logic.


and then you write `.chunked(list)` so you can write `.map { query(list) }` instead of `.map { query(it) }` :)

i wish there was an unused lambda parameter warning..


I love it. Immediately added to my daily note-taking practice


the "personality" of this team is just such a joy...


Are you being sarcastic or just an ellipsis (…) abuser? If the former, what for?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: