Hacker Newsnew | past | comments | ask | show | jobs | submit | jusuhi's commentslogin

Because i63 won't help with 32 bit variables.


...which is a bit of a weak reason when there are hardly any 32-bit CPUs around anymore though. Picking a narrower integer type makes sense in specific situations like tightly packed structs, but in that case it usually doesn't matter whether the integer range is 2^31 or 2^32, if you really need the extra bit of range it's usually ok to go to the next wider integer type.


Feel free to generalize my argument to i31 and i15.

Also, look around you and realize that non-64-bit microprocessors outnumber 64-bit machines by orders of magnitude. There is more to the world than your laptop and smartphone. (Just within 1m of myself at this very moment I see 4 devices with a 32 bit microprocessor in them, and I'm sitting in a coffee shop reading a book. Well, and browsing HN now and then.)


There's actually a better solution if you allow the compiler some wiggle room:

Zig and C23 (and probably a couple other languages) support arbitrary width integers, so integer types like u15, i3, i1 or even u512 etc... are something entirely 'normal', and even useful in day-to-day coding when mixing signed and unsigned integer math (for instance Zig disallows to assign an unsigned integer to a signed integer of the same width, but it's perfectly fine to assign a 7-bit unsigned integer to an 8-bit signed integer).

Down on the CPU such operations happen on the register width, and if the integer width exceeds the register width, are unrolled into a chain of op-with-carry instructions (same way we did 16- or 32-bit math back on 8-bit processors).


Surely your laptop and smartphone have several 32-bit microprocessors in them.


Pretty much all "64-bit" CPU architectures out there also have specific instructions for 32-bit arithmetic.

They were designed that way partly on purpose, to cater for the enormous amount of existing program code that has been written with the assumption that an `int` is 32 bits. A large portion of that code also written with the assumption that they are also 2's complement and wrapping.


The keywords here are operational transform (OT) and conflict-free replicated data types (CRDT). Fun to implement, there are lots of libs.


The task proving some statement and the task of finding the shortest, or a "reasonably short" proof, are very different endeavours.

The first is about certainty that a statement is valid ("true"). The other is about simplifying the understanding of _why_ it is valid. Most of the time, you don't care much about the latter.


It would be nice to have an reproducible example of how the result was achieved though. Otherwise it's a compendium of results, not proofs.


What is your certainty that that statement is true? What if it was a calculation which takes decades on a supercomputer?


At current rates, whatever is done on a supercomputer today is done by a cheap pocket-size device just decades later. So, I'm not too worried about this case.

One of the first famous examples of this is the four-coloring theorem. I don't know any serious mathematician who is not certain of that result.


The "best answer" perhaps from the POV of your own intuition. But the question is about a historical fact, and those don't work that way.


> Is this a legacy thing or does a tilted cursor serves a purpose?

It's not just a question of historical fact. There's where it originated from. Then there are the documented reasons they first did it that way. Then the undocumented but plausible reasons it started that way. Then there are the reasons it has stayed this way. Those reasons aren't facts, they're counterfactuals.

Implicit in asking why the mouse cursor is like this is asking why it isn't a different way. If there was a better enough cursor, it would have won out. So all of its reasons of functionality, even the ones the inventor didn't think of or didn't matter historically, are part of why it is the way it is today.


I guess first is the reason why it's initially and second is why it stay that way.


If you'd actually read wherever the link is going then you could get an idea instead of just speculating wildly.


Phenomenon. That's the singular form.


> through the DOS computer language

Oh yes, that DOS computer language. Bill Gates was fluent in that language.

Totally makes me trust everything else in that article even more.


Misleading title. It wasn't just "outside". It was a controlled experiment with Mars-like conditions, unlike space directly outside the ISS.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: