As for the article, I believe one of the reasons English and by extension the US ended up "owning" the computer revolution was it was a large language with a simple alphabet. It has less letters than many other large language and was easily coded into the tiny computers of the 40s and 50s.
I think there were a lot of historical contingencies involved. The nature of the language itself (small vs large alphabet, etc) is probably one of the least important.
There was an interesting BBC article a while back about the decline of German usage in science: https://www.bbc.co.uk/news/magazine-29543708 They put it down to anti-German sentiment during WWI (not WWII as I would have assumed!)
It’s pretty easy to imagine an alternative world where German was the common international language of science and became the basis of most programming languages too.
The importance of German as a scientific language was even greater before WWI than claimed in that BBC article.
While the BBC article estimates that of the scientific literature of the 19th century and of the pre-WWI 20th century a third was published in French, a third in English and a third in German, I believe that an estimate much closer to reality would be a quarter in French, a quarter in English and a half in German.
That doesn’t make them American any more than Amazon is Japanese because they have a Japanese division.
The companies I listed are Asian businesses that make hardware in Asian regions for Asian consumers and also have a massive presence in the west too.
I could also list multiple Asian companies that don’t have a large presence in the US but then you wouldnt have heard of them so what would the point of that be?
Some of the comments in this thread smack of “I’ve never needed to use anything outside of America so I just assume English-speaking businesses are the only thriving industries”.
All 3 languages have a bit more letters that English. Some of those language's letters are marked with inflections.
Plus the first computers used only Upper Case Letters which were designed similar to how Ancient Latin was carve letters in stone. So it was far easier to design a printers, storage, punch cards for when you only care about 26 letters and 10 numbers and a few punctuation marks.
The “first computers” (which can actually mean a lot of different things depending on what you’re defining as a computer) didn’t output text at all.
Even in the digital era, they output binary rather than text. In the 50s it would have been machine code in and machine code out.
Punch cards would have been binary, Initially machine code but later text encodings in binary. Given machines back then weren’t fixed with 8-bit bytes, it meant you have have larger or smaller character sets.
There were plenty of Japanese, Chinese and Russian computers using non-Latin characters. There were also plenty of European computers that supported native characters outside of the standard 26 English letters too. That’s why character encodings have been such a nightmare to work with prior to Unicodes adoption (and frankly, Unicode creates a new set of problems, but that’s a different topic entirely).
Just because you haven’t had to work with non-Latin, or even non-American, encodings doesn’t mean that all machines were English-centric.
Most of the lowercase Cyrillic letters are just smaller version of the uppercase ones. And neither the German nor the Russian alphabet is larger to any extent that matters.
CJKV languages are a bit more subtle subject. Vietnamese nowadays uses a romance-inspired alphabet, where only the tone marks are slightly difficult to typeset. Japanese and Korean could have gotten rid of Kanji/Ganja if they really wanted. But in any case printing technology for Chinese characters existed and was in wide use at the turn of the 20th century.
French and German are both completely intelligible when written in all caps with no diacritics (in French by simply omitting them, in German by replacing e.g. Ü with UE).
https://guidetogrammar.org/grammar/twain.htm
As for the article, I believe one of the reasons English and by extension the US ended up "owning" the computer revolution was it was a large language with a simple alphabet. It has less letters than many other large language and was easily coded into the tiny computers of the 40s and 50s.