I see this as a last ditch attempt by Microsoft to stay semi-relevant. Of course this was the great promise of .NET to begin with, but it seems rather late to exercise the option now (some 15 years or so after .NET was created)-- a very desperate move by a dying empire.
> F# has a number of these specific implementations but they're kind of "hard-coded" in the language rather than fully extensible.
That's not true. You can extend the language yourself with "computation expressions" (and that's in fact how the 'async', 'let!' et al keywords are implemented).
True, but what you can't do is abstract over them; you can't write a function that operates on "some generic computation expression" (you can't even write that type). So you can't write "sequence" or "traverseM" or more complicated/specific things you build on top of these.
There is an encoding supporting that level of abstraction (perhaps though not as straightforward as you'd prefer); namely using interfaces (which are first-class) to encode module signatures. See: https://gist.github.com/t0yv0/192353
The operators on collections always have alternate named forms. I actually instinctively skip over the operators when perusing the documentation and have no trouble at all finding what I need.
Not to be 'that guy'TM but I think your last sentence is missing at least a word. Also, there are comma splices throughout. :) [Maybe that was the real reason you had trouble finding a job writing]. [just kidding] :)
That premise is definitely true. It is a result of Gödel's Incompleteness Theorems. 'Physics' certainly satisfies the constraint: 'of sufficient complexity to encode the natural numbers'.
I wonder how you come to that conclusion, because physics is not proven to be of "infinite complexity" (loose statement).
Some things that we are used to from mathematics might not be true in physics. Take for example the fact that in mathematics the real numbers are uncountable. Now in reality (physics), the whole set of real numbers may not exist. It is only an abstract concept from mathematics. And while it may be possible to reproduce any real number in physics as some quantity, you are reproducing them as you go, making the "real" (physical) real numbers countable.
Pretty much any system of logic worth looking at (including any which the sciences may be based on and the one running inside each human brain) is going to at least be as complex as this set of axioms.
Also, you really should read the JR Lucas material-- it explains this. And, I'd suggest Nagel & Newman's _Godel's Proof_ for a great introductory explanation of the Incompleteness Theorems.
We don't know that the physical world has infinite precision. In particular, time doesn't even seem to be continuous as far as we can tell (cf. Planck constant). And we think there's a finite amount of mass in the universe, so how could we encode arbitrarily large natural numbers (as is required to model Peano arithmetic)?
We don't have to encode arbitrarily large natural numbers. Rather we have to encode the rules that allow us to construct them (which is quite simple actually). And, I think 'digital physics' is more compatible with Incompleteness implications than the alternatives; not less.
> I could not agree more with your first paragraph. The only other language I've used that I've had that experience with was Haskell, and while there are good arguments to be made for using Haskell in production, it should be obvious that's not a language that will ever become mainstream.
I don't think it is at all obvious that Haskell won't become mainstream. It's already exerted a tremendous influence over many other mainstream languages and there's only so long that can happen before people just start going directly to the source of the innovations (or one of its direct descendants).
I agree Haskell has had an important influence, but I don't see why people would necessarily "go directly" to it because of that. In fact, I would argue just the reverse. People chose the derivative languages b/c they provide things the original does not.
To wit, Lisp never became mainstream despite exerting a huge influence. Likewise Smalltalk.
Technically true, but I don't think it makes sense to consider Clojure to be the same thing as Lisp in the context of ternaryoperator's statement (though I obviously can't speak for him). Notably, Clojure's tight integration with Java is the primary reason for its relative popularity, and what most sets it apart from the rest of the Lisp family. It is disingenuous to claim Clojure means Lisp has gone mainstream when the popularity of Clojure is not due to its inclusion in the Lisp family.
Beyond that, I'm not quite sure Clojure counts as "mainstream" yet. According to the TIOBE Index, it doesn't even rank in the top 50 languages. Heck, the top 20 includes R, and Dart, neither of which I would call "mainstream" (I'm actually really surprised at how high R is ranking). I don't know how significant that is, though the TIOBE Index is measuring "number of skilled engineers world-wide, courses, and third party vendors" and that seems like a reasonable approximation for "mainstream" to me.
There are several languages targeting the JVM these days. And, what obviously sets Clojure apart from other JVM languages is its Lispiness (i.e., the JVM is constant across JVM languages).
Clojure is not married to the JVM either-- in fact, it has been hinted that it would jump ship if something better comes along or the current situation becomes less viable. Furthermore we already have a dialect of Clojure called ClojureScript which targets JavaScript/node/V8.
And, I look at the JVM as really merely a library/API/runtime. C++ has STL and stdio and such and they are not part of the language proper but rather merely libraries for interacting with the underlying operating system (in a platform independent way). The same is true for the JVM with respect to Clojure and Scala et al.
Yeah, but nothing in it is Smalltalk-specific. It's not like Smalltalk survives in the mainstream because of Hotspot (in the way that, say, Algol survives).
Well, a counter point then can be: and why did those vendors did not insist on Smalltalk? Why weren't Smalltalk more heavily pushed by some big vendor itself?
It's not like SUN was the only player in town. IBM pushed Smalltalk IIRC.
I think this (from StackOverflow) tells a more comprehensive story):
• when Smalltalk was introduced, it was too far ahead of its time in terms of what kind of hardware it really needed
• In 1995, when Java was released to great fanfare, one of the primary Smalltalk vendors (ParcPlace) was busy merging with another (Digitalk), and that merger ended up being more of a knife fight
• By 2000, when Cincom acquired VisualWorks (ObjectStudio was already a Cincom product), Smalltalk had faded from the "hip language" scene
Even worse than the problem of uncommon concepts as monads is that Haskell's memory footprint is extremely hard to reason about. A few years ago it was impossible with the http libraries to download a file without the program consuming several times as much memory as the downloaded file.
> Even worse than the problem of uncommon concepts as monads
Just go ahead and learn the typeclass hierarchy and such-- it really is quite a useful higher level of abstraction in whatever language you choose. And it definitely will enter the mainstream (even more than is already has [Swift, Scala & C# all have monadic constructs]).
> Haskell's memory footprint is extremely hard to reason about.
And you'd probably want to also throw runtime in there as well.
I think this is relative-- it's not "extremely hard" for everyone. Also, many structured programmers found object orientation "extremely hard" but somehow the industry managed to progress through that era.
A common recipe people quote for good software is "a) first make it work, b) then make it fast".
Haskell is very good at a), and not bad at all at b). With the help of the profiler it shouldn't be that hard to determine a program's bottlenecks/leaks and fix them, as with any other language.
BTW, since you mention http, I have this reading on my back burner [1] but from skimming it found that for certain payloads a haskell http server may perform better than nginx (1.4, circa 2013?), which is an impressive feat.
Rust is not your typical lower-level language though. It supports a lot of the features that functional programmers expect. It is an eagerly evaluated language that lets you drop to 'unsafe' code where necessary but in its natural form, it is surprisingly high level.