I could not agree more with your first paragraph. The only other language I've used that I've had that experience with was Haskell, and while there are good arguments to be made for using Haskell in production, it should be obvious that's not a language that will ever become mainstream.
I'm hoping that as Swift evolves over time, it will slowly become that sort of language. Right now it's pretty hard to write any real-world code in Swift that doesn't work with the Cocoa frameworks, and the Cocoa frameworks are typed in an objective-c-compatible way (even if new frameworks are written in Swift they'll need to maintain obj-c compatibility), which means you don't get the strong typing that's necessary for this property. Pure Swift code has the potential to behave like this, although you probably need to avoid the ImplicitlyUnwrappedOptional feature (the ! suffix on types), which of course primarily exists for ease of obj-c integration anyway.
I'm bringing up Swift because, with Apple's backing, it's very quickly becoming a "mainstream" language. I put that in quotes because it is only usable with iOS and OS X programming (for now at least), but iOS is large enough that obj-c should be considered a mainstream language despite the fact that almost nobody outside of iOS/OS X uses it, and therefore as Swift supplants obj-c it becomes appropriate to call it mainstream.
Regarding parallelism, I've been in love with Rust for a long time now, and one of the biggest reasons is because Rust makes parallelism safe. As an iOS/OS X programmer by trade, I think thread safety is far and away the biggest elephant in the room. Despite the fact that we've known that multithreading is the future for years, and despite the wonderful Grand Central Dispatch library on iOS/OS X, most programmers still think in a single-threaded mindset and don't even consider how their code should operate if invoked on a separate thread. This was one of my bugaboos with Go back when I was using that language (which was from the day it was announced right up until I discovered Rust, though admittedly my usage was in hobby projects and nothing serious).
I applaud the fact that Go has a data race detector now, which I used to finally uncover a lurking data race that plagued one of my programs for months (and which was ultimately caused by the Go library using two goroutines where I expected one, and therefore data which I expected to be on a single goroutine was actually mutated from two goroutines simultaneously). But I think Rust is absolute proof that a modern language can be designed such that data races are prohibited at compile-time without sacrificing any language flexibility.
> I could not agree more with your first paragraph. The only other language I've used that I've had that experience with was Haskell, and while there are good arguments to be made for using Haskell in production, it should be obvious that's not a language that will ever become mainstream.
I don't think it is at all obvious that Haskell won't become mainstream. It's already exerted a tremendous influence over many other mainstream languages and there's only so long that can happen before people just start going directly to the source of the innovations (or one of its direct descendants).
I agree Haskell has had an important influence, but I don't see why people would necessarily "go directly" to it because of that. In fact, I would argue just the reverse. People chose the derivative languages b/c they provide things the original does not.
To wit, Lisp never became mainstream despite exerting a huge influence. Likewise Smalltalk.
Technically true, but I don't think it makes sense to consider Clojure to be the same thing as Lisp in the context of ternaryoperator's statement (though I obviously can't speak for him). Notably, Clojure's tight integration with Java is the primary reason for its relative popularity, and what most sets it apart from the rest of the Lisp family. It is disingenuous to claim Clojure means Lisp has gone mainstream when the popularity of Clojure is not due to its inclusion in the Lisp family.
Beyond that, I'm not quite sure Clojure counts as "mainstream" yet. According to the TIOBE Index, it doesn't even rank in the top 50 languages. Heck, the top 20 includes R, and Dart, neither of which I would call "mainstream" (I'm actually really surprised at how high R is ranking). I don't know how significant that is, though the TIOBE Index is measuring "number of skilled engineers world-wide, courses, and third party vendors" and that seems like a reasonable approximation for "mainstream" to me.
There are several languages targeting the JVM these days. And, what obviously sets Clojure apart from other JVM languages is its Lispiness (i.e., the JVM is constant across JVM languages).
Clojure is not married to the JVM either-- in fact, it has been hinted that it would jump ship if something better comes along or the current situation becomes less viable. Furthermore we already have a dialect of Clojure called ClojureScript which targets JavaScript/node/V8.
And, I look at the JVM as really merely a library/API/runtime. C++ has STL and stdio and such and they are not part of the language proper but rather merely libraries for interacting with the underlying operating system (in a platform independent way). The same is true for the JVM with respect to Clojure and Scala et al.
Yeah, but nothing in it is Smalltalk-specific. It's not like Smalltalk survives in the mainstream because of Hotspot (in the way that, say, Algol survives).
Well, a counter point then can be: and why did those vendors did not insist on Smalltalk? Why weren't Smalltalk more heavily pushed by some big vendor itself?
It's not like SUN was the only player in town. IBM pushed Smalltalk IIRC.
I think this (from StackOverflow) tells a more comprehensive story):
• when Smalltalk was introduced, it was too far ahead of its time in terms of what kind of hardware it really needed
• In 1995, when Java was released to great fanfare, one of the primary Smalltalk vendors (ParcPlace) was busy merging with another (Digitalk), and that merger ended up being more of a knife fight
• By 2000, when Cincom acquired VisualWorks (ObjectStudio was already a Cincom product), Smalltalk had faded from the "hip language" scene
Even worse than the problem of uncommon concepts as monads is that Haskell's memory footprint is extremely hard to reason about. A few years ago it was impossible with the http libraries to download a file without the program consuming several times as much memory as the downloaded file.
> Even worse than the problem of uncommon concepts as monads
Just go ahead and learn the typeclass hierarchy and such-- it really is quite a useful higher level of abstraction in whatever language you choose. And it definitely will enter the mainstream (even more than is already has [Swift, Scala & C# all have monadic constructs]).
> Haskell's memory footprint is extremely hard to reason about.
And you'd probably want to also throw runtime in there as well.
I think this is relative-- it's not "extremely hard" for everyone. Also, many structured programmers found object orientation "extremely hard" but somehow the industry managed to progress through that era.
A common recipe people quote for good software is "a) first make it work, b) then make it fast".
Haskell is very good at a), and not bad at all at b). With the help of the profiler it shouldn't be that hard to determine a program's bottlenecks/leaks and fix them, as with any other language.
BTW, since you mention http, I have this reading on my back burner [1] but from skimming it found that for certain payloads a haskell http server may perform better than nginx (1.4, circa 2013?), which is an impressive feat.
Rust is not your typical lower-level language though. It supports a lot of the features that functional programmers expect. It is an eagerly evaluated language that lets you drop to 'unsafe' code where necessary but in its natural form, it is surprisingly high level.
I'm hoping that as Swift evolves over time, it will slowly become that sort of language. Right now it's pretty hard to write any real-world code in Swift that doesn't work with the Cocoa frameworks, and the Cocoa frameworks are typed in an objective-c-compatible way (even if new frameworks are written in Swift they'll need to maintain obj-c compatibility), which means you don't get the strong typing that's necessary for this property. Pure Swift code has the potential to behave like this, although you probably need to avoid the ImplicitlyUnwrappedOptional feature (the ! suffix on types), which of course primarily exists for ease of obj-c integration anyway.
I'm bringing up Swift because, with Apple's backing, it's very quickly becoming a "mainstream" language. I put that in quotes because it is only usable with iOS and OS X programming (for now at least), but iOS is large enough that obj-c should be considered a mainstream language despite the fact that almost nobody outside of iOS/OS X uses it, and therefore as Swift supplants obj-c it becomes appropriate to call it mainstream.
Regarding parallelism, I've been in love with Rust for a long time now, and one of the biggest reasons is because Rust makes parallelism safe. As an iOS/OS X programmer by trade, I think thread safety is far and away the biggest elephant in the room. Despite the fact that we've known that multithreading is the future for years, and despite the wonderful Grand Central Dispatch library on iOS/OS X, most programmers still think in a single-threaded mindset and don't even consider how their code should operate if invoked on a separate thread. This was one of my bugaboos with Go back when I was using that language (which was from the day it was announced right up until I discovered Rust, though admittedly my usage was in hobby projects and nothing serious).
I applaud the fact that Go has a data race detector now, which I used to finally uncover a lurking data race that plagued one of my programs for months (and which was ultimately caused by the Go library using two goroutines where I expected one, and therefore data which I expected to be on a single goroutine was actually mutated from two goroutines simultaneously). But I think Rust is absolute proof that a modern language can be designed such that data races are prohibited at compile-time without sacrificing any language flexibility.