Hacker Newsnew | past | comments | ask | show | jobs | submit | yohanatan's commentslogin


Nope, nothing at all like that.


I see this as a last ditch attempt by Microsoft to stay semi-relevant. Of course this was the great promise of .NET to begin with, but it seems rather late to exercise the option now (some 15 years or so after .NET was created)-- a very desperate move by a dying empire.


> F# has a number of these specific implementations but they're kind of "hard-coded" in the language rather than fully extensible.

That's not true. You can extend the language yourself with "computation expressions" (and that's in fact how the 'async', 'let!' et al keywords are implemented).


True, but what you can't do is abstract over them; you can't write a function that operates on "some generic computation expression" (you can't even write that type). So you can't write "sequence" or "traverseM" or more complicated/specific things you build on top of these.


There is an encoding supporting that level of abstraction (perhaps though not as straightforward as you'd prefer); namely using interfaces (which are first-class) to encode module signatures. See: https://gist.github.com/t0yv0/192353


The operators on collections always have alternate named forms. I actually instinctively skip over the operators when perusing the documentation and have no trouble at all finding what I need.


I think a succinct way to put it is: on-the-fly "profile-guided optimization". A good JITter will do this automatically (as you said).


Not to be 'that guy'TM but I think your last sentence is missing at least a word. Also, there are comma splices throughout. :) [Maybe that was the real reason you had trouble finding a job writing]. [just kidding] :)


I'm not so sure that the pharaoh in the background was impressed.


That premise is definitely true. It is a result of Gödel's Incompleteness Theorems. 'Physics' certainly satisfies the constraint: 'of sufficient complexity to encode the natural numbers'.

See: http://users.ox.ac.uk/~jrlucas/Godel/implic.html [particularly "reality outruns knowledge"]


I wonder how you come to that conclusion, because physics is not proven to be of "infinite complexity" (loose statement).

Some things that we are used to from mathematics might not be true in physics. Take for example the fact that in mathematics the real numbers are uncountable. Now in reality (physics), the whole set of real numbers may not exist. It is only an abstract concept from mathematics. And while it may be possible to reproduce any real number in physics as some quantity, you are reproducing them as you go, making the "real" (physical) real numbers countable.


It doesn't take 'infinite complexity' [your term] to encode the natural numbers. Rather it is only a handful of axioms. See: http://en.wikipedia.org/wiki/Primitive_recursive_arithmetic

Pretty much any system of logic worth looking at (including any which the sciences may be based on and the one running inside each human brain) is going to at least be as complex as this set of axioms.

Also, you really should read the JR Lucas material-- it explains this. And, I'd suggest Nagel & Newman's _Godel's Proof_ for a great introductory explanation of the Incompleteness Theorems.


We don't know that the physical world has infinite precision. In particular, time doesn't even seem to be continuous as far as we can tell (cf. Planck constant). And we think there's a finite amount of mass in the universe, so how could we encode arbitrarily large natural numbers (as is required to model Peano arithmetic)?


We don't have to encode arbitrarily large natural numbers. Rather we have to encode the rules that allow us to construct them (which is quite simple actually). And, I think 'digital physics' is more compatible with Incompleteness implications than the alternatives; not less.


How can you construct something which is larger than the amount of mass in the universe?


Downvoters: if you disagree, feel free to post an attempt at a refutation or somehow otherwise explain your vote.


Yeah, but he was talking about the real numbers.


But the reals are irrelevant as far as Incompleteness is concerned. He's obviously confused.


Downvoters: if you disagree, feel free to post an attempt at a refutation or somehow otherwise explain your vote.


> I could not agree more with your first paragraph. The only other language I've used that I've had that experience with was Haskell, and while there are good arguments to be made for using Haskell in production, it should be obvious that's not a language that will ever become mainstream.

I don't think it is at all obvious that Haskell won't become mainstream. It's already exerted a tremendous influence over many other mainstream languages and there's only so long that can happen before people just start going directly to the source of the innovations (or one of its direct descendants).


I agree Haskell has had an important influence, but I don't see why people would necessarily "go directly" to it because of that. In fact, I would argue just the reverse. People chose the derivative languages b/c they provide things the original does not.

To wit, Lisp never became mainstream despite exerting a huge influence. Likewise Smalltalk.


> I agree Haskell has had an important influence, but I don't see why people would necessarily "go directly" to it because of that.

I also said "or one of its direct descendants" (like Agda or Idris in all likeliness).

> To wit, Lisp never became mainstream

Clojure doesn't count? And the good parts of Perl, Ruby & Javascript are essentially Lisp without the homoiconicity.


> Clojure doesn't count?

Perfect example. Clojure != Lisp. But it is a pretty obvious descendant.


"Lisp" is a family of languages and Clojure is one member of this family.


Technically true, but I don't think it makes sense to consider Clojure to be the same thing as Lisp in the context of ternaryoperator's statement (though I obviously can't speak for him). Notably, Clojure's tight integration with Java is the primary reason for its relative popularity, and what most sets it apart from the rest of the Lisp family. It is disingenuous to claim Clojure means Lisp has gone mainstream when the popularity of Clojure is not due to its inclusion in the Lisp family.

Beyond that, I'm not quite sure Clojure counts as "mainstream" yet. According to the TIOBE Index, it doesn't even rank in the top 50 languages. Heck, the top 20 includes R, and Dart, neither of which I would call "mainstream" (I'm actually really surprised at how high R is ranking). I don't know how significant that is, though the TIOBE Index is measuring "number of skilled engineers world-wide, courses, and third party vendors" and that seems like a reasonable approximation for "mainstream" to me.


There are several languages targeting the JVM these days. And, what obviously sets Clojure apart from other JVM languages is its Lispiness (i.e., the JVM is constant across JVM languages).

Clojure is not married to the JVM either-- in fact, it has been hinted that it would jump ship if something better comes along or the current situation becomes less viable. Furthermore we already have a dialect of Clojure called ClojureScript which targets JavaScript/node/V8.

And, I look at the JVM as really merely a library/API/runtime. C++ has STL and stdio and such and they are not part of the language proper but rather merely libraries for interacting with the underlying operating system (in a platform independent way). The same is true for the JVM with respect to Clojure and Scala et al.


> Likewise Smalltalk

Java happened. Business were already in the process of adopting Smalltalk.

Hotspot is a Smalltalk JIT compiler reborn.

Eclipse is Visual Age for Smalltalk reborn. It still keeps the old Smalltalk code browser.


>Hotspot is a Smalltalk JIT compiler reborn.

Yeah, but nothing in it is Smalltalk-specific. It's not like Smalltalk survives in the mainstream because of Hotspot (in the way that, say, Algol survives).

[edit: survives, not "survices"]


My point was that Smalltalk did not became mainstream, because a few heavy weight vendors decided to switch field to support the new kid on the block.


Well, a counter point then can be: and why did those vendors did not insist on Smalltalk? Why weren't Smalltalk more heavily pushed by some big vendor itself?

It's not like SUN was the only player in town. IBM pushed Smalltalk IIRC.

I think this (from StackOverflow) tells a more comprehensive story):

• when Smalltalk was introduced, it was too far ahead of its time in terms of what kind of hardware it really needed

• In 1995, when Java was released to great fanfare, one of the primary Smalltalk vendors (ParcPlace) was busy merging with another (Digitalk), and that merger ended up being more of a knife fight

• By 2000, when Cincom acquired VisualWorks (ObjectStudio was already a Cincom product), Smalltalk had faded from the "hip language" scene

http://stackoverflow.com/questions/711140/why-isnt-smalltalk...


I used VisualWorks at the university in 1995, just before Java appeared and there were presentations with broken Java code[1].

Eclipse 1.0 was Visual Age for Smalltalk redone in Java.

If Java hadn't appeared in the scene, maybe even with those cat fights, the language would have become mainstream anyway.

This just speculation from my part.

[1] The famous "private protected" that was accepted in the very first release.


Technically, Eclipse was Visual Age for Java [only implemented in Smalltalk] redone in Java :-)


So his point stands. People preferred Java to Smalltalk and it thus didn't become mainstream.


s/People/Vendors/g

Which is quite different.


Even worse than the problem of uncommon concepts as monads is that Haskell's memory footprint is extremely hard to reason about. A few years ago it was impossible with the http libraries to download a file without the program consuming several times as much memory as the downloaded file.


> Even worse than the problem of uncommon concepts as monads

Just go ahead and learn the typeclass hierarchy and such-- it really is quite a useful higher level of abstraction in whatever language you choose. And it definitely will enter the mainstream (even more than is already has [Swift, Scala & C# all have monadic constructs]).

> Haskell's memory footprint is extremely hard to reason about.

And you'd probably want to also throw runtime in there as well.

I think this is relative-- it's not "extremely hard" for everyone. Also, many structured programmers found object orientation "extremely hard" but somehow the industry managed to progress through that era.


A common recipe people quote for good software is "a) first make it work, b) then make it fast".

Haskell is very good at a), and not bad at all at b). With the help of the profiler it shouldn't be that hard to determine a program's bottlenecks/leaks and fix them, as with any other language.

BTW, since you mention http, I have this reading on my back burner [1] but from skimming it found that for certain payloads a haskell http server may perform better than nginx (1.4, circa 2013?), which is an impressive feat.

1: http://aosabook.org/en/posa/warp.html


I have definitely heard from multiple people that for them Rust was a good stepping stone to Haskell. Hopefully it will be the same way with Swift.


That may be true but I find it curious. Without knowing too much about Rust, it looks like a lower level language than Haskell.

I'd have thought that someone would pick the higher level language first, and go to the lower lever when in need of more control.


Rust is not your typical lower-level language though. It supports a lot of the features that functional programmers expect. It is an eagerly evaluated language that lets you drop to 'unsafe' code where necessary but in its natural form, it is surprisingly high level.


I think the same is true for F# and Scala.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: