Not sure it reflects on the entire community. Proposition - that the majority of users are well intentioned and participate by either upvoting or not voting. The minority aggressively downvote for whatever reason. In this circumstance, the aggressive downvoters are misrepresented simply because they are more active / their behaviour is more recognised.
Additionally, due to a generally positive bias (supposing we all generally like YC, HN and have a common positive attitude towards startup culture), the downvotes are more noticable / memorable than things that are frequently upvoted.
Same reason all you can focus on is that chip in your windscreen when realistically its only 2% of the entire window. =)
I have seen comments in HN similar to what you have experienced, comments from frequent posters as well.
The unpleasant thing is that people expect to be hand-fed their daily dose of intellectual stimulant, and got upset if the dosage is off. I've seen comments like, "Your submission/post/blog are lame. It's very annoying it wastes my time." WTF?! The guy spent lots of time writing a blog and asked for feedback. If it's not up to your level, just move on, no point in bashing the guy on wasting your time just because you don't find it stimulating.
What is more disheartening is the upvotes for those kinds of comments. It's really a turn-off.
I actually see quite a bit of the same mentality here. Anything deviated from the main stream of the latest fad will get downvoted. There's little appreciation of diverged opinions or opposite views.
I look back at my comment history and see that generally speaking, if I say something relatively thoughtful, almost without exception it doesn't get downvoted into oblivion. Were I insulting, rude or uncivil a different standard would apply.
The same simply can't be said for proggit. Proggit has gone the way most mature forums go: elitist, dismissive, intolerant and reactionary. You saw the same thing on Usenet (eg comp.lang.c) in years gone past (if you're old enough to remember that).
There is a certain personality type that seems to float to the top of such dank, stagnant pools of water. I call such people toxic. You see it on forums, in open source projects and the workplace. Such people seem to be attracted to the ability to exercise power without actually contributing anything (although they're convinced they are contributing). Once a certain number of such people are entrenched it's very difficult for any such organization to ever turn itself around rather than fade into irrelevancy so much effort needs to be spent simply keeping such people away from the controls.
Much of Zed's famous anti-Rails rant revolved around such people (a classic example being someone writing security code assuming there were 30 days in every month).
HN is not that way at all and any stay on Proggit should tell you how far off HN is from that in a very short time.
I'll show two of my encounters as a limited data point to illustrate that HN is not far from getting to the level of Progit.
In one post about shebang (#) being used in Facebook and Twitter, the discussion was about crawling Ajax pages. I asked a question on good practices to make a Ajax site crawler friendly. It got downvotes! I was truly puzzled. The question was purely technical, non-controversial, within the topic, and extending the discussion. Yet, there were people (long timers with downvote power) trying to discourage it. They were acting exactly like toxic as you described.
In another post about Joel's statement of SO being more scalable than Digg, people were giving this and that explanation but ignoring the obvious elephant in the room - .Net was faster PHP. I made that statement and got downvoted to oblivion. Of course people here hate Microsoft, are into dynamic-type languages, and prefer open-source but a technical fact is still a fact. This just shows how narrow-minded people here are who can't tolerate diverge approaches to problems.
Oh well, if they want it to be a toxic playground, they get it.
I don't think that you're accurately representing the state of affairs.
At the time I'm reading it, your second comment (http://news.ycombinator.com/item?id=1799442) is neutral at 1 point. That is not a symptom of people acting "toxic" and trying to "discourage" you. It was a good question, but nobody seemed to know much about it, so it just sort of sat there.
Your first comment (http://news.ycombinator.com/item?id=1787833) is at -3 points. Unfortunately, as it's five days old, I can no longer put it at -4. You stepped into an admittedly poor discussion about specific reasons why SO might need less servers than Digg, and posted a flamebaity generalization with absolutely no support; you didn't suggest a reason that .NET might be faster than LAMP, nor any evidence that it might be faster, or what it might be faster at. You just dropped a one-line load of an opinion and left. I absolutely am on board with being maximally toxic and discouraging toward the comment you made there.
The 2nd comment (http://news.ycombinator.com/item?id=1799442) was downvoted. I saw it after I posted it. It was brought back up by others upto 1 later. Tried as I might I just couldn't understand the rational of the downvotes. I could only attribute it to the Reddit-like behavior where every submission has 33% downvotes, which is why I'm saying HN is heading the Reddit way.
For the first comment, it's pretty well-known that C#/VB.Net is way faster than PHP/Python/Ruby in raw performance. Benchmarks after benchmarks have shown that fact. Often it's just a matter of bringing it up as a reference in discussion. I don't want to prepare a benchmark for every statement I made.
I've seen plenty of opinionated single liners got plenty of upvotes, admittedly those stated the popular views, so single liner is not a good reason for downvote. But nevertheless I'm not into a popular contest. If HN can't tolerate diverge view/opinion, that's its loss.
Does anyone have suggestion on good practices to make an Ajax page crawler friendly? Since the anchor url (#) are generated on the fly by Javascript, how does the crawler know what anchor urls to follow given the parent page?
Are invisible urls (for archor urls) still frown upon by Google?
I've never had a chance to use dynamic programming in real life beyond school work. For "challenging" algorithm work in real life, I did topological sort for evaluating dependency graph, bloom filter for skipping lookup, NFA for executing regex-like rule graph, Rete for faster rule evaluation, extendible hashing for fast index storage, complex event flow graph with SQL-like nodes. Those are the ones I can remember now.
BTW, I did got bit by O(nlgn) vs O(n^2) issue, despite computers being fast nowaday. Once I used bubble sort in a cache system. It was quick and simple. Things work fine for small size cache but slowed down substantially when size > 10,000; n^2 => 100,000,000 ops. Switching to heap sort did solve the problem. So yes, understanding algorithmic complexity does help in real life work.
However, 99% of development are run of the mill coding.
I saw a post in Reddit doing frontpage snapshot yesterday http://redditsnapshot.sweyla.com/ and thought to myself, this is pretty cool; I should make one for HN. Today someone already has done it. Congrat.
I had the same thought (as you can tell!) and thought that after the whole *Instant fad, I should quickly whip something up since I had a few spare hours ;-)
I'm a little perturbed by the whole "whip something out in a few hours" craze of late but have adopted a "if you can't beat em, join em" approach for the moment.
It depends how you approach the problem. I'm guessing here, from my own experience implementing social graph features, but here goes:
There are a couple of ways you can go about this, the first is the database: Join the network of people against the land of content and bring it back. This doesn't work (as an aside this is what people mean when they say web scale, it has nothing to do with web traffic, it's social graphs) your database will cry. All though not at first, in development it works fine, and you feel fine, and for a while you're ok, but you start growing....
Another way you can go about it is by denormalizing. In this world you store a pointer to each content item for each user. So anytime I do something all the people [following|watch|connected|friended] to me get a record indicating I did this. This works, but now you have lots of data (lots and lots of data!) spread all crazy around. You need some kind of system to push that data out to everybody. It's those last two that drive up your hardware usage, it's not necessarily web boxes, but it's boxes in the background broadcasting the events out to the world, and the datastores to hold it all. Depending on how your web code works you could also have a lot of overhead on the webservers putting all that stuff together.
My experience here comes from building the social features into toolbox.com. A good example is this page http://it.toolbox.com/people/george_krautzel/posts-connectio... That's all the posts from users connected to our CEO (all 750k of them). Getting that to return in near real time is super fun (and you can probably tell that I went down the DB join path before it all fell apart).
Maybe, but women's salaries are about the same as and often lower than men's, while older developers tend to get paid much more than junior developers.
With the caveat that they'd need to older, but not old-fashioned. Mental inflexability is too damaging, especially when coupled with an easy defense like "Oh, but you're younger and not as wise".