This is my backup plan for the day when I can no longer get a good used car that's old enough. Until then, though, I'm not willing to reward auto manufacturers who do this by giving them my money.
I have read his diaries, though carelessly writing Mark Antony over Marcus Aurelius does undermine the point. -- Thanks, edited. I guess one shouldnt write HN comments while listening to corporate policy announcements.
I was hit by this. It looks like my laptop had not been syncing correctly for a couple months, then GDrive overwrote my local “My Drive” folder with the old cloud copy.
I was going to say, from what I've seen it looks like this kind of problem: a problem with the desktop sync, rather than Google's cloud storage itself.
The sync problem at the same time feels more likely and worse, as if the files never touched Google's servers and its sync went and deleted them, there'd be nothing they could do.
No, indeed it is not. If the problem is that people are getting phished because they type their info into a spoofed login page, how would making one standard login page be the answer?
I released the first public version of Phactory yesterday, and I'd like to get some feedback from developers here on HN.
Phactory was born from a desire to have something like Factory Girl for my day job writing PHP. A couple of my co-workers and I have been using it for a few weeks now, and we're finding it really useful.
If you have time to give it a try, any feedback/criticism would be much appreciated.
What features would you want to see in a library like this?
If you're offering it for public scrutiny, then the good news is you've just gotten a good bit of it from an experienced security professional. Perhaps instead of calling tptacek arrogant, you could take the criticism that you say you're inviting?
The best practice in password hashing schemes is well understood by the security community, so you have to understand that it can be frustrating to watch the same mistakes made over and over again.
As tptacek pointed out, the correct answer is "use bcrypt". He's not telling you not to offer your library for public use, he's just pointing out that there is absolutely no reason to roll your own password hashing scheme.
"Also, could I not continue using SHA2 256, and add an option to specify the number of hashing repetitions, which would increase the time to compute, as well as frustrate dictionary attacks?"
Are you sure there is no inherent property of SHA256 that would allow an attacker to shortcut the computation of successive hashes? I'm not saying there is, but simply that cryptographers have already solved this problem and considered all the angles. Why are you trying to start from scratch?
I really have no idea why you are saying "there is absolutely no reason to roll your own password hashing scheme" Its a plugin that automates salting and hashing and then an acessor to compare against that. Are we actually calling something that reduces the work of a developer and many lines of code down to one a "scheme" now?
I also wondered if there was an characteristic to sha256 that would make repeating it pointless. I did see it mentioned someplace however, thats why I asked.
Oh and he is pointing out I shouldn't be doing this, and this somehow comes off as arrogant, especially considering a few of the other plugins default to SHA1
"That you don't know any of this --- and I say this respectfully --- tells me that maybe you should be using someone else's password hashing library instead of reinventing your own"
"Password hashing scheme" refers exclusively to the algorithm used to transform a plaintext password into a ciphertext value (whether it's a hash or derived-key).
The rest of your library is the part you want to spend your effort on. Make it easy to use, make it flexible, put in some great features like a user admin panel. That stuff is the domain of the webapp library builder. Just trust the crypto part to the cryptographers, and use bcrypt/scrypt.
Where have I claimed to have done any crypography at all???
auto_hash is just a plugin that wraps a call to ruby's Digest library in a convenient rails plugin, the entire "crypto" part are these two lines:
salt = ActiveSupport::SecureRandom.hex(10)
Digest::SHA2.new.update(value + salt).to_s
It just happened what my research showed to be the most common hashing algorithm recommended and practiced. This is a step up from from clearance and devise which use SHA1 by default
Which is was I was absolutely baffled by comments such as "auto_hash is an inferior password hash" and "tells me that maybe you should be using someone else's password hashing library instead of reinventing your own"
Looks suspiciously like most of the criticism was from those who didn't give more than a glance to the plugin before criticizing it. Maybe the name auto_hash was confusing some people, thinking it was a hashing algorithm rather than just a silly little rails plugin.
I'm sorry you've taken offense. I'm not writing well today.
Again, the core of your misunderstanding here is your belief that SHA256 is a security function. It isn't.
Also, you believe you're simply using SHA256. You're not. You're using SHA256(nonce, password), which is a construction, not an algorithm.
There's nothing wrong with constructions; every security protocol uses them. But you need to recognize the merits and problems with the construction you've ended up using. Your construction is terribly vulnerable to incremental brute force cracking. There are much better constructions that don't have this problem; scrypt and bcrypt are among them. There's also PBKDF2 and "stretched" SHA256.
But, and this will annoy you to hear: security-critical code isn't something you should "learn on the job". Take someone else's secure system (in Ruby, use ruby-bcrypt, which is excellent) and build on that instead.
I'm reading this thread from top to bottom and it looks like both sides are talking past each other.
His point: His plugin is a wrapper that provides a basic framework for handling authentication. He's not implementing his own cryptography. He happens to use an SHA256 construction as part of it because that's what he's seen as one of the standard ways of handling passwords.
Your point: Don't use salted SHA256 for passwords. Use bcrypt.
I think this whole conversation could have been snipped off if you had started with "change auto_hash to use bcrypt instead of SHA256 since it's more secure".
It'd be great if he fixed auto_hash to use bcrypt instead of SHA256; this is, after all, the entirety of my original comment about his code.
Just be aware that once he replaces auto_hash with bcrypt, auto_hash has literally no functionality anymore; bcrypt-ruby already does all of what auto_hash does, better.
I'm on it right now, would have been done last night but having trouble with rails 3.0 and gem paths.
But its not true it wont have any functionality, it does what it claims to do, which isn't much, but its something.
Putting
auto_hash :password, :field2, :field3
In a model will automate the process of "cryptofying" (using a fake word to avoid any more terminology disputes)
database fields :password, :field2, :field3 upon save or update
Then it will give you a dynamic accessors like user.password_match?, user.field2_match?
This saves lines of ugly code I don't want to look it, and also frees up the models before_save hook.
Amendment: I think this will make auto_hash the only auth related plugin that defaults to, and only offers, bcrypt
class User < ActiveRecord::Base
include BCrypt
def password
@password ||= Password.new(password_hash)
end
def password=(new_password)
@password = Password.create(new_password)
self.password_hash = @password
end
end
Here is auto_hash
class User < ActiveRecord::Base
auto_hash :password
end
Believe me, I understand what these one way hashing algorithms are, but you have to cut me a little slack for not knowing the exact proper terminology, since this is not my field. I know its not some magical "security function", actually I never used that term, but I did say "security hash", which I took to mean a hashing function used in security, like website authentication.
But its baffling that you are saying this is not something I should "learn on the job", since I'm doing exactly what the other alternative libraries are doing, except a few offer more configuration. Authlogic, Clearance and Devise all are doing what I'm doing, some a little better, by offering bcrypt, some much worse, by defaulting to SHA1. I hope that, although it looks like you are singling out my humble plugin for criticism, you are actually criticizing most existing plugins. If you want to do that, its your call, but I'm just offering an alternative.
In fact I was going to use bcrypt, but I discovered that it require some installation to use, and I didn't want to make a simple plugin more complicated. Why did I think this was ok? Because, as I mentioned before, I did a little digging and found almost nothing warning about using sha2 for hashing passwords, so I assumed it was still considered good enough.
I really sounds like you are criticizing auto_hash as some poorly attempted one way hashing function but its only a plugin with a single line, that does anything crypto.
Digest::SHA2.new.update(value + salt)
If I had found a single reference to sha2 not being secure enough for website passwords I would have simply replaced it with this line
BCrypt::Password.create(value + salt)
I would hardly call that that failing of somebody who shouldn't be "learning on the job".
AMENDMENT: Not a simple 1 line drop in, but I'm getting there :)
Thanks to everybody voting me down for no reason I can see.
Ok, I didn't vote you down, but let me help you out:
The fact that the mistake you made is only one line of code has nothing to do with anything. A single-line coding error (blindly offsetting malloc's return) broke all of Flash. Every line counts.
If that sucks and feels unfair, I agree with you. You can mitigate this problem by not writing security code.
Actually, slow in this case does not at all mean "this operation isn't implemented in hardware or optimized assembly".
The slowness of an algorithm like bcrypt is a tunable property of the key expansion algorithm. A step in this process will be repeated 2^n times, where n can be configured by the user.
If a password function was only slow because it wasn't implemented in assembly on your server, an attacker would obviously just go implement it in assembly for his brute-force crack.
Actually, the major contribution in Colin's scrypt work is to find primitives that don't admit as well to hardware acceleration; his hypothesis is that one of bcrypt's weaknesses is that even though it's very slow in software, it might be possible to massively accelerate it with FPGAs.
If you can find a copy of 59 Seconds, by Richard Wiseman, it's mentioned in there (with references, if I remember correctly). I don't have my copy handy.
In what way is pushing 20K requests/second "concurrency fail"?
Node.js doesn't make it easy to share state between processes, but once you're scaling to multiple processes, the jump to multiple machines probably isn't far behind. You'll need to design a distributed algorithm, or just centralize your shared state in something like Redis anyway.
That's a good argument for a plain web app fronting a DB server. But what if you want to implement a server that keeps a lot of data in memory? What if you're not the user of Cassandra, Redis, Lucene or a specialized analytics engine but its creator?
For request/response type of scenarios (unlike batch processing) you basically have two options: (a) Use a multi-process shared memory design (or even just the file cache), which is very cumbersome to implement because you cannot use pointers and hence none of the data structures in your favorite language's library. (b) Use shared in-process state with powerful concurrency primitives. That's what clojure helps you do.
You cannot avoid to choose between these two architectures by using distribution across machines. Distribution is built on top of whatever you do on a single machine.
If you consider the multi-server case, this is a consensus problem. To solve it, you need to either use an algorithm like Paxos, or use some central synchronization point like a Redis or memcached server. Yes, node.js fails compared to Clojure when it comes to taking advantage of multiple cores on this example. Clojure's concurrency primitives are really slick. But that's not the whole story.