It is really, really hard for me to see this as anything other than utter paranoia. As one of the messages in the thread stated:
> Right. How exactly would you backdoor an RNG so (a) it could be effectively used by the NSA when they needed it (e.g. to recover Tor keys), (b) not affect the security of massive amounts of infrastructure, and (c) be so totally undetectable that there'd be no risk of it causing a ststorm that makes the $0.5B FDIV bug seem like small change (not to mention the legal issues, since this one would have been inserted deliberately, so we're probably talking bet-the-company amounts of liability there).
And how long ago would the idea that the NSA get call logs for every call in the USA have been utter paranoia? Or that they tap and record all international internet traffic?
Just because you are paranoid doesn't mean that they aren't out to get you!
If you random number generator isn't then all of your crypto is basically useless. Paranoid is the correct state of mind for these systems.
> And how long ago would the idea that the NSA get call logs for every call in the USA have been utter paranoia? Or that they tap and record all international internet traffic?
Before 1988, if you were paying attention. So the idea that the NSA was watching everything you did is almost 30 years old now.
Well, it is documented that the NSA made DES weaker by using less bits for key size (this makes brute forcing easier). I aslo noted that Schiener's AES submission was passed over (I speculate that Rijndael is easier to brute force).
The feds used to fight civilian crypto tooth and nail. Then they allowed it, and in one of the crypto books a story was related that the feds were bummed about RSA and friends. The listener questioned why, when surely their efforts were feeble compared to the government's. The response was the pace of development was much faster than expected.
The change the NSA made was to replace the s-boxes used with ones that made using differential crypto analysis slightly less efficient than brute force. As it happens, the s-boxes provided by the NSA were also among the worst 9%-16% possible with respect to linear crypto analysis. "A software implementation of this attack recovered a DES key in 50 days using 12 HP9000/735 workstations" [1]. I do not know the specs of said workstations, but for reference the book claims that was the fastest attack at the time of writing (1996).
This is not to say that the NSA was aware of linear crypto analysis when they made their recomendation. Indeed the fact that their s-boxes also happened to be just good enough to beet differential, and the fact that an independent government investigation (the details of which are classified) cleared them of wrongdoing, are enough to convince that they did not intend to introduce a hole. Furthermore, the NSA has also now published the requirements they used to generate their s-boxes. Schneier suggests in his book that the s-boxes were weakened unintentionally by the act of introducing structure to them, without knowing to defend against linear analysis.
Correct. The NSA suggested changes in the DES S-boxes, which led to many questions. Ultimately, what was discovered is that their changes strengthened DES, not weakened it, as some had feared.
Well, I guess Rijndael is "easier" to brute force in that it's faster than Twofish. But "easier" to brute force doesn't mean a whole lot; AES-192 is easier to brute force than AES-256, but both are so outside the realm of current-day computation than it doesn't really matter.
Do these put a different slant on the whole "current-day computation" angle? Not necessarily these machines, but isn't it feasible that custom hardware could be manufactured using current tech, that upsets the notion of AES brute force feasibility?
Read up on the Clipper chip: A chip which sort of being promoted to be the "official" way to do crypto in the US. Specifically designed to be decryptable by the NSA via "key escrow".
" Then-Senators John Ashcroft and John Kerry were opponents of the Clipper chip proposal, arguing in favor of the individual's right to encrypt messages and export encryption software."
Many developers that worked on crypto would cross the border into Canada to meet up and work on crypto to get around the export restrictions (crypto software was classified as a weapon; exporting it could get you the same punishment as exporting a missile).
Read "Crypto: how the code rebels beat the government, saving privacy in the digital age" by Steven Levy. He outlines the whole story of public crypto until about 2000. Good read, too.
Definitely paranoia. If you want to believe NSA is spying trough your Intel system, they could do it trough vPro and not some RNG calculations. One might assume that NSA can easily tap into the built in VNC server[1] of the CPU.
[1] Computers with particular Intel® Core™ vPro™ processors enjoy the benefit of a VNC-compatible Server embedded directly onto the chip, enabling permanent remote access and control. A RealVNC collaboration with Intel's ground-breaking hardware has produced VNC Viewer Plus, able to connect even if the computer is powered off, or has no functioning operating system. http://www.vnc.com/products/viewerplus/
Because you don't reflash firmware every time you enable/disable this technologies, it's obvious that there must be some code which checks configuration flags to activate this features.
Twist is that such code is executed on dedicated specialised processor in chipset/CPU with own firmware and it does much more:
You know who else cooperates with the NSA? The Linux community. You know, that whole "SELinux" thing? Yeah, that's an NSA project.
Turns out cooperating with the NSA doesn't automatically mean spying on the public, it could instead be hardening crypto security. Which is the NSA's other job, it turns out.
Yes and no better example than DES in which the NSA hardened DES against differential cryptanalysis and then reduced the key size from 128 bits to 54 bits so they could break it. Given the prior actions of the NSA is doesn't seem unbelievable that they would both harden and backdoor linux.
If I hadn't disabled it... which of the dozens of times it's gotten in my way on a new image? Most recently last week, by the way. I disable it because it prevents correct code from running in an already-secure environment. I don't bother beforehand, because I inevitably forget. And then waste ten minutes before I realize I need to turn off the magic "break everything" switch.
In the last seven days, has the fundamental incompatibility between SELinux's design and traditional Unix permissions and tools been suddenly corrected? Has tooling been created to allow us mere mortal sysadmins and engineers to understand and manipulate the byzantine SELinux configuration?
System Apache unable to listen on non-standard port.
> Not possible.
Tell me of a vulnerability on a fully-updated RHEL 6 image running only SSH and a basic Apache configuration serving static files which would be prevented by the stock SELinux configuration.
> You mean labels? No, that's pretty fundamental to SELinux.
Exactly. So my explicit decisions about file permissions must be duplicated. No thanks.
"It would be difficult to implement effectively, therefore it is likely to not exist."
Of course, the judgement also takes into account the extreme consequences for the company implementing it if discovered, and the unlikelihood that that company could be legally compelled to do so, which was the case with all recently revealed examples of companies cooperating with the NSA. (Never mind that we have not even seen hidden /software/ backdoors forced by the NSA - merely systems that were known to be interceptable being intercepted.)
The same argument also applies to trusting the CPU itself: although it would be more difficult to insert a generic backdoor and ensure it could be exploited easily without compromising performance than to backdoor a random number generator, this is a matter of degree, not a fundamental difference in the argument. Though you may not trust the CPU either, I suppose, but in that case not using rdrand won't save you.
> Right. How exactly would you backdoor an RNG so (a) it could be effectively used by the NSA when they needed it (e.g. to recover Tor keys), (b) not affect the security of massive amounts of infrastructure, and (c) be so totally undetectable that there'd be no risk of it causing a ststorm that makes the $0.5B FDIV bug seem like small change (not to mention the legal issues, since this one would have been inserted deliberately, so we're probably talking bet-the-company amounts of liability there).