Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Keeping Secrets (medium.com/stanford-select)
240 points by ghosh on Nov 17, 2014 | hide | past | favorite | 27 comments


Pardon the longer quote, but I want to comment on this -

  Rather than trying to understand both sides of the 
  issue and make the “right” decision, Hellman said 
  that in the heat of the controversy, he listened to 
  his ego instead.

  It was not until Hellman watched Day After Trinity,
  a documentary about the development of the atomic 
  bomb, that he realized how dangerous his decision-
  making process had been. The moment in the film that 
  troubled him most, he recalled, was when the Manhattan 
  Project scientists tried to explain why they continued 
  to work on the bomb after Hitler had been defeated and 
  the threat of a German atom bomb had disappeared. The 
  scientists “had figured out what they wanted to do and
  had then come up with a rationalization for doing it,
  rather than figuring out the right thing to do and
  doing it whether or not it was what they wanted to do."
This attitude - screw the consequences, let's just scratch my curiosity itch - is extremely common in tech circles. Cryptonomicon did a good job presenting this issue in a highly digested form - that is when the Avi character is setting up a data haven for all the good reasons and the only people that show up for the (funding) presentation are the criminals and rogue government agents. I was messing with anonymous private p2p systems at the time when the book came out and it was frankly a shock to read, because somehow it was an obvious angle that I never considered at all. I was just engineering stuff because it was really interesting, but never did I consider the consequences of actual application. Realizing that there's an ethical component to every technical project was quite an eye-opener and it had profound effect on how was viewing projects ever since. Perhaps it's obvious to some or a non-issue to others, but then perhaps there are those here who can relate...


A short story that I read many years ago, possibly in an SF anthology, has stayed with me. I've forgotten who's the author, and don't necessarily agree with his point, but it is a good parable.

Briefly, a man comes to the house of a famous scientist, who happens to have a mentally disabled son. The unknown visitor comes to plead, politely, for the scientist to cease working on a super-weapon project. He is heard, but is rejected. The visitor accepts it well, and asks to use the bathroom before leaving.

After he has left, the scientist finds out that the man has given a gun to his son. He manages to get hold of the weapon without mishap, and wonders what kind of man would give a loaded gun to a disabled child ...



One of the many memorable parts of Cryptonomicon is where Randy asks Avi how long he wants to keep his secrets safe and gets the answer:

"I want them to remain secret for as long as men are capable of evil."


There are generally three possible thought processes to a new technology:

1. This technology will be great! I'll be a hero!

2. Technology is value-neutral. Those who use it have their own agency in how they use it.

3. This technology could have bad side-effects and I should face them.

"1" is very common. "3" is extremely rare. I find "2" intellectually defensible if a bit dodgy.

I wish more technologists wondered about the end result of their pursuits. Instead, they see themselves as the protagonists in their own story, and anyone else who gets caught up should just deal with it.


Another important consideration in #3 is that "this technology could have side-effects like enabling behaviors that society (or the government, presumably a proxy for society's will) currently consider bad and illegal, that i think should be enabled regardless."

The origin of the definition of "bad" is important. Many things are illegal that should not be. It's important to build tools to help people break certain unjust laws.


People are doing it for the government as well, is that somehow more laudable?


Asimov's short story "The Dead Past" carries a similar theme.


This is a good read, though it curiously omits mentioning that the NSA's twiddling of DES's S-boxes turned out to have made DES stronger rather than weaker. That seems like a pretty important note in the story.

(http://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27...)


> According to Inman, the uptake of the research community’s cryptographic ideas came at a much slower pace than he had expected. As a result, less foreign traffic ended up being encrypted than the agency had projected, and the consequences for national security were not as dramatic as he had feared. Essentially, Inman recalled, “there was no demand” for encryption systems outside of governments, even though many high-grade systems eventually became available.

Some things don't change. Despite the fact that the bedrock of basically all noteworthy asymmetric cryptography was laid in just a handful of years 40 years ago, and despite the fact we've had crypto protocols to solve a lot of really compelling problems for decades, the NSA, and government generally, still has little worry about. The market has a way of selecting really lowsy solutions when it comes to privacy. Consider:

- The abysmal state of implementation. Over-engineered, poorly designed, poorly implemented, and poorly deployed. Did I miss the memo for the billions of dollars of investment and meticulous engineering being poured in to the cryptography Space Race? I guess a couple of OpenSSL forks is a start, right?

- Zero adoption of personal digital signatures. Zilch. Nudda. You can't prove authorship of anything, and can be framed for almost anything. None of the logs being made of your activities are seen by you, let alone signed-off by you as authentic.

- The complete lack of good, usable, client authentication. We've known how to do secure password authentication, even in the presence of weak passwords, since before the web existed, yet we have nothing. Google authenticator is the only meaningful contribution to authentication on the web since the 90s (Pretty much all the third-party systems conflate the issue of identification and authorisation)

- Complete centralisation of interpersonal messaging (Email -> Your webmail provider, SMS, Whatsapp, Facebook Messenger etc). It's all unencrypted, logged, and subverted for government or commercial interests.

- Blackboxification of consumer electronics. Yet somehow, despite the urgency in keeping DRM keys secret, essentially the same technology, we don't have usable HSMs in consumer devices like phones yet.

- Extensive surveillance of all our financial activity. Our supermarkets can track our personal shopping habits down to the items we buy week on week, and our banks knows where you like to buy your Sunday lunch. We've known how to achieve cash-equivalent privacy digitally for 20 years. All we have is Bitcoin which, while heading in the right direction on trust, serves some grand libertarian ideal and accomplishes little in terms of privacy or user friendliness. Go read about Digicash, in another life it could have shipped with Windows 95.

- The complete lack of good trust models and, more importantly, the lack of any education or inclination amongst the general public, particularly among the young and technologically comfortable, to question whether we should really be trusting website X, company Y or app Q with our personal data and habits. Social networks have changed attitudes toward sharing our personal life in one generation. My dad considers Facebook statuses bizarre. My grandma still doesn't trust plastic or direct debits, and prefers cash. We're caught in a generation gap where we have no reason to trust many entities, but have so much incentive to risk it anyway.

... clearly the demand for cryptography is still low.


> Zero adoption of personal digital signatures. Zilch. Nudda.

See Estonia - http://e-estonia.com/component/digital-signature/


There has been piecemeal adoption of various "digital signature" technologies across Europe, but it's mostly limited to specific areas dealing with tax and government communications. Accountants and lawyers might have encountered this sort of tech at some point, but that's about it.

Digital signatures are extremely disruptive of long-standing legislation, so they require excruciatingly slow rollouts anyway: first you have to change all laws dealing with contracts and identification, then you have to define what technology you're going to use (and pray it doesn't change in 2 years), then you have to modernize State infrastructure to deal with the new setup, then you have to persuade regular citizens to change their habits. And of course, as for all crypto schemes, the result will only be as strong as the weakest link in the chain (e.g. the Diginotar fiasco).


> then you have to persuade regular citizens to change their habits.

And then you still need to have a backup procedure for elders, people in extreme poverty and others who for some reasons cannot use electronic solutions to interact with the government.


What's interesting from Estonia is they made the digital signature equal to a physical one in all ways and left consumer demand to drive the adoption because of its convenience and safety.


Same in Denmark. The government issued digital signature are so widely used that close to 100 percent of all adults between 18-70 years use them and also a significant percent of the older population and of children (15 years and up). And not just for government websites. For banking, for signing contracts, etc.

Now, the whole system is set up by the Danish government, so if you worry about government spying you probably won't find that signature very useful :-)


I admit, I'm being highly cynical. This system is great. The keys are state issued, so if you set your mind to level of paranoia where the state is trying to frame you for tax fraud, it's not beyond repudiation... but yeah, on balance it's likely a better system than being rolled out here in the UK[0]. Gov.uk 'Verify' involves four entities (the hub, the government department you're trying to use , you, and your chosen identity provider) and, from what I can tell, it's just glorified OpenID.

[0] https://identityassurance.blog.gov.uk/2014/10/14/gov-uk-veri...


I'd like to see usage statistics. Spain deployed DNIe (mandatory national ID with an embedded certificate) but less than 5% of Spaniards use it due to lack of marketing and a very poor implementation.

Would love to hear a success story :)


In a centralised system a lot of these techniques don't actually make much sense. We could have client certs but they would need to be cached in memory on machines that may be comprimised. At that point why not just use cookies sent over SSL? The same is true of personal digital signatures. Why give complete trust to something that could have been obtained using malware or phishing?


> At that point why not just use cookies sent over SSL?

Cookies are completely under the control of the web server/application. They can't provide provide a secure means of client authentication, except as really bad session tokens. The way they're currently used as such is essentially broken because they're not actually bound to any session state, which opens the door for session hijacking attacks even if you're using TLS.


Which is exactly the same problem you would have with client certs. If an attacker can get a cookie why can't they also get a private key?

Also a centralised system needs to have a way to recover accounts when credentials are lost. Without some kind of third party validation (which would reduce sign up) sites would have to rely on exactly the same methods to validate users as password + cookie. For consumer sites with badly maintained machines client certs just increase the attack surface with little real benefit.


Do you think that the failure for something like PGP (in it various incarnations) to take off was "useability"? Failure to reach critical mass? Complete lack of interest/concern by the general user? Something else?

> My dad considers Facebook statuses bizarre.

If you were to be a rational outside observer, wouldn't you agree with this?


Tbh I think PGP probably failed because it's flawed the moment you move to webmail, and webmail is just so compelling from a user experience POV. I think the rest of the friction could have been overcome given enough UI candy.


> As of 2012, the federal government provided 60 percent of U.S. academic research and development funding. By choosing which projects to fund, grant-giving government agencies influence what research takes place.

That could explain a few things about how demand has developed, and how technologies for products have been shaped.


On the other hand

> After the run-in with the academic community in the late 1970s, the NSA history asserts that Vice Adm. Inman “secure[d] a commitment” that the Office of Naval Research would coordinate its grants with the NSA.

Tor was (and continues to be) sponsored by the Naval Research Lab.


From Hellman's letter to Stanford's attorney:

>Although it is a remote possibility, the danger of initially inadvertent police state type surveillance through computerization must be considered.

Yup...I'd say we've pretty well backed in to it.


I don't have a problem with the way things turned out. An equally plausible scenario would be if the researchers had just surrendered to the Feds then we would have nothing until it was too late.

The Feds like to squeeze - that's their job. It's why Congress usually takes care in limiting the scope of their activities. But had the researchers acted "reasonably", they'd probably be working in some basement a la Dan Akroyd's character in "Spies Like Us."


Not directly related but a fun fact: GCHQ had discovered both RSA and Diffie Hellman asymmetric crypto years earlier: http://en.wikipedia.org/wiki/Clifford_Cocks




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: