fhwang.net

HOPE 2004, Day 2

For the second day of HOPE, I found myself mostly preferring the technical talks over the political talks, because overall they taught me more. Since I'm a politics junkie, I might not learn much from a HOPE talk about, say, media consolidation, but drop me into a talk about cryptography and I'll barely keep my head above water.

So the day started poorly when I overslept and caught only the tail end of Peter Wayner's "Building the Anti-Big Brother", which was about designing databases that won't give up too much information if they're compromised through cracking or through a subpoena. He showed an interesting example of a library that uses one-way hashes to track checked out books without giving away which books are being checked out by a particular patron.

By comparison, the next panel, "Propaganda in Art and Media", didn't feel substantial to me. A lot of discussion focused on smaller instances of information control: One example was the fact that HOPE had been asked to take some banners down from the windows facing the street. I would've liked more scholarly research (how did our society get this way?) or concrete examples of activism (how can we fix it?). It's too easy—especially for this crowd—to hang around and bitch about the Man.

Next was Steve Wozniak, Apple co-founder, giving the Saturday keynote. This was a much improved experience over Friday, since they set up a video feed to the movie room on the mezzanine level: My friend Rob and I stayed up on the main floor and it was pretty comfortable. Woz talked about his early experiences designing computers and founding Apple with Steve Jobs, and how he tries to inspire an excitement for learning in the kids in the computer classes he teaches today. He comes across as a genuine guy who's passionate about learning, though my inner eco-nazi feels compelled to note that he loses points for driving a Hummer.

After that, Steve Rambam talked about privacy and did a lot for me to crystallize the specific nature of the threat. "Your big fear should not be Big Brother," he said, "it should be private industry." For marketing reasons, private industry compiles and processes copious amounts of personal data—much more than the government, in fact. And although that itself might be considered dangerous—I personally just consider it annoying—the threat amplifies when the data gathered by private industry starts to leak into governmental usage. This happens because governmental functions are now outsourced to private industry to an unprecedented degree, and companies that harness data for private sector clients are also eager to repurpose that data to make even more money by selling it to the government. The purpose of the data in both cases is the same, Rambam noted: Both private industry and the public sector want to know who you are, and what you're going to do.

Then, three heavy tech talks: "Making Use of the Subliminal Channel in DSA", "Ten Years of Practical Anonymity", and "How to Break Anonymity Networks". The first discussed how to hide small messages in digital signatures without affecting the validity of that signature: One use of this is to find a private key if you manage to pass a victim a maliciously patched version of GNU PG. I understood this in theory though I have to confess that some of the math went way over my head, but hey, that's howI like it. The second and third talks were sort of tandem talks about anonymizing networks, including discussion of how they work now and ways to defeat them.

Anonymity is a really difficult problem, it turns out, and one of the really interesting things about it is that current anonymity schemes get better the more people use the system. The reason is that it's better to blend into the crowd, which makes statistical analysis difficult. As Nick Mathewson, the speaker for "How to Break Anonymity Networks" pointed out, this is different from, say, encrypting an email using PKI, which is just as strong if the sender and the recipient are the only two people using the system.

Which brings up an interesting game-theory question. You have a system that is inherently focused on illicit traffic. You have a bunch of users of this illicit traffic who want other users into the system to cover their tracks, but getting this level of anonymity isn't exactly user-friendly, and it's far beyond what most people feel they need. In fact you could say that it might be their disadvantage to use this system since it might subject them to blanket investigation for charges such as, say, being involved in a child porn ring. So how do you get them to use your system?

I posed this to Mathewson, and he said that user-friendly plug-ins for email systems will help, and that he's imagining a lot of people using this just for the cool factor. I suspect that will help quite a bit, though whether it will help enough is hard to say. (What exactly "enough" is, of course, depends on how paranoid you are about having your anonymity broken.) It would be good if, to use some game-theory speak, it would be in the licit users' self-interest to participate somehow. Or maybe with expanding RIAA lawsuits for MP3 trading online, the class of illicit user can be expanded until everybody joins for their own self-protection?

The two anonymity talks kept me away from the start of Robert Steele's talk, but since he'd basically said he could keep talking for as long as there was a crowd, Rob and I went over and took a seat. Steele talks from lots of experience in military and intelligence, and in the 30 or so minutes I heard him take questions from the audience he covered topics such as what's wrong with intelligence gathering as it's currently done by the CIA, how to recruit and groom informants, and how to gather information using openly available sources. He argued that it's a mistake to only recruit young people and have them build a cover, saying that it's nearly impossible to build a solid cover today with all the information floating around—which gels nicely with Steve Rambam's earlier warnings about corporate data mining. He also said that although he thinks it's plausible that Karl Rove and Dick Cheney allowed September 11 to happen under the thinking that Pearl Harbor united the country before, it's more likely that what happened was simply a massive communications failure made inevitable by systemic mismanagement. I'd never heard Steele speak before, and it's astounding to hear somebody speak so knowledgeably and candidly about such a mysterious topic—the word "hypnotic" would not be an exaggeration.

At about 1 a.m. Steele took a break, and that's when I tore myself away to catch the train home, write this, and go to bed. One more day to go!

blog comments powered by Disqus

« Previous post

Next post »