Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

HTTPS More Vulnerable To Traffic Analysis Attacks Than Suspected

Unknown Lamer posted about 9 months ago | from the working-out-the-bugs dept.

Encryption 17

msm1267 writes "Researchers have built new attack techniques against HTTPS traffic that have been effective in learning details on users' surfing habits, leaking sensitive data that could impact privacy. They tested against 600 leading healthcare, finance, legal services and streaming video sites, including Netflix. Their attack, they said in a research paper, reduced errors from previous methodologies more than 3 ½ times. They also demonstrate a defense against this attack that reduces the accuracy of attacks by 27 percent by increasing the effectiveness of packet level defenses in HTTPS, the paper said. 'We design our attack to distinguish minor variations in HTTPS traffic from significant variations which indicate distinct traffic contents,' the paper said. 'Minor traffic variations may be caused by caching, dynamically generated content, or user-specific content including cookies. Our attack applies clustering techniques to identify patterns in traffic.'"

Sorry! There are no comments related to the filter you selected.

The primary point not in abstratct but not summary (5, Informative)

JoshuaZ (1134087) | about 9 months ago | (#46429111)

The most interesting bit is not in the summary. Given individual websites they could identify which specific webpage one was visiting thus leaking with high probability all sorts of medical, financial and legal information. Examples used include from medicine the websites of the Mayo Clinic and Planned Parenthood, from finance Wells Fargo and Bank of America, and from entertainment Youtube and Netflix. This sort of thing could be used for all sorts of surveillance or blackmail. Even just knowing what Youtube videos one is watching could be used for such ends.

Re:The primary point not in abstratct but not summ (4, Informative)

mveloso (325617) | about 9 months ago | (#46429209)

The "leaks" seem more like they can track the path of a user through a website, given the structure of the links and the relative size of the pages. I don't think they claimed they could tell what the data was on the page, but sometimes the fact that a user is on a given page is enough (depending on the structure of the site).

For youtube, they'd have to figure out the relative sizes of all the pages, which might be difficult to do (and the size will change depending on he comments and browser used).

Re:The primary point not in abstratct but not summ (0)

Anonymous Coward | about 9 months ago | (#46429721)

I've seen it work with Google Maps over https. I can't find the video now, but it shouldn't be too long as I originally saw it on /..

Re:The primary point not in abstratct but not summ (2)

Cramer (69040) | about 9 months ago | (#46430803)

Right. They first crawl the site to build a map of the encrypted pages. Then by looking at other encrypted streams, they can guess, with approx. 89% accuracy, what page it was. The overwhelming point here is that it is a complete and utter GUESS . Without decrypting the contents, they don't know for sure what it is. The issue for SSL is that it's not very good encryption if my https traffic for foo.html is sufficiently the same as another https session's traffic for foo.html -- i.e. it's failing the test of differential analysis.

Re:The primary point not in abstratct but not summ (1)

Jorl17 (1716772) | about 9 months ago | (#46433199)

And do you think this is specific to HTTPS, or rather a problem with most encryption techniques as we use them (given that we're not zero-padding input data to make it all rougly the same size, that is, pretty indistinguishable)?

Re:The primary point not in abstratct but not summ (1)

Cramer (69040) | about 9 months ago | (#46433595)

In this case, it's specific against SSL. But in general, this is another form of differential crypt-analysis. Any credible encryption system takes steps to prevent this. (simply put, a single bit change in either key or plaintext should not have an easily predictable effect on the ciphertext.) As far as I know, no one has tried this method on other crypto methods.

Size alone is a very weak means of mapping content. Almost every modern web application has some variability in the output size at any given URL. Plus it's likely there will be many URLs generating the same size output.

Re:The primary point not in abstratct but not summ (0)

Anonymous Coward | about 9 months ago | (#46433925)

Youtube is a bad example since Google no doubt still secretly cooperates with the NSA.

Everything is vulnerable to traffic analysis (0)

Anonymous Coward | about 9 months ago | (#46429231)

Either you slow the connection way down with garbage data or you admit that knowing where traffic is going is a huge "flaw" in routing protocols.

we just pad to the next X bytes, where X is small (1)

raymorris (2726007) | about 9 months ago | (#46433335)

What we do, and have done for many years, is just pad to the nearest X bytes, where X is roughly size / 30. That's small enough that it makes little difference in speed, but many resources end up being the same size.

Consider as an example the Mayo clinic web site. Each page is maybe 5KB for the html itself. The graphics for the logo, nav bar, etc.are separate requests, cached after the home page. 80% of the html is template stuff - the header, the footer, the nav bar, overall page structure. Maybe 20%, or 1KB, is different on each page. Most pages have 500-1,000 bytes of unique content. So pad up to the nearest 100 bytes. You aren't going to notice any slowdown from an extra 50 bytes, but if most pages are an even multiple of 100 and their sizes generally don't differ by more than 1,000 bytes, about 10% of all the pages on the site will pad out to the same size as the requested page - foiling the attack.

It seems to have worked. The bad guys discuss our security system on the crack forums regularly, but there's been no mention of a successful sized-based attack.

Reduced more than 3.5 times? (1)

wonkey_monkey (2592601) | about 9 months ago | (#46429451)

Their attack ... reduced errors from previous methodologies more than 3 ½ times.

There has got to be a clearer way of saying that. Do they mean "to less than 28%?"

Re:Reduced more than 3.5 times? (1)

sexconker (1179573) | about 9 months ago | (#46430303)

Their attack ... reduced errors from previous methodologies more than 3 ½ times.

There has got to be a clearer way of saying that. Do they mean "to less than 28%?"

The errors were at X before, now they're under negative 2.5X.

Another reason to use VPNs? (1)

mlts (1038732) | about 9 months ago | (#46430399)

This might be another reason that one should consider using VPNs, even if on a trusted network. At least an attacker would be able to see traffic go by, but not know where it is going to, especially if there is a program in the background doing random HTTPS queries to various sites for noise.

Of course, the downside of VPNs is that a lot of them have their outgoing IP address flagged, so Google either demands a CAPTCHA before use, or just gives the middle finger and denies access entirely.

Re:Another reason to use VPNs? (1)

aaarrrgggh (9205) | about 9 months ago | (#46431359)

It is more like you need a TOR styled VPN, routing your traffic over different paths and aggregating/dis-aggregating so there is never a single point that all your traffic flows through. Not especially efficient.

Re:Another reason to use VPNs? (1)

pigiron (104729) | about 9 months ago | (#46431831)

Yes, but have you taken Dark Matter into account?

Reminds me of those tor research papers (0)

Anonymous Coward | about 9 months ago | (#46432613)

It sounds to me like a lot of the "we can identify tor network almost perfectly" papers in which they ignore false positives completely.

Not to say that that means these attacks do not have any value in them for the attacker, but it heavily reduces where it can be applied.
You could follow a single user relatively well if you had a rough idea of where he would go, but it becomes completely useless to follow a huge number of people of which a small subset will go to a place you are interested.

Assume you want to know which guy of a group of 10000 people go to a specific site, and your true positive rate is 100% and false positive rate of just 1%. This means you will end up with actually flagging 100 people saying they went there if even if you know its just one out of those.

hobby hobby! (0)

Anonymous Coward | about 9 months ago | (#46434023)

beta or not ...
if you want to try it yourself:
http://wiki.untangle.com/index.php/HTTPS_Inspector

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?