Transforming the Web Into a Transparent 'HTTPA' Database 69
An anonymous reader writes MIT researchers believe the solution to misuse and leakage of private data is more transparency and auditability, not adding new layers of security. Traditional approaches make it hard, if not impossible, to share data for useful purposes, such as in healthcare. Enter HTTPA, HTTP with accountability.
From the article: "With HTTPA, each item of private data would be assigned its own uniform resource identifier (URI), a component of the Semantic Web that, researchers say, would convert the Web from a collection of searchable text files into a giant database. Every time the server transmitted a piece of sensitive data, it would also send a description of the restrictions on the data’s use. And it would also log the transaction, using the URI, in a network of encrypted servers."
From the article: "With HTTPA, each item of private data would be assigned its own uniform resource identifier (URI), a component of the Semantic Web that, researchers say, would convert the Web from a collection of searchable text files into a giant database. Every time the server transmitted a piece of sensitive data, it would also send a description of the restrictions on the data’s use. And it would also log the transaction, using the URI, in a network of encrypted servers."
Encrypt the encrypt data and then give everyone (Score:3)
the key.
All of these sorts of silly ideas depend on no exploits and everyone being a 'good guy'.
If those two things were the case, there would be little to no reason to implement something in the first place.
Re: (Score:2)
Yep, sounds like just another variation on the evil bit [ietf.org].
Re:Encrypt the encrypt data and then give everyone (Score:5, Informative)
Mark Stefik, and his group at Xerox PARC, were talking about 'Digital Property Rights Language' back in 1994 or so, and by 1998, if not earlier, it had metastasized into a giant chunk of XML [coverpages.org]. That later mutated into "XrML", the 'Extensible Rights Management Language', which eventually burrowed into MPEG-21/ISO/IEC 21000 as the standard's 'rights expression language'.
Some terrible plans just never entirely die.
More like labeling a paper copy "confidential" (Score:2)
From the beginning of the article:
> HTTPA, designed to fight the "inadvertent misuse" of data by people authorized to access it.
It sounds to me that it's more similar to labeling a paper file "confidential, for xyz use only". By attaching the confidentiality information directly to the data, you seek to avoid having someone absent-mindedly email the information to a vendor, without thinking about the fact that the information is supposed to be kept confidential.
missed the point. "inadvertent misuse". reminder (Score:4, Informative)
I think you've missed the point. Quoting the beginning of the article:
> HTTPA,designed to fight the "inadvertent misuse" of data by people authorized to access it.
I've had this conversation more than once:
Bob - Why did you tell people about ___. That was supposed to be a secret.
Sally - Oh, I'm sorry, I didn't realize that was supposed to be kept confidential.
Also this thought "oops, what I just said was supposed to be kept confidential. I messed up."
Those are the situations the protocol is supposed to address, the INADVERTENT release of confidential data. It's the digital equivalent of stamping a paper "confidential, for abc use only". Any time the system accesses the data, it is also reminded of the confidentiality rules attached to that data. This so they can, through processes and software, avoid mistakes. For example, a client could be set so that an attempt to copy confidential data to clipboard instead copies the reminder "this is confidential information", so someone copying it into an email without thinking gets reminded.
fold-down shelves, ATMs suggest otherwise. (Score:2)
In shopping malls, guests frequently leave bags in the restroom inadvertently. In some malls, the stalls have a little spring-loaded shelf that folds down to set your bag on. The thing is, the shelf blocks the door. You CAN'T leave the stall without picking up your bag to raise the shelf out the way. Nobody has ever accidentally left a bag on one of those shelves.
Another common "oops" used to be leaving one's ATM card in the ATM. You'd insert your card, hwy the money you came for, then leave without the
Might have a place (Score:2)
Years ago I was working as a subcontractor to a major defense contractor. I had a conversation with IT that went something like this:
IT to all personnel: Anyone with a computer must review each file on their drive and label any that might contain confidential information. Please insert our company logo and the following text into any confidential files.
Me to IT: To clarify, I have approximately X files on my hard drive. Do I really need to review ALL of my files?
Privacy Ultimately Loses (Score:2)
1) Google and advertisers track you + accumulate data.
2) The government does the same
3) Credit reporting agencies and banks selling your debt/credit card transaction data.
4) Employers
5) Insurance companies + on and on
Facebook and Google and LinkedIn are just 3 companies built on invading your privacy and there are tons more.
Short version: You are losing your privacy. "Not liking it", "Angry posts" and the like won't change this.
On the plus side: They really ar
Re: (Score:2)
Short version: You are losing your privacy. "Not liking it", "Angry posts" and the like won't change this.
So, what do you think WILL change it? Moving along with the other sheep to the other side of the field, hoping the wolf won't get you next?
Angry posts are more likely to change it than what you seem to be advocating. Enjoy your cage.
Re: (Score:2)
Government action and privacy laws are the only solution, which I can't see the government being interested in because they are one of the main perpetrators (NSA spying, etc.).
"Angry posts" didn't stop email spam or telemarketing abuse -- someone complaining on the internet is of no concern to a company that is trying to generate revenue. Both of those were dealt severe blows by laws.
Re: (Score:2)
They want to take the Web, and make into some kind of holy f*king persistent interconnected-data mess, which would be broken all the time because data that is supposed to be persistent seldom is after a few years.
I do not want the Internet to be an "interconnected database". I think if we tried to do it
That's literally the worst idea I ever heard (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Only my anger over people asking me this.
Re: (Score:1)
I have nothing to hide, but nobody needs to know that.
Re: (Score:2)
Re:That's literally the worst idea I ever heard (Score:5, Informative)
The original paper [mit.edu] has examples where such a DRM-based system has some legitimate usages. One was for patient data. If you want to eliminate special client software there, you can have this system, and run everything on the browser. The system abstracts and standardizes the access control, which is hopefully already present, and helps to close holes in the implementation. For intranets the model perfectly makes sense, however deployment into the wild wide web is of course extremely harmful.
Media people didn't alter the story, as the paper already contained discussion about www deployment, but only picked the bullshit non-intratnet-web part.
Re: (Score:2)
Re: (Score:2)
But no one ever really does that. Although you can state-freeze an OS, none of the OS makers have useful constructions that allow vetted air-gap updates via media transfer.
The entire scheme looks like a paradise for someone that wants to crack it like an egg. This, too, shall pass.
Re: (Score:2)
You're missing a bunch of steps.
You need to diff it all, make sure it MD5s (or better). Other dependencies have to be checked. While many of the Deb repos are fine, there's then the rest of the stuff you're using-- whose dependencies might not be in a cute and highly watched (if we're lucky) spot.
So you can apply this technique with other OS families and come up with similar questions, and no good airgap answers. You update only a core set of stuff, yes, the OS, but only after a lot of steps. And we hope yo
Re: (Score:1)
What? No one does that? Maybe not in the US, but over here in Europe we certainly do have air-gapped networks where security is necessary (military). We were able to push Windows security updates just fine.
Re: (Score:2)
Being able to reuse more browser code might well improve the existing proprietary client; but (as with any DRM system) you can't eliminate special client software; because you would otherwise be incapable of distinguishing between a web browser that happily accepts your 'HTTPA' and then ignores your restrictions and one that accepts and obeys.
As ever, you'll either need some suitably d
Re: (Score:2)
Being able to reuse more browser code might well improve the existing proprietary client; but (as with any DRM system) you can't eliminate special client software
I've meant that the developers don't have to reinvent the wheel. Of course the software will be closed source, but the developers of the medical application won't need to invent their own access control, but can use this component. Its a good idea when it makes hospitals more secure.
Re: (Score:2)
I find it somewhat hilarious that the Wikipedia article about the Evil Bit has "do not track" in its "see also" section.
'send a description of the restrictions' (Score:2)
Privacy's on the honor system now!
Re: (Score:2)
Re:'send a description of the restrictions' (Score:4, Insightful)
We aren't taking the status quo, building a whole new layer on it, and pretending it'll work. That's how it's different.
Re: (Score:2)
Umm... overhead?
More trusted third party foolishness (Score:5, Informative)
All I see here is a bunch of stuff that all depends on trusted third parties... and in security circles, "trusted" means "can screw you over if they act against your interests". In this case it relies on trusted identity providers, labeled 'Verification Agent' in the paper.
It all breaks down if a verification agent is compromised, and the breach of even a single identity can have severe consequences that the accountability system cannot trace once information is in the hands of bad actors.
The authors effectively admit that this entire mechanism relies on the honor system; it explicitly cannot strictly enforce any access control, because in the context of medical data access control may stand between life and death.
Finally, the deliberate gathering of all this information-flow metadata would add another layer to the panopticon the net is turning into.
Re: (Score:2)
Re: (Score:2)
This is oddly close to what I think DRM ought to be: advisory, not enforcing. Remove the accountability aspect, not least because it's a farce that leaves the most recent honest party holding the bag, and you have my concept of an ideal DRM engine: provenance meta-tags that let you know what color your bits are [sooke.bc.ca], which you can use if it affects you or ignore if it doesn't, leaving no rights-holder the wiser no matter what course you take.
Accountability-oriented DRM, which prevents no action but forces yo
Re: (Score:1)
What? (Score:2)
Web browsers with DRM built in? Terrible.
Re: (Score:2)
Ah, so anyone but the legitimate user can easily access the content? Just like with the other forms of DRM?
Re: (Score:2)
But ... but 4 out of 5 people involved enjoy gang rape!
(now mod me down, I got karma to burn!)
Re: (Score:2)
Right, so when you get a disease that people are irrationally afraid of and no one will hire you, then what?
The whole throw away privacy argument relies on everyone being more or less rational. Even if everyone is, maybe you get diagnosed with a disease that's going to kill you in a few years, but you'll be functional up until the end. Plenty of people won't hire you just because they won't hire someone who is only going to be there for a couple years regardless of the reason.
Re: (Score:2)
Right, so when you get a disease that people are irrationally afraid of and no one will hire you, then what?
You sue them, and they give you money for the rest of your life, just like you worked for them, but you don't have to come in and actually work?
What's the downside here?
Re: (Score:2)
They appeal and delay the law suits and keep it dragging along in court until you're dead?
Re: (Score:2)
They appeal and delay the law suits and keep it dragging along in court until you're dead?
Just be cheaper to buy off than to fight.
Re: (Score:2)
The downside is having to prove that they didn't hire you for that reason. Who the best candidate is is subjective. Did I not hire tlambert because he has $DISEASE, or because the other guy was a better candidate?
And as the other reply points out, lawsuits take time. I don't know about you, but I actually need money regularly. Sitting on my butt while a lawsuit wends its way through court is not something I can afford to do.
Track everything and ignore the restrictions anywa (Score:2)
Bad summary or stupid idea? (Score:2)
Is it a bad summary or a stupid idea?
As it is explained, it seems that system does not cover the case where someone gets the data and leaks it
Re:Bad summary or stupid idea? (Score:5, Informative)
Is it a bad summary or a stupid idea?
Yes.
As it is explained, it seems that system does not cover the case where someone gets the data and leaks it
It's advisory access controls with voluntary indications of use, with transaction metadata logging.
(1) The rights you could be granted are based on the object, not the actor and the object
(2) You obtain the exported rights list
(3) You voluntarily provide a purpose in line with the rights which are granted
(4) Your voluntary compliance with the rights list is logged as metadata, because collection of metadata isn't controversial at all
(5) You retrieve the data
(6) You use it however the hell you want, because you're a bad actor
(7) If you are a good actor, you enforce use restrictions in the client
and...
(8) You try to sell the idea as somehow secure, even though it's less secure than NFSv3, since NFSv3 at least requires the client to forge their ID
So "Yes" - a bad summary, and a stupid idea.
Dumb idea and here why (Score:2)
This is a dumb idea that sounds like a good concept. It like any other good thing on the internet requires that no one be malicious. SMTP didn't used to be restricted until spammers abused it. All that it takes to defeat HTTPA is a client written to ignore the A part.
Many issues (Score:1)
The problem is distribution of trust. This is solveable.
Re: (Score:2)
Really? No one in the known universe has figured out how enforce 'trust' on others.
Please, enlighten me, how do you force me to owner your request to not tell your secrets to someone else? Kill me before I have the chance to do anything with the information? Thats the only known method, and it still depends on no exploits; such as me finding a way to tell the guy standing next to me before you manage to kill me.
Buzzword bingo (Score:2)
Lost me at Semantic Web (Score:2)
More hoopla, with bandwidth and CPU intensive DRM and user activity tracking on top. What problem is this even trying to solve?
how is this not toad's freenet (Score:1)
under a different protocol?
how diff from google analytics? (Score:2)
...which already logs unique uris and often classifies using server- config'ed tags?
Really want to share a secret Bob? Alice? (Score:2)