Andreas Weigand, the guy who built Amazon’s cloud services and is now an itinerant speaker in academic environments, sports a maniacal smile, as if he’d just discovered the algorithm that led to the creation of the universe. Discovery is his modus operandi, as if the world were made for his personal exploration.
The other day, at his invitation, I sat in on the final session of his course on the Social Data Revolution, held at UC Berkeley’s Graduate School of Information, or the iSchool. This session was about data ownership and featured Pete Warden, founder of JetPack and someone who likes to find patterns by sorting through data about people, and Brad Rubenstein, about whom I know nothing since he likes to make sure to clean up any personal “digital exhaust.”
Warden said the issue of data ownership is moot since even small companies have the ability to mass analyze data from sensors (the Internet of Things) without the individual generator of that data even being aware of its existence. For instance, Warden did a mass survey of Instagram photos of bars and was able to determine which bars were gay, including ones in Tehran, where being gay can land one in jail.
What’s important, he said, is to take control of data outcomes, not ownership; as an example, he cited the new health laws that prohibit insurance companies from covering persons with prior medical conditions, such as cancer.
Rubinstein added a caveat to information gatherers: “The act of collecting data confers responsibility for the consequences of that data.”
Weigand – who publishes the details of his day-by-day globe-trotting on his web site -- asked how many people gave out fake data -- a way to protect privacy…possibly. Most of the class raised their hands.
Countering all the negative implications for data scooping, I suggested that the very technology that makes personal data public could also be used to privatize it, so that individuals could sell their data in exchange for sharing some of their habits, needs, and preferences…even the fake ones.
And then, at some point in the near future, processes will have to be developed to assess the verity of data, because taken out of context (by ignoring other critical variables), or generated in ill faith (a lie), social data could become an amorphous or inaccurate guide to understanding human behavior.
Comments