Online data collection happens on an enormous scale but isn’t always above board.
In September 2015, Comcast reached a $33m settlement with 75,000 customers whose personal information was published, despite those customers paying a fee precisely to prevent this from happening.
Each customer was paid $100 compensation in a case that has been one of the first to put a price on data trust online.
Elsewhere, it’s hard not to see ad blocking software as an early symptom of diseased consumer trust.
And stories of people being hounded by charities after data brokers have shared their details are becoming commonplace.
This may be offline data, but the technology for sharing it is part of the new data brokerage digital infrastructure.
I’ve been reading a report from Nesta, the innovation charity, titled ‘Research on digital identity ecosystems‘.
It’s an essay, really, that is more philosophical in parts than similar papers from within our industry, but gives some good context for data usage and regulation, today and for the future.
A cultural shift towards data hoarding
The Nesta report asserts there is a cultural shift towards data hoarding.
Collection and analysis of personal information is seen as paramount for most digital companies, even if the data is not strictly necessary for current processes.
This is demonstrated by digital startups, which harvest data for VC investors, and industry consultants specialising in data collection and analytics.
GAFA: walled gardens and surveillance capital
The shift has been gradual, over a decade, but many internet users now spend most of their time on a very limited number of platforms, with Google and Facebook the most pervasive.
These new technology companies began by trading on West Coast alternative culture, being different from previous incumbents such as IBM.
But, over time, they have diversified, moving into new markets supported by increasing ad revenue.
This ad revenue comes from the ability to track, measure and monetise users on their devices. Some see this as a new economic model, defined as ‘surveillance capital’.
Can you design a better #GoogleLogo asked @guardiantech (http://t.co/sjdxSohlEy). Yes, yes I can. pic.twitter.com/MHltSfRACB
— Keir (@keiross) September 2, 2015
The myth of future data use
The hype around big data comes partly from a belief that further revenue can be extracted from exisiting data sets.
Although new services have been created with this technology, there is still an expectation from consumers that the data economy will result in more than simply hyper-targeted advertising.
Consumers understand that giving out their details should be part of a value exchange and they have become wary of doing so if they perceive that value to be lacking.
If new and useful services are not created from big data technology, might it be that data will lose some of its currency?
Data as currency
If data as currency is based on a promise of future value, it has been posited that this resembles the emergence of a debt system.
Comparisons can be made with large-scale financial systems and symbolic means of exchange, alongside a government interest in regulation.
If this analogy is taken further, it would seem reasonable to expect more cases of customers taking action against companies that take their data and then either misuse it or fail to deliver future value.
Data can already be seen to be used as currency, for example at the Financial Times, where answering some market research questions grants a user article access, or at Waze, where customers sharing traffic information get improved map functionality.
Is price discrimination the next step?
Tying data with identity, as is becoming increasingly possibly with the expansion of the Google and Facebook ecosystems, companies may soon be in a position to assess a user’s willingness to purchase a specific product and tailor the offer to those circumstances.
Price discrimination may be seen to benefit customers and companies, though it could be argued that this would stand to increase the average sale value of any particular product.
This could make price discrimination a sore point for customers if the practice was seen to increase dramatically.
Data as property
The key to hashed customer data used to target advertising is that is does not have a personal identifier, i.e. the data cannot be tied back to the consumer.
It’s this that allows companies to use consumer data legally for advertising.
However, many users view data as their property, and view even its non-identifiable use as unacceptable.
It’s partly this conception of data that has led to ‘do not track’ technology, with consumers wanting to freely navigate the internet without sharing their data.
Social validation shows trust is a two-way street
Identity is an important topic online, with stories of people ruining their lives with misguided comments on Twitter or Facebook.
The ability to identify users, part of companies’ data collection, may well be necessary for some online services to provide value.
The Nesta paper gives the examples of eBay, Uber and Airbnb as platforms that rate or validate their users, in order to provide a good experience for others.
Airbnb requires users to have a plausible online identity (100 Facebook friends, as shorthand) before they can use the service.
In the same way that customer review sites have made online services a fairer playing field for consumers, perhaps identification of consumers is necessary to provide value to a customer base as a whole.
Data brokers and consumer concern
The World Privacy Forum and the Federal Trade Commission have estimated there may be up to 4,000 data brokering companies worldwide.
This isn’t inherently a bad thing but it’s a big number in light of growing customer concern over data usage. Geo-location is the next frontier for these companies, as online data begins to translate to real-world consumer habits.
So what does the future hold?
Many are calling for shared benefits (with the consumer) from data brokerage.
If data breaches continue and ad blocking and ‘do not track’ technology spreads, steps must be taken to create a trusted flow of personal data where control, value, trust and transparency may need to play a more significant role in the context of the digital economy and the identity market.
For more on this topic, read Delivering Value in the Data Exchange.