Menu
Evan Schuman: Transparency about data retention requires knowing what you have

Evan Schuman: Transparency about data retention requires knowing what you have

A new call for transparency about what data mobile apps are retaining sounds fine and noble, but too many companies don't even know what their apps know about consumers

Now, here's a noble goal. U.K. telecom giant Orange on Friday (Feb. 21) launched a campaign to encourage companies to be much more transparent about the data they are collecting with their mobile apps, as well as helping consumers to better control how such data is used. Laudable, really -- and terribly unrealistic.

I'm not even talking about the fact that most companies would rather not be transparent about why they retain consumer data. ("We're trying to get you to buy expensive stuff that you don't need and probably don't even really want. Why do you ask?") The real problem is that you can't disclose what you don't know.

And companies seem to know frighteningly little about what their mobile apps are doing, if efforts by Starbucks, Delta, Facebook, Match.com and eHarmony are any indication.

In a phone interview yesterday (Feb. 24), one of the leaders of that Orange report said that the disconnect between what companies know and what they really need to disclose is alarming. "Every industry needs to make a call to action for transparency," but such an effort is severely complicated by instances where "senior management is not even aware of" the data being retained, said Fred Lindgren, who runs much of mobile strategy for Orange and whose actual title is "senior manager of business anticipation." (Don't hold that title against Lindgren; he said he isn't a fan of it either.)

Not being aware that data is being retained is a real problem, as the cases cited above demonstrate. I have argued that the problem is that companies that develop mobile apps are likely to test them to make sure they perform the functions that they want the apps to perform, but they don't really think about the need to make sure that the apps aren't doing things that no one expected, such as exposing passwords in plain text.

I actually see companies following one of three paths to data retention. In the first, the companies want to retain the least amount of intrusive data possible, but are not sure how to do it. Those companies can end up issuing mobile apps that are retaining information that they don't even know about. Ask them to be transparent about what they are retaining and they will tell you what they think is the truth but could in fact be very far from it. Companies that follow the second path do want to retain as much data as possible, but they want to hide that from their customers as much as they can. These companies see transparency on this issue as inimical to their interests. That doesn't mean that they are staffed by evil people. They probably think that their data retention is as much a boon to the consumer as it is to the corporation, since it helps their customers get pointed toward the products and services that they really want. If they are secretive about all of that, it's because they figure the customers would get the wrong idea, but the sheer helpfulness of the benefits that arise from data retention make it all OK. Companies on the third path lie somewhere between the other two types of company. They think they know what they're collecting, but don't.

The reasons vary, but transparency wouldn't work for any of those companies. Some would reject it as a threat to their business model, and some just wouldn't know enough about what their apps are retaining to tell consumers.

But the whole idea that transparency is the right thing to do is built on shaky ground. Lindgren's argument for transparency is a common one in privacy circles: Fundamentally, the data belongs to the consumer, and whether such data is retained and how it is to be used should rest with the consumer. It's a high-minded principle, but not really based on fact. It can be legitimately argued that data belongs to whoever pays to have it collected (assuming that, where permission is needed, it has been granted). In other words, if a consumer agrees to data collection (even though that agreement might be buried within a long and poorly worded "click here to use this app" declaration) and then a retailer, manufacturer or third-party app developer pays a lot to collect, analyze, retain and store that data, it's not clear that the consumer does indeed own it. (Whether the user should own such data is a very different question, but until legislation is passed or the courts rule on that explicitly, it's not a fact that consumers own their own data, which is quite clear to anyone who has tried fixing errors in a credit report.)

Another key point in the Orange report is that, compared with just a couple of years ago, consumers are far less confident that large companies will handle their mobile data properly. The problem with that stat is that it's unclear what is behind it. Were consumers being naive and overly optimistic when they answered, a couple of years ago, that they were reasonably confident that major companies would handle their data properly? If so, do the more recent answers merely show that consumers are now sadder but wiser now that they understand better the level of corporate apathy that exists when it comes to mobile data?

But corporations haven't cornered the market on apathy; consumers haven't really shown that they care about this issue -- that is, until they're affected by a breach. That's why I scoff when the Orange report notes, "There is currently no stand-out body who is seen to be educating the consumer about how to control their data." That's true, but frankly, consumers don't care about these issues to a degree that they inform their decisions. The overwhelming majority of consumers who want an app go ahead and get that app without thinking at all about what personal data that app might retain or even expose.

The alarming truth is that, in many cases, the company that developed the app isn't thinking about that either. Before transparency can mean anything much, companies are going to have to get a whole lot better at that. They must expend the effort to know exactly what their apps are doing, in terms of data retention, data usage and data protection ( security). Until that happens, none of these privacy efforts stand a chance.

Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for CBSNews.com, RetailWeek and eWeek. Evan can be reached at eschuman@thecontentfirm.com and he can be followed at twitter.com/eschuman. Look for his column every Tuesday.

Read more about security in Computerworld's Security Topic Center.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags privacymobile securityFacebookstarbucksorangeeHarmony

More about DeltaFacebookFredindeedMatch.comOrangeStarbucksTopic

Show Comments
[]