March 17

0 comments

Rethinking Privacy Policies

When is the last time you actually sat down and read a privacy policy? What about writing one?

In the last week, I have read some (painful), written and updated one (interesting) and started to consider how they drive (or not) actions around how people protect information. I think we need to reconsider our privacy policies…

Sometimes a confluence of events presents themselves to shape thinking in new and important ways:

1. Last week I updated the privacy policy for the Security Salon. In the process, I reviewed a lot of policies, checked out the “privacy policy generators” and tried to craft a policy that was fair, made sense and was technically accurate — as well as captured the essence of my intentions. To be fair, I felt the “generators” were confusing and limiting. In the end, I generated a policy and then modified it by hand. No doubt, it’ll evolve.

2. On Friday, an article on a local company (High Peaks invests $500K in software developer Apprenda) stood out to me for two reasons:

a. This is a Software as a Service (Saas) company. They represent a growing trend that holds some important lessons and opportunities for changing the way people protect information.

b. They are a startup, and they actually have a dedicated security resource onsite as a founder – and his title is “Vice President of Security and Infrastructure.” This suggests security is top of mind.

3. This weekend, it was reported that 13 people were fired and another dozen or so — including doctors! — have been disciplined for access to Britney Spears medical records. Sadly, this activity is not new in the realm of medical records, and the reaction is not surprising.

 

So I wrote a privacy policy, learned about a company handling information that was founded with security engaged from the beginning and read about the results of people violating the privacy of a medical patient. They all stayed with me — and then last night, I learned why.

Last night, I approved a comment to a post I wrote over two years ago. Normally, this is a sure sign of spam. In this case, it was not spam – and better. It was the catalyst that pulled my thinking together (yes, catalysts rely on other catalysts – now you know).

The comments were focused on the privacy policy of Plaxo. Keep in mind, the post is old and the privacy policy has probably evolved. Stacy Martin has moved on and the new Plaxo Privacy Officer is Redgee Capili. All of that withstanding, here is an excerpt from the recent comment that got me thinking:

…you did NOT say that Plaxo will not read the data of their customers… It would be nice to see a policy shuch [sic] as “Plaxo will not read the data of its customers unless 1) explicit permission is granted from the customer or 2) a law enforcement agency with appropriate juristiction demands to see the data.”

This is a subtle point and an interesting question – if someone provides a service, beyond protecting the information, should they have access to the data they hold? If so, for what purposes? I even question what it means to “read” – machine or human? Is there a difference?

Same time – fascinating post popped up yesterday in the Security Catalyst Community, asking the ‘right’ way to handle ‘discovered’ PII: Handling Discovered PII. Great question!

We face a human problem. We need a new approach. Where to start? When it comes to privacy policies – I think we need to start with some active and transparent conversations about responsibility. What do you think?


Tags


You may also like

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Tired of feeling defeated on Friday?

Where the stack of work to get done is bigger than what got finished. You dread next week before the weekend even begins.

It doesn’t have to be this way.