Deborah J. Salons is a Washington DC-based attorney whose practice focuses on telecommunications, cloud, information security and privacy law. She is also a Certified Information Privacy Professional through the International Association of Privacy Professionals. With almost a decade of experience, Deborah is a skilled government affairs professional with a proven track record of leadership and coalition building in the technology, telecommunications, and media industries. Deborah earned her undergraduate degree in Communications and Speech from the University of Washington, Master of Arts in Communications Management from the Annenberg School for Communications at the University of Southern California, and law degree from Indiana University School of Law- Bloomington. While in law school, Deborah served as Editor-In-Chief of the Federal Communications Law Journal and clerked at the FCC and NAB. Deborah is licensed to practice law in California and the District of Columbia.
I got to know Deborah when she was moderating a panel I appeared on at the Open Data Center Alliance's Forecast 2012 event. It's taken me an embarrassingly long time to get this podcast scheduled and recorded but I hope you'll agree it was worth the wait. This is an important topic but one that is, unfortunately, often poorly served by shallow or sensationalistic stories, blog posts, and papers. Deborah, on the other hand, takes us systematically through the legal issues associated with data at each point in its collection or use and offers lots of great, practical advice.
In this podcast, she talks about legal issues related to data, "big" and otherwise, and how concerns can be managed with the proper education and the proper teamwork between your legal counsel and your technology folks.
Listen to MP3 (0:28:10)
Listen to OGG (0:28:10)
Transcript:
Gordon
Haff: You're listening to the Cloudy Chat Podcast with Gordon Haff.
Hi, everyone. This is Gordon Haff, cloud evangelist with Red Hat, and I'm here
on the phone with Deborah Salons, an attorney who is focusing on cloud
information security and privacy. I know that's a topic that's at the top of a
lot of people's minds, so I think this is going to be a really interesting
podcast.
Welcome,
Deborah.
Deborah
Salons: Thank you very much, Gordon, for having me. I'm very excited
to be here.
Gordon:
Great. Can you maybe introduce yourself a little bit?
Deborah:
Sure. My name's Deborah Salons. I'm an attorney based in Washington, DC.
I've been practicing for about 10 years. I started practicing in the telecommunications
regulatory space. As we all know, as technology converges and
telecommunications converges with technology, I've been trying to follow that
current, and my practice has been melding more into investigating issues in
regards to cloud computing and information security as well as privacy issues.
Gordon:
This is obviously a topic that makes for a lot of, perhaps,
sensationalistic headlines and fear stories and what have you. I think we'll
maybe start off this discussion at a high level. Our legal concerns around
cloud computing generally, and specifically around the handling of data,
whether on‑premise or in a public cloud, is it just a bunch of click‑baiting
headlines, or is there really something there that people need to be concerned
about?
Deborah:
I think that there really is something there for people to be concerned
about, but I think that those concerns can be managed with the proper education
and the proper teamwork between your legal counsel and your technology folks.
Before
I get any further, I do want to make the disclaimer that anything that I
discuss on here is just legal information. It's not legal advice. I have to
give the disclaimer since I'm an attorney. I'm not your attorney …
But
I do believe that it's very important for technologists to collaborate with
their attorneys on these issues, because they're not just headlines. Regardless
of whether your data is stored on‑premises or in the cloud, depending on what
kind of data you have and where it's located, you will be subject to privacy
and data‑security regulations.
With
those things can come civil penalties, can come enforcement actions, private
lawsuits, criminal penalties in some cases, which is very scary, and FCRA also
have some criminal penalties attached to that. The bottom line of all of this
is it could be a PR nightmare for your company.
Regardless
of whether the data is on‑premises or in the cloud, you're facing some legal
issues. Now, once it goes into the cloud, there's another layer on top of that.
Jurisdiction issues come into play, some data sovereignty. Of course, there's
the security concerns that come with cloud, but you also have security concerns
when it's on‑premises.
These
issues are something to be concerned about, but like I said, if you start with
your legal counsel at the very beginning, at the get‑go, I think you can really
manage those concerns.
Gordon:
I think that's really a good segue into the next part of this discussion,
because a lot of the debates or discussions around legal issues in the cloud
can be very high‑level, and I'm not sure they're always that practical. But
you've given a number of presentations that I've seen where you really go
through specific actions that people can take in order to minimize their legal
risk, so I'd like to walk down that path. The first step is, really, collecting
data in the first place. We're really focusing on data here because, although
those aren't the only issues, I think they're probably where the most severe
legal issues can potentially arise. What are the issues around just collecting
the data in the first place?
Deborah:
Sure. The data collection, it's probably one of the most important steps
in the whole Big Data process, so to speak, because that's when you're deciding
what information you're going to take in, and depending on what kind of
information you have is what kind of regulations that you're going to be
subject to. The problems with laws in this area is that you can't really take a
book off of a shelf in regards to Big Data law. These laws come from state
laws, federal laws, international laws, so it's really a cornucopia of laws
that you're going to be subject to.
At
the data‑collection stage, some of the concerns that you're going to have is,
at the time that you actually collect the data, are people opting in? Are
people opting out? Are you getting this data from your customers? Are they
visiting a website? You have to be also very concerned about your privacy
policy.
If
you're collecting information on a website, and you say one thing and you're
doing another, you're going to be in violation of some FTC rules. The privacy
policy is something that you really need to look at through all the steps, but
at the initial step, you really need to take into consideration what you're telling
people and how you're getting the information.
The
next thing I would say is you really need to also look at the kind of data that
you're collecting. Are you collecting personally identifiable information? Are
you collecting health information that's going to be subject to HIPAA? Is there
financial information that you're collecting? In the case of telecommunications
companies, is there CPNI that you're collecting?
Human
resources, student citizenship, classified information, all of these things,
whatever kind of information that you're collecting, at the point of
collection, you should really look at what you're going to be subject to.
Finally,
who are you collecting the data from? If you're collecting the data from
children, you're going to be subject to the COPPA regulations, Child Online
Protection Act. If there's state‑specific resident laws, if you're collecting
information from someone in California versus New York, there might be
different regulations that you're going to be subject to.
Furthermore,
if you're collecting information from international citizens, there might be
some other regulations that are thrown in the mix.
At
the very beginning stage, at data collection, I would say your initial privacy
considerations would be how are you collecting information, what kind of
information are you collecting, and who are you collecting the information
from.
Gordon:
One of the interesting things that point to it is in "big data"
discussions. The default that a lot of people seem to think is good. "Hey,
we just collect everything, and then all these wonderful insights are going to
fall out that." Well, first of all, we've seen, often, those great
insights don't necessarily just fall out. But I think what you're telling us
here is that some types of data collection have definite costs or even
potential risks associated with them.
Deborah:
Right. With all the capabilities that we have now, it's easier to collect
more information. It sounds like it's probably better to collect more
information than less, if you have the capabilities. You could learn more about
your customers, and you could actually take some of that information and
provide better services to them. But on the flip side, even if you're collecting
it and it's not identifiable, there are ways, when you collect so much
information, that you could perhaps re‑identify that information. That might
trigger some laws, or that might go again their privacy policy.
Again,
that's very, very important, because if you are doing something contrary to
what you're telling people you're doing, and it could be even on accident or it
could be that you didn't intend to do that but you're able to do that, you
could run into a problem.
It
is a very interesting balance between the excitement of having all those
capability and technology to collect this information, run analytics, and get
all this information about your customers or what have you—but
on the flip side, you might be running into some problem. Again, this is
another situation where you want to have the technologists, the lawyers, and
the business people all come together and really figure out a plan and see what
the best road map is avoiding potential risks.
Gordon:
Deborah, as we move to the next stage, where you have all this data.
You're now adjusting it and cleaning it. You may be getting multiple data sets
together. This is certainly what the areas where this whole issue of "Is
the data personally identifiable or not?" really comes in. In fact, you
see more and more news items or research papers and the like talking about how
some types of data that people have just been assuming aren't personally
identifiable increasingly turns out they really are, because it wasn't that
they couldn't be personally identifiable. It's just people figured it was going
to be so hard that they were safe.
But
it served like "security through obscurity." It doesn't necessarily
work that way.
Deborah:
Right. I think that this is becoming more and more of an issue, because
even if you collected a whole bunch of data that was not personally
identifiable, the more data you collect, the more dots you can connect. Then I
think that that is the downside to all of this. You might be collecting a whole
bunch of information that you might not think you would be able to match up
with a certain individual, but once you collect the whole bunch of data points,
you might be able to reverse‑engineer, so to speak, the identification of that
information.
That's
something definitely to be on the lookout for, because with all of this
technology, the capabilities of the Cloud, and computing power, it's a reality.
Gordon:
We've got this data now. We've brought it into our systems. We've
hopefully cleaned it up somewhat. Now, we're wanting to actually use this
thing, whether it's people looking at it, data‑mining type of techniques, or
typically some combination thereof. We're now wanting to get something useful
out of it. What legal risks can come in at that stage?
Deborah:
Well, I think, when you have the human element of just who's handling the
data. It could be a computer, it could be a human, but at the end of the day,
you have humans giving computers commands. But some interesting aspects to this
is that I found, and I'm not an import/export specialist whatsoever, but
speaking to some of my import/export attorney friends, one of the things that
they were discussing with me is citizenship and work visa issues with some of
your employees working with certain kinds of information.
It
could be technical specs. It could be classified information or business trade
secret type things. But you could actually, with the combination of the type of
information and the citizenship or work visa classification that your employee
has, certain combinations can trigger import/export laws, and you can actually
make an illegal export of information with your information being on American
soil and your employee being on American soil.
Again,
I'm not a specialist in this area, so I'm not quite sure what that combination
is, but I thought that was very interesting. That's something that all
companies should be aware of because you don't want to be in a situation where
you had that combination and you're triggering import/export laws when that
might be something that's not even in the purview of what you're looking at.
You're
thinking, "Oh, I've got data. It's all here." You would never think
of import/export which I think that's a very interesting twist to all of this.
On
top of that, just having people having access to certain data, regardless of
their citizenship or where it's at, you really need to look at your corporate
policy to see who has access to certain data and what their limitations are.
Anyone in the security community will tell you that a large number of data
breaches occur internally and occur within employees within a company.
It
might not necessarily be a hacker but it's someone internal. So you really need
to see what your constraints are, what your corporate policies are to ensure
that the people who need to see the data, see the data and the people who might
not need to see it don't have the access to it. Not only do you need to look at
information security from an outside hacker situation, but you also need to do
your due diligence and make sure that you're handling your information security
internally as well.
Gordon:
We've done something with this data. The storage is cheap these days, so
I think in general people you're serving claim to hold on to this data they've
gone to a whole bunch of trouble to collect and analyze because who knows what
could be useful a year from now. But that introduces some risk as well.
Deborah:
Sure, and I think storage can be cheap these days, like you said, and it
could be the tendency of some business people to look at cost and go perhaps
with the cheapest storage or whatever is the most convenient. But you really
need to take a step back and look at the big picture in regarding storage
because, first of all, you need to figure out where you're physically storing
the data because even though the cloud seems like it's borderless, it really
isn't. There's jurisdictions and either the location of the data really do make
a difference as far as what regulations you're subject to. That could be a huge
business consideration. And so, there's national security considerations.
There's the EU data privacy laws and data sovereignty issues.
I
know the European Union, they view anything that has to do with the U.S. cloud
computing provider is subject to U.S. Patriot Act and that's something that they're not
comfortable with. So, as far as where you're physically storing the data, that
can make a huge difference. Then during the storage, you need to understand how
you're protecting it, who has the duty of care.
If
you have an outside storage provider, what kinds of firewalls? Encryption
standards? What are they doing to protect the data, as well as what you're
doing?
That
comes to the question of what if there is a security breach? That could be a
huge problem as well, because breach notification laws are different in almost
every state in the United States. You might have to, depending on where the
information is located and who you're collecting the information from, you
might need to make specific notifications about breaches to different people.
All
of those notifications might have different caveats and different
specifications, as to how you need to notify people. That could be 49 different
notices [corrected from 48 in the audio] that are going to be going out. On top of that, another interesting
thing that's going on with breaches is that the SEC, Security and Exchange
Commission, is now interested in breaches.
If
there is a large enough breach in your system and you're a publicly traded
company, you might need to disclose that information to the Security Exchange
Commission. Because, in the end, this can affect your business. So that's a new
string of laws that's also developing. Breaches are very important, not only to
your bottom line, but to your PR.
Then
also on a legal stand point. Where you store the information and how you're
storing it is very important. Don't just look at the cheapest avenue. You have
to really look at the big picture.
Finally,
I would say, how long are you keeping this data? You really need to balance
out. Like you said, you can gather all this information.
You
can keep it forever if you wanted to. You really need to figure out how long
you really need to keep the data and what the risks are, attached to keeping
the data for X amount of time. And, when you dispose of that data, obviously,
you have to be responsible about that. Do that in a responsible manner, so
you're not breaking rules at that point either.
Gordon:
What are the implications for things like e‑discovery, with having these
much larger data sets?
Deborah:
From what I understand, and I'm not as much of an e‑discovery expert, the
more data you have, it might be harder to sift through things to be responsive
to discovery requests. I was reading an article earlier today that said that
could just be the tip of the iceberg. When
you actually have these discovery requests and you have all this data to be
going through,
there can be things that pop up about how you're storing your data, what kind
of corporate policies you have. In case you have customers, if you're following
your privacy policy or doing what you said you were going to be doing.
So
on top of just the e‑discovery issues of being responsive to discovery
requests, things could come out in the process that would not be favorable to
your company. Not only is the e‑discovery an issue. But also having your
corporate policies in line before you're in a reactive situation with e‑discovery
is very important because some of these things can creep out in the process and
those processes become public, with being involved in litigation.
Gordon:
Finally, in the case of researchers, companies under some circumstances,
at least some of their data, they want to publicly share in some way, shape or
form. Obviously, they know about medical records, confidentiality and so forth,
but what are maybe some of the less obvious things that companies need to think
about when they make data public in some way?
Deborah:
I think, when you're at the point when you're ready to share and
publicize the data that you've collected, you're at the end of the whole Big
Data process. I really think you need to wrap back around to the very beginning
and look at the privacy policy and the agreements that you made with the customers
and the people that you've received the information from. Because, at the end
of the process, people might not look back at what they said that they were
going to be collecting and how they said that they were going to be using it.
I
would advise people to make sure however you're sharing and publicizing this
data is, in fact, in tune with what you said in the very beginning, because you
can get into a lot of trouble with unfair and deceptive trade practices, with
not matching up with your privacy policy or what you said you were going to be
using the information for. You have to really make sure that the end matches up
with the beginning.
Obviously,
with the types of information, you don't want to be spilling health information
or classified information or things like that. But you really need to see, can
you really legally share the information? Is what you're doing not only
consistent with your privacy policy, but is it also consistent with the rules
in your jurisdiction?
Also,
if you're sharing unidentified data, you really need to look at, if I were to
share this data set and it's unidentified, is there a way that it can be re‑identified?
Because you don't want to be liable for that at the end of the day if there's a
problem doing that.
This
whole big data process is a great thing and you can find great information that
would be very useful to the public. But you just need to make sure that it's in
line with your privacy policy, that you're doing things on the up and up. And
see what could happen. After you share that, what could be the next steps.
Because you need to look into the future, to make sure that you're not going to
walk into any kind of a problem that you just didn't think about looking in the
future for problems.
Versus
what you did in the past with your privacy policy and what you're doing at the
present time, with the legal regulations. And, what you can disclose.
Gordon:
Even with ostensibly public data, we've seen issues recently. For
instance, registered gun owners' locations being published. Even though that
was, ostensibly, public data a lot of people weren't very happy about that. It
doesn't seem, in many cases, that we've really come to grips with the
implications of certain types of public data. They're no longer gets filed away
in a dusty town hall somewhere. They can be accessed with the click of a mouse
button and be mashed up with other types of public data, to really expose a lot
of facts about people that they don't necessarily feel comfortable being
something that their neighbor can just go online and look up.
Deborah:
Sure. It's really about connecting the dots. That's a big issue. I think
that that's something that the courts are actually starting to recognize. I
know that I've been in social engineering presentations where you can connect
dots from people's social media websites and other things on the Internet
that's free, and it's public information. Once you have a whole bunch of
different dots you can connect them and figure out things out very, very
easily.
I
know that the Supreme Court was just looking at this issue in the Jones case,
which was a case that happened here in DC, and it was in regards to legal
searches, searches and seizures and whatnot, and if a warrant was needed for
sticking a GPS sticker onto a car to see where that car went.
The
interesting thing that happened in the Supreme Court and also in the DC Circuit
Court dicta was that the justices were discussing about having an officer doing
surveillance on someone. You can see where they go from one point to another,
but it might not be 24 hours a day, seven days a week.You might see someone go
into a grocery store. You might see someone go into a doctor's office.
But
if you have a constant GPS sticker and you see someone went to the store at
this time, and then someone went to a doctor's office at this time, like if it
was a gynecologist's office, for instance.--this was actually discussed in one
of the court cases--and then someone went to a children's clothing store or a
baby crib store, something like that. When you put all these things together,
you can figure out that the person is expecting a child, and it might not have
anything to do with the case.
It's
almost like the more information you have and the more dots that you can put
together, it provides you with a bigger picture than if someone was just
following someone in a car and you see someone go one place, you see someone go
another place.
You
might not have the time. You might not know who's there. You might not know a
whole bunch of other different things. But the more data you have, the more dots you
can connect, and that's when things get tricky.
Gordon:
I guess to maybe wrap up here, we've talked about a lot of different
regulations and you've talked about some of the court cases. How well
established is the law in these areas overall, and how consistent is it between
different jurisdictions, both states and internationally?
Deborah:
Sure. I would say that the law certainly has not caught up with the
technology, and I think that that's very common when you're dealing with new
technologies, whether it be in the wireless arena or computers. The law is
always slower than the rate of innovation. I think the problem is, you have all
of these new technologies and the laws don't really match up with what you're
dealing with. That is a huge problem. That's why I really believe that business
people, the technologists and the lawyers all really need to get together.
Because you don't want to find yourself in a position, down the road, where
you're facing legal problems because you didn't think about it. But at the same
time the regulators weren't thinking about it either. You might come down the
road where something affects you that you didn't think.
Also,
if you're a big company with a big idea and some kind of new technology, it's
also important for you to be talking to your regulators and rule makers and
lobbying Congress and whatnot. So that you have a say and you can provide
information and let people know what you're doing and how the laws can be
changed or developed, in order to help your business thrive.
But
as far as the rules being uniform, that's also a problem. Like I mentioned with
the data breach laws, they're all different. There are 49 different data breach notification rules on the books. [Transcript note: 46 states, DC, Virgin Islands, and Puerto Rico (making a total of 49). Corrected from "48" in the audio.] So that is a problem.
Because if you have a breach and if you have information from people in
different states and you have information located in different states, you
might have to send out 49 different letters.
That
can cost a lot of money, with legal fees, to be frank. I know that that's
something that the government is looking into making consistent, here in the
United States. That is a problem.
Another
problem is, with different countries, obviously, you're going to have different
rules and regulations. Just as a philosophical matter, the European Union and
the Europeans look at privacy a lot different than we do here in America. The
Europeans look at privacy as a human right, and we look at it as a consumer
protection law.
There's
a different perspective at it and how people value privacy is different, so
their laws are going to reflect their values a lot differently.
In
the EU, I know right now the data protection authority is making moves to try
to have some kind of consistent law with all the European Union nations in
regards to privacy. Right now, they can all have different laws, just following
under the directive, but they're really trying to get the regulations to be
consistent so that at least within the European Union there can be some
consistency there.
They're
currently working on this, but it hasn't happened yet. I think between
countries and even within the United States with the states, there is a problem
with consistency. That's all the more reason to have your legal counsel
involved, because you might think you know the law in one place, but it can
certainly be different in another.
You
want to make sure that the way that you are crafting your business and your
business model and how you're doing things, it's the most legal and you are
avoiding any kind of expensive legal fees at the end of the day, whether it be
dealing with all of these different regulations, or dealing with litigation,
which is obviously the thing that you don't want to have happen.
Gordon:
That sounds like good advice. Thank you very much, Deborah.
Deborah:
Thank you very much, Gordon.
1 comment:
Informative.Thanks for sharing
Post a Comment