Privacy

If You’re Going to Store Cookies, EU Court Says You Need the Users’ Active Consent: Data Privacy Trends

After two weeks in Italy, of course my first post back had to be a post about Europe, right?  ;o)  Regardless, while I was out last week, a notable decision was announced by the Court of Justice of the European Union which might impact a few people (i.e., anyone from the EU who accesses websites).

Anyway, as covered in Rob Robinson’s terrific Complex Discovery blog last week (Storing Cookies Requires Internet Users’ Active Consent Says Court of Justice of the European Union), in Case C-673/17, Bundesverband der Verbraucherzentralen und Verbraucherverbände ̶ Verbraucherzentrale Bundesverband eV v Planet49 GmbH, the Court of Justice of the European Union decided that the consent which a website user must give to the storage of and access to cookies on his or her equipment is not validly constituted by way of a prechecked checkbox which that user must deselect to refuse his or her consent.

Per the press announcement from the Court of Justice of the European Union, the German Federation of Consumer Organisations challenged the use by the German company, Planet49, of a pre-ticked checkbox in connection with online promotional games, by which internet users wishing to participate consent to the storage of cookies before the German courts.  As a result, the Bundesgerichtshof (Federal Court of Justice, Germany) asked the Court of Justice to interpret the EU law on the protection of electronic communications privacy.  P.S., don’t ask me to pronounce some of these words… ;o)

That decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data. EU law aims to protect the user from any interference with his or her private life, in particular, from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge.

The Court notes that consent must be specific so that the fact that a user selects the button to participate in a promotional lottery is not sufficient for it to be concluded that the user validly gave his or her consent to the storage of cookies.

Furthermore, according to the Court, the information that the service provider must give to a user includes the duration of the operation of cookies and whether or not third parties may have access to those cookies.

Interesting data privacy developments continue that will certainly shape how we not only share information, but even use websites going forward.  It will be interesting to see what the landscape for internet use looks like a few years from now.

BTW, as noted above, I’ve been gone for two weeks, so I want to give a BIG THANKS to Tom O’Connor for his terrific white paper on 30(b)(6) witness depositions that we were able to cover as a six-part series on our blog for the past two weeks.  So, instead of giving you the “best of eDiscovery Daily” these past two weeks, we were able to give you new content!

Also, BTW, Complex Discovery is currently conducting its Fall 2019 eDiscovery Business Confidence Survey.  This is the sixteenth quarterly eDiscovery Business Confidence Survey conducted by Complex Discovery.  All questions are multiple-choice, and the entire survey can be completed in about two minutes, so please check it out and feel free to participate.

So, what do you think?  Will this ruling affect how websites request consent for cookies?  Please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

The Cat and Mouse Game Between Data Privacy Regulators and Online Advertisers: Data Privacy Trends

You didn’t think companies that make a lot of their revenue in online advertising were going to just roll over when Europe’s General Data Protection Regulation (GDPR) was enacted to protect personal information, did you?  Apparently not, as this article discusses.

According to Legaltech® News (Is the GDPR Creating a Cat-and-Mouse Game Between Advertisers and Regulators?, written by Frank Ready), the browser company Brave alleged last week that Google was using a mechanism called “push pages” to work around restrictions on the sharing of personally identifiable information (PII) laid out by the EU’s General Data Protection Regulation (GDPR). It did so, Brave said, by assigning a distinct, almost 2,000 character-long code to user information shared with advertisers.

Google issued a response to the site Tom’s Hardware saying that it does not “serve personalized ads or send bid requests to bidders without user consent.”  But, Google’s ad practices are already facing an inquiry by the Irish Data Protection Commission (DPC), specifically with regards to how well they comply with “GDPR principles of transparency and data minimization.” However, regulators attempting to enforce the anonymization of user data could find it difficult to keep pace with companies looking for new ways to both comply with privacy requirements and protect the online advertising revenue that is central to their business.

Jarno Vanto, a partner in the privacy and cybersecurity group at Crowell & Moring, thinks part of the problem is most of the information that’s collected about users online nowadays could potentially qualify as PII.

“Ad tech companies are now trying to come up with ways on the one hand to comply, but then they are still stuck in the old world where they were able to collect all of this data because they could rely on this distinction between non-PII and PII, and that’s no longer really a valued distinction,” Vanto said.

Debbie Reynolds, founder of the data privacy and cyber response firm Debbie Reynolds Consulting, believes other companies will be looking towards the outcome of the Irish DPC’s inquiry with interest as they try to align their own data practices with compliance and profitability.  Still, she’s not expecting much in the way of new parameters surrounding what constitutes a unique identifier.

“I don’t think the regulators are going to try and go out of their way to create new words or new definitions,” Reynolds said.

Vanto said he thinks it could be a tough road due to the amount of resources that would have be leveraged in order to keep track of the practices employed by each technology company.  But Vanto also noted that there are tech-savvy privacy activists who have an interest in monitoring such activity, as well as rival companies that may also be inclined to keep their competitors moving towards the same kind of consent-based data sharing models that they are being driven to adopt into their advertising practices.

Just like Tom is always finding it difficult to stop Jerry, it appears that regulatory agencies – even with GDPR – are finding it difficult to stop the companies wanting to do everything they can to keep the advertising dollars flowing.  It will be interesting to see how this struggle plays out over time.

So, what do you think?  Will the regulatory agencies be able to find a way to protect personal information from advertisers?  Please share any comments you might have or if you’d like to know more about a particular topic.

Image Copyright © Turner Entertainment

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

It’s Friday the 13th! So, What’s Worse – Jason or Last Minute Votes on CCPA Amendments?: Data Privacy Trends

OK, today is the day that spawned a whole series of horror movies that have been a part of the Hollywood lexicon for nearly forty years now.  But the decisions being made in California this week could impact data privacy considerations for…well, who knows how long?  So, which is worse?  Depends on your point of view, I guess.

You all know the legacy of all the Friday the 13th movies, starting with this one way back in 1980 (still a classic!).  Yes, it even had Kevin Bacon in it, which should appease all you “Six Degrees of Kevin Bacon” fans.  Thank you, sir, may I have another?  ;o)

No, you may not, because this post is about data privacy and the California Consumer Privacy Act (CCPA).  According to this article from Alysa Zeltzer Hutnik & Alex Schneider at Kelley Drye, apparently this week marks the final opportunity for California lawmakers to amend the CCPA before the legislative session closes.

And, they have been busy with changes, according to this article from yesterday from Andrew Kingman and Jim Halpert (no, not THAT Jim Halpert) from the International Association of Privacy Professionals (IAPP) where we learned that the proposed amendment to support loyalty programs (to permit the sale of personal information collected through loyalty programs in very limited circumstances) has been shelved for the year.

Speaking of IAPP, they have a great infographic here (and below) that shows the status of privacy laws in various states.  You know I love an infographic!

Not enough detail for you?  They also have this more detailed table with references to each statute or bill currently making its way through the legislative process.

They say a picture is worth a thousand words and I’ve given you at least two of them, so I’ve earned a few days off, I’d say.  I’ll settle for two.  Back Monday!

So, what do you think?  Are you concerned about last minute changes to CCPA?  Please share any comments you might have or if you’d like to know more about a particular topic.

Image Copyright © New Line Cinema (you know which one)

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Forget CCPA. COPPA Just Cost YouTube and Google $170 Million: Cybersecurity Trends

Sure, we’ve been talking a lot the past couple of years about Europe’s General Data Protection Regulation (GDPR), enacted in May 2018 and we’ve already seen one big fine here and another huge potential fine here.  And, we’ve been talking for over a year now about the California Consumer Privacy Act, which is scheduled to take effect next January 1st.  But, have we talked about “COPPA”?  Not, till now.  But, “COPPA” just cost YouTube and Google $170 million.

According to CBS News (Google to pay $170 million for violating kids’ privacy on YouTube, written by Sarah Min), Google will pay a record $170 million fine to settle a lawsuit filed by federal and state authorities that charged the internet giant with violating children’s privacy on YouTube, the Federal Trade Commission (FTC) said Wednesday.

The settlement requires Google and YouTube to pay $136 million to the FTC and $34 million to New York state for violating the Children’s Online Privacy Protection Act (COPPA), by collecting personal information from children without their parents’ consent.

The FTC and the New York attorney general alleged in a complaint that YouTube gathered children’s personal information by using “cookies,” or personal identifiers, that track users online. According to the suit, YouTube earned millions of dollars by using the information to deliver targeted ads to kids.

COPPA requires online websites to obtain parental consent prior to collecting kids’ online usage information. The FTC and New York Attorney General Letitia James said that, while YouTube claimed it caters to a general audience, many of its online channels are aimed at children under the age 13. That requires the service to comply with COPPA guidelines.

“YouTube touted its popularity with children to prospective corporate clients,” FTC Chairman Joe Simons said in a statement. “Yet when it came to complying with COPPA, the company refused to acknowledge that portions of its platform were clearly directed to kids.”

For example, a toymaker with a YouTube channel could track people who viewed its videos to send ads for its own products that are targeted to children. The FTC said in its complaint that Google and YouTube told toymaker Mattel that YouTube “is today’s leader in reaching children age 6-11 against top TV channels.” It also said that the companies told Hasbro that YouTube is the “#1 website regularly visited by kids.”

But when it came to advertisers, the FTC alleged that YouTube told at least one marketer that the video-search company need not comply with COPPA, as it did not have users under the age of 13 on the platform.

Prior to Google’s settlement, the largest civil FTC penalty for a children’s data-privacy case was a $5.7 million for a case in February involving social media app TikTok. This penalty is nearly 30 times that one.  Still, critics say last week’s settlement still amounts to a drop in the bucket for Google, whose parent company Alphabet was sitting on $121 billion in cash and securities at the end of June.

Nonetheless, this penalty, along with Google’s GDPR fine from earlier this year, adds up to nearly $227 million.  That’s some serious money, even for a company like Google.  Great Google-y Moogle-y!

So, what do you think?  Will fines like these change how organizations track user data?  Please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

This Makes the Potential GDPR Fine Against British Airways Look Like Peanuts: Data Privacy Trends

It was just last week that we discussed the new (probable) largest fine since the General Data Protection Regulation (GDPR) was enacted last May, with the proposed fine of nearly $230 million against British Airways for a data breach last year.  But, this fine approved by the Federal Trade Commission (FTC) against Facebook for data privacy violations makes that fine look like peanuts.

As discussed by Sharon Nelson in her excellent Ride the Lightning blog (FTC Approves Fine of Roughly $5 Billion Against Facebook for Privacy Violations), the New York Times (subscription required) reported on July 12th that the FTC has approved a fine of roughly $5 billion against Facebook for mishandling users’ personal information, according to three people briefed on the vote, in what would be a landmark settlement that signals a newly aggressive stance by regulators toward the country’s most powerful technology companies.

The much-anticipated settlement still needs final approval from the Justice Department, which rarely rejects settlements reached by the FTC. It would be the biggest fine by far levied by the federal government against a technology company, eclipsing the $22 million imposed on Google in 2012. The size of the penalty underscored the rising frustration among Washington officials with how large technology companies collect, store and use people’s information and the Facebook settlement sets a new bar for privacy enforcement by United States officials, who have brought few cases against large technology companies.  In addition to the fine, Facebook agreed to more comprehensive oversight of how it handles user data. But none of the conditions in the settlement will impose strict limitations on Facebook’s ability to collect and share data with third parties.

The FTC’s investigation was fueled by The New York Times and The Observer of London, which discovered that the social network allowed Cambridge Analytica, a British consulting firm to the Trump campaign, to harvest personal information of its users. The firm used the data to build political profiles about individuals without the consent of Facebook users.

The agency found that Facebook’s handling of user data violated a 2011 privacy settlement with the FTC. That earlier settlement, which came after the company was accused of deceiving people about how it handled their data, required the company to revamp its privacy practices.

Facebook made more than $55 billion in revenue last year.  So, at 9.1 percent of annual revenue last year, this fine would be even more than a GDPR fine would be at 4 percent of global annual turnover.  Nonetheless, when the FTC fine was revealed, Facebook’s stock price rose, making it clear that investors were concerned the result could be even worse.  Maybe the “bite” from GDPR fines isn’t painful enough?

So, what do you think?  Do fines like cause your organization to re-evaluate your own security policies?  As always, please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Move Over Google, Here May Be Your New Largest GDPR Fine: Data Privacy Trends

Now, we’re talking some serious money!  In January, we covered the first big fine for failing to comply with Europe’s General Data Protection Regulation (GDPR) when France’s data protection regulator, the Commission nationale de l’informatique et des libertés (CNIL), issued a €50 million fine (about $56.8 million) fine to Google for failing to comply with GDPR.  Now, we have a new fine being proposed which is more than four times that amount.

As discussed by Sharon Nelson in her excellent Ride the Lightning blog (British Airways Faces Record Fine After Data Breach), the New York Times (subscription required) reported on July 8th that British authorities have said that they intend to order British Airways to pay a fine of nearly $230 million for a data breach last year, the largest penalty against a company for privacy lapses under GDPR.

Poor security at the airline allowed hackers to divert about 500,000 customers visiting the British Airways website last summer to a fraudulent site, where names, addresses, login information, payment card details, travel bookings and other data were taken, according to the Information Commissioner’s Office, the British agency in charge of reviewing data breaches.

In a statement British Airways said it was “surprised and disappointed” by the agency’s finding and would dispute the judgment, which isn’t final regarding the amount.

As we’ve noted many times on this blog, GDPR allows regulators in each European Union country to issue fines of up to 4 percent of a company’s global revenue for a breach. And by acting against an iconic British brand, officials showed that enforcement would not be limited to American-based tech companies, which have been seen as a primary target.

Before GDPR, fines by the Information Commissioner’s Office were capped at 500,000 pounds, or about $625,000. That was the fine it imposed on Facebook last year for allowing Cambridge Analytica to harvest information on millions of users without their consent. However, Facebook and Google are among other companies currently under investigation by the European authorities over breaches of the GDPR (despite previous fines before and after GDPR went into effect, respectively).

The large proposed fine against British Airways is thought to be based on the fact that this was an avoidable breach caused by alleged sloppy security and organizational practices.

As noted above, the British decision to fine British Airways £183.5 million, worth about 1.5 percent of the airline’s annual revenue, is not final. The agency said it would “carefully consider” responses from the airline and others to its penalty before issuing a final decision.  Even if it’s reduced, it seems inevitable to be a new record for GDPR fines (at least for now).

So, what do you think?  Do fines like this cause your organization to re-evaluate your own security policies?  As always, please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Will CCPA Be a “Dumpster Fire” for Those Trying to Comply? Here are 10 Reasons it Might Be: Data Privacy Trends

We’re less than six months away from the scheduled start of the California Consumer Privacy Act (CCPA) on January 1, 2020.  So, what are the law’s prospects when it goes into effect next year?  According to one article, there are ten reasons why CCPA is going to be a “dumpster fire” when it goes into effect next year.

In Truth on the Market (10 Reasons Why the California Consumer Privacy Act (CCPA) Is Going to Be a Dumpster Fire, written by Alec Stapp), real estate developer Alastair Mactaggart spent nearly $3.5 million last year to put a privacy law on the ballot in California’s November election. He then negotiated a deal with state lawmakers to withdraw the ballot initiative if they passed their own privacy bill. That law – CCPA – was enacted after only seven days of drafting and amending.

Mactaggart said it all began when he spoke with a Google engineer and was shocked to learn how much personal data the company collected and he was motivated to find out exactly how much of his data Google had. But, instead of using Google’s freely available transparency tools, Mactaggart decided to spend millions to pressure the state legislature into passing new privacy regulation.

CCPA has six consumer rights, including the right to know; the right of data portability; the right to deletion; the right to opt-out of data sales; the right to not be discriminated against as a user; and a private right of action for data breaches.  But, according to Stapp, there are ten reasons why CCPA is going to be a “dumpster file”.  Here are a few of them:

  • CCPA compliance costs will be astronomical: According to the article, if CCPA were in effect today, 86 percent of firms would not be ready. With an estimated half a million firms liable under the CCPA, if all eligible firms paid only $100,000, the upfront cost would already be $50 billion.  And, that doesn’t include lost advertising revenue, which could total as much as $60 billion
  • CCPA is potentially unconstitutional as-written: The law’s purported application to businesses not physically located in California raises potentially significant dormant Commerce Clause and other Constitutional problems.
  • GDPR compliance programs cannot be recycled for CCPA: Companies cannot just expand the coverage of their EU GDPR compliance measures to residents of California. For example, the California Consumer Privacy Act contains a broader definition of “personal data”, establishes broad rights for California residents to direct deletion of data with differing exceptions than those available under GDPR and establishes broad rights to access personal data without certain exceptions available under GDPR, among other differences.
  • CCPA’s definition of “personal information” is extremely over-inclusive: CCPA likely includes gender information in the “personal information” definition because it is “capable of being associated with” a particular consumer when combined with other datasets. Also, the definition of “personal information” includes “household” information, which is particularly problematic. A “household” includes the consumer and other co-habitants, which means that a person’s “personal information” oxymoronically includes information about other people.
  • CCPA will need to be amended, creating uncertainty for businesses: As of now, a dozen bills amending CCPA have passed the California Assembly and continue to wind their way through the legislative process. California lawmakers have just over two more months (until September 13th) to make any final changes to the law before it goes into effect.

The complete list of ten reasons that CCPA is going to be a “dumpster file” is provided in the article here.

So, what do you think?  Are you concerned about the status of CCPA?  As always, please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Got Data Across Borders? The Sedona Conference Has a Commentary for You: eDiscovery Best Practices

Lately, it seems like we have a new publication from The Sedona Conference® (TSC) every couple of months or so.  This latest publication provides guidance to those that are transferring data across borders.

Last week, TSC and its Working Group 6 on International Electronic Information Management, Discovery, and Disclosure (WG6) announced that The Sedona Conference Commentary and Principles on Jurisdictional Conflicts over Transfers of Personal Data Across Borders (“Commentary”) has been published for public comment.

The goal of this Commentary is to provide: (1) a practical guide to corporations and others who must make day-to-day operational decisions regarding the transfer of data across borders; and (2) to provide a framework for the analysis of questions regarding the laws applicable to cross-border transfers of personal data.

This 48-page (PDF) Commentary lists six choice-of-law principles, then provides an Introduction that discusses the underlying tension, comity (not comedy) and legal and practical complexity associated with the transfer of data.  It ends with a 19 page Appendix on Data Privacy Complexity and Background.  In between, the Commentary goes much more into depth on the six choice of law principles, which are as follows:

Principle 1: A nation has nonexclusive jurisdiction over, and may apply its privacy and data protection laws to, natural persons and organizations in or doing business in its territory, regardless of whether the processing of the relevant personal data takes place within its territory.

Principle 2: A nation usually has nonexclusive jurisdiction over, and may apply its privacy and data protection laws to, the processing of personal data inextricably linked to its territory.

Principle 3: In commercial transactions in which the contracting parties have comparable bargaining power, the informed choice of the parties to a contract should determine the jurisdiction or applicable law with respect to the processing of personal data in connection with the respective commercial transaction, and such choice should be respected so long as it bears a reasonable nexus to the parties and the transaction.

Principle 4: Outside of commercial transactions, where the natural person freely makes a choice, that person’s choice of jurisdiction or law should not deprive him or her of protections that would otherwise be applicable to his or her data.

Principle 5: Data in transit (“Data in Transit”) from one sovereign nation to another should be subject to the jurisdiction and the laws of the sovereign nation from which the data originated, such that, absent extraordinary circumstances, the data should be treated as if it were still located in its place of origin.

Principle 6: Where personal data located within, or otherwise subject to, the jurisdiction or the laws of a sovereign nation is material to a litigation, investigation, or other legal proceeding within another sovereign nation, such data shall be provided when it is subject to appropriate safeguards that regulate the use, dissemination, and disposition of the data.

You can download a copy of the Commentary here (login required, which is free).  The Commentary is open for public comment through August 10, 2019. Questions and comments on the Commentary are welcome through August 10, and may be sent to comments@sedonaconference.org.

As always, the drafting team will carefully consider all comments received, and determine what edits are appropriate for the final version.  Also, a webinar on the Commentary will be scheduled in the coming weeks, and will be announced by email and on The Sedona Conference website to give you the opportunity to ask questions and gain additional insight on this important topic.

So, what do you think?  How does your organization address transfer of personal data across borders?  As always, please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

FTC Calling for National Data Privacy Law: Data Privacy Trends

Sure, we’ve talked about California’s Consumer Privacy Act (CCPA).  And, we’ve also noted that there are at least 15 state data privacy laws that are working their way through the legislative process.  But, is there anybody pushing for a national data privacy law?  At least one Federal agency is doing so.

According to Naked Security (FTC renews call for single federal privacy law, written by Lisa Vaas – with hat tip to Sharon Nelson and the excellent Ride the Lightning blog), the U.S. Federal Trade Commission (FTC) is again “beating the drum” for the long-discussed and much-debated national data privacy law, the lack of which keeps the country from parity with the EU and its General Data Protection Regulation (GDPR), or with the various states (including California) that are working on their own laws.

Earlier this month, FTC commissioners testified before the House Energy and Commerce subcommittee and as reported by The New York Times, they addressed how a national privacy law could regulate how big tech companies like Facebook and Google collect and handle user data.  Of course, besides consumer protection, the FTC is looking for more power. Commissioners asked Congress to strengthen the agency’s ability to police violations, asking for more resources and greater authority to impose penalties.

In February, both the House and Senate held hearings on privacy legislation, transparency about how data is collected and shared, and the stiffening of penalties for data-handling violations.  A number of lawmakers agree that we need a new, single federal privacy law and they are now considering several laws and bills, including the Data Care Act and the American Data Dissemination Act.  One senator even proposed a bill that would throw execs into jail for up to 20 years if they play “loosey-goosey” with consumer privacy.  Yeah, that’ll happen.

With the FTC in settlement talks with Facebook following its 13-month investigation into privacy violations stemming from the Cambridge Analytica privacy debacle, there are certainly plenty of reasons to pass legislation to standardize the handling of data privacy breaches.  All we have to do is to get Congress to agree on it.  Easy, right?  ;o)

So, what do you think?  Do we need a national data privacy law similar to Europe’s GDPR?  Please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Grants Motion to Compel Production of Telephone Records from Individual Plaintiff: eDiscovery Case Law

In Siemers v. BNSF Railway Co., No. 8:17-cv-360 (D. Neb. Apr. 8, 2019), Nebraska Magistrate Judge Susan M. Bazis finding that the plaintiff’s telephone records are discoverable pursuant to Fed. R. Civ. P. 26, that they are not subject to a privilege claim just because plaintiff’s counsel’s telephone number may appear in the records and that privacy issues are minimal to non-existent (since the at-issue records do not contain the substance of communications), ordered the plaintiff to produce his telephone records within one week of the order.

Case Background

In this case regarding the plaintiff’s suit against his former employer for alleged violations of the Federal Employers Liability Act (“FELA”), the defendant requested production of the plaintiff’s cellular telephone records from November 1, 2016 (the day before the claimed injury incident that is the basis of Plaintiff’s lawsuit) to present. After the plaintiff refused to produce any records in response to the defendant’s request, a discovery dispute conference was held in October 2018, with the Court finding that the plaintiff’s communications with coworkers or others from the defendant and telephone records evidencing the same were relevant and discoverable, and ordered the parties to further confer regarding production of these items.

The plaintiff then issued a subpoena to his cellular telephone provider and received a listing of incoming and outgoing telephone calls and text messages, but not the substance of any communications. Nonetheless, the plaintiff refused to produce to the defendant the telephone records produced to him in response to his subpoena.

In the final pretrial conference, the defendant argued that the records were discoverable because whether and how often plaintiff has communicated with BNSF coworkers or management since his alleged injury could have credibility considerations, that identifying the fact that a communication occurred between the plaintiff and his attorney was not privileged or, alternatively, that it was not unduly burdensome to redact those references and that no privacy interest was implicated in the telephone records because the records do not contain the substance of any communications.  The plaintiff argued that the defendant’s request was “overbroad on its face and therefore not reasonably calculated to lead to the discovery of relevant information” and also contended that the discovery sought by the defendant was “unreasonably cumulative or duplicative and could have been obtained from other sources that is more convenient, less burdensome, or less expensive.”

Judge’s Ruling

Considering the respective arguments, Judge Bazis ruled as follows:

  1. “Plaintiff’s telephone records from November 1, 2016 to present and any other records received by Plaintiff in response to his subpoena to his cellular telephone provider are discoverable pursuant to Fed. R. Civ. P. 26. BNSF is entitled to discover whether and how often Plaintiff has communicated with coworkers or BNSF management since his alleged injury.
  2. The fact that Plaintiff’s counsel’s telephone number may appear in the records does not render them subject to a privilege claim. Plaintiff may redact references to communications between Plaintiff and Plaintiff’s counsel, which the Court finds is not overly burdensome.
  3. Privacy considerations of Plaintiff or third parties not involved in this litigation are minimal to non-existent since the at-issue records do not contain the substance of communications.”

As a result, Judge Bazis ordered (in all caps, no less) the plaintiff “to produce to BNSF all records received in response to Plaintiff’s subpoena to his cellular telephone carrier” within one week of the order, noting that he could “redact references to communications between Plaintiff’s counsel and Plaintiff (but is not required to do so to maintain privilege claims regarding the substance of the communications).”

So, what do you think?  Was that appropriate or was the defendant’s request overbroad?  Please let us know if any comments you might have or if you’d like to know more about a particular topic.

Case opinion link courtesy of eDiscovery Assistant.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.