Analysis

Alon Israely, Esq., CISSP of BIA: eDiscovery Trends

This is the third of the 2015 LegalTech New York (LTNY) Thought Leader Interview series. eDiscovery Daily interviewed several thought leaders at LTNY this year and generally asked each of them the following questions:

  1. What are your general observations about LTNY this year and how it fits into emerging trends? Do you think American Lawyer Media (ALM) should consider moving LTNY to a different time of year to minimize travel disruptions due to weather?
  2. After our discussion last year regarding the new amendments to discovery provisions of the Federal Rules of Civil Procedure, additional changes were made to Rule 37(e). Do you see those changes as being positive and do you see the new amendments passing through Congress this year?
  3. Last year, most thought leaders agreed that, despite numerous resources in the industry, most attorneys still don’t know a lot about eDiscovery. Do you think anything has been done in the past year to improve the situation?
  4. What are you working on that you’d like our readers to know about?

Today’s thought leader is Alon Israely. Alon is the Manager of Strategic Partnerships at Business Intelligence Associates, Inc. (BIA) and currently leads the Strategic Partner Program at BIA. Alon has over eighteen years of experience in a variety of advanced computing-related technologies and has consulted with law firms and corporations on a variety of technology issues, including expert witness services related to computer forensics, digital evidence management and data security. Alon is an attorney and a Certified Information Systems Security Professional (CISSP).

What are your general observations about LTNY this year and how it fits into emerging trends? Do you think American Lawyer Media (ALM) should consider moving LTNY to a different time of year to minimize travel disruptions due to weather?

I didn’t get to spend as much time on the floor and in the sessions as I would like because, for me, LTNY has become mostly meetings. On the one hand, that doesn’t help me answer your question as completely as I could but, on the other hand, it’s good for ALM because it shows that there’s business being conducted. A big difference between this year and last year (which may be reflective of our activity at BIA, but others have said it as well), is that there has been more substantive discussions and deal-making than in the past. And, I think that’s what you ultimately want from an industry conference.

Also, and I’m not sure if this is because of attrition or consolidation within the industry, but there seems to be more differentiation among the exhibitors at this year’s show. It used to be that I would walk around LegalTech with outside investors who are often people not from the industry and they would comment that “it seems like everybody does the same thing”. Now, I think you’re starting to see real differentiation, not just the perception of differentiation, with exhibitors truly offering solutions in niche and specialized areas.

As for whether ALM should consider moving the show, absolutely! It seems as though the last few years that has been one of the conversation topics among many vendors as they’re setting up before LegalTech as they ask “why is this happening again” with the snow and what-not. We’ve certainly had some logistics problems the past couple of years.

I do think there is something nice about having the show early in the year with people having just returned from the holidays, getting back into business near the beginning of Q1. It is a good time as we’re not yet too distracted with other business, but I think that it would probably be smart for ALM to explore moving LTNY to maybe the beginning of spring. Even a one-month move to the beginning of March could help. I would definitely keep the show in New York and not move the location; although, I would think that they could consider different venues besides the Hilton without affecting attendance. While some exhibitors might say keep it at this time of year to coordinate with their release schedules, I would say that’s a legacy software answer. Being in the SaaS world, we have updates every few weeks, or sooner, so I think with the new Silicon Valley approach to building software, it shouldn’t be as big a deal to match a self- created release schedule. Marketing creates that schedule more than anything else.

After our discussion last year regarding the new amendments to discovery provisions of the Federal Rules of Civil Procedure, additional changes were made to Rule 37(e). Do you see those changes as being positive and do you see the new amendments passing through Congress this year?

I think that they’re going to pass Congress. I’ve been focusing on the changes related to preservation as it seems that most noteworthy cases, especially those involving Judge (Shira) Scheindlin, involve a preservation mistake somewhere. For us at BIA, we feel the Rules changes are quite a validation of what we’re doing with respect to requiring counsel to meet early to discuss discovery issues, and to force the issue of preservation to the forefront. Up until these changes, only savvy and progressive counsel were focused on how legal hold and preservation was being handled and making sure, for example, that there wasn’t some question eight months down the road about some particular batch of emails. The fact that it is now codified and that’s part of the pre-trial “checklist” is very important in creating efficiencies in discovery in general and it’s great for BIA, frankly, because we build preservation software. It validates needing an automated system in your organization which will help you comply.

Last year, most thought leaders agreed that, despite numerous resources in the industry, most attorneys still don’t know a lot about eDiscovery. Do you think anything has been done in the past year to improve the situation?

I hate to sound pessimistic, and obviously I’m generalizing from my experience, but it feels like attorneys are less interested in learning about eDiscovery and more interested in being able to rely on some sort of solution, whether that solution is software or a service provider, to solve their problems. It’s a little bit of a new “stick your head in the sand” attitude. Before, they ignored it; now, they just want to “find the right wrench”. It’s not always just one wrench and it’s not that easy. It is important to be able to say “we use this software and that software and this vendor and here’s our process” and rely on that, but the second step is to understand why you are relying on that software and that vendor. I think some lawyers will just say “great, I’ll buy this software or hire this vendor and I’m done” and check that check box that they now have complied with eDiscovery but it’s important to do both – to purchase the right software or hire the right vendor AND to understand why that was done.

Certainly, vendors may be part of the problem – depending upon how they educate. At BIA, we promote TotalDiscovery as a way of not having to worry about your preservation issues, not having data “fall through the cracks” and that you’ll have defensible processes. We do that but, at the same time, we also try to educate our clients too. We don’t just say “use the software and you’re good to go”, we try to make sure that they understand why the software benefits them. That’s a better way to sell and attorneys feel better about their decision to purchase software when they fully understand why it benefits them.

What are you working on that you’d like our readers to know about?

As I already mentioned, BIA has TotalDiscovery, our SaaS-based preservation software and we are about to release what we call “real-time processing”, which effectively allows for you to go from defensible data collections to searching that collected data in minutes. So, you can perform a remote collection and, within a few minutes of performing that collection, already start to perform eDiscovery caliber searches on that data. We call it the “time machine”. In the past, you would send someone out to collect data, they would bring it back and put it into processing software, then they would take the processed data and they’d search it and provide the results to the attorneys and it would be a three or four week process.

Instead, our remote collection tool lets you collect “on the fly” from anywhere in the world without the logistics of IT, third-party experts and specialized equipment and this will add the next step to that, which is, after collecting the data in a forensically sound manner, almost immediately TotalDiscovery will allow you to start searching it. This is not a local tool – we’re not dropping agents onto someone’s machine to index the entire laptop, we’re collecting the data and, using the power of the cloud and new technology to validate and index that data at super high speeds so that users (corporate legal departments and law firms) can quickly perform searches, view the documents and the hit highlights, as well as tag and export documents and data as needed. It changes the way that the corporate user handles ECA (early case assessment). They get defensible collection and true eDiscovery processing in one automated workflow. We announced that new release here at LegalTech, we’ll be releasing it in the next few weeks and we’re very excited about it.

Thanks, Alon, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

EDRM Publishes Updated Statistical Sampling Guide with Public Comments: eDiscovery Trends

In 2012, we covered EDRM’s initial announcement of a new guide called Statistical Sampling Applied to Electronic Discovery and we covered the release of the updated guide (Release 2) back in December. That version of the guide has now been updated with feedback from the comment period.

The public comment period for EDRM’s Statistical Sampling Applied to Electronic Discovery, Release 2, published on the EDRM website here, concluded on January 9, 2015 and EDRM has announced the release of the updated guide today.

The guide ranges from the introductory and explanation of basic statistical terms (such as sample size, margin of error and confidence level) to more advanced concepts such as binomial distribution and hypergeometric distribution. Bring your brain.

The guide includes an accompanying Excel spreadsheet which can be downloaded from the page, EDRM Statistics Examples 20150123.xlsm, which implements relevant calculations supporting Sections 7, 8 and 9 of the 10 section guide. The spreadsheet was developed using Microsoft Excel 2013 and is an .xlsm file, meaning that it contains VBA code (macros), so you may have to adjust your security settings in order to view and use them. You’ll also want to read the guide first (especially sections 7 thru 10) as the Excel workbook is a bit cryptic.

Even though the public comment period has ended, comments can still be posted at the bottom of the EDRM Statistical Sampling Release 2 page, or emailed to the group at sampling@edrm.net or you can fill out their comment form here.

As I noted back in December, the old guide, from April of 2012, is still on the EDRM site. You’ll want to make sure you go to the new updated guide, located here.

So, what do you think? Do you perform statistical sampling to verify results within your eDiscovery process? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

The First 7 to 10 Days May Make or Break Your Case: eDiscovery Best Practices

Having worked with a client recently that was looking for some guidance at the outset of their case, it seemed appropriate to revisit this topic here.

When a case is filed, several activities must be completed within a short period of time (often as soon as the first seven to ten days after filing) to enable you to assess the scope of the case, where the key electronically stored information (ESI) is located and whether to proceed with the case or attempt to settle with opposing counsel. Here are several of the key early activities that can assist in deciding whether to litigate or settle the case.

Activities:

  • Create List of Key Employees Most Likely to have Documents Relevant to the Litigation: To estimate the scope of the case, it’s important to begin to prepare the list of key employees that may have potentially responsive data. Information such as name, title, eMail address, phone number, office location and where information for each is stored on the network is important to be able to proceed quickly when issuing hold notices and collecting their data. Some of these employees may no longer be with your organization, so you may have to determine whether their data is still available and where.
  • Issue Litigation Hold Notice and Track Results: The duty to preserve begins when you anticipate litigation; however, if litigation could not be anticipated prior to the filing of the case, it is certainly clear once the case if filed that the duty to preserve has begun. Hold notices must be issued ASAP to all parties that may have potentially responsive data. Once the hold is issued, you need to track and follow up to ensure compliance. Here are a couple of posts from 2012 regarding issuing hold notices and tracking responses.
  • Interview Key Employees: As quickly as possible, interview key employees to identify potential locations of responsive data in their possession as well as other individuals they can identify that may also have responsive data so that those individuals can receive the hold notice and be interviewed.
  • Interview Key Department Representatives: Certain departments, such as IT, Records or Human Resources, may have specific data responsive to the case. They may also have certain processes in place for regular destruction of “expired” data, so it’s important to interview them to identify potentially responsive sources of data and stop routine destruction of data subject to litigation hold.
  • Inventory Sources and Volume of Potentially Relevant Documents: Potentially responsive data can be located in a variety of sources, including: shared servers, eMail servers, employee workstations, employee home computers, employee mobile devices, portable storage media (including CDs, DVDs and portable hard drives), active paper files, archived paper files and third-party sources (consultants and contractors, including cloud storage providers). Hopefully, the organization already has created a data map before litigation to identify the location of sources of information to facilitate that process. It’s important to get a high level sense of the total population to begin to estimate the effort required for discovery.
  • Plan Data Collection Methodology: Determining how each source of data is to be collected also affects the cost of the litigation. Are you using internal resources, outside counsel or a litigation support vendor? Will the data be collected via an automated collection system or manually? Will employees “self-collect” any of their own data? If so, important data may be missed. Answers to these questions will impact the scope and cost of not only the collection effort, but the entire discovery effort.

These activities can result in creating a data map of potentially responsive information and a “probable cost of discovery” spreadsheet (based on initial estimated scope compared to past cases at the same stage) that will help in determining whether to proceed to litigate the case or attempt to settle with the other side.

So, what do you think? How quickly do you decide whether to litigate or settle? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscoveryDaily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

DESI Wants Your Input! – eDiscovery Trends

It’s not Desi Arnaz who wants it, but the Discovery of Electronically Stored Information (DESI) VI workshop, which is being held at the University of San Diego on June 8 as part of the 15th International Conference on Artificial Intelligence & Law (ICAIL 2015).

The DESI VI workshop aims to bring together researchers and practitioners to explore innovation and the development of best practices for application of search, classification, language processing, data management, visualization, and related techniques to institutional and organizational records in eDiscovery, information governance, public records access, and other legal settings. Ideally, the aim of the DESI workshop series has been to foster a continuing dialogue leading to the adoption of further best practice guidelines or standards in using machine learning, most notably in the eDiscovery space. Organizing committee members include Jason R. Baron of Drinker Biddle & Reath LLP and Douglas W. Oard of the University of Maryland.

Previous DESI workshops were held in places like Palo Alto, London, Barcelona, Rome and Pittsburgh (maybe not as exciting as the other locales, but they don’t have six Super Bowl championships 🙂 ).

DESI VI invites “refereed” papers (due by April 10 and limited to 4-10 pages) describing research or practice. After peer review, accepted papers will be posted on the DESI VI website and distributed to workshop participants. Authors of accepted refereed papers will be invited to present their work either as an oral or a poster presentation. They also invite “unrefereed” position papers (due by May 1and typically 2-3 pages) describing individual interests for inclusion (without review) on the DESI VI Web site and distribution to workshop participants.  Submissions should be sent by email to Doug Oard (oard@umd.edu) with the subject line DESI VI POSITION PAPER or DESI VI RESEARCH PAPER. All submissions received will be acknowledged within 3 days.

Participation in the DESI VI workshop is open. Submission of papers is encouraged, but not required.

For more information about the workshop, click the Call for Submissions here (or here for the PDF version). The Call for Submissions also includes a References section which includes papers and cases useful as background reading for the focus of the workshop – even if you don’t plan to go, it’s a good list to check out. I’m happy to say that most of the cases on the list have been covered by this blog (including Da Silva Moore, EORHB v. HOA Holdings, Global Aerospace Inc., et al. v. Landow Aviation, L.P. and others.

So, what do you think? Are you going to attend? Submit a paper? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscoveryDaily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Three “C”s, Cowboys, Cannibals and Craig (Ball) – eDiscovery Best Practices

They say that a joke is only old if you haven’t heard it before. In that vein, an article about eDiscovery is only old if you haven’t read it before. Craig Ball is currently revisiting some topics that he covered ten years ago with an updated look, making them appropriate for 1) people who weren’t working in eDiscovery ten years ago (which is probably a lot of you), 2) people who haven’t read the articles previously and 3) people who have read the articles previously, but haven’t seen his updated takes.  In other words, everybody.

So far, Craig has published three revisited articles to his terrific Ball in your court blog. They are:

Starting Over, which sets the stage for the series, and covers The DNA of Data, which was the very first Ball in your court (when it was still in print form). This article discusses how electronic evidence isn’t going away and claims of inaccessible data and how technological advances have rendered claims of inaccessibility mostly moot.

Unclear on the Concept (originally published in Law Technology News in May of 2005), which discusses some of the challenges of early concept searching and related tools (when terms like “predictive coding” and “technology assisted review” hadn’t even entered our lexicon yet). Craig also pokes fun at himself for noting back then how he read Alexander Solzhenitsyn and Joyce Carol Oates in grade school. 🙂

Cowboys and Cannibals (originally published in Law Technology News in June of 2005), which discusses the need for a new email “sheriff” in town (not to be confused with U.S. Magistrate Judge John Facciola in this case) to classify emails for easier retrieval. Back then, we didn’t know just how big the challenge of Information Governance would become. His updated take concludes as follows:

“What optimism exists springs from the hope that we will move from the Wild West to Westworld, that Michael Crichton-conceived utopia where robots are gunslingers. The technology behind predictive coding will one day be baked into our IT apps, and much as it serves to protect us from spam today, it will organize our ESI in the future.”

That day is coming, hopefully sooner rather than later. And, you have to love a blog post that references Westworld, which was a terrific story and movie back in the 70s (wonder why nobody has remade that one yet?).

eDiscovery Daily has revisited topics several times as well, especially some of the topics we covered in the early days of the blog, when we didn’t have near as many followers yet. It’s new if you haven’t read it, right? I look forward to future posts in Craig’s series.

So, what do you think? How long have you been reading articles about eDiscovery? Please share any comments you might have or if you’d like to know more about a particular topic.

Image © Metro Goldwyn Mayer

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscoveryDaily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

2014 eDiscovery Case Law Yhttps://cloudnine.com/ediscoverydaily/case-law/2014-ediscovery-case-law-year-in-review-part-3/ear in Review, Part 3

As we noted yesterday and the day before, eDiscoveryDaily published 93 posts related to eDiscovery case decisions and activities over the past year, covering 68 unique cases! Yesterday, we looked back at cases related to eDiscovery cost sharing and reimbursement, fee disputes and production format disputes. Today, let’s take a look back at cases related to privilege and inadvertent disclosures, requests for social media, cases involving technology assisted review and the case of the year – the ubiquitous Apple v. Samsung dispute.

We grouped those cases into common subject themes and will review them over the next few posts. Perhaps you missed some of these? Now is your chance to catch up!

PRIVILEGE / INADVERTENT DISCLOSURES

There were a couple of cases related to privilege issues, including one where privilege was upheld when the plaintiff purchased the defendant’s seized computer at auction! Here are two cases where disclosure of privileged documents was addressed:

Privilege Not Waived on Defendant’s Seized Computer that was Purchased by Plaintiff at Auction: In Kyko Global Inc. v. Prithvi Info. Solutions Ltd., Washington Chief District Judge Marsha J. Pechman ruled that the defendants’ did not waive their attorney-client privilege on the computer of one of the defendants purchased by plaintiffs at public auction, denied the defendants’ motion to disqualify the plaintiff’s counsel for purchasing the computer and ordered the plaintiffs to provide defendants with a copy of the hard drive within three days for the defendants to review it for privilege and provide defendants with a privilege log within seven days of the transfer.

Plaintiff Can’t “Pick” and Choose When it Comes to Privilege of Inadvertent Disclosures: In Pick v. City of Remsen, Iowa District Judge Mark W. Bennett upheld the magistrate judge’s order directing the destruction of an inadvertently-produced privileged document, an email from defense counsel to some of the defendants, after affirming the magistrate judge’s analysis of the five-step analysis to determine whether privilege was waived.

SOCIAL MEDIA

Requests for social media data in litigation continue, though there were not as many disputes over it as in years past (at least, not with cases we covered). Here are three cases related to social media data:

Plaintiff Ordered to Produce Facebook Photos and Messages as Discovery in Personal Injury Lawsuit: In Forman v. Henkin, a Motion to Compel was granted in part for a defendant who requested authorization to obtain records of the plaintiff’s private postings to Facebook.

Plaintiff Ordered to Re-Open Social Media Account for Discovery: In Chapman v. Hiland Operating, LLC, while noting that he was “skeptical” that reactivating the plaintiff’s Facebook account would produce any relevant, noncumulative information, North Dakota Magistrate Judge Charles S. Miller ordered the plaintiff to “make a reasonable, good faith attempt” to reactivate her Facebook account.

Order for Financial Records and Facebook Conversations Modified Due to Privacy Rights: In Stallings v. City of Johnston City, Illinois Chief District Judge David R. Herndon modified an earlier order by a magistrate judge in response to the plaintiff’s appeal, claiming that the order violated the privacy rights of the plaintiff, and of minor children with whom the plaintiff had held conversations on Facebook.

TECHNOLOGY ASSISTED REVIEW

Technology assisted review continued to be discussed and debated between parties in 2014, with some disputes involving how technology assisted review would be conducted as opposed to whether it would be conducted at all. Courts continued to endorse technology assisted review and predictive coding, even going so far as to suggest the use of it in one case. Here are six cases involving the use of technology assisted review in 2014:

Court Rules that Unilateral Predictive Coding is Not Progressive: In In Progressive Cas. Ins. Co. v. Delaney, Nevada Magistrate Judge Peggy A. Leen determined that the plaintiff’s unannounced shift from the agreed upon discovery methodology, to a predictive coding methodology for privilege review was not cooperative. Therefore, the plaintiff was ordered to produce documents that met agreed-upon search terms without conducting a privilege review first.

Court Rules in Dispute Between Parties Regarding ESI Protocol, Suggests Predictive Coding: In a dispute over ESI protocols in FDIC v. Bowden, Georgia Magistrate Judge G. R. Smith approved the ESI protocol from the FDIC and suggested the parties consider the use of predictive coding.

Court Sides with Defendant in Dispute over Predictive Coding that Plaintiff Requested: In the case In re Bridgepoint Educ., Inc., Securities Litigation, California Magistrate Judge Jill L. Burkhardt ruled that expanding the scope of discovery by nine months was unduly burdensome, despite the plaintiff’s request for the defendant to use predictive coding to fulfill its discovery obligation and also approved the defendants’ method of using search terms to identify responsive documents for the already reviewed three individual defendants, directing the parties to meet and confer regarding the additional search terms the plaintiffs requested.

Though it was “Switching Horses in Midstream”, Court Approves Plaintiff’s Predictive Coding Plan: In Bridgestone Americas Inc. v. Int’l Bus. Mach. Corp., Tennessee Magistrate Judge Joe B. Brown, acknowledging that he was “allowing Plaintiff to switch horses in midstream”, nonetheless ruled that that the plaintiff could use predictive coding to search documents for discovery, even though keyword search had already been performed.

Court Approves Use of Predictive Coding, Disagrees that it is an “Unproven Technology”: In Dynamo Holdings v. Commissioner of Internal Revenue, Texas Tax Court Judge Ronald Buch ruled that the petitioners “may use predictive coding in responding to respondent’s discovery request” and if “after reviewing the results, respondent believes that the response to the discovery request is incomplete, he may file a motion to compel at that time”.

Court Opts for Defendant’s Plan of Review including TAR and Manual Review over Plaintiff’s TAR Only Approach: In Good v. American Water Works, West Virginia District Judge John T. Copenhaver, Jr. granted the defendants’ motion for a Rule 502(d) order that merely encouraged the incorporation and employment of time-saving computer-assisted privilege review over the plaintiffs’ proposal disallowing linear privilege review altogether.

APPLE V. SAMSUNG

Every now and then, there is a case that just has to be covered. Whether it be for the eDiscovery related issues (e.g., adverse inference sanction, inadvertent disclosures, eDiscovery cost reiumbursement) or the fact that billions of dollars were at stake or the fact that the case earned its own “gate” moniker, the Apple v. Samsung case demanded attention. Here are the six posts (just from 2014, we have more in previous years) about this case:

Quinn Emanuel Sanctioned for Inadvertent Disclosure, Samsung Escapes Sanction: California Magistrate Judge Paul S. Grewal has now handed down an order on motions for sanctions against Samsung and the Quinn Emanuel law firm in the never-ending Apple v. Samsung litigation for the inadvertent disclosure of confidential agreements that Apple had with Nokia, Ericsson, Sharp and Philips – now widely referred to as “patentgate”.

Apple Can’t Mention Inadvertent Disclosure in Samsung Case: Back in January, Quinn Emanuel Urquhart & Sullivan LLP was sanctioned for their inadvertent disclosure in the Apple vs Samsung litigation (commonly referred to as “patentgate”). California Magistrate Judge Paul S. Grewal handed down an order on motions for sanctions against Quinn Emanuel (in essence) requiring the firm to “reimburse Apple, Nokia, and their counsel for any and all costs and fees incurred in litigating this motion and the discovery associated with it”. Many felt that Samsung and Quinn Emanuel got off lightly. Now, Apple can’t even mention the inadvertent disclosure in the upcoming Samsung trial.

Apple Wins Another $119.6 Million from Samsung, But It’s Only 6% of What They Requested: Those of you who have been waiting for significant news to report from the Apple v. Samsung litigation, your wait is over! As reported last week in The Recorder, a California Federal jury ordered Samsung on Friday to pay Apple $119.6 million for infringing three of Apple’s iPhone patents. However, the award was a fraction of the nearly $2.2 billion Apple was requesting.

Samsung and Quinn Emanuel Ordered to Pay Over $2 Million for “Patentgate” Disclosure: Remember the “patentgate” disclosure last year (by Samsung and their outside counsel firm of Quinn Emanuel Urquhart & Sullivan LLP) of confidential agreements that Apple had with Nokia? Did you think they were going to avoid having to pay for that disclosure? The answer is no.

Court Refuses to Ban Samsung from Selling Products Found to Have Infringed on Apple Products: Apple may have won several battles with Samsung, including ultimately being awarded over $1 billion in verdicts, as well as a $2 million sanction for the inadvertent disclosure of its outside counsel firm (Quinn Emanuel Urquhart & Sullivan LLP) commonly known as “patentgate”. But, Samsung has may have won the war with the court’s refusal to ban Samsung from selling products that were found to have infringed on Apple products.

Apple Recovers Part, But Not All, of its Requested eDiscovery Costs from Samsung: Apple won several battles with Samsung, including ultimately being awarded over $1 billion in verdicts, as well as a $2 million sanction for the inadvertent disclosure of its outside counsel firm (Quinn Emanuel Urquhart & Sullivan LLP) commonly known as “patentgate”, but ultimately may have lost the war when the court refused to ban Samsung from selling products that were found to have infringed on Apple products. Now, they’re fighting over relative chicken-feed in terms of a few million that Apple sought to recover in eDiscovery costs.

Tomorrow, we will cover cases related to the most common theme of the year (three guesses and the first two don’t count). Stay tuned!

So, what do you think? Did you miss any of these? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscoveryDaily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

EDRM Updates Statistical Sampling Applied to Electronic Discovery Guide – eDiscovery Trends

Over two years ago, we covered EDRM’s initial announcement of a new guide called Statistical Sampling Applied to Electronic Discovery.  Now, they have announced an updated version of the guide.

The release of EDRM’s Statistical Sampling Applied to Electronic Discovery, Release 2, announced last week and published on the EDRM website, is open for public comment until January 9, 2015, after which any input received will be reviewed and considered for incorporation before the updated materials are finalized.

As EDRM notes in their announcement, “The updated materials provide guidance regarding the use of statistical sampling in e-discovery. Much of the information is definitional and conceptual and intended for a broad audience. Other materials (including an accompanying spreadsheet) provide additional information, particularly technical information, for e-discovery practitioners who are responsible for developing further expertise in this area.”

The expanded Guide is comprised of ten sections (most of which have several sub-sections), as follows:

  1. Introduction
  2. Estimating Proportions within a Binary Population
  3. Acceptance Sampling
  4. Sampling in the Context of the Information Retrieval Grid – Recall, Precision and Elusion
  5. Seed Set Selection in Machine Learning
  6. Guidelines and Considerations
  7. Additional Guidance on Statistical Theory
  8. Calculating Confidence Levels, Confidence Intervals and Sample Sizes
  9. Acceptance Sampling
  10. Examples in the Accompanying Excel Spreadsheet

The guide ranges from the introductory and explanation of basic statistical terms (such as sample size, margin of error and confidence level) to more advanced concepts such as binomial distribution and hypergeometric distribution.  Bring your brain.

As section 10 indicates, there is also an accompanying Excel spreadsheet which can be downloaded from the page, EDRM Statistics Examples 20141023.xlsm, which implements relevant calculations supporting Sections 7, 8 and 9. The spreadsheet was developed using Microsoft Excel 2013 and is an .xlsm file, meaning that it contains VBA code (macros), so you may have to adjust your security settings in order to view and use them.  You’ll also want to read the guide first (especially sections 7 thru 10) as the Excel workbook is a bit cryptic.

Comments can be posted at the bottom of the EDRM Statistical Sampling page, or emailed to the group at mail@edrm.net or you can fill out their comment form here.

One thing that I noticed is that the old guide, from April of 2012, is still on the EDRM site.  It might be a good idea to archive that page to avoid confusion with the new guide.

So, what do you think?  Do you perform statistical sampling to verify results within your eDiscovery process?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Opts for Defendant’s Plan of Review including TAR and Manual Review over Plaintiff’s TAR Only Approach – eDiscovery Case Law

 

In Good v. American Water Works, 2:14-01374 (S.D. W. Vir. Oct. 29, 2014), West Virginia District Judge John T. Copenhaver, Jr. granted the defendants' motion for a Rule 502(d) order that merely encouraged the incorporation and employment of time-saving computer-assisted privilege review over the plaintiffs’ proposal disallowing linear privilege review altogether.

Case Background

In this class action litigation involving the Freedom Industries chemical spill, the parties met and conferred, agreeing on all but one discovery issue: privilege review and 502(d) clawbacks.  The defendants proposed that the Rule 502(d) order merely encourage the incorporation and employment of computer-assisted privilege review, while the plaintiffs proposed that the order “limit privilege review to what a computer can accomplish, disallowing linear (aka ‘eyes on’) privilege review altogether”.

The plaintiffs would agree only to a pure quick peek/claw-back arrangement, which would place never-reviewed, never privilege-logged documents in their hands as quickly as physically possible at the expense of any opportunity for care on the part of a producing party to protect a client's privileged and work product protected information.  On the other hand, the defendants did not wish to forego completely the option to manually review documents for privilege and work product protection. 

The plaintiffs argued that if they were to proceed with a manual privilege review, then only 502(b) protection – the inadvertent waiver rule – should apply, and not 502(d) protection, which offers more expansive protection against privilege waivers.

Judge’s Ruling

Judge Copenhaver noted that “[t]he defendants have chosen a course that would allow them the opportunity to conduct some level of human due diligence prior to disclosing vast amounts of information, some portion of which might be privileged. They also appear to desire a more predictable clawback approach without facing the uncertainty inherent in the Rule 502(b) factoring analysis. Nothing in Rule 502 prohibits that course. And the parties need not agree in order for that approach to be adopted”.

Therefore, despite the fact that the plaintiffs were “willing to agree to an order that provides that the privilege or protection will not be waived and that no other harm will come to the Defendants if Plaintiffs are permitted to see privileged or work product protected documents”, Judge Copenhaver ruled that “[i]nasmuch as defendants' cautious approach is not prohibited by the text of Rule 502, and they appear ready to move expeditiously in producing documents in the case, their desired approach is a reasonable one.”  As a result, he entered their proposed Rule 502(d) order, “with the expectation that the defendants will marshal the resources necessary to assure that the delay occasioned by manual review of portions of designated categories will uniformly be minimized so that disclosure of the entirety of even the most sensitive categories is accomplished quickly.”

So, what do you think?  Should the defendants have retained the right to manual review or should the plaintiffs’ proposed approach have been adopted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

How Mature is Your Organization in Handling eDiscovery? – eDiscovery Best Practices

A new self-assessment resource from EDRM helps you answer that question.

A few days ago, EDRM announced the release of the EDRM eDiscovery Maturity Self-Assessment Test (eMSAT-1), the “first self-assessment resource to help organizations measure their eDiscovery maturity” (according to their press release linked here).

As stated in the press release, eMSAT-1 is a downloadable Excel workbook containing 25 worksheets (actually 27 worksheets when you count the Summary sheet and the List sheet of valid choices at the end) organized into seven sections covering various aspects of the e-discovery process. Complete the worksheets and the assessment results are displayed in summary form at the beginning of the spreadsheet.  eMSAT-1 is the first of several resources and tools being developed by the EDRM Metrics group, led by Clark and Dera Nevin, with assistance from a diverse collection of industry professionals, as part of an ambitious Maturity Model project.

The seven sections covered by the workbook are:

  1. General Information Governance: Contains ten questions to answer regarding your organization’s handling of information governance.
  2. Data Identification, Preservation & Collection: Contains five questions to answer regarding your organization’s handling of these “left side” phases.
  3. Data Processing & Hosting: Contains three questions to answer regarding your organization’s handling of processing, early data assessment and hosting.
  4. Data Review & Analysis: Contains two questions to answer regarding your organization’s handling of search and review.
  5. Data Production: Contains two questions to answer regarding your organization’s handling of production and protecting privileged information.
  6. Personnel & Support: Contains two questions to answer regarding your organization’s hiring, training and procurement processes.
  7. Project Conclusion: Contains one question to answer regarding your organization’s processes for managing data once a matter has concluded.

Each question is a separate sheet, with five answers ranked from 1 to 5 to reflect your organization’s maturity in that area (with descriptions to associate with each level of maturity).  Default value of 1 for each question.  The five answers are:

  • 1: No Process, Reactive
  • 2: Fragmented Process
  • 3: Standardized Process, Not Enforced
  • 4: Standardized Process, Enforced
  • 5: Actively Managed Process, Proactive

Once you answer all the questions, the Summary sheet shows your overall average, as well as your average for each section.  It’s an easy workbook to use with input areas defined by cells in yellow.  The whole workbook is editable, so perhaps the next edition could lock down the calculated only cells.  Nonetheless, the workbook is intuitive and provides a nice exercise for an organization to grade their level of eDiscovery maturity.

You can download a copy of the eMSAT-1 Excel workbook from here, as well as get more information on how to use it (the page also describes how to provide feedback to make the next iterations even better).

The EDRM Maturity Model Self-Assessment Test is the fourth release in recent months by the EDRM Metrics team. In June 2013, the new Metrics Model was released, in November 2013 a supporting glossary of terms for the Metrics Model was published and in November 2013 the EDRM Budget Calculators project kicked off (with four calculators covered by us here, here, here and here).  They’ve been busy.

So, what do you think?  How mature is your organization in handling eDiscovery?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Approves Use of Predictive Coding, Disagrees that it is an “Unproven Technology” – eDiscovery Case Law

 

In Dynamo Holdings v. Commissioner of Internal Revenue, Docket Nos. 2685-11, 8393-12 (U.S. Tax Ct. Sept 17, 2014), Texas Tax Court Judge Ronald Buch ruled that the petitioners “may use predictive coding in responding to respondent's discovery request” and if “after reviewing the results, respondent believes that the response to the discovery request is incomplete, he may file a motion to compel at that time”.

The cases involved various transfers from one entity to a related entity where the respondent determined that the transfers were disguised gifts to the petitioner's owners and the petitioners asserted that the transfers were loans.

The respondent requested for the petitioners to produce the electronically stored information (ESI) contained on two specified backup storage tapes or simply produce the tapes themselves. The petitioners asserted that it would "take many months and cost at least $450,000 to do so", requesting that the Court deny the respondent's motion as a "fishing expedition" in search of new issues that could be raised in these or other cases. Alternatively, the petitioners requested that the Court let them use predictive coding to efficiently and economically identify the non-privileged information responsive to respondent's discovery request.  The respondent opposed the petitioners' request to use predictive coding, calling it "unproven technology" and added that petitioners could simply give him access to all data on the two tapes and preserve the right (through a "clawback agreement") to later claim that some or all of the data is privileged.

Judge Buch called the request to use predictive coding “somewhat unusual” and stated that “although it is a proper role of the Court to supervise the discovery process and intervene when it is abused by the parties, the Court is not normally in the business of dictating to parties the process that they should use when responding to discovery… Yet that is, in essence, what the parties are asking the Court to consider – whether document review should be done by humans or with the assistance of computers. Respondent fears an incomplete response to his discovery. If respondent believes that the ultimate discovery response is incomplete and can support that belief, he can file another motion to compel at that time.”

With regard to the respondent’s categorization of predictive coding as “unproven technology”, Judge Buch stated “We disagree. Although predictive coding is a relatively new technique, and a technique that has yet to be sanctioned (let alone mentioned) by this Court in a published Opinion, the understanding of e-discovery and electronic media has advanced significantly in the last few years, thus making predictive coding more acceptable in the technology industry than it may have previously been. In fact, we understand that the technology industry now considers predictive coding to be widely accepted for limiting e-discovery to relevant documents and effecting discovery of ESI without an undue burden.”

As a result, Judge Buch ruled that “[p]etitioners may use predictive coding in responding to respondent's discovery request. If, after reviewing the results, respondent believes that the response to the discovery request is incomplete, he may file a motion to compel at that time.”

So, what do you think?  Should predictive coding have been allowed in this case?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.