Analysis

eDiscovery Trends: eDiscovery Work is Growing in Law Firms and Corporations

 

There was an article in Law Technology News last Friday (Survey Shows Surge in E-Discovery Work at Law Firms and Corporations, written by Monica Bay) that discussed the findings of a survey released by The Cowen Group, indicating that eDiscovery work in law firms and corporations is growing considerably.  Eighty-eight law firm and corporate law department professionals responded to the survey.

Some of the key findings:

  • 70 percent of law firm respondents reported an increase in workload for their litigation support and eDiscovery departments (compared to 42 percent in the second quarter of 2009);
  • 77 percent of corporate law department respondents reported an increase in workload for their litigation support and eDiscovery departments;
  • 60 percent of respondents anticipate increasing their internal capabilities for eDiscovery;
  • 55 percent of corporate and 62 percent of firm respondents said they "anticipate outsourcing a significant amount of eDiscovery to third-party providers” (some organizations expect to both increase internal capabilities and outsource);
  • 50 percent of the firms believe they will increase technology speeding in the next three months (compared to 31 percent of firms in 2010);
  • 43 percent of firms plan to add people to their litigation support and eDiscovery staff in the next 3 months, compared to 32 percent in 2011;
  • Noting that “corporate legal departments are under increasing pressure to ‘do more with less in-house to keep external costs down’”, only 12 percent of corporate respondents anticipate increasing headcount and 30 percent will increase their technology spend in the next six months;
  • In the past year, 49 percent of law firms and 23 percent of corporations have used Technology Assisted Review/ Predictive Coding technology through a third party service provider – an additional 38 percent have considered using it;
  • As for TAR/Predictive Coding inhouse, 30 percent of firms have an inhouse tool, and an additional 35 percent are considering making the investment.

As managing partner David Cowen notes, “Cases such as Da Silva Moore, Kleen, and Global Aerospace, which have hit our collective consciousness in the past three months, affect the investments in technology that both law firms and corporations are making.”  He concludes the Executive Summary of the report with this advice: “Educate yourself on the latest evolving industry trends, invest in relationships, and be an active participant in helping your executives, your department, and your clients ‘do more with less’.”

So, what do you think?  Do any of those numbers and trends surprise you?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: The Da Silva Moore Case Has Class (Certification, That Is)

 

As noted in an article written by Mark Hamblett in Law Technology News, Judge Andrew Carter of the U.S. District Court for the Southern District of New York has granted conditional class certification in the Da Silva Moore v. Publicis Groupe & MSL Group case.

In this case, women employees of the advertising conglomerate Publicis Groupe and its U.S. subsidiary, MSL, have accused their employer of company-wide discrimination, pregnancy discrimination, and a practice of keeping women at entry-level positions with few opportunities for promotion.

Judge Carter concluded that “Plaintiffs have met their burden by making a modest factual showing to demonstrate that they and potential plaintiffs together were victims of a common policy or plan that violated the law. They submit sufficient information that because of a common pay scale, they were paid wages lower than the wages paid to men for the performance of substantially equal work. The information also reveals that Plaintiffs had similar responsibilities as other professionals with the same title. Defendants may disagree with Plaintiffs' contentions, but the Court cannot hold Plaintiffs to a higher standard simply because it is an EPA action rather an action brought under the FLSA.”

“Courts have conditionally certified classes where the plaintiffs have different job functions,” Judge Carter noted, indicating that “[p]laintiffs have to make a mere showing that they are similarly situated to themselves and the potential opt-in members and Plaintiffs here have accomplished their goal.”

This is just the latest development in this test case for the use of computer-assisted coding to search electronic documents for responsive discovery. On February 24, Magistrate Judge Andrew J. Peck of the U.S. District Court for the Southern District of New York issued an opinion making it likely the first case to accept the use of computer-assisted review of electronically stored information (“ESI”) for this case.  However, on March 13, District Court Judge Andrew L. Carter, Jr. granted plaintiffs’ request to submit additional briefing on their February 22 objections to the ruling.  In that briefing (filed on March 26), the plaintiffs claimed that the protocol approved for predictive coding “risks failing to capture a staggering 65% of the relevant documents in this case” and questioned Judge Peck’s relationship with defense counsel and with the selected vendor for the case, Recommind.

Then, on April 5, Judge Peck issued an order in response to Plaintiffs’ letter requesting his recusal, directing plaintiffs to indicate whether they would file a formal motion for recusal or ask the Court to consider the letter as the motion.  On April 13, (Friday the 13th, that is), the plaintiffs did just that, by formally requesting the recusal of Judge Peck (the defendants issued a response in opposition on April 30).  But, on April 25, Judge Carter issued an opinion and order in the case, upholding Judge Peck’s opinion approving computer-assisted review.

Not done, the plaintiffs filed an objection on May 9 to Judge Peck's rejection of their request to stay discovery pending the resolution of outstanding motions and objections (including the recusal motion, which has yet to be ruled on.  Then, on May 14, Judge Peck issued a stay, stopping defendant MSLGroup's production of electronically stored information.  Finally, on June 15, Judge Peck, in a 56 page opinion and order, denied the plaintiffs’ motion for recusal

So, what do you think?  What will happen in this case next?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Domain Categorization of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through “fuzzy” searching to find misspellings or OCR errors in an opponent’s produced ESI.

Domain Categorization

Another type of analysis is the use of domain categorization.  Email is generally the biggest component of most ESI collections and each participant in an email communication belongs to a domain associated with the email server that manages their email.

FirstPass supports domain categorization by providing a list of domains associated with the ESI collection being reviewed, with a count for each domain that appears in emails in the collection.  Domain categorization provides several benefits when reviewing your opponent’s ESI:

  • Non-Responsive Produced ESI: Domains in the list that are obviously non-responsive to the case can be quickly identified and all messages associated with those domains can be “group-tagged” as non-responsive.  If a significant percentage of files are identified as non-responsive, that may be a sign that your opponent is trying to “bury you with paper” (albeit electronic).
  • Inadvertent Disclosures: If there are any emails associated with outside counsel’s domain, they could be inadvertent disclosures of attorney work product or attorney-client privileged communications.  If so, you can then address those according to the agreed-upon process for handling inadvertent disclosures and clawback of same.
  • Issue Identification: Messages associated with certain parties might be related to specific issues (e.g., an alleged design flaw of a specific subcontractor’s product), so domain categorization can isolate those messages more quickly.

In summary, there are several ways to use first pass review tools, like FirstPass, for reviewing your opponent’s ESI production, including: email analytics, synonym searching, fuzzy searching and domain categorization.  First pass review isn’t just for your own production; it’s also an effective process to quickly evaluate your opponent’s production.

So, what do you think?  Have you used first pass review tools to assess an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Fuzzy Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Tuesday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through synonym searching to find variations of your search terms to increase the possibility of finding the terminology used by your opponents.

Fuzzy Searching

Another type of analysis is the use of fuzzy searching.  Attorneys know what terms they’re looking for, but those terms may not often be spelled correctly.  Also, opposing counsel may produce a number of image only files that require Optical Character Recognition (OCR), which is usually not 100% accurate.

FirstPass supports "fuzzy" searching, which is a mechanism by finding alternate words that are close in spelling to the word you're looking for (usually one or two characters off).  FirstPass will display all of the words – in the collection – close to the word you’re looking for, so if you’re looking for the term “petroleum”, you can find variations such as “peroleum”, “petoleum” or even “petroleom” – misspellings or OCR errors that could be relevant.  Then, simply select the variations you wish to include in the search.  Fuzzy searching is the best way to broaden your search to include potential misspellings and OCR errors and FirstPass provides a terrific capability to select those variations to review additional potential “hits” in your collection.

Tomorrow, I’ll talk about the use of domain categorization to quickly identify potential inadvertent disclosures and weed out non-responsive files produced by your opponent, based on the domain of the communicators.  Hasta la vista, baby! J

In the meantime, what do you think?  Have you used fuzzy searching to find misspellings or OCR errors in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Synonym Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through email analytics to see the communication patterns graphically to identify key parties for deposition purposes and look for potential production omissions.

Synonym Searching

Another type of analysis is the use of synonym searching.  Attorneys understand the key terminology their client uses, but they often don’t know the terminology their client’s opposition uses because they haven’t interviewed the opposition’s custodians.  In a product defect case, the opposition may refer to admitted design or construction “mistakes” in their product or process as “flaws”, “errors”, “goofs” or even “flubs”.  With FirstPass, you can enter your search term into the synonym searching section of the application and it will provide a list of synonyms (with hit counts of each, if selected).  Then, you can simply select the synonyms you wish to include in the search.  As a result, FirstPass identifies synonyms of your search terms to broaden the scope and catch key “hits” that could be the “smoking gun” in the case.

Thursday, I’ll talk about the use of fuzzy searching to find misspellings that may be commonly used by your opponent or errors resulting from Optical Character Recognition (OCR) of any image-only files that they produce.  Stay tuned!  🙂

In the meantime, what do you think?  Have you used synonym searching to identify variations on terms in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Happy Independence Day from all of us at eDiscovery Daily and CloudNine Discovery!

eDiscovery Trends: First Pass Review – of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

In the past few years, applications that support Early Case Assessment (ECA) (or Early Data Assessment, as many prefer to call it) and First Pass Review (FPR) of ESI have become widely popular in eDiscovery as the analytical and culling benefits of conducting FPR have become obvious.  The benefit of these FPR tools to analyze and cull their ESI before conducting attorney review and producing relevant files has become increasingly clear.  But, nobody seems to talk about what these tools can do with opponent’s produced ESI.

Less Resources to Understand Data Produced to You

In eDiscovery, attorneys typically develop a reasonably in-depth understanding of their collection.  They know who the custodians are, have a chance to interview those custodians and develop a good knowledge of standard operating procedures and terminology of their client to effectively retrieve responsive ESI.  However, that same knowledge isn’t present when reviewing opponent’s data.  Unless they are deposed, the opposition’s custodians aren’t interviewed and where the data originated is often unclear.  The only source of information is the data itself, which requires in-depth analysis.  An FPR application like FirstPass®, powered by Venio FPR™, can make a significant difference in conducting that analysis – provided that you request a native production from your opponent, which is vital to being able to perform that in-depth analysis.

Email Analytics

The ability to see the communication patterns graphically – to identify the parties involved, with whom they communicated and how frequently – is a significant benefit to understanding the data received.  FirstPass provides email analytics to understand the parties involved and potentially identify other key opponent individuals to depose in the case.  Dedupe capabilities enable quick comparison against your production to confirm if the opposition has possibly withheld key emails between opposing parties.  FirstPass also provides an email timeline to enable you to determine whether any gaps exist in the opponent’s production.

Message Threading

The ability to view message threads for emails (which Microsoft Outlook® tracks), can also be a useful tool as it enables you to see the entire thread “tree” of a conversation, including any side discussions that break off from the original discussion.  Because Outlook tracks those message threads, any missing emails are identified with placeholders.  Those could be emails your opponent has withheld, so the ability to identify those quickly and address with opposing counsel (or with the court, if necessary) is key to evaluating the completeness of the production.

Tomorrow, I’ll talk about the use of synonym searching to find variations of your search terms that may be common terminology of your opponent.  Same bat time, same bat channel! 🙂

In the meantime, what do you think?  Have you used email analytics to analyze an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: When Litigation Hits, The First 7 to 10 Days is Critical

When a case is filed, several activities must be completed within a short period of time (often as soon as the first seven to ten days after filing) to enable you to assess the scope of the case, where the key electronically stored information (ESI) is located and whether to proceed with the case or attempt to settle with opposing counsel.  Here are several of the key early activities that can assist in deciding whether to litigate or settle the case.

Activities:

  • Create List of Key Employees Most Likely to have Documents Relevant to the Litigation: To estimate the scope of the case, it’s important to begin to prepare the list of key employees that may have potentially responsive data.  Information such as name, title, eMail address, phone number, office location and where information for each is stored on the network is important to be able to proceed quickly when issuing hold notices and collecting their data.
  • Issue Litigation Hold Notice and Track Results: The duty to preserve begins when you anticipate litigation; however, if litigation could not be anticipated prior to the filing of the case, it is certainly clear once the case if filed that the duty to preserve has begun.  Hold notices must be issued ASAP to all parties that may have potentially responsive data.  Once the hold is issued, you need to track and follow up to ensure compliance.  Here are a couple of recent posts regarding issuing hold notices and tracking responses.
  • Interview Key Employees: As quickly as possible, interview key employees to identify potential locations of responsive data in their possession as well as other individuals they can identify that may also have responsive data so that those individuals can receive the hold notice and be interviewed.
  • Interview Key Department Representatives: Certain departments, such as IT, Records or Human Resources, may have specific data responsive to the case.  They may also have certain processes in place for regular destruction of “expired” data, so it’s important to interview them to identify potentially responsive sources of data and stop routine destruction of data subject to litigation hold.
  • Inventory Sources and Volume of Potentially Relevant Documents: Potentially responsive data can be located in a variety of sources, including: shared servers, eMail servers, employee workstations, employee home computers, employee mobile devices, portable storage media (including CDs, DVDs and portable hard drives), active paper files, archived paper files and third-party sources (consultants and contractors, including cloud storage providers).  Hopefully, the organization already has created a data map before litigation to identify the location of sources of information to facilitate that process.  It’s important to get a high level sense of the total population to begin to estimate the effort required for discovery.
  • Plan Data Collection Methodology: Determining how each source of data is to be collected also affects the cost of the litigation.  Are you using internal resources, outside counsel or a litigation support vendor?  Will the data be collected via an automated collection system or manually?  Will employees “self-collect” any of their own data?  Answers to these questions will impact the scope and cost of not only the collection effort, but the entire discovery effort.

These activities can result in creating a data map of potentially responsive information and a “probable cost of discovery” spreadsheet (based on initial estimated scope compared to past cases at the same stage) that will help in determining whether to proceed to litigate the case or attempt to settle with the other side.

So, what do you think?  How quickly do you decide whether to litigate or settle?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Documentation is Key to a Successful Discovery Effort

 

We like to point out good articles about eDiscovery on this blog to keep our readers aware of trends and best practices.  I recently read an article on InsideCounsel titled E-discovery: Memorializing the e-discovery process, written by Alvin Lindsay, which had some good specific examples of where good documentation is important to prevent sanctions and save litigation costs.

Litigation Holds

The author notes that, since the Zubulake opinions issued by Judge Shira Scheindlin in 2003 and 2004, 1) most jurisdictions have come to expect that parties must issue a litigation hold “as soon as litigation becomes reasonably foreseeable”, and 2) “oral” litigation holds are unlikely to be sufficient since the same Judge Scheindlin noted in Pension Committee that failure to issue a “written” litigation hold constitutes “gross negligence”.  His advice: “make sure the litigation hold is in writing, and includes at minimum the date of issue, the recipients and the scope of preservation”.  IT personnel responsible for deleting “expired” data (outside of retention policies) also need to receive litigation hold documentation; in fact, “it can be a good idea to provide a separate written notice order just for them”.  Re-issuing the hold notices periodically is important because, well, people forget if they’re not reminded.  For previous posts on the subject of litigation holds, click here and here.

Retention Policies and Data Maps

Among the considerations for documentation here are the actual retention and destruction policies, system-wide backup procedures and “actual (as opposed to theoretical) implementation of the firm’s recycle policy”, as well as documentation of discussions with any personnel regarding same.  A data map provides a guide for legal and IT to the location of data throughout the company and important information about that data, such as the business units, processes and technology responsible for maintaining the data, as well as retention periods for that data.  The author notes that many organizations “don’t keep data maps in the ordinary course of business, so outside counsel may have to create one to truly understand their client’s data retention architecture.”  Creating a data map is impossible for outside counsel without involvement and assistance at several levels within the organization, so it’s truly a group effort and best done before litigation strikes.  For previous posts on the subject of data maps, click here and here.

Conferences with Opposing Counsel

The author discusses the importance of documenting the nature and scope of preservation and production and sums up the importance quite effectively by stating: “If opposing parties who are made aware of limitations early on do not object in a timely fashion to what a producing party says it will do, courts will be more likely to invoke the doctrines of waiver and estoppel when those same parties come to complain of supposed production infirmities on the eve of trial.”  So, the benefits of documenting those limitations early on are clear.

Collecting, Culling and Sampling

Chain of custody documentation (as well as a through written explanation of the collection process) is important to demonstrating integrity of the data being collected.  If you collect at a broad level (as many do), then you need to cull through effective searching to identify potentially responsive ESI.  Documenting the approach for searching as well as the searches themselves is key to a defensible searching and culling process (it helps when you use an application, like FirstPass®, powered by Venio FPR™, that keeps a history of all searches performed).  As we’ve noted before, sampling enables effective testing and refinement of searches and aids in the defense of the overall search approach.

Quality Control

And, of course, documenting all materials and mechanisms used to provide quality assurance and control (such as “materials provided to and used to train the document reviewers, as well as the results of QC checks for each reviewer”) make it easier to defend your approach and even “clawback” privileged documents if you can show that your approach was sound.  Mistakes happen, even with the best of approaches.

So, what do you think?  These are some examples of important documentation of the eDiscovery process – can you think of others?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Judge Peck Denies Recusal Motion in Da Silva Moore

 

It’s been a few weeks since we heard anything from the Da Silva Moore case.  If you’ve been living under a rock the past few months, Magistrate Judge Andrew J. Peck of the U.S. District Court for the Southern District of New York issued an opinion in this case in February making it one of the first cases to accept the use of computer-assisted review of electronically stored information (“ESI”).  However, the plaintiffs objected to the ruling and questioned Judge Peck’s relationship with defense counsel and with the selected vendor for the case, Recommind and ultimately formally requested the recusal of Judge Peck.  For links to all of the recent events in the case that we’ve covered, click here.

Last Friday, in a 56 page opinion and order, Judge Peck denied the plaintiffs’ motion for recusal.  The opinion and order reviewed the past several contentious months and rejected the plaintiffs’ arguments for recusal in the following areas:

Participation in conferences discussing the use of predictive coding:

“I only spoke generally about computer-assisted review in comparison to other search techniques…The fact that my interest in and knowledge about predictive coding in general overlaps with issues in this case is not a basis for recusal.”

“To the extent plaintiffs are complaining about my general discussion at these CLE presentations about the use of predictive coding in general, those comments would not cause a reasonable objective observer to believe I was biased in this case. I did not say anything about predictive coding at these LegalTech and other CLE panels that I had not already said in in my Search,Forward article, i.e., that lawyers should consider using predictive coding in appropriate cases. My position was the same as plaintiffs’ consultant . . . . Both plaintiffs and defendants were proposing using predictive coding in this case.  I did not determine which party’s predictive coding protocol was appropriate in this case until the February 8, 2012 conference, after the panels about which plaintiffs complain.”

“There are probably fewer than a dozen federal judges nationally who regularly speak at ediscovery conferences. Plaintiffs' argument that a judge's public support for computer-assisted review is a recusable offense would preclude judges who know the most about ediscovery in general (and computer-assisted review in particular) from presiding over any case where the use of predictive coding was an option, or would preclude those judges from speaking at CLE programs. Plaintiffs' position also would discourage lawyers from participating in CLE programs with judges about ediscovery issues, for fear of subsequent motions to recuse the judge (or disqualify counsel).”

Relationship with defense counsel Ralph Losey:

“While I participated on two panels with defense counsel Losey, we never had any ex parte communication regarding this lawsuit. My preparation for and participation in ediscovery panels involved only ediscovery generally and the general subject of computer-assisted review. Losey's affidavit makes clear that we have never spoken about this case, and I confirm that. During the panel discussions (and preparation sessions), there was absolutely no discussion of the details of the predictive coding protocol involved in this case or with regard to what a predicative coding protocol should look like in any case. Plaintiffs' assertion that speaking on an educational panel with counsel creates an appearance of impropriety is undermined by Canon 4 of the Judicial Code of Conduct, which encourages judges to participate in such activities.”

Relationship with Recommind, the selected vendor in the case:

“The panels in which I participated are distinguishable. First, I was a speaker at educational conferences, not an audience member. Second, the conferences were not one-sided, but concerned ediscovery issues including search methods in general. Third, while Recommind was one of thirty-nine sponsors and one of 186 exhibitors contributing to LegalTech's revenue, I had no part in approving the sponsors or exhibitors (i.e., funding for LegalTech) and received no expense reimbursement or teaching fees from Recommind or LegalTech, as opposed to those companies that sponsored the panels on which I spoke. Fourth, there was no "pre-screening" of MSL's case or ediscovery protocol; the panel discussions only covered the subject of computer-assisted review in general.”

Perhaps it is no surprise that Judge Peck denied the recusal motion.  Now, the question is: will District Court Judge Andrew L. Carter, Jr. weigh in?

So, what do you think?  Should Judge Peck recuse himself in this case or does he provide an effective argument that recusal is unwarranted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: Where Does the Money Go? RAND Provides Some Answers

 

The RAND Corporation, a nonprofit research and analysis institution recently published a new 159 page report related to understanding eDiscovery costs entitled Where the Money Goes: Understanding Litigant Expenditures for Producing Electronic Discovery by Nicholas M. Pace and Laura Zakaras that has some interesting findings and recommendations.  To obtain either a paperback copy or download a free eBook of the report, click here.

For the study, the authors requested case-study data from eight Fortune 200 companies and obtained data for 57 large-volume eDiscovery productions (from both traditional lawsuits and regulatory investigations) as well as information from extensive interviews with key legal personnel from the participating companies.  Here are some of the key findings from the research:

  • Review Makes Up the Largest Percentage of eDiscovery Production Costs: By a whopping amount, the major cost component in their cases was the review of documents for relevance, responsiveness, and privilege (typically about 73 percent). Collection, on the other hand, only constituted about 8 percent of expenditures for the cases in the study, while processing costs constituted about 19 percent in the cases.  It costs about $14,000 to review each gigabyte and $20,000 in total production costs for each gigabyte (click here for a previous study on per gigabyte costs).  Review costs would have to be reduced by about 75% in order to make those costs comparable to processing, the next highest component.
  • Outside Counsel Makes Up the Largest Percentage of eDiscovery Expenditures: Again, by a whopping amount, the major cost component was expenditures for outside counsel services, which constituted about 70 percent of total eDiscovery production costs.  Vendor expenditures were around 26 percent.  Internal expenditures, even with adjustments made for underreporting, were generally around 4 percent of the total.  So, almost all eDiscovery expenditures are outsourced in one way or another.
  • If Conducted in the Traditional Manner, Review Costs Are Difficult to Reduce Significantly: Rates currently paid to “project attorneys during large-scale reviews in the US may well have bottomed out” and foreign review teams are often not a viable option due to “issues related to information security, oversight, maintaining attorney-client privilege, and logistics”.  Increasing the rate of review is also limited as, “[g]iven the trade-off between reading speed and comprehension…it is unrealistic to expect much room for improvement in the rates of unassisted human review”.  The study also notes that techniques for grouping documents, such as near-duplicate detection and clustering, while helpful, are “not the answer”.
  • Computer-Categorized Document Review Techniques May Be a Solution: Techniques such as predictive coding have the potential of reducing the review hours by about 75% with about the same level of consistency, resulting in review costs of less than $2,000 and total production costs of less than $7,000.  However, “lack of clear signals from the bench” that the techniques are defensible and lack of confidence by litigants that the techniques are reliable enough to reliably identify the majority of responsive documents and privileged documents are barriers to wide-scale adoption.

Not surprisingly, the recommendations included taking “the bold step of using, publicly and transparently, computer-categorized document review techniques” for large-scale eDiscovery efforts.

So, what do you think?  Are you surprised by the cost numbers?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.