eDiscoveryDaily

Six eDiscovery Predictions for 2014, Part One – eDiscovery Trends

It’s that time of year, where people make predictions for the coming year for all sorts of things, including electronic discovery trends for the coming year.  Though I have to say, I’ve seen fewer predictions this year than in past years.  Nonetheless, I feel compelled to offer some of my own predictions.  If they turn out right, you heard it here first!

Prediction 1: Predictive coding technologies will become more integrated into the discovery process, for more than just review.

Two or three years ago, predictive coding (a.k.a., technology assisted review or computer assisted review) was a promising technology that had yet to be officially accepted in the courts.  Then, in 2012, cases such as Da Silva Moore v. Publicis Groupe & MSL Group, Global Aerospace Inc., et al, v. Landow Aviation, L.P. dba Dulles Jet Center, et al and In re Actos (Pioglitazone) Products Liability Litigation, predictive coding was approved (and there was at least two other cases where it was contemplated).  So, it’s beginning to be used, though most attorneys still don’t fully understand how it works or understand that it’s not a “turn-key” software solution, it includes a managed process that uses the software.

It’s not going out on a limb to say that this year predictive coding technologies will be more widely used; however, I think those technologies will branch out beyond review to other phases of the eDiscovery life cycle, including Information Governance.  Predictive coding is not new technology, it’s basically artificial intelligence applied to the review process, so it’s logical that same technology can be applied to other areas of the discovery life cycle as well.

Prediction 2: The proposed amendments will be adopted, but it will be a struggle.

Changes to Federal Rules for eDiscovery have been drafted and have been approved for public comment.  However, several people have raised concerns about some of the new rules.  Judge Shira Scheindlin has criticized proposed Rule 37(e), intended to create a uniform national standard regarding the level of culpability required to justify severe sanctions for spoliation, for creating “perverse incentives” and encouraging “sloppy behavior.”

U.S. Sen. Christopher Coons (D-Del.), who chairs the Subcommittee on Bankruptcy and the Courts, predicted that some proposed restrictions – such as reducing the number of depositions, interrogatories and requests for admission for each case – “would do nothing about the high-stakes, highly complex or highly contentious cases in which discovery costs are a problem.”  Senator Coons and Sherrilyn Ifill, president of the NAACP Legal Defense and Educational Fund Inc., also expressed concerns that those limits would likely restrict plaintiffs in smaller cases in which discovery costs are not a problem.

Needless to say, not everybody is a fan of all of the new proposed rules, especially Rule 37(e).  But, the proposed rules have gotten this far and there are a number of lobbyists pushing for adoption.  So, I think they’ll be adopted, but not without some controversy and struggle.

Prediction 3: The eDiscovery industry will continue to consolidate and many remaining providers will need to continue to reinvent themselves.

Every year, I see several predictions that more eDiscovery vendors will fail and/or there will be more consolidation in the industry.  And, every year there is consolidation.  Here’s the latest updated list of mergers, acquisitions and investments since 2001, courtesy of Rob Robinson.  But, every year there also new players in the market, so the number of providers never seems to change dramatically.  Last year, by my count, there were 225 exhibitors at Legal Tech New York (LTNY), with many, if not most of them in the eDiscovery space.  This year, the partial list stands at 212.  Not a tremendous drop off, if any.

Nonetheless, there will be more pressure on eDiscovery providers than ever before to provide services at reasonable prices, yet turn a profit.  I’ve seen bold predictions, like this one from Albert Barsocchini at NightOwl Discovery in which he predicted the possible end of eDiscovery processing fees.  I’m not sure that I agree that they’re going away entirely, but I do see further commoditization of several eDiscovery services.  The providers that offer truly unique software offerings and/or expert services to complement any commodity-based services that they offer will be the ones best equipped to meet market demands, profitably.

On Monday, I predict I’ll have three more predictions to cover.  Hey, at least that’s one prediction that should come true!

So, what do you think?  Do you have any eDiscovery predictions for 2014?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

EDRM UTBMS eDiscovery Code Set Calculator – eDiscovery Best Practices

Last month, we discussed budget calculators available from the Metrics section of the Electronic Discovery Reference Model (EDRM) web site.  So far, we have reviewed two of the budget calculators, beginning with the E-Discovery Cost Estimator for Processing and Review workbook provided by Julie Brown at Vorys law firm and the Doc Review Cost Calculator provided by an eDiscovery vendor.  Today, we will continue our review of the calculators with a look at the EDRM UTBMS eDiscovery Code Set Calculator provided by Browning Marean, DLA Piper law firm; and George Socha, Socha Consulting (and, of course, co-founder of EDRM).

As described on the site, this budget calculator uses the ABA’s Uniform Task Based Management System (UTBMS) eDiscovery codes as a starting point for calculating estimated eDiscovery expenses. Users enter anticipated average hour rates for:

  • Partners
  • Associates
  • Paralegals
  • Contract reviewers
  • In-house resources
  • Vendors

For each relevant L600-series UTMBS code, users enter (a) total estimated hours for each relevant group and (b) total estimated associated disbursements.  The spreadsheet then displays:

  • A summary of the estimated costs
  • Details of the estimated costs for each combination, such as estimated costs of time partners spend planning discovery (Partner and L601)
  • Totals by type of person, such as Partner
  • Totals by individual UTMBS code, such as L601
  • Totals by higher level UTBMS codes, such as L600

This spreadsheet is quite clear and easy to use.  It provides a summary section at the top of the sheet for the top level codes from L600 (Identification) to L690 (Project Management), which are fed by the enterable cells to the left and below.  All of the enterable cells are in yellow to make it easy to identify where the data needs to be entered (the hourly rates for each of the positions are top left and the total estimated hours are enterable for each position and subcode).

Based on the entered rates and hours within each subcode, costs are calculated and displayed in green for each position within each subcode, as well as a total for each subcode which rolls up to a total for the top level code displayed in blue at the top of the sheet.  There is also a column to enter associated disbursements for each code and subcode to reflect those disbursements that don’t tie to an hourly rate.  The sheet is protected to avoid inadvertent overwriting of formulas, but there is no password so that the user can tweak formulas if necessary.

This workbook would certainly be useful for tracking eDiscovery costs according to the UTBMS codes, especially for hourly billed activities.  It’s not a spreadsheet for estimating costs based on estimated data volumes but rather estimated hours spent by key staff on each phase of discovery.  You can download this calculator individually or a zip file containing all four calculators here.

So, what do you think?  How do you estimate eDiscovery costs?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Want to Avoid Unexpected Issues in Your eDiscovery Project? Conduct a "Pre-Mortem" – eDiscovery Best Practices

“Insanity is doing the same thing over and over again expecting a different result”.  To avoid that issue, experienced project managers know the value of conducting a “post-mortem” to discuss any problems encountered during the project and how to avoid repeating them in the future.  But, what if you could prevent them from happening in the first place?

That’s where a “pre-mortem” comes in.  Like a “post-mortem” enables you to correct problems encountered after the fact for the future, a “pre-mortem” enables you to anticipate problems in the first place and create a plan to prevent those problems from happening.  On many projects that I’ve worked on, we’ve conducted a “pre-mortem” to brainstorm what can go wrong (i.e., risks) and identify a plan for mitigating each of those risks up front, then revisit regularly (typically, weekly if not more frequently) to monitor the plan for proactively addressing each risk.  This exercise can avoid a lot of headaches during the project.

These potential problems can happen throughout the discovery life cycle, so the “pre-mortem” list of potential problems will often evolve over the course of the discovery process.  Here are a couple of examples:

  • Data anomalies and exception files will slow down processing and cause us to fall behind in preparing data for review: As we’ve noted before, exceptions are the rule and you will frequently encounter exception files that cannot be processed (or require considerable effort to process).  Some “pre-mortem” steps to address this issue are to: 1) proactively discuss (and hopefully agree) with opposing counsel on how to handle these files in a manner that minimizes the time to attempt to correct those files and 2) establish a procedure for setting aside these files (when possible) while loading the remaining problem-free data.  Removing these potential roadblocks to getting data ready for review will help keep the discovery process moving and on schedule.
  • Review will take longer than anticipated and we will miss the production deadline: There are several measures that can be utilized to avoid this issue, including: 1) obtaining as much information about your document collection as possible up front, including number of documents, number of pages per document (when available), types of files being reviewed (some take longer than others), etc.; 2) prepare complete  and clear review instructions for your attorney reviewers; 3) estimate the number of reviewers and expected throughput for review; 4) conduct a pilot review with a few reviewers to compare actual results to estimates and adjust estimates (and review instructions) accordingly; 5) exceed (at least slightly) the number of estimated reviewers to provide some leeway and 6) monitor progress daily and adjust quickly if productivity starts to fall behind.

By identifying what could go wrong up front, creating a plan to avoid those issues and monitoring the plan regularly to proactively address each risk, you can keep those problems from happening in the first place.

So, what do you think?  Do you perform “pre-mortems” at the beginning of your project?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Plaintiff’s Attorney’s Fee Request Slashed because they “Transformed what should Have Been a Simple Case into a Discovery Nightmare” – eDiscovery Case Law

 

In Fair Housing Center of Southwest Michigan v. Hunt, No. 1:09-cv-593, 2013 U.S. Dist. (W.D. Mich. Oct. 21, 2013), Michigan Magistrate Judge Joseph G. Scoville ruled that the plaintiffs were prevailing parties in its settlement agreement with the defendants and were entitled to an award of attorney’s fees, but slashed the plaintiff’s fee request, “both because the hours devoted to this case were excessive and because the fee request makes no effort to account for the limited success that plaintiffs achieved in this case”.

In this housing discrimination case, the parties entered a settlement agreement that referred the question of attorney's fees and costs to magistrate Judge Scoville. The plaintiffs filed a motion seeking $605,507.92, consisting of $587,905.00 in attorneys’ fees and $17,602.92 in taxable costs. The defendants opposed the motion “on every possible ground”, contending that the plaintiffs did not enjoy “prevailing party” status entitling them to an award of attorney's fees and, if they were entitled to fees, that the amount sought was “grossly excessive”.

Noting that a prevailing party is one who achieves “a material alteration of the legal relationship of the parties”, Judge Scoville ruled that the plaintiffs are prevailing parties due to the court-approved Settlement Agreement, which awarded the plaintiffs a monetary award in the amount of $47,500.00.  Because the plaintiffs were ruled as prevailing parties, he also ruled that they were entitled to the full amount of the taxable costs because they were for transcript fees for depositions and hearings.

However, when it came to attorney’s fees of $587,905.00, Judge Scoville found that the “expenditure of 2,614 hours by three partners, two associates, and two paralegals” was “a truly extravagant expenditure of time and resources on what should have been a relatively simple case”.  He further noted:

“It is virtually impossible to see how the exercise of billing judgment would lead a law firm to invest 2,600 hours, by seven different billers, in the pursuit of such a simple case. A hardworking attorney lucky enough to bill 40 hours a week, 50 weeks per year, would bill only 2,000 hours per year. Thus, 2,600 hours represents significantly more than one year of attorney time, expended in pursuit of a single case, without distraction.”

Judge Scoville also noted that the plaintiff’s “single-minded focus on discovery of ESI engendered predictable disputes over discovery” and that it “appeared to this court on more than one occasion that plaintiffs were treating the case as a litigation workshop on discovery of ESI rather than a lawsuit.”

Judge Scoville stated that it “would be well within the court's discretion to deny plaintiffs' motion for attorney's fees in its entirety. This approach, however, would be unduly harsh under the specific facts of this case because plaintiffs were clearly prevailing parties on some claims and counsel's reasonable efforts should be compensated, even though their overall approach to the case was clearly excessive.”  Therefore, he applied several reductions of partner, associate and paralegal hours, reducing the total awarded to $223,444.80.

So, what do you think?  Should the full attorney’s fees have been awarded?  Or perhaps denied entirely?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Cost Calculator for Document Review – eDiscovery Best Practices

A couple of weeks ago, we discussed budget calculators available from the Metrics section of the Electronic Discovery Reference Model (EDRM) web site and, two days later, began a review of the budget calculators, beginning with the E-Discovery Cost Estimator for Processing and Review workbook provided by Julie Brown at Vorys law firm.  Today, we will continue our review of the calculators with a look at the Doc Review Cost Calculator.

As described on the site, this budget calculator focuses on review, which is universally considered to be the most expensive phase of the eDiscovery process (by far). From assumptions entered by users, it calculates per-document and per-hour (a) low and high price estimates, (b) low and high costs on a per page basis, and (c) low and high costs on a per document basis.

To use it, enter assumptions in the white and yellow cells in columns B, C, and D. Calculations are shown in columns D through T.

Assumptions that you can provide include: pages per document, low and high page counts in the collection, low and high time to complete the review project (in weeks) and reviewer hours per week, proposed rates for review (hourly and per document), low and high pages per hour rates for review (from which documents per hour rates are computed), proposed rates for review management (hourly and per document) and percentage of the collection to QC.

From the entered assumptions, the model will provide calculations to illustrate the low and high cost estimates for the low and high page count estimates, for both a per-document and a per-hour review billing structure.  It will also estimate a range of the number of reviewers needed to complete the project within the time frames specified, to help you plan on staffing necessary to meet proposed deadlines.  The detailed calculations are stored in a hidden sheet called “Calculations” – you can unhide it if you want to see how the review calculation “sausage” is made.

This model uses an “old school” assessment of a document collection based on page counts, so to use it with native file collections (where page counts aren’t known), you have to set the pages per document to 1 – your review rate then becomes documents (files) per hour.

Suggestions for improvement:

  • Some of the enterable assumption cells are in yellow and some in white (the same color as the computed cells), it would be easier and clearer to identify the assumptions fields if they were all yellow to differentiate them from the computed cells;
  • Protect the sheet and lock down the computed cells (at least in the main sheet) to avoid accidental overwriting of calculations (with the ability to unprotect the sheet if a formula requires tweaking);
  • Tie a line or bar graph to the numbers to represent the differences graphically;
  • Provide some notes to explain some of the cells (especially the assumption cells) in more detail.

Nonetheless, this workbook would certainly be useful for estimating review costs and number of reviewers needed to complete a large scale review, not only at the start, but also to provide updated estimates as review commences, so you can adjust cost estimates and staffing needs as you go.  You can download this calculator individually or a zip file containing all four calculators here.  In a few days, we will continue our review of the current EDRM budget calculators in more detail with the ESI Cost Budget calculator from Browning Marean of DLA Piper law firm.

So, what do you think?  How do you estimate eDiscovery costs?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Plaintiff Sanctioned After its "Failure to Take the Most Basic Document Preservation Steps" – eDiscovery Case Law

In SJS Distribution Systems, Inc. v. Sam’s East, Inc., No. 11 CV 1229 (WFK)(RML), 2013 U.S. Dist. (E.D.N.Y. Oct. 11, 2013), New York Magistrate Judge Robert M. Levy found the plaintiff’s failure to take “the most basic document preservation steps”, including issuing a litigation hold – “even after it discovered the packaging nonconformities and filed this action” – constituted gross negligence. As a result, an adverse inference instruction sanction was issued against the plaintiff and the defendant was awarded its costs and attorney’s fees associated with its motion to compel.

This breach of contract dispute arose out of the plaintiff’s purchase of almost $3 million worth of diapers from the defendant where the plaintiff contended that it discovered a discrepancy between the packaging of the diapers that it ordered and some of the diaper shipments defendant delivered in the fall of 2010 and claimed that it was unable to resell those diapers to the intended buyer and incurred damages as a result.

During discovery, the defendant asked for documents from the plaintiff relating to the purchase, sale, delivery, and attempts to resell the diapers. When the plaintiff certified that it had produced all responsive documents, the defendant advised the plaintiff of deficiencies in its document production, but the plaintiff reaffirmed that it had produced all responsive documents in its possession. After an additional document request, the plaintiff produced three additional pages along with amended responses and objections. The plaintiff’s president also asserted that his company “does not normally save copies of all emails sent or received, that it ‘did not anticipate’ litigation with defendant or the need to save all email communication with defendant, and that plaintiff has no internal emails since it is primarily a one-person entity”.

Judge Levy asked the parties to meet and confer in an attempt to resolve the dispute after the defendant asked for permission to file a motion to compel. Plaintiff’s counsel then announced it had discovered 181 additional hard-copy documents addressing the sale and storage of the diapers, which it turned over to the defendant, which, nonetheless, continued to claim that the plaintiff had still not produced all relevant electronically stored information (ESI).  When the defendant subsequently obtained discovery from a third-party, it included 334 pages of emails from the plaintiff to third parties that the plaintiff had not produced, leading to the defendant eventually filing a motion for spoliation sanctions.

Judge Levy used the Zubulake test to determine the applicability of sanctions, as follows:

(1) the party having control over the evidence had an obligation to preserve it at the time it was destroyed; (2) the records were destroyed with a culpable state of mind; and (3) the destroyed evidence was relevant to the party’s claim or defense such that a reasonable trier of fact could find that it would support that claim or defense.

Judge Levy found that all three components of the test were applicable, noting “the facts here establish that SJS’s failure to take the most basic document preservation steps, even after it discovered the packaging nonconformities and filed this action, constitutes gross negligence.”  As a result, the defendant’s motion was granted in part and denied in part, with an adverse inference sanction imposed as well as awarding the defendant its costs and attorney’s fees associated with its motion to compel.  Judge Levy refused to preclude the plaintiff from offering evidence from mid-November 2010 forward, refusing to apply such a “drastic remedy” because there was no evidence of bad faith and other evidence was still available to the defendant.

So, what do you think?  Should sanctions have been issued?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Company Should Have Preserved Personal eMails, But No Sanctions (Yet) – eDiscovery Case Law

 

In Puerto Rico Telephone Co. v. San Juan Cable LLC, No. 11-2135 (GAG/BJM), 2013 U.S. Dist. (D.P.R. Oct. 7, 2013), Puerto Rico Magistrate Judge Bruce J. McGiverin found that “plaintiff has proffered sufficient evidence to establish that [the defendant] OneLink failed to preserve relevant emails within its control”, but denied the plaintiff’s request for sanctions at this time because of the “absence of bad faith” on the defendant's part and the plaintiff's failure to demonstrate prejudice.

In this antitrust lawsuit, the plaintiff sued several defendants, including OneLink Communications, for attempting to block its entry into the cable television market.  The plaintiff contended that OneLink engaged in sanctionable spoliation of evidence by failing to preserve relevant emails from the personal email accounts of three former OneLink officers. Because of this failure, the plaintiff sought an adverse inference instruction at the summary judgment stage and at trial.

Judge McGiverin stated that “when seeking an adverse inference instruction, the proponent of the inference must provide sufficient evidence to ‘show that the party who destroyed the document `knew of (a) the claim (that is, the litigation or the potential for litigation), and (b) the document's potential relevance to that claim.’…Such an instruction usually is appropriate "only where the evidence permits a finding of bad faith destruction,"…but bad faith is not required where circumstances indicate an adverse inference instruction is otherwise warranted”.

Continuing, Judge McGiverin noted “Here, plaintiff has proffered sufficient evidence to establish that OneLink failed to preserve relevant emails within its control. While the emails at issue come from the personal email accounts of OneLink's former officers, these officers had used their personal email accounts to manage the company for as long as seven years…OneLink presumably knew its managing officers used their personal email accounts to engage in company business, and thus its duty to preserve extended to those personal email accounts.”

However, Judge McGiverin found the “plaintiff's request for sanctions problematic on multiple fronts”.  First, he found that OneLink had not acted in bad faith because it had issued a litigation hold notice to employees within one month of the filing of the lawsuit. He also found that any “prejudice suffered by PRTC is currently speculative” since only three email chains could not be located, these three chains were not potentially damaging to OneLink, and the plaintiff had been able to acquire those chains from other sources.

He noted that the “plaintiff may renew its motion for sanctions if circumstances so warrant” if “more information regarding the extent of spoliation” was discovered, but, at least for now, denied the plaintiff’s motion for adverse inference instruction.

So, what do you think?  Should sanctions be issued when a party fails to preserve personal email?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Daily will take a break for the holidays and will return on Thursday, January 2, 2013. Happy Holidays from all of us at CloudNine Discovery and eDiscovery Daily!

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Orders Plaintiff to Perform Some Requested Searches Despite the Plaintiff’s Claim that they’re “Unprecedented” – eDiscovery Case Law

 

In Swanson v. ALZA Corp., No.: CV 12-04579-PJH (KAW), 2013 U.S. Dist. (N.D. Cal. Oct. 7, 2013), California Magistrate Judge Kandis A. Westmore granted in part and denied in part the defendant's request to compel the plaintiff to apply its search terms to his ESI, ordering some of the search terms to be performed, despite the plaintiff’s assertion that the “the application of Boolean searches was unprecedented”.

In this patent dispute, the plaintiff produced nearly 750,000 pages of documents to the defendant on a rolling basis between June and August 9, 2013, of which more than 600,000 pages were from ESI sources. During that period, the parties met and conferred regarding possible additional search terms designed to capture further documents responsive to the defendant's requests, but were unable to agree.  On July 17, the defendant filed a motion to compel demanding that its new terms be utilized to search the plaintiff's ESI, to which the plaintiff objected.

Part of the dispute was in agreeing on the number of terms that the defendant was requesting: the plaintiff characterized the defendant's request as proposing 89 additional terms, whereas the defendant characterized its request as only equating to 25 additional search "terms," which were distinct searches containing multiple terms and Boolean operators.  After the court ordered the parties to provide supplemental information to assist in the resolution of the underlying dispute on August 27, the defendant, believing documents were missing from the production based on the plaintiff’s sample searches, asked for access to the plaintiff’s database to run its own searches, or for the plaintiff to run a narrowed list of 11 “terms”.  At the hearing, the plaintiff's counsel explained that his firm ran two sample ESI searches, which took 34 hours and 51 minutes, and 23 hours and 58 minutes, respectively.

Judge Westmore “was not persuaded by Plaintiff's argument that running modified searches would place such an undue burden as to relieve Plaintiff of his obligation to produce responsive documents” and did not agree with the plaintiff’s contention that “the application of Boolean searches was unprecedented”, stating “[t]his is actually not the case, and given the availability of technology, Boolean searches will undoubtedly become the standard, if, for no other reason, to limit ESI documents to those most likely to be relevant to pending litigation.”

Ultimately, Judge Westmore ruled for the plaintiff to deduplicate the results of its two sample searches and to perform three of the other requested searches, stating that “[w]hile Defendant has not shown that it should be entitled to have all 11 searches performed, the Court is persuaded that it would not be unduly burdensome for Plaintiff to perform the searches below in light of Plaintiff's unwillingness to produce its ESI database”.

So, what do you think?  Was this a fair compromise?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Vorys Project Ballpark Cost Estimator for ESI Processing and Review – eDiscovery Best Practices

On Tuesday, we discussed budget calculators available from the Metrics section of the Electronic Discovery Reference Model (EDRM) web site.  Today, we will begin a more in-depth discussion of the budget calculators, beginning with the E-Discovery Cost Estimator for Processing and Review workbook provided by Julie Brown at Vorys law firm.

As described on the site, this budget calculator contains two worksheets. The Linear-search-analytics worksheet allows users to calculate ballpark cost estimates for processing and review under three “cases” and compare the results. The cases are:

  • Case 1: Full blown processing and linear review
  • Case 2: Search terms used to cull data during processing
  • Case 3: Use analytical culling tool

With each case, users are able to see the cost consequences that result from changing variables such as Data Volume, Volume after culling, and Pre-processing cost/GB.  The cost differences are shown numerically, as well as via two graphs, a 3D horizontal bar graph that shows the cost differences between the three cases (see above graphic for an example) and a 2D horizontal bar graph that shows the cost differences, with a breakdown of processing and review costs for each.

The Linear-size examples worksheet allows users to compare four versions of Case 1. Users are able to see the cost consequences (in both numeric and 2D vertical bar graph form) that result from changing any combination of six variables: Data Volume, Processing Cost/GB, Pages per GB, Docs Reviewed by Hour, Hourly Rate, and FTEs.

Both spreadsheets provide useful information and are well controlled to differentiate the data entry cells (with no fill color in the cell) from the calculation only cells (with a fill color) and the sheets are protected to prohibit accidental overwriting of the calculated cells (the sheets aren’t locked with a password, so you can override it if you want to make adjustments).  The sheet is designed to help you generate a ballpark cost for processing and review based on the variables provided, so it doesn’t include any fixed overhead costs such as software, hardware or facility costs.  It also doesn’t include any management overhead, so it’s essentially a model for variable costs only, but it could be useful to help you determine at what volume an analytical culling tool might pay for itself.

Suggestions for improvement:

  • Create a common section for data entry variables so you don’t have to re-enter them for each comparison case to save time and avoid data inconsistencies;
  • While you’re at it, add variables for pages per document and hours per week – right now, you have to unprotect the sheet and change the formulas if you want to change those variables (not all document sets or work weeks are the same);
  • Add sheets to compare versions of Case 2 and Case 3, like the sheet for Case 1.

Nonetheless, this workbook is quite useful if you want to obtain a ballpark estimate and comparison for processing and review and compare costs for alternatives.  You can download this calculator individually or a zip file containing all four calculators here.  After the first of the year, we will continue our review of the current EDRM budget calculators in more detail.

So, what do you think?  How do you estimate eDiscovery costs?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Want to Estimate your eDiscovery Budget? Use One of These Calculators – eDiscovery Best Practices

It has been a busy year for the Electronic Discovery Reference Model (EDRM).  In addition to announcing a transition to nonprofit status by May 2014, since the May annual meeting, several EDRM projects (Metrics, Jobs, Data Set and the new Native Files project) have already announced new deliverables and/or requested feedback.  Now, another resource is available via the EDRM site – Budget Calculators!

It can be difficult to estimate the total costs for eDiscovery at the outset of a case.  There are a number of variables and options that could impact the budget by a wide margin and it may be difficult to compare costs for various options for processing and review.  However, thanks to the EDRM Metrics team and contributing members, budget calculator Excel workbooks are available to enable you to at least “ballpark” the costs.  The budget calculator spreadsheets are designed to help organizations estimate likely eDiscovery costs, based on assumptions that you provide, such as average hourly rates for contract reviewers or average number of pages per document.

There are four budget calculators that are currently available.  They are:

  • UF LAW E-Discovery Project Ballpark Cost Estimator for ESI Processing and Review: This budget calculator contains two worksheets. The first worksheet allows users to calculate ballpark cost estimates for processing and review under three “cases” (Full blown processing and linear review, Search terms used to cull data during processing and Use analytical culling tool) and compare the results.  The second worksheet allows users to compare four versions of Case 1.  This workbook has been provided by University of Florida Levin College of Law and Vorys law firm.
  • Doc Review Cost Calculator: This budget calculator focuses on review. From assumptions entered by users, it calculates per-document and per-hour (a) low and high price estimates, (b) low and high costs on a per page basis, and (c) low and high costs on a per document basis.
  • ESI Cost Budget: This budget calculator estimates costs by project phase. The phases are: ESI Collection, ESI Processing, Paper Collection and Processing, Document Review, Early Data Assessment, Phase 1 Review, Phase 2 Review, Production, Privilege Review, Review of Opposition’s Production and Hosting Costs.  This workbook has been provided by Browning Marean, DLA Piper law firm.
  • EDRM UTBMS eDiscovery Code Set Calculator: This budget calculator uses the UTBMS e-discovery codes as a starting point for calculating estimated e-discovery expenses. Users enter anticipated average hour rates for: Partners, Associates, Paralegals, Contract reviewers, In-house resources and Vendors, along with total estimated hours for each relevant group and total estimated associated disbursements for each relevant L600-series UTMBS code.  The spreadsheet then displays: a summary of the estimated costs, details of the estimated costs for each combination, totals by type of person and totals by individual and higher-level UTMBS codes.  This workbook has been provided by Browning Marean, DLA Piper law firm; and George Socha, Socha Consulting.

You can download each calculator individually or a zip file containing all four calculators.  If you have your own budget calculator, you can also submit yours to EDRM to share with others.  The calculators are available here.  On Thursday, we will begin reviewing the current budget calculators in more detail.

So, what do you think?  How do you estimate eDiscovery costs?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.