Electronic Discovery

Is Technology Assisted Review Older than the US Government? – eDiscovery Trends

A lot of people consider Technology Assisted Review (TAR) and Predictive Coding (PC) to be new technology.  We attempted to debunk that as myth last year after our third annual thought leader interview series by summarizing comments from some of the thought leaders that noted that TAR and PC really just apply artificial intelligence to the review process.  But, the foundation for TAR may go way farther back than you might think.

In the BIA blog, Technology Assisted Review: It’s not as new as you think it is, Robin Athlyn Thompson and Brian Schrader take a look at the origins of at least one theory behind TAR.  Called the “Naive Bayes classifier”, it’s based on theorems that were essentially introduced to the public in 1812.  But, the theorems existed quite a bit earlier than that.

Bayes’s theorem is named after Rev. Thomas Bayes (who died in 1761), who first showed how to use new evidence to update beliefs. He lived so long ago, that there is no known widely accepted portrait of him.  His friend Richard Price edited and presented this work in 1763, after Bayes’s death, as An Essay towards solving a Problem in the Doctrine of Chances.  Bayes’ algorithm remained unknown until it was independently rediscovered and further developed by Pierre-Simon Laplace, who first published the modern formulation in his 1812 Théorie analytique des probabilities (Analytic theory of probabilities).

Thompson and Schrader go on to discuss more recent uses of artificial intelligence algorithms to map trends, including Amazon’s More Like This functionality that Amazon uses to recommend other items that you may like, based on previous purchases.  That technology has been around for nearly two decades – can you believe it’s been that long? – and is one of the key factors for Amazon’s success over that time.

So, don’t scoff at the use of TAR because it’s “new technology”, that thinking is “naïve”.  Some of the foundation statistical theories for TAR go further back than the birth of our country.

So, what do you think?  Has your organization used technology assisted review on a case yet?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Throwback Thursdays – How Databases Were Used, Circa Early 1980s, Part 5

So far in this blog series, we’ve taken a look at the ‘litigation support culture’ in the late 1970’s and early 1980’s, we’ve covered how databases were built, and we started discussing how databases were used.  We’re going to continue that in this post.  But first, if you missed the earlier posts in this series, they can be found here, here, here, here, here, here, here and here.

In last week’s post, we covered searching a database.  As I mentioned, searches were typically done by a junior level litigation team member who was trained to use the search engine.  Search results were printed on thermal paper, and that paper was flattened, folded accordion style, and given to a senior attorney to review – with the goal of identifying the documents he or she would like to see.  Those printouts included information that was recorded by a coder for each document.  A typical database record on a printout might look like this:

DocNo: PL00004568 – 4572

DocDate: 08/15/72

DocType: LETTER

Title: 556 Specifications

Characteristics: ANNOTATED; NO SIGNATURE

Author: Jackson-P

Author Org: ABC Inc.

Recipient: Parker-T

Recipient Org: XYZ Corp.

Copied: Franco-W; Hopkins-R

Copied Org: ABC Inc.

Mentioned: Phillips-K; Andrews-C

Subjects: A122 Widget 556; C320 Instructions

Contents: This letter incudes specifications for product 556 and requests confirmation that it meets requirements.

Source: ABC-Parker

The attorney reviewing the printout would determine (based on the coded information) which documents to review – checking those off with a pen.

The marked up printout was delivered to the archive librarian for ‘pulling’.  We NEVER turned over the original (from the archive’s ‘original working copy’).  Rather, an archive clerk worked with the printout, locating boxes that included checked documents, and locating the documents within those boxes. The clerk made a photocopy of each document, returned the originals to their boxes, and placed the photocopies in a second box.  When the ‘document pull’ was complete, a QC clerk verified the copies against the printout to ensure nothing was missed, and then the copies were delivered to the attorney.

In last week’s post, I mentioned how long it took for a database to get built.  Once the database was available for use, retrievals were slow, by today’s standards.  Depending on the number of documents to be pulled, it could take days for an attorney to get a stack of documents back to review.  While that would be unacceptable today, it was a huge improvement over the alternative at the time – which was to flip through an entire document collection eyeballing every page looking for documents of interest.  For example, when preparing for a deposition, a team of paralegals would get to work going through boxes of documents and eyeballing every page looking for the deponent’s name.

Working with a database then was – by today’s standards – done at a snail’s pace.  But the time savings at the time were significant.  And the search results were usually more thorough.  On one project I managed, just as the database loading was completed, an attorney called me to say he was preparing for a deposition and had his paralegals manually review the collection looking for the deponent’s name.  They spent a week doing it and found under 200 documents. He was uncomfortable with those results.  I told him the database was almost available – we just had to do some testing – but I could do a search for him.  I did that while he waited on the phone and quickly reported back to him that the database search found almost twice as many documents.  We delivered the documents to him within a couple of days.

Tune in next week and we’ll cover how the litigation world circa 1980 evolved and got to where it is today.

Please let us know if there are eDiscovery topics you’d like to see us cover in eDiscoveryDaily.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Grants Motion for Spoliation Sanctions Due to Data that is “Less Accessible” – eDiscovery Case Law

 

In Mazzei v. Money Store, 01cv5694 (JGK) (RLE) (S.D. N.Y. July 21, 2014), New York Magistrate Judge Ronald L. Ellis granted the plaintiff’s motion for spoliation sanctions against the defendant, ordering the defendant to bear the cost of obtaining all the relevant data in question from a third party as well as paying for plaintiff attorney fees in filing the motion.

In this class action fraud case, the plaintiff requested that the defendants be sanctioned for violating the duty to preserve information within an electronic database created by a third party, claiming that the information had been lost from the electronic database system and was now in the possession of another third party, making it more costly to retrieve.  Accordingly, the plaintiff asked the court to direct the defendants to obtain the information from the third party, as a sanction for their failure to preserve the information in its original format.

On the other hand, the defendants claimed that the information was preserved, but it was “not readable,” and stated that it would cost a minimum of $10,000 to determine if the information was even searchable. The defendants also argued that the “not readable” documents were not their responsibility to preserve and produce because the information was controlled by a third party.

Judge Ellis stated that “[a] party is in control of any documents in the possession of a third party if that third party is contractually obligated to make them available…Defendants had both the legal right and practical ability to obtain the information relating to fees in the New Invoice System after the initiation of this action”. He noted that the defendants had a Master Services Agreement with the original third party provider of the database, which gave the defendants “a contractual right to demand the information specifically identifying the fees being charged”, so the defendants “were in control of that information”.

Regarding the defendant’s contention that there is no spoliation issue because the plaintiffs have not shown that the information in the system is less accessible now than prior to the transfer of the defendants' access to the system, Judge Ellis declared “This argument has no merit. Mazzei asserts, and Defendants do not dispute, that the only remaining trace of the information in the New Invoice System is possessed by Lender Processing Services and is not presently in a readable format. Therefore, the information is less accessible than it was when Defendants had access to it.”

Finding that the information was relevant and that the defendants “acted with a culpable state of mind” when they failed to preserve the data in its original form, Judge Ellis ordered the defendants “to 1) bear the cost of determining whether the New Invoice System data currently in the possession of LPS is searchable; 2) pay Mazzei his attorneys' fees for this application.”  The plaintiff was ordered to “submit an affidavit detailing reasonable hours and rates associated with its motion by August 1”.

So, what do you think?  Was the judge right to sanction the defendant for failing to preserve the accessibility of the information?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Litigation Support Tools of the Trade – eDiscovery Best Practices

If you have worked in litigation support for a number of years like I have, you start to assemble a toolkit of applications that help you get your job done more quickly and efficiently.  In her excellent Litigation Support Guru blog, Amy Bowser-Rollins (who was previously profiled on THIS blog) has recently published a series of posts that describe tools of the trade that she recommends to litigation support “newbies”.  Let’s take a look.

The entire series of 18 Tools of the Trade is available here.  Here are the specific tools that she covers:

  1. TextPad
  2. Snagit
  3. Unstoppable Copier
  4. Concordance CPL to convert DAT to DCB
  5. Bulk Rename Utility
  6. TrueCrypt
  7. FileZilla
  8. Beyond Compare
  9. Dan Biemer Concordance CPLs
  10. Tableau
  11. Avery DesignPro
  12. UltraEdit
  13. FTK Imager (also previously discussed on our blog here, here, here, here and here)
  14. Directory Lister Pro
  15. iConvert
  16. Hard Drive SATA/IDE Adapter
  17. 7-Zip
  18. AutoCAD Viewer

Whether you need to edit large text files, perform screen captures, copy or rename files, manipulate data for Concordance, encrypt data for transfer, FTP data using an intuitive interface, capture data from a drive without spoliating evidence, create CD labels, convert load files, compress file collections or view engineering drawings, there is an application for you.  I personally use many of these frequently, including TextPad, Snagit, TrueCrypt, FileZilla, Beyond Compare, FTK Imager and iConvert.

Several of these applications are free.  Most are at least inexpensive.  They are vital “tools of the trade” for litigation support professionals.  Kudos to Amy for a terrific blog series!

So, what do you think?  What “tools of the trade” do you have in your litigation support “tool belt”?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Rules to Limit Scope of Discovery, Noting that “Searching for ESI is only one discovery tool” – eDiscovery Case Law

In United States v. Univ. of Neb. at Kearney, 4:11CV3209 (D. Neb. Aug. 25, 2014), Nebraska Magistrate Judge Cheryl R. Zwart denied the government’s motion to compel discovery, finding that “ESI is neither the only nor the best and most economical discovery method for obtaining the information the government seeks” and stating that searching for ESI “should not be deemed a replacement for interrogatories, production requests, requests for admissions and depositions”.

This Fair Housing Act case involved a case brought about by the government with claims that students were prohibited or hindered from having “emotional assistance animals in university housing when such animals were needed to accommodate the requesting students’ mental disabilities.  Discovery has been a lengthy and disputed process since the parties filed a Stipulation and Order Regarding Discovery back in March of 2012.

The scope of ESI was a major part of the dispute. The defendants objected that the government’s search parameters were too expansive, and the cost of compliance would be unduly burdensome. The defendants explained that the cost of retrieval, review, and production would approach a million dollars, and provided an outline identifying the document “hits” and the estimated discovery costs.  The government served revised search terms on April 14, 2014. Although narrowed, the government’s search terms would still yield 51,131 responsive documents, and based on the defendants’ estimate, would require the defendants to expend an additional $155,574 to retrieve, review, and produce the responsive ESI.

To date, the defendants had paid $122,006 to third-party vendors for processing the government’s ESI requests and proposed the requests be narrowed to the “housing” or “residential” context. The defendants’ search terms would yield 10,997 responsive documents.  The Government did not want to limit the scope of discovery and recommended producing all the ESI subject to a clawback agreement for the Government to search the ESI. The Defendants argued such an agreement would violate the Family Educational Rights and Privacy Act by disclosing student personal identifiable information without their notice and consent.

The court had ordered the parties to provide answers to specific questions regarding their efforts at resolving ESI as part of any motion to compel filed. The government’s responsive statement does not include information comparing the cost of its proposed document retrieval method and amount at issue in the case, any cost/benefit analysis of the discovery methods proposed, or a statement of who should bear those costs.

Judge Zwart stated that she “will not order the university to produce ESI without first reviewing the disclosure, even with the protection afforded under a clawback order. And if UNK must review the more than 51,000 documents requested by the government’s proposed ESI requests, the cost in both dollars and time exceeds the value to be gained by the government’s request.”

Illustrating the lack of proportionality in the government’s requests, Judge Zwart stated “Searching for ESI is only one discovery tool. It should not be deemed a replacement for interrogatories, production requests, requests for admissions and depositions, and it should not be ordered solely as a method to confirm the opposing party’s discovery is complete. For example, the government proposes search terms such as ‘document* w/25 policy.’ The broadly used words “document” and “policy” will no doubt retrieve documents the government wants to see, along with thousands of documents that have no bearing on this case. And to what end? Through other discovery means, the government has already received copies of UNK’s policies for the claims at issue.”

As a result, Judge Zwart stated that “the court is convinced ESI is neither the only nor the best and most economical discovery method for obtaining the information the government seeks. Standard document production requests, interrogatories, and depositions should suffice—and with far less cost and delay.”

So, what do you think?  Were the government’s requests overbroad or should they have been granted their motion to compel?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

New Survey Shows eDiscovery Workload, Predictive Coding Use Increasing – eDiscovery Trends

 

eDiscovery workload, the use of predictive coding and projected rate of adoption of technically assisted review are all up significantly, according to a new report by recruiting and staffing firm The Cowen Group.

In its Executive Summary, the Q2 2014 Quarterly Critical Trends Report highlighted “three compelling and critical trends”, as follows:

  1. Workload and the rate of increase is up, significantly
  2. The demand for talent is still increasing, however at a much slower rate than last year.
  3. The use of predictive coding (PC) is up by 40 percent but the projected rate of adoption of technically assisted review (TAR) and PC is dramatically up by 75 percent.

The survey represents responses from one hundred eDiscovery partners and litigation support managers/directors from 85 Am Law 200 law firms.  Not all participants responded to all of the questions.  Some of the key findings from the survey are as follows:

  • 64 percent of respondents reported an increase in workload during the first half of 2014, up 12 percent from last year’s survey;
  • For those reporting an increase in workload, 88 percent replied that this growth was attributed to the higher number of cases they are managing, 77 percent attributed it to the larger size of each case and 65 percent attributed it to both factors;
  • 56 responding firms project that their workload will continue to increase over the next 6 months;
  • Despite the increase in workload and expected growth, over half of responding firms (50 out of 96) said the size of their eDiscovery departments has stayed the same, 26 responding firms reported an increase and 20 responding firms reported a decrease;
  • The majority of respondents (55 out of 97) also expect their eDiscovery department sizes to remain the same through the end of the year, with 41 out of the remaining 42 responding firms expecting an increase;
  • Based on responses, the strongest demand for talent over the past 6 months was in the positions of analysts and project managers;
  • As for TAR and predictive coding, 30 out of 78 responding firms reported that their use of TAR in review workflows increased over the last 6 months (46 out of the remaining 48 firms reported that it stayed the same), with a whopping 52 out of 78 responding firms reporting an expected increase over the rest of the year.

The FREE 8 page report has several additional survey results and is presented in an easy to read graphical format.  To review the report, click here.

So, what do you think?  Do any of those numbers and trends surprise you?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Throwback Thursdays – How Databases Were Used, Circa Early 1980s, Part 4

So far in this blog series, we’ve taken a look at the ‘litigation support culture’ in the late 1970’s and early 1980’s, and we’ve covered how a database was built.  We’re going to move on to discuss how those databases were used.  But first, if you missed the earlier posts in this series, they can be found here, here, here, here, here, here and here.

After coding and keying, data from magnetic tapes was loaded into a database, usually housed on a vendor’s timeshare computer.  This was before business people had computers on their desks, so litigation teams leased computer terminals – often called “dumb terminals”.  The picture above is of a Texas Instruments Silent 700 terminal – which was the standard for use by litigators.  This photo was taken at the Texas State Historical Museum.

These terminals were portable and came housed in a hard plastic case with a handle.  By today’s standards, they were not “compact”.  They were in fact quite heavy and not as easy to tote around as the laptops and tablets of today.  As you can see, there’s no screen.  You inserted a roll of thermal paper which ‘spit out’ search results.  And, as you can see, you accessed the mainframe using a standard telephone.  The phone’s handset was inserted into an acoustic coupler on the terminal, and you dialed the computer’s phone number for a connection.  You’re probably thinking that retrievals were pretty slow over phone lines…  yes and no.  Certainly response time wasn’t at the level that it typically is today, but the only thing being transmitted in search sessions was data.  There were no images.  So retrievals weren’t as slow as you might expect.

Searches were done using very ‘precise’ syntax.  You asked the database for information, and it collected precisely what you asked for.  There weren’t fuzzy searches, synonym searches, and so on.  The only real search that provided flexibility was stem searching.  You could, for example, search for “integrat*” and retrieve variations such as “integrate”, integrates”, “integrated” and “integration”.  The most commonly used search engines required that you start a search with a command (like “find”, “sort”, or “print”).  If you were doing a “find” command, that was followed by the field in which you wanted to search, an equal sign, and the word you were searching for.  To search for all documents authored by John Smith, your command might look like:

Find Auth=Smith-J*

The database responded by telling you how many records it found that matched your criteria.  Usually the next step was to sort the results (often by date or document number), and then print the results – that is, print the information that was coded for each record.  Keep in mind, “prints” were on a continuous roll of thermal paper spit out by the machine.  More often than not, searches were done by junior litigation team members and results were provided to a senior attorney to review.  So the thermal paper roll with the results was usually flattened and folded accordion-style to make reviews easier.

In next week’s post, we’ll discuss retrieval of the actual documents.

Please let us know if there are eDiscovery topics you’d like to see us cover in eDiscoveryDaily.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Denies Plaintiff’s Fallback Request for Meet and Confer after Quashing its Subpoena – eDiscovery Case Law

 

In Boston Scientific Corporation v. Lee, 1:13-cv-13156-DJC, (N.D. Cal., Aug 4, 2014), California Magistrate Judge Paul S. Grewal found time to preside over a case other than Apple v. Samsung and granted the motion to quash the plaintiff’s subpoena for the defendant’s laptops, refusing the plaintiff’s fallback position to meet and confer and referencing Leave it to Beaver in the process.

The defendant left the employment of the plaintiff and began working for a competitor, which caused the case to be filed against him, claiming theft of trade secrets and violation of a confidentiality agreement (by downloading confidential information onto a USB thumb drive).  The defendant’s new employer assigned him a laptop when he started his employment, which he used to both perform his job duties and communicate with his attorneys. Several weeks after the lawsuit was filed, the defendant’s employer segregated this laptop with a third party e-discovery vendor, and issued him a second one.

The defendant’s employer also produced forensic information about the contents of the first laptop to the plaintiff in the form of file listing reports, which disclose extensive metadata of the files contained on the laptop, USB reports, and web browsing history reports.  When the plaintiff pressed for more, the defendant’s employer offered to have an independent vendor review a full forensic image of the first laptop to search for pertinent information, including a review of any deleted files.  The plaintiff refused, requesting forensic discovery from both laptops and issued a subpoena, to which the defendant’s employer filed a motion to quash.

Judge Grewal began his order as follows:

“This case illustrates a recurring problem in all civil discovery, especially in intellectual property cases. A party demands the sun, moon and stars in a document request or interrogatory, refusing to give even a little bit. The meet and confer required by a court in advance of a motion is perfunctory at best, with no compromise whatsoever. But when the parties appear before the court, the recalcitrant party possesses newfound flexibility and a willingness to compromise. Think Eddie Haskell singing the Beaver's praises to June Cleaver, only moments after giving him the business in private. Having considered the arguments, the court GRANTS Nevro's motion to quash.”

Explaining his decision, Judge Grewal stated “No doubt there exists discoverable information on the two laptops, but by demanding nothing less than a complete forensic image of not just one but two laptops belonging to a direct competitor, Boston Scientific demands too much. Such imaging will disclose privileged communications related to the litigation as well as irrelevant trade secrets from a nonparty-competitor.  Boston Scientific's subpoena therefore seeks discovery of protected matter, something plainly not permitted under Rule 45, rendering the subpoena overbroad and imposing an undue burden on Nevro.”

Judge Grewal noted that the plaintiff “[a]s a fall back”, proposed what the defendant’s employer had originally proposed: the retention of an independent vendor to “review a full forensic image of the Initial Laptop to search for pertinent information, including a review of any deleted files.”  Judge Grewal closed the door on that request as well, stating “to allow Boston Scientific now to seek shelter from a fallback position that Nevro previously tendered in good faith would make a mockery of both parties' obligation to meet and confer in good faith from the start.  The time to tap flexibility and creativity is during meet and confer, not after.”

So, what do you think?  Should the plaintiff have been granted its fall back request or was it too late for compromise?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

How Much Will it Cost? – eDiscovery Best Practices

By far, the most important (and, therefore, the most asked) question asked of eDiscovery providers is “How much will it cost?”.  Actually, you should be asking a few questions to get that answer – if they are the right questions, you can actually get the answer you seek.

With these questions, you can hopefully prevent surprises and predict and control costs:

  • What is the Unit Price for Each Service?: It’s important to make sure that you have a clear understanding of every unit price the eDiscovery provider includes in an estimate.  Some services may be charged per-page or per-document, while others may be charged per gigabyte, and others may be charged on an hourly basis.  It’s important to understand how each service is being charged and ensure that the price model makes sense.
  • Are the Gigabytes Counted as Original or Expanded Gigabytes?: For the per gigabyte services, it’s also important to make sure that you whether they are billed on the original GBs or the expanded GBs.  Expanded GBs can be two to three times as large (or more) as the original GBs.  Some services are typically billed on the original GBs (or at least the unzipped GBs) while others are typically billed on the expanded GBs.  It’s important to know which metric is used; otherwise, your ESI collection may be larger than you think and you may be in for a surprise when the bill comes.
  • Will I Get an Estimate in Advance for Hourly Billed Services?: When you ask for specific hourly billed services from the provider (such as professional consulting or technician services) to complete a specific task, it’s important to get an estimate to complete that task as well as advanced notification if the task will require more time than estimated.
  • What Incidental Costs are Billed?: It’s not uncommon (or unreasonable) for incidentals like project management, supplies and shipping to be included in invoices.  In particular, project management services can be an important component to the services provided by the eDiscovery provider.  But, the rates charged for these incidentals can vary widely.  Understanding what incidentals are being billed and the rates for those services is important to controlling costs.
  • If Prices are Subject to Change, What is the Policy for Those Changes and Notification of Clients?: Let’s face it, prices do change, even in the eDiscovery industry.  In ongoing contracts, most eDiscovery providers will retain the right to change prices to reflect the cost of doing business (whether they exercise those rights or not).  It’s important to know the terms that your provider has set for the ability to change prices, what the notification policy is for those price changes and what your options are if the provider exercises that right.

With the right questions and a good understanding of your project parameters, you can get to the answer to that elusive question “How much will it cost?”.

So, what do you think?  How do you manage costs with your eDiscovery providers?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Though it was “Switching Horses in Midstream”, Court Approves Plaintiff’s Predictive Coding Plan – eDiscovery Case Law

In Bridgestone Americas Inc. v. Int’l Bus. Mach. Corp., No. 3:13-1196 (M.D. Tenn. July 22, 2014), Tennessee Magistrate Judge Joe B. Brown, acknowledging that he was “allowing Plaintiff to switch horses in midstream”, nonetheless ruled that that the plaintiff could use predictive coding to search documents for discovery, even though keyword search had already been performed.

In this case where the plaintiff sued the defendant for a $75 million computer system that it claimed threw its “entire business operation into chaos”, the plaintiff requested that the court allow the use of predictive coding in reviewing over two million documents.  The defendant objected, noting that the request was an unwarranted change to the original case management order that did not include predictive coding, and that it would be unfair to use predictive coding after an initial screening had been done with keyword search terms.

Judge Brown conducted a lengthy telephone conference with the parties on June 25 and, began the analysis in his order by observing that “[p]redictive coding is a rapidly developing field in which the Sedona Conference has devoted a good deal of time and effort to, and has provided various best practices suggestions”, also noting that “Magistrate Judge Peck has written an excellent article on the subject and has issued opinions concerning predictive coding.”  “In the final analysis”, Judge Brown continued, “the uses of predictive coding is a judgment call, hopefully keeping in mind the exhortation of Rule 26 that discovery be tailored by the court to be as efficient and cost-effective as possible.”

As a result, noting that “we are talking about millions of documents to be reviewed with costs likewise in the millions”, Judge Brown permitted the plaintiff “to use predictive coding on the documents that they have presently identified, based on the search terms Defendant provided.”  Judge Brown acknowledged that he was “allowing Plaintiff to switch horses in midstream”, so “openness and transparency in what Plaintiff is doing will be of critical importance.”

This case has similar circumstances to Progressive Cas. Ins. Co. v. Delaney, where that plaintiff also desired to shift from the agreed upon discovery methodology for privilege review to a predictive coding methodology.  However, in that case, the plaintiff did not consult with either the court or the requesting party regarding their intentions to change review methodology and the plaintiff’s lack of transparency and lack of cooperation resulted in the plaintiff being ordered to produce documents according to the agreed upon methodology.  It pays to cooperate!

So, what do you think?  Should the plaintiff have been allowed to shift from the agreed upon methodology or did the volume of the collection warrant the switch?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.