Searching

eDiscovery Case Law: Never Mind! Judge Scheindlin Withdraws FOIA Requests Opinion

Back in February, eDiscovery Daily reported that Southern District of New York Judge Shira A. Scheindlin’s latest opinion regarding eDiscovery best practices.  In National Day Laborer Organizing Network v. U.S. Immigration and Customs Enforcement Agency, 10 Civ. 3488, she ruled that the federal government must provide documents “in a usable format” when it responds to Freedom of Information Act (FOIA) requests.

In this case, the National Day Laborer Organizing Network, Center for Constitutional Rights and the Immigration Justice Clinic at the Benjamin N. Cardozo School of Law sued to require production of a wide range of documents under the Freedom of Information Act in August 2010.  In response, the government agency defendants produced documents grouped together in large files that were not searchable, for which individual documents could not be easily identified, with emails separated from their attachments.

In ruling at that time, Judge Scheindlin noted that “Once again, this Court is required to rule on an eDiscovery issue that could have been avoided had the parties had the good sense to ‘meet and confer,’ ‘cooperate’ and generally make every effort to ‘communicate’ as to the form in which ESI would be produced.”, and ruled that federal agencies must turn over documents that include “metadata,” which allows them to be searched and indexed.  Indicating that “common sense dictates” that the handling of FOIA requests should be informed by “the spirit if not the letter” of the Federal Rules of Civil Procedure, Judge Scheindlin indicated the government offered “a lame excuse” for delivering non-searchable documents.  A copy of the original opinion and order can be found here.

Now, that opinion has been withdrawn.

In a very short order withdrawing the opinion, Judge Scheindlin stated:

“This court has been informed that the parties have recently resolved their dispute regarding the form and format in which records will be produced by defendants in this Freedom of Information Act lawsuit.  In the interests of justice, this Court now believes that it would be prudent to withdraw the opinion it issued on February 7, 2011 (Docket #41).  I do so because, as subsequent submissions have shown, that decision was not based on a full and developed record.  By withdrawing the decision, it is the intent of this Court that the decision shall have no precedential value in this lawsuit or in any other lawsuit.

The Court also withdraws its Supplemental Order dated February 14, 2011 (Docket # 50).”

So, as Emily Litella would say, “Never Mind!”

So, what do you think?  What impact does the withdrawal of the opinion have on future eDiscovery cases?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Case Law: District Court Judge Affirms $1 Million Sanction to Pappas in Victor Stanley

 

One of the first posts ever published in eDiscovery Daily was this one, where defendant Mark Pappas, President of Creative Pipe, Inc., was ordered by Magistrate Judge Paul W. Grimm to  “be imprisoned for a period not to exceed two years, unless and until he pays to Plaintiff the attorney's fees and costs that will be awarded to Plaintiff as the prevailing party pursuant to Fed. R. Civ. P. 37(b)(2)(C).”.  Judge Grimm found that “Defendants…deleted, destroyed, and otherwise failed to preserve evidence; and repeatedly misrepresented the completeness of their discovery production to opposing counsel and the Court.”

Upon appeal, District Court Judge Marvin J. Garbis declined to adopt the order regarding incarceration, stating: “[T]he court does not find it appropriate to Order Defendant Pappas incarcerated for future possible failure to comply with his obligation to make payment of an amount to be determined in the course of further proceedings.”

Then, in January of this year, Judge Grimm entered an order awarding a total of $1,049,850.04 in “attorney’s fees and costs associated with all discovery that would not have been un[der]taken but for Defendants' spoliation, as well as the briefings and hearings regarding Plaintiff’s Motion for Sanctions.”  As a result, the court awarded $901,553.00 in attorney’s fees and $148,297.04 in costs, including the costs for the Plaintiff’s computer forensic consultant, finding that “Defendants’ first spoliation efforts corresponded with the beginning of litigation” and that “Defendants’ misconduct affected the entire discovery process since the commencement of this case.”

Naturally, the award was appealed.

On Tuesday, June 14, Judge Garbis affirmed Judge Grimm’s prior Report and Recommendation ordering the award.  Judge Garbis noted that “The Court’s stated standard for includible fees and costs is consistent with the purpose of designing a sanction that will ‘restore the prejudiced party to the same position he would have been in absent the wrongful destruction of evidence by the opposing party.’  Judge Garbis discussed and rejected all of Creative Pipe’s objections as to the amount of the award, adopting Judge Grimm’s findings that all of these fees were in fact related to the discovery malfeasance.

With Creative Pipe having already paid a total of $478,409.92, a balance remains under the order of $571,440.12, which concluded with Judge Garbis stating that “Defendants shall, by July 15, 2011, pay Plaintiff…the balance due”.  No mention of Judge Grimm’s original automatic jail sanction for non-payment of the fees, though, Judge Garbis originally said he might impose jail sanctions for non-payment.

So, what do you think?  Will the defendant pay the rest?  Appeal to the Circuit Court?  Could he still go to jail?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: More On the Recommind Patent Controversy

 

Perhaps the most controversial story discussed in the eDiscovery community in quite some time is the controversy regarding the patent recently announced by Recommind for Predictive Coding via press release entitled, Recommind Patents Predictive Coding, issued on June 8.  I haven’t seen this much backlash against a company or individual since last summer when LeBron James’ decision to leave the Cleveland Cavaliers for the Miami Heat (and the subsequent championship-like celebration that he and his teammates conducted before the season).  How did that turn out?  😉

Since that announcement, there have been several articles and blog posts about it, including:

  • This one, from Monica Bay of Law Technology News, asking the question: “Is Recommind Blowing Smoke?”  where discussed the buzz over Recommind’s announcement;
  • This one, from Evan Koblentz (also of Law Technology News), entitled “Recommend Intends to Flex Predictive Coding Muscles” which includes responses from Catalyst and Valora Technologies;
  • This one, also from Evan Koblentz, a blog post from EDD Update, where Recommind General Counsel and Vice President Craig Carpenter acknowledges that Recommind failed to obtain a trademark for the term Predictive Coding (though Recommind is still using the ™ symbol on the term Predictive Coding onthis page);
  • Three blog posts in four days from Sharon D. Nelson of Ride the Lightning blog, which debate the enforceability of the patent and include a response from OrcaTec, noting that Recommind’s implied threat of litigation is “nothing more than an attempt to bully the market place”.

There are several other articles and blog posts regarding the topic, but if I listed them all, I’d have no room left for anything new!  Sorry that I couldn’t include them all.

I reached out to Bill Dimm, founder of Hot Neuron LLC, makers of Clustify, which clusters documents in groups for effective, expedited review and asked him his thoughts about the Recommind press release and patent.  Here are his comments:

"Recommind's press release would have been accurately titled 'Recommind Patents a Method for Predictive Coding,' but it went with the much more provocative title 'Recommind Patents Predictive Coding,' implying  that its patent covers every conceivable way of doing predictive coding.  The only way I can see that being accurate is if you DEFINE predictive coding to be exactly the procedure outlined in claim 1 of Recommind's patent.  Of course, 'predictive coding' is a relatively new term, so the definition is up for debate.  The patent itself says:

'Predictive coding refers to the capability to use a small set of coded documents (or partially coded documents) to predict document coding of a corpus.' That sure sounds like it allows for a lot of possibilities beyond the procedure in claim 1 of the patent.  The press release goes on to say: 'ONLY [emphasis is mine] Recommind's patented, iterative, computer-assisted approach can 'bend the cost curve' of document review.'  Really?  So, Recommind has the ONLY product in the industry that works?  A few of us disagree.  Even clustering, which Recommind claims does not qualify as predictive coding will bend the cost curve because the efficiency boost it provides increases with the size of the document set.

Moving on from the press release to the patent itself, I would recommend reading claim 1 if you are interested in such things.  It is the most general method that the USPTO allowed Recommind to claim –  the other claims are all dependent claims that describe more specific embodiments of claim 1, presumably so that Recommind would have a leg left to stand on if prior art was found to invalidate claim 1.  Claim 1 describes a procedure for predictive coding that involves quite a few steps.  It is my understanding (I am NOT a lawyer) that the patent is irrelevant for any predictive coding procedure that does not include every single one of the steps listed in claim 1.  Since claim 1 includes things like identification cycles, rolling loads, and random sampling, it seems unlikely that existing products would accidentally infringe on the patent.

As far as Clustify is concerned, Recommind's patent is irrelevant since our procedure for predictive coding is different.  In fact, I explained in a presentation at a recent conference why random sampling is a very inefficient approach (something that has been known for decades in other fields), so I wouldn't even be tempted to follow Recommind's procedure."

So, what do you think?  Will the Recommind predictive coding patent allow them to rule predictive coding?  Or only their specific approach?  Will LeBron James ever win a championship?  Please share any comments you might have or if you’d like to know more about a particular topic.

Full disclosure: Hot Neuron is a partner of Trial Solutions, which has used their product, Clustify, in various client projects.

eDiscovery Best Practices: Avoiding eDiscovery Nightmares: 10 Ways CEOs Can Sleep Easier

 

I found this article in the CIO Central blog on Forbes.com from Robert D. Brownstone – it’s a good summary of issues for organizations to consider so that they can avoid major eDiscovery nightmares.  The author counts down his top ten list David Letterman style (clever!) to provide a nice easy to follow summary of the issues.  Here’s a summary recap, with my ‘two cents’ on each item:

10. Less is more: The U.S. Supreme Court ruled unanimously in 2005 in the Arthur Andersen case that a “retention” policy is actually a destruction policy.  It’s important to routinely dispose of old data that is no longer needed to have less data subject to discovery and just as important to know where that data resides.  My two cents: A data map is a great way to keep track of where the data resides.

9. Sing Kumbaya: They may speak different languages, but you need to find a way to bridge the communication gap between Legal and IT to develop an effective litigation-preparedness program.  My two cents: Require cross-training so that each department can understand the terms and concepts important to the other.  And, don’t forget the records management folks!

8. Preserve or Perish: Assign the litigation hold protocol to one key person, either a lawyer or a C-level executive to decide when a litigation hold must be issued.  Ensure an adequate process and memorialize steps taken – and not taken.  My two cents: Memorialize is underlined because an organization that has a defined process and the documentation to back it up is much more likely to be given leeway in the courts than a company that doesn’t document its decisions.

7. Build the Three-Legged Stool: A successful eDiscovery approach involves knowledgeable people, great technology, and up-to-date written protocols.  My two cents: Up-to-date written protocols are the first thing to slide when people get busy – don’t let it happen.

6. Preserve, Protect, Defend: Your techs need the knowledge to avoid altering metadata, maintain chain-of-custody information and limit access to a working copy for processing and review.  My two cents: A good review platform will assist greatly in all three areas.

5. Natives Need Not Make You Restless: Consider exchanging files to be produced in their original/”native” formats to avoid huge out-of-pocket costs of converting thousands of files to image format.  My two cents: Be sure to address how redactions will be handled as some parties prefer to image those while others prefer to agree to alter the natives to obscure that information.

4. Get M.A.D.?  Then Get Even: Apply the Mutually Assured Destruction (M.A.D.) principle to agree with the other side to take off the table costly volumes of data, such as digital voicemails and back-up data created down the road.  My two cents: That’s assuming, of course, you have the same levels of data.  If one party has a lot more data than the other party, there may be no incentive for that party to agree to concessions.

3. Cooperate to Cull Aggressively and to Preserve Clawback Rights: Setting expectations regarding culling efforts and reaching a clawback agreement with opposing counsel enables each side to cull more aggressively to reduce eDiscovery costs.  My two cents: Some parties will agree on search terms up front while others will feel that gives away case strategy, so the level of cooperation may vary from case to case.

2. QA/QC: Employ Quality Assurance (QA) tests throughout review to ensure a high accuracy rate, then perform Quality Control (QC) testing before the data goes out the door, building time in the schedule for that QC testing.  Also, consider involving a search-methodology expert.  My two cents: I cannot stress that last point enough – the ability to illustrate how you got from the large collection set to the smaller production set will be imperative to responding to any objections you may encounter to the produced set.

1. Never Drop Your Laptop Bag and Run: Dig in, learn as much as you can and start building repeatable, efficient approaches.  My two cents: It’s the duty of your attorneys and providers to demonstrate competency in eDiscovery best practices.  How will you know whether they have or not unless you develop that competency yourself?

So, what do you think?  Are there other ways for CEOs to avoid eDiscovery nightmares?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Best Practices: Message Thread Review Saves Costs and Improves Consistency

 

Insanity is doing the same thing over and over again and expecting a different result.  But, in ESI review, it can be even worse when you get a different result.

One of the biggest challenges when reviewing ESI is identifying duplicates so that your reviewers aren’t reviewing the same files again and again.  Not only does that drive up costs unnecessarily, but it could lead to problems if the same file is categorized differently by different reviewers (for example, inadvertent production of a duplicate of a privileged file if it is not correctly categorized).

Of course, there are a number of ways to identify duplicates.  Exact duplicates (that contain the exact same content in the same file format) can be identified through hash values, which are a digital fingerprint of the content of the file.  MD5 and SHA-1 are the most popular hashing algorithms, which can identify exact duplicates of a file, so that they can be removed from the review population.  Since many of the same emails are emailed to multiple parties and the same files are stored on different drives, deduplication through hashing can save considerable review costs.

Sometimes, files are not exact duplicates but contain the same (or almost the same) information.  One example is a Word document published to an Adobe PDF file – the content is the same, but the file format is different, so the hash value will be different.  Near-deduplication can be used to identify files where most or all of the content matches so they can be verified as duplicates and eliminated from review.

Then, there is message thread analysis.  Of course, most email messages are part of a larger discussion, which could be just between two parties, or include a number of parties in the discussion.  To review each email in the discussion thread would result in much of the same information being reviewed over and over again.  Instead, message thread analysis pulls those messages together and enables them to be reviewed as an entire discussion.  That includes any side conversations within the discussion that may or may not be related to the original topic (e.g., a side discussion about lunch plans or did you see American Idol last night).

FirstPass®, powered by Venio FPR™, is one example of an application that provides a mechanism for message thread analysis of Outlook emails that pulls the entire thread into one conversation for review as one big “tree”.  The “tree” representation gives you the ability to see all of the conversations within the discussion and focus your review on the last emails in each conversation to see what is said without having to review each email.  Side conversations are “branches” of the tree and FirstPass enables you to tag individual messages, specific branches or the entire tree as responsive, non-responsive, privileged or some other designation.  Also, because of the way that Outlook tracks emails in the thread, FirstPass identifies messages that are missing from the collection with a red X, enabling you to investigate and determine if additional collection is needed and avoiding potential spoliation claims.

With message thread analysis, you can minimize review of duplicative information within emails, saving time and cost and ensuring consistency in the review.

So, what do you think?  Does your review tool support message thread analysis?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Case Law: Completing Production AFTER Trial is Too Late

In DL v. District of Columbia, No. 05-1437 (RCL) (D.D.C. May 9, 2011), repeated, flagrant, and unrepentant failures of the District of Columbia to comply with discovery orders, failure to supplement discovery responses, and eventual production of thousands of e-mails—some more than two years old—after the date of trial resulted in a sanction of waiver of privilege over documents sought by plaintiffs.

Plaintiffs filed an action seeking injunctive and declaratory relief for the failure of the District of Columbia Government to provide them with a free appropriate public education as required under the Individuals with Disabilities and Education Act. On the first day of trial six years later, counsel for the District acknowledged that the District several days earlier had begun a rolling production of thousands of emails per day that was expected to continue through the end of trial. Counsel for the District stated that the court had not been informed of production problems because it had been hoped review of the documents for relevance and privilege and thus production of the documents could have been completed earlier. From the bench, the court ordered the District to produce all of the email without objection and with privilege waived within one week of the end of the trial so that plaintiffs could seek to supplement the trial record if necessary. The District sought reconsideration of the order.

Likening the District’s posture to an airplane with landing gear that deploys only after touchdown, the court denied the District’s motion. Waiver of privilege was an appropriate sanction because it was just and was proportional between offense and sanction, considering the District’s violation of multiple discovery orders and failure to meet its obligation to supplement its discovery responses. The court concluded that its sanction was justified considering prejudice to plaintiffs, prejudice to the judicial system, and the need to deter similar misconduct in the future. Since the District chose not to bring the situation to the court’s attention until the day of trial, the court “had no practical alternative short of entering a default.”

The court held that whether the District had acted in good faith and whether plaintiffs also had committed discovery violations was irrelevant:

Whether the District made a good-faith effort to produce all responsive e-mails before the trial is irrelevant. As explained above, it was not sanctioned for failing to make a good-faith effort. It was sanctioned for openly, continuously, and repeatedly violating multiple Court orders, failing to adhere to or even acknowledge the existence of the Federal Rules’ discovery framework, and committing a discovery abuse so extreme as to be literally unheard of in this Court. The Rules require more than simply making a good-faith effort to produce documents. They require adherence to a very precise framework for navigating the discovery process. Moreover, the duty to adhere to clear Court orders is among a lawyer’s most basic. Were it not for those two directives—the Federal Rules’ discovery framework and Court orders regarding discovery — discovery would devolve into pure bedlam. Disciplined adherence to those Rules and Orders on the part of courts as well as parties is the only tool our system has to wrangle the whirlwind as it were and tame an otherwise unmanageable part of the litigation process. A good-faith effort to produce documents in the absence of adherence to Court orders and the Federal Rules is useless.

So, what do you think?  Have you ever had opposing counsel try to produce documents at the beginning of trial – or even after?  Please share any comments you might have or if you’d like to know more about a particular topic.

Case Summary Source: Applied Discovery (free subscription required).  For eDiscovery news and best practices, check out the Applied Discovery Blog here.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: Email Footers Give Privilege Searches the Boot

 

This communication (including any attachments) is intended for the use of the intended recipient(s) only and may contain information that is confidential, privileged or legally protected.  Any unauthorized use or dissemination of this communication is strictly prohibited.  If you have received this communication in error, please immediately notify the sender by return e-mail message and delete all copies of the original communication. Thank you for your cooperation.

This is an example of a standard email disclaimer often automated to appear in the footer of outgoing emails to disclaim liability.  Many organizations choose to add disclaimers to their emails for legal protection to attempt to protect themselves from legal threats such as breach of confidentiality or accidental breach of privilege.

However, when it comes time to collect and search email collections for confidentiality and privilege, these email footers can wreak havoc with those searches.  Searches for the words “confidential” or “privileged” will essentially be rendered useless as they will literally retrieve every email with the email disclaimer footer in it.

So, what to do?

One way to address the issue is to identify any other variations of words and phrases that might imply privilege.  Searching for phrases like “attorney client” or “attorney work product” – provided those phrases are not in the footer – may identify many of the privileged files.

Another way is to shift your search focus to names of individuals likely to conduct privileged communications, such as the names of the attorneys communicating with the organization.  Sometimes you may not know the names of all of the attorneys, so a search for domains associated with the outside counsel firms should identify the names of the individuals sending from or receiving to those domains.

If searching for the term "privileged" is still the best way to ensure that you find all of the potentially privileged files, one of our readers, Mark Lyon, actually identified a better way to search for the term “privileged” that I sheepishly admit I did not think of late last night when I wrote this, so I had to amend this post to include it (a first!).   Identifying the various footers at use within at least the main companies included in the collection, then excluding those entire footers from the index will remove those footers from filling up your search results.  Another reader, Joe Howie, has discussed in more detail an approach for removing those footers from the index.  Thanks to both Mark and Joe for keeping me on my toes!  🙂

So, what do you think?  Are email disclaimer footers making your privileged searches more complicated?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Case Law: Facebook Did Not Deduce That They Must Produce

In this case, United States Magistrate Judge Howard Lloyd of the Northern District of California compelled Facebook to produce ESI that was previously produced in a converted, non-searchable format and further ordered Facebook not to use a third-party vendor’s online production software to merely “provide access” to it.  The court’s order granting the plaintiff’s Motion To Compel Production in In re Facebook PPC Advertising Litigation, 2011 WL 1324516 (N.D.Cal. Apr. 6, 2011) addressed the importance of ESI Protocols, the requirement to produce ESI in native formats, and production of documents versus providing access to them.  A copy of the order can be found here.

Several plaintiffs brought a class action against Facebook for breach of contract and violation of California’s Unfair Competition Law, suing Facebook for allegedly misrepresenting the quality of its “click filters,” which are filters used to prevent charging merchants when advertisements are inadvertently clicked.  When discovery disputes occurred, plaintiffs filed their Motion To Compel, alleging:

  1. Facebook refused to agree to an ESI Protocol to establish the manner and form of electronic production, including agreement on search words or phrases, custodians and time frames for production,
  2. Facebook uploaded its responses to discovery requests to a commercial website (Watchdox.com) in a manner that seriously limited the plaintiffs’ ability to review them.  Documents on Watchdox.com could not be printed and Facebook, citing confidentiality concerns, retained the ability to cause documents to expire and no longer be accessible after a period of time.
  3. The documents loaded to Watchdox.com, as well as others that were actually produced, were not in their native format, and thus were unsearchable and unusable.  One such document was an 18,000 page customer complaint database printed to PDF which lacked any searchable features.

With regard to the refusal to agree to an ESI protocol, Facebook argued that such a protocol would result in “forcing the parties to anticipate and address all potential issues on the form of electronic production” and “would likely have the result of frustrating and slowing down the discovery process.” The court rejected this argument, noting “The argument that an ESI Protocol cannot address every single issue that may arise is not an argument to have no ESI Protocol at all”.

In reviewing Facebook’s production protocol, the Court noted that “each of these steps make the discovery process less efficient without providing any real benefit.” and found that Facebook’s privacy concerns were unreasonable since a two tiered protective order already existed in the case as well as the fact that confidential documents could be marked as such to prevent inadvertent disclosures.  The Court held that Facebook’s use of Watchdox.com was unduly burdensome on the Plaintiffs and thus ordered Facebook to produce any documents that had been uploaded to Watchdox.com in their native searchable formats.  The Court also ordered Facebook to reproduce previously produced documents that were provided in an unsearchable format in their native searchable formats.

So, what do you think?  Is merely providing access to documents sufficient for production?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Best Practices: 4 Steps to Effective eDiscovery With Software Analytics

 

I read an interesting article from Texas Lawyer via Law.com entitled “4 Steps to Effective E-Discovery With Software Analytics” that has some interesting takes on project management principles related to eDiscovery and I’ve interjected some of my thoughts into the analysis below.  A copy of the full article is located here.  The steps are as follows:

1. With the vendor, negotiate clear terms that serve the project's key objectives.  The article notes the important of tying each collection and review milestone (e.g., collecting and imaging data; filtering data by file type; removing duplicates; processing data for review in a specific review platform; processing data to allow for optical character recognition (OCR) searching; and converting data into a tag image file format (TIFF) for final production to opposing counsel) to contract terms with the vendor. 

The specific milestones will vary – for example, conversion to TIFF may not be necessary if the parties agree to a native production – so it’s important to know the size and complexity of the project, and choose only an experienced eDiscovery vendor who can handle the variations.

2. Collect and process data.  Forensically sound data collection and culling of obviously unresponsive files (such as system files) to drastically decrease the overall review costs are key services that a vendor provides in this area.  As we’ve noted many times on this blog, effective culling can save considerable review costs – each gigabyte (GB) culled can save $16-$18K in attorney review costs.

The article notes that a hidden cost is the OCR process of translating extracted text into a searchable form and that it’s an optimal negotiation point with the vendor.  This may have been true when most collections were paper based, but as most collections today are electronic based, the percentage of documents requiring OCR is considerably less than it used to be.  However, it is important to be prepared that there are some native files which will be “image only”, such as TIFFs and scanned PDFs – those will require OCR to be effectively searched.

3. Select a data and document review platform.  Factors such as ease of use, robustness, and reliability of analytic tools, support staff accessibility to fix software bugs quickly, monthly user and hosting fees, and software training and support fees should be considered when selecting a document review platform.

The article notes that a hidden cost is selecting a platform with which the firm’s litigation support staff has no experience as follow-up consultation with the vendor could be costly.  This can be true, though a good vendor training program and an intuitive interface can minimize or even eliminate this component.

The article also notes that to take advantage of the vendor’s more modern technology “[a] viable option is to use a vendor's review platform that fits the needs of the current data set and then transfer the data to the in-house system”.  I’m not sure why the need exists to transfer the data back – there are a number of vendors that provide a cost-effective solution appropriate for the duration of the case.

4. Designate clear areas of responsibility.  By doing so, you minimize or eliminate inefficiencies in the project and the article mentions the RACI matrix to determine who is responsible (individuals responsible for performing each task, such as review or litigation support), accountable (the attorney in charge of discovery), consulted (the lead attorney on the case), and informed (the client).

Managing these areas of responsibility effectively is probably the biggest key to project success and the article does a nice job of providing a handy reference model (the RACI matrix) for defining responsibility within the project.

So, what do you think?  Do you have any specific thoughts about this article?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: Sedona Conference Database Principles

 

A few months ago, eDiscovery Daily posted about discovery of databases and how few legal teams understand database discovery and know how to handle it.  We provided a little pop quiz to test your knowledge of databases, with the answers here.

Last month, The Sedona Conference® Working Group on Electronic Document Retention & Production (WG1) published the Public Comment Version of The Sedona Conference® Database Principles – Addressing the Preservation & Production of Databases &Database Information in Civil Litigation to provide guidance and recommendations to both requesting and producing parties to simplify discovery of databases and information derived from databases.  You can download the publication here.

As noted in the Executive Overview of the publication, some of the issues that make database discovery so challenging include:

  • More enterprise-level information is being stored in searchable data repositories, rather than in discrete electronic files,
  • The diverse and complicated ways in which database information can be stored has made it difficult to develop universal “best-practice” approaches to requesting and producing information stored in databases,
  • Retention guidelines that make sense for archival databases (databases that add new information without deleting past records) rapidly break down when applied to transactional databases where much of the system’s data may be retained for a limited time – as short as thirty days or even thirty seconds.

The commentary is broken into three primary sections:

  • Section I: Introduction to databases and database theory,
  • Section II: Application of The Sedona Principles, designed for all forms of ESI, to discovery of databases,
  • Section III: Proposal of six new Principles that pertain specifically to databases with commentary to support the Working Group’s recommendations.  The principles are stated as follows:
    • Absent a specific showing of need or relevance, a requesting party is entitled only to database fields that contain relevant information, not the entire database in which the information resides or the underlying database application or database engine.
    • Due to differences in the way that information is stored or programmed into a database, not all information in a database may be equally accessible, and a party’s request for such information must be analyzed for relevance and proportionality.
    • Requesting and responding parties should use empirical information, such as that generated from test queries and pilot projects, to ascertain the burden to produce information stored in databases and to reach consensus on the scope of discovery.
    • A responding party must use reasonable measures to validate ESI collected from database systems to ensure completeness and accuracy of the data acquisition.
    • Verifying information that has been correctly exported from a larger database or repository is a separate analysis from establishing the accuracy, authenticity, or admissibility of the substantive information contained within the data.
    • The way in which a requesting party intends to use database information is an important factor in determining an appropriate format of production.

To submit a public comment, you can download a public comment form here, complete it and fax (yes, fax) it to The Sedona Conference® at 928-284-4240.  You can also email a general comment to them at tsc@sedona.net.

eDiscovery Daily will be delving into this document in more detail in future posts.  Stay tuned!

So, what do you think?  Do you have a need for guidelines for database discovery?   Please share any comments you might have or if you’d like to know more about a particular topic.