Production

Defendant Ordered to Produce Archived Emails Even Though Plaintiff Failed to Produce Theirs – eDiscovery Case Law

In Finjan, Inc. v. Blue Coat Systems., 5:13-cv-03999-BLF (N.D. Cal. Oct. 17, 2014), California Magistrate Judge Paul S. Grewal granted the plaintiff’s motion ordering the defendant to produce relevant emails from its eight custodians, even though the plaintiff was unable to provide its own archival emails.

Case Background

As Judge Grewal stated “[t]o cut to the chase in this dispute over the scope and pace of Defendant Bluecoat Systems, Inc.’s document production in this patent infringement case”, the plaintiff moved to compel the defendant to produce email from eight custodians related to both technical documents and damages documents as well as damages testimony.  The defendant did not object to producing any of the technical discovery requested and raised only limited issues concerning the documents on damages, mostly objecting to producing custodial email from archival systems when the plaintiff was not able to do the same in return.

Each party agreed to identify eight custodians and ten terms per custodian for the other to search. The defendant did not dispute the relevance of either the custodians or search terms the plaintiff selected. But when the defendant learned that the plaintiff did not have former employees’ emails — except as produced in other litigations — the defendant balked at the idea that its custodians should have to turn over any email other than from active systems.

Judge’s Ruling

“Reduced to its essence, Rule 26(b)(2)(iii) requires this court to decide: have Blue Coat’s discovery responses been fair? Blue Coat’s discovery responses so far have largely been fair, but not entirely”, stated Judge Grewal.

Judge Grewal found that, with the exception of one document repository recently discovered (as acknowledged by defendant’s counsel), the defendant had completed its obligation regarding the technical document production.  Judge Grewal also ruled that the plaintiff “has identified no legitimate reason why it should be provided discovery on Blue Coat’s foreign sales or valuation on the whole”.  He also stated that the defendant “might reasonably be required to at least tell Finjan what the [third party] agreements are and the status of its efforts to secure consent.”

However, with regard to the archival email, Judge Grewal ruled as follows:

“Where Blue Coat has been less than fair is with respect to archival email for its eight custodians. Blue Coat may largely be in the right that it should not have to dig through legacy systems when Finjan is unable to the same for its custodians. But one party’s discovery shortcomings are rarely enough to justify another’s. And here, at least with respect to documents mentioning Finjan — the one specific category of documents Finjan could identify that it needed from archived email — Finjan’s request is reasonable.”

As a result, the defendant was ordered to “identify all license agreements whose production is awaiting third-party consent and the status of its efforts to secure that consent” within seven days and “produce all archival email from its eight designated custodians that mention Finjan and supplemental Interrogatories 5 and 6” within 21 days.

So, what do you think?  Should the defendant have to produce email when the plaintiff can’t do the same?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

The Importance of Metadata – eDiscovery Best Practices

This topic came up today with a client that wanted information to help justify the request for production from opposing counsel in native file format, so it’s worth revisiting here…

If an electronic document is a “house” for information, then metadata could be considered the “deed” to that house. There is far more to explaining a house than simply the number of stories and the color of trim. It is the data that isn’t apparent to the naked eye that tells the rest of the story. For a house, the deed lines out the name of the buyer, the financier, and the closing date among heaps of other information that form the basis of the property. For an electronic document, it’s not just the content or formatting that holds the key to understanding it. Metadata, which is data about the document, contains information such as the user who created it, creation date, the edit history, and file type. Metadata often tells the rest of the story about the document and, therefore, is often a key focus of eDiscovery.

There are many different types of metadata and it is important to understand each with regard to requesting that metadata in opposing counsel productions and being prepared to produce it in your own productions.  Examples include:

  • Application Metadata: This is the data created by an application, such as Microsoft® Word, that pertains to the ESI (“Electronically Stored Information”) being addressed. It is embedded in the file and moves with it when copied, though copying may alter the application metadata.
  • Document Metadata: These are properties about a document that may not be viewable within the application that created it, but can often be seen through a “Properties” view (for example, Word tracks the author name and total editing time).
  • Email Metadata: Data about the email.  Sometimes, this metadata may not be immediately apparent within the email application that created it (e.g., date and time received). The amount of email metadata available varies depending on the email system utilized.  For example, Outlook has a metadata field that links messages in a thread together which can facilitate review – not all email applications have this data.
  • Embedded Metadata: This metadata is usually hidden; however, it can be a vitally important part of the ESI. Examples of embedded metadata are edit history or notes in a presentation file. These may only be viewable in the original, native file since it is not always extracted during processing and conversion to an image format.
  • File System Metadata: Data generated by the file system, such as Windows, to track key statistics about the file (e.g., name, size, location, etc.) which is usually stored externally from the file itself.
  • User-Added Metadata: Data created by a user while working with, reviewing, or copying a file (such as notes or tracked changes).
  • Vendor-Added Metadata: Data created and maintained by an eDiscovery vendor during processing of the native document.  Don’t be alarmed, it’s impossible to work with some file types without generating some metadata; for example, you can’t review and produce individual emails within a custodian’s Outlook PST file without generating those out as separate emails (either in Outlook MSG format or converted to an image format, such as TIFF or PDF).

Some metadata, such as user-added tracked changes or notes, could be work product that may affect whether a document is responsive or contains privileged information, so it’s important to consider that metadata during review, especially when producing in native format.

Here’s an example of one case where the production of metadata was ordered and an answer to the question “Is it Possible for a File to be Modified Before it is Created?” (you might be surprised at the answer).

So, what do you think? Have you been involved in cases where metadata was specifically requested as part of discovery? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Plaintiff Can’t “Pick” and Choose When it Comes to Privilege of Inadvertent Disclosures – eDiscovery Case Law

In Pick v. City of Remsen, C 13-4041-MWB (N.D. Iowa Sept. 15, 2014), Iowa District Judge Mark W. Bennett upheld the magistrate judge’s order directing the destruction of an inadvertently-produced privileged document, an email from defense counsel to some of the defendants, after affirming the magistrate judge’s analysis of the five-step analysis to determine whether privilege was waived.

Case Background

In this wrongful termination case, the plaintiff served a request for production of documents that included “all relevant non-privileged emails initiated by or received by the City of Remsen in regard to the Plaintiff and/or any of the issues set forth in Plaintiff’s complaint”.  Among the documents produced was an email, dated July 14, 2012, from defense counsel to Remsen Utility Board members and others discussing an upcoming Utility Board meeting.  Defense counsel learned of the email’s inadvertent disclosure on March 25, 2014, when the plaintiff served supplemental discovery responses on defense counsel and contacted plaintiff’s counsel within 34 minutes of the discovery.

Defense counsel asked that the email be destroyed. The plaintiff’s counsel suggested the email could be redacted to protect “advice relating to procedure,” but indicated he intended to rely on the remainder of the email unless ordered otherwise by the court.  The defendants’ filed a motion requesting that the court order the email’s destruction as an inadvertently produced privileged document, which the magistrate judge granted.

Judge’s Ruling

The Magistrate Judge, Leonard Strand, had applied the five-step analysis to determine the proper range of privilege to extend.  Those five factors were, as follows:

  1. The reasonableness of the precautions taken to prevent inadvertent disclosure in view of the extent of document production: Judge Strand found that the privileged email was “inconspicuously located among various non-privileged email messages”, which, based on the fact that defendants turned over to their counsel 440 pages of documents (including 183 pages of email messages, some pages of which contained more than one email), was upheld as “completely fair and accurate”;
  2. The number of inadvertent disclosures: Since there was only one inadvertent disclosure, Judge Bennett upheld the ruling as “not clearly erroneous”;
  3. The extent of the disclosures: Though the email was sent to six people, all six were privileged recipients of the email, so Judge Bennett upheld the ruling as “not clearly erroneous”;
  4. The promptness of measures taken to rectify the disclosure: Because defense counsel contacted plaintiff’s counsel just 34 minutes after learning of the email’s inadvertent disclosure and requested its destruction, Judge Bennett upheld the ruling as “not clearly erroneous”; and
  5. Whether the overriding interest of justice would be served by relieving the party of its error: Judge Strand, finding that the plaintiff “clearly has other evidence that he intends to rely on in support of his various claims”, ruled in favor of the defendant in this factor as well, which Judge Bennett upheld.

Judge Bennett summarized as follows: “The email is classic legal advice that should be protected by the attorney-client privilege…This interest of justice would be harmed here by permitting Pick to use the email at trial…Given the important nature of the attorney-client privilege and the manner in which the email was inadvertently disclosed, Judge Strand’s conclusion that the overriding interest in justice factor weighed against waiver is not clearly erroneous. Accordingly, Pick’s objection is overruled.”

So, what do you think?  Did defense counsel’s quick reaction to the disclosure save the email’s privileged status?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Judgment of $34 Million against Insurer Dodging Malpractice Claim is a “Dish” Served Cold – eDiscovery Case Law

 

In my hometown of Houston, attempting to deny coverage to a client successfully sued for discovery-related negligence cost OneBeacon Insurance Company a $34 million judgment by a federal jury.

As reported by Robert Hilson in ACEDS (Insurer dodging malpractice claim must pay $34 million for attorney’s discovery blunder, subscription required), OneBeacon Insurance Company attempted to rescind a policy from T. Wade Welch & Associates (TWW), the Houston-based law firm that was sued by former client Dish Network in 2007 after one of its attorneys provided incomplete discovery responses to Dish’s adversary. That failure resulted in so-called “death penalty” sanctions against Dish in a contractual interference case brought by Russian Media Group (RMG), which accepted a $12 million settlement.

After Dish won an arbitration award against TWW for that amount last year, OneBeacon sued TWW in federal court seeking a declaration the law firm voided coverage on its policy by failing to disclose the Dish sanctions prior to entering into that policy. OneBeacon alleged that TWW should have known that those penalties in the Dish case would give rise to a malpractice claim, which would trigger a so-called “prior knowledge exclusion” in the insurance policy that rescinds coverage, the insurer claimed. It also accused the firm of making misrepresentations on its policy application.

However, a jury in the US District Court for the Southern District of Texas begged to differ, finding that:

  • OneBeacon failed to move for a swift settlement with Dish when its liability had become clear; and
  • OneBeacon’s failure to settle the Dish claim amounted to gross negligence.

The jury assessed damages as follows:

  • TWW’s lost profits sustained in the past: $3 million;
  • TWW’s lost profits that, in reasonable probability, it will sustain in the future: $5 million;
  • Sum assessed in exemplary damages for OneBeacon’s gross negligence: $5 million;
  • Sum assessed in exemplary damages because OneBeacon’s conduct was committed knowingly: $7.5 million.

TOTAL: $20.5 million.  Plus, although the OneBeacon policy had a $5 million limit, the “Stowers Doctrine,” which holds that an insurer undertaking the defense of an insured has the obligation to make a good faith attempt to settle the insured’s claim within those policy limits, additionally put the company on the hook for the entire $12.6 million arbitration settlement.  Ouch!

This was after US district Judge Gray Miller, in June, denied summary judgment to OneBeacon, finding that it was not clear from the evidentiary record whether TWW attorneys should have reasonably foreseen the malpractice claim that eventually arose from the Dish sanctions.

For more information on the case, including the jury’s verdict, click here (subscription required).

So, what do you think?  Should OneBeacon have been “on the hook” for the settlement amount or should the “prior knowledge exclusion” have excluded them?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Despite 18 Missing Emails in Production, Court Denies Request for “Discovery on Discovery” – eDiscovery Case Law

In Freedman v. Weatherford Int’l, 12 Civ. 2121 (LAK) (JCF) (S.D.N.Y. Sept. 12, 2014), New York Magistrate Judge James C. Francis, IV denied the plaintiff’s request to, among other things, require the defendant to produce “certain reports comparing the electronic search results from discovery in this action to the results from prior searches” – despite the fact that the plaintiff identified 18 emails that the defendant did not produce that were ultimately produced by a third party.

Case Background

In this securities fraud class action, Judge Francis had previously denied three motions to compel by the plaintiffs seeking production of “(1) ‘certain reports comparing the electronic search results from discovery in this action to the results from prior searches’; (2) ‘documents concerning an investigation undertaken by [the] Audit Committee’ of [the] defendant…; and (3) ‘documents concerning an investigation undertaken by the law firm Latham & Watkins LLP’.”  In denying the motions, Judge Francis stated that “Although I recognized that such ‘discovery on discovery’ is sometimes warranted, I nevertheless denied the request because the plaintiffs had not ‘proffered an adequate factual basis for their belief that the current production is deficient.’”

However, Judge Francis granted reconsideration and asked for further briefing on the second item, based on the plaintiffs’ presentation of “new evidence, unavailable at the time [they] filed their [earlier] motion, which allegedly reveals deficiencies in [Weatherford’s] current production.”

Eighteen Missing Emails

The new evidence referenced by the plaintiffs consisted of 18 emails from “critical custodians at Weatherford” that were produced (after briefing on the original motion to compel was complete) not by the defendants, but by a third-party causing the plaintiffs to contend that Weatherford’s production is “significantly deficient.”  The plaintiffs contended that “providing them with a “report of the documents `hit'” by search terms used in connection with the Latham and Audit Committee Investigations will identify additional relevant documents that have not been produced here.”

Judge’s Ruling

However, Judge Francis disagreed, stating “the suggested remedy is not suited to the task. The plaintiffs admit that of those 18 e-mails only three, at most, would have been identified by a search using the terms from the investigations.”  He also cited Da Silva Moore, noting that “[T]he Federal Rules of Civil Procedure do not require perfection…Weatherford has reviewed “millions of documents [] and [produced] hundreds of thousands,” comprising “nearly 4.4 million pages,” in this case…It is unsurprising that some relevant documents may have fallen through the cracks. But, most importantly, the plaintiffs’ proposed exercise is unlikely to remedy the alleged discovery defects. In light of its dubious value, I will not require Weatherford to provide the requested report.”

So, what do you think?  Was the decision justified or should the defendant have been held to a higher standard?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Text Overlays on Image-Only PDF Files Can Be Problematic – eDiscovery Best Practices

Recently, we at CloudNine Discovery received a set of Adobe PDF files from a client that raised an issue regarding the handling of those files for searching and reviewing purposes.   The issue serves as a cautionary tale for those working with image-only PDFs in their document collection.  Here’s a recap of the issue.

The client was using OnDemand Discovery®, which is our new Client Side add-on to OnDemand® that allows clients to upload their own native data for automated processing and loading into new or existing projects.  The collection was purported to consist mostly of image-only PDF files.  PDF files are created in two ways:

  1. By saving or printing from applications to a PDF file: Many applications, such as Microsoft Office applications like Word, Excel and PowerPoint, provide the ability to save the document or spreadsheet that you’ve created to a PDF file, which is common when you want to “publish” the document.  If the application you’re using doesn’t provide that option, you can print the document to PDF using any of several PDF printer drivers available (some of which are free).  These PDFs that are created usually include the text of the file from which the PDF was created.
  2. By scanning or otherwise creating an image to a PDF file: Typically, this occurs either by scanning hard copy documents to PDF or through some sort of receipt in an image-only PDF form (such as through fax software).  These PDFs that are created are images and do not include the text of the document from which they came.

Like many processing tools, such as LAW PreDiscovery®, OnDemand Discovery is programmed to handle PDF files by extracting the text if present or, if not, performing OCR on the files to capture text from the image.  Text from the file is always preferable to OCR text because it’s a lot more accurate, so this is why OCR is typically only performed on the PDF files lacking text.

After the client loaded their data, we did a spot Quality Control check (like we always do) and discovered that the text for several of the documents only consisted of Bates numbers.

Why?

Because the Bates numbers were added as text overlays to the pre-existing image-only PDF files.  When the processing software viewed the file, it found that there was extractable text, so it extracted that text instead of OCRing the PDF file.  In effect, adding the Bates numbers as text overlays to the image-only PDF rendered it as no longer an image-only PDF.  Therefore, the content portion of the text wasn’t captured, so it wasn’t available for indexing and searching.  These documents were essentially rendered non-searchable even after processing.

How did this happen?  Likely through Adobe Acrobat’s Bates Numbering functionality, which is available on later versions of Acrobat (version 8 and higher).  It does exactly that – applies a text overlay Bates number to each page of the document.  Once that happens, eDiscovery processing software applications will not perform OCR on the image-only PDF.

What can you do about it?  If you haven’t applied Bates numbers on the files yet (or have a backup of the files before they were applied – highly recommended) and they haven’t been produced, you should process the files before putting Bates numbers on the images to ensure that you capture the most text available.  And, if opposing counsel will be producing any image-only PDF files, you will want to request the text as well (along with a load file) so that you can maximize your ability to search their production (of course, your first choice should be to receive native format productions whenever possible – here’s a link to an excellent guide on that subject).

If the Bates numbers are already applied and you don’t have a backup of the files without the Bates numbers (oops!) you’re faced with additional processing charges to convert them to TIFF and perform OCR of the text AND the Bates number, a totally unnecessary charge if you plan ahead.

So, what do you think?  Have you dealt with image-only PDF files with text overlaid Bates numbers?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Battle Continues between Attorneys and Client over Attorneys’ Failure to Review Documents – eDiscovery Case Law

In Price Waicukauski & Riley v. Murray, 1:10-cv-1065-WTL-TAB (S.D. Ind. Sept. 18, 2014), Indiana District Judge William T. Lawrence granted the plaintiff’s request for summary judgment for failure to pay attorney’s fees of over $125,000, and refused to issue summary judgment for either party related to a legal malpractice claim for the plaintiff’s admitted failure to review documents produced in the defendants’ case against another party because of a factual dispute regarding the plaintiff’s knowledge of the documents produced.

Case Background

This case was filed in August 2010 by the Plaintiff, Price Waicukauski & Riley, LLC, (“PWR”) against the Defendants, Dennis and Margaret Murray and DPM, Ltd. (“Murrays”), to recover $127,592.91 in attorneys’ fees owed to the plaintiff. The attorneys’ fees stem from the plaintiff’s representation of the defendants in a rather contentious lawsuit against Conseco that spanned more than six years, ultimately settling.  In November 2010, unhappy with the plaintiff’s representation, the defendants filed a counterclaim against the plaintiff alleging legal malpractice.

Legal Malpractice Claim of Breach of Duty of Loyalty

The defendants alleged several allegations of legal malpractice against the plaintiff, including conflict of interest, failure to properly plead federal subject matter jurisdiction and failure to take depositions and conduct discovery as they requested, among other allegations.  One allegation related to the defendants’ claim that the plaintiff breached the duty of loyalty to the defendants by failing (despite the defendant’s request to do so) to review documents before production in the case that revealed a Trust set up on behalf of the plaintiffs that wasn’t disclosed in interrogatory responses.  The plaintiff informed Mr. Murray that it reviewed the documents; however, it ultimately admitted that it did not.  As a result of the misleading interrogatory responses, Conseco filed a motion for sanctions which was granted, resulting in the defendants being ordered to pay over $85,000 in attorneys’ fees to their opponent’s lawyers.

The defendants claimed that the plaintiff knew about the Trust from the outset of its representation; however, the plaintiff (“falsely and underhandedly”, according to the defendants) represented to the magistrate judge assigned to the Underlying Litigation that it had no knowledge of the Trust until the defendants’ accountant produced the Murrays’ tax returns.

Judge’s Analysis and Ruling

The plaintiffs referenced Niswander v. Price Waicukauski & Riley, LLC, where the court held that “[w]hether . . . the Plaintiffs’ attorneys had a duty to review the documents personally before producing them in discovery . . . is simply not something within the knowledge of a layperson.”  However, Judge Lawrence noted, “[t]he Murrays have no expert testimony on either of these two related claims; however, they claim they fall into the above-mentioned exception for when no expert testimony is needed: ‘when the question is within the common knowledge of the community as a whole or when an attorney’s negligence is so grossly apparent that a layperson would have no difficulty in appraising it.’”  Judge Lawrence also agreed with the defendants differentiation of Niswander to this case in that they specifically directed the plaintiff to review the documents while the plaintiffs in Niswander did not.

Judge Lawrence also stated that “[t]he same rings true with the Murrays’ allegation that PWR falsely denied knowledge of the Trust… Again, the Court believes that it is well within the knowledge of a layperson that attorneys should not lie and falsely implicate their own clients in order to shield themselves from liability. Thus, the Court agrees that no expert testimony is needed on this claim regarding the standard of care.”

However, Judge Lawrence decided that “neither party is entitled to summary judgment on this issue” because “there is a factual dispute that precludes granting summary judgment on this claim. PWR maintains that it did not lie; it steadfastly maintains that it had no knowledge of the Trust at the outset of the litigation. Thus, when asked by the magistrate judge when it found out about the Trust, it informed her truthfully. Therefore, whether or not PWR breached a duty that caused injury to the Murrays depends on whom the jury believes: Mr. Murray or PWR.”

Ultimately, Judge Lawrence granted the summary judgment on behalf of the plaintiffs for the attorney’s fees of $127,592.91 (which the defendant did not dispute, but argued that “[t]he successful pursuit of [their] claims would effectively eliminate PWR’s claim”) and denied the defendant’s summary judgment request, but refused a final judgment as outstanding claims remained against the plaintiff.  Judge Lawrence concluded his ruling by noting that “The only claims that remain to be tried are the proximate cause issue with regard to the federal subject matter jurisdiction claim and the allegation of malpractice committed by PWR as a result of its alleged failure to review certain documents that led to sanctions being imposed on Mr. Murray.”

So, what do you think?  Does the failure by the plaintiff to review the defendant’s production constitute legal malpractice?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Rules that Joint Stipulation Supports Plaintiff’s Production of Images Instead of Native Files – eDiscovery Case Law

In Melian Labs, Inc. v. Triology LLC, No. 13-cv-04791-SBA (N.D. Cal. Sept. 4, 2014), California Magistrate Judge Kandis A. Westmore denied the plaintiff’s motion to compel discovery in native form because the production format had been agreed upon under the parties’ ESI protocol under the Joint Rule 26(f) Report filed by the parties that supported production in “paper, PDF, or TIFF format”.

In this trademark dispute, the plaintiff sought a declaratory judgment that its website did not infringe upon the defendant’s trademark, but rather, that the defendant’s use of the trademark infringed on the plaintiff’s senior trademark rights.

On March 26, 2014, the parties filed a case management conference statement (referred to as the “Joint Rule 26(f) Report”), and informed the district court that:

“With respect to the production of electronic data and information, the parties agree that the production of metadata beyond the following fields are not necessary in this lawsuit absent a showing of a compelling need: Date Sent, Time Sent, Date Received, Time Received, To, From, CC, BCC, and Email Subject. The parties agree to produce documents electronic form in paper, PDF, or TIFF format, and spreadsheets and certain other electronic files in native format when it is more practicable to do so.”

The plaintiff began its document production on June 23 and had produced 1218 pages of documents to date.  On August 1, the defendant complained about the format of the plaintiff’s document production of its electronically stored information (“ESI”), claiming that the produced PDFs were stripped of all metadata in violation of the agreement of the parties and that the spreadsheets were not produced in native format.  The defendant contended that the plaintiff’s production of “7 large PDF image documents, which each appear to be a compilation of ESI improperly collected and produced,” were violative of Federal Rule of Civil Procedure 34(b)(2)(E), because they were not produced in their native format and are not reasonably usable.  The defendant also contended that the plaintiff failed to comply with the Joint Rule 26(f) Report by refusing to produce all spreadsheets in native format – the plaintiff acknowledged that some of its spreadsheet printouts were difficult to read, and, in those cases, it produced the spreadsheets in native format (Excel) upon request, but contended that the parties never agreed to produce all spreadsheets in native format.

Judge Westmore stated that “Triology’s complaint is purely one of form and, at this juncture, it is not claiming that Melian’s production is incomplete. Rule 34(b) only requires that the parties produce documents as they are kept in the usual course of business or in the form ordinarily maintained unless otherwise stipulated. Fed. R. Civ. P. 34(b)(2)(E). The parties’ Joint Rule 26(f) Report is a stipulation, and, therefore, Rule 34(b) does not govern. Further, the Joint Rule 26(f) Report does not require that all ESI be produced electronically. Instead, it states that ESI may be produced in paper, PDF or TIFF.”

Judge Westmore also noted that “Triology fails to articulate why metadata is important to emails, when every email should contain the information sought on the face of the document.”  As a result, he ruled that the defendant’s “request to compel the production of all emails in a searchable or native format is denied”.

So, what do you think?  Did the Joint Rule 26(f) Report allow the plaintiff to produce PDFs with no metadata or was the defendant still entitled to native files with at least the email metadata?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

How Mature is Your Organization in Handling eDiscovery? – eDiscovery Best Practices

A new self-assessment resource from EDRM helps you answer that question.

A few days ago, EDRM announced the release of the EDRM eDiscovery Maturity Self-Assessment Test (eMSAT-1), the “first self-assessment resource to help organizations measure their eDiscovery maturity” (according to their press release linked here).

As stated in the press release, eMSAT-1 is a downloadable Excel workbook containing 25 worksheets (actually 27 worksheets when you count the Summary sheet and the List sheet of valid choices at the end) organized into seven sections covering various aspects of the e-discovery process. Complete the worksheets and the assessment results are displayed in summary form at the beginning of the spreadsheet.  eMSAT-1 is the first of several resources and tools being developed by the EDRM Metrics group, led by Clark and Dera Nevin, with assistance from a diverse collection of industry professionals, as part of an ambitious Maturity Model project.

The seven sections covered by the workbook are:

  1. General Information Governance: Contains ten questions to answer regarding your organization’s handling of information governance.
  2. Data Identification, Preservation & Collection: Contains five questions to answer regarding your organization’s handling of these “left side” phases.
  3. Data Processing & Hosting: Contains three questions to answer regarding your organization’s handling of processing, early data assessment and hosting.
  4. Data Review & Analysis: Contains two questions to answer regarding your organization’s handling of search and review.
  5. Data Production: Contains two questions to answer regarding your organization’s handling of production and protecting privileged information.
  6. Personnel & Support: Contains two questions to answer regarding your organization’s hiring, training and procurement processes.
  7. Project Conclusion: Contains one question to answer regarding your organization’s processes for managing data once a matter has concluded.

Each question is a separate sheet, with five answers ranked from 1 to 5 to reflect your organization’s maturity in that area (with descriptions to associate with each level of maturity).  Default value of 1 for each question.  The five answers are:

  • 1: No Process, Reactive
  • 2: Fragmented Process
  • 3: Standardized Process, Not Enforced
  • 4: Standardized Process, Enforced
  • 5: Actively Managed Process, Proactive

Once you answer all the questions, the Summary sheet shows your overall average, as well as your average for each section.  It’s an easy workbook to use with input areas defined by cells in yellow.  The whole workbook is editable, so perhaps the next edition could lock down the calculated only cells.  Nonetheless, the workbook is intuitive and provides a nice exercise for an organization to grade their level of eDiscovery maturity.

You can download a copy of the eMSAT-1 Excel workbook from here, as well as get more information on how to use it (the page also describes how to provide feedback to make the next iterations even better).

The EDRM Maturity Model Self-Assessment Test is the fourth release in recent months by the EDRM Metrics team. In June 2013, the new Metrics Model was released, in November 2013 a supporting glossary of terms for the Metrics Model was published and in November 2013 the EDRM Budget Calculators project kicked off (with four calculators covered by us here, here, here and here).  They’ve been busy.

So, what do you think?  How mature is your organization in handling eDiscovery?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

When Preparing Production Sets, Quality is Job 1 – Best of eDiscovery Daily

OK, I admit I stole that line from an old Ford commercial😉

France Strikes Back!  Today, we’re heading back to Paris for one final evening before heading home (assuming the Air France pilots let us).  For the next two weeks except for Jane Gennarelli’s Throwback Thursday series, we will be re-publishing some of our more popular and frequently referenced posts.  Today’s post is a best practice topic for preparing production sets.  Enjoy!

Yesterday, we talked about addressing parameters of production up front to ensure that those requirements make sense and avoid foreseeable production problems well before the production step.  Today, we will talk about quality control (QC) mechanisms to make sure that the production is complete and accurate.

Quality Control Checks

There are a number of checks that can and should be performed on the production set, prior to producing it to the requesting party.  Here are some examples:

  • File Counts: The most obvious check you can perform is to ensure that the count of files matches the count of documents or pages you have identified to be produced.  However, depending on the production, there may be multiple file counts to check:
    • Image Files: If you have agreed with opposing counsel to produce images for all documents, then there will be a count of images to confirm.  If you’re producing multi-page image files (typically, PDF or TIFF), the count of images should match the count of documents being produced.  If you’re producing single-page image files (usually TIFF), then the count should match the number of pages being produced.
    • Text Files: When producing image files, you may also be producing searchable text files.  Again, the count should match either the documents (multi-page text files) or pages (single-page text files) with one possible exception.  If a document or page has no searchable text, are you still producing an empty file for those?  If not, you will need to be aware of how many of those instances there are and adjust the count accordingly to verify for QC purposes.
    • Native Files: Native files (if produced) are typically at the document level, so you would want to confirm that one exists for each document being produced.
    • Subset Counts: If the documents are being produced in a certain organized manner (e.g., a folder for each custodian), it’s a good idea to identify subset counts at those levels and verify those counts as well.  Not only does this provide an extra level of count verification, but it helps to find the problem more quickly if the overall count is off.
    • Verify Counts on Final Production Media: If you’re verifying counts of the production set before copying it to the media (which is common when burning files to CD or DVD), you will need to verify those counts again after copying to ensure that all files made it to the final media.
    • Sampling of Results: Unless the production is relatively small, it may be impractical to open every last file to be produced to confirm that it is correct.  If so, employ accepted statistical sampling procedures (such as those described here and here for searching) to identify an appropriate sample size and randomly select that sample to open and confirm that the correct files were selected, HASH values of produced native files match the original source versions of those files, images are clear and text files contain the correct text.
    • Redacted Files: If any redacted files are being produced, each of these (not just a sample subset) should be reviewed to confirm that redactions of privileged or confidential information made it to the produced file.  Many review platforms overlay redactions which have to be burned into the images at production time, so it’s easy for mistakes in the process to cause those redactions to be left out or burned in at the wrong location.  Very Important! – You also need to confirm that the redacted text has been removed from any text files that have been produced
    • Inclusion of Logs: Depending on agreed upon parameters, the production may include log files such as:
      • Production Log: Listing of all files being produced, with an agreed upon list of metadata fields to identify those files.
      • Privilege Log: Listing of responsive files not being produced because of privilege (and possibly confidentiality as well).  This listing often identifies the privilege being asserted for each file in the privilege log.
      • Exception Log: Listing of files that could not be produced because of a problem with the file.  Examples of types of exception files are included here.

Each production will have different parameters, so the QC requirements will differ, so these are examples, but not necessarily a comprehensive list of all potential QC checks to perform.

So, what do you think?  Can you think of other appropriate QC checks to perform on production sets?  If so, please share them!  As well as any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.