eDiscoveryDaily

Class Action Plaintiffs Required to Provide Social Media Passwords and Cell Phones – eDiscovery Case Law

We’ve seen several cases where social media data was requested – with some requests granted (including this one, this one and this one) and other requests denied (including this one, this one and this one).  Here is a recent case where the request was granted.

Considering proportionality and accessibility concerns in EEOC v. Original Honeybaked Ham Co. of Georgia, 11-cv-02560-MSK-MEH, 2012 U.S. Dist. (D. Colo. Nov. 7, 2012), Colorado Magistrate Judge Michael Hegarty held that where a party had showed certain of its adversaries’ social media content and text messages were relevant, the adversaries must produce usernames and passwords for their social media accounts, usernames and passwords for e-mail accounts and blogs, and cell phones used to send or receive text messages to be examined by a forensic expert as a special master in camera.

Case Background

This case began when the EEOC sued employer The Original Honeybaked Ham Company of Georgia (“HBH”) on behalf of a class alleging sexual harassment and retaliation. During discovery, HBH requested “numerous categories of documents designed to examine the class members’ damages—emotional and financial—as well as documents going to the credibility and bias of the class members,” and the company moved the court to compel their production.

Among the documents HBH requested were “full unredacted” social media content and text messages. HBH requested such electronically stored information (ESI) because “[m]any of the class members ha[d] utilized electronic media to communicate—with one another or with their respective insider groups—information about their employment with/separation from Defendant HBH, this lawsuit, their then-contemporaneous emotional state, and other topics and content that [HBH] contend[ed] may be admissible in this action.” For example, HBH had “obtained one affected former employee’s Facebook pages” and found that they “contain[ed] a significant variety of relevant information, and further, that other employees posted relevant comments on this Facebook account.”

Court Analysis of Document Request

Judge Hegarty noted that the variety of topics that class members discussed via electronic communications could be viewed “logically as though each class member had a file folder titled ‘Everything About Me,’ which they have voluntarily shared with others.” Therefore, because the documents—if they were in hard copy—would be discoverable if relevant, their existence in electronic form made them likewise discoverable: “The fact that [documents] exist[ ] in cyberspace on an electronic device is a logistical and, perhaps, financial problem, but not a circumstance that removes the information from accessibility by a party opponent in litigation.” Moreover, the fact that the “Everything About Me” folder was stored in this instance on Facebook made the documents perhaps more susceptible to discovery: “There is a strong argument that storing such information on Facebook and making it accessible to others presents an even stronger case for production, at least as it concerns any privacy objection. It was the claimants (or at least some of them) who, by their own volition, created relevant communications and shared them with others.”

As for their relevance, Judge Hegarty ticked through the categories of documents on the Facebook page that HBH had already obtained and noted that each were potentially relevant. Accordingly, and because “other employees posted relevant comments on this Facebook account,” Judge Hegarty required the production of each class member’s social media content.

Also driving Judge Hegarty’s decision was a concern for proportionality: “The cumulative exposure to the Defendant is most definitely well into the low-to-mid seven-figure range. This is important to note when addressing whether the potential cost of producing the discovery is commensurate with the dollar amount at issue.”

Judge’s Ruling

Ultimately, Judge Hegarty held that each class member should produce the following ESI and related devices: cell phones used to send or receive text messages during the relevant period, information necessary to access social media websites used during the relevant period, and information necessary to access “any e-mail account or web blog or similar/related electronically accessed internet or remote location used for communicating with others or posting communications or pictures” during the relevant period.

Protocol for Production Using a Special Master

Though the relevant information was discoverable, Judge Hegarty established a specific protocol for its production. First, the court would appoint a forensic expert to serve as a Special Master to review the produced ESI in camera. “[T]he parties [would] collaborate to create (1) a questionnaire to be given to the Claimants with the intent of identifying all such potential sources of discoverable information; and (2) instructions to be given to the Special Master defining the parameters of the information he will collect.” Judge Hegarty gave the parties specific procedures to follow in the instance of a disagreement during this process. The Special Master could then begin review.

Following in camera review by the special master, Judge Hegarty stipulated that it would also “review the information in camera and require the production to Defendant of only that information which the Court determines is legally relevant under the applicable rules.” The court would then provide the material to the EEOC, which would have an opportunity to conduct a privilege review. The EEOC would then produce nonprivileged information to HBH along with a privilege log. The court would return irrelevant materials to the EEOC and provide a method for the EEOC to contest any relevancy determinations.

Regarding costs of the review, Judge Hegarty ordered the cost of forensic evaluation to be split equally between the parties. Judge Hegarty noted, “The information ordered to be produced is discoverable—information which, if it exists, was created by the Claimants.” However, the court reserved the option to revisit the allocation of costs and to relieve the Plaintiff/Claimants of monetary responsibility if the effort produced little or no relevant information.

So, what do you think?  Was the judge correct in requiring production of user names, passwords and cell phones for each class member?  Please share any comments you might have or if you’d like to know more about a particular topic.

Case Summary Source: Applied Discovery (free subscription required).  For eDiscovery news and best practices, check out the Applied Discovery Blog here.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

According to IDC, Big Data is Only Getting Bigger – eDiscovery Trends

According to the International Data Corporation (IDC), big data is only getting bigger.  In the publication IDC iView “Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East,” (sponsored by EMC), which is excerpted here, the “digital universe” is growing even faster than we thought.

As the report notes: “at the midpoint of a longitudinal study starting with data collected in 2005 and extending to 2020, our analysis shows a continuously expanding, increasingly complex, and ever more interesting digital universe.”  IDC’s sixth annual study of the digital universe contains some interesting findings, including:

  • From 2005 to 2020, the digital universe will grow by a factor of 300, from 130 exabytes to 40,000 exabytes, or 40 trillion gigabytes (more than 5,200 gigabytes for every man, woman, and child in 2020). From now until 2020, the digital universe will about double every two years.
  • The investment in spending on IT hardware, software, services, telecommunications and staff that could be considered the “infrastructure” of the digital universe and telecommunications will grow by 40% between 2012 and 2020. As a result, the investment per gigabyte (GB) during that same period will drop from $2.00 to $0.20. Of course, investment in targeted areas like storage management, security, big data, and cloud computing will grow considerably faster.
  • A majority of the information in the digital universe, 68% in 2012, is created and consumed by consumers — watching digital TV, interacting with social media, sending camera phone images and videos between devices and around the Internet, and so on. Yet enterprises have liability or responsibility for nearly 80% of the information in the digital universe.
  • Only a tiny fraction of the digital universe has been explored for analytic value. IDC estimates that by 2020, as much as 33% of the digital universe will contain information that might be valuable if analyzed.
  • By 2020, nearly 40% of the information in the digital universe will be “touched” by cloud computing providers — meaning that a byte will be stored or processed in a cloud somewhere in its journey from originator to disposal.
  • The first Digital Universe Study was published in 2007.  At that time, IDC’s forecast for the digital universe in 2010 was 988 exabytes (in 2002, there were 5 exabytes in the world, representing an estimated growth of 19,760% in eight years).  Based on actuals, it was later revised to 1,227 exabytes (an actual growth of 24,540% in eight years).  So far, data is growing even faster than anticipated.

The report excerpt breaks out several graphs to illustrate where the digital universe is now and where it’s headed, showing how, as IT costs rise, the costs per GB will fall considerably and also showing the “geography” of the digital universe, with the US currently accounting for 32% of the digital universe.  According to IDC, the share of the digital universe attributable to emerging markets is up to 36% in 2012 and is expected to be 62% by 2020.

Obviously, this has considerable eDiscovery ramifications as data within organizations will continue to grow exponentially and a combination of good information governance programs and effective retrieval technology will be even more vital to keep eDiscovery manageable and costs in check.

So, what do you think?  Do you have a plan in place to manage exponential data growth?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Trend Has Shifted Against Reimbursement of eDiscovery Costs – eDiscovery Case Law

Last year, the trend seemed to be to award the prevailing party reimbursement of eDiscovery costs, including in this case and this case.  Now, that trend appears to have been reversed with those requests being denied (or reversed) by the courts, including this case and this case.  Now, here is another case where reimbursement of eDiscovery costs was denied.

In adhering to the Third Circuit’s 2012 decision in Race Tires America, Inc. v. Hoosier Racing Tire Corp., a district court declined to permit a prevailing party to bill its opponent under 28 U.S.C. § 1920(4) in Abbott Point of Care, Inc. v. Epocal, Inc., No. CV-08-S-543-NE, 2012 U.S. Dist. (N.D. Ala. Nov. 5, 2012) for costs associated with its eDiscovery database because such a request did not comport with a strict interpretation of the statute.

In a lawsuit originally based on Abbott’s allegations that Epocal infringed four of its patents and tortiously interfered with the employment contracts of some of its former employees, a jury awarded Epocal a fully favorable decision. The judgment provided that all costs associated with the lawsuit were taxed to Abbott; accordingly, Epocal filed a bill of costs with the court. This dispute arose when Abbott objected to Epocal’s bill of costs.

The points of the bill that Abbott disputed included “$175,390 in eDiscovery database charges through discovery (item 5) [and] $165,108 in eDiscovery database charges through trial (item 6).” The court pointed out that although “Abbott object[ed] to Epocal’s recovery of any costs for the creation and maintenance of an electronic discovery database under § 1920(4),” the Eleventh Circuit had not issued any controlling guidance on how courts within its limits should interpret the language of Section 1920(4). {emphasis added}

The court noted that Abbott relied on, and many district courts—including those within the Eleventh Circuit—also turned to Race Tires, where “[t]he Third Circuit emphasized that the determination of whether a particular cost can be awarded pursuant to § 1920 is purely a matter of statutory construction.” Section 1920(4) permits taxation only for “‘exemplification’ or ‘making copies,’” and the Third Circuit further explained that those actions meant “‘produc[ing] illustrative evidence or the authentication of public records.’” The court noted that because the statute did not provide for such relief, “[t]he Third Circuit refused to give any weight to equitable considerations, including the importance of database services to the ultimate act of production, the technical skill required to create a database, and the ‘efficiencies and cost savings resulting from the efforts of electronic discovery consultants.’” Moreover, the Third Circuit “refused to allow costs for any of the steps that might lead up to the actual copying of documents or other materials, including ‘gathering, preserving, processing, searching, culling, and extracting’ discoverable information.”

Although the court was “sympathetic to the practical arguments advanced by Epocal,” it declined to extend the scope of the Third Circuit’s interpretation of Section 1920(4). Noting that “[u]nfortunately . . . the law does not always favor efficiency or practicality,” the court followed the Third Circuit’s “thorough, reasonable, and persuasive interpretation of that statute.” As such, it found that Epocal would “not be permitted to recover any costs for the maintenance of an electronic discovery database” and therefore it did not need to distinguish between the costs Epocal requested for eDiscovery charges incurred at different times during the litigation.

So, what do you think?  Should the costs have been reimbursed?  Should prevailing parties have some means for recouping their eDiscovery costs?  Please share any comments you might have or if you’d like to know more about a particular topic.

Case Summary Source: Applied Discovery (free subscription required).  For eDiscovery news and best practices, check out the Applied Discovery Blog here.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Baby, You Can Drive My CARRM – eDiscovery Trends

Full disclosure: this post is NOT about the Beatles’ song, but I liked the title.

There have been a number of terms applied to using technology to aid in eDiscovery review, including technology assisted review (often referred to by its acronym “TAR”) and predictive coding.  Another term is Computer Assisted Review (which lends itself to the obvious acronym of “CAR”).

Now, the Electronic Discovery Reference Model (EDRM) is looking to provide an “owner’s manual” to that CAR with its new draft Computer Assisted Review Reference Model (CARRM), which depicts the flow for a successful CAR project.  The CAR process depends on, among other things, a sound approach for identifying appropriate example documents for the collection, ensuring educated and knowledgeable reviewers to appropriately code those documents and testing and evaluating the results to confirm success.  That’s why the “A” in CAR stands for “assisted” – regardless of how good the tool is, a flawed approach will yield flawed results.

As noted on the EDRM site, the major steps in the CARRM process are:

Set Goals

The process of deciding the outcome of the Computer Assisted Review process for a specific case. Some of the outcomes may be:

  • reduction and culling of not-relevant documents;
  • prioritization of the most substantive documents; and
  • quality control of the human reviewers.

Set Protocol

The process of building the human coding rules that take into account the use of CAR technology. CAR technology must be taught about the document collection by having the human reviewers submit documents to be used as examples of a particular category, e.g. Relevant documents. Creating a coding protocol that can properly incorporate the fact pattern of the case and the training requirements of the CAR system takes place at this stage. An example of a protocol determination is to decide how to treat the coding of family documents during the CAR training process.

Educate Reviewer

The process of transferring the review protocol information to the human reviewers prior to the start of the CAR Review.

Code Documents

The process of human reviewers applying subjective coding decisions to documents in an effort to adequately train the CAR system to “understand” the boundaries of a category, e.g. Relevancy.

Predict Results

The process of the CAR system applying the information “learned” from the human reviewers and classifying a selected document corpus with pre-determined labels.

Test Results

The process of human reviewers using a validation process, typically statistical sampling, in an effort to create a meaningful metric of CAR performance. The metrics can take many forms, they may include estimates in defect counts in the classified population, or use information retrieval metrics like Precision, Recall and F1.

Evaluate Results

The process of the review team deciding if the CAR system has achieved the goals of anticipated by the review team.

Achieve Goals

The process of ending the CAR workflow and moving to the next phase in the review lifecycle, e.g. Privilege Review.

The diagram does a good job of reflecting the linear steps (Set Goals, Set Protocol, Educate Reviewer and, at the end, Achieve Goals) and a circle to represent the iterative steps (Code Documents, Predict Results, Test Results and Evaluate Results) that may need to be performed more than once to achieve the desired results.  It’s a very straightforward model to represent the process.  Nicely done!

Nonetheless, it’s a draft version of the model and EDRM wants your feedback.  You can send your comments to mail@edrm.net or post them on the EDRM site here.

So, what do you think?  Does the CARRM model make computer assisted review more straightforward?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

How Are You Handling Defensible Deletion? – eDiscovery Best Practices

According to the Compliance, Governance and Oversight Council (CGOC), information volume doubles every 18-24 months and 90% of the data in the world has been created in the last two years.  So, many organizations are drowning in electronically stored information (ESI) and costs associated with managing that ESI for eDiscovery are continuing to rise.  An effective plan for information governance that includes defensible deletion of ESI is an effective way of keeping that ESI from overwhelming your organization.  But, what percentage of organizations is defensibly deleting data?  A new survey from eDJGroup is attempting to find out.

Defensible deletion of ESI that has little or no business value and is not subject to legal hold is a good business practice that is protected through Rule 37(e) of the Federal Rules of Civil Procedure (commonly known as the “safe harbor” rule) which states:

Failure to Provide Electronically Stored Information. Absent exceptional circumstances, a court may not impose sanctions under these rules on a party for failing to provide electronically stored information lost as a result of the routine, good-faith operation of an electronic information system.

Barry Murphy’s article (Defensible Deletion Gaining Steam) on eDiscovery Journal discusses the eDJGroup survey and provides some interim results to two of the questions asked:

  1. Do you believe defensible deletion of information is necessary in order to manage growing volumes of digital information?: As of the article date on December 4, an overwhelming 93.0% of all respondents have said ‘yes’, 2.3% have said ‘no’ and 4.7% have said ‘don’t know / unsure’.  Frankly, I’m surprised that anybody doesn’t believe that defensible deletion is necessary!
  2. Does your organization currently defensibly delete information?: As of the article date on December 4, 14.0% of all respondents have said ‘yes, for all systems’, 54.7% have said ‘yes, in some systems’, 16.3% have said ‘no’ and 15.1% have said ‘don’t know’.  That means that over 2/3 of respondents so far (68.7%) defensibly delete information in at least some systems.

As of the article date, there had been 86 respondents.  But, the survey is not over!  You can take the survey here and contribute to the results.  Murphy says the ‘survey will close later this month, so be sure to take it now’, but doesn’t provide a specific date that it closes (one would hope it would be at the end of the month to be as inclusive as possible).  Nonetheless, ‘all respondents will get a summary of the results and have a chance to win a $250 gift card’, so it’s worthwhile to participate.

So, what do you think?  Does your organization have an information governance plan that includes defensible deletion of data?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Percentage of eDiscovery Sanctions Cases Declining – eDiscovery Trends

According to Kroll Ontrack, the percentage of eDiscovery cases addressing sanctions “dropped by approximately ten percent” compared to 2011, while “cases addressing procedural issues more than doubled”.  Let’s take a closer look at the numbers and look at some cases in each category.

As indicated in their December 4 news release, in the past year, Kroll Ontrack experts summarized 70 of the most significant state and federal judicial opinions related to the preservation, collection, review and production of electronically stored information (ESI). The breakdown of the major issues that arose in these eDiscovery cases is as follows:

  • Thirty-two percent (32%) of cases addressed sanctions regarding a variety of issues, such as preservation and spoliation, noncompliance with court orders and production disputes.  Out of 70 cases, that would be about 22 cases addressing sanctions this past year.  Here are a few of the recent sanction cases previously reported on this blog.
  • Twenty-nine percent (29%) of cases addressed procedural issues, such as search protocols, cooperation, production and privilege considerations.  Out of 70 cases, that would be about 20 cases.  Here are a few of the recent procedural issues cases previously reported on this blog.
  • Sixteen percent (16%) of cases addressed discoverability and admissibility issues.  Out of 70 cases, that would be about 11 cases.  Here are a few of the recent discoverability / admissibility cases previously reported on this blog.
  • Fourteen percent (14%) of cases discussed cost considerations, such as shifting or taxation of eDiscovery costs.  Out of 70 cases, that would be about 10 cases.  Here are a few of the recent eDiscovery costs cases previously reported on this blog.
  • Nine percent (9%) of cases discussed technology-assisted review (TAR) or predictive coding.  Out of 70 cases, that would be about 6 cases.  Here are a few of the recent TAR cases previously reported on this blog, how many did you get?

While it’s nice and appreciated that Kroll Ontrack has been summarizing the cases and compiling these statistics, I do have a couple of observations/questions about their numbers (sorry if they appear “nit-picky”):

  • Sometimes Cases Belong in More Than One Category: The case percentage totals add up to 100%, which would make sense except that some cases address issues in more than one category.  For example, In re Actos (Pioglitazone) Products Liability Litigation addressed both cooperation and technology-assisted review, and Freeman v. Dal-Tile Corp. addressed both search protocols and discovery / admissibility.  It appears that Kroll classified each case in only one group, which makes the numbers add up, but could be somewhat misleading.  In theory, some cases belong in multiple categories, so the total should exceed 100%.
  • Did Cases Addressing Procedural Issues Really Double?: Kroll reported that “cases addressing procedural issues more than doubled”; however, here is how they broke down the category last year: 14% of cases addressed various procedural issues such as searching protocol and cooperation, 13% of cases addressed various production considerations, and 12% of cases addressed privilege considerations and waivers.  That’s a total of 39% for three separate categories that now appear to be described as “procedural issues, such as search protocols, cooperation, production and privilege considerations” (29%).  So, it looks to me like the percentage of cases addressing procedural issues actually dropped 10%.  Actually, the two biggest category jumps appear to be discoverability and admissibility issues (2% last year to 16% this year) and TAR (0% last year to 9% this year).

So, what do you think?  Has your organization been involved in any eDiscovery opinions this year?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

“Rap Weasel” Forced to Honor $1 Million Reward Offered via YouTube – eDiscovery Case Law

It isn’t every day that eDiscoveryDaily has reason to reference The Hollywood Reporter in a story about eDiscovery case law, but even celebrities have eDiscovery preservation obligations during litigation.  In Augstein v. Leslie, 11 Civ. 7512 (HB) (SDNY Oct. 17, 2012), New York District Judge Harold Baer imposed an adverse inference sanction against hip hop and R&B artist Ryan Leslie for “negligent destruction” of a hard drive returned to him by the plaintiff after a $1 million reward was offered via YouTube.  On November 28, a jury ordered him to pay the $1 million reward to the plaintiff.

Reward Offered, then Refused

While Leslie was on tour in Germany in 2010, a laptop and external hard drive (that contained some of Leslie’s songs not yet released) were stolen.  Capitalizing on his popularity on social media, Leslie initially offered $20,000 for return of the items, then, on November 6, 2010, a video on YouTube was posted increasing the reward to $1 million.  The increase of the reward was also publicized on Leslie’s Facebook and Twitter accounts.  After Augstein, a German auto repair shop owner, returned the laptop and hard drive, Leslie refused to pay the reward alleging “the intellectual property for which he valued the laptop was not present on the hard drive when it was returned”.

Plaintiff’s Arguments as to Why Reward was not Warranted

Leslie attempted to make the case that when he used the word “offer,” that he really meant something different. He argued that a reasonable person would have understood mention of a reward not as a unilateral contract, but instead as an “advertisement” – an invitation to negotiate.

Leslie’s other argument was that, regardless whether it was an “offer” or not, Augstein failed to perform because he did not return the intellectual property, only the physical property.  Leslie claimed that he and several staff members tried to access the data on the hard drive but were unable to do so.  Leslie sent the hard drive to the manufacturer, Avastor, which ultimately deleted the information and sent Leslie a replacement drive.  The facts associated with the attempts to recover information from the hard drive and requests by the manufacturer to do the same were in dispute between Leslie, his assistant, and Avastor, who claimed no request for data recovery was made by Leslie or anyone on his team.

Judge’s Responses and Decision

Regarding Leslie’s characterization of the offer as an “advertisement”, Judge Baer disagreed, noting that “Leslie’s videos and other activities together are best characterized as an offer for a reward. Leslie ‘sought to induce performance, unlike an invitation to negotiate [often an advertisement], which seeks a reciprocal promise.’”

Regarding Leslie’s duty to preserve the hard drive, Judge Baer noted: “In this case, Leslie was on notice that the information on the hard drive may be relevant to future litigation and, as a result, had an obligation to preserve that information. Augstein contacted Leslie personally and through his attorney regarding the payment of the reward, and a short time later, the hard drive was sent by Leslie to Avastor….Leslie does not dispute these facts.”  As a result, Judge Baer found that “Leslie and his team were at least negligent in their handling of the hard drive.”

Citing Zubulake among other cases with respect to negligence as sufficient for spoliation, Judge Baer ruled “I therefore impose a sanction of an adverse inference; it shall be assumed that the desired intellectual property was present on the hard drive when Augstein returned it to the police.”  This led to the jury’s decision and award last month, causing the New York Post to characterize Leslie as a “Rap Weasel”, which Leslie himself poked fun at on Instagram.  Only in America!

So, what do you think?  Was the adverse inference sanction warranted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

More Self-Documentation Features for Review Solutions – eDiscovery Best Practices

As we discussed yesterday, one feature of review solutions that often gets overlooked is the ability for the review solution to automatically document searching and review activities.  Not only does that make it easier to identify potential issues in the process; it also facilitates the ability for attorneys to demonstrate a defensible approach to discovery to the court.

Yesterday, we discussed self-documentation with regard to keeping a search history to support easy “tweaking” of searches and document the natural iterative process of searching, facilitating the ability for attorneys to demonstrate a defensible search approach to discovery to the court. Let’s discuss two other areas where self-documentation can assist in the discovery analysis and review process:

Review Set Assignment and Tracking: When a review effort requires multiple reviewers to meet the review and production deadline, assigning documents to each reviewer and tracking each reviewer’s progress to estimate completion is critical and can be extremely time consuming to perform manually (especially for large scale review projects involving dozens or even hundreds of reviewers).  A review application, such as OnDemand®, that automates the assignment of documents to the reviewers and automatically tracks their review activity and throughput eliminates that manual time, enabling the review supervisor to provide feedback to the reviewers for improved review results as well as reassign documents as needed to maximize reviewer productivity.

Track Tag and Edit Activity: Review projects involving multiple attorneys and reviewers can be difficult to manage.  The risk of mistakes is high.  For example, privileged documents can be inadvertently tagged non-privileged and important notes or comments regarding individual documents can be inadvertently deleted.  One or more of the users in your case could be making these mistakes and not even be aware that it’s occurring.  A review application, such as OnDemand®, that tracks each tagging/un-tagging event and each edit to any field for a document can enable you to generate an audit log report to look for potential mistakes and issues.  For example, generate an audit log report showing any documents where the Privileged tag was applied and then removed.  Audit log reports are a great way to identify mistakes that have occurred, determine which user made those mistakes, and address those mistakes with them to eliminate future occurrences.  Using the self-documentation feature of an audit log report can enable you to avoid inadvertent disclosures of privileged documents and other potential eDiscovery production issues.

So, what do you think?  How important are self-documentation features in a review solution to you?  Can you think of other important self-documentation features in a review solution?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

When Considering Review Solutions, Don’t Forget About Self-Documentation – eDiscovery Best Practices

When evaluating eDiscovery review solutions, there are a number of features that attorneys consider as part of their selection process.  For example: What searching capabilities does the solution have?  How does it handle native files?  How does it support annotations and redactions of images?  Can it support conceptual clustering and predictive coding?  But, one feature that often gets overlooked is the ability for the review solution to automatically document searching and review activities.  Not only does that make it easier to identify potential issues in the process; it also facilitates the ability for attorneys to demonstrate a defensible approach to discovery to the court.

There are at least three areas where self-documentation can assist in the discovery analysis and review process:

Searching: An application, such as FirstPass®, powered by Venio FPR™, that keeps track of every search in a search history can provide assistance to attorneys to demonstrate a defensible search approach.  eDiscovery searching is almost always an iterative process where you perform a search, analyze the results (often through sampling of the search results, which FirstPass also supports), then adjust the search to either add, remove or modify terms to improve recall (when responsive information is being missed) or improve precision (when the terms are overly broad and yielding way too much non-responsive information, such as the “mining” example we’ve discussed previously).

Tracking search history accomplishes two things: 1) it makes it easier to recall previous searches and “tweak” them to run a modified version of the search without starting from scratch (some searches can be really complex, so this can be a tremendous time saver) and, 2) it documents the natural iterative process of searching, facilitating the ability for attorneys to demonstrate a defensible search approach to discovery to the court, if necessary.  And, if you don’t think that ever comes up, check out these case summaries here, here, here and here.   Not only that, the ability to look at previous searches can be a shorten the learning curve for new users that need to conduct searches by giving them examples after which to pattern their own searches.

Tomorrow, we’ll discuss the other two areas where self-documentation can assist in the discovery analysis and review process.  Let the anticipation build!

So, what do you think?  How important are self-documentation features in a review solution to you?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

New eDiscovery Guidelines for Northern District of California – eDiscovery Trends

The U.S. District Court for the Northern District of California has announced new Guidelines for counsel and litigants regarding the discovery of electronically stored information (“ESI”) effective as of last Tuesday (November 27). The Guidelines were developed by a bench-bar committee chaired by Magistrate Judge Elizabeth D. Laporte in partnership with the Court’s Rules Committee and unanimously approved by the entire Court.

As stated in the announcement: “Counsel and litigants should familiarize themselves with the Guidelines and immediately begin using the revised Standing Order for All Judges of the Northern District of California when preparing case management statements and the Checklist as appropriate when meeting and conferring.”

As noted in the announcement, in addition to the Standing Order noted above, the package of new ESI-related documents is comprised of:

In the announcement, Judge Laporte stated: “These tools are designed to promote cooperative e-discovery planning as soon as practicable that is tailored and proportionate to the needs of the particular case to achieve its just, speedy and inexpensive resolution, consistent with Rule 1 of the Federal Rules of Civil Procedure… The Court requires counsel to be familiar with these tools and confirm in the initial case management statement that they have reviewed the Guidelines regarding preservation and decided whether to enter into a stipulated order governing e-discovery, in light of the Model Stipulated Order.”

To confirm that familiarity and understanding by counsel, paragraph 6 of the Standing Order requires that all Joint Case Management Statements include:

“A brief report certifying that the parties have reviewed the Guidelines Relating to the Discovery of Electronically Stored Information (“ESI Guidelines”), and confirming that the parties have met and conferred pursuant to Fed. R. Civ. P. 26(f) regarding reasonable and proportionate steps taken to preserve evidence relevant to the issues reasonably evident in this action.”

As noted in this blog previously, other courts, such as the Southern District of New York (pilot program) and the Eastern District of Texas (for patent cases) have implemented standards for handling ESI, at least in certain situations.

So, what do you think?  Should all District courts adopt similar standards and provide similar guidelines and checklists?  If not, why not?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.