Electronic Discovery

Baby, You Can Drive My CARRM – eDiscovery Trends

Full disclosure: this post is NOT about the Beatles’ song, but I liked the title.

There have been a number of terms applied to using technology to aid in eDiscovery review, including technology assisted review (often referred to by its acronym “TAR”) and predictive coding.  Another term is Computer Assisted Review (which lends itself to the obvious acronym of “CAR”).

Now, the Electronic Discovery Reference Model (EDRM) is looking to provide an “owner’s manual” to that CAR with its new draft Computer Assisted Review Reference Model (CARRM), which depicts the flow for a successful CAR project.  The CAR process depends on, among other things, a sound approach for identifying appropriate example documents for the collection, ensuring educated and knowledgeable reviewers to appropriately code those documents and testing and evaluating the results to confirm success.  That’s why the “A” in CAR stands for “assisted” – regardless of how good the tool is, a flawed approach will yield flawed results.

As noted on the EDRM site, the major steps in the CARRM process are:

Set Goals

The process of deciding the outcome of the Computer Assisted Review process for a specific case. Some of the outcomes may be:

  • reduction and culling of not-relevant documents;
  • prioritization of the most substantive documents; and
  • quality control of the human reviewers.

Set Protocol

The process of building the human coding rules that take into account the use of CAR technology. CAR technology must be taught about the document collection by having the human reviewers submit documents to be used as examples of a particular category, e.g. Relevant documents. Creating a coding protocol that can properly incorporate the fact pattern of the case and the training requirements of the CAR system takes place at this stage. An example of a protocol determination is to decide how to treat the coding of family documents during the CAR training process.

Educate Reviewer

The process of transferring the review protocol information to the human reviewers prior to the start of the CAR Review.

Code Documents

The process of human reviewers applying subjective coding decisions to documents in an effort to adequately train the CAR system to “understand” the boundaries of a category, e.g. Relevancy.

Predict Results

The process of the CAR system applying the information “learned” from the human reviewers and classifying a selected document corpus with pre-determined labels.

Test Results

The process of human reviewers using a validation process, typically statistical sampling, in an effort to create a meaningful metric of CAR performance. The metrics can take many forms, they may include estimates in defect counts in the classified population, or use information retrieval metrics like Precision, Recall and F1.

Evaluate Results

The process of the review team deciding if the CAR system has achieved the goals of anticipated by the review team.

Achieve Goals

The process of ending the CAR workflow and moving to the next phase in the review lifecycle, e.g. Privilege Review.

The diagram does a good job of reflecting the linear steps (Set Goals, Set Protocol, Educate Reviewer and, at the end, Achieve Goals) and a circle to represent the iterative steps (Code Documents, Predict Results, Test Results and Evaluate Results) that may need to be performed more than once to achieve the desired results.  It’s a very straightforward model to represent the process.  Nicely done!

Nonetheless, it’s a draft version of the model and EDRM wants your feedback.  You can send your comments to mail@edrm.net or post them on the EDRM site here.

So, what do you think?  Does the CARRM model make computer assisted review more straightforward?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

How Are You Handling Defensible Deletion? – eDiscovery Best Practices

According to the Compliance, Governance and Oversight Council (CGOC), information volume doubles every 18-24 months and 90% of the data in the world has been created in the last two years.  So, many organizations are drowning in electronically stored information (ESI) and costs associated with managing that ESI for eDiscovery are continuing to rise.  An effective plan for information governance that includes defensible deletion of ESI is an effective way of keeping that ESI from overwhelming your organization.  But, what percentage of organizations is defensibly deleting data?  A new survey from eDJGroup is attempting to find out.

Defensible deletion of ESI that has little or no business value and is not subject to legal hold is a good business practice that is protected through Rule 37(e) of the Federal Rules of Civil Procedure (commonly known as the “safe harbor” rule) which states:

Failure to Provide Electronically Stored Information. Absent exceptional circumstances, a court may not impose sanctions under these rules on a party for failing to provide electronically stored information lost as a result of the routine, good-faith operation of an electronic information system.

Barry Murphy’s article (Defensible Deletion Gaining Steam) on eDiscovery Journal discusses the eDJGroup survey and provides some interim results to two of the questions asked:

  1. Do you believe defensible deletion of information is necessary in order to manage growing volumes of digital information?: As of the article date on December 4, an overwhelming 93.0% of all respondents have said ‘yes’, 2.3% have said ‘no’ and 4.7% have said ‘don’t know / unsure’.  Frankly, I’m surprised that anybody doesn’t believe that defensible deletion is necessary!
  2. Does your organization currently defensibly delete information?: As of the article date on December 4, 14.0% of all respondents have said ‘yes, for all systems’, 54.7% have said ‘yes, in some systems’, 16.3% have said ‘no’ and 15.1% have said ‘don’t know’.  That means that over 2/3 of respondents so far (68.7%) defensibly delete information in at least some systems.

As of the article date, there had been 86 respondents.  But, the survey is not over!  You can take the survey here and contribute to the results.  Murphy says the ‘survey will close later this month, so be sure to take it now’, but doesn’t provide a specific date that it closes (one would hope it would be at the end of the month to be as inclusive as possible).  Nonetheless, ‘all respondents will get a summary of the results and have a chance to win a $250 gift card’, so it’s worthwhile to participate.

So, what do you think?  Does your organization have an information governance plan that includes defensible deletion of data?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Percentage of eDiscovery Sanctions Cases Declining – eDiscovery Trends

According to Kroll Ontrack, the percentage of eDiscovery cases addressing sanctions “dropped by approximately ten percent” compared to 2011, while “cases addressing procedural issues more than doubled”.  Let’s take a closer look at the numbers and look at some cases in each category.

As indicated in their December 4 news release, in the past year, Kroll Ontrack experts summarized 70 of the most significant state and federal judicial opinions related to the preservation, collection, review and production of electronically stored information (ESI). The breakdown of the major issues that arose in these eDiscovery cases is as follows:

  • Thirty-two percent (32%) of cases addressed sanctions regarding a variety of issues, such as preservation and spoliation, noncompliance with court orders and production disputes.  Out of 70 cases, that would be about 22 cases addressing sanctions this past year.  Here are a few of the recent sanction cases previously reported on this blog.
  • Twenty-nine percent (29%) of cases addressed procedural issues, such as search protocols, cooperation, production and privilege considerations.  Out of 70 cases, that would be about 20 cases.  Here are a few of the recent procedural issues cases previously reported on this blog.
  • Sixteen percent (16%) of cases addressed discoverability and admissibility issues.  Out of 70 cases, that would be about 11 cases.  Here are a few of the recent discoverability / admissibility cases previously reported on this blog.
  • Fourteen percent (14%) of cases discussed cost considerations, such as shifting or taxation of eDiscovery costs.  Out of 70 cases, that would be about 10 cases.  Here are a few of the recent eDiscovery costs cases previously reported on this blog.
  • Nine percent (9%) of cases discussed technology-assisted review (TAR) or predictive coding.  Out of 70 cases, that would be about 6 cases.  Here are a few of the recent TAR cases previously reported on this blog, how many did you get?

While it’s nice and appreciated that Kroll Ontrack has been summarizing the cases and compiling these statistics, I do have a couple of observations/questions about their numbers (sorry if they appear “nit-picky”):

  • Sometimes Cases Belong in More Than One Category: The case percentage totals add up to 100%, which would make sense except that some cases address issues in more than one category.  For example, In re Actos (Pioglitazone) Products Liability Litigation addressed both cooperation and technology-assisted review, and Freeman v. Dal-Tile Corp. addressed both search protocols and discovery / admissibility.  It appears that Kroll classified each case in only one group, which makes the numbers add up, but could be somewhat misleading.  In theory, some cases belong in multiple categories, so the total should exceed 100%.
  • Did Cases Addressing Procedural Issues Really Double?: Kroll reported that “cases addressing procedural issues more than doubled”; however, here is how they broke down the category last year: 14% of cases addressed various procedural issues such as searching protocol and cooperation, 13% of cases addressed various production considerations, and 12% of cases addressed privilege considerations and waivers.  That’s a total of 39% for three separate categories that now appear to be described as “procedural issues, such as search protocols, cooperation, production and privilege considerations” (29%).  So, it looks to me like the percentage of cases addressing procedural issues actually dropped 10%.  Actually, the two biggest category jumps appear to be discoverability and admissibility issues (2% last year to 16% this year) and TAR (0% last year to 9% this year).

So, what do you think?  Has your organization been involved in any eDiscovery opinions this year?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

“Rap Weasel” Forced to Honor $1 Million Reward Offered via YouTube – eDiscovery Case Law

It isn’t every day that eDiscoveryDaily has reason to reference The Hollywood Reporter in a story about eDiscovery case law, but even celebrities have eDiscovery preservation obligations during litigation.  In Augstein v. Leslie, 11 Civ. 7512 (HB) (SDNY Oct. 17, 2012), New York District Judge Harold Baer imposed an adverse inference sanction against hip hop and R&B artist Ryan Leslie for “negligent destruction” of a hard drive returned to him by the plaintiff after a $1 million reward was offered via YouTube.  On November 28, a jury ordered him to pay the $1 million reward to the plaintiff.

Reward Offered, then Refused

While Leslie was on tour in Germany in 2010, a laptop and external hard drive (that contained some of Leslie’s songs not yet released) were stolen.  Capitalizing on his popularity on social media, Leslie initially offered $20,000 for return of the items, then, on November 6, 2010, a video on YouTube was posted increasing the reward to $1 million.  The increase of the reward was also publicized on Leslie’s Facebook and Twitter accounts.  After Augstein, a German auto repair shop owner, returned the laptop and hard drive, Leslie refused to pay the reward alleging “the intellectual property for which he valued the laptop was not present on the hard drive when it was returned”.

Plaintiff’s Arguments as to Why Reward was not Warranted

Leslie attempted to make the case that when he used the word “offer,” that he really meant something different. He argued that a reasonable person would have understood mention of a reward not as a unilateral contract, but instead as an “advertisement” – an invitation to negotiate.

Leslie’s other argument was that, regardless whether it was an “offer” or not, Augstein failed to perform because he did not return the intellectual property, only the physical property.  Leslie claimed that he and several staff members tried to access the data on the hard drive but were unable to do so.  Leslie sent the hard drive to the manufacturer, Avastor, which ultimately deleted the information and sent Leslie a replacement drive.  The facts associated with the attempts to recover information from the hard drive and requests by the manufacturer to do the same were in dispute between Leslie, his assistant, and Avastor, who claimed no request for data recovery was made by Leslie or anyone on his team.

Judge’s Responses and Decision

Regarding Leslie’s characterization of the offer as an “advertisement”, Judge Baer disagreed, noting that “Leslie’s videos and other activities together are best characterized as an offer for a reward. Leslie ‘sought to induce performance, unlike an invitation to negotiate [often an advertisement], which seeks a reciprocal promise.’”

Regarding Leslie’s duty to preserve the hard drive, Judge Baer noted: “In this case, Leslie was on notice that the information on the hard drive may be relevant to future litigation and, as a result, had an obligation to preserve that information. Augstein contacted Leslie personally and through his attorney regarding the payment of the reward, and a short time later, the hard drive was sent by Leslie to Avastor….Leslie does not dispute these facts.”  As a result, Judge Baer found that “Leslie and his team were at least negligent in their handling of the hard drive.”

Citing Zubulake among other cases with respect to negligence as sufficient for spoliation, Judge Baer ruled “I therefore impose a sanction of an adverse inference; it shall be assumed that the desired intellectual property was present on the hard drive when Augstein returned it to the police.”  This led to the jury’s decision and award last month, causing the New York Post to characterize Leslie as a “Rap Weasel”, which Leslie himself poked fun at on Instagram.  Only in America!

So, what do you think?  Was the adverse inference sanction warranted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

More Self-Documentation Features for Review Solutions – eDiscovery Best Practices

As we discussed yesterday, one feature of review solutions that often gets overlooked is the ability for the review solution to automatically document searching and review activities.  Not only does that make it easier to identify potential issues in the process; it also facilitates the ability for attorneys to demonstrate a defensible approach to discovery to the court.

Yesterday, we discussed self-documentation with regard to keeping a search history to support easy “tweaking” of searches and document the natural iterative process of searching, facilitating the ability for attorneys to demonstrate a defensible search approach to discovery to the court. Let’s discuss two other areas where self-documentation can assist in the discovery analysis and review process:

Review Set Assignment and Tracking: When a review effort requires multiple reviewers to meet the review and production deadline, assigning documents to each reviewer and tracking each reviewer’s progress to estimate completion is critical and can be extremely time consuming to perform manually (especially for large scale review projects involving dozens or even hundreds of reviewers).  A review application, such as OnDemand®, that automates the assignment of documents to the reviewers and automatically tracks their review activity and throughput eliminates that manual time, enabling the review supervisor to provide feedback to the reviewers for improved review results as well as reassign documents as needed to maximize reviewer productivity.

Track Tag and Edit Activity: Review projects involving multiple attorneys and reviewers can be difficult to manage.  The risk of mistakes is high.  For example, privileged documents can be inadvertently tagged non-privileged and important notes or comments regarding individual documents can be inadvertently deleted.  One or more of the users in your case could be making these mistakes and not even be aware that it’s occurring.  A review application, such as OnDemand®, that tracks each tagging/un-tagging event and each edit to any field for a document can enable you to generate an audit log report to look for potential mistakes and issues.  For example, generate an audit log report showing any documents where the Privileged tag was applied and then removed.  Audit log reports are a great way to identify mistakes that have occurred, determine which user made those mistakes, and address those mistakes with them to eliminate future occurrences.  Using the self-documentation feature of an audit log report can enable you to avoid inadvertent disclosures of privileged documents and other potential eDiscovery production issues.

So, what do you think?  How important are self-documentation features in a review solution to you?  Can you think of other important self-documentation features in a review solution?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

When Considering Review Solutions, Don’t Forget About Self-Documentation – eDiscovery Best Practices

When evaluating eDiscovery review solutions, there are a number of features that attorneys consider as part of their selection process.  For example: What searching capabilities does the solution have?  How does it handle native files?  How does it support annotations and redactions of images?  Can it support conceptual clustering and predictive coding?  But, one feature that often gets overlooked is the ability for the review solution to automatically document searching and review activities.  Not only does that make it easier to identify potential issues in the process; it also facilitates the ability for attorneys to demonstrate a defensible approach to discovery to the court.

There are at least three areas where self-documentation can assist in the discovery analysis and review process:

Searching: An application, such as FirstPass®, powered by Venio FPR™, that keeps track of every search in a search history can provide assistance to attorneys to demonstrate a defensible search approach.  eDiscovery searching is almost always an iterative process where you perform a search, analyze the results (often through sampling of the search results, which FirstPass also supports), then adjust the search to either add, remove or modify terms to improve recall (when responsive information is being missed) or improve precision (when the terms are overly broad and yielding way too much non-responsive information, such as the “mining” example we’ve discussed previously).

Tracking search history accomplishes two things: 1) it makes it easier to recall previous searches and “tweak” them to run a modified version of the search without starting from scratch (some searches can be really complex, so this can be a tremendous time saver) and, 2) it documents the natural iterative process of searching, facilitating the ability for attorneys to demonstrate a defensible search approach to discovery to the court, if necessary.  And, if you don’t think that ever comes up, check out these case summaries here, here, here and here.   Not only that, the ability to look at previous searches can be a shorten the learning curve for new users that need to conduct searches by giving them examples after which to pattern their own searches.

Tomorrow, we’ll discuss the other two areas where self-documentation can assist in the discovery analysis and review process.  Let the anticipation build!

So, what do you think?  How important are self-documentation features in a review solution to you?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

New eDiscovery Guidelines for Northern District of California – eDiscovery Trends

The U.S. District Court for the Northern District of California has announced new Guidelines for counsel and litigants regarding the discovery of electronically stored information (“ESI”) effective as of last Tuesday (November 27). The Guidelines were developed by a bench-bar committee chaired by Magistrate Judge Elizabeth D. Laporte in partnership with the Court’s Rules Committee and unanimously approved by the entire Court.

As stated in the announcement: “Counsel and litigants should familiarize themselves with the Guidelines and immediately begin using the revised Standing Order for All Judges of the Northern District of California when preparing case management statements and the Checklist as appropriate when meeting and conferring.”

As noted in the announcement, in addition to the Standing Order noted above, the package of new ESI-related documents is comprised of:

In the announcement, Judge Laporte stated: “These tools are designed to promote cooperative e-discovery planning as soon as practicable that is tailored and proportionate to the needs of the particular case to achieve its just, speedy and inexpensive resolution, consistent with Rule 1 of the Federal Rules of Civil Procedure… The Court requires counsel to be familiar with these tools and confirm in the initial case management statement that they have reviewed the Guidelines regarding preservation and decided whether to enter into a stipulated order governing e-discovery, in light of the Model Stipulated Order.”

To confirm that familiarity and understanding by counsel, paragraph 6 of the Standing Order requires that all Joint Case Management Statements include:

“A brief report certifying that the parties have reviewed the Guidelines Relating to the Discovery of Electronically Stored Information (“ESI Guidelines”), and confirming that the parties have met and conferred pursuant to Fed. R. Civ. P. 26(f) regarding reasonable and proportionate steps taken to preserve evidence relevant to the issues reasonably evident in this action.”

As noted in this blog previously, other courts, such as the Southern District of New York (pilot program) and the Eastern District of Texas (for patent cases) have implemented standards for handling ESI, at least in certain situations.

So, what do you think?  Should all District courts adopt similar standards and provide similar guidelines and checklists?  If not, why not?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Another Major eDiscovery Acquisition: DTI Acquires Fios – eDiscovery Trends

As reported by Law Technology News and the Litigation Support News and Information Blog, Document Technologies Inc. (DTI), the nation’s largest independent provider of discovery services, facilities management, and knowledge process outsourcing has acquired Fios Inc., one of the electronic discovery industry’s most recognized brands.  The new company will be known as “Fios, A DTI Company.”

Atlanta-based DTI said that Fios was purchased to gain clients and staff. Based in Portland, Ore. and founded in 1999 , Fios also brings software development skills and workflow expertise to the DTI portfolio.

“We felt the opportunity to bring Fios into the DTI family was quite attractive,” said DTI CEO John Davenport Jr. “They have exactly what we look for — an exceptional group of high-performing employees, a respected name, strong relationships with an impressive list of Am Law 100 and Fortune 500 corporate clients, and most importantly, similar values and corporate cultures.”

Terms of the current deal were not disclosed.  DTI has been busy this year – this is its third acquisition of the year, after acquiring Los Angeles computer forensics specialist Data Forté in July and Houston-based legal staffing company Provius in September.

Industry consolidation continues.  This latest acquisition makes at least 44 eDiscovery industry deals so far this year, with companies such as Applied Discovery, CaseCentral, Sanction Solutions, Lateral Data and Digital Reef having been acquired.  Of course, not all acquisitions work out, as we saw recently with the HP/Autonomy purchase.  It will be interesting to see how this acquisition works out and how well DTI integrates all of its recent acquisitions.

So, what do you think?  Will the accelerated pace of eDiscovery acquisitions continue?  If so, who’s next?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Plaintiff Hammered with Case Dismissal for “Egregious” Discovery Violations – eDiscovery Case Law

Apparently, destroying your first computer with a sledgehammer and using Evidence Eliminator and CCleaner on your second computer (when you have a duty to preserve both) are not considered to be best practices for preservation.  Who knew?  😉

In Taylor v. Mitre Corp., (E.D. Va. Nov. 8, 2012), Virginia District Court Judge Liam O’Grady upheld the findings by the Magistrate Judge for dismissal of the plaintiff’s claims and payment of the defendant’s reasonable attorney’s fees and costs due to “egregious” discovery conduct.  Here’s why:

  • The plaintiff hired counsel back in 2009 “in anticipation of bringing this lawsuit against Mitre for violations of the FMLA and failure to accommodate his disabilities.  Mr. Taylor’s lawyer immediately put him on clear notice that he was required to maintain all files and documents (electronic and otherwise) related to his claim, and that deleting or discarding such files could result in sanctions including dismissal of his claim.”;
  • The plaintiff filed his EEOC claim in November 2010;
  • Sometime in 2011, the plaintiff “wiped” his work desktop, then “took a sledgehammer to it” and disposed of it in the local landfill (as noted in the footnote: “Mr. Taylor has given varying accounts of the size and type of the hammer he used to wreck the computer, but does not deny that he smashed it with some nature of mallet.”);
  • Before destroying his work computer, the plaintiff “attempted” to back up files from it “but was only partially successful”;
  • In November 2011, the plaintiff filed his complaint;
  • On July 1, the Magistrate Judge “ordered Mr. Taylor to submit his current computer, a laptop, to inspection within a week.”  The plaintiff “had represented that whatever documents he maintained during the partially successful backup operation described above had been transferred to the laptop, and the Defendant won permission to inspect the laptop”;
  • A few days later, the defendant’s forensic expert examined the laptop and determined that the plaintiff had “run a program called Evidence Eliminator, a program whose express purpose is removing ‘sensitive material’ from one’s hard drive and defeating forensic software.”  The plaintiff admitted that “he downloaded the program the same day he learned of the Court’s inspection order”;
  • The plaintiff also ran another program CCleaner (which also erases files from the computer so that they cannot be recovered) “at least twice between the time of the Court’s inspection order and the actual inspection.”

The plaintiff claimed “that CCleaner was set to run automatically, and so even if it did delete relevant documents, the deletion was not intentional.” – a claim that the court found to be “highly suspicious”.  However, when it came to the installation of Evidence Eliminator, Judge O’Grady did not mince words:

“For Mr. Taylor to download and run a program whose express purpose is deletion of evidence in direct response to the Magistrate Judge’s order that his computer be produced for inspection was to blatantly disregard his duties in the judicial system under which he sought relief. The Court finds Mr. Taylor’s conduct to be egregious and highly contemptuous of the inspection order. Mr. Taylor has forfeited his right to pursue his claims with this Court any further.”

So, what do you think?  Is this the most egregious example of spoliation you’ve ever seen?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

The Grossman-Cormack Glossary of Technology Assisted Review – eDiscovery Resources

Do you know what a “Confidence Level” is?  No, I’m not talking about Tom Brady completing football passes in coverage.  How about “Harmonic Mean”?  Maybe if I hum a few bars?  Gaussian Calculator?  Sorry, it has nothing to do with how many Tums you should eat after a big meal.  No, the answer to all of these can be found in the new Grossman-Cormack Glossary of Technology Assisted Review.

Maura Grossman and Gordon Cormack are educating us yet again with regard to Technology Assisted Review (TAR) with a comprehensive glossary that defines key TAR-related terms and also provides some key case references, including EORHB, Global Aerospace, In Re: Actos:, Kleen Products and, of course, Da Silva Moore.  The authors of the heavily cited article Technology-Assisted Review in E-Discovery Can Be More Effective and More Efficient Than Exhaustive Manual Review have provided a new reference document that may help many in the industry understand key TAR concepts better.  Or, at least recognize key terms associated with TAR.  This is version 1.01, published just this month and clearly intended to evolve over time.  As the authors note in the Preamble:

“The introduction of TAR into the legal community has brought with it much confusion because different terms are being used to refer to the same thing (e.g., ‘technology assisted review,’ ‘computer-assisted review,’ ‘computer-aided review,’ ‘predictive coding,’ and ‘content based advanced analytics,’ to name but a few), and the same terms are also being used to refer to different things (e.g., ‘seed sets’ and ‘control sample’). Moreover, the introduction of complex statistical concepts, and terms-of-art from the science of information retrieval, have resulted in widespread misunderstanding and sometimes perversion of their actual meanings.

This glossary is written in an effort to bring order to chaos by introducing a common framework and set of definitions for use by the bar, the bench, and service providers. The glossary endeavors to be comprehensive, but its definitions are necessarily brief. Interested readers may look elsewhere for detailed information concerning any of these topics. The terms in the glossary are presented in alphabetical order, with all defined terms in capital letters.

In the future, we plan to create an electronic version of this glossary that will contain live links, cross references, and annotations. We also envision this glossary to be a living, breathing work that will evolve over time. Towards that end, we invite our colleagues in the industry to send us their comments on our definitions, as well as any additional terms they would like to see included in the glossary, so that we can reach a consensus on a consistent, common language relating to technology assisted review. Comments can be sent to us at mrgrossman@wlrk.com and gvcormac@uwaterloo.ca.”

Live links, with a Table of Contents, in a (hopefully soon) next iteration will definitely make this guide even more useful.  Nonetheless, it’s a great resource for those of us that have bandied around these terms for some time.

So, what do you think?  Will this glossary help educate the industry and help standardize use of the terms?  Or will it lead to one big “Confusion Matrix”? (sorry, I couldn’t resist)  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.