Searching

eDiscovery Case Law: “Tweets” Are Public and Must Be Produced, Judge Rules

 

First, Malcolm Harris tried to quash a subpoena seeking production of his Tweets and his Twitter account user information in his New York criminal case.  That request was rejected, so Twitter then sought to quash the subpoena themselves, claiming that the order to produce the information imposed an “undue burden” on Twitter and even forced it to “violate federal law”.  Now, the criminal court judge has ruled on Twitter’s motion.

On June 30, in People v. Harris, 2011NY080152, New York Criminal Court Judge Matthew Sciarrino Jr. ruled that Twitter must produce tweets and user information of, Harris, an Occupy Wall Street protester, who clashed with New York Police back in October of last year and faces disorderly conduct charges.

Noting that “The court order is not unreasonably burdensome to Twitter, as it does not take much to search and provide the data to the court.”, Judge Sciarrino provided an analogy regarding the privacy of the Twitter account information, as follows:

“Consider the following: a man walks to his window, opens the window, and screams down to a young lady, ‘I'm sorry I hit you, please come back upstairs.’ At trial, the People call a person who was walking across the street at the time this occurred. The prosecutor asks, ‘What did the defendant yell?’ Clearly the answer is relevant and the witness could be compelled to testify. Well today, the street is an online, information superhighway, and the witnesses can be the third party providers like Twitter, Facebook, Instragram, Pinterest, or the next hot social media application.”

Continuing, Judge Sciarrino stated: “If you post a tweet, just like if you scream it out the window, there is no reasonable expectation of privacy. There is no proprietary interest in your tweets, which you have now gifted to the world. This is not the same as a private email, a private direct message, a private chat, or any of the other readily available ways to have a private conversation via the internet that now exist…Those private dialogues would require a warrant based on probable cause in order to access the relevant information.”

Judge Sciarrino indicated that his decision was “partially based on Twitter's then terms of service agreement. After the April 20, 2012 decision, Twitter changed its terms and policy effective May 17, 2012. The newly added portion states that: ‘You Retain Your Right To Any Content You Submit, Post Or Display On Or Through The Service.’”  So, it would be interesting to see if the same ruling would be applied for “tweets” and other information posted after that date.

Judge Sciarrino did note that the government must obtain a search warrant to compel a provider of Electronic Communication Service (“ECS”) to disclose contents of communication in its possession that are in temporary "electronic storage" for 180 days or less (18 USC §2703[a]).  So, he ordered “that Twitter disclose all non-content information and content information from September 15, 2011 to December 30, 2011” related to Harris’ account.

So, what do you think?  Did the judge make the right call or should Twitter have been able to quash the subpoena?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: TREC Study Finds that Technology Assisted Review is More Cost Effective

 

As reported in Law Technology News (Technology-Assisted Review Boosted in TREC 2011 Results by Evan Koblentz), the Text Retrieval Conference (TREC) Legal Track, a government sponsored project designed to assess the ability of information retrieval techniques to meet the needs of the legal profession, has released its 2011 study results (after several delays).  The overview of the 2011 TREC Legal Track can be found here.

The report concludes the following: “From 2008 through 2011, the results show that the technology-assisted review efforts of several participants achieve recall scores that are about as high as might reasonably be measured using current evaluation methodologies. These efforts require human review of only a fraction of the entire collection, with the consequence that they are far more cost-effective than manual review.” 

However, the report also notes that “There is still plenty of room for improvement in the efficiency and effectiveness of technology-assisted review efforts, and, in particular, the accuracy of intra-review recall estimation tools, so as to support a reasonable decision that 'enough is enough' and to declare the review complete. Commensurate with improvements in review efficiency and effectiveness is the need for improved external evaluation methodologies that address the limitations of those used in the TREC Legal Track and similar efforts.”

Other notable tidbits from the study and article:

  • Ten organizations participated in the 2011 study, including universities from diverse locations such as Beijing and Melbourne and vendors including OpenText and Recommind;
  • Participants were required to rank the entire corpus of 685,592 documents by their estimate of the probability of responsiveness to each of three topics, and also to provide a quantitative estimate of that probability;
  • The document collection used was derived from the EDRM Enron Data Set;
  • The learning task had three distinct topics, each representing a distinct request for production.  A total of 16,999 documents was selected – about 5,600 per topic – to form the “gold standard” for comparing the document collection;
  • OpenText had the top number of documents reviewed compared to recall percentage in the first topic, the University of Waterloo led the second, and Recommind placed best in the third;
  • One of the participants has been barred from future participation in TREC – “It is inappropriate –- and forbidden by the TREC participation agreement –- to claim that the results presented here show that one participant’s system or approach is generally better than another’s. It is also inappropriate to compare the results of TREC 2011 with the results of past TREC Legal Track exercises, as the test conditions as well as the particular techniques and tools employed by the participating teams are not directly comparable. One TREC 2011 Legal Track participant was barred from future participation in TREC for advertising such invalid comparisons”.  According to the LTN article, the barred participant was Recommind.

For more information, check out the links to the article and the study above.  TREC previously announced that there would be no 2012 study and is targeting obtaining a new data set for 2013.

So, what do you think?  Are you surprised by the results or are they expected?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: eDiscovery Work is Growing in Law Firms and Corporations

 

There was an article in Law Technology News last Friday (Survey Shows Surge in E-Discovery Work at Law Firms and Corporations, written by Monica Bay) that discussed the findings of a survey released by The Cowen Group, indicating that eDiscovery work in law firms and corporations is growing considerably.  Eighty-eight law firm and corporate law department professionals responded to the survey.

Some of the key findings:

  • 70 percent of law firm respondents reported an increase in workload for their litigation support and eDiscovery departments (compared to 42 percent in the second quarter of 2009);
  • 77 percent of corporate law department respondents reported an increase in workload for their litigation support and eDiscovery departments;
  • 60 percent of respondents anticipate increasing their internal capabilities for eDiscovery;
  • 55 percent of corporate and 62 percent of firm respondents said they "anticipate outsourcing a significant amount of eDiscovery to third-party providers” (some organizations expect to both increase internal capabilities and outsource);
  • 50 percent of the firms believe they will increase technology speeding in the next three months (compared to 31 percent of firms in 2010);
  • 43 percent of firms plan to add people to their litigation support and eDiscovery staff in the next 3 months, compared to 32 percent in 2011;
  • Noting that “corporate legal departments are under increasing pressure to ‘do more with less in-house to keep external costs down’”, only 12 percent of corporate respondents anticipate increasing headcount and 30 percent will increase their technology spend in the next six months;
  • In the past year, 49 percent of law firms and 23 percent of corporations have used Technology Assisted Review/ Predictive Coding technology through a third party service provider – an additional 38 percent have considered using it;
  • As for TAR/Predictive Coding inhouse, 30 percent of firms have an inhouse tool, and an additional 35 percent are considering making the investment.

As managing partner David Cowen notes, “Cases such as Da Silva Moore, Kleen, and Global Aerospace, which have hit our collective consciousness in the past three months, affect the investments in technology that both law firms and corporations are making.”  He concludes the Executive Summary of the report with this advice: “Educate yourself on the latest evolving industry trends, invest in relationships, and be an active participant in helping your executives, your department, and your clients ‘do more with less’.”

So, what do you think?  Do any of those numbers and trends surprise you?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: The Da Silva Moore Case Has Class (Certification, That Is)

 

As noted in an article written by Mark Hamblett in Law Technology News, Judge Andrew Carter of the U.S. District Court for the Southern District of New York has granted conditional class certification in the Da Silva Moore v. Publicis Groupe & MSL Group case.

In this case, women employees of the advertising conglomerate Publicis Groupe and its U.S. subsidiary, MSL, have accused their employer of company-wide discrimination, pregnancy discrimination, and a practice of keeping women at entry-level positions with few opportunities for promotion.

Judge Carter concluded that “Plaintiffs have met their burden by making a modest factual showing to demonstrate that they and potential plaintiffs together were victims of a common policy or plan that violated the law. They submit sufficient information that because of a common pay scale, they were paid wages lower than the wages paid to men for the performance of substantially equal work. The information also reveals that Plaintiffs had similar responsibilities as other professionals with the same title. Defendants may disagree with Plaintiffs' contentions, but the Court cannot hold Plaintiffs to a higher standard simply because it is an EPA action rather an action brought under the FLSA.”

“Courts have conditionally certified classes where the plaintiffs have different job functions,” Judge Carter noted, indicating that “[p]laintiffs have to make a mere showing that they are similarly situated to themselves and the potential opt-in members and Plaintiffs here have accomplished their goal.”

This is just the latest development in this test case for the use of computer-assisted coding to search electronic documents for responsive discovery. On February 24, Magistrate Judge Andrew J. Peck of the U.S. District Court for the Southern District of New York issued an opinion making it likely the first case to accept the use of computer-assisted review of electronically stored information (“ESI”) for this case.  However, on March 13, District Court Judge Andrew L. Carter, Jr. granted plaintiffs’ request to submit additional briefing on their February 22 objections to the ruling.  In that briefing (filed on March 26), the plaintiffs claimed that the protocol approved for predictive coding “risks failing to capture a staggering 65% of the relevant documents in this case” and questioned Judge Peck’s relationship with defense counsel and with the selected vendor for the case, Recommind.

Then, on April 5, Judge Peck issued an order in response to Plaintiffs’ letter requesting his recusal, directing plaintiffs to indicate whether they would file a formal motion for recusal or ask the Court to consider the letter as the motion.  On April 13, (Friday the 13th, that is), the plaintiffs did just that, by formally requesting the recusal of Judge Peck (the defendants issued a response in opposition on April 30).  But, on April 25, Judge Carter issued an opinion and order in the case, upholding Judge Peck’s opinion approving computer-assisted review.

Not done, the plaintiffs filed an objection on May 9 to Judge Peck's rejection of their request to stay discovery pending the resolution of outstanding motions and objections (including the recusal motion, which has yet to be ruled on.  Then, on May 14, Judge Peck issued a stay, stopping defendant MSLGroup's production of electronically stored information.  Finally, on June 15, Judge Peck, in a 56 page opinion and order, denied the plaintiffs’ motion for recusal

So, what do you think?  What will happen in this case next?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Domain Categorization of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through “fuzzy” searching to find misspellings or OCR errors in an opponent’s produced ESI.

Domain Categorization

Another type of analysis is the use of domain categorization.  Email is generally the biggest component of most ESI collections and each participant in an email communication belongs to a domain associated with the email server that manages their email.

FirstPass supports domain categorization by providing a list of domains associated with the ESI collection being reviewed, with a count for each domain that appears in emails in the collection.  Domain categorization provides several benefits when reviewing your opponent’s ESI:

  • Non-Responsive Produced ESI: Domains in the list that are obviously non-responsive to the case can be quickly identified and all messages associated with those domains can be “group-tagged” as non-responsive.  If a significant percentage of files are identified as non-responsive, that may be a sign that your opponent is trying to “bury you with paper” (albeit electronic).
  • Inadvertent Disclosures: If there are any emails associated with outside counsel’s domain, they could be inadvertent disclosures of attorney work product or attorney-client privileged communications.  If so, you can then address those according to the agreed-upon process for handling inadvertent disclosures and clawback of same.
  • Issue Identification: Messages associated with certain parties might be related to specific issues (e.g., an alleged design flaw of a specific subcontractor’s product), so domain categorization can isolate those messages more quickly.

In summary, there are several ways to use first pass review tools, like FirstPass, for reviewing your opponent’s ESI production, including: email analytics, synonym searching, fuzzy searching and domain categorization.  First pass review isn’t just for your own production; it’s also an effective process to quickly evaluate your opponent’s production.

So, what do you think?  Have you used first pass review tools to assess an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Fuzzy Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Tuesday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through synonym searching to find variations of your search terms to increase the possibility of finding the terminology used by your opponents.

Fuzzy Searching

Another type of analysis is the use of fuzzy searching.  Attorneys know what terms they’re looking for, but those terms may not often be spelled correctly.  Also, opposing counsel may produce a number of image only files that require Optical Character Recognition (OCR), which is usually not 100% accurate.

FirstPass supports "fuzzy" searching, which is a mechanism by finding alternate words that are close in spelling to the word you're looking for (usually one or two characters off).  FirstPass will display all of the words – in the collection – close to the word you’re looking for, so if you’re looking for the term “petroleum”, you can find variations such as “peroleum”, “petoleum” or even “petroleom” – misspellings or OCR errors that could be relevant.  Then, simply select the variations you wish to include in the search.  Fuzzy searching is the best way to broaden your search to include potential misspellings and OCR errors and FirstPass provides a terrific capability to select those variations to review additional potential “hits” in your collection.

Tomorrow, I’ll talk about the use of domain categorization to quickly identify potential inadvertent disclosures and weed out non-responsive files produced by your opponent, based on the domain of the communicators.  Hasta la vista, baby! J

In the meantime, what do you think?  Have you used fuzzy searching to find misspellings or OCR errors in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Synonym Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through email analytics to see the communication patterns graphically to identify key parties for deposition purposes and look for potential production omissions.

Synonym Searching

Another type of analysis is the use of synonym searching.  Attorneys understand the key terminology their client uses, but they often don’t know the terminology their client’s opposition uses because they haven’t interviewed the opposition’s custodians.  In a product defect case, the opposition may refer to admitted design or construction “mistakes” in their product or process as “flaws”, “errors”, “goofs” or even “flubs”.  With FirstPass, you can enter your search term into the synonym searching section of the application and it will provide a list of synonyms (with hit counts of each, if selected).  Then, you can simply select the synonyms you wish to include in the search.  As a result, FirstPass identifies synonyms of your search terms to broaden the scope and catch key “hits” that could be the “smoking gun” in the case.

Thursday, I’ll talk about the use of fuzzy searching to find misspellings that may be commonly used by your opponent or errors resulting from Optical Character Recognition (OCR) of any image-only files that they produce.  Stay tuned!  🙂

In the meantime, what do you think?  Have you used synonym searching to identify variations on terms in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Happy Independence Day from all of us at eDiscovery Daily and CloudNine Discovery!

eDiscovery Trends: First Pass Review – of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

In the past few years, applications that support Early Case Assessment (ECA) (or Early Data Assessment, as many prefer to call it) and First Pass Review (FPR) of ESI have become widely popular in eDiscovery as the analytical and culling benefits of conducting FPR have become obvious.  The benefit of these FPR tools to analyze and cull their ESI before conducting attorney review and producing relevant files has become increasingly clear.  But, nobody seems to talk about what these tools can do with opponent’s produced ESI.

Less Resources to Understand Data Produced to You

In eDiscovery, attorneys typically develop a reasonably in-depth understanding of their collection.  They know who the custodians are, have a chance to interview those custodians and develop a good knowledge of standard operating procedures and terminology of their client to effectively retrieve responsive ESI.  However, that same knowledge isn’t present when reviewing opponent’s data.  Unless they are deposed, the opposition’s custodians aren’t interviewed and where the data originated is often unclear.  The only source of information is the data itself, which requires in-depth analysis.  An FPR application like FirstPass®, powered by Venio FPR™, can make a significant difference in conducting that analysis – provided that you request a native production from your opponent, which is vital to being able to perform that in-depth analysis.

Email Analytics

The ability to see the communication patterns graphically – to identify the parties involved, with whom they communicated and how frequently – is a significant benefit to understanding the data received.  FirstPass provides email analytics to understand the parties involved and potentially identify other key opponent individuals to depose in the case.  Dedupe capabilities enable quick comparison against your production to confirm if the opposition has possibly withheld key emails between opposing parties.  FirstPass also provides an email timeline to enable you to determine whether any gaps exist in the opponent’s production.

Message Threading

The ability to view message threads for emails (which Microsoft Outlook® tracks), can also be a useful tool as it enables you to see the entire thread “tree” of a conversation, including any side discussions that break off from the original discussion.  Because Outlook tracks those message threads, any missing emails are identified with placeholders.  Those could be emails your opponent has withheld, so the ability to identify those quickly and address with opposing counsel (or with the court, if necessary) is key to evaluating the completeness of the production.

Tomorrow, I’ll talk about the use of synonym searching to find variations of your search terms that may be common terminology of your opponent.  Same bat time, same bat channel! 🙂

In the meantime, what do you think?  Have you used email analytics to analyze an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Documentation is Key to a Successful Discovery Effort

 

We like to point out good articles about eDiscovery on this blog to keep our readers aware of trends and best practices.  I recently read an article on InsideCounsel titled E-discovery: Memorializing the e-discovery process, written by Alvin Lindsay, which had some good specific examples of where good documentation is important to prevent sanctions and save litigation costs.

Litigation Holds

The author notes that, since the Zubulake opinions issued by Judge Shira Scheindlin in 2003 and 2004, 1) most jurisdictions have come to expect that parties must issue a litigation hold “as soon as litigation becomes reasonably foreseeable”, and 2) “oral” litigation holds are unlikely to be sufficient since the same Judge Scheindlin noted in Pension Committee that failure to issue a “written” litigation hold constitutes “gross negligence”.  His advice: “make sure the litigation hold is in writing, and includes at minimum the date of issue, the recipients and the scope of preservation”.  IT personnel responsible for deleting “expired” data (outside of retention policies) also need to receive litigation hold documentation; in fact, “it can be a good idea to provide a separate written notice order just for them”.  Re-issuing the hold notices periodically is important because, well, people forget if they’re not reminded.  For previous posts on the subject of litigation holds, click here and here.

Retention Policies and Data Maps

Among the considerations for documentation here are the actual retention and destruction policies, system-wide backup procedures and “actual (as opposed to theoretical) implementation of the firm’s recycle policy”, as well as documentation of discussions with any personnel regarding same.  A data map provides a guide for legal and IT to the location of data throughout the company and important information about that data, such as the business units, processes and technology responsible for maintaining the data, as well as retention periods for that data.  The author notes that many organizations “don’t keep data maps in the ordinary course of business, so outside counsel may have to create one to truly understand their client’s data retention architecture.”  Creating a data map is impossible for outside counsel without involvement and assistance at several levels within the organization, so it’s truly a group effort and best done before litigation strikes.  For previous posts on the subject of data maps, click here and here.

Conferences with Opposing Counsel

The author discusses the importance of documenting the nature and scope of preservation and production and sums up the importance quite effectively by stating: “If opposing parties who are made aware of limitations early on do not object in a timely fashion to what a producing party says it will do, courts will be more likely to invoke the doctrines of waiver and estoppel when those same parties come to complain of supposed production infirmities on the eve of trial.”  So, the benefits of documenting those limitations early on are clear.

Collecting, Culling and Sampling

Chain of custody documentation (as well as a through written explanation of the collection process) is important to demonstrating integrity of the data being collected.  If you collect at a broad level (as many do), then you need to cull through effective searching to identify potentially responsive ESI.  Documenting the approach for searching as well as the searches themselves is key to a defensible searching and culling process (it helps when you use an application, like FirstPass®, powered by Venio FPR™, that keeps a history of all searches performed).  As we’ve noted before, sampling enables effective testing and refinement of searches and aids in the defense of the overall search approach.

Quality Control

And, of course, documenting all materials and mechanisms used to provide quality assurance and control (such as “materials provided to and used to train the document reviewers, as well as the results of QC checks for each reviewer”) make it easier to defend your approach and even “clawback” privileged documents if you can show that your approach was sound.  Mistakes happen, even with the best of approaches.

So, what do you think?  These are some examples of important documentation of the eDiscovery process – can you think of others?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Judge Peck Denies Recusal Motion in Da Silva Moore

 

It’s been a few weeks since we heard anything from the Da Silva Moore case.  If you’ve been living under a rock the past few months, Magistrate Judge Andrew J. Peck of the U.S. District Court for the Southern District of New York issued an opinion in this case in February making it one of the first cases to accept the use of computer-assisted review of electronically stored information (“ESI”).  However, the plaintiffs objected to the ruling and questioned Judge Peck’s relationship with defense counsel and with the selected vendor for the case, Recommind and ultimately formally requested the recusal of Judge Peck.  For links to all of the recent events in the case that we’ve covered, click here.

Last Friday, in a 56 page opinion and order, Judge Peck denied the plaintiffs’ motion for recusal.  The opinion and order reviewed the past several contentious months and rejected the plaintiffs’ arguments for recusal in the following areas:

Participation in conferences discussing the use of predictive coding:

“I only spoke generally about computer-assisted review in comparison to other search techniques…The fact that my interest in and knowledge about predictive coding in general overlaps with issues in this case is not a basis for recusal.”

“To the extent plaintiffs are complaining about my general discussion at these CLE presentations about the use of predictive coding in general, those comments would not cause a reasonable objective observer to believe I was biased in this case. I did not say anything about predictive coding at these LegalTech and other CLE panels that I had not already said in in my Search,Forward article, i.e., that lawyers should consider using predictive coding in appropriate cases. My position was the same as plaintiffs’ consultant . . . . Both plaintiffs and defendants were proposing using predictive coding in this case.  I did not determine which party’s predictive coding protocol was appropriate in this case until the February 8, 2012 conference, after the panels about which plaintiffs complain.”

“There are probably fewer than a dozen federal judges nationally who regularly speak at ediscovery conferences. Plaintiffs' argument that a judge's public support for computer-assisted review is a recusable offense would preclude judges who know the most about ediscovery in general (and computer-assisted review in particular) from presiding over any case where the use of predictive coding was an option, or would preclude those judges from speaking at CLE programs. Plaintiffs' position also would discourage lawyers from participating in CLE programs with judges about ediscovery issues, for fear of subsequent motions to recuse the judge (or disqualify counsel).”

Relationship with defense counsel Ralph Losey:

“While I participated on two panels with defense counsel Losey, we never had any ex parte communication regarding this lawsuit. My preparation for and participation in ediscovery panels involved only ediscovery generally and the general subject of computer-assisted review. Losey's affidavit makes clear that we have never spoken about this case, and I confirm that. During the panel discussions (and preparation sessions), there was absolutely no discussion of the details of the predictive coding protocol involved in this case or with regard to what a predicative coding protocol should look like in any case. Plaintiffs' assertion that speaking on an educational panel with counsel creates an appearance of impropriety is undermined by Canon 4 of the Judicial Code of Conduct, which encourages judges to participate in such activities.”

Relationship with Recommind, the selected vendor in the case:

“The panels in which I participated are distinguishable. First, I was a speaker at educational conferences, not an audience member. Second, the conferences were not one-sided, but concerned ediscovery issues including search methods in general. Third, while Recommind was one of thirty-nine sponsors and one of 186 exhibitors contributing to LegalTech's revenue, I had no part in approving the sponsors or exhibitors (i.e., funding for LegalTech) and received no expense reimbursement or teaching fees from Recommind or LegalTech, as opposed to those companies that sponsored the panels on which I spoke. Fourth, there was no "pre-screening" of MSL's case or ediscovery protocol; the panel discussions only covered the subject of computer-assisted review in general.”

Perhaps it is no surprise that Judge Peck denied the recusal motion.  Now, the question is: will District Court Judge Andrew L. Carter, Jr. weigh in?

So, what do you think?  Should Judge Peck recuse himself in this case or does he provide an effective argument that recusal is unwarranted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.