Review

eDiscovery Trends: First Pass Review – Fuzzy Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Tuesday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through synonym searching to find variations of your search terms to increase the possibility of finding the terminology used by your opponents.

Fuzzy Searching

Another type of analysis is the use of fuzzy searching.  Attorneys know what terms they’re looking for, but those terms may not often be spelled correctly.  Also, opposing counsel may produce a number of image only files that require Optical Character Recognition (OCR), which is usually not 100% accurate.

FirstPass supports "fuzzy" searching, which is a mechanism by finding alternate words that are close in spelling to the word you're looking for (usually one or two characters off).  FirstPass will display all of the words – in the collection – close to the word you’re looking for, so if you’re looking for the term “petroleum”, you can find variations such as “peroleum”, “petoleum” or even “petroleom” – misspellings or OCR errors that could be relevant.  Then, simply select the variations you wish to include in the search.  Fuzzy searching is the best way to broaden your search to include potential misspellings and OCR errors and FirstPass provides a terrific capability to select those variations to review additional potential “hits” in your collection.

Tomorrow, I’ll talk about the use of domain categorization to quickly identify potential inadvertent disclosures and weed out non-responsive files produced by your opponent, based on the domain of the communicators.  Hasta la vista, baby! J

In the meantime, what do you think?  Have you used fuzzy searching to find misspellings or OCR errors in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: First Pass Review – Synonym Searching Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

Yesterday, we talked about the use of First Pass Review (FPR) applications (such as FirstPass®, powered by Venio FPR™) to not only conduct first pass review of your own collection, but also to analyze your opponent’s ESI production.  One way to analyze that data is through email analytics to see the communication patterns graphically to identify key parties for deposition purposes and look for potential production omissions.

Synonym Searching

Another type of analysis is the use of synonym searching.  Attorneys understand the key terminology their client uses, but they often don’t know the terminology their client’s opposition uses because they haven’t interviewed the opposition’s custodians.  In a product defect case, the opposition may refer to admitted design or construction “mistakes” in their product or process as “flaws”, “errors”, “goofs” or even “flubs”.  With FirstPass, you can enter your search term into the synonym searching section of the application and it will provide a list of synonyms (with hit counts of each, if selected).  Then, you can simply select the synonyms you wish to include in the search.  As a result, FirstPass identifies synonyms of your search terms to broaden the scope and catch key “hits” that could be the “smoking gun” in the case.

Thursday, I’ll talk about the use of fuzzy searching to find misspellings that may be commonly used by your opponent or errors resulting from Optical Character Recognition (OCR) of any image-only files that they produce.  Stay tuned!  🙂

In the meantime, what do you think?  Have you used synonym searching to identify variations on terms in an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Happy Independence Day from all of us at eDiscovery Daily and CloudNine Discovery!

eDiscovery Trends: First Pass Review – of Your Opponent’s Data

 

Even those of us at eDiscoveryDaily have to take an occasional vacation; however, instead of “going dark” for the week, we thought we would republish a post series from the early days of the blog (when we didn’t have many readers yet)  So chances are, you haven’t seen these posts yet!  Enjoy!

In the past few years, applications that support Early Case Assessment (ECA) (or Early Data Assessment, as many prefer to call it) and First Pass Review (FPR) of ESI have become widely popular in eDiscovery as the analytical and culling benefits of conducting FPR have become obvious.  The benefit of these FPR tools to analyze and cull their ESI before conducting attorney review and producing relevant files has become increasingly clear.  But, nobody seems to talk about what these tools can do with opponent’s produced ESI.

Less Resources to Understand Data Produced to You

In eDiscovery, attorneys typically develop a reasonably in-depth understanding of their collection.  They know who the custodians are, have a chance to interview those custodians and develop a good knowledge of standard operating procedures and terminology of their client to effectively retrieve responsive ESI.  However, that same knowledge isn’t present when reviewing opponent’s data.  Unless they are deposed, the opposition’s custodians aren’t interviewed and where the data originated is often unclear.  The only source of information is the data itself, which requires in-depth analysis.  An FPR application like FirstPass®, powered by Venio FPR™, can make a significant difference in conducting that analysis – provided that you request a native production from your opponent, which is vital to being able to perform that in-depth analysis.

Email Analytics

The ability to see the communication patterns graphically – to identify the parties involved, with whom they communicated and how frequently – is a significant benefit to understanding the data received.  FirstPass provides email analytics to understand the parties involved and potentially identify other key opponent individuals to depose in the case.  Dedupe capabilities enable quick comparison against your production to confirm if the opposition has possibly withheld key emails between opposing parties.  FirstPass also provides an email timeline to enable you to determine whether any gaps exist in the opponent’s production.

Message Threading

The ability to view message threads for emails (which Microsoft Outlook® tracks), can also be a useful tool as it enables you to see the entire thread “tree” of a conversation, including any side discussions that break off from the original discussion.  Because Outlook tracks those message threads, any missing emails are identified with placeholders.  Those could be emails your opponent has withheld, so the ability to identify those quickly and address with opposing counsel (or with the court, if necessary) is key to evaluating the completeness of the production.

Tomorrow, I’ll talk about the use of synonym searching to find variations of your search terms that may be common terminology of your opponent.  Same bat time, same bat channel! 🙂

In the meantime, what do you think?  Have you used email analytics to analyze an opponent’s produced ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Documentation is Key to a Successful Discovery Effort

 

We like to point out good articles about eDiscovery on this blog to keep our readers aware of trends and best practices.  I recently read an article on InsideCounsel titled E-discovery: Memorializing the e-discovery process, written by Alvin Lindsay, which had some good specific examples of where good documentation is important to prevent sanctions and save litigation costs.

Litigation Holds

The author notes that, since the Zubulake opinions issued by Judge Shira Scheindlin in 2003 and 2004, 1) most jurisdictions have come to expect that parties must issue a litigation hold “as soon as litigation becomes reasonably foreseeable”, and 2) “oral” litigation holds are unlikely to be sufficient since the same Judge Scheindlin noted in Pension Committee that failure to issue a “written” litigation hold constitutes “gross negligence”.  His advice: “make sure the litigation hold is in writing, and includes at minimum the date of issue, the recipients and the scope of preservation”.  IT personnel responsible for deleting “expired” data (outside of retention policies) also need to receive litigation hold documentation; in fact, “it can be a good idea to provide a separate written notice order just for them”.  Re-issuing the hold notices periodically is important because, well, people forget if they’re not reminded.  For previous posts on the subject of litigation holds, click here and here.

Retention Policies and Data Maps

Among the considerations for documentation here are the actual retention and destruction policies, system-wide backup procedures and “actual (as opposed to theoretical) implementation of the firm’s recycle policy”, as well as documentation of discussions with any personnel regarding same.  A data map provides a guide for legal and IT to the location of data throughout the company and important information about that data, such as the business units, processes and technology responsible for maintaining the data, as well as retention periods for that data.  The author notes that many organizations “don’t keep data maps in the ordinary course of business, so outside counsel may have to create one to truly understand their client’s data retention architecture.”  Creating a data map is impossible for outside counsel without involvement and assistance at several levels within the organization, so it’s truly a group effort and best done before litigation strikes.  For previous posts on the subject of data maps, click here and here.

Conferences with Opposing Counsel

The author discusses the importance of documenting the nature and scope of preservation and production and sums up the importance quite effectively by stating: “If opposing parties who are made aware of limitations early on do not object in a timely fashion to what a producing party says it will do, courts will be more likely to invoke the doctrines of waiver and estoppel when those same parties come to complain of supposed production infirmities on the eve of trial.”  So, the benefits of documenting those limitations early on are clear.

Collecting, Culling and Sampling

Chain of custody documentation (as well as a through written explanation of the collection process) is important to demonstrating integrity of the data being collected.  If you collect at a broad level (as many do), then you need to cull through effective searching to identify potentially responsive ESI.  Documenting the approach for searching as well as the searches themselves is key to a defensible searching and culling process (it helps when you use an application, like FirstPass®, powered by Venio FPR™, that keeps a history of all searches performed).  As we’ve noted before, sampling enables effective testing and refinement of searches and aids in the defense of the overall search approach.

Quality Control

And, of course, documenting all materials and mechanisms used to provide quality assurance and control (such as “materials provided to and used to train the document reviewers, as well as the results of QC checks for each reviewer”) make it easier to defend your approach and even “clawback” privileged documents if you can show that your approach was sound.  Mistakes happen, even with the best of approaches.

So, what do you think?  These are some examples of important documentation of the eDiscovery process – can you think of others?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Judge Peck Denies Recusal Motion in Da Silva Moore

 

It’s been a few weeks since we heard anything from the Da Silva Moore case.  If you’ve been living under a rock the past few months, Magistrate Judge Andrew J. Peck of the U.S. District Court for the Southern District of New York issued an opinion in this case in February making it one of the first cases to accept the use of computer-assisted review of electronically stored information (“ESI”).  However, the plaintiffs objected to the ruling and questioned Judge Peck’s relationship with defense counsel and with the selected vendor for the case, Recommind and ultimately formally requested the recusal of Judge Peck.  For links to all of the recent events in the case that we’ve covered, click here.

Last Friday, in a 56 page opinion and order, Judge Peck denied the plaintiffs’ motion for recusal.  The opinion and order reviewed the past several contentious months and rejected the plaintiffs’ arguments for recusal in the following areas:

Participation in conferences discussing the use of predictive coding:

“I only spoke generally about computer-assisted review in comparison to other search techniques…The fact that my interest in and knowledge about predictive coding in general overlaps with issues in this case is not a basis for recusal.”

“To the extent plaintiffs are complaining about my general discussion at these CLE presentations about the use of predictive coding in general, those comments would not cause a reasonable objective observer to believe I was biased in this case. I did not say anything about predictive coding at these LegalTech and other CLE panels that I had not already said in in my Search,Forward article, i.e., that lawyers should consider using predictive coding in appropriate cases. My position was the same as plaintiffs’ consultant . . . . Both plaintiffs and defendants were proposing using predictive coding in this case.  I did not determine which party’s predictive coding protocol was appropriate in this case until the February 8, 2012 conference, after the panels about which plaintiffs complain.”

“There are probably fewer than a dozen federal judges nationally who regularly speak at ediscovery conferences. Plaintiffs' argument that a judge's public support for computer-assisted review is a recusable offense would preclude judges who know the most about ediscovery in general (and computer-assisted review in particular) from presiding over any case where the use of predictive coding was an option, or would preclude those judges from speaking at CLE programs. Plaintiffs' position also would discourage lawyers from participating in CLE programs with judges about ediscovery issues, for fear of subsequent motions to recuse the judge (or disqualify counsel).”

Relationship with defense counsel Ralph Losey:

“While I participated on two panels with defense counsel Losey, we never had any ex parte communication regarding this lawsuit. My preparation for and participation in ediscovery panels involved only ediscovery generally and the general subject of computer-assisted review. Losey's affidavit makes clear that we have never spoken about this case, and I confirm that. During the panel discussions (and preparation sessions), there was absolutely no discussion of the details of the predictive coding protocol involved in this case or with regard to what a predicative coding protocol should look like in any case. Plaintiffs' assertion that speaking on an educational panel with counsel creates an appearance of impropriety is undermined by Canon 4 of the Judicial Code of Conduct, which encourages judges to participate in such activities.”

Relationship with Recommind, the selected vendor in the case:

“The panels in which I participated are distinguishable. First, I was a speaker at educational conferences, not an audience member. Second, the conferences were not one-sided, but concerned ediscovery issues including search methods in general. Third, while Recommind was one of thirty-nine sponsors and one of 186 exhibitors contributing to LegalTech's revenue, I had no part in approving the sponsors or exhibitors (i.e., funding for LegalTech) and received no expense reimbursement or teaching fees from Recommind or LegalTech, as opposed to those companies that sponsored the panels on which I spoke. Fourth, there was no "pre-screening" of MSL's case or ediscovery protocol; the panel discussions only covered the subject of computer-assisted review in general.”

Perhaps it is no surprise that Judge Peck denied the recusal motion.  Now, the question is: will District Court Judge Andrew L. Carter, Jr. weigh in?

So, what do you think?  Should Judge Peck recuse himself in this case or does he provide an effective argument that recusal is unwarranted?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: X1 Social Discovery – Social Media Discovery for Professionals

 

According to EDDUpdate.com, social media will be eclipsing email as the primary discovery resource within three years.  Social media has become a normal part of our everyday life as we share our photos on Facebook, tweet news on Twitter, and make professional connections on LinkedIn.  We’ve previously covered social media archiving tools here, highlighting a firm named Smarsh, and the need for effective electronic discovery methods is only growing by the day.  As you can imagine, the sheer amount of content being generated is astounding.  Twitter CEO Dick Costolo announced on June 6th that Twitter had broken the 400 million tweet-per-day barrier, up 18% from 340 million back in March.  These aren’t simply meaningless ones and zeroes, either. X1 Discovery has information for 689 cases related to social media discovery from 2010 and 2011 linked on their website, making it clear just how many cases are being affected by social media these days.

With regard to ESI on social media networks, X1 Discovery features a solution called X1 Social Discovery, which is described as “the industry's first investigative solution specifically designed to enable eDiscovery and computer forensics professionals to effectively address social media content.  X1 Social Discovery provides for a powerful platform to collect, authenticate, search, review and produce electronically stored information (ESI) from popular social media sites, such as Facebook, Twitter and LinkedIn.”

We reached out to X1 Discovery for more information about X1 Social Discovery, especially with regard as to what sort of challenges faces a new tool developed for a new type of information.  For example, why isn’t support for Google+, Google’s fledgling social network, offered?  X1 Discovery Executive Vice President for Sales and Business Development, Skip Lindsey, addressed that question accordingly:

“Our system can be purposed to accommodate a wide variety of use cases and we are constantly working with clients to understand their requirements to further enhance the product.  As you are aware there are a staggering number of potential social media systems to be collected from, but in terms of frequency of use, Facebook, Twitter and Linkedin are far and away the most prominent and there is a lot of constant time and attention we provide to ensure the accuracy and completeness of the data we obtain from those sites. We use a combination of direct API’s to the most popular systems, and have incorporated comprehensive web crawling and single page web capture into X1 Social Discovery to allow capture of virtually any web source that the operator can access. Google + is on the roadmap and we plan support in the near future.”

So, who is going to benefit most from X1 Social Discovery, and how is it different than an archiving tool like Smarsh?  According to Lindsay:

“X1 Social Discovery is installable software, not a service. This means that clients can deploy quickly and do not incur any additional usage charges for case work. Our investigative interface and workflow are unique in our opinion and better suited to professional investigators, law enforcement and eDiscovery professionals that other products that we have seen which work with social media content. Many of these other systems were created for the purpose of compliance archiving of web sites and do not address the investigation and litigation support needs of our client base. We feel that the value proposition of X1 Social Discovery is hard to beat in terms of its functionality, defensibility, and cost of ownership.”

With so many cases requiring collection by experienced professionals these days, it seems appropriate that there’s a tool like X1 Social Discovery designed for them for collecting social media ESI.

So, what do you think?  Do you collect your own social media ESI or do you use experienced professionals for this collection?  What tools have you used?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: Where Does the Money Go? RAND Provides Some Answers

 

The RAND Corporation, a nonprofit research and analysis institution recently published a new 159 page report related to understanding eDiscovery costs entitled Where the Money Goes: Understanding Litigant Expenditures for Producing Electronic Discovery by Nicholas M. Pace and Laura Zakaras that has some interesting findings and recommendations.  To obtain either a paperback copy or download a free eBook of the report, click here.

For the study, the authors requested case-study data from eight Fortune 200 companies and obtained data for 57 large-volume eDiscovery productions (from both traditional lawsuits and regulatory investigations) as well as information from extensive interviews with key legal personnel from the participating companies.  Here are some of the key findings from the research:

  • Review Makes Up the Largest Percentage of eDiscovery Production Costs: By a whopping amount, the major cost component in their cases was the review of documents for relevance, responsiveness, and privilege (typically about 73 percent). Collection, on the other hand, only constituted about 8 percent of expenditures for the cases in the study, while processing costs constituted about 19 percent in the cases.  It costs about $14,000 to review each gigabyte and $20,000 in total production costs for each gigabyte (click here for a previous study on per gigabyte costs).  Review costs would have to be reduced by about 75% in order to make those costs comparable to processing, the next highest component.
  • Outside Counsel Makes Up the Largest Percentage of eDiscovery Expenditures: Again, by a whopping amount, the major cost component was expenditures for outside counsel services, which constituted about 70 percent of total eDiscovery production costs.  Vendor expenditures were around 26 percent.  Internal expenditures, even with adjustments made for underreporting, were generally around 4 percent of the total.  So, almost all eDiscovery expenditures are outsourced in one way or another.
  • If Conducted in the Traditional Manner, Review Costs Are Difficult to Reduce Significantly: Rates currently paid to “project attorneys during large-scale reviews in the US may well have bottomed out” and foreign review teams are often not a viable option due to “issues related to information security, oversight, maintaining attorney-client privilege, and logistics”.  Increasing the rate of review is also limited as, “[g]iven the trade-off between reading speed and comprehension…it is unrealistic to expect much room for improvement in the rates of unassisted human review”.  The study also notes that techniques for grouping documents, such as near-duplicate detection and clustering, while helpful, are “not the answer”.
  • Computer-Categorized Document Review Techniques May Be a Solution: Techniques such as predictive coding have the potential of reducing the review hours by about 75% with about the same level of consistency, resulting in review costs of less than $2,000 and total production costs of less than $7,000.  However, “lack of clear signals from the bench” that the techniques are defensible and lack of confidence by litigants that the techniques are reliable enough to reliably identify the majority of responsive documents and privileged documents are barriers to wide-scale adoption.

Not surprisingly, the recommendations included taking “the bold step of using, publicly and transparently, computer-categorized document review techniques” for large-scale eDiscovery efforts.

So, what do you think?  Are you surprised by the cost numbers?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Case Law: Privilege Waived Because Defendants Failed to Notice “Something Had Gone Awry” with Their Production

 

In D’Onofrio v. Borough of Seaside Park, No. 09-6220 (AET), (D.N.J. May 30, 2012), New Jersey Magistrate Judge Tonianne Bongiovanni denied the defendants’ motion for discovery to reclaim privileged documents that were inadvertently produced, finding that privilege was waived because the defendants failed to take reasonable measures to rectify the disclosure. 

During the course of discovery in a case where the plaintiff alleged the defendants engaged in conduct that violated the plaintiff’s constitutional and statutory rights, the defendants reviewed 14 boxes of documents for possible production to the plaintiff. Six of those boxes, the “Ryan/McKenna” boxes, were reviewed by a partner at the law firm representing the defendants. The partner marked certain documents as privileged and then instructed a clerical employee to separate privileged and non-privileged documents, to Bates stamp the separated documents, and to burn the non-privileged documents onto a disc for production. The clerical employee failed to follow instructions, and privileged documents were inadvertently produced. 

Despite subsequent events where the defendants could have discovered the mistake, the defendants remained unaware of the accidental disclosure for approximately eight months until the plaintiff attached some of the privileged documents to an exhibit of his brief on an unrelated matter. The intervening events where the defendant failed to notice the production of privileged documents included the following: (1) the defendants voluntarily recalled the disc to reorganize the documents and remove electronic comments inadvertently left on some documents, and then resubmitted the disc to the plaintiff; (2) the defendants again recalled the disc after the plaintiff informed them the new disc was unreadable, and, after a clerical employee performed a “quality control audit” on the disc to ensure the defendants were producing the same set of documents, the defendants again produced the disc; (3) the defendants created a privilege log but did not realize the number of documents for the Ryan/McKenna boxes marked privileged was too small; and (4) after the plaintiff informed them that some of the documents on another disc were out of order, the defendants discovered hundreds of privileged documents from the “borough” boxes, another set of boxes, had been accidentally produced, but the defendants did not re-review the Ryan/McKenna documents that were produced.

Judge Bongiovanni articulated the applicable standard of review under Federal Rule of Evidence 502(b), stating that the factors to be considered in determining whether a waiver occurred are: (1) the reasonableness of the precautions taken to prevent inadvertent disclosure in view of the extent of the document production; (2) the number of inadvertent disclosures; (3) the extent of the disclosure; (4) any delay and measures taken to rectify the disclosure; and (5) whether the overriding interests of justice would or would not be served by relieving the party of its error.

Judge Bongiovanni had no trouble finding that the defendants “initially” took reasonable precautions to prevent production of privileged documents by devoting sufficient time to review, having a partner personally review all of the Ryan/McKenna documents, delegating to a clerical employee the task of separating privileged and non-privileged documents, and even by reviewing the disc before producing it to the plaintiff.

She then noted that the number and extent of the defendant’s unintentional disclosures were “neutral.”

Turning to the defendants’ efforts to rectify the disclosure, however, Judge Bongiovanni concluded that the defendants “did not take reasonable steps to remedy their error.” She stated, “Defendants should have been aware that something was amiss with their document production long before Plaintiff relied on three privileged documents” in his brief. Furthermore, although the defendants were not obligated to “engage in a post-production review to determine whether any protected communication or information [was] produced by mistake,” once a party is “‘on notice that something [i]s amiss with its document production and privilege review,’ then that party has an obligation to ‘promptly re-assess its procedures and re-check its production.’” The court pointed out that “the combination of the inadvertently produced attorney electronic comments and 728 pages of privileged Borough documents should have put the [ ] Defendants on notice that something had gone profoundly awry with their document production and privilege review.” A “reasonable person” would have rechecked the disc containing the Ryan/McKenna documents, and yet the defendants failed to do so.

Finally, the court also found that the interests of justice favored finding that a waiver occurred because the defendants’ “negligence” led to the inadvertent disclosure of privileged information.

So, what do you think?  Was the ruling fair?  Please share any comments you might have or if you’d like to know more about a particular topic.

Case Summary Source: Applied Discovery (free subscription required).  For eDiscovery news and best practices, check out the Applied Discovery Blog here.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Test Your Searches Before the Meet and Confer

 

One of the very first posts ever on this blog discussed the danger of using wildcards.  For those who haven’t been following the blog from the beginning, here’s a recap.

A couple of years ago, I provided search strategy assistance to a client that had already agreed upon several searches with opposing counsel.  One search related to mining activities, so the attorney decided to use a wildcard of “min*” to retrieve variations like “mine”, “mines” and “mining”.

That one search retrieved over 300,000 files with hits.

Why?  Because there are 269 words in the English language that begin with the letters “min”.  Words like “mink”, “mind”, “mint” and “minion” were all being retrieved in this search for files related to “mining”.  We ultimately had to go back to opposing counsel and attempt to negotiate a revised search that was more appropriate.

What made that process difficult was the negotiation with opposing counsel.  My client had already agreed on over 200 terms with opposing counsel and had proposed many of those terms, including this one.  The attorneys had prepared these terms without assistance from a technology consultant (I was brought into the project after the terms were negotiated and agreed upon) and without testing any of the terms.

Since they had been agreed upon, opposing counsel was understandably resistant to modifying the terms.  The fact that my client faced having to review all of these files was not their problem.  We were ultimately able to provide a clear indication that many of the terms in this search were non-responsive and were able to get opposing counsel to agree to a modified list of variations of “mine” that included “minable”, “mine”, “mineable”, “mined”, “minefield”, “minefields”, “miner”, “miners”, “mines”, “mining” and “minings”.  We were able sort through the “minutia” and “minimize” the result set to less than 12,000 files with hits, saving our client a “mint”, which they certainly didn’t “mind”.  OK, I’ll stop now.

However, there were several other inefficient terms that opposing counsel refused to renegotiate and my client was forced to review thousands of additional files that they shouldn’t have had to review, which was a real “mindblower” (sorry, I couldn’t resist).  Had the client included a technical member on the team and had they tested each of these searches before negotiating terms with opposing counsel, they would have been able to figure out which terms were overbroad and would have been better prepared to negotiate favorable search terms for retrieving potentially responsive data.

When litigation is anticipated, it’s never too early to begin collecting potentially responsive data and assessing it by performing searches and testing the results.  However, if you wait until after the meet and confer with opposing counsel, it can be too late.

So, what do you think?  What steps do you take to assess your data before negotiating search terms?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Best Practices: Smoking Gun Shoots Blanks, Google Wins Latest Battle in “Smartphone War” with Oracle

 

Despite a significant inadvertent disclosure of information during Google's litigation with Oracle Corp., U.S. District Judge William Alsup last Thursday (May 31) dismissed claims that its Android mobile phone platform infringes Oracle's copyrights relating to the Java computer language.

Oracle had accused Google of infringing the "structure, sequence and organization" of 37 of Java's application programming interface (API) application. Referring to this case as “the first of the so-called smartphone war cases”, Alsup ruled in the 41-page decision that the particular Java elements Google replicated were free for all to use under copyright law, noting: "So long as the specific code used to implement a method is different, anyone is free under the Copyright Act to write his or her own code to carry out exactly the same function or specification of any methods used in the Java API"

Summarizing the validity of Oracle’s claim, Judge Alsup stated:

“Of the 166 Java packages, 129 were not violated in any way.  Of the 37 accused, 97 percent of the Android lines were new from Google and the remaining three percent were freely replicable under the merger and names doctrines.  Oracle must resort, therefore, to claiming that it owns, by copyright, the exclusive right to any and all possible implementations of the taxonomy-like command structure for the 166 packages and/or any subpart thereof – even though it copyrighted only one implementation.  To accept Oracle’s claim would be to allow anyone to copyright one version of code to carry out a system of commands and thereby bar all others from writing their own different versions to carry out all or part of the same commands.  No holding has ever endorsed such a sweeping proposition.”

Judge Alsup indicated that he was not ruling that Java API packages are free for all to use, stating: “This order does not hold that Java API packages are free for all to use without license.  It does not hold that the structure, sequence, and organization of all computer programs may be stolen. Rather, it holds on the specific facts of this case, the particular elements replicated by Google were free for all to use under the Copyright Act.”

Oracle filed suit against Google in San Francisco federal court in August 2011 claiming that the Android mobile operating system infringed Java copyrights and patents (to which Oracle obtained the rights after acquiring Sun Microsystems in 2010) and once valued damages in the case at $6 billion. In the first phase of the trial, the jury returned a verdict that said Google infringed the structure, sequence, and organization of 37 API packages; however, they deadlocked on Google's affirmative defense that it only made fair use of Java technology and Alsup had not yet ruled on whether the APIs could be copyrighted.  He has now.

Oracle is expected to appeal.

So, what do you think?  Will Oracle appeal and should they do so?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.