Analysis

Working Successfully with eDiscovery and Litigation Support Service Providers: Evaluating Quality

Yesterday, we talked about evaluating service-provider pricing.  That, of course, is just part of the picture.  You need a service provider that can and does provide high-quality work that meets your expectations.

This can be hard to assess when you are evaluating a service provider with which you don’t have prior experience. And, unfortunately, it’s just not possible to know up-front if a service provider will do high-quality work on any given project.  You can, however, determine whether a service provider is likely to do high-quality work.  Here are some suggestions for doing so:

  1. Ask for references, and check them.  Ask for both end-user references and for people who were the point of contact with the service provider.  And ask for references for projects that were similar in size and scope to your project.  Later in this blog series, I’m going to give you some suggestions for doing an effective reference check.
  2. Look at their procedures and processes.  This is important for tasks that are labor intensive, and for tasks that are heavily technology based too.  Look at intake procedures, workflow procedures, and status-tracking procedures.
  3. Look at the type and level of quality control that is done.  Find out what is checked 100%, what is sampled, what triggers rework, what computer validation is done, and what is checked manually.
  4. Ask about staff qualifications, experience and training.
  5. Ask about project management.  A well-managed project will yield higher-quality results.  For certain types of projects, you might also require interviewing the project manager that will be assigned to your project.
  6. Evaluate the quality of your communication with the service provider during the evaluation process.  Did they understand your questions and your needs?  Were documents submitted to you (proposals and correspondence) clear and free of errors?  I might not eliminate a service provider from consideration for problems in this area, but I’d certainly question the care the service provider might take with my work if they didn’t take care in their communications with me.

What has been your experience with service provider work quality?  Do you have good or bad experiences you can tell us about?  Please share any comments you might have and let us know if you’d like to know more about an eDiscovery topic.

Working Successfully with eDiscovery and Litigation Support Service Providers: Evaluating Price

 

When you are looking for help with handling discovery materials, there are hundreds of service providers to choose from.  It’s important that you choose one that can meet your schedule, has fair pricing and does high-quality work.  But there are other things you should look at as well. 

In the next few blogs in this series, we’re going to discuss what you should be looking at when you evaluate a service provider.  Note that these points are not covered in order of importance.  The importance of any single evaluation point will vary from case to case and will depend on things like the type of service you are looking for, the duration of the project, the complexity of the project, and the size of the project.

Let’s start with Price.  Obviously, costs are significant and the first thing most people look at when doing an evaluation.  Unfortunately, many people don’t look at anything else.  Don’t fall into that trap.  If a service provider offers prices much lower than everyone else’s, that should sound some alarms.  There’s a chance the service provider doesn’t understand the task or is cutting corners somewhere.  Do a lot of digging and take a close look at the organization’s procedures and technology before selecting a service provider that is comparatively very low-priced. 

There’s another very important consideration when you are comparing service provider pricing:  not all pricing models are the same.  Make sure you understand every component of a service provider’s price, what’s included, what’s not, what exactly you are paying for, and how it affects the bottom line.  Let me give you an example.  Some service providers charge per GB for “input” gigs for electronic discovery processing, while others charge per GB for “output” gigs.  Of course, the ones that charge for “input” gigs charge a lower per gig price, but they are charging for more gigabytes. 

Understand how a service provider’s pricing is structured and what it means when you are evaluating prices.  It’s always a good idea to ask a service provider to estimate total costs for a project to verify your understanding.

In the next blogs in this series, we’ll look at other things you should be looking at when selecting a vendor.

What has been your experience with service provider work?  Do you have good or bad experiences you can tell us about?  Please share any comments you might have and let us know if you’d like to know more about an eDiscovery topic.

eDiscovery Trends: Despite What NY Times Says, Lawyers Not Going Away

 

There was a TV commercial in the mid-80’s where a soap opera actor delivered the line “I’m not a doctor, but I play one on TV”.  Can you remember the product it was advertising (without clicking on the link)?  If so, you win the trivia award of the day!  😉

I’m a technologist who has been working in litigation support and eDiscovery for over twenty years.  If you’ve been reading eDiscovery Daily for awhile, you’ve probably noticed that I’ve written several posts regarding significant case law as it pertains to eDiscovery.  I often feel that I should offer a disclaimer before each of these posts saying “I’m not a lawyer, but I play one on the Web”.  As the disclaimer at the bottom of the page stipulates, these posts aren’t meant to provide legal advice and it is not my intention to do so, but merely to identify cases that may be of interest to our readers and I try to provide a basic recap of these cases and leave it at that.  As Clint Eastwood once said, “A man’s got to know his limitations”.

A few days ago, The New York Times published an article entitled Armies of Expensive Lawyers, Replaced by Cheaper Software which discussed how, using ‘artificial intelligence, “e-discovery” software can analyze documents in a fraction of the time for a fraction of the cost’ (extraneous comma in the title notwithstanding).  The article goes on to discuss linguistic and sociological techniques for retrieval of relevant information and discusses how the Enron Corpus, available in a number of forms, including through EDRM, has enabled software providers to make great strides in analytical capabilities using this large base of data to use in testing.  It also discusses whether this will precipitate a march to the unemployment line for scores of attorneys.

A number of articles and posts since then have offered commentary as to whether that will be the case.  Technology tools will certainly reduce document populations significantly, but, as the article noted, “[t]he documents that the process kicks out still have to be read by someone”.  Not only that, the article still makes the assumption that people too often make with search technology – that it’s a “push a button and get your answer” approach to identifying relevant documents.  But, as has been noted in several cases and also here on this blog, searching is an iterative process where sampling the search results is recommended to confirm that the search maximizes recall and precision to the extent possible.  Who do you think is going to perform that sampling?  Lawyers – that’s who (working with technologists like me, of course!).  And, some searches will require multiple iterations of sampling and analysis before the search is optimized.

Therefore, while the “armies” of lawyers many not need near as many members of the infantry, they will still need plenty of corporals, sergeants, captains, colonels and generals.  And, for those entry-level reviewing attorneys that no longer have a place on review projects?  Well, we could always use a few more doctors on TV, right?  😉

So, what do you think?  Are you a review attorney that has been impacted by technology – positively or negatively?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: George Socha of Socha Consulting

 

This is the seventh of the LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and asked each of them the same three questions:

  1. What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?
  2. Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?
  3. What are you working on that you’d like our readers to know about?

Today’s thought leader is George Socha.  A litigator for 16 years, George is President of Socha Consulting LLC, offering services as an electronic discovery expert witness, special master and advisor to corporations, law firms and their clients, and legal vertical market software and service providers in the areas of electronic discovery and automated litigation support. George has also been co-author of the leading survey on the electronic discovery market, The Socha-Gelbmann Electronic Discovery Survey.  In 2005, he and Tom Gelbmann launched the Electronic Discovery Reference Model project to establish standards within the eDiscovery industry – today, the EDRM model has become a standard in the industry for the eDiscovery life cycle and there are eight active projects with over 300 members from 81 participating organizations. George has a J.D. for Cornell Law School and a B.A. from the University of Wisconsin – Madison.

What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?

On the very “flip” side, the number one trend to date in 2011 is predictions about trends in 2011.  They are part of a consistent and long-term pattern, which is that many of these trend predictions are not trend predictions at all – they are marketing material and the prediction is “you will buy my product or service in the coming year”.

That said, there are a couple of things of note.  Since I understand you talked to Tom about Apersee, it’s worth noting that corporations are struggling with working through a list of providers to find out who provides what services.  You would figure that there is somewhere in the range of 500 or so total providers.  But, my ever-growing list, which includes both external and law firm providers, is at more than 1,200.  Of course, some of those are probably not around anymore, but I am confident that there are at least 200-300 that I do not yet have on the list.  My guess when the list shakes out is that there are roughly 1,100 active providers out there today.  If you look at information from the National Center for State Courts and the Federal Judicial Center, you’ll see that there are about 11 million new lawsuits filed every year.  I saw an article in the Cornell Law Forum a week or two ago which indicated that there are roughly 1.1 million lawyers in the country.  So, there are 11 million lawsuits, 1.1 million lawyers and 1,100 providers.  Most of those lawyers have no experience with eDiscovery and most of those lawsuits have no provider involved, which means eDiscovery is still very much an emerging market, not even close to being a mature market.  As fast as providers disappear, through attrition or acquisition, new providers enter the market to take their place.

Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?

{Interviewed on the second afternoon of LTNY}  Maybe this is overly optimistic, but part of what I’m seeing in leading up to the conference, on various web sites and at the conference itself, is that a series of incremental changes taking place over a long period are finally leading to some radical differences.  One of those differences is that we finally are reaching a point where a number of providers can make the claim to being “end-to-end providers” with some legitimacy.  For as long as we’ve had the EDRM model, we’ve had providers that have professed to cover the full EDRM landscape, by which they generally have meant Identification through Production.  A growing number of providers not only cover that portion of the EDRM spectrum but have some ability to address Information Management, Presentation, or both   By and large, those providers are getting there by building their software and services based on experience and learning over the past 8 to 10 to 12 years, introducing new offerings at the show that reflect that learned experience.

A couple of days ago, I only half-jokingly issued “the Dyson challenge” (as in the Dyson vacuum cleaner).  Every year, come January, our living room carpet is strewn with pine tree needles and none of the vacuum cleaners that we have ever had have done a good job of picking up those needles.  The Dyson vacuum cleaner claims it cyclones capture more dirt than anything, but I was convinced that could not include those needles.  Nonetheless I tried, and to my surprise it worked like a charm!  I want to see the providers offering products able to perform at that high level, not just meeting but exceeding expectations.

I also see a feeling of excitement and optimism that wasn’t apparent at last year’s show.

What are you working on that you’d like our readers to know about?

As I mentioned, we have launched the Apersee web site, designed to allow consumers to find providers and products that fit their specific needs.  The site is in beta and the link is live.  It’s in beta because we’re still working on features to make it as useful as possible to customers and providers.  We’re hoping it’s a question of weeks, not months, before those features are implemented.  Once we go fully live, we will go two months with the system “wide open” – where every consumer can see all the provider and product information that any provider has put in the system.  After that, consumers will be able to see full provider and product profiles for providers who have purchased blocks of views.  Even if a provider does not purchase views, all selection criteria it enters are searchable, but search results will display only the provider’s name and website name.  Providers will be able to get stats on queries and how many times their information is viewed, but not detailed information as to which customers are connecting and performing the queries.

As for EDRM, we continue to make progress with an array of projects and a growing number of collaborative efforts, such as the work the Data Set group has down with TREC Legal and the work the Metrics group has done with the LEDES Committee. We not only want to see membership continue to grow, but we also want to continue to push for more active participation to continue to make progress in the various working groups.  We’ve just met at the show here regarding the EDRM Testing pilot project to address testing standards.  There are very few guidelines for testing of electronic discovery software and services, so the Testing project will become a full EDRM project as of the EDRM annual meeting this May to begin to address the need for those guidelines.

Thanks, George, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Trends: Jim McGann of Index Engines

 

This is the third of the LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and asked each of them the same three questions:

  1. What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?
  2. Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?
  3. What are you working on that you’d like our readers to know about?

Today’s thought leader is Jim McGann.  Jim is Vice President of Information Discovery at Index Engines.  Jim has extensive experience with the eDiscovery and Information Management in the Fortune 2000 sector. He has worked for leading software firms, including Information Builders and the French-based engineering software provider Dassault Systemes.  In recent years he has worked for technology-based start-ups that provided financial services and information management solutions.

What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?

What we’re seeing is that companies are becoming a bit more proactive.  Over the past few years we’ve seen companies that have simply been reacting to litigation and it’s been a very painful process because ESI collection has been a “fire drill” – a very last minute operation.  Not because lawyers have waited and waited, but because the data collection process has been slow, complex and overly expensive.  But things are changing. Companies are seeing that eDiscovery is here to stay, ESI collection is not going away and the argument of saying that it’s too complex or expensive for us to collect is not holding water. So, companies are starting to take a proactive stance on ESI collection and understanding their data assets proactively.  We’re talking to companies that are not specifically responding to litigation; instead, they’re building a defensible policy that they can apply to their data sources and make data available on demand as needed.    

Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?

{Interviewed on the first morning of LTNY}  Well, in walking the floor as people were setting up, you saw a lot of early case assessment last year; this year you’re seeing a lot of information governance..  That’s showing that eDiscovery is really rolling into the records management/information governance area.  On the CIO and General Counsel level, information governance is getting a lot of exposure and there’s a lot of technology that can solve the problems.  Litigation support’s role will be to help the executives understand the available technology and how it applies to information governance and records management initiatives.  You’ll see more information governance messaging, which is really a higher level records management message.

As for other trends, one that I’ll tie Index Engines into is ESI collection and pricing.  Per GB pricing is going down as the volume of data is going up.  Years ago, prices were a thousand per GB, then hundreds of dollars per GB, etc.  Now the cost is close to tens of dollars per GB. To really manage large volumes of data more cost-effectively, the collection price had to become more affordable.  Because Index Engines can make data on backup tapes searchable very cost-effectively, for as little as $50 per tape, data on tape has become  as easy to access and search as online data. Perhaps even easier because it’s not on a live network.  Backup tapes have a bad reputation because people think of them as complex or expensive, but if you take away the complexity and expense (which is what Index Engines has done), then they really become “full point-in-time” snapshots.  So, if you have litigation from a specific date range, you can request that data snapshot (which is a tape) and perform discovery on it.  Tape is really a natural litigation hold when you think about it, and there is no need to perform the hold retroactively.

So, what does the ease of which the information can be indexed from tape do to address the inaccessible argument for tape retrieval?  That argument has been eroding over the years, thanks to technology like ours.  And, you see decisions from judges like Judge Scheindlin saying “if you cannot find data in your primary network, go to your backup tapes”, indicating that they consider backup tapes in the next source right after online networks.  You also see people like Craig Ball writing that backup tapes may be the most convenient and cost-effective way to get access to data.  If you had a choice between doing a “server crawl” in a corporate environment or just asking for a backup tape of that time frame, tape is the much more convenient and less disruptive option.  So, if your opponent goes to the judge and says it’s going to take millions of dollars to get the information off of twenty tapes, you must know enough to be in front of a judge and say “that’s not accurate”.  Those are old numbers.  There are court cases where parties have been instructed to use tapes as a cost-effective means of getting to the data.  Technology removes the inaccessible argument by making it easier, faster and cheaper to retrieve data from backup tapes.

The erosion of the accessibility burden is sparking the information governance initiatives. We’re seeing companies come to us for legacy data remediation or management projects, basically getting rid of old tapes. They are saying “if I’ve got ten years of backup tapes sitting in offsite storage, I need to manage that proactively and address any liability that’s there” (that they may not even be aware exists).  These projects reflect a proactive focus towards information governance by remediating those tapes and getting rid of data they don’t need.  Ninety-eight percent of the data on old tapes is not going to be relevant to any case.  The remaining two percent can be found and put into the company’s litigation hold system, and then they can get rid of the tapes.

How do incremental backups play into that?  Tapes are very incremental and repetitive.  If you’re backing up the same data over and over again, you may have 50+ copies of the same email.  Index Engines technology automatically gets rid of system files and applies a standard MD5Hash to dedupe.  Also, by using tape cataloguing, you can read the header and say “we have a Saturday full backup and five incremental during the week, then another Saturday full backup”. You can ignore the incremental tapes and just go after the full backups.  That’s a significant percent of the tapes you can ignore.

What are you working on that you’d like our readers to know about?

Index Engines just announced today a partnership with LeClairRyan. This partnership combines legal expertise for data retention with the technology that makes applying the policy to legacy data possible.   For companies that want to build policy for the retention of legacy data and implement the tape remediation process we have advisors like LeClairRyan that can provide legacy data consultation and oversight.  By proactively managing the potential liability  of legacy data, you are also saving the IT costs to explore that data.

Index Engines  also just announced a new cloud-based tape load service that will provide full identification, search and access to tape data for eDiscovery. The Look & Learn service, starting at $50 per tape, will provide clients with full access to the index of their tape data without the need to install any hardware or software. Customers will be able to search the index and gather knowledge about content, custodians, email and metadata all via cloud access to the Index Engines interface, making discovery of data from tapes even more convenient and affordable.

Thanks, Jim, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Trends: Alon Israely, Esq., CISSP of BIA

 

This is the second of the LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and asked each of them the same three questions:

  1. What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?
  2. Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?
  3. What are you working on that you’d like our readers to know about?

Today’s thought leader is Alon Israely.  Alon is a Senior Advisor in BIA’s Advisory Services group and when he’s not advising clients on e-discovery issues he works closely with BIA’s product development group for its core technology products.  Alon has over fifteen years of experience in a variety of advanced computing-related technologies and has consulted with law firms and their clients on a variety of technology issues, including expert witness services related to computer forensics, digital evidence management and data security.

What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?

I think one of the important trends for corporate clients and law firms is cost control, whether it’s trying to minimize the amount of project management hours that are being billed or the manner in which the engagement is facilitated.  I’m not suggesting going full-bore necessarily, but taking baby steps to help control costs is a good approach.  I don’t think it’s only about bringing prices down, because I think that the industry in general has been able to do that naturally well.  But, I definitely see a new focus on the manner in which costs are managed and outsourced.  So, very specifically, scoping correctly is key, making sure you’re using the right tool for the right job, keeping efficiencies (whether that’s on the vendor side or the client side) by doing things such as not having five phone calls for a meeting to figure out what the key words are for field searching or just going out and imaging every drive before deciding what’s really needed. Bringing simple efficiencies to the mechanics of doing e-discovery saves tons of money in unnecessary legal, vendor and project management fees.  You can do things that are about creating efficiencies, but are not necessarily changing the process or changing the pricing.

I also see trends in technology, using more focused tools and different tools to facilitate a single project.  Historically, parties would hire three or four different vendors for a single project, but today it may be just one or two vendors or maybe even no vendors, (just the law firm) but, it’s the use of the right technologies for the right situations – maybe not just one piece of software, but leveraging several for different parts of the process.  Overall, I foresee fewer vendors per project, but more vendors increasing their stable of tools.  So, whereas a vendor may have had a review tool and one way of doing collection, now they may have two or three review tools, including an ECA tool, and one or two ways of doing collections. They have a toolkit from which they can choose the best set of tools to bring to the engagement.  Because they have more tools to market, vendors can have the right tool in-their-back-pocket whereas before the tool belonged to just one service provider so you bought from them, or you just didn’t have it.

Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?

{Interviewed on the first morning of LTNY} I think you have either a little or a lot of – depending on how aggressive I want to be with my opinion – that there seems to be a disconnect between what they’re speaking about in the panels and what we’re seeing on the floor.  But, I think that’s OK in that the conference itself, is usually a little bit ahead of the curve with respect to topics, and the technology will catch up.  You have topics such as predictive coding and social networking related issues – those are two big ones that you’ll see.  I think, for example, there are very few companies that have a solution for social networking, though we happen to have one.  And, predictive coding is the same scenario.  You have a lot of providers that talk about it, but you have a handful that actually do it, and you have probably even fewer than that who do it right.  I think that next year you’ll see many predictive coding solutions and technologies and many more tools that have that capability built into them.  So, on the conference side, there is one level of information and on the floor side, a different level.

What are you working on that you’d like our readers to know about?

BIA has a new product called TotalDiscovery.com, the industry’s first SaaS (software-as-a-service), on-demand collection technology that provides defensible collections.  We just rolled it out, we’re introducing it here at LegalTech and we’re starting a technology preview and signing up people who want to use the application or try it.  It’s specifically for attorneys, corporations, service providers – anyone who’s in the business and needs a tool for defensible data collection performed with agility (always hard to balance) – so without having to buy software or have expert training, users simply login or register and can start immediately.  You don’t have to worry about the traditional business processes to get things set up and started.  Which, if you think about it on the collections side of e-discovery it means that  the client’s CEO or VP of Marketing can call you up and say “I’m leaving, I have my PST here, can you just come get it?” and you can facilitate that process through the web, download an application, walk through a wizard, collect it defensibly, encrypt it and then deliver a filtered set, as needed, for review..

The tool is designed to collect defensibly and to move the collected data – or some subset of that data –to delivery, from there you would select your review tool of choice and we hand it off to the selected review tool.  So, we’re not trying to be everything, we’re focused on automating the left side of the EDRM.  We have loads to certain tools, having been a service provider for ten years, and we’re connecting with partners so that we can do the handoff, so when the client says “I’m ready to deliver my data”, they can choose OnDemand or Concordance or another review tool, and then either directly send it or the client can download and ship it.  We’re not trying to be a review tool and not trying to be an ECA tool that helps you find the needle in the haystack; instead, we’re focused on collecting the data, normalizing it, cataloguing it and handing if off for the attorneys to do their work.

Thanks, Alon, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Best Practices: Judges’ Guide to Cost-Effective eDiscovery

 

Last week at LegalTech, I met Joe Howie at the blogger’s breakfast on Tuesday morning.  Joe is the founder of Howie Consulting and is the Director of Metrics Development and Communications for the eDiscovery Institute, which is a 501(c)(3) nonprofit research organization for eDiscovery.

eDiscovery Institute has just released a new publication that is a vendor-neutral guide for approaches to considerably reduce discovery costs for ESI.  The Judges’ Guide to Cost-Effective E-Discovery, co-written by Anne Kershaw (co-Founder and President of the eDiscovery Institute) and Joe Howie, also contains a foreword by the Hon. James C. Francis IV, Magistrate Judge for the Southern District of New York.  Joe gave me a copy of the guide, which I read during my flight back to Houston and found to be a terrific publication that details various mechanisms that can reduce the volume of ESI to review by up to 90 percent or more.  You can download the publication here (for personal review, not re-publication), and also read a summary article about it from Joe in InsideCounsel here.

Mechanisms for reducing costs covered in the Guide include:

  • DeNISTing: Excluding files known to be associated with commercial software, such as help files, templates, etc., as compiled by the National Institute of Standards and Technology, can eliminate a high number of files that will clearly not be responsive;
  • Duplicate Consolidation (aka “deduping”): Deduping across custodians as opposed to just within custodians reduces costs 38% for across-custodian as opposed to 21% for within custodian;
  • Email Threading: The ability to review the entire email thread at once reduces costs 36% over having to review each email in the thread;
  • Domain Name Analysis (aka Domain Categorization): As noted previously in eDiscoveryDaily, the ability to classify items based on the domain of the sender of the email can significantly reduce the collection to be reviewed by identifying emails from parties that are clearly not responsive to the case.  It can also be a great way to quickly identify some of the privileged emails;
  • Predictive Coding: As noted previously in eDiscoveryDaily, predictive coding is the use of machine learning technologies to categorize an entire collection of documents as responsive or non-responsive, based on human review of only a subset of the document collection. According to this report, “A recent survey showed that, on average, predictive coding reduced review costs by 45 percent, with several respondents reporting much higher savings in individual cases”.

The publication also addresses concepts such as focused sampling, foreign language translation costs and searching audio records and tape backups.  It even addresses some of the most inefficient (and therefore, costly) practices of ESI processing and review, such as wholesale printing of ESI to paper for review (either in paper form or ultimately converted to TIFF or PDF), which is still more common than you might think.  Finally, it references some key rules of the ABA Model Rules of Professional Conduct to address the ethical duty of attorneys in effective management of ESI.  It’s a comprehensive publication that does a terrific job of explaining best practices for efficient discovery of ESI.

So, what do you think?  How many of these practices have been implemented by your organization?  Please share any comments you might have or if you’d like to know more about a particular topic.

Deadline Extended to Vote for the Most Significant eDiscovery Case of 2010

 

Our ‘little experiment’ to see what the readers of eDiscoveryDaily think about case law developments in 2010 needs more time as we have not yet received enough votes yet to have a statistically significant result.  So, we’ve extended the deadline to select the case with the most significant impact on eDiscovery practices in 2010 to February 28.  Evidently, calling out the vote on the last business day before LegalTech is not the best timing.  Live and learn!

As noted previously, we have “nominated” five cases, which we feel were the most significant in different issues of case law, including duty to preserve and sanctions, clawback agreements under Federal Rule of Evidence 502, not reasonably accessible arguments and discoverability of social media content.  If you feel that some other case was the most significant case of 2010, you can select that case instead.  Again, it’s very important to note that you can vote anonymously, so we’re not using this as a “hook” to get your information.  You can select your case without providing any personal information.  However, we would welcome your comments as to why you selected the case you did and you can – optionally – identify yourself as well.

To get more information about the nominated cases (as well as other significant cases), click here.  To cast your vote, click here.

And, as always, please share any comments you might have or if you’d like to know more about a particular topic.

Vote for the Most Significant eDiscovery Case of 2010!

 

Since it’s awards season, we thought we would get into the act from an eDiscovery standpoint.  Sure, you have Oscars, Emmys and Grammys – but what about “EDDies”?  (I’ll bet you wondered what Eddie Munster could possibly have to do with eDiscovery, didn’t you?)

So, we’re conducting a ‘little experiment’ to see what the readers of eDiscoveryDaily think about case law developments in 2010.  This is our first annual “EDDies” award to select the case with the most significant impact on eDiscovery practices in 2010.  No cash or prizes being awarded, or even a statuette, but a chance to see what the readers think was the most important case of the year from an eDiscovery standpoint.

We have “nominated” five cases below, which we feel were the most significant in different issues of case law, including duty to preserve and sanctions, clawback agreements under Federal Rule of Evidence 502, not reasonably accessible arguments and discoverability of social media content.  We have a link to review more information about each case, and a link at the bottom of this post to cast your vote.

Very Important!  You can vote anonymously, so we’re not using this as a “hook” to get your information.  You can click on the link at the bottom, select your case and be done with it.  However, we would welcome your comments as to why you selected the case you did and you can – optionally – identify yourself as well.  eDiscoveryDaily will publish selected comments to reflect opinion of the voters as well as the vote results on February 7.  Click here to cast your vote now!

So, here are the cases:

Duty to Preserve/Sanctions

  • The Pension Committee of the Montreal Pension Plan v. Banc of America Securities, LLC, 29010 U.S. Dist. Lexis 4546 (S.D.N.Y. Jan. 15, 2010) (as amended May 28, 2010) – “Pension Committee”: The case that defined negligence, gross negligence, and willfulness in the electronic discovery context and demonstrated the consequences (via sanctions) resulting from those activities.  Judge Shira Scheindlin titled her 85-page opinion “Zubulake Revisited: Six Years Later”.  For more on this case, click here.
  • Victor Stanley, Inc. v. Creative Pipe, Inc., 2010 WL 3530097 (D. Md. 2010) – “Victor Stanley II”: The case of “the gang that couldn’t spoliate straight” where one of the defendants faced imprisonment for up to 2 years (subsequently set aside on appeal) and the opinion included a 12 page chart delineating the preservation and spoliation standards in each judicial circuit.  For more on this case, click here and here.

Clawback Agreements

  • Rajala v. McGuire Woods LLP, 2010 WL 2949582 (D. Kan. July 22, 2010) – “Rajala”: The case that addressed the applicability of Federal Rule of Evidence 502(d) and (e) for “clawback” provisions for inadvertently produced privileged documents.  For more on this case, click here.

Not Reasonably Accessible

  • Major Tours, Inc. v. Colorel, 2010 WL 2557250 (D.N.J. June 22, 2010) – “Major Tours”: The case that established a precedent that a party may obtain a Protective Order relieving it of the duty to access backup tapes, even when that party’s failure to issue a litigation hold caused the data not to be available via any other means.  For more on this case, click here.

Social Media Discovery

  • Crispin v. Christian Audigier Inc., 2010 U.S. Dist. Lexis 52832 (C.D. Calif. May 26, 2010) – “Crispin”: The case that used a 24 year old law (The Stored Communications Act of 1986) to address whether ‘private’ data on social networks is discoverable.  For more on this case, click here.

If you feel that some other case was the most significant case of 2010, you can select that case instead.  Other notable cases include:

  • Rimkus v. Cammarata, 2010 WL 645253 (S.D. Tex. Feb. 19, 2010): Where District Court Judge Lee Rosenthal examined spoliation laws of each of the 13 Federal Circuit Courts of Appeal.
  • Orbit One Communications Inc. v. Numerex Corp., 2010 WL 4615547 (S.D.N.Y. Oct. 26, 2010): Magistrate Judge James C. Francis concluded that sanctions for spoliation must be based on the loss of at least some information relevant to the dispute (differing with “Pension Committee” in this manner).
  • DeGeer v. Gillis, 2010 U.S. Dist. Lexis 97457(N.D. Ill. Sept. 17, 2010): Demonstration of inadvertent disclosure made FRE 502(d) effective, negating waiver of privilege.
  • Takeda Pharmaceutical Co., Ltd. v. Teva Pharmaceuticals USA, Inc., 2010 WL 2640492 (D. Del. June 21, 2010): Defendants’ motion to compel the production of ESI for a period of 18 years was granted, with imposed cost-shifting.
  • E.E.O.C. v. Simply Storage Management, LLC, 2010 U.S. Dist. Lexis 52766 (S.D. Ind. May 11, 2010): EEOC is ordered to produce certain social networking communications.
  • McMillen v. Hummingbird Speedway, Inc., No. 113-2010 CD (C.P. Jefferson, Sept. 9, 2010): Motion to Compel discovery of social network account log-in names and passwords was granted.

Click here to cast your vote now!  Results will be published in eDiscoveryDaily on February 7.

The success of this ‘little experiment’ will determine whether next year there is a second annual “EDDies” award.  😉

And, as always, please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Searching: For Defensible Searching, Be a "STARR"

 

Defensible searching has become a priority in eDiscovery as parties in several cases have experienced significant consequences (including sanctions) for not implementing a defensible search strategy in responding to discovery requests.

Probably the most famous case where search approach has been an issue was Victor Stanley, Inc. v. Creative Pipe , Inc., 250 F.R.D. 251 (D. Md. 2008), where Judge Paul Grimm noted that “only prudent way to test the reliability of the keyword search is to perform some appropriate sampling of the documents” and found that privilege on 165 inadvertently produced documents was waived, in part, because of the inadequacy of the search approach.

A defensible search strategy is part using an effective tool (with advanced search capabilities such as “fuzzy”, wildcard, synonym and proximity searching) and part using an effective approach to test and verify search results.

I have an acronym that I use to reflect the defensible search process.  I call it “STARR” – as in “STAR” with an extra “R” or Green Bay Packer football legend Bart Starr (sorry, Bears fans!).  For each search that you need to conduct, here’s how it goes:

  • Search: Construct the best search you can to maximize recall and precision for the desired result.  An effective tool gives you more options for constructing a more effective search, which should help in maximizing recall and precision.  For example, as noted on this blog a few days ago, a proximity search can, under the right circumstances, provide a more precise search result without sacrificing recall.
  • Test: Once you’ve conducted the search, it’s important to test two datasets to determine the effectiveness of the search:
    • Result Set: Test the result set by randomly selecting an appropriate sample percentage of the files and reviewing those to determine their responsiveness to the intent of the search.  The appropriate percentage of files to be reviewed depends on the size of the result set – the smaller the set, the higher percentage of it that should be reviewed.
    • Files Not Retrieved: While testing the result set is important, it is also important to randomly select an appropriate sample percentage of the files that were not retrieved in the search and review those as well to see if any responsive hits are identified as missed by the search.
  • Analyze: Analyze the results of the random sample testing of both the result set and also the files not retrieved to determine how effective the search was in retrieving mostly responsive files and whether any responsive files were identified as missed by the search performed.
  • Revise: If the search retrieved a low percentage of responsive files and retrieved a high percentage of non-responsive files, then precision of the search may need to be improved.  If the files not retrieved contained any responsive files, then recall of the search may need to be improved.  Evaluate the results and see what, if any, revisions can be made to the search to improve precision and/or recall.
  • Repeat: Once you’ve identified revisions you can make to your search, repeat the process.  Search, Test, Analyze and (if necessary) Revise the search again until the precision and recall of the search is maximized to the extent possible.

While you can’t guarantee that you will retrieve all of the responsive files or eliminate all of the non-responsive ones, a defensible approach to get as close as you can to that goal will minimize the number of files for review, potentially saving considerable costs and making you a “STARR” in the courtroom when defending your search approach.

So, what do you think?  Are you a “STARR” when it comes to defensible searching?  Please share any comments you might have or if you’d like to know more about a particular topic.