Electronic Discovery

eDiscovery Trends: Jim McGann of Index Engines

 

This is the third of the LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and asked each of them the same three questions:

  1. What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?
  2. Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?
  3. What are you working on that you’d like our readers to know about?

Today’s thought leader is Jim McGann.  Jim is Vice President of Information Discovery at Index Engines.  Jim has extensive experience with the eDiscovery and Information Management in the Fortune 2000 sector. He has worked for leading software firms, including Information Builders and the French-based engineering software provider Dassault Systemes.  In recent years he has worked for technology-based start-ups that provided financial services and information management solutions.

What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?

What we’re seeing is that companies are becoming a bit more proactive.  Over the past few years we’ve seen companies that have simply been reacting to litigation and it’s been a very painful process because ESI collection has been a “fire drill” – a very last minute operation.  Not because lawyers have waited and waited, but because the data collection process has been slow, complex and overly expensive.  But things are changing. Companies are seeing that eDiscovery is here to stay, ESI collection is not going away and the argument of saying that it’s too complex or expensive for us to collect is not holding water. So, companies are starting to take a proactive stance on ESI collection and understanding their data assets proactively.  We’re talking to companies that are not specifically responding to litigation; instead, they’re building a defensible policy that they can apply to their data sources and make data available on demand as needed.    

Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?

{Interviewed on the first morning of LTNY}  Well, in walking the floor as people were setting up, you saw a lot of early case assessment last year; this year you’re seeing a lot of information governance..  That’s showing that eDiscovery is really rolling into the records management/information governance area.  On the CIO and General Counsel level, information governance is getting a lot of exposure and there’s a lot of technology that can solve the problems.  Litigation support’s role will be to help the executives understand the available technology and how it applies to information governance and records management initiatives.  You’ll see more information governance messaging, which is really a higher level records management message.

As for other trends, one that I’ll tie Index Engines into is ESI collection and pricing.  Per GB pricing is going down as the volume of data is going up.  Years ago, prices were a thousand per GB, then hundreds of dollars per GB, etc.  Now the cost is close to tens of dollars per GB. To really manage large volumes of data more cost-effectively, the collection price had to become more affordable.  Because Index Engines can make data on backup tapes searchable very cost-effectively, for as little as $50 per tape, data on tape has become  as easy to access and search as online data. Perhaps even easier because it’s not on a live network.  Backup tapes have a bad reputation because people think of them as complex or expensive, but if you take away the complexity and expense (which is what Index Engines has done), then they really become “full point-in-time” snapshots.  So, if you have litigation from a specific date range, you can request that data snapshot (which is a tape) and perform discovery on it.  Tape is really a natural litigation hold when you think about it, and there is no need to perform the hold retroactively.

So, what does the ease of which the information can be indexed from tape do to address the inaccessible argument for tape retrieval?  That argument has been eroding over the years, thanks to technology like ours.  And, you see decisions from judges like Judge Scheindlin saying “if you cannot find data in your primary network, go to your backup tapes”, indicating that they consider backup tapes in the next source right after online networks.  You also see people like Craig Ball writing that backup tapes may be the most convenient and cost-effective way to get access to data.  If you had a choice between doing a “server crawl” in a corporate environment or just asking for a backup tape of that time frame, tape is the much more convenient and less disruptive option.  So, if your opponent goes to the judge and says it’s going to take millions of dollars to get the information off of twenty tapes, you must know enough to be in front of a judge and say “that’s not accurate”.  Those are old numbers.  There are court cases where parties have been instructed to use tapes as a cost-effective means of getting to the data.  Technology removes the inaccessible argument by making it easier, faster and cheaper to retrieve data from backup tapes.

The erosion of the accessibility burden is sparking the information governance initiatives. We’re seeing companies come to us for legacy data remediation or management projects, basically getting rid of old tapes. They are saying “if I’ve got ten years of backup tapes sitting in offsite storage, I need to manage that proactively and address any liability that’s there” (that they may not even be aware exists).  These projects reflect a proactive focus towards information governance by remediating those tapes and getting rid of data they don’t need.  Ninety-eight percent of the data on old tapes is not going to be relevant to any case.  The remaining two percent can be found and put into the company’s litigation hold system, and then they can get rid of the tapes.

How do incremental backups play into that?  Tapes are very incremental and repetitive.  If you’re backing up the same data over and over again, you may have 50+ copies of the same email.  Index Engines technology automatically gets rid of system files and applies a standard MD5Hash to dedupe.  Also, by using tape cataloguing, you can read the header and say “we have a Saturday full backup and five incremental during the week, then another Saturday full backup”. You can ignore the incremental tapes and just go after the full backups.  That’s a significant percent of the tapes you can ignore.

What are you working on that you’d like our readers to know about?

Index Engines just announced today a partnership with LeClairRyan. This partnership combines legal expertise for data retention with the technology that makes applying the policy to legacy data possible.   For companies that want to build policy for the retention of legacy data and implement the tape remediation process we have advisors like LeClairRyan that can provide legacy data consultation and oversight.  By proactively managing the potential liability  of legacy data, you are also saving the IT costs to explore that data.

Index Engines  also just announced a new cloud-based tape load service that will provide full identification, search and access to tape data for eDiscovery. The Look & Learn service, starting at $50 per tape, will provide clients with full access to the index of their tape data without the need to install any hardware or software. Customers will be able to search the index and gather knowledge about content, custodians, email and metadata all via cloud access to the Index Engines interface, making discovery of data from tapes even more convenient and affordable.

Thanks, Jim, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Trends: EDD Toolkit Smartphone App

 

“Blue Horseshoe Loves BlueStar”.  Anyone remember that famous quote from Charlie Sheen (when he was known for his acting) in the movie Wall Street?

Well, now eDiscovery buffs with a smartphone love BlueStar too.

BlueStar Case Solutions, Inc. (BlueStar), just launched EDD Toolkit, which is a free eDiscovery app for smartphones. The app features a Cost Estimator, Time Estimator, Conversion Table and Glossary for common eDiscovery questions with regards to ESI processing, document review and production. BlueStar is touting EDD Toolkit as “a useful application for attorneys, paralegals, in-house counsel and litigation support staff who quickly need answers about a particular eDiscovery project”.

Desiree Salomon, BlueStar’s Marketing Manager says, “It’s the ultimate eDiscovery ‘cheat sheet.’”

The app components include:

  • Conversion Table: Calculates the number of documents or pages in a user defined amount of data and breaks it down by common email and document formats.  So, if you ever need to perform a quick estimate of document size based on the data size of your collection, it can provide a quick, “ballpark” estimate.
  • Cost Estimator: Using stated “industry averages”, it estimates how much a user defined amount of data or number of documents for review could cost, based on basic assumptions.
  • Time Estimator: Estimates time required for ESI processing and review, as well as how long it can take to scan paper documents into an electronic format.
  • Glossary: Provides definitions for many common eDiscovery related terms via a quickly accessible interface.  This component is particularly educational for the eDiscovery novice.

I downloaded the app onto my Android phone and played with it a bit and, it is pretty cool!  Of course, the cost and time estimators are not substitutes for a formal estimate; in fact, the app provides a link to request a formal quote from BlueStar.  How convenient!  Nonetheless, it’s a clever idea and I have to hand it to BlueStar for an ingenious marketing tool.

BlueStar's EDD Toolkit is currently available for iPhone and Android, while BlackBerry and Windows 7 versions are “scheduled for release later this month”, according to their press release. To learn more or to download the EDD Toolkit app for free, go to http://www.bluestarcs.com/resources/app-support.

So, what do you think?  Are you ready to use your smartphone to learn more about eDiscovery?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: Alon Israely, Esq., CISSP of BIA

 

This is the second of the LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and asked each of them the same three questions:

  1. What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?
  2. Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?
  3. What are you working on that you’d like our readers to know about?

Today’s thought leader is Alon Israely.  Alon is a Senior Advisor in BIA’s Advisory Services group and when he’s not advising clients on e-discovery issues he works closely with BIA’s product development group for its core technology products.  Alon has over fifteen years of experience in a variety of advanced computing-related technologies and has consulted with law firms and their clients on a variety of technology issues, including expert witness services related to computer forensics, digital evidence management and data security.

What do you consider to be the current significant trends in eDiscovery on which people in the industry are, or should be, focused?

I think one of the important trends for corporate clients and law firms is cost control, whether it’s trying to minimize the amount of project management hours that are being billed or the manner in which the engagement is facilitated.  I’m not suggesting going full-bore necessarily, but taking baby steps to help control costs is a good approach.  I don’t think it’s only about bringing prices down, because I think that the industry in general has been able to do that naturally well.  But, I definitely see a new focus on the manner in which costs are managed and outsourced.  So, very specifically, scoping correctly is key, making sure you’re using the right tool for the right job, keeping efficiencies (whether that’s on the vendor side or the client side) by doing things such as not having five phone calls for a meeting to figure out what the key words are for field searching or just going out and imaging every drive before deciding what’s really needed. Bringing simple efficiencies to the mechanics of doing e-discovery saves tons of money in unnecessary legal, vendor and project management fees.  You can do things that are about creating efficiencies, but are not necessarily changing the process or changing the pricing.

I also see trends in technology, using more focused tools and different tools to facilitate a single project.  Historically, parties would hire three or four different vendors for a single project, but today it may be just one or two vendors or maybe even no vendors, (just the law firm) but, it’s the use of the right technologies for the right situations – maybe not just one piece of software, but leveraging several for different parts of the process.  Overall, I foresee fewer vendors per project, but more vendors increasing their stable of tools.  So, whereas a vendor may have had a review tool and one way of doing collection, now they may have two or three review tools, including an ECA tool, and one or two ways of doing collections. They have a toolkit from which they can choose the best set of tools to bring to the engagement.  Because they have more tools to market, vendors can have the right tool in-their-back-pocket whereas before the tool belonged to just one service provider so you bought from them, or you just didn’t have it.

Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?

{Interviewed on the first morning of LTNY} I think you have either a little or a lot of – depending on how aggressive I want to be with my opinion – that there seems to be a disconnect between what they’re speaking about in the panels and what we’re seeing on the floor.  But, I think that’s OK in that the conference itself, is usually a little bit ahead of the curve with respect to topics, and the technology will catch up.  You have topics such as predictive coding and social networking related issues – those are two big ones that you’ll see.  I think, for example, there are very few companies that have a solution for social networking, though we happen to have one.  And, predictive coding is the same scenario.  You have a lot of providers that talk about it, but you have a handful that actually do it, and you have probably even fewer than that who do it right.  I think that next year you’ll see many predictive coding solutions and technologies and many more tools that have that capability built into them.  So, on the conference side, there is one level of information and on the floor side, a different level.

What are you working on that you’d like our readers to know about?

BIA has a new product called TotalDiscovery.com, the industry’s first SaaS (software-as-a-service), on-demand collection technology that provides defensible collections.  We just rolled it out, we’re introducing it here at LegalTech and we’re starting a technology preview and signing up people who want to use the application or try it.  It’s specifically for attorneys, corporations, service providers – anyone who’s in the business and needs a tool for defensible data collection performed with agility (always hard to balance) – so without having to buy software or have expert training, users simply login or register and can start immediately.  You don’t have to worry about the traditional business processes to get things set up and started.  Which, if you think about it on the collections side of e-discovery it means that  the client’s CEO or VP of Marketing can call you up and say “I’m leaving, I have my PST here, can you just come get it?” and you can facilitate that process through the web, download an application, walk through a wizard, collect it defensibly, encrypt it and then deliver a filtered set, as needed, for review..

The tool is designed to collect defensibly and to move the collected data – or some subset of that data –to delivery, from there you would select your review tool of choice and we hand it off to the selected review tool.  So, we’re not trying to be everything, we’re focused on automating the left side of the EDRM.  We have loads to certain tools, having been a service provider for ten years, and we’re connecting with partners so that we can do the handoff, so when the client says “I’m ready to deliver my data”, they can choose OnDemand or Concordance or another review tool, and then either directly send it or the client can download and ship it.  We’re not trying to be a review tool and not trying to be an ECA tool that helps you find the needle in the haystack; instead, we’re focused on collecting the data, normalizing it, cataloguing it and handing if off for the attorneys to do their work.

Thanks, Alon, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

Managing an eDiscovery Contract Review Team: Starting the Project

 

Throughout the life of the project, you will implement some standard project management techniques for monitoring quality and ensuring that questions are resolved efficiently.  We’re going to cover those techniques in the next blog posts in this series.  There are, however, a couple of special steps you should take at the beginning of a project to ensure that it gets off to a smooth start.

First, you want to make sure very quickly that everyone on the team understands the criteria and is applying it correctly to the documents.  The best way to do this is to check everybody’s initial work right away and to provide feedback as quickly as you can.   Make arrangements with the supervisory staff to work extra hours the first few days of the project.  During the first couple of days of the project, have supervisors check the work after the review staff leaves.   Have the supervisors give feedback – one-on-one – to each team member within the first couple of days of the project.  In addition, make sure that the supervisors are communicating with one another and with the project manager about what they are finding.  Wide-spread misunderstandings will uncover holes in the training and can easily be cleared up with the group in short re-training sessions. 

When we talked about who should be on the review team, we talked about “decision makers” and subject matter experts.  Make sure these team members are onsite full-time the first few days of the project.  There will be a lot of questions the first few days, and you’ll want to resolve those questions quickly.  Once the project gets underway, the level of questions will subside, and the supervisors and project manager will be better equipped to answer the questions that do arise.  At that point, you probably don’t need the decision makers and subject matter experts on hand full time.  But make sure they are present at the start of the project. 

How do you approach starting a document review project?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

eDiscovery Trends: Tom Gelbmann of Gelbmann & Associates, LLC

 

This is the first of the LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and asked each of them the same three questions:

  1. What do you consider to be the current significant trends in eDiscovery that people in the industry are, or should be, focused on?
  2. Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?
  3. What are you working on that you’d like our readers to know about?

Today’s thought leader is Tom Gelbmann. Tom is Principal of Gelbmann & Associates, LLC, co-author of the Socha-Gelbmann Electronic Discovery Survey and co-founder of the Electronic Discovery Reference Model (EDRM).  Since 1993, Gelbmann & Associates, LLC has helped law firms and Corporate Law Departments realize the full benefit of their investments in Information Technology.  As today is Valentine’s Day, consider this interview with Tom as eDiscoveryDaily’s Valentine’s Day present to you!

What do you consider to be the current significant trends in eDiscovery that people in the industry are, or should be, focused on?

The first thing that comes to mind is the whole social media thing, which is something you’re probably getting quite a bit of (in your interviews), but with the explosion of the use of social media, personally and within organizations, we’re seeing a huge explosion (in eDiscovery).  One of the issues is that there is very little in terms of policy and management around that, and I look at it in a very similar vein to the late ’80s and early ‘90s when electronic mail came about and there were no real defining guidelines.  It wasn’t until we got to a precipitating event where “all of a sudden, organizations get religion” and say “oh my god, we better have a policy for this”.  So, I think the whole social media thing is one issue.

On top of that, another area that is somewhat of an umbrella to all this is information management and EDRM with the Information Management Reference Model (IMRM) is certainly part of that. What is important in this context is that corporations are beginning to realize the more they get their “electronic house in order”, the better off they’re going to be in many ways.  Less cost, less embarrassment and so forth.

The third thing is that, and this is something that I’ve been tracking for awhile, the growth in tools and solutions available for small organizations and small cases.  For a long time, everything was about millions of documents and gigabytes of data – that’s what got the headlines and that what the service bureaus and providers were focusing on.  The real “gold” in my mind is the small cases, the hundreds of thousands of small cases that are out there.  The providers that can effectively reach that market in a cost-effective way will be positioned very well and I think we’re starting to see that happen.  And, I think the whole “cloud” concept of technology is helping that.

Which of those trends are evident here at LTNY, which are not being talked about enough, and/or what are your general observations about LTNY this year?

{Interviewed on the first afternoon of the show} Well, so far it’s been a blur [laughs].  But, I think we’re definitely seeing social media as a big issue at this LegalTech and I also think we’re seeing more solutions toward the smaller cases and smaller organizations here at this year’s show.

What are you working on that you’d like our readers to know about?

From an EDRM standpoint, I just came from a meeting for the EDRM Testing pilot project.  Last fall, at the mid-year meeting, there was a groundswell to address testing, and the basic issue is applying some principles of testing to software products associated with electronic discovery to answer the question of “how do you know?” when the court asks if the results are true and what sort of testing process did you go through.  There is very little as far as a testing regimen or even guidelines on a testing regimen for electronic discovery software and so the EDRM testing group is looking to establish some guidelines, starting very basically looking at bands of rigor associated with bands of risk.  So, you will see that at this year’s EDRM annual meeting in May that EDRM Testing will become a full-fledged project.

And the other thing that I’m happy to announce is that George Socha and I have launched a web site called Apersee, which is the next step in the evolution of the (Socha-Gelbmann) rankings.  We killed the rankings two years ago because they were being misused.  Consumers wanted to know who do I send the RFP to, who do I engage and they would almost mindlessly send to the Socha-Gelbmann Top Ten.  But, now the consumers can specify what they’re looking for, starting with areas of the model, whether it’s Collection, Preservation, Review, etc., and provide other information such as geography and types of ESI and what will be returned on those searches is a list of providers with those services or products.  We have right now about 800 providers in the database and many of those have very basic listings at this point.  As this is currently in beta, we have detailed information that we pre-populated for about 200 providers and are expanding rapidly.  Over the next couple of months, we’re working hard with providers to populate their sites with whatever content is appropriate to describe their products and services in terms of what they do, where they do it, etc., that can feed the search engine.  And, we have been getting very good feedback from both the consumer side and the provider side as being a very valuable service.

Thanks, Tom, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Case Law: Responses to FOIA Requests Must Be Searchable

Southern District of New York Judge Shira A. Scheindlin is at it again!  Her latest ruling is that the federal government must provide documents “in a usable format” when it responds to Freedom of Information Act (FOIA) requests.

Noting that “Once again, this Court is required to rule on an eDiscovery issue that could have been avoided had the parties had the good sense to ‘meet and confer,’ ‘cooperate’ and generally make every effort to ‘communicate’ as to the form in which ESI would be produced.”, Judge Scheindlin ruled that federal agencies must turn over documents that include “metadata,” which allows them to be searched and indexed.  Indicating that “common sense dictates” that the handling of FOIA requests should be informed by “the spirit if not the letter” of the Federal Rules of Civil Procedure, Judge Scheindlin indicated the government offered “a lame excuse” for delivering non-searchable documents.

In National Day Laborer Organizing Network v. U.S. Immigration and Customs Enforcement Agency, 10 Civ. 3488, the National Day Laborer Organizing Network, Center for Constitutional Rights and the Immigration Justice Clinic at the Benjamin N. Cardozo School of Law sued to require production of a wide range of documents under the Freedom of Information Act in August 2010.  In response, the government agency defendants produced documents grouped together in large files that were not searchable, for which individual documents could not be easily identified, with emails separated from their attachments.

Consistent with the decisions of several state courts regarding their own FOIA statutes, Judge Scheindlin ruled that the federal law requires that metadata, which allows for electronic files to be organized and searched, must be retained in the records agencies produce.  While the federal act doesn’t specifically specify the form in which documents must be delivered, it does require that documents be provided in any “format” that is “readily reproducible” by the agency in that format.  Metadata, in the FOIA context, is “readily reproducible,” Judge Scheindlin noted.

Judge Scheindlin also observed that “whether or not metadata has been specifically requested,” the production of non-searchable documents is “an inappropriate downgrading” of electronically stored information and provision of files “stripped of all metadata and lumped together without any indication of where a record begins and ends” is not an “acceptable form of production,” she said.

A copy of the opinion and order can be found here.

So, what do you think?  Have you been the recipient of a “lumped together” non-searchable production recently?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: EDRM Metrics Privilege Survey

 

As a member of the EDRM Metrics Project for the past four years, I have seen several accomplishments by the group to provide an effective means of measuring the time, money and volumes associated with eDiscovery activities, including:

  • Code Set: An extensive code set of activities to be tracked from Identification through Presentation, as well as Project Management.
  • Case Study: A hypothetical case study that illustrates at each phase why metrics should be collected, what needs to be measured, how metrics are acquired and where they’re recorded, and how the metrics can be used.
  • Cube: A simple graphical model which illustrates the EDRM phases, aspects to be tracked (e.g., custodians, systems, media, QA, activities, etc.) and the metrics to be applied (i.e., items, cost, volume, time).

The EDRM Metrics project has also been heavily involved in proposing a standard set of eDiscovery activity codes for the ABA’s Uniform Task Based Management System (UTBMS) series of codes used to classify the legal services performed by a law firm in an electronic invoice submission.

Now, we need your help for an information gathering exercise.

We are currently conducting a Metrics Privilege survey to get a sense throughout the industry as to typical volumes and percentages of privileged documents within a collection.  It’s a simple, 7 question survey that strives to gather information regarding your experiences with privileged documents (whether you work for a law firm, corporation, provider or some other organization).

If you have a minute (which is literally all the time it will take), please take the survey and pass along to your colleagues to do so as well.  The more respondents who participate, the more representative the survey will be as to the current eDiscovery community.  To take the survey, go to edrm.net or click here.  EDRM will publish the results in the near future.

So, what do you think?  What are your typical metrics with regard to privileged documents?  Please share any comments you might have or if you’d like to know more about a particular topic.

Managing an eDiscovery Contract Review Team: Training a Review Team

 

Yesterday, we discussed the assembling the project team for document review.  It’s also important that the review team gets good training.  As a starting point, prepare a training manual for each team member that includes this information:

  • The document review criteria
  • A list of the custodians.  For each, provide the custodian’s title, a job description, a description of his/her role in the events that are at issue in the case, and a description of the types of documents you expect will be found in his/her files
  • Lists of keywords, key characters, key events, and key dates
  • Samples of responsive documents that you collected when you reviewed the collection
  • The review procedures
  • The review schedule
  • Instructions for use of the review tool

Cover these topics in training:

  • Case background information
    • A description of the parties
    • A description of the events that led to the case
    • A description of the allegations and defenses
    • An overview of the expected case schedule
  • Project overview information
    • A description of the goals of the review project
    • A description of the process
    • An overview of the expected project schedule
  • Responsive criteria
    • Go through the criteria – point-by-point – to ensure that the group understands what is responsive
    • Provide samples of responsive documents
  • Mechanics
    • Describe the roles of individuals on the team
    • Review the procedures for reviewing documents
    • Train the reviewers in use of the online review tool

Give the team training exercises – that is, give them sample documents to review.  Collect the work, review it, and give feedback to the group.

And let me give you two more suggestions that will help make your training effective:

  1. Train the team together, rather than one-on-one or in sub-groups.  Under this team-training approach, you ensure that everyone hears the same thing, and that responses to questions asked by individuals will benefit the entire group.
  2. Involve a senior attorney in the training.  You might, for example, ask a senior attorney to give the case background information.  Attention from a senior litigation team member is good for morale.  It tells the team that the work they are doing is important to the case.

How do you approach training a document review team?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

Managing an eDiscovery Contract Review Team: Assembling the Project Team

 

Before assembling the review team, think through how the project will be structured.  This will drive decisions on the type of people that you’ll need.  Your goal is to get the work done as cost effectively as possible – using less expensive personnel where possible — without sacrificing work quality or the utility of the work product.

The “base” of the project will be comprised of contract reviewers and qc staff.  In the project plan, you determined the number of people that you need.  At this point, don’t worry about who will be a reviewer and who will do qc work.  Everybody can start as a reviewer.  After a few days, you can identify who will do qc work.  You’ve got options for assembling this staff, but you should consider working with a litigation support vendor who offers staffing services.  A good vendor already has access to a pool of people with document review experience.  This can save you lots of time and work.

In addition to the contract review staff, you’ll need project management staff.  We’ve already talked about a project manager.  For a large project, you’ll want project supervisors — each responsible for a team of reviewers/qc personnel.  Each supervisor is responsible for overseeing the flow of work to the team, the quality of the work done by the team, the productivity of team members, and answering questions raised by reviewers (or ensuring that questions are resolved).  I usually create teams of 10 to 12 and assign one supervisor to a team.   The supervisors might be law firm litigation support professionals, or supervisory staff provided by the vendor with whom you are working.

You’ll also need “decision makers” and experts in the subject matter to round out the team.  At a minimum, you’ll want an attorney from the litigation team.  Depending on the complexity of the documents, you might need a client employee who is familiar with the company’s operations and documents.  These people should be on-site, full-time for the first few days of a project.  Eventually there will be fewer questions and it’s probably sufficient to have phone access to these team members.

Later in this blog series we’ll talk about how these staff levels interact so that decisions are made by attorneys but effectively implemented by review staff.

How do you structure a document review team?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

eDiscovery Best Practices: Judges’ Guide to Cost-Effective eDiscovery

 

Last week at LegalTech, I met Joe Howie at the blogger’s breakfast on Tuesday morning.  Joe is the founder of Howie Consulting and is the Director of Metrics Development and Communications for the eDiscovery Institute, which is a 501(c)(3) nonprofit research organization for eDiscovery.

eDiscovery Institute has just released a new publication that is a vendor-neutral guide for approaches to considerably reduce discovery costs for ESI.  The Judges’ Guide to Cost-Effective E-Discovery, co-written by Anne Kershaw (co-Founder and President of the eDiscovery Institute) and Joe Howie, also contains a foreword by the Hon. James C. Francis IV, Magistrate Judge for the Southern District of New York.  Joe gave me a copy of the guide, which I read during my flight back to Houston and found to be a terrific publication that details various mechanisms that can reduce the volume of ESI to review by up to 90 percent or more.  You can download the publication here (for personal review, not re-publication), and also read a summary article about it from Joe in InsideCounsel here.

Mechanisms for reducing costs covered in the Guide include:

  • DeNISTing: Excluding files known to be associated with commercial software, such as help files, templates, etc., as compiled by the National Institute of Standards and Technology, can eliminate a high number of files that will clearly not be responsive;
  • Duplicate Consolidation (aka “deduping”): Deduping across custodians as opposed to just within custodians reduces costs 38% for across-custodian as opposed to 21% for within custodian;
  • Email Threading: The ability to review the entire email thread at once reduces costs 36% over having to review each email in the thread;
  • Domain Name Analysis (aka Domain Categorization): As noted previously in eDiscoveryDaily, the ability to classify items based on the domain of the sender of the email can significantly reduce the collection to be reviewed by identifying emails from parties that are clearly not responsive to the case.  It can also be a great way to quickly identify some of the privileged emails;
  • Predictive Coding: As noted previously in eDiscoveryDaily, predictive coding is the use of machine learning technologies to categorize an entire collection of documents as responsive or non-responsive, based on human review of only a subset of the document collection. According to this report, “A recent survey showed that, on average, predictive coding reduced review costs by 45 percent, with several respondents reporting much higher savings in individual cases”.

The publication also addresses concepts such as focused sampling, foreign language translation costs and searching audio records and tape backups.  It even addresses some of the most inefficient (and therefore, costly) practices of ESI processing and review, such as wholesale printing of ESI to paper for review (either in paper form or ultimately converted to TIFF or PDF), which is still more common than you might think.  Finally, it references some key rules of the ABA Model Rules of Professional Conduct to address the ethical duty of attorneys in effective management of ESI.  It’s a comprehensive publication that does a terrific job of explaining best practices for efficient discovery of ESI.

So, what do you think?  How many of these practices have been implemented by your organization?  Please share any comments you might have or if you’d like to know more about a particular topic.