Review

Managing an eDiscovery Contract Review Team: Training a Review Team

 

Yesterday, we discussed the assembling the project team for document review.  It’s also important that the review team gets good training.  As a starting point, prepare a training manual for each team member that includes this information:

  • The document review criteria
  • A list of the custodians.  For each, provide the custodian’s title, a job description, a description of his/her role in the events that are at issue in the case, and a description of the types of documents you expect will be found in his/her files
  • Lists of keywords, key characters, key events, and key dates
  • Samples of responsive documents that you collected when you reviewed the collection
  • The review procedures
  • The review schedule
  • Instructions for use of the review tool

Cover these topics in training:

  • Case background information
    • A description of the parties
    • A description of the events that led to the case
    • A description of the allegations and defenses
    • An overview of the expected case schedule
  • Project overview information
    • A description of the goals of the review project
    • A description of the process
    • An overview of the expected project schedule
  • Responsive criteria
    • Go through the criteria – point-by-point – to ensure that the group understands what is responsive
    • Provide samples of responsive documents
  • Mechanics
    • Describe the roles of individuals on the team
    • Review the procedures for reviewing documents
    • Train the reviewers in use of the online review tool

Give the team training exercises – that is, give them sample documents to review.  Collect the work, review it, and give feedback to the group.

And let me give you two more suggestions that will help make your training effective:

  1. Train the team together, rather than one-on-one or in sub-groups.  Under this team-training approach, you ensure that everyone hears the same thing, and that responses to questions asked by individuals will benefit the entire group.
  2. Involve a senior attorney in the training.  You might, for example, ask a senior attorney to give the case background information.  Attention from a senior litigation team member is good for morale.  It tells the team that the work they are doing is important to the case.

How do you approach training a document review team?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

Managing an eDiscovery Contract Review Team: Assembling the Project Team

 

Before assembling the review team, think through how the project will be structured.  This will drive decisions on the type of people that you’ll need.  Your goal is to get the work done as cost effectively as possible – using less expensive personnel where possible — without sacrificing work quality or the utility of the work product.

The “base” of the project will be comprised of contract reviewers and qc staff.  In the project plan, you determined the number of people that you need.  At this point, don’t worry about who will be a reviewer and who will do qc work.  Everybody can start as a reviewer.  After a few days, you can identify who will do qc work.  You’ve got options for assembling this staff, but you should consider working with a litigation support vendor who offers staffing services.  A good vendor already has access to a pool of people with document review experience.  This can save you lots of time and work.

In addition to the contract review staff, you’ll need project management staff.  We’ve already talked about a project manager.  For a large project, you’ll want project supervisors — each responsible for a team of reviewers/qc personnel.  Each supervisor is responsible for overseeing the flow of work to the team, the quality of the work done by the team, the productivity of team members, and answering questions raised by reviewers (or ensuring that questions are resolved).  I usually create teams of 10 to 12 and assign one supervisor to a team.   The supervisors might be law firm litigation support professionals, or supervisory staff provided by the vendor with whom you are working.

You’ll also need “decision makers” and experts in the subject matter to round out the team.  At a minimum, you’ll want an attorney from the litigation team.  Depending on the complexity of the documents, you might need a client employee who is familiar with the company’s operations and documents.  These people should be on-site, full-time for the first few days of a project.  Eventually there will be fewer questions and it’s probably sufficient to have phone access to these team members.

Later in this blog series we’ll talk about how these staff levels interact so that decisions are made by attorneys but effectively implemented by review staff.

How do you structure a document review team?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

eDiscovery Best Practices: Judges’ Guide to Cost-Effective eDiscovery

 

Last week at LegalTech, I met Joe Howie at the blogger’s breakfast on Tuesday morning.  Joe is the founder of Howie Consulting and is the Director of Metrics Development and Communications for the eDiscovery Institute, which is a 501(c)(3) nonprofit research organization for eDiscovery.

eDiscovery Institute has just released a new publication that is a vendor-neutral guide for approaches to considerably reduce discovery costs for ESI.  The Judges’ Guide to Cost-Effective E-Discovery, co-written by Anne Kershaw (co-Founder and President of the eDiscovery Institute) and Joe Howie, also contains a foreword by the Hon. James C. Francis IV, Magistrate Judge for the Southern District of New York.  Joe gave me a copy of the guide, which I read during my flight back to Houston and found to be a terrific publication that details various mechanisms that can reduce the volume of ESI to review by up to 90 percent or more.  You can download the publication here (for personal review, not re-publication), and also read a summary article about it from Joe in InsideCounsel here.

Mechanisms for reducing costs covered in the Guide include:

  • DeNISTing: Excluding files known to be associated with commercial software, such as help files, templates, etc., as compiled by the National Institute of Standards and Technology, can eliminate a high number of files that will clearly not be responsive;
  • Duplicate Consolidation (aka “deduping”): Deduping across custodians as opposed to just within custodians reduces costs 38% for across-custodian as opposed to 21% for within custodian;
  • Email Threading: The ability to review the entire email thread at once reduces costs 36% over having to review each email in the thread;
  • Domain Name Analysis (aka Domain Categorization): As noted previously in eDiscoveryDaily, the ability to classify items based on the domain of the sender of the email can significantly reduce the collection to be reviewed by identifying emails from parties that are clearly not responsive to the case.  It can also be a great way to quickly identify some of the privileged emails;
  • Predictive Coding: As noted previously in eDiscoveryDaily, predictive coding is the use of machine learning technologies to categorize an entire collection of documents as responsive or non-responsive, based on human review of only a subset of the document collection. According to this report, “A recent survey showed that, on average, predictive coding reduced review costs by 45 percent, with several respondents reporting much higher savings in individual cases”.

The publication also addresses concepts such as focused sampling, foreign language translation costs and searching audio records and tape backups.  It even addresses some of the most inefficient (and therefore, costly) practices of ESI processing and review, such as wholesale printing of ESI to paper for review (either in paper form or ultimately converted to TIFF or PDF), which is still more common than you might think.  Finally, it references some key rules of the ABA Model Rules of Professional Conduct to address the ethical duty of attorneys in effective management of ESI.  It’s a comprehensive publication that does a terrific job of explaining best practices for efficient discovery of ESI.

So, what do you think?  How many of these practices have been implemented by your organization?  Please share any comments you might have or if you’d like to know more about a particular topic.

Managing an eDiscovery Contract Review Team: Prepare a Review Plan

 

Before starting a review project, prepare a plan that will be your map for moving forward and that you’ll use to monitor progress throughout the project.  The components of the plan should include:

  • A schedule:  The schedule will be dictated by the date on which you’re required to produce documents.  Determine when you can start  (you’ll need time in advance to assemble the team, draft criteria, train the team, and get documents loaded into the review tool).  Plan to complete the project well in advance of the production deadline, so you’ve got some pad for schedule slippage.  Once you have a start date and an end date, look at the size of the collection and calculate how many documents you need to process in a day. 
  • Identify required staff resources:  Determine the number of units a reviewer can do in a day (use statistics from prior projects or do test runs on a sampling of the collection).  From there, do the math to determine how many reviewers you’ll need to complete the project within your schedule.  Add in people to do quality control (a good ratio is 1 qc reviewer for every 4 or 5 reviewers).
  • A budget:  Include costs for review/qc staff, project management, equipment, and any other incidentals.
  • Prepare a monitoring plan:  Every day you should look at the progress that the review team is making so you’ll know where you stand with schedule and budget.  You’ll need reports that provide statistics on what has been completed – reports that can probably be generated by the review tool.  Look at those reports to determine if they will meet your needs or if you’ll need to establish additional tracking mechanisms.
  • Prepare a quality control plan:  Determine how reviewed documents will be funneled to the quality control staff, the mechanism by which quality control reviewers will report findings to project management, and the mechanism by which feedback will be provided to reviewers.

A good plan isn’t hard to prepare and it can make a world of difference in how smoothly the project will run.

How do you plan a document review?  Have you run into glitches?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

Managing an eDiscovery Contract Review Team: Identify a Project Manager

 

Yesterday, we talked about applying topic codes to the documents to identify helpful or harmful documents.  Today, we will talk about identifying a project manager for the review.

A good, experienced project manager is critical to the success of your review project.  In fact, the project manager is the most important part of the equation.  The project manager will be responsible for:

  • Creating a schedule and a budget
  • Determining the right staff size
  • Lining up all the resources that you’ll need like computer equipment, software, and supplies
  • Preparing training materials.
  • Coordinating training of the review team
  • Serving as a liaison with the service providers who are processing the data, loading data into the review tool, and making the review tool available to the review team
  • Monitoring status of the project and reporting to the litigation team
  • Identifying potential problems with schedule and budget and developing resolutions
  • Ensuring that questions are resolved quickly and that lines of communication between the review team and decision makers are open
  • Supervising workflow and quality control work

Choose someone who has project management experience and is experienced in litigation, technology, electronic discovery, working with vendors, and working with attorneys.  Identify the project manager early on and get him or her involved in the project planning steps. 

What do you look for in a project manager?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

Managing an eDiscovery Contract Review Team: Applying Topic Codes in the Document Review

 

So far we’ve covered drafting criteria for responsiveness and for privilege.  You may, however, be asking the review team to do more than that in the document review.  You might, for example, ask them to apply topic codes to the documents or to identify helpful or harmful documents.  At this point in the case, you will be better off keeping this very simple.  There are several reasons for this:

  • Chances are that you’re on a tight schedule.  An in depth analysis of the collection at this point may cause you to miss production deadlines.
  • If you ask people to focus on too many things in the review, you increase the likelihood of errors and inconsistencies, especially if the team is inexperienced with the case, the client and the documents.
  • You’re still in the early stages of the case.  As it evolves you’ll identify new facts, issues and witnesses that will be important.  This will not be your only effort to match documents with issues, facts and witnesses.

It may be reasonable, however, to ask the team to do some very basic categorization of the documents around topics.  Let me give you an example.  Let’s say you are handling a pharmaceutical case involving a drug product that is alleged to have significant adverse reactions.  You know that you’ll be interested in documents that discuss testing of the product, marketing, manufacturing, and so on.  You could ask the team to apply those general types of topics to the documents.  You could also identify a few examples of text that will be helpful and text that will be harmful, and create corresponding topic codes (using our pharmaceutical case illustration, you might have a topic code for “Death of a patient”).   A very simple set of topic codes shouldn’t slow down the review, and this effort will provide some search hooks into the collection once the review is complete.

Once you’ve developed a simple, workable topic list, write clear, objective definitions for each topic, and find documents in the collection that serve as examples of each.  Include those definitions and examples in the criteria.

Do you have topic codes applied to a collection in an initial review?  How do you approach it and how well does it work?  Please share any comments you have and let us know if you’d like to know more about an eDiscovery topic.

eDiscovery Case Law: Privilege Waived for Produced Servers

If you were at the International Legal Technology Association (ILTA) trade show this past August, you may have noticed a huge unfinished building in the middle of the strip – the Fontainebleau Resort.  It sits idle after financing was pulled, forcing Fontainebleau Las Vegas LLC to file for Chapter 11 bankruptcy in June of 2009.  Naturally, lawsuits followed, between the Term Lenders and Fontainebleau Resort, LLC (FRLLC), the third party parent of Fontainebleau Las Vegas – In re Fontainebleau Las Vegas Contract Litig., (S.D. Fla. Jan 7, 2011)

A company that responded to a third party subpoena and court orders compelling production by handing over three servers to lenders without conducting any relevancy review and without reviewing two of the servers for privileged materials waived privilege for documents on the two servers that were not reviewed.

The parent company of a resort in bankruptcy proceedings was served by lenders to the resort with a subpoena for production of documents. The company did not object to the scope of the subpoena, and the court granted a motion of the lenders to compel production. Counsel for the company then halted work by an e-discovery vendor who had completed screening the company’s email server for responsive documents but had not started a privilege review because of concerns that the company could not pay for the services. Counsel for the company also sought to withdraw from the case, but the company was unable to find new counsel.

Rather than seeking a stay or challenging discovery rulings from the court, the company turned over data from a document server, an accounting server, and an email server. According to the court, the three servers were turned over to the lenders without any meaningful review for relevancy or responsiveness. Despite an agreement with the lenders on search terms for the email server, the company produced a 126 gigabyte disk with 700,000 emails from that server and then, without asking for leave of court, was late in producing a privilege log for data on the email server. The lenders sought direction from the court on waiver of privilege and their obligation if they found privileged materials in the data produced by the company. The company for the first time then raised objections to the burdensomeness of the original subpoena served over six months earlier given the company’s lack of resources or employees to conduct a document review.

The court held that the company “waived the attorney-client privilege and work product protection, and any other applicable privileges, for the materials it produced from two of three computer servers in what can fairly be described as a data dump as part of a significantly tardy response to a subpoena and to court-ordered production deadlines.” The court stated that in effect, the company “took the two servers, which it never reviewed for privilege or responsiveness, and said to the Term Lenders ‘here, you go figure it out.’”

However, because the company prepared a privilege log for the email server, the court added that privileges were not waived for materials from the email server. Also, the lenders were directed to alert the company to any “clearly privileged material they may find during their review of the production on the documents and accounting servers.” Although the court was not ruling on admissibility at trial of that privileged material, the lenders would be allowed to use it during pre-trial preparations, including depositions.

So, what do you think?  Was justice served?  Please share any comments you might have or if you’d like to know more about a particular topic.

Case Summary Source: Applied Discovery (free subscription required).  For eDiscovery news and best practices, check out the Applied Discovery Blog here.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Managing an eDiscovery Contract Review Team: Drafting Privileged Criteria

Yesterday, we covered drafting criteria for responsiveness.  You may, however, be asking the review team to do more than identify responsive documents.  You might, for example, also ask them to identify privileged documents, significant documents, documents that need to be redacted, documents that need to be reviewed by an expert, and so on.  In this issue, we’ll talk about reviewing for privilege.

First, let’s clarify what you’ll be asking the review team to do.  If you are using a team of contract reviewers, it is unlikely that you’ll be asking them to make privilege decisions.  You might, however, ask them to identify and flag potentially privileged documents.  Under this approach, attorneys on your team who can make privilege decisions would do a subsequent review of the potentially privileged documents.  That’s when privilege decisions will be made.

Of course, you’ll need to give the contract team criteria for potentially privileged materials.  Consider including these information points and instructions in the criteria:

  • The names and initials of individual attorneys, both outside counsel and corporate in-house attorneys.
  • The names and initials of legal assistants and other litigation team members of outside counsel and the corporate legal department (work done by these individuals under the direction of counsel may be privileged).
  • The names of law firms that have served as outside counsel.
  • Documents on law firm letterhead.
  • Documents stamped “Work Product”, “Attorney Client”, “Privileged” or other designations indicating confidentiality re litigation.
  • Legal documents such as pleadings or briefs in draft form.
  • Handwritten annotations on documents that may be authored by counsel or litigation team members under the direction of counsel.
  • Subject areas of privileged communication.

In addition, provide instructions for documents that will not be privileged.  In every collection, there will be certain types of documents that won’t be privileged unless they bear privileged annotations.  Examples are published literature, press releases, advertisements, corporate annual reports, brochures, user manuals…  in short, any documents that are public in nature.  These materials won’t be privileged unless they bear privileged annotations.  Likewise, most document collections will include internal documents that will fall into the same category.  Examples may be insurance policies, invoices, manufacturing reports, and so on.  Create a list of these documents and include them in the criteria instructions.

Have you drafted criteria for a privilege review of a large collection?  How did you approach it and how well did it work?  Please share any comments you might have and let us know if you’d like to know more about an eDiscovery topic.

Managing an eDiscovery Contract Review Team: Drafting Responsive Criteria – a Step-by-Step Guide

 

The criteria that you prepare for the review will be governed by the objectives that you established for the review.  At a minimum, you’ll draft criteria for responsive documents.  In addition, you may draft criteria for privileged documents, hot documents, and so on.  Let’s start with drafting responsive criteria.  For this step, you’ll need the request for production and the notes that you took when you sampled the document collection.

For each separate point on the request for production, do the following:

  • Expand on the definition.  Make it clearer and more detailed.  Make sure that the language you use is understandable to lay people.
  • List topic areas that are likely to appear in responsive documents.  Make sure these topic areas are objective in nature and that they minimize the need for judgment.  For example, don’t include criteria like “documents that demonstrate negligence in operations”.  Rather, break this down into real-life objective examples like “documents that discuss accidents”, “documents that discuss poor employee performance” and so on.  Use real examples from the documents – examples that you came across during your sampling of the collection.
  • List date ranges of responsive materials.
  • Based on your review of the documents, list as many examples as you can of document types that are responsive, and attach examples to the criteria. 
  • Based on your review of the documents, include as many examples as you can of responsive text.

Several members of the litigation team should review the draft criteria.  Once all suggestions for modifications and additions are agreed upon, put the criteria in “final” form – “final” meaning the document that you will use at the start of the review project.  As you go move forward, update the criteria with more examples and clearer definitions as you learn more about the collection.

In the next issue, we’ll cover criteria for other review objectives you might have established (for example, you might be screening for privilege or significance).

Have you drafted criteria for a document review of a large collection?  How did you approach it and how well did it work?  Please share any comments you might have and let us know if you’d like to know more about an eDiscovery topic.

eDiscovery Searching: Proximity, Not Absence, Makes the Heart Grow Fonder

Recently, I assisted a large corporate client where there were several searches conducted across the company’s enterprise-wide document management systems (DMS) for ESI potentially responsive to the litigation.  Some of the individual searches on these systems retrieved over 200,000 files by themselves!

DMS systems are great for what they are intended to do – provide a storage archive for documents generated within the organization, version tracking of those documents and enable individuals to locate specific documents for reference or modification (among other things).  However, few of them are developed with litigation retrieval in mind.  Sure, they have search capabilities, but it can sometimes be like using a sledgehammer to hammer a thumbtack into the wall – advanced features to increase the precision of those searches may often be lacking.

Let’s say in an oil company you’re looking for documents related to “oil rights” (such as “oil rights”, “oil drilling rights”, “oil production rights”, etc.).  You could perform phrase searches, but any variations that you didn’t think of would be missed (e.g., “rights to drill for oil”, etc.).  You could perform an AND search (i.e., “oil” AND “rights”), and that could very well retrieve all of the files related to “oil rights”, but it would also retrieve a lot of files where “oil” and “rights” appear, but have nothing to do with each other.  A search for “oil” AND “rights” in an oil company’s DMS systems may retrieve every published and copyrighted document in the systems mentioning the word “oil”.  Why?  Because almost every published and copyrighted document will have the phrase “All Rights Reserved” in the document.

That’s an example of the type of issue we were encountering with some of those searches that yielded 200,000 files with hits.  And, that’s where proximity searching comes in.  Proximity searching is simply looking for two or more words that appear close to each other in the document (e.g., “oil within 5 words of rights”) – the search will only retrieve the file if those words are as close as specified to each other, in either order.  Proximity searching helped us reduce that collection to a more manageable number for review, even though the enterprise-wide document management system didn’t have a proximity search feature.

How?  We wound up taking a two-step approach to get the collection to a more likely responsive set.  First, we did the “AND” search in the DMS system, understanding that we would retrieve a large number of files, and exported those results.  After indexing them with a first pass review tool that has more precise search alternatives (at Trial Solutions, we use FirstPass™, powered by Venio FPR™, for first pass review), we performed a second search on the set using proximity searching to limit the result set to only files where the terms were near each other.  Then, tested the results and revised where necessary to retrieve a result set that maximized both recall and precision.

The result?  We were able to reduce an initial result set of 200,000 files to just over 5,000 likely responsive files by applying the proximity search to the first result set.  And, we probably saved $50,000 to $100,000 in review costson a single search.

I also often use proximity searches as alternatives to phrase searches to broaden the recall of those searches to identify additional potentially responsive hits.  For example, a search for “Doug Austin” doesn’t retrieve “Austin, Doug” and a search for “Dye 127” doesn’t retrieve “Dye #127”.  One character difference is all it takes for a phrase search to miss a potentially responsive file.  With proximity searching, you can look for these terms close to each other and catch those variations.

So, what do you think?  Do you use proximity searching in your culling for review?  Please share any comments you might have or if you’d like to know more about a particular topic.