Identification

Kroll Leverages ESI Analyst for Case Insights: CloudNine Podcasts

Without the right tools, sorting through a large dataset is akin to stumbling in the dark. Before deep-diving into voluminous data, legal teams need to know what to look for. The sooner those insights are found, the better. For years, attorneys uploaded data to traditional review platforms to win their clients and firm a head start. Since the platforms offered minimal searching tools, attorneys meticulously combed through mobile device data text by text. This process is not only time-consuming but also inefficient. Valuable case insights are easy to miss when hidden amongst other information.

CloudNine Senior Director, Rick Clark, kicks off the new 360 Innovate Podcast through an interview with Phil Hodgkins, Director of Data Insights and Forensics at Kroll. As a growing global practice, Kroll is well-versed in managing data-heavy projects involving compliance, investigations, and litigations. While conducting an internal investigation, Kroll learned how ESI Analyst’s capabilities surpassed those of two traditional review platforms. Through its various identification and visualization features, ESI Analyst yielded larger insights at a much faster rate. To learn how the Kroll team utilized ESI Analyst to strategically navigate through a broad dataset, visit this link: https://cloudnine.com/webcasts/kroll-innovate/?pg=ediscoverydaily/searching/kroll-leverages-esi-analyst-for-case-insights-cloudnine-podcasts

Getting the Most out of Your Keyword Searches

Though a more basic searching technique, keyword searches allow professionals to identify one or two specific words from multiple documents. Nowadays, keyword searches are considered inferior to the successor, predictive coding (TAR). In comparison to TAR, the “outdated” search method is more expensive and time-consuming. Keyword searches are also less predictable; when filtering through the same data set, keyword searches yield fewer results. Based on these flaws, some would argue that keyword searches are a dying technique. So, why bother talking about them at all? Though keyword searches have their flaws, they are far from obsolete. Some legal teams prefer to utilize manual review, recognizing it as a tried-and-true method. For example, the defendants in Coventry Capital U.S., LLC v. EEA Life Settlements, Inc. attempted to use TAR in 2020 to resolve the fraud case, but they argued the process was “protracted and contentious.” Thus, Judge Sarah L. Cave declined to compel the inclusion of TAR. [1] Similar outcomes occurred in cases such as Hyles v. New York City (2016) and In re Viagra (Sildenafil Citrate) Prods. Liab. Lit. (2016). In both cases, the court refused to mandate the usage of TAR when the responding party demonstrated a clear preference for keyword searching. [2] With this knowledge in mind, it’s important to recognize that keyword searches are still effective when done right.

Five Tips for Effective Keyword Searches

  1. Good communication is crucial.

Consult your custodians before running your searches. Use the conversations to identify any specific terms or abbreviations that may be relevant to your review. If necessary, you may also need to speak with an experienced advisor. Through their expertise, they can assist you with the sampling and testing process. Advisors are a great way to save time and money for everyone involved.

  1. Create and test your initial set of terms.

Everyone has to start somewhere. Your initial search terms don’t have to be perfect. While constructing your list, estimate how many results you expect each term to yield. Once you’ve run your test, evaluate how the search results compare to your expectations. If you received significantly fewer results than anticipated, adjust the search terms as needed. You may have to refine your search list multiple times. Anticipate this possibility to avoid missing any deadlines.  [3]

  1. Limit searches that include wildcards and/or numbers.

When searching for words with slight differences, it’s better to search for each variation rather than use wildcards. For example, you should set up individual searches for “email” and “emails” instead of using “email*” as a search term. Numbers can also be a problem if not done correctly (i.e. searching for the number 10 will show results for 100, 1 000, etc.). Make sure to place the number in quotes to avoid this issue.

  1. Count the characters.

Search terms with four or fewer characters are likely to yield false hits. Short words or abbreviations like HR or IT may be identified in longer, unrelated results. Filtering out the false hits requires extra review time and money.

  1. Know how to search for names properly.

Avoid searching for custodian names. Their name will most likely be attached to more documents and hits than expected or desired. When searching for non-custodians, place “w/2” between their first and last name. Doing so will show all variations of the full name. Finally, consider searching for nicknames to get even more results. Ask the client what nicknames they respond to before making your search term list. [4]

 

[1] Doug Austin, “Court Rules for Defendant on TAR and (Mostly) Custodian Disputes: eDiscovery Case Law,” eDiscovery Today, January 12, 2021.

[2] “How Courts Treat ‘Technology Assisted Review’ in Discovery,” Rivkin Radler, March 13, 2019.

[3] “Improving the effectiveness of keyword search terms,” E-discovery Consulting, November 11, 2021.

[4] Kathryn Cole, “Key Word Searching – What Is It? And How Do I Do It (Well)?,” All About eDiscovery, December 9, 2016.

When Litigation Hits, The First 7 to 10 Days is Critical: eDiscovery Throwback Thursdays

Here’s our latest blog post in our Throwback Thursdays series where we are revisiting some of the eDiscovery best practice posts we have covered over the years and discuss whether any of those recommended best practices have changed since we originally covered them.

This post was originally published on June 28, 2012, when eDiscovery Daily was less than two years old.  This post has already been revisited a couple of times since and has been referenced in a handful of webcasts as well.  It’s still good advice today.  Enjoy!

When a case is filed (or even before, if litigation is anticipated then), several activities must be completed within a short period of time (often as soon as the first seven to ten days after filing) to enable you to assess the scope of the case, where the key electronically stored information (ESI) is located and whether to proceed with the case or attempt to settle with opposing counsel.  Here are several of the key early activities that can assist in deciding whether to litigate or settle the case.

Activities:

  • Create List of Key Employees Most Likely to have Documents Relevant to the Litigation: To estimate the scope of the case, it’s important to begin to prepare the list of key employees that may have potentially responsive data. Information such as name, title, e-mail address, phone number, office location and where information for each is stored on the network is important to be able to proceed quickly when issuing hold notices and collecting their data.
  • Issue Litigation Hold Notice and Track Results: The duty to preserve begins when you anticipate litigation; however, if litigation could not be anticipated prior to the filing of the case, it is certainly clear once the case if filed that the duty to preserve has begun. Hold notices must be issued ASAP to all parties that may have potentially responsive data.  Once the hold is issued, you need to track and follow up to ensure compliance.  Here are a couple of recent posts regarding issuing hold notices and tracking responses.
  • Interview Key Employees: As quickly as possible, interview key employees to identify potential locations of responsive data in their possession as well as other individuals they can identify that may also have responsive data so that those individuals can receive the hold notice and be interviewed.
  • Interview Key Department Representatives: Certain departments, such as IT, Records or Human Resources, may have specific data responsive to the case. They should also have certain processes in place for regular destruction of “expired” data, so it’s important to interview them to identify potentially responsive sources of data and stop routine destruction of data subject to litigation hold.
  • Inventory Sources and Volume of Potentially Relevant Documents: Potentially responsive data can be located in a variety of sources, including: shared servers, e-mail servers, employee workstations, employee home computers, employee mobile devices (including bring your own device (BYOD) devices), portable storage media (including CDs, DVDs and portable hard drives), active paper files, archived paper files and third-party sources (consultants and contractors, including cloud storage providers). Hopefully, the organization already has created a data map before litigation to identify the location of sources of information to facilitate that process.  It’s important to get a high-level sense of the total population to begin to estimate the effort required for discovery.
  • Plan Data Collection Methodology: Determining how each source of data is to be collected also affects the cost of the litigation. Are you using internal resources, outside counsel or a litigation support vendor?  Will the data be collected via an automated collection system or manually?  Will employees “self-collect” any of their own data?  Answers to these questions will impact the scope and cost of not only the collection effort, but the entire discovery effort.

These activities can result in creating an inventory of potentially responsive information and help in estimating discovery costs (especially when compared to past cases at the same stage) that will help in determining whether to proceed to litigate the case or attempt to settle with the other side.

So, what do you think?  How quickly do you decide whether to litigate or settle?  Please share any comments you might have or if you’d like to know more about a particular topic.

Sponsor: This blog is sponsored by CloudNine, which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Here’s a Chance to Learn What You Need to Do When a Case is First Filed: eDiscovery Best Practices

The first days after a complaint is filed are critical to managing the eDiscovery requirements of the case efficiently and cost-effectively. With a scheduling order required within 120 days of the complaint and a Rule 26(f) “meet and confer” conference required at least 21 days before that, there’s a lot to do and a short time to do it. Where do you begin?

On Wednesday, September 27 at noon CST (1:00pm EST, 10:00am PST), CloudNine will conduct the webcast Holy ****, The Case is Filed! What Do I Do Now? (yes, that’s the actual title). In this one-hour webcast, we’ll take a look at the various issues to consider and decisions to be made to help you “get your ducks in a row” and successfully prepare for the Rule 26(f) “meet and confer” conference within the first 100 days after the case is filed. Topics include:

  • What You Should Consider Doing before a Case is Even Filed
  • Scoping the Discovery Effort
  • Identifying Employees Likely to Have Potentially Responsive ESI
  • Mapping Data within the Organization
  • Timing and Execution of the Litigation Hold
  • Handling of Inaccessible Data
  • Guidelines for Interviewing Custodians
  • Managing ESI Collection and Chain of Custody
  • Search Considerations and Preparation
  • Handling and Clawback of Privileged and Confidential Materials
  • Determining Required Format(s) for Production
  • Timing of Discovery Deliverables and Phased Discovery
  • Identifying eDiscovery Liaison and 30(b)(6) Witnesses
  • Available Resources and Checklists

I’ll be presenting the webcast, along with Tom O’Connor, who is now a Special Consultant to CloudNine!  If you follow our blog, you’re undoubtedly familiar with Tom as a leading eDiscovery thought leader (who we’ve interviewed several times over the years) and I’m excited to have Tom as a participant in this webcast!  To register for it, click here.

So, what do you think?  When a case is filed, do you have your eDiscovery “ducks in a row”?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

At Litigation Time, the Cost of Data Storage May Not Be As Low As You Think: eDiscovery Best Practices

One of my favorite all-time graphics that we’ve posted on the blog (from one of our very first posts) is this ad from the early 1980s for a 10 MB disk drive – for $3,398!  That’s MB (megabytes), not GB (gigabytes) or TB (terabytes).  These days, the cost per GB for data storage is pennies on the dollar, which is a big reason why the total amount of data being captured and stored by industry doubles every 1.2 years.  But, at litigation time, all that data can cost you – big.

When I checked on prices for external hard drives back in 2010 (not network drives, which are still more expensive), prices for a 2 TB external drive at Best Buy were as low as $140 (roughly 7 cents per GB).  Now, they’re as low as $81.99 (roughly 4.1 cents per GB).  And, these days, you can go bigger – a 5 TB drive for as low as $129.99 (roughly 2.6 cents per GB).  I promise that I don’t have a side job at Best Buy and am not trying to sell you hard drives (even from the back of a van).

No wonder organizations are storing more and more data and managing Big Data in organizations has become such a challenge!

Because organizations are storing so much data (and in more diverse places than ever before), information governance within those organizations has become vitally important in keeping that data as manageable as possible.  And, when litigation or regulatory requests hit, the ability to quickly search and cull potentially responsive data is more important than ever.

Back in 2010, I illustrated how each additional GB that has to be reviewed can cost as much as $16,650 (even with fairly inexpensive contract reviewers).  And, that doesn’t even take into consideration the costs to identify, preserve, collect, and produce each additional GB.  Of course, that was before Da Silva Moore and several other cases that ushered in the era of technology assisted review (even though more cases are still not using it than are using it).  Regardless, that statistic illustrates how the cost of data storage may not be as low as you think at litigation time – each GB could cost hundreds or even thousands to manage (even in the era of eDiscovery automation and falling prices for eDiscovery software and services).

Equating the early 1980’s ad above to GB, that equates to about $330,000 per GB!  But, if you go all the way back to 1950, the cost of a 5 MB drive from IBM was $50,000, which equates to about $10 million per GB!  Check out this interactive chart of hard drive prices from 1950-2010, courtesy of That Data Dude (yes, that really is the name of the site) where you can click on different years and see how the price per GB has dropped over the years.  It’s way cool!

So, what do you think?  Do you track GB metrics for your cases?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

George Socha of Socha Consulting LLC: eDiscovery Trends

This is the second of the 2016 LegalTech New York (LTNY) Thought Leader Interview series.  eDiscovery Daily interviewed several thought leaders at LTNY this year to get their observations regarding trends at the show and generally within the eDiscovery industry.  Unlike previous years, some of the questions posed to each thought leader were tailored to their position in the industry, so we have dispensed with the standard questions we normally ask all thought leaders.

Today’s thought leader is George Socha.  A litigator for 16 years, George is President of Socha Consulting LLC, offering services as an electronic discovery expert witness, special master and advisor to corporations, law firms and their clients, and legal vertical market software and service providers in the areas of electronic discovery and automated litigation support. George has also been co-author of the leading survey on the electronic discovery market, The Socha-Gelbmann Electronic Discovery Survey; in 2011, he and Tom Gelbmann converted the Survey into Apersee, an online system for selecting eDiscovery providers and their offerings.  In 2005, he and Tom Gelbmann launched the Electronic Discovery Reference Model project to establish standards within the eDiscovery industry – today, the EDRM model has become a standard in the industry for the eDiscovery life cycle and there are nine active projects with over 300 members from 81 participating organizations.  George has a J.D. for Cornell Law School and a B.A. from the University of Wisconsin – Madison.

What are your general observations about LTNY this year and about emerging eDiscovery trends overall?

{Interviewed the first morning of LTNY, so the focus of the question to George was more about his expectations for the show and also about general industry trends}.

This is the largest legal technology trade show of the year so it’s going to be a “who’s who” of people in the hallways.  It will be an opportunity for service and software providers to roll out their new “fill in the blank”.  It will be great to catch up with folks that I only get to see once a year as well as folks that I get to see a lot more than that.  And, yet again, I don’t expect any dramatic revelations on the exhibit floor or in any of the sessions.

We continue to hear two recurring themes:  the market is consolidating and eDiscovery has become a commodity. I still don’t see either of these actually happening.  Consolidation would be if some providers were acquiring others and no new providers were coming along to fill in the gaps, or if a small number of providers was taking over a huge share of the market.  Instead, as quickly as one provider acquires another, two, three or more new providers pop up and often with new ideas they hope will gain traction.  In terms of dominating the market, there has been some consolidation on the software side but as to services provider the market continues to look more like law firms than like accounting firms.

In terms of commoditization, I think we still have a market where people want to pay “K-mart, off the rack” prices for “Bespoke” suits.  That reflects continued intense downward pressure on prices.  It does not suggest, however, that the e-discovery market has begun to approximate, for example, the markets for corn, oil or generic goods.  E-discovery services and software are not yet fungible – with little meaningful difference between them other than price.  I have heard no discussion of “e-discovery futures.”  And providers and consumers alike still seem to think that brand, levels of project management, and variations in depth and breadth of offerings matter considerably.

Given that analytics happens at various places throughout the eDiscovery life cycle, is it time to consider tweaking the EDRM model to reflect a broader scope of analysis?

The question always is, “what should the tweak look like?”  The questions I ask in return are “What’s there that should not be there?”, “What should be there that is not?” and “What should be re-arranged?”  One common category of suggested tweaks are the ones meant to change the EDRM model to look more like one particular person’s or organization’s workflow.  This keeps coming up even though the model was never meant to be a workflow – it is a conceptual framework to help break one unitary item into a set of more discrete components that you can examine in comparison to each other and also in isolation.

A second set of tweaks focuses on adding more boxes to the diagram.  Why, we get asked, don’t we have a box called Early Case Assessment, and another called Legal Hold, and another called Predictive Coding, and so on. With activities like analytics, you can take the entire EDRM diagram and drop it inside any one of those boxes or in that circle.  Those concepts already are present in the current diagram.  If, for example, you took the entire EDRM diagram and dropped it inside the Identification box, you could call that Early Case Assessment or Early Data Assessment.  There was discussion early on about whether there should be a box for “Search”, but Search is really an Analysis function – there’s a home for it there.

A third set of suggested tweaks centers on eliminating elements from the diagram.  Some have proposed that we combine the processing and review boxes into a single box – but the rationale they offer is that because they offer both those capabilities there no longer is a need to show separate boxes for the separate functions.

What are you working on that you’d like our readers to know about?

First, we would like to invite current and prospective members to join us on April 18 for our Spring meeting which will be at the ACEDS conference this year.  The conference is from April 18 through April 20, with the educational portion of the conference slated for the 19th and 20th.

For several years at the conference, ACEDS has given out awards honoring eDiscovery professionals.  To congratulate this year’s winners we will be giving them one-year individual EDRM memberships.

On the project side, one of the undertakings we are working on is “SEAT-1,” a follow up to our eMSAT-1 (eDiscovery Maturity Self-Assessment Test).  SEAT-1 will be a self-assessment test specifically for litigation groups and law firms.  The test is intended to enable them to better assess where they are at, how they are doing and where they want to be.  We are also working on different ways to deliver our budget calculators.  It’s too early to provide details on that, but we’re hoping to be able to provide more information soon.

Finally, in the past year we have begun to develop and deliver member-only resources. We published a data set for members only and we put a new section of EDRM site with information about the changes to the Federal rules, including a comprehensive collection of information about the changes to the rules.  This year, we will be working on additional resources to be available to just our members.

Thanks, George, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

For a Successful Outcome to Your Discovery Project, Work Backwards: eDiscovery Best Practices

Based on a recent experience with a client, it seemed appropriate to revisit this topic. Plus, it’s always fun to play with the EDRM model. Notice anything different? 🙂

While the Electronic Discovery Reference Model from EDRM has become the standard model for the workflow of the process for handling electronically stored information (ESI) in discovery, it might be helpful to think about the EDRM model and work backwards, whether you’re the producing party or the receiving party.

Why work backwards?

You can’t have a successful outcome without envisioning the successful outcome that you want to achieve. The end of the discovery process includes the production and presentation stages, so it’s important to determine what you want to get out of those stages. Let’s look at them.

Presentation

Whether you’re a receiving party or a producing party, it’s important to think about what types of evidence you need to support your case when presenting at depositions and at trial – this is the type of information that needs to be included in your production requests at the beginning of the case as well as the type of information that you’ll need to preserve as a producing party.

Production

The format of the ESI produced is important to both sides in the case. For the receiving party, it’s important to get as much useful information included in the production as possible. This includes metadata and searchable text for the produced documents, typically with an index or load file to facilitate loading into a review application. The most useful form of production is native format files with all metadata preserved as used in the normal course of business.

For the producing party, it’s important to be efficient and minimize costs, so it’s important to agree to a production format that minimizes production costs. Converting files to an image based format (such as TIFF) adds costs, so producing in native format can be cost effective for the producing party as well. It’s also important to determine how to handle issues such as privilege logs and redaction of privileged or confidential information.

Addressing production format issues up front will maximize cost savings and enable each party to get what they want out of the production of ESI. If you don’t, you could be arguing in court like our case participants from yesterday’s post.

Processing-Review-Analysis

It also pays to make decisions early in the process that affect processing, review and analysis. How should exception files be handled? What do you do about files that are infected with malware? These are examples of issues that need to be decided up front to determine how processing will be handled.

As for review, the review tool being used may impact how quick and easy it is to get started, to load data and to use the tool, among other considerations. If it’s Friday at 5 and you have to review data over the weekend, is it easy to get started? As for analysis, surely you test search terms to determine their effectiveness before you agree on those terms with opposing counsel, right?

Preservation-Collection-Identification

Long before you have to conduct preservation and collection for a case, you need to establish procedures for implementing and monitoring litigation holds, as well as prepare a data map to identify where corporate information is stored for identification, preservation and collection purposes.

And, before a case even begins, you need an effective Information Governance program to minimize the amount of data that you might have to consider for responsiveness in the first place.

As you can see, at the beginning of a case (and even before), it’s important to think backwards within the EDRM model to ensure a successful discovery process. Decisions made at the beginning of the case affect the success of those latter stages, so working backwards can help ensure a successful outcome!

So, what do you think? What do you do at the beginning of a case to ensure success at the end?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscovery Daily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Daily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

The First 7 to 10 Days May Make or Break Your Case: eDiscovery Best Practices

Having worked with a client recently that was looking for some guidance at the outset of their case, it seemed appropriate to revisit this topic here.

When a case is filed, several activities must be completed within a short period of time (often as soon as the first seven to ten days after filing) to enable you to assess the scope of the case, where the key electronically stored information (ESI) is located and whether to proceed with the case or attempt to settle with opposing counsel. Here are several of the key early activities that can assist in deciding whether to litigate or settle the case.

Activities:

  • Create List of Key Employees Most Likely to have Documents Relevant to the Litigation: To estimate the scope of the case, it’s important to begin to prepare the list of key employees that may have potentially responsive data. Information such as name, title, eMail address, phone number, office location and where information for each is stored on the network is important to be able to proceed quickly when issuing hold notices and collecting their data. Some of these employees may no longer be with your organization, so you may have to determine whether their data is still available and where.
  • Issue Litigation Hold Notice and Track Results: The duty to preserve begins when you anticipate litigation; however, if litigation could not be anticipated prior to the filing of the case, it is certainly clear once the case if filed that the duty to preserve has begun. Hold notices must be issued ASAP to all parties that may have potentially responsive data. Once the hold is issued, you need to track and follow up to ensure compliance. Here are a couple of posts from 2012 regarding issuing hold notices and tracking responses.
  • Interview Key Employees: As quickly as possible, interview key employees to identify potential locations of responsive data in their possession as well as other individuals they can identify that may also have responsive data so that those individuals can receive the hold notice and be interviewed.
  • Interview Key Department Representatives: Certain departments, such as IT, Records or Human Resources, may have specific data responsive to the case. They may also have certain processes in place for regular destruction of “expired” data, so it’s important to interview them to identify potentially responsive sources of data and stop routine destruction of data subject to litigation hold.
  • Inventory Sources and Volume of Potentially Relevant Documents: Potentially responsive data can be located in a variety of sources, including: shared servers, eMail servers, employee workstations, employee home computers, employee mobile devices, portable storage media (including CDs, DVDs and portable hard drives), active paper files, archived paper files and third-party sources (consultants and contractors, including cloud storage providers). Hopefully, the organization already has created a data map before litigation to identify the location of sources of information to facilitate that process. It’s important to get a high level sense of the total population to begin to estimate the effort required for discovery.
  • Plan Data Collection Methodology: Determining how each source of data is to be collected also affects the cost of the litigation. Are you using internal resources, outside counsel or a litigation support vendor? Will the data be collected via an automated collection system or manually? Will employees “self-collect” any of their own data? If so, important data may be missed. Answers to these questions will impact the scope and cost of not only the collection effort, but the entire discovery effort.

These activities can result in creating a data map of potentially responsive information and a “probable cost of discovery” spreadsheet (based on initial estimated scope compared to past cases at the same stage) that will help in determining whether to proceed to litigate the case or attempt to settle with the other side.

So, what do you think? How quickly do you decide whether to litigate or settle? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine. eDiscoveryDaily is made available by CloudNine solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Dealing with the Departed – eDiscovery Best Practices

 

Having addressed this recently with a client, I thought this was a good topic to revisit here on the blog…

When litigation hits, key activities to get a jump on the case include creating a list of key employees most likely to have documents relevant to the litigation and interviewing those key employees, as well as key department representatives, such as IT for information about retention and destruction policies.  These steps are especially important as they may shed light on custodians you might not think about – the departed.

When key employees depart an organization, it’s important for that organization to have a policy in place to preserve their data for a period of time to ensure that any data in their possession that might be critical to company operations is still available if needed.  Preserving that data may occur in a number of ways, including:

  • Saving the employee’s hard drive, either by keeping the drive itself or by backing it up to some other media before wiping it for re-use;
  • Keeping any data in their network store (i.e., folder on the network dedicated to the employee’s files) by backing up that folder or even (in some cases) simply leaving it there for access if needed;
  • Storage and/or archival of eMail from the eMail system;
  • Retention of any portable media in the employee’s possession (including DVDs, portable hard drives, smart phones, etc.).

As part of the early fact finding, it’s essential to determine the organization’s retention policy (and practices, especially if there’s no formal policy) for retaining data (such as the examples listed above) of departed employees.  You need to find out if the organization keeps that data, where they keep it, in what format, and for how long.

When interviewing key employees, one of the typical questions to ask is “Do you know of any other employees that may have responsive data to this litigation?”  The first several interviews with employees often identify other employees that need to be interviewed, so the interview list will often grow to locate potentially responsive electronically stored information (ESI).  It’s important to broaden that question to include employees that are no longer with the organization to identify any that also may have had responsive data and try to gather as much information about each departed employee as possible, including the department in which they worked, who their immediate supervisor was and how long they worked at the company.  Often, this information may need to be gathered from Human Resources.

Once you’ve determined which departed employees might have had responsive data and whether the organization may still be retaining any of that data, you can work with IT or whoever has possession of that data to preserve and collect it for litigation purposes.  Just because they’re departed doesn’t mean they’re not important.

So, what do you think?  Does your approach for identifying and collecting from custodians include departed custodians?  Please share any comments you might have or if you’d like to know more about a particular topic.

Image © Warner Bros.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

How Mature is Your Organization in Handling eDiscovery? – eDiscovery Best Practices

A new self-assessment resource from EDRM helps you answer that question.

A few days ago, EDRM announced the release of the EDRM eDiscovery Maturity Self-Assessment Test (eMSAT-1), the “first self-assessment resource to help organizations measure their eDiscovery maturity” (according to their press release linked here).

As stated in the press release, eMSAT-1 is a downloadable Excel workbook containing 25 worksheets (actually 27 worksheets when you count the Summary sheet and the List sheet of valid choices at the end) organized into seven sections covering various aspects of the e-discovery process. Complete the worksheets and the assessment results are displayed in summary form at the beginning of the spreadsheet.  eMSAT-1 is the first of several resources and tools being developed by the EDRM Metrics group, led by Clark and Dera Nevin, with assistance from a diverse collection of industry professionals, as part of an ambitious Maturity Model project.

The seven sections covered by the workbook are:

  1. General Information Governance: Contains ten questions to answer regarding your organization’s handling of information governance.
  2. Data Identification, Preservation & Collection: Contains five questions to answer regarding your organization’s handling of these “left side” phases.
  3. Data Processing & Hosting: Contains three questions to answer regarding your organization’s handling of processing, early data assessment and hosting.
  4. Data Review & Analysis: Contains two questions to answer regarding your organization’s handling of search and review.
  5. Data Production: Contains two questions to answer regarding your organization’s handling of production and protecting privileged information.
  6. Personnel & Support: Contains two questions to answer regarding your organization’s hiring, training and procurement processes.
  7. Project Conclusion: Contains one question to answer regarding your organization’s processes for managing data once a matter has concluded.

Each question is a separate sheet, with five answers ranked from 1 to 5 to reflect your organization’s maturity in that area (with descriptions to associate with each level of maturity).  Default value of 1 for each question.  The five answers are:

  • 1: No Process, Reactive
  • 2: Fragmented Process
  • 3: Standardized Process, Not Enforced
  • 4: Standardized Process, Enforced
  • 5: Actively Managed Process, Proactive

Once you answer all the questions, the Summary sheet shows your overall average, as well as your average for each section.  It’s an easy workbook to use with input areas defined by cells in yellow.  The whole workbook is editable, so perhaps the next edition could lock down the calculated only cells.  Nonetheless, the workbook is intuitive and provides a nice exercise for an organization to grade their level of eDiscovery maturity.

You can download a copy of the eMSAT-1 Excel workbook from here, as well as get more information on how to use it (the page also describes how to provide feedback to make the next iterations even better).

The EDRM Maturity Model Self-Assessment Test is the fourth release in recent months by the EDRM Metrics team. In June 2013, the new Metrics Model was released, in November 2013 a supporting glossary of terms for the Metrics Model was published and in November 2013 the EDRM Budget Calculators project kicked off (with four calculators covered by us here, here, here and here).  They’ve been busy.

So, what do you think?  How mature is your organization in handling eDiscovery?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.