Information Management

Four Tips for Successful Meet and Confers

When approaching any challenge or goal, it’s often best to start with the big picture before narrowing things down. By working backwards, you can identify the steps needed to achieve the desired result. This type of thinking can be applied to Rule 26(f) conferences (also known as meet and confers). As mandated by Rule 26(f) of the FRCP, both parties must meet at least 21 days before holding a scheduling conference. The purpose of the meet and confer is to discuss litigation details such as data preservation, privilege issues, the form of production, and expenses. To get the ball rolling, counsel can prepare a list of general questions: What data types need to be collected? How should the scope of discovery be defined? What pace is needed to meet court-established deadlines? General questions like these build a solid foundation for deeper inquiries and concerns. [1]

More Tips for Meet and Confers

  1. Initiate the conference early.

The meet and confer process is not something that can or should be rushed. Negotiation takes time, patience, and multiple attempts. Waiting until the last-minute benefits no one. Instead of frantically rushing to meet deadlines, schedule the meet and confer as soon as possible. Sometimes, counsel is hesitant to meet early because they feel that they don’t have enough information and prep time. Thus, in addition to meeting early, parties should also meet often. Multiple conferences allow the parties to fully understand and iron out the details.

  1. Identify and evaluate the accessibility of relevant data types.

Companies interact with a variety of data types on a daily basis – email, Facebook, Zoom, the list goes on. Producing each one would be burdensome, expensive, and unnecessary. Only focus on relevant data types that are proportional to the needs of the case. Companies also regularly create and destroy large volumes of information. Therefore, you must assess their data retention policies to determine what information is stored and where. Once that’s settled, consider whether the data types are too expensive or inaccessible for production.

  1. Walk in with the right mindset.

Compromise is impossible to reach without flexibility from both parties. At the same time, neither party should feel obligated to concede to all proposals. Meet and confers should be thought of as open dialogues. Discuss, debate, and engage in respectful arguments when necessary. Above all, cooperate by ensuring your suggestions are reasonable and proportional. [2]  If this aspect is a concern, consider hiring a discovery expert. Through their industry knowledge, experts can assess the opposing party’s discovery systems and requests.

  1. Understand your client’s data policies and systems.

Before heading into the meet and confer, try to gather as much information as possible. Ask your client if they have any formal information governance policies. If not, probe further to identify how and where their data is stored. It’s also important to identify the person or department in charge of storing said data. The client’s IT environment must be understood as well. Inquire about the quantity and locations of company computers. Additionally, request information about the company’s software programs, backup schedules, data custodians, etc. [1]

 

[1] Ronald I. Raether Jr., “Preparing for the Rule 26(f) Scheduling Conference and Other Practical Advice in the Wake of the Recent Amendments to the Rules Governing E-Discovery,” The Federal Lawyer, August 2007.

[2] Scott Devens, “Defensible Strategies for the ‘Meet and Confer,’” Bloomberg Law, Oct. 18, 2011.

eDiscovery Trends: Tom Gelbmann of Gelbmann & Associates, LLC

 

This is the fourth of the 2012 LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and generally asked each of them the following questions:

  1. What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?
  2. Which trend(s), if any, haven’t emerged to this point like you thought they would?
  3. What are your general observations about LTNY this year and how it fits into emerging trends?
  4. What are you working on that you’d like our readers to know about?

Today’s thought leader is Tom Gelbmann. Tom is Principal of Gelbmann & Associates, LLC.  Since 1993, Gelbmann & Associates, LLC has advised law firms and Corporate Law Departments to realize the full benefit of their investments in Information Technology.  Tom has also been co-author of the leading survey on the electronic discovery market, The Socha-Gelbmann Electronic Discovery Survey; last year he and George Socha converted the Survey into Apersee, an online system for selecting eDiscovery providers and their offerings.  In 2005, he and George Socha launched the Electronic Discovery Reference Model project to establish standards within the eDiscovery industry – today, the EDRM model has become a standard in the industry for the eDiscovery life cycle and there are nine active projects with over 300 members from 81 participating organizations.

What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?  And which trend(s), if any, haven’t emerged to this point like you thought they would?

I’m seeing an interesting trend regarding offerings from traditional top tier eDiscovery providers. Organizations who have invested in eDiscovery related technologies are beginning to realize these same technologies can be applied to information governance and compliance and enable an organization to get a much greater grasp on its total content.  Greater understanding of location and profile of content not only helps with eDiscovery and compliance, but also business intelligence and finally – destruction – something few organizations are willing to address.

We have often heard – Storage is cheap. The full sentence should be: Storage is cheap, but management is expensive.  I think that a lot of the tools that have been applied for collection, culling, search and analysis enable organizations to look at large quantities of information that is needlessly retained. It also allows them to take a look at information and get some insights on their processes and how that information is either helping their processes or, more importantly, hindering those processes and I think it's something you're going to see will help sell these tools upstream rather than downstream.

As far as items that haven't quite taken off, I think that technology assisted coding – I prefer that term over “predictive coding” – is coming, but it's not there yet.  It’s going to take a little bit more, not necessarily waiting for the judiciary to help, but just for organizations to have good experiences that they could talk about that demonstrate the value.  You're not going to remove the human from the process.  But, it's giving the human a better tool.  It’s like John Henry, with the ax versus the steam engine.  You can cut a lot more wood with the steam engine, but you still need the human.

What are your general observations about LTNY this year and how it fits into emerging trends?

Based on the sessions that I've attended, I think there's much more education.  There's just really more practical information for people to take away on how to manage eDiscovery and deal with eDiscovery related products or problems, whether it's cross-border issues, how to deal with the volumes, how to bring processes in house or work effectively with vendors.  There's a lot more practical “how-tos” than I've seen in the past.

What are you working on that you’d like our readers to know about?

Well, I think one of the things I'm very proud of with EDRM is that just before LegalTech, we put out a press release of what's happening with the projects, and I'm very pleased that five of the nine EDRM projects had significant announcements.  You can go to EDRM.net for that press release that details those accomplishments, but it shows that EDRM is very vibrant, and the teams are actually making good progress. 

Secondly, George Socha and I are very proud about the progress of Apersee, which was announced last year at LegalTech.  We've learned a lot, and we've listened to our clientele in the market – consumers and providers.  We listened, and then our customers changed their mind.  But, as a result, it's on a stronger track and we're very proud to announce that we have two gold sponsors, AccessData and Nuix.  We’re also talking to additional potential sponsors, and I think we'll have those announcements very shortly.

Thanks, Tom, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Trends: Jim McGann of Index Engines

 

This is the third of the 2012 LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and generally asked each of them the following questions:

  1. What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?
  2. Which trend(s), if any, haven’t emerged to this point like you thought they would?
  3. What are your general observations about LTNY this year and how it fits into emerging trends?
  4. What are you working on that you’d like our readers to know about?

Today’s thought leader is Jim McGann.  Jim is Vice President of Information Discovery at Index Engines.  Jim has extensive experience with the eDiscovery and Information Management in the Fortune 2000 sector. He has worked for leading software firms, including Information Builders and the French-based engineering software provider Dassault Systemes.  In recent years he has worked for technology-based start-ups that provide financial services and information management solutions.

What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?  And which trend(s), if any, haven’t emerged to this point like you thought they would?

I think what we're seeing is a lot of people becoming a bit more proactive.  I may combine your questions together because I'm surprised that people haven’t become proactive sooner.  LegalTech has included a focus on litigation readiness for how long? Ten years or so?  And we're still dealing with how to react to litigation, and you're still seeing fire drills occur.  There’s still not enough setting up of environments in the corporate world and in the legal world that would enable customers to respond more quickly.  It surprises me how little has been developed in this regard.. 

I think the reason for the slow start is that there are a lot of regulations that have been evolving and people haven't really understood what they need to prepare and how to react.  There’s been ten years of LegalTech and we're still struggling with how to respond to basic litigation requests because the volume has grown, accessibility arguments have changed, Federal rules have been solidified, and so forth.

What we're seeing when we go and talk to customers (and we talk to a lot of end-user customers that are facing litigation) is IT on one end of the table saying, ‘we need to solve this for the long term’, and litigation support teams on the other end of the table saying, ‘I need this today, I’ve been requesting data since July, and I still haven't received it and it's now January’.  That's not good.

The evolution is from what we call “litigation support”.  Litigation support, which is more on the reactive side to proactive litigation readiness, expects to be able to push a button and put a hold on John Doe's mailbox.  Or, specifically find content that’s required at a moment's notice.

So, I think the trend is litigation readiness.  Are people really starting to prepare for it?  Every meeting that we go into, we see IT organizations, who are in the compliance security groups, rolling up their sleeves and saying I need to solve this for my company long term but we have this litigation.  It's a mixed environment.  In the past, we would go meet with litigation support teams, and IT wasn't involved.  You're seeing buzz words like Information Governance.  You're seeing big players like IBM, EMC and Symantec jumping deep into it.

What's strange is that IT organizations are getting involved in formalizing a process that hasn't been formalized in the past.  It's been very much, maybe not “ad hoc”, but IT organizations did what they could to meet project needs.  Now IT is looking at solving the problem long term, and there’s a struggle.  Attorneys are not the best long term planners – they're doing what they need to do.  They've got 60 days to do discovery, and IT is thinking five years.  We need to balance this out.

What are your general observations about LTNY this year and how it fits into emerging trends?

We're talking to a lot of people that are looking at next generation solutions.  The problems have changed, so solutions are evolving to address how you solve those problems.

There's also been a lot of consolidation in the eDiscovery space as well, so people are saying that their relationship has changed with their other vendors.  There have been a lot of those conversations.

I'm not sure what the attendance is at this year’s show, but attendees seem to be serious about looking for new solutions.  Maybe because the economy was so bad over the past year or maybe because it's a new budget year and budgets are freeing up, but people are looking at making changes, looking at new solutions.  We see that a lot with service providers, as well as law firms and other end users.

What are you working on that you’d like our readers to know about?

We’ve announced the release of Octane Version 4.3, which preserves files and emails at a bit level from MS Exchange and IBM Lotus Notes, as well as indexing forensics images and evidence files at speeds reaching 1TB per hour using a single node.  Bit-for-bit email processing and forensic image indexing speeds are unprecedented breakthroughs in the industry.  Bit-level indexing is not only faster but also more reliable because email is stored in its original format with no need for conversion.  Index Engines can also now index terabytes of network data including forensic images in hours, not weeks, like traditional tools.  So, we’re excited about the new version of Octane.

We’ve also just announced a partnership with Merrill Corporation, to provide our technology to collect and process ESI from networks, desktops, forensic images and legacy backup tapes, for both reactive litigation and proactive litigation readiness.  Merrill has recognized the shift in reactive to proactive litigation readiness that I mentioned earlier and we are excited to be aligned with Merrill in meeting the demands of their customers in this regard.

Thanks, Jim, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Trends: Christine Musil of Informative Graphics Corporation (IGC)

 

This is the second of the 2012 LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and generally asked each of them the following questions:

  1. What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?
  2. Which trend(s), if any, haven’t emerged to this point like you thought they would?
  3. What are your general observations about LTNY this year and how it fits into emerging trends? (Note: Christine was interviewed the night before the show, so there were obviously no observations at that point)
  4. What are you working on that you’d like our readers to know about?

Today’s thought leader is Christine Musil.  Christine has a diverse career in engineering and marketing spanning 18 years. Christine has been with IGC since March 1996, when she started as a technical writer and a quality assurance engineer. After moving to marketing in 2001, she has applied her in-depth knowledge of IGC's products and benefits to marketing initiatives, including branding, overall messaging, and public relations. She has also been a contributing author to a number of publications on archiving formats, redaction, and viewing technology in the enterprise.

What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?  And which trend(s), if any, haven’t emerged to this point like you thought they would?

That's a hard question.  Especially for us because we're somewhat tangential to the market, and not as deeply enmeshed in the market as a lot of the other vendors are.  I think the number of acquisitions in the industry was what we expected, though maybe the M&A players themselves were surprising.  For example, I didn't personally see the recent ADI acquisition (Applied Discovery acquired by Siris Capital) coming.  And while we weren’t surprised that Clearwell was acquired, we thought that their being acquired by Symantec was an interesting move.

So, we expect the consolidation to continue.  We watched the major content management players like EMC OpenText to see if they would acquire additional, targeted eDiscovery providers to round out some of their solutions, but through 2011 they didn’t seem to have decided whether they're “all in” despite some previous acquisitions in the space.  We had wondered if some of them have decided maybe they're out again, though EMC is here in force for Kazeon this year.  So, I think that’s some of what surprised me about the market.

Other trends that I see are potentially more changes in the FRCP (Federal Rules of Civil Procedure) and probably a continued push towards project-based pricing.    We have certainly felt the pressure to do more project-based pricing, so we're watching that. Escalating data volumes have caused cost increases and, obviously, something's going to have to give there.  That's where I think we’re going to see more regulations come out through new FRCP rules to provide more proportionality to the Discovery process, or clients will simply dictate more pricing alternatives.

What are you working on that you’d like our readers to know about?

We just announced a new release of our Brava!® product, version 7.1, at the show.  The biggest additions to Brava are in the Enterprise version, and we’re debuting a the new Brava Changemark®  Viewer (Changemark®) for smartphones as well as an upcoming Brava HTML client for tablets.  iPads have been a bigger game changer than I think a lot of people even anticipated.  So, we’re excited about it. Also new with Brava 7.1 isvideo collaboration and improved enterprise readiness and performance for very large deployments.

We also just announced the results of our Redaction Survey, which we conducted to gauge user adoption of toward electronic redaction software. Nearly 65% of the survey respondents were from law firms, so that was a key indicator of the importance of redaction within the legal community.  Of the respondents, 25% of them indicated that they are still doing redaction manually, with markers or redaction tape, 32% are redacting electronically, and nearly 38% are using a combined approach with paper-based and software-driven redaction.  Of those that redact electronically, the reasons that they prefer electronic redaction included professional look of the redactions, time savings, efficiency and “environmental friendliness” of doing it electronically. 

For us, it's exciting moving into those areas and our partnerships continue to be exciting, as well.  We have partnerships with LexisNexis and Clearwell, both of which are unaffected by the recent acquisitions.  So, that's what's new at IGC.

Thanks, Christine, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Trends: George Socha of Socha Consulting

 

This is the first of the 2012 LegalTech New York (LTNY) Thought Leader Interview series.  eDiscoveryDaily interviewed several thought leaders at LTNY this year and generally asked each of them the following questions:

  1. What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?
  2. Which trend(s), if any, haven’t emerged to this point like you thought they would?
  3. What are your general observations about LTNY this year and how it fits into emerging trends?
  4. What are you working on that you’d like our readers to know about?

Today’s thought leader is George Socha.  A litigator for 16 years, George is President of Socha Consulting LLC, offering services as an electronic discovery expert witness, special master and advisor to corporations, law firms and their clients, and legal vertical market software and service providers in the areas of electronic discovery and automated litigation support. George has also been co-author of the leading survey on the electronic discovery market, The Socha-Gelbmann Electronic Discovery Survey; last year he and Tom Gelbmann converted the Survey into Apersee, an online system for selecting eDiscovery providers and their offerings.  In 2005, he and Tom Gelbmann launched the Electronic Discovery Reference Model project to establish standards within the eDiscovery industry – today, the EDRM model has become a standard in the industry for the eDiscovery life cycle and there are nine active projects with over 300 members from 81 participating organizations.  George has a J.D. for Cornell Law School and a B.A. from the University of Wisconsin – Madison.

What do you consider to be the emerging trends in eDiscovery that will have the greatest impact in 2012?

I may have said this last year too, but it holds true even more this year – if there's an emerging trend, it's the trend of people talking about the emerging trend.  It started last year and this year every person in the industry seems to be delivering the emerging trend.  Not to be too crass about it, but often the message is, "Buy our stuff", a message that is not especially helpful.

Regarding actual emerging trends, each year we all try to sum up legal tech in two or three words.  The two words for this year can be “predictive coding.”  Use whatever name you want, but that's what everyone seems to be hawking and talking about at LegalTech this year.  This does not necessarily mean they really can deliver.  It doesn't mean they know what “predictive coding” is.  And it doesn't mean they've figured out what to do with “predictive coding.”  Having said that, expanding the use of machine assisted review capabilities as part of the e-discovery process is a important step forward.  It also has been a while coming.  The earliest I can remember working with a client, doing what's now being called predictive coding, was in 2003.  A key difference is that at that time they had to create their own tools.  There wasn't really anything they could buy to help them with the process.

Which trend(s), if any, haven’t emerged to this point like you thought they would?

One thing I don't yet hear is discussion about using predictive coding capabilities as a tool to assist with determining what data to preserve in the first place.  Right now the focus is almost exclusively on what do you do once you’ve “teed up” data for review, and then how to use predictive coding to try to help with the review process.

Think about taking the predictive coding capabilities and using them early on to make defensible decisions about what to and what not to preserve and collect.  Then consider continuing to use those capabilities throughout the e-discovery process.  Finally, look into using those capabilities to more effectively analyze the data you're seeing, not just to determine relevance or privilege, but also to help you figure out how to handle the matter and what to do on a substantive level.

What are your general observations about LTNY this year and how it fits into emerging trends?

Well, Legal Tech continues to have been taken over by electronic discovery.  As a result, we tend to overlook whole worlds of technologies that can be used to support and enhance the practice of law. It is unfortunate that in our hyper-focus on e-discovery, we risk losing track of those other capabilities.

What are you working on that you’d like our readers to know about?

With regard to EDRM, we recently announced that we have hit key milestones in five projects.  Our EDRM Enron Email Data Set has now officially become an Amazon public dataset, which I think will mean wider use of the materials.

We announced the publication of our Model Code of Conduct, which was five years in the making.  We have four signatories so far, and are looking forward to seeing more organizations sign on.

We announced the publication of version 2.0 of our EDRM XML schema.  It's a tightened-up schema, reorganized so that it should be a bit easier to use and more efficient in the operation.

With the Metrics project, we are beginning to add information to a database that we've developed to gather metrics, the objective being to be able to make available metrics with an empirical basis, rather than the types of numbers bandied about today, where no one seems to know how they were arrived at. Also, last year the Uniform Task Billing Management System (UTBMS) code set for litigation was updated.  The codes to use for tracking e-discovery activities were expanded from a single code that covered not just e-discovery but other activities, to a number of codes based on the EDRM Metrics code set.

On the Information Governance Reference Model (IGRM) side, we recently published a joint white paper with ARMA.  The paper cross-maps the EDRMs Information Governance Reference Model (IGRM) with ARMA's Generally Accepted Recordkeeping Principles (GARP).  We look forward to more collaborative materials coming out of the two organizations.

As for Apersee, we continue to allow consumers search the data on the site for free, but we also are longer charging providers a fee for their information to be available.  Instead, we now have two sponsors and some advertising on the site.  This means that any provider can put information in, and everyone can search that information.  The more data that goes in, the more useful the searching process comes because.  All this fits our goal of creating a better way to match consumers with the providers who have the services, software, skills and expertise that the consumers actually need.

And on a consulting and testifying side, I continue to work a broad array of law firms; corporate and governmental consumers of e-discovery services and software; and providers offering those capabilities.

Thanks, George, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Case Law: Court Dismisses Identify Theft Case Where No Harm Was Proven

 

In the case Reilly v. Ceridian Corp, 11-1738 (3rd Cir. Dec. 12, 2011), the Third Circuit affirmed the district court’s dismissal of a class action against payroll processing company Ceridian for a data breach, finding that the plaintiffs case lacked merit because their alleged injuries were too speculative.

An unknown hacker breached Ceridian’s Powerpay system in December 2009, potentially gaining access to payroll information such as names, birth dates, bank account numbers and Social Security numbers belonging to approximately 27,000 employees at 1,900 companies. Two individual plaintiffs filed suit on behalf of all of the individuals whose information was exposed in the security breach.  However, the lawsuit did not allege that the hacker actually accessed, misused or copied the data. Instead, the plaintiffs claim was based on an allegedly increased risk of identity theft, emotional distress and the credit-monitoring costs they incurred.

The U.S. Court of Appeals for the Third Circuit upheld a District Court decision dismissing the case, finding that these asserted injuries were too speculative to give the plaintiffs standing to bring a federal lawsuit and emphasized the need for an injury-in-fact, which must be actual or imminent, not hypothetical.

The court distinguished this case from other cases in the Seventh and Ninth Circuits where plaintiffs bringing claims for data breaches were found to have standing. The Third Circuit judges noted that those other cases involved threatened harms that were much more “imminent” and “certainly impending” due to evidence of improper intent (such as the Ninth Circuit case, where an individual had attempted to open a bank account with a plaintiff’s information following the physical theft of a laptop).

Even though the plaintiffs voluntarily expended time and money to monitor their financial situation, the court concluded:

“Here, no evidence suggests that the data has been—or will ever be—misused”…The present test is actuality, not hypothetical speculations concerning the possibility of future injury. Appellants’ allegations of an increased risk of identity theft resulting from a security breach are therefore insufficient to secure standing.”

So, what do you think?  Should the case have been dismissed?  Or should a company be held responsible for security breaches regardless what is done with the data that’s breached?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: 2012 Predictions – By The Numbers

With a nod to Nick Bakay, “It’s all so simple when you break things down scientifically.”

The late December/early January time frame is always when various people in eDiscovery make their annual predictions as to what trends to expect in the coming year.  I know what you’re thinking – “oh no, not another set of eDiscovery predictions!”  However, at eDiscovery Daily, we do things a little bit differently.  We like to take a look at other predictions and see if we can spot some common trends among those before offering some of our own (consider it the ultimate “cheat sheet”).  So, as I did last year, I went “googling” for 2012 eDiscovery predictions, and organized the predictions into common themes.  I found eDiscovery predictions here, here, here, here, here, here and Applied Discovery.  Oh, and also here, here and here.  Ten sets of predictions in all!  Whew!

A couple of quick comments: 1) Not all of these are from the original sources, but the links above attribute the original sources when they are re-prints.  If I have failed to accurately attribute the original source for a set of predictions, please feel free to comment.  2) This is probably not an exhaustive list of predictions (I have other duties in my “day job”, so I couldn’t search forever), so I apologize if I’ve left anybody’s published predictions out.  Again, feel free to comment if you’re aware of other predictions.

Here are some of the common themes:

  • Technology Assisted Review: Nine out of ten “prognosticators” (up from 2 out of 7 last year) predicted a greater emphasis/adoption of technological approaches.  While some equate technology assisted review with predictive coding, other technology approaches such as conceptual clustering are also increasing in popularity.  Clearly, as the amount of data associated with the typical litigation rises dramatically, technology is playing a greater role to enable attorneys manage the review more effectively and efficiently.
  • eDiscovery Best Practices Combining People and Technology: Seven out of ten “augurs” also had predictions related to various themes associated with eDiscovery best practices, especially processes that combine people and technology.  Some have categorized it as a “maturation” of the eDiscovery process, with corporations becoming smarter about eDiscovery and integrating it into core business practices.  We’ve had numerous posts regarding to eDiscovery best practices in the past year, click here for a selection of them.
  • Social Media Discovery: Six “pundits” forecasted a continued growth in sources and issues related to social media discovery.  Bet you didn’t see that one coming!  For a look back at cases from 2011 dealing with social media issues, click here.
  • Information Governance: Five “soothsayers” presaged various themes related to the promotion of information governance practices and programs, ranging from a simple “no more data hoarding” to an “emergence of Information Management platforms”.  For our posts related to Information Governance and management issues, click here.
  • Cloud Computing: Five “mediums” (but are they happy mediums?) predict that ESI and eDiscovery will continue to move to the cloud.  Frankly, given the predictions in cloud growth by Forrester and Gartner, I’m surprised that there were only five predictions.  Perhaps predicting growth of the cloud has become “old hat”.
  • Focus on eDiscovery Rules / Court Guidance: Four “prophets” (yes, I still have my thesaurus!) expect courts to provide greater guidance on eDiscovery best practices in the coming year via a combination of case law and pilot programs/model orders to establish expectations up front.
  • Complex Data Collection: Four “psychics” also predicted that data collection will continue to become more complex as data sources abound, the custodian-based collection model comes under stress and self-collection gives way to more automated techniques.

The “others receiving votes” category (three predicting each of these) included cost shifting and increased awards of eDiscovery costs to the prevailing party in litigation, flexible eDiscovery pricing and predictable or reduced costs, continued focus on international discovery and continued debate on potential new eDiscovery rules.  Two each predicted continued consolidation of eDiscovery providers, de-emphasis on use of backup tapes, de-emphasis on use of eMail, multi-matter eDiscovery management (to leverage knowledge gained in previous cases), risk assessment /statistical analysis and more single platform solutions.  And, one predicted more action on eDiscovery certifications.

Some interesting predictions.  Tune in tomorrow for ours!

So, what do you think?  Care to offer your own “hunches” from your crystal ball?  Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: ARMA International and EDRM Jointly Release Information Governance White Paper

 

A few months ago, the Electronic Discovery Reference Model (EDRM) and ARMA International announced that they would be collaborating on information governance guidelines for eDiscovery.  It only took them a little over three months to release their first work product.

On December 20 of last year, ARMA and EDRM announced the publication of a jointly developed white paper entitled, How the Information Governance Reference Model (IGRM) Complements ARMA International’s Generally Accepted Recordkeeping Principles (GARP).  The press release announcing the release of the white paper can be found on the EDRM site here.  The web version of the paper is located here and the PDF version can be downloaded here.

The core of the paper is to relate the EDRM Information Governance Reference Model (IGRM) to ARMA’s GARP® principles.  There are eight GARP principles, as follows:

  1. Accountability
  2. Transparency
  3. Integrity
  4. Protection
  5. Compliance
  6. Availability
  7. Retention
  8. Disposition

The white paper provides a chart for assigning ownership to each business unit for each GARP principle and describes the Maturity Model with five levels of effective Information Governance, ranging from Level 1 (Sub-standard) to Level 5 (Transformational).  Transformational describes “an organization that has integrated information governance into its overall corporate infrastructure and business processes to such an extent that compliance with the program requirements is routine”.  Based on the CGOC Information Governance Benchmark Report from a little over a year ago, most organizations have quite a bit of maturing still to do.

The white paper then proceeds to describe each of the eight principles “According to GARP” at Level 5 Transformational Maturity.  Where’s Robin Williams when you need him?  The white paper finishes with several conclusions noting that “the IGRM complements the metrics defined by ARMA International’s Information Governance Maturity Model”.

This white paper provides a great overview of both the IGRM and ARMA GARP principles and is well worth reading to develop an understanding of both models.  It will be interesting to see how the EDRM and ARMA joint effort proceeds from here to help organizations achieve a higher level of “maturity” when it comes to information governance.

So, what do you think?  Have you read the white paper yet?  Do you think the EDRM/ARMA collaboration will lead to greater information governance within organizations?  As always, please share any comments you might have or if you’d like to know more about a particular topic!

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

eDiscovery Trends: Jason R. Baron

 

This is the first of the Holiday Thought Leader Interview series.  I interviewed several thought leaders to get their perspectives on various eDiscovery topics.

Today’s thought leader is Jason R. Baron. Jason has served as the National Archives' Director of Litigation since May 2000 and has been involved in high-profile cases for the federal government. His background in eDiscovery dates to the Reagan Administration, when he helped retain backup tapes containing Iran-Contra records from the National Security Council as the Justice Department’s lead counsel. Later, as director of litigation for the U.S. National Archives and Records Administration, Jason was assigned a request to review documents pertaining to tobacco litigation in U.S. v. Philip Morris.

He currently serves as The Sedona Conference Co-Chair of the Working Group on Electronic Document Retention and Production. Baron is also one of the founding coordinators of the TREC Legal Track, a search project organized through the National Institute of Standards and Technology to evaluate search protocols used in eDiscovery. This year, Jason was awarded the Emmett Leahy Award for Outstanding Contributions and Accomplishments in the Records and Information Management Profession.

You were recently awarded the prestigious Emmett Leahy Award for excellence in records management. Is it unusual that a lawyer wins such an award? Or is the job of the litigator and records manager becoming inextricably linked?

Yes, it was unusual: I am the first federal lawyer to win the Emmett Leahy award, and only the second lawyer to have done so in the 40-odd years that the award has been given out. But my career path in the federal government has been a bit unusual as well: I spent seven years working as lead counsel on the original White House PROFS email case (Armstrong v. EOP), followed by more than a decade worrying about records-related matters for the government as Director of Litigation at NARA. So with respect to records and information management, I long ago passed at least the Malcolm Gladwell test in "Outliers" where he says one needs to spend 10,000 hours working on anything to develop a level of "expertise."  As to the second part of your question, I absolutely believe that to be a good litigation attorney these days one needs to know something about information management and eDiscovery — since all evidence is "born digital" and lots of it needs to be searched for electronically. As you know, I also have been a longtime advocate of a greater linking between the fields of information retrieval and eDiscovery.

In your acceptance speech you spoke about the dangers of information overload and the possibility that it will make it difficult for people to find important information. How optimistic that we can avoid this dystopian future? How can the legal profession help the world avoid this fate? 

What I said was that in a world of greater and greater retention of electronically stored information, we need to leverage artificial intelligence and specifically better search algorithms to keep up in this particular information arms race. Although Ralph Losey teased me in a recent blog post that I was being unduly negative about future information dystopias, I actually am very optimistic about the future of search technology assisting in triaging the important from the ephemeral in vast collections of archives. We can achieve this through greater use of auto-categorization and search filtering methods, as well as a having a better ability in the future to conduct meaningful searches across the enterprise (whether in the cloud or not). Lawyers can certainly advise their clients how to practice good information governance to accomplish these aims.

You were one of the founders of the TREC Legal Track research project. What do you consider that project’s achievement at this point?

The initial idea for the TREC Legal Track was to get a better handle on evaluating various types of alternative search methods and technologies, to compare them against a "baseline" of how effective lawyers were in relying on more basic forms of keyword searching. The initial results were a wake-up call, in showing lawyers that sole reliance on simple keywords and Boolean strings sometimes results in a large quantity of relevant evidence going missing. But during the half-decade of research that now has gone into the track, something else of perhaps even greater importance has emerged from the results, namely: we have a much better understanding now of what a good search process looks like, which includes a human in the loop (known in the Legal Track as a topic authority) evaluating on an ongoing, iterative basis what automated search software kicks out by way of initial results. The biggest achievement however may simply be the continued existence of the TREC Legal Track itself, still going in its 6th year in 2011, and still producing important research results, on an open, non-proprietary platform, that are fully reproducible and that benefit both the legal profession as well as the information retrieval academic world. While I stepped away after 4 years from further active involvement in the Legal Track as a coordinator, I continue to be highly impressed with the work of the current track coordinators, led by Professor Doug Oard at the University of Maryland, who was remained at the helm since the very beginning.

To what extent has TREC’s research proven the reliability of computer-assisted review in litigation? Is there a danger that the profession assumes the reliability of computer-assisted review is a settled matter?

The TREC Legal Track results I am most familiar with through calendar year 2010 have shown computer-assisted review methods finding in some cases on the order of 85% of relevant documents (a .85 recall rate) per topic while only producing 10% false positives (a .90 precision rate). Not all search methods have had these results, and there has been in fact a wide variance in success achieved, but these returns are very promising when compared with historically lower rates of recall and precision across many information retrieval studies. So the success demonstrated to date is highly encouraging. Coupled with these results has been additional research reported by Maura Grossman & Gordon Cormack, in their much-cited paper Technology-Assisted Review in EDiscovery Can Be More Effective and More Efficient Than Exhaustive Manual Review, which makes the case for the greater accuracy and efficiency of computer-assisted review methods.

Other research conducted outside of TREC, most notably by Herbert Roitblat, Patrick Oot and Anne Kershaw, also point in a similar direction (as reported in their article Mandating Reasonableness in a Reasonable Inquiry). All of these research efforts buttress the defensibility of technology-assisted review methods in actual litigation, in the event of future challenges. Having said this, I do agree that we are still in the early days of using many of the newer predictive types of automated search methods, and I would be concerned about courts simply taking on faith the results of past research as being applicable in all legal settings. There is no question however that the use of predictive analytics, clustering algorithms, and seed sets as part of technology-assisted review methods is saving law firms money and time in performing early case assessment and for multiple other purposes, as reported in a range of eDiscovery conferences and venues — and I of course support all of these good efforts.

You have discussed the need for industry standards in eDiscovery. What benefit would standards provide?

Ever since I served as Co-Editor in Chief on The Sedona Conference Commentary on Achieving Quality in eDiscovery (2009), I have been thinking that the process for conducting good eDiscovery. That paper focused on project management, sampling, and imposing various forms of quality controls on collection, review, and production. The question is, is a good eDiscovery process capable of being fit into a maturity model of sorts, and might be useful to consider whether vendors and law firms would benefit from having their in-house eDiscovery processes audited and certified as meeting some common baseline of quality? To this end, the DESI IV workshop ("Discovery of ESI") held in Pittsburgh last June, as part of the Thirteenth International AI and Law Conference (ICAIL 2011), had as its theme exploring what types of model standards could be imposed on the eDiscovery discipline, so that we all would be able to work from some common set of benchmarks, Some 75 people attended and 20-odd papers were presented. I believe the consensus in the room was that we should be pursuing further discussions as to what an ISO 9001-type quality standard would look like as applied to the specific eDiscovery sector, much as other industry verticals have their own ISO standards for quality. Since June, I have been in touch with some eDiscovery vendors have actually undergone an audit process to achieve ISO 9001 certification. This is an area where no consensus has yet emerged as to the path forward — but I will be pursuing further discussions with DESI workshop attendees in the coming months and promise to report back in this space as to what comes of these efforts.

What sort of standards would benefit the industry? Do we need standards for pieces of the eDiscovery process, like a defensible search standard, or are you talking about a broad quality assurance process?

DESI IV started by concentrating on what would constitute a defensible search standard; however, it became clear at the workshop and over the course of the past few months that we need to think bigger, in looking across the eDiscovery life cycle as to what constitutes best practices through automation and other means. We need to remember however that eDiscovery is a very young discipline, as we're only five years out from the 2006 Rules Amendments. I don't have all the answers, by any means, on what would constitute an acceptable set of standards, but I like to ask questions and believe in a process of continuous, lifelong learning. As I said, I promise I'll let you know about what success has been achieved in this space.

Thanks, Jason, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

eDiscovery Best Practices: Production is the “Ringo” of the eDiscovery Phases

 

Since eDiscovery Daily debuted over 14 months ago, we’ve covered a lot of case law decisions related to eDiscovery.  65 posts related to case law to date, in fact.  We’ve covered cases associated with sanctions related to failure to preserve data, issues associated with incomplete collections, inadequate searching methodologies, and inadvertent disclosures of privileged documents, among other things.  We’ve noted that 80% of the costs associated with eDiscovery are in the Review phase and that volume of data and sources from which to retrieve it (including social media and “cloud” repositories) are growing exponentially.  Most of the “press” associated with eDiscovery ranges from the “left side of the EDRM model” (i.e., Information Management, Identification, Preservation, Collection) through the stages to prepare materials for production (i.e., Processing, Review and Analysis).

All of those phases lead to one inevitable stage in eDiscovery: Production.  Yet, few people talk about the actual production step.  If Preservation, Collection and Review are the “John”, “Paul” and “George” of the eDiscovery process, Production is “Ringo”.

It’s the final crucial step in the process, and if it’s not handled correctly, all of the due diligence spent in the earlier phases could mean nothing.  So, it’s important to plan for production up front and to apply a number of quality control (QC) checks to the actual production set to ensure that the production process goes as smooth as possible.

Planning for Production Up Front

When discussing the production requirements with opposing counsel, it’s important to ensure that those requirements make sense, not only from a legal standpoint, but a technical standpoint as well.  Involve support and IT personnel in the process of deciding those parameters as they will be the people who have to meet them.  Issues to be addressed include, but not limited to:

  • Format of production (e.g., paper, images or native files);
  • Organization of files (e.g., organized by custodian, legal issue, etc.);
  • Numbering scheme (e.g., Bates labels for images, sequential file names for native files);
  • Handling of confidential and privileged documents, including log requirements and stamps to be applied;
  • Handling of redactions;
  • Format and content of production log;
  • Production media (e.g., CD, DVD, portable hard drive, FTP, etc.).

I was involved in a case recently where opposing counsel was requesting an unusual production format where the names of the files would be the subject line of the emails being produced (for example, “Re: Completed Contract, dated 12/01/2011”).  Two issues with that approach: 1) The proposed format only addressed emails, and 2) Windows file names don’t support certain characters, such as colons (:) or slashes (/).  I provided that feedback to the attorneys so that they could address with opposing counsel and hopefully agree on a revised format that made more sense.  So, let the tech folks confirm the feasibility of the production parameters.

The workflow throughout the eDiscovery process should also keep in mind the end goal of meeting the agreed upon production requirements.  For example, if you’re producing native files with metadata, you may need to take appropriate steps to keep the metadata intact during the collection and review process so that the metadata is not inadvertently changed. For some file types, metadata is changed merely by opening the file, so it may be necessary to collect the files in a forensically sound manner and conduct review using copies of the files to keep the originals intact.

Tomorrow, we will talk about preparing the production set and performing QC checks to ensure that the ESI being produced to the requesting party is complete and accurate.

So, what do you think?  Have you had issues with production planning in your cases?  Please share any comments you might have or if you’d like to know more about a particular topic.