Information Governance

eDiscovery Case Law: Discovery Violations Result in Sanctions Against Plaintiff and Counsel

Yesterday, we reported on a case with no sanctions; today, we report on a case with a different outcome.

Both the plaintiff and plaintiff’s counsel have been ordered to pay sanctions for discovery abuses in a lawsuit in Washington court that was dismissed with prejudice on June 8, 2011.

In Play Visions, Inc. v. Dollar Tree Stores, Inc., No. C09-1769 MJP (W.D. Wash. June 8, 2011), the plaintiff moved to voluntarily dismiss its case with prejudice. The defendants did not argue against dismissal but did seek sanctions from the plaintiff based on what they considered to be “a pattern of sanctionable discovery misconduct.” The court ruled that discovery abuses had occurred, and fined the plaintiff and plaintiff’s counsel $137,168.41 “jointly and severally”. The misconduct of the plaintiff, Play Visions, Inc., included:

  • Misrepresentation of Available Documents: Play Visions claimed that all relevant documents were kept in hard copy only; however, deposition of Play Visions’ CFO revealed that electronic records existed that should have been presented months earlier under discovery.
  • Falsified Expert’s Report: The plaintiff’s expert report was prepared by plaintiff’s counsel Mark Lorbiecki and only signed and “approved” by the expert. In addition, the court discovered that the plaintiff had violated the court’s protective order by revealing confidential information to the same expert witness.

As a result of these misrepresentations and discovery abuses and others, the court ruled for the defendant’s motion and demanded the plaintiff and its counsel pay sanctions:

  • The court found that Play Visions, Inc. had falsely certified that all relevant records had been saved in paper format and delayed the search and production of documents. Play Visions’ counsel was found to have been negligent in familiarizing himself with Play Visions’ document practices and to have failed in assisting his client in mandatory discovery.
  • Accordingly, the court considered every case where the defendant was forced to do extra work as a result of the plaintiff’s delays and inaccuracies, and fined Play Visions, Inc. and its counsel $137,168.41 jointly and severally, due within 15 days of the order.
  • Not finding “that the discovery violations in this case merit finding the entire case exceptional under 35 U.S.C. § 285”, the court ruled against shifting any attorney’s fees in this case.  Otherwise, the sanctions award could have been even higher!

So, what do you think? Do the discovery violations committed by Play Visions and by its attorney demand monetary sanctions on this scale? Did Play Visions actually believe that they had no relevant electronic files?  Please share any comments you might have, or let us know if you’d like to know more about a particular topic.

eDiscovery Best Practices: Avoiding eDiscovery Nightmares: 10 Ways CEOs Can Sleep Easier

 

I found this article in the CIO Central blog on Forbes.com from Robert D. Brownstone – it’s a good summary of issues for organizations to consider so that they can avoid major eDiscovery nightmares.  The author counts down his top ten list David Letterman style (clever!) to provide a nice easy to follow summary of the issues.  Here’s a summary recap, with my ‘two cents’ on each item:

10. Less is more: The U.S. Supreme Court ruled unanimously in 2005 in the Arthur Andersen case that a “retention” policy is actually a destruction policy.  It’s important to routinely dispose of old data that is no longer needed to have less data subject to discovery and just as important to know where that data resides.  My two cents: A data map is a great way to keep track of where the data resides.

9. Sing Kumbaya: They may speak different languages, but you need to find a way to bridge the communication gap between Legal and IT to develop an effective litigation-preparedness program.  My two cents: Require cross-training so that each department can understand the terms and concepts important to the other.  And, don’t forget the records management folks!

8. Preserve or Perish: Assign the litigation hold protocol to one key person, either a lawyer or a C-level executive to decide when a litigation hold must be issued.  Ensure an adequate process and memorialize steps taken – and not taken.  My two cents: Memorialize is underlined because an organization that has a defined process and the documentation to back it up is much more likely to be given leeway in the courts than a company that doesn’t document its decisions.

7. Build the Three-Legged Stool: A successful eDiscovery approach involves knowledgeable people, great technology, and up-to-date written protocols.  My two cents: Up-to-date written protocols are the first thing to slide when people get busy – don’t let it happen.

6. Preserve, Protect, Defend: Your techs need the knowledge to avoid altering metadata, maintain chain-of-custody information and limit access to a working copy for processing and review.  My two cents: A good review platform will assist greatly in all three areas.

5. Natives Need Not Make You Restless: Consider exchanging files to be produced in their original/”native” formats to avoid huge out-of-pocket costs of converting thousands of files to image format.  My two cents: Be sure to address how redactions will be handled as some parties prefer to image those while others prefer to agree to alter the natives to obscure that information.

4. Get M.A.D.?  Then Get Even: Apply the Mutually Assured Destruction (M.A.D.) principle to agree with the other side to take off the table costly volumes of data, such as digital voicemails and back-up data created down the road.  My two cents: That’s assuming, of course, you have the same levels of data.  If one party has a lot more data than the other party, there may be no incentive for that party to agree to concessions.

3. Cooperate to Cull Aggressively and to Preserve Clawback Rights: Setting expectations regarding culling efforts and reaching a clawback agreement with opposing counsel enables each side to cull more aggressively to reduce eDiscovery costs.  My two cents: Some parties will agree on search terms up front while others will feel that gives away case strategy, so the level of cooperation may vary from case to case.

2. QA/QC: Employ Quality Assurance (QA) tests throughout review to ensure a high accuracy rate, then perform Quality Control (QC) testing before the data goes out the door, building time in the schedule for that QC testing.  Also, consider involving a search-methodology expert.  My two cents: I cannot stress that last point enough – the ability to illustrate how you got from the large collection set to the smaller production set will be imperative to responding to any objections you may encounter to the produced set.

1. Never Drop Your Laptop Bag and Run: Dig in, learn as much as you can and start building repeatable, efficient approaches.  My two cents: It’s the duty of your attorneys and providers to demonstrate competency in eDiscovery best practices.  How will you know whether they have or not unless you develop that competency yourself?

So, what do you think?  Are there other ways for CEOs to avoid eDiscovery nightmares?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: The Best SaaS Providers are Certifiable

 

The increasing popularity of cloud-based Software-as-a-Service (SaaS) solutions is becoming well documented, with this very blog noting Forrester and Gartner predictions of tremendous growth in cloud computing over the next several years.  We’ve also noted the importance of knowing where your data is stored, as many online poker players learned the hard way when the recent US government crackdown of several gambling sites left them without a way to recover their funds.

If only there were some sort of certification, administered by an impartial third party, to ensure that your SaaS provider has implemented policies and processes that keep your information secure, stable and safe.  There is such a certification.

SAS 70 (the Statement on Auditing Standards No. 70) defines the standards an auditor must employ in order to assess the contracted internal controls of a service provider. Service providers, such as insurance claims processors, credit processing companies and, especially pertinent to eDiscovery, hosted data centers, are evaluated by these standards. The SAS 70 was developed by the American Institute of Certified Public Accountants (AICPA) as a simplification of a set of criteria for auditing standards originally defined in 1988.  Standards such as SAS 70 became critical in the wake of the Sarbanes-Oxley, which created significant legal penalties for publicly traded companies who lacked sufficient control standards for their financial information.

Under SAS 70, auditor reports are classified as either Type I or Type II. In a Type I report, the auditor evaluates the service provider to prevent accounting inconsistencies, errors and misrepresentation. The auditor also evaluates the likelihood that those efforts will produce the desired future results. A Type II report goes a step further.  It includes the same information as that contained in a Type I report; however, the auditor also attempts to determine the effectiveness of agreed-on controls since their implementation. Type II reports also incorporate data compiled during a specific time period, usually a minimum of six months.

SAS 70 reports are either requested by the service provider or a user organization (i.e., clients). The ability for the service provider to provide consistent service auditor's reports builds a client's trust and confidence in the service provider, satisfying potential concerns. A SaaS (2 a’s, as opposed to one for SAS) provider that has received SAS 70 Type II certification has demonstrated to an impartial third party a proven track record of policies and processes to protect its clients’ data.  When it comes to your data, you want a provider that has proven to be certifiable.

So, what do you think?  Is your SaaS provider SAS 70 Type II certified?  Please share any comments you might have or if you’d like to know more about a particular topic.

Full disclosure: I work for Trial Solutions, which provides SaaS-based eDiscovery review applications FirstPass® (for first pass review) and OnDemand® (for linear review and production).  Our clients’ data is hosted in a secured, SAS 70 Type II certified Tier 4 Data Center in Houston, Texas.

eDiscovery Trends: Forecast for More Clouds

 

No, eDiscoveryDaily has not begun providing weather forecasts on our site.  Or stock forecasts.

But, imagine if you could invest in an industry that could nearly sextuple in nine years? (i.e., multiply six-fold).

Well, the cloud computing, or Software-as-a-Service (SaaS), industry may be just the industry for you.  According to a Forrester report from last month, the global cloud computing market will grow from 40.7 billion dollars in 2011 to more than 241 billion dollars by 2020.  That’s a 200 billion dollar increase in nine years.  That’s enough to put anybody “on cloud nine”!

The report titled Sizing The Cloud by Stefan Ried (Principal Analyst, Forrester) and Holger Kisker (Sr. Analyst, Forrester), outlines the different market dynamics for three core layers of cloud computing, as follows:

  • Public Cloud: From 25.5 billion dollars to 159.3 billion dollars by 2020;
  • Virtual Private Cloud: From 7.5 billion dollars to 66.4 billion dollars by 2020;
  • Private Cloud: From 7.8 billion dollars to 159.3 billion dollars by 2020.

Public cloud providers include everything from Facebook and Twitter to Amazon.com and Salesforce.com.  As the name implies, a private cloud is where companies implement their own cloud environment to support its own needs.  A virtual private cloud is simply a private cloud located within a public cloud.

Forrester is not the only analyst firm that expects big things for cloud computing.  The Gartner Group projected that the cloud computing industry will have revenue of 148.8 billion dollars by 2014, even higher than Forrester’s forecast of 118.7 billion dollars for the same year.  Clearly, the benefits of the cloud are causing many organizations to consider it as a viable option for storing and managing critical data.

What does that mean from an eDiscovery perspective?  That means a forecast for more clouds.  If your organization doesn’t have a plan in place for managing, identifying, preserving and collecting data from its cloud solutions, things could get stormy!

So, what do you think?  Is your organization storing more data in the cloud?  Does your organization have an effective plan in place for getting to the data when litigation strikes?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Best Practices: Usefulness of Facebook’s Self Collection Mechanism

 

We’ve written about Facebook a lot on this blog.  Shortly after this blog was launched, we provided information on Facebook’s subpoena policy.  We’ve also talked about the eDiscovery implications associated with the rollout of Facebook’s new email messaging system, dubbed “Facemail”.  And, just last week, we chronicled a case involving Facebook where they were ordered to produce documents instead of just merely providing access to them.  And, we haven’t even mentioned the latest revelations that Facebook may have secretly hired a PR firm to plant negative stories about Google (oops, we just did!).

But perhaps our most popular post regarding Facebook was regarding the self collection mechanism that they rolled out last October, which we found out about via our LegalTech interview with Craig Ball published back in March after our February interview (Craig also wrote an article about the feature in Law Technology News in February).

Now, another article has been written about the usefulness of Facebook’s self collection mechanism (called “Download Your Information”) in the blog E-Discovery Law Alert, entitled How Useful is Facebook's "Download Your Information" Feature in E-Discovery?, written by Patrick V. DiDomenico.

The author of this article conducted a test by downloading his information via the utility, deleting some information from his Facebook profile – “an email message, some wall posts, comments, photos, and even a friend (not a close friend)” – hopefully, he added the friend back.  Then, he downloaded his information again, every day for four days, with no change for the first three days.  On the fourth day, most of the deleted information disappeared from the download, except the email message (which disappeared when he ran the utility one more time).

The conclusion was that the mechanism “does not appear to ‘look back’ and recover deleted information in the user’s account”.  Thoughts:

  • With no change in the download in the first three days, the author notes that “Facebook did not take a fresh snapshot of my account every day – it just re-downloaded the same file three days in a row”.  He doesn’t mention whether he added any content during this time.  It would be interesting to see if that would force a change.
  • I don’t believe that there is any specific documentation from Facebook as to how it handles additions and deletions and how often the snapshot is updated.  If not, it might behoove them to create some, it might save them some subpoena requests.
  • The author notes that “it is inadvisable for lawyers to rely solely on the Download Your Information feature for discovery of an adversary’s Facebook information” as it “gives no assurance that a litigant’s attempt to delete evidence will be revealed”.  On the other hand, it may be still an appropriate mechanism to use for your own discovery to preserve your own information.  Facebook may also store deleted information on backup tapes, so a subpoena could catch your opponent red-handed if you can justify the discovery of those tapes.  Food for thought.

So, what do you think?  Have you had any Facebook discovery requests in your eDiscovery projects?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Best Practices: What Are the Skeletons in Your ESI Closet?

 

At eDiscoveryDaily, we try not to re-post articles or blog posts from other sources.  We may reference them, but we usually try to add some commentary or analysis to provide a unique spin on the topic.  However, I read a post Thursday on one of the better legal blogs out there – Ride the Lightning from Sharon Nelson – that was a guest post by Jim McGann, VP of Information Discovery at Index Engines that I thought was well done and good information for our readers.  Jim has been interviewed by eDiscoveryDaily here and here and always has terrific insight on ESI issues.  You can click here to read the post directly on Ride the Lightning or simply read below.

Law firms and corporations alike tend to keep data storage devices well beyond what their compliance requirements or business needs actually dictate.  These so-called “skeletons in the closet” pose a major problem when the entity gets sued or subpoenaed. All that dusty data is suddenly potentially discoverable. Legal counsel can be proactive and initiate responsible handling of this legacy data by defining a new, defensible information governance process.

  1. Understand all data sources. The first choice when faced with an ESI collection is to look at current online network data. However, many other sources of email and files exist on corporate networks, sources that may be more defensible and even cost effective to collect from, including offsite storage typically residing on backup tapes. Tape as a collection source has been overlooked because it was historically difficult and expensive to collect from legacy backup tapes.
  2. Get proactive with legal requirements. Defining what ESI data should be kept and placed on litigation hold and what can be purged are the first steps in a proactive strategy. These legal requirements will allow clients to put a policy in place to save specific content, certain custodians and intellectual property so that it is identifiable and ready for on demand discovery.
  3. Understand technology limitations. Only use tools that index all the content, and don’t change any of the metadata. Some older search solutions compromise the indexing process, and this may come to haunt you in the end.
  4. Become a policy expert. As new technology comes on the market, it tends to improve and strengthen the discovery process. Taking the time to understand technology trends allows you to stay one step ahead of the game and create a current defensible collection process and apply policy to it.

So, what do you think?  Do you have “skeletons” in your ESI closet?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Best Practices: 4 Steps to Effective eDiscovery With Software Analytics

 

I read an interesting article from Texas Lawyer via Law.com entitled “4 Steps to Effective E-Discovery With Software Analytics” that has some interesting takes on project management principles related to eDiscovery and I’ve interjected some of my thoughts into the analysis below.  A copy of the full article is located here.  The steps are as follows:

1. With the vendor, negotiate clear terms that serve the project's key objectives.  The article notes the important of tying each collection and review milestone (e.g., collecting and imaging data; filtering data by file type; removing duplicates; processing data for review in a specific review platform; processing data to allow for optical character recognition (OCR) searching; and converting data into a tag image file format (TIFF) for final production to opposing counsel) to contract terms with the vendor. 

The specific milestones will vary – for example, conversion to TIFF may not be necessary if the parties agree to a native production – so it’s important to know the size and complexity of the project, and choose only an experienced eDiscovery vendor who can handle the variations.

2. Collect and process data.  Forensically sound data collection and culling of obviously unresponsive files (such as system files) to drastically decrease the overall review costs are key services that a vendor provides in this area.  As we’ve noted many times on this blog, effective culling can save considerable review costs – each gigabyte (GB) culled can save $16-$18K in attorney review costs.

The article notes that a hidden cost is the OCR process of translating extracted text into a searchable form and that it’s an optimal negotiation point with the vendor.  This may have been true when most collections were paper based, but as most collections today are electronic based, the percentage of documents requiring OCR is considerably less than it used to be.  However, it is important to be prepared that there are some native files which will be “image only”, such as TIFFs and scanned PDFs – those will require OCR to be effectively searched.

3. Select a data and document review platform.  Factors such as ease of use, robustness, and reliability of analytic tools, support staff accessibility to fix software bugs quickly, monthly user and hosting fees, and software training and support fees should be considered when selecting a document review platform.

The article notes that a hidden cost is selecting a platform with which the firm’s litigation support staff has no experience as follow-up consultation with the vendor could be costly.  This can be true, though a good vendor training program and an intuitive interface can minimize or even eliminate this component.

The article also notes that to take advantage of the vendor’s more modern technology “[a] viable option is to use a vendor's review platform that fits the needs of the current data set and then transfer the data to the in-house system”.  I’m not sure why the need exists to transfer the data back – there are a number of vendors that provide a cost-effective solution appropriate for the duration of the case.

4. Designate clear areas of responsibility.  By doing so, you minimize or eliminate inefficiencies in the project and the article mentions the RACI matrix to determine who is responsible (individuals responsible for performing each task, such as review or litigation support), accountable (the attorney in charge of discovery), consulted (the lead attorney on the case), and informed (the client).

Managing these areas of responsibility effectively is probably the biggest key to project success and the article does a nice job of providing a handy reference model (the RACI matrix) for defining responsibility within the project.

So, what do you think?  Do you have any specific thoughts about this article?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: Thought Leader Interview with Jeffrey Brandt, Editor of Pinhawk Law Technology Daily Digest

 

As eDiscovery Daily has done in the past, we have periodically interviewed various thought leaders in eDiscovery and legal technology to provide insight as to trends in the industry for our readers to consider.  Recently, I was able to interview Jeffrey Brandt, Editor of the Pinhawk Law Technology Daily Digest and columnist for Legal IT Professionals.

With an educational background in computer science and mathematics from the University of Pittsburgh, Jeff has over twenty four years of experience in the field of legal automation working with various organizations in the United States, Canada, and the United Kingdom.  As a technology and management consultant to hundreds of law firms and corporate law departments he has worked on information management projects including: long range strategic planning, workflow management and reengineering, knowledge management, IT structure and personnel requirements and budgeting. Working as a CIO at several large law firms, Jeff has helped bring oversight, coordination and change management to initiatives including: knowledge management, library & research services, eDiscovery, records management, technology and more. Most recently, he served as the Chief Information and Knowledge Officer with an AMLaw 100 law firm based out of Washington, DC.

Jeff has also been asked to serve on numerous advisory councils and CIO advisory boards for key vendors in the legal space, advising them on issues of client service and future product direction.  He is a long time member (and former board member) of the International Legal Technology Association (ILTA) and has taught CLE classes on topics ranging from litigation support to ethics and technology.

What do you consider to be the current significant trends in eDiscovery in 2011 and beyond on which people in the industry are, or should be, focused?

I would say that the biggest two are the project management component and, for lack of a better term, automated or artificial intelligence.

The whole concept and the complexities of what it takes to manage a case today are more challenging than ever, including issues like the number of sources, the amount of data in the sources, the format in which you’re producing, where can the data go and who can see it.  I remember the days when people used to take a couple of bankers boxes, put them in their car and go home and work on the documents.  You simply cannot do that today – the amount of information today is just insane.

As for artificial intelligence, as was discussed in the (Pinhawk) digest recently, you’re seeing the emergence of predictive coding and using computers to cull through the massive amounts of information so that a human can take the final pass.  I think more and more we’re going to see people relying on those types of technologies – some because they embrace it, others because there is no other way to humanly do it.

I think if there’s any third trend it would probably be where do we go next to get the data?  In terms of social media, mining Facebook and Twitter and all the other various sources for additional data as part of the discovery process has become a challenge.

You recently became editor of the Pinhawk Law Technology Daily Digest.  Tell me about that and about your plans for the digest.

Well, I think there are several things going forward.  My role is to keep up the good work that Curt Meltzer, the founding editor, started and fill the “big shoes” that Curt left behind.  My goal is to expand the sources of information from which Pinhawk draws.  There are about 400 sources today and I think by the time my sources (and possibly a few others) are added in, there will be over 500.  We’ve also talked about going to our readership and asking them “what are your go-to and must read sources?” to include those sources as well.  We’ll also be looking to incorporate social media tools to hopefully make the experience much more comprehensive and easier to participate in for the Pinhawk digest reader.

And, what should we be looking for in your column in Legal IT Professionals?

Well, I like to dabble in multiple areas – in the small consulting practice that I have, I do a little bit of everything.  I’ve recently done some very interesting work in communities of practice, using social media tools, focusing them inward in law firms to provide the forum for lawyers to open up, share and mentor to others.  I like KM (Knowledge Management) and related topics and we had a recent post in Pinhawk talking about the future of the law firm.  To me, those types of discussions are fascinating.

You take the extremes and you’ve got the “law factory”, you take the high-end and you’ve got the “bet the farm” law firm.  How technology plays a role in whatever culture, whatever focus a law firm puts itself on is interesting.  And then you watch and see some of the rumblings and inklings of what can be done in places like Australia, where you have third-party investment of law firms and the United Kingdom, where they are about to get third-party investment.  There was a recent article about third-party ownership of law firms in North Carolina.  You look at examples like that and you say “is the model of partnership alive?”  When you get into “big law”, are they really partnerships?  Where are they in the spectrum of a thousand sole practitioners operating under one letterhead to a firm of a thousand lawyers?  That’s where I think that communities of practice and social media tools are going to help lawyers know more about their own partners and own firms. 

It’s sad that in some firms the lawyers on the north side of the building don’t even know the lawyers on the south side of the building, let alone the people on the eighth floor vs. the tenth floor.  It’s a changing landscape.  When I got into legal and was first a CIO at Porter, Wright, Morris & Arthur, 250 lawyers in Columbus, Ohio was the 83rd largest law firm in the US – an AMLAW 100 firm.  Today, does that size a firm even make it into the AMLAW 250?

In my column at Legal IT Professionals, you’ll see more about KM and change management.  Another part of my practice is mentoring IT executives in how to deal with business problems related to the business of law and I think that might be my next post – free advice to the aspiring CIO.

This might sound odd coming from a technologist, but…it’s not really about the technology.  From a broad standpoint, you can be successful with most software tools.  A law firm isn’t going to be made or broken whether it chose OpenText or iManage as a document management tool or chose a specific litigation support tool.  It is more about the people, the education and the process than it is the actual tool.  Yes, there are some horrible tools that you should avoid, but, all things being equal, it’s really more the other pieces of the equation that determine your success.

Thanks, Jeff, for participating in the interview!

And to the readers, as always, please share any comments you might have or if you’d like to know more about a particular topic!

What eDiscovery Professionals Can Learn from the Internet Gambling Crack Down

 

Many of you may have heard about the FBI cracking down on the three largest online gambling sites in the past few days, as the owners of those sites in the United States have been indicted and charged with bank fraud, money laundering and illegal gambling offenses and the sites have been essentially shut down in the US.  Restraining orders have been issued against more than 75 bank accounts in 14 countries used by the poker companies PokerStars, Full Tilt Poker and Absolute Poker.  Many US customers of these sites are now scrambling to try to get their funds out of the sites and finding it difficult to do so.

So what?  This is an eDiscovery blog, right?  What does an Internet gambling crack down have to do with eDiscovery?

PokerStars, Full Tilt Poker, Absolute Poker and other gambling sites are cloud-based, Software-as-a-Service (SaaS) applications.  Just like Amazon, Facebook, Twitter, eBay, YouTube and SalesForce.com, these sites provide an application via the web that enables its clients to receive a service.  In the case of Amazon, it’s the ability to purchase any number of products.  For Facebook, it’s the ability to share information with friends and family.  For these gambling sites, it’s the ability to play poker for money with anyone else in the world who has the same gambling itch that you do and a broadband connection.

The problem is: in the US, it’s illegal.  The Unlawful Internet Gambling Enforcement Act of 2006 prohibits gambling businesses from knowingly accepting payments related to a bet or wager that involves the use of the Internet and that is unlawful under any federal or state law.  So, these sites are hosted in other countries to attempt to skirt the law.

What many US customers of these sites are finding out is the same thing that eDiscovery professionals discover when they need to retrieve cloud-based data in response to a discovery request: it’s imperative to know where your data is stored.  It’s likely that many customers of these gambling sites knew that their funds were kept off-shore, while others may not have known this was the case.  Regardless, they’re now scrambling to get their data (i.e., funds) back — if they can.

Many organizations are “in the same boat” when it comes to their SaaS providers – it may be unclear where that data is being stored and it may be difficult to retrieve if it’s stored in a foreign country with a different set of laws.  It’s important to establish (in writing if possible) with the provider up front where the data will be stored and agree on procedures such as records management/destruction schedules so that you know where your data is stored and can get access to it when you need it.  Don’t gamble with your data.

So, what do you think?  Do you have organizational data in a SaaS-based solution?  Do you have a plan for getting that data when you need it?  Please share any comments you might have or if you’d like to know more about a particular topic.

Full disclosure: I work for Trial Solutions, which provides SaaS-based eDiscovery review applications FirstPass® (for first pass review) and OnDemand® (for linear review and production).  Our clients’ data is hosted in a secured Tier 4 Data Center in Houston, Texas, where Trial Solutions is headquartered.

eDiscovery Trends: 2011 eDiscovery Errors Survey

 

As noted in Legal IT Professionals on Friday, LDM Global on Friday announced the results of its 2011 eDiscovery Errors survey. The company asked a selection of industry professionals their views on which errors they experienced most often during the discovery process. Results were collected from across the USA, Europe and Australia.

According to Scott Merrick, LDM Global Marketing Director and survey author, “Our goal was to find out what the real, day to day issues and problems are around the discovery process.”  He also noted that “Of particular interest was the ongoing challenge of good communication. Technology has not solved that challenge and it remains at the forefront of where mistakes are made.”

The respondents of the survey were broken down into the following groups: Litigation Support Professionals 47%, Lawyers 30%, Paralegals 11%, IT Professionals 9% and Others 3%.  Geographically, the United States and Europe had 46% of the respondents each, with the remaining 8% of respondents coming from Australia.  LDM Global did not identify the total number of respondents to the survey.

For each question about errors, respondents were asked to classify the error as “frequently occurs”, “occasionally occurs”, “not very common” or “never occurs”.  Based on responses, the most common errors are:

  • Failure to Effectively Communicate across Teams: 50% of the respondents identified this error as one that frequently occurs
  • An Inadequate Data Retention Policy: 47% of the respondents identified this error as one that frequently occurs
  • Not Collecting all Pertinent Data: 41% of the respondents identified this error as one that frequently occurs
  • Failure to Perform Critical Quality Control (i.e., sampling): 40% of the respondents identified this error as one that frequently occurs
  • Badly Thought Out, or Badly Implemented, Policy: 40% of the respondents identified this error as one that frequently occurs

Perhaps one of the most surprising results is that only 14% of respondents identified Spoliation of evidence, or the inability to preserve relevant emails as an error that frequently occurs.  So, why are there so many cases in which sanctions have been issued for that very issue?  Interesting…

For complete survey results, go to LDMGlobal.com.

So, what do you think?  What are the most common eDiscovery errors that your organization has encountered?   Please share any comments you might have or if you’d like to know more about a particular topic.