Information Management

eDiscovery Trends: Potential ESI Sources Abound in Penn State Case

 

Whether you’re a college football fan or not, chances are you’ve heard about the scandal associated with the allegations of serial child abuse by former Penn State football coach Jerry Sandusky.  There seems to be new developments almost daily and the scandal has already cost the jobs of the university president, vice president, athletic director and the head football coach, Joe Paterno, who had been head coach since 1965 and on the coaching staff since 1950 (most of us weren’t even born yet!).  Numerous lawsuits seem highly likely to arise as a result of the alleged abuse against a variety of defendants, including the university, individuals alleged to be involved in the abuse and cover-up and also the Second Mile Foundation founded by Sandusky.

Seth Row, an attorney with Parsons Farnell & Grein LLP in Portland (OR), has written an article published in the Association of Certified eDiscovery Specialists (ACEDS) web site providing a detailing of potential sources of ESI that may be relevant in the case.  The article illustrates the wide variety of sources that might be responsive to the litigation.  Here are some of the sources cited by Row:

  • Videotape of entry and exit from the athletic facilities at Penn State, to which Paterno gave Sandusky access after the latter resigned in 1999;
  • Entry/exit logs, which are likely housed in a database if keycards were used, for the Lasch Football Building, where abuse was allegedly witnessed
  • Phone records of incoming and outgoing calls;
  • Electronic rosters of football players, coaches, staff, student interns, and volunteers affiliated with the Penn State football program over time;
  • The personal records of these individuals, including telephone logs, internet search histories, email accounts, medical and financial records, and related information created over time;
  • University listservs;
  • Internet forums – a New York Times article reported last week that a critical break in the investigation came via a posting on the Internet, mentioning that a Penn State football coach might have seen something ugly, but kept silent;
  • Maintenance logs maintained by the two custodial employees who allegedly witnessed abuse;
  • Identities of all media beat reporters who covered the Penn State football team;
  • Passenger and crew manifests for all chartered flights of the Penn State football team in which Sandusky was a passenger;
  • Sandusky's credit card records to document meals and outings where he may have been accompanied by victims, and records of gifts he purchased for them;
  • All records of the Second Mile Foundation identifying boys who participated in its programs, as well as the names of donors and officers, directors and staff;
  • Paper record equivalents of this ESI that were produced in the 1990s before electronic recordkeeping became prevalent;
  • All electronic storage and computing devices owned or maintained by Sandusky, Paterno and other central figures in the scandal, including cell phones, personal computers, tablet computers, flash drives, and related hardware.

With such a wide variation of potential custodians and time frames, it will be difficult to quickly narrow down the potential ESI sources.  As the author points out, it seems likely that Penn State has already locked down its records retention policies throughout the university.  They certainly would seem to have a reasonable expectation of litigation.  Investigators and attorneys will likely be racing against time to identify as many other parties as possible with potentially responsive ESI.

So, what do you think?  Have you been involved in litigation with such a wide distribution of potentially responsive ESI?  Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Best Practices: Data Mapping Doesn’t Have to be Complicated

 

Some time ago, we talked about the importance of preparing a data map of your organization’s data to be ready when litigation strikes.

Back then, we talked about four steps to create and maintain an effective data map, including:

  • Obtaining early “buy-in” with various departments throughout the organization;
  • Document and educate to develop logical and comprehensive practices for managing data;
  • Communicate regularly so that new data stores (or changes to existing ones) can be addressed as they occur;
  • Update periodically to keep up with changes in technology that create new data sources.

The data map itself doesn’t have to be complicated.  It can be as simple as a spreadsheet (or series of spreadsheets, one for each department or custodian, depending on what level of information is likely to be requested).  Here are examples of types of information that you might see in a typical data map spreadsheet:

  • Type of Data: Prepare a list and continue to add to it to ensure all of the types or data are considered.  These can include email, work product documents, voice mail, databases, web site, social media content, hard copy documents, and any other type of data in use within your organization.
  • Department/Custodian: A data map is no good unless you identify the department or custodian responsible for the data.  Some of these may be kept by IT (e.g., Exchange servers for the entire organization) while others could be down to the individual level (e.g., Access databases kept on an individual’s laptop).
  • Storage Classification: The method(s) by which the data is stored by the department or custodian is important to track.  You’ll typically have Online, Nearline, Offline and Inaccessible Data.  A type of data can apply to multiple or even all storage classifications.  For example, email can be stored Online in Exchange servers, Nearline in an email archiving system, Offline in backup tapes and Inaccessible in a legacy format.  Therefore, you’ll need a column in your spreadsheet for each storage classification.
  • Retention Policy: Track the normal retention policy for each type of data stored by each department of custodian (e.g., retain email for 5 years).  While a spreadsheet won’t automatically identify when specific data is “expired”, a regular process of looking for data older than the retention time period will enable your organization to purge “expired” data.
  • Litigation Hold Applied: Unless of course, that data is subject to an active litigation hold.  If so, you’ll want to identify the case(s) for which the hold is applied and be prepared to update to remove those cases from the list once the hold obligation is released.  If all holds are released on normally “expired” data and no additional hold obligations are expected, that may be the opportunity to purge that data.
  • Last Update Date: It’s always a good idea to keep track of when the information in the data map was last updated.  If it’s been a while since that last update, it might be time to coordinate with that department or custodian to bring their portion of the data map current.

As you see, a fairly simple 9 or 10 column spreadsheet might be all you need to start gathering information about the data stores in your organization.

So, what do you think?  Has your organization implemented a data mapping program?  If not, why not? Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: Is Email Still the Most Common Form of Requested ESI?

 

Email has historically been the most common form of requested electronically stored information (ESI), but that has changed, according to a survey performed by Symantec and reported in Law Technology News.

According to the article, Symantec’s survey, conducted this past June and July, included lawyers and technologists at 2,000 enterprises worldwide.  However, the article doesn’t indicate the total number of respondents or whether that’s the number of organizations receiving the survey or the number actually responding.

Regarding how frequently (percentage of situations requested) various types of ESI are requested during legal and regulatory processes, the survey yielded some surprising answers:

  • Files and Documents: 67 percent
  • Application and Database Records: 61 percent
  • Email: 58 percent
  • Microsoft SharePoint records: 51 percent
  • Messaging Formats (e.g., instant messaging, texts, and BlackBerry PIN messages): 44 percent
  • Social Media Data: 41 percent

Email requested in legal and regulatory processes just over half the time?  That’s more than surprising, that’s shocking!

Symantec’s survey also asked about implementation of a formal data retention policy, with 30 percent of responding companies indicating that they have discussed but have not implemented a policy and 14 percent indicating that they have no plans to implement a policy (44 percent total that have not implemented a policy).  Reasons for not doing so were as follows (respondents were allowed to pick multiple reasons):

  • No Need Identified: 41 percent
  • Cost: 38 percent
  • No Designated Employee (to implement the policy): 27 percent
  • Too Time Consuming: 26 percent
  • Lack of Expertise: 21 percent

Many of these companies may not feel compelled to implement a policy because they are not frequently in litigation nor are they in regulated industries.

So, what do you think?  Do the percentages above reflect your experience as to how frequently the different types of ESI are requested?  Does the email percentage seem significantly low?  In my experience, it does.  Please share any comments you might have or if you’d like to know more about a particular topic.

A Marriage Made for eDiscovery: EDRM and ARMA

 

EDRM has been busy lately, with a new Model Code of Conduct drafted recently and now this announcement.

As discussed in our recent twopart series on eDiscovery standards, there is a growing movement to develop industry standards, frameworks, or reference models to help manage eDiscovery. This week, there was perhaps a major move in that direction as the Electronic Discovery Reference Model (EDRM) and ARMA International announced that they would be collaborating on information governance guidelines for eDiscovery.  

According to EDRM, the partnership began at LegalTech in New York back in February when ARMA reached out to suggest working together. The plan is still vague, but together these two groups hope to provide a framework for records management in the eDiscovery context. “I don’t know where this partnership will take us, but it’s just silly that two groups with similar goals and ideals would work in isolation,” says George Socha, an eDiscovery consultant and one of the co-founders and co-managers of EDRM.

Two years ago, EDRM started its Information Governance Reference Model, providing a conceptual framework for information governance. Today, the Information Governance Reference Model is primarily a rough guide for developing information management programs. But EDRM, which is a relatively small volunteer effort, hopes that the weight of ARMA, which boasts 11,000 members, will help flesh out the framework.

By contrast, the Association for Information Management Professionals (ARMA) International is an established and relatively large and influential group claiming 11,000 members in 30 countries. ARMA international has developed its Generally Accepted Record-keeping Principles, or GARP, framework to provide best practices for information management. The framework is designed generally for records-keeping management, but has been designed to account for the demands of eDiscovery. Though ARMA’s core constituency is records managers, the demands of litigation have been driving many of the group’s recent initiatives. 

Interestingly, as we’ve noted previously, ARMA has previously described the EDRM effort as falling “short of describing standards or best practices that can be applied to the complex issues surrounding the creation, management, and governance of electronic information.” However, the organization clearly believes EDRM’s network of experienced litigators and IT professionals will help it address the demands of eDiscovery.

If broad industry standards efforts are going to be developed, it will take more such efforts like this that cut across industries and bring expertise from different areas into alignment. Socha believes that though the EDRM and ARMA have traditionally served different groups, they have both realized that they are concerned with many of the same problems.  “A lot of the root causes of eDiscovery issues come from a failure to have your electronic house in order,” says Socha. “What the Information Governance Reference Model and GARP are about is addressing that issue.”

So, what do you think? Does the EDRM need ARMA? Or vice versa? Please share any comments you might have or if you'd like to know more about a particular topic.

eDiscovery Standards: Does the Industry Need Them?

 

eDiscovery Daily recently ran a three part series analyzing eDiscovery cost budgeting. Cost has long been a driving force in eDiscovery decision-making, but it is just one dimension in choosing EDD services. Other industries have well-established standards for quality – think of the automotive or software industries, which have standard measures for defects or bugs. This year there has been a rising call for developing industry standards in eDiscovery to provide quality measures.

There is a belief that eDiscovery is becoming more routine and predictable, which means standards of service can be established. But is eDiscovery really like manufacturing? Can you assess the level of service in EDD in terms of number of defects? Quality is certainly a worthy aim – government agencies have shifted away from cost being the single biggest justification for contract award, more heavily weighting quality of service in such decisions.  The question is how to measure quality in EDD.

Quality standards that offer some type of objective measures could theoretically provide another basis for decision-making in addition to cost. Various attempts have been made at creating industry standards over the years, very little has yet been standardized. The recent DESI (Discovery of Electronically Stored Information) IV workshop at the International Conference on Artificial Intelligence and Law in June investigated possible standards. In the background to the conference, organizers bemoaned that “there is no widely agreed-upon set of standards or best practices for how to conduct a reasonable eDiscovery search for relevant evidence.” 

Detractors say standards are just hoops for vendors to jump through or a checkbox to check that don’t do much to differentiate one company from another. However, proponents believe industry standards could define issues like document defensibility, defining output, or how to go about finding responsive documents in a reasonable way, issues that can explode if not managed properly.

The Sedona Conference, Electronic Discovery Reference Model (EDRM), and Text Retrieval Conference (TREC) Legal Track all have efforts of one kind or another to establish standards for eDiscovery. EDRM provides a model for eDiscovery and standards of production. It has also led an effort to create a standard, generally accepted XML model to allow vendors and systems to more easily share electronically stored information (ESI). However, that applies to software vendors, and really doesn’t help the actual work of eDiscovery.

The Sedona Commentary on Achieving Quality in eDiscovery calls for development of standards and best practices in processing electronic evidence. Some of the standards being considered for broad industry standards are the ISO 9000 standard, which provides industry-specific frameworks for certifying organizations or the Capability Maturity Model Integration (CMMI), centered around improving processes.

The Association for Information Management Professionals (ARMA) is pushing its Generally Accepted Record-keeping Principles (GARP) framework to provide best practices for information management in the eDiscovery context. This article from ARMA is dismissive of information governance efforts such as the EDRM, which it says provides a framework for eDiscovery projects, but “falls short of describing standards or best practices that can be applied to the complex issues surrounding the creation, management, and governance of electronic information.”

Meanwhile, there are efforts underway to standardize pieces of the eDiscovery process. Law.com says that billing code standards are in the works to help clients understand what they are buying when they sign a contract for eDiscovery services.

Perhaps the most interesting and important effort is the TREC Legal Track, which began as government research project into improving search results. The project garnered a fair amount of attention when it discovered that keyword searching was as effective as or better than many advanced concept searches and other technology that was becoming popular in the industry. Since that time, researchers have been trying to develop objective criteria for comparing methods for searching large collections of documents in civil litigation.

As of today, these efforts are largely unrelated, disjointed, or even dismissive of competing efforts. In my next post, I’ll dig into specific efforts to see if any make sense for the industry. So, what do you think? Are standards needed, or is it just a lot of wheel spinning? Please share any comments you might have or if you'd like to know more about a particular topic.

Editor's Note: Welcome Jason Krause as a guest author to eDiscovery Daily blog!  Jason is a freelance writer in Madison, Wisconsin. He has written about technology and the law for more than a dozen years, and has been writing about EDD issues since the first Zubulake decisions. Jason began his career in Silicon Valley, writing about technology for The Industry Standard, and later served as the technology reporter for the ABA Journal. He can be reached at jasonkrause@hotmail.com.

eDiscovery Trends: Cloud Covered by Ball

 

What is the cloud, why is it becoming so popular and why is it important to eDiscovery? These are the questions being addressed—and very ably answered—in the recent article Cloud Cover (via Law Technology News) by computer forensics and eDiscovery expert Craig Ball, a previous thought leader interviewee on this blog.

Ball believes that the fears about cloud data security are easily dismissed when considering that “neither local storage nor on-premises data centers have proved immune to failure and breach”. And as far as the cloud's importance to the law and to eDiscovery, he says, "the cloud is re-inventing electronic data discovery in marvelous new ways while most lawyers are still grappling with the old."

What kinds of marvelous new ways, and what do they mean for the future of eDiscovery?

What is the Cloud?

First we have to understand just what the cloud is.  The cloud is more than just the Internet, although it's that, too. In fact, what we call "the cloud" is made up of three on-demand services:

  • Software as a Service (SaaS) covers web-based software that performs tasks you once carried out on your computer's own hard drive, without requiring you to perform your own backups or updates. If you check your email virtually on Hotmail or Gmail or run a Google calendar, you're using SaaS.
  • Platform as a Service (PaaS) happens when companies or individuals rent virtual machines (VMs) to test software applications or to run processes that take up too much hard drive space to run on real machines.
  • Infrastructure as a Service (IaaS) encompasses the use and configuration of virtual machines or hard drive space in whatever manner you need to store, sort, or operate your electronic information.

These three models combine to make up the cloud, a virtual space where electronic storage and processing is faster, easier and more affordable.

How the Cloud Will Change eDiscovery

One reason that processing is faster is through distributed processing, which Ball calls “going wide”.  Here’s his analogy:

“Remember that scene in The Matrix where Neo and Trinity arm themselves from gun racks that appear out of nowhere? That's what it's like to go wide in the cloud. Cloud computing makes it possible to conjure up hundreds of virtual machines and make short work of complex computing tasks. Need a supercomputer-like array of VMs for a day? No problem. When the grunt work's done, those VMs pop like soap bubbles, and usage fees cease. There's no capital expenditure, no amortization, no idle capacity. Want to try the latest concept search tool? There's nothing to buy! Just throw the tool up on a VM and point it at the data.”

Because the cloud is entirely virtual, operating on servers whose locations are unknown and mostly irrelevant, it throws the rules for eDiscovery right out the metaphorical window.

Ball also believes that everything changes once discoverable information goes into the cloud. "Bringing ESI beneath one big tent narrows the gap between retention policy and practice and fosters compatible forms of ESI across web-enabled applications".

"Moving ESI to the cloud," Ball adds, "also spells an end to computer forensics." Where there are no hard drives, there can be no artifacts of deleted information—so, deleted really means deleted.

What's more, “[c]loud computing makes collection unnecessary”. Where discovery requires that information be collected to guarantee its preservation, putting a hold on ESI located in the cloud will safely keep any users from destroying it. And because cloud computing allows for faster processing than can be accomplished on a regular hard drive, the search for discovery documents will move to where they're located, in the cloud. Not only will this approach be easier, it will also save money.

Ball concludes his analysis with the statement, "That e-discovery will live primarily in the cloud isn't a question of whether but when."

So, what do you think? Is cloud computing the future of eDiscovery? Is that future already here? Please share any comments you might have or if you'd like to know more about a particular topic.

eDiscovery Trends: An Insufficient Password Will Thwart Even The Most Secure Site

 

Several months ago, we talked about how most litigators have come to accept that Software-as-a-Service (SaaS) systems are secure.  For example, at Trial Solutions, the servers hosting data for our OnDemand® and FirstPass® (powered by Venio FPR™) platforms are housed in a Tier 4 data center in Houston (which is where our headquarters is).  The security at this data center is military grade: 24 x 7 x 365 onsite security guards, video surveillance, biometric and card key security required just to get into the building.  Not to mention a building that features concrete bollards, steel lined walls, bulletproof glass, and barbed wire fencing.

Pretty secure, huh?  Hacking into a system like this would be very difficult, wouldn’t you think?  I’ll bet that the CIA, PBS and Sony had secure systems as well; however, they were recently “hacked” by the hacker group LulzSec.  According to a recent study by the Ponemon Institute (linked to here via the Ride the Lightning blog), the chance of any business being hacked in the next 12 months is a “statistical certainty”.

No matter how secure a system is, whether it’s local to your office or stored in the “cloud”, an insufficient password that can be easily guessed can allow hackers to get in and steal your data.  Some dos and don’ts:

Dos:

  • If you need to write passwords down, write them down without the corresponding user IDs and keep the passwords with important documents like your passport, social security card and other important documents you’re unlikely to lose.  Or, better yet, use a password management application that encrypts and stores all of your passwords.
  • Mnemonics make great passwords.  For example, “I work for Trial Solutions in Houston, Texas” could become a password like “iw4tsiht”. (by the way, that’s not a password for any of my accounts, so don’t even try)  😉
  • Change passwords every few months.  Some systems require this anyway.

Don’ts:

  • Don’t use the same password for multiple accounts, especially if they have sensitive data such as bank account or credit card information.
  • Don’t email passwords to yourself – if someone is able to hack into your email, then they have access to those accounts as well.
  • Personal information may be easy to remember, but it can also be easily guessed, so avoid using things like your kids’ names, birthday or other information that can be guessed by someone who knows you.
  • Avoid logging into sensitive accounts when using public Wi-Fi as it is much easier for hackers to tap into what you’re doing in those environments.  If you’re thinking of checking your bank balance while having a latte at Starbucks, don’t.

So, what do you think?  Are you guilty of any of the “don’ts” listed above?  Please share any comments you might have or if you’d like to know more about a particular topic.

Full disclosure: I work for Trial Solutions, which provides SaaS-based eDiscovery review applications FirstPass® (for first pass review) and OnDemand® (for linear review and production).  Our clients’ data is hosted in a secured, SAS 70 Type II certified Tier 4 Data Center in Houston, Texas.

eDiscovery Case Law: Discovery Violations Result in Sanctions Against Plaintiff and Counsel

Yesterday, we reported on a case with no sanctions; today, we report on a case with a different outcome.

Both the plaintiff and plaintiff’s counsel have been ordered to pay sanctions for discovery abuses in a lawsuit in Washington court that was dismissed with prejudice on June 8, 2011.

In Play Visions, Inc. v. Dollar Tree Stores, Inc., No. C09-1769 MJP (W.D. Wash. June 8, 2011), the plaintiff moved to voluntarily dismiss its case with prejudice. The defendants did not argue against dismissal but did seek sanctions from the plaintiff based on what they considered to be “a pattern of sanctionable discovery misconduct.” The court ruled that discovery abuses had occurred, and fined the plaintiff and plaintiff’s counsel $137,168.41 “jointly and severally”. The misconduct of the plaintiff, Play Visions, Inc., included:

  • Misrepresentation of Available Documents: Play Visions claimed that all relevant documents were kept in hard copy only; however, deposition of Play Visions’ CFO revealed that electronic records existed that should have been presented months earlier under discovery.
  • Falsified Expert’s Report: The plaintiff’s expert report was prepared by plaintiff’s counsel Mark Lorbiecki and only signed and “approved” by the expert. In addition, the court discovered that the plaintiff had violated the court’s protective order by revealing confidential information to the same expert witness.

As a result of these misrepresentations and discovery abuses and others, the court ruled for the defendant’s motion and demanded the plaintiff and its counsel pay sanctions:

  • The court found that Play Visions, Inc. had falsely certified that all relevant records had been saved in paper format and delayed the search and production of documents. Play Visions’ counsel was found to have been negligent in familiarizing himself with Play Visions’ document practices and to have failed in assisting his client in mandatory discovery.
  • Accordingly, the court considered every case where the defendant was forced to do extra work as a result of the plaintiff’s delays and inaccuracies, and fined Play Visions, Inc. and its counsel $137,168.41 jointly and severally, due within 15 days of the order.
  • Not finding “that the discovery violations in this case merit finding the entire case exceptional under 35 U.S.C. § 285”, the court ruled against shifting any attorney’s fees in this case.  Otherwise, the sanctions award could have been even higher!

So, what do you think? Do the discovery violations committed by Play Visions and by its attorney demand monetary sanctions on this scale? Did Play Visions actually believe that they had no relevant electronic files?  Please share any comments you might have, or let us know if you’d like to know more about a particular topic.

eDiscovery Best Practices: Avoiding eDiscovery Nightmares: 10 Ways CEOs Can Sleep Easier

 

I found this article in the CIO Central blog on Forbes.com from Robert D. Brownstone – it’s a good summary of issues for organizations to consider so that they can avoid major eDiscovery nightmares.  The author counts down his top ten list David Letterman style (clever!) to provide a nice easy to follow summary of the issues.  Here’s a summary recap, with my ‘two cents’ on each item:

10. Less is more: The U.S. Supreme Court ruled unanimously in 2005 in the Arthur Andersen case that a “retention” policy is actually a destruction policy.  It’s important to routinely dispose of old data that is no longer needed to have less data subject to discovery and just as important to know where that data resides.  My two cents: A data map is a great way to keep track of where the data resides.

9. Sing Kumbaya: They may speak different languages, but you need to find a way to bridge the communication gap between Legal and IT to develop an effective litigation-preparedness program.  My two cents: Require cross-training so that each department can understand the terms and concepts important to the other.  And, don’t forget the records management folks!

8. Preserve or Perish: Assign the litigation hold protocol to one key person, either a lawyer or a C-level executive to decide when a litigation hold must be issued.  Ensure an adequate process and memorialize steps taken – and not taken.  My two cents: Memorialize is underlined because an organization that has a defined process and the documentation to back it up is much more likely to be given leeway in the courts than a company that doesn’t document its decisions.

7. Build the Three-Legged Stool: A successful eDiscovery approach involves knowledgeable people, great technology, and up-to-date written protocols.  My two cents: Up-to-date written protocols are the first thing to slide when people get busy – don’t let it happen.

6. Preserve, Protect, Defend: Your techs need the knowledge to avoid altering metadata, maintain chain-of-custody information and limit access to a working copy for processing and review.  My two cents: A good review platform will assist greatly in all three areas.

5. Natives Need Not Make You Restless: Consider exchanging files to be produced in their original/”native” formats to avoid huge out-of-pocket costs of converting thousands of files to image format.  My two cents: Be sure to address how redactions will be handled as some parties prefer to image those while others prefer to agree to alter the natives to obscure that information.

4. Get M.A.D.?  Then Get Even: Apply the Mutually Assured Destruction (M.A.D.) principle to agree with the other side to take off the table costly volumes of data, such as digital voicemails and back-up data created down the road.  My two cents: That’s assuming, of course, you have the same levels of data.  If one party has a lot more data than the other party, there may be no incentive for that party to agree to concessions.

3. Cooperate to Cull Aggressively and to Preserve Clawback Rights: Setting expectations regarding culling efforts and reaching a clawback agreement with opposing counsel enables each side to cull more aggressively to reduce eDiscovery costs.  My two cents: Some parties will agree on search terms up front while others will feel that gives away case strategy, so the level of cooperation may vary from case to case.

2. QA/QC: Employ Quality Assurance (QA) tests throughout review to ensure a high accuracy rate, then perform Quality Control (QC) testing before the data goes out the door, building time in the schedule for that QC testing.  Also, consider involving a search-methodology expert.  My two cents: I cannot stress that last point enough – the ability to illustrate how you got from the large collection set to the smaller production set will be imperative to responding to any objections you may encounter to the produced set.

1. Never Drop Your Laptop Bag and Run: Dig in, learn as much as you can and start building repeatable, efficient approaches.  My two cents: It’s the duty of your attorneys and providers to demonstrate competency in eDiscovery best practices.  How will you know whether they have or not unless you develop that competency yourself?

So, what do you think?  Are there other ways for CEOs to avoid eDiscovery nightmares?   Please share any comments you might have or if you’d like to know more about a particular topic.

eDiscovery Trends: The Best SaaS Providers are Certifiable

 

The increasing popularity of cloud-based Software-as-a-Service (SaaS) solutions is becoming well documented, with this very blog noting Forrester and Gartner predictions of tremendous growth in cloud computing over the next several years.  We’ve also noted the importance of knowing where your data is stored, as many online poker players learned the hard way when the recent US government crackdown of several gambling sites left them without a way to recover their funds.

If only there were some sort of certification, administered by an impartial third party, to ensure that your SaaS provider has implemented policies and processes that keep your information secure, stable and safe.  There is such a certification.

SAS 70 (the Statement on Auditing Standards No. 70) defines the standards an auditor must employ in order to assess the contracted internal controls of a service provider. Service providers, such as insurance claims processors, credit processing companies and, especially pertinent to eDiscovery, hosted data centers, are evaluated by these standards. The SAS 70 was developed by the American Institute of Certified Public Accountants (AICPA) as a simplification of a set of criteria for auditing standards originally defined in 1988.  Standards such as SAS 70 became critical in the wake of the Sarbanes-Oxley, which created significant legal penalties for publicly traded companies who lacked sufficient control standards for their financial information.

Under SAS 70, auditor reports are classified as either Type I or Type II. In a Type I report, the auditor evaluates the service provider to prevent accounting inconsistencies, errors and misrepresentation. The auditor also evaluates the likelihood that those efforts will produce the desired future results. A Type II report goes a step further.  It includes the same information as that contained in a Type I report; however, the auditor also attempts to determine the effectiveness of agreed-on controls since their implementation. Type II reports also incorporate data compiled during a specific time period, usually a minimum of six months.

SAS 70 reports are either requested by the service provider or a user organization (i.e., clients). The ability for the service provider to provide consistent service auditor's reports builds a client's trust and confidence in the service provider, satisfying potential concerns. A SaaS (2 a’s, as opposed to one for SAS) provider that has received SAS 70 Type II certification has demonstrated to an impartial third party a proven track record of policies and processes to protect its clients’ data.  When it comes to your data, you want a provider that has proven to be certifiable.

So, what do you think?  Is your SaaS provider SAS 70 Type II certified?  Please share any comments you might have or if you’d like to know more about a particular topic.

Full disclosure: I work for Trial Solutions, which provides SaaS-based eDiscovery review applications FirstPass® (for first pass review) and OnDemand® (for linear review and production).  Our clients’ data is hosted in a secured, SAS 70 Type II certified Tier 4 Data Center in Houston, Texas.