Posts By :

Doug Austin

Court Orders Plaintiff to Perform Some Requested Searches Despite the Plaintiff’s Claim that they’re “Unprecedented” – eDiscovery Case Law

 

In Swanson v. ALZA Corp., No.: CV 12-04579-PJH (KAW), 2013 U.S. Dist. (N.D. Cal. Oct. 7, 2013), California Magistrate Judge Kandis A. Westmore granted in part and denied in part the defendant's request to compel the plaintiff to apply its search terms to his ESI, ordering some of the search terms to be performed, despite the plaintiff’s assertion that the “the application of Boolean searches was unprecedented”.

In this patent dispute, the plaintiff produced nearly 750,000 pages of documents to the defendant on a rolling basis between June and August 9, 2013, of which more than 600,000 pages were from ESI sources. During that period, the parties met and conferred regarding possible additional search terms designed to capture further documents responsive to the defendant's requests, but were unable to agree.  On July 17, the defendant filed a motion to compel demanding that its new terms be utilized to search the plaintiff's ESI, to which the plaintiff objected.

Part of the dispute was in agreeing on the number of terms that the defendant was requesting: the plaintiff characterized the defendant's request as proposing 89 additional terms, whereas the defendant characterized its request as only equating to 25 additional search "terms," which were distinct searches containing multiple terms and Boolean operators.  After the court ordered the parties to provide supplemental information to assist in the resolution of the underlying dispute on August 27, the defendant, believing documents were missing from the production based on the plaintiff’s sample searches, asked for access to the plaintiff’s database to run its own searches, or for the plaintiff to run a narrowed list of 11 “terms”.  At the hearing, the plaintiff's counsel explained that his firm ran two sample ESI searches, which took 34 hours and 51 minutes, and 23 hours and 58 minutes, respectively.

Judge Westmore “was not persuaded by Plaintiff's argument that running modified searches would place such an undue burden as to relieve Plaintiff of his obligation to produce responsive documents” and did not agree with the plaintiff’s contention that “the application of Boolean searches was unprecedented”, stating “[t]his is actually not the case, and given the availability of technology, Boolean searches will undoubtedly become the standard, if, for no other reason, to limit ESI documents to those most likely to be relevant to pending litigation.”

Ultimately, Judge Westmore ruled for the plaintiff to deduplicate the results of its two sample searches and to perform three of the other requested searches, stating that “[w]hile Defendant has not shown that it should be entitled to have all 11 searches performed, the Court is persuaded that it would not be unduly burdensome for Plaintiff to perform the searches below in light of Plaintiff's unwillingness to produce its ESI database”.

So, what do you think?  Was this a fair compromise?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Vorys Project Ballpark Cost Estimator for ESI Processing and Review – eDiscovery Best Practices

On Tuesday, we discussed budget calculators available from the Metrics section of the Electronic Discovery Reference Model (EDRM) web site.  Today, we will begin a more in-depth discussion of the budget calculators, beginning with the E-Discovery Cost Estimator for Processing and Review workbook provided by Julie Brown at Vorys law firm.

As described on the site, this budget calculator contains two worksheets. The Linear-search-analytics worksheet allows users to calculate ballpark cost estimates for processing and review under three “cases” and compare the results. The cases are:

  • Case 1: Full blown processing and linear review
  • Case 2: Search terms used to cull data during processing
  • Case 3: Use analytical culling tool

With each case, users are able to see the cost consequences that result from changing variables such as Data Volume, Volume after culling, and Pre-processing cost/GB.  The cost differences are shown numerically, as well as via two graphs, a 3D horizontal bar graph that shows the cost differences between the three cases (see above graphic for an example) and a 2D horizontal bar graph that shows the cost differences, with a breakdown of processing and review costs for each.

The Linear-size examples worksheet allows users to compare four versions of Case 1. Users are able to see the cost consequences (in both numeric and 2D vertical bar graph form) that result from changing any combination of six variables: Data Volume, Processing Cost/GB, Pages per GB, Docs Reviewed by Hour, Hourly Rate, and FTEs.

Both spreadsheets provide useful information and are well controlled to differentiate the data entry cells (with no fill color in the cell) from the calculation only cells (with a fill color) and the sheets are protected to prohibit accidental overwriting of the calculated cells (the sheets aren’t locked with a password, so you can override it if you want to make adjustments).  The sheet is designed to help you generate a ballpark cost for processing and review based on the variables provided, so it doesn’t include any fixed overhead costs such as software, hardware or facility costs.  It also doesn’t include any management overhead, so it’s essentially a model for variable costs only, but it could be useful to help you determine at what volume an analytical culling tool might pay for itself.

Suggestions for improvement:

  • Create a common section for data entry variables so you don’t have to re-enter them for each comparison case to save time and avoid data inconsistencies;
  • While you’re at it, add variables for pages per document and hours per week – right now, you have to unprotect the sheet and change the formulas if you want to change those variables (not all document sets or work weeks are the same);
  • Add sheets to compare versions of Case 2 and Case 3, like the sheet for Case 1.

Nonetheless, this workbook is quite useful if you want to obtain a ballpark estimate and comparison for processing and review and compare costs for alternatives.  You can download this calculator individually or a zip file containing all four calculators here.  After the first of the year, we will continue our review of the current EDRM budget calculators in more detail.

So, what do you think?  How do you estimate eDiscovery costs?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Want to Estimate your eDiscovery Budget? Use One of These Calculators – eDiscovery Best Practices

It has been a busy year for the Electronic Discovery Reference Model (EDRM).  In addition to announcing a transition to nonprofit status by May 2014, since the May annual meeting, several EDRM projects (Metrics, Jobs, Data Set and the new Native Files project) have already announced new deliverables and/or requested feedback.  Now, another resource is available via the EDRM site – Budget Calculators!

It can be difficult to estimate the total costs for eDiscovery at the outset of a case.  There are a number of variables and options that could impact the budget by a wide margin and it may be difficult to compare costs for various options for processing and review.  However, thanks to the EDRM Metrics team and contributing members, budget calculator Excel workbooks are available to enable you to at least “ballpark” the costs.  The budget calculator spreadsheets are designed to help organizations estimate likely eDiscovery costs, based on assumptions that you provide, such as average hourly rates for contract reviewers or average number of pages per document.

There are four budget calculators that are currently available.  They are:

  • UF LAW E-Discovery Project Ballpark Cost Estimator for ESI Processing and Review: This budget calculator contains two worksheets. The first worksheet allows users to calculate ballpark cost estimates for processing and review under three “cases” (Full blown processing and linear review, Search terms used to cull data during processing and Use analytical culling tool) and compare the results.  The second worksheet allows users to compare four versions of Case 1.  This workbook has been provided by University of Florida Levin College of Law and Vorys law firm.
  • Doc Review Cost Calculator: This budget calculator focuses on review. From assumptions entered by users, it calculates per-document and per-hour (a) low and high price estimates, (b) low and high costs on a per page basis, and (c) low and high costs on a per document basis.
  • ESI Cost Budget: This budget calculator estimates costs by project phase. The phases are: ESI Collection, ESI Processing, Paper Collection and Processing, Document Review, Early Data Assessment, Phase 1 Review, Phase 2 Review, Production, Privilege Review, Review of Opposition’s Production and Hosting Costs.  This workbook has been provided by Browning Marean, DLA Piper law firm.
  • EDRM UTBMS eDiscovery Code Set Calculator: This budget calculator uses the UTBMS e-discovery codes as a starting point for calculating estimated e-discovery expenses. Users enter anticipated average hour rates for: Partners, Associates, Paralegals, Contract reviewers, In-house resources and Vendors, along with total estimated hours for each relevant group and total estimated associated disbursements for each relevant L600-series UTMBS code.  The spreadsheet then displays: a summary of the estimated costs, details of the estimated costs for each combination, totals by type of person and totals by individual and higher-level UTMBS codes.  This workbook has been provided by Browning Marean, DLA Piper law firm; and George Socha, Socha Consulting.

You can download each calculator individually or a zip file containing all four calculators.  If you have your own budget calculator, you can also submit yours to EDRM to share with others.  The calculators are available here.  On Thursday, we will begin reviewing the current budget calculators in more detail.

So, what do you think?  How do you estimate eDiscovery costs?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Without Meet and Confer Approval of its “Triangulating” Approach to Discovery, Defendant Ordered to Supplement Production – eDiscovery Case Law

 

In Banas v. Volcano Corp., No. 12-cv-01535-WHO (N.D. Cal. Oct. 4, 2013), California District Judge William H. Orrick determined that a defendant’s approach to discovery in which identifying the relevant documents by "triangulating" the defendant's employees wasn’t discussed with the plaintiff beforehand in a meet and confer.  Despite the fact that the court did “not find that defendant's production technique was unreasonable”, the defendant was ordered to supplement its responses since the approach wasn’t discussed and it left out multiple deponents.

In what was described as a “tentative” ruling, the facts were laid out as such:

  • In order to address the massive discovery required in this case, the defendant decided to identify the relevant documents by "triangulating" the defendant's employees. Rather than search every employee's emails, the defendant selected a subset of employees who would likely have received documents from, or sent them to, other employees who might have had involvement in this matter, so that the result would "most likely" capture all the relevant documents. In discovery, the defendant produced more than 225,000 documents. There was no agreement or even discussion between the parties about defendant's triangulation approach before the documents were produced.
  • Because of the volume of discovery in this case, documents were produced on a rolling basis. The last group of documents was produced shortly before the close of fact discovery.
  • Plaintiffs took the depositions of some 18 current or former employees of defendant. At least some of those witnesses were not within the subset of employees whose emails were searched directly by defendant.
  • The plaintiff had a hard drive that contained various documents he received while employed by the defendant. The plaintiffs compared the documents on his hard drive with the documents produced by the defendant regarding one employee they deposed and found that the defendant had produced a small fraction of the documents held by the plaintiff involving that deponent (the parties disputed his importance to the litigation).

Because of the discrepancy between the documents produced by the defendant and those contained in the plaintiff's hard drive, the plaintiff requested that the defendant search the electronic files of the witnesses whom plaintiffs deposed to ensure that the production is complete.  Though Judge Orrick did “not find that defendant's production technique was unreasonable”, he found that the plaintiff’s request was reasonable and “[u]nless there was an agreement concerning the ‘triangulation’ approach”, ordered the defendant to “perform this supplementary search and to produce any non-duplicative items”.

So, what do you think?  Could this dispute have been avoided by a meet and confer?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

The Categories Are… – eDiscovery Trends

Alex Trebek has been uttering the phrase “the categories are” for years on the popular game show Jeopardy®. But, do you know how to take advantage of the “Categories” feature on this blog?

As a daily blog that has been around for over 3 years, eDiscoveryDaily has published 820 posts to date (we should hit our 1,000 post milestone sometime next summer!). We’ve covered eDiscovery trends, key case law decisions and best practices, among other things. We have yet to remove any posts that we’ve published from our site – as a result, we have developed quite a knowledge base resource about eDiscovery topics. Even though I’ve written most of the posts on this blog, I find myself using it from time to time, because, honestly, my brain can only retain so much… 🙂

One of the useful features that our site provides is the Library section. You should see it on the left sidebar underneath the Subscription section. There are two sub-sections that can be useful, Categories and Monthly Archives. As the name implies, Monthly Archives provides an entry for each month’s set of posts – all the way back to the launch of the blog in September 2010. It’s a great way to catch up on topics if you’ve missed them.

As for the Categories sub-section, you may have noticed at the bottom of each post under the Disclaimer, there is a “Filed under” section to show the categories that the post is filed under. Most posts relate to at least a couple of categories (for example, this one relates to Electronic Discovery and Industry Trends; not surprisingly, almost every post relates to electronic discovery because, after all, this is an eDiscovery blog).

The Categories drop down in the Library section enables you to see the classification categories that we use and select a category of interest. So, if you’re interested in viewing the posts related to Case Law (260 of them to date, including yesterday’s post), simply select ‘Case Law’ from the drop down and the site will display a listing of post summaries, starting with the most recent post.

Do you have an interest in activities within the Electronic Discovery Reference Model (EDRM) or want to know more about Information Governance? Want to know more about Federal eDiscovery Rules or State eDiscovery Rules? Would you like to review cases where Sanctions have been applied, or at least considered? One simple click is all it takes to get there.

Feel free to not only read each daily post, but also to use this blog as a knowledge base resource. If it’s a significant eDiscovery trend, key case law decision or best practice over the past 3+ years, we probably have it here.

So, what do you think? Are there additional categories that you’d like to see us track? Please share any comments you might have or if you’d like to know more about a particular topic.

Image © 2013 – Jeopardy Productions, Inc.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Court Declines to Impose Default Judgment, But Orders Searchable Production and Extends Deadlines – eDiscovery Case Law

 

In Kwan Software Engineering, Inc. v. the defendant Technologies, LLC, No. C 12-03762 SI (N.D. Cal. Oct. 1, 2013), California District Judge Susan Illston denied the plaintiff’s motion for terminating sanctions against the defendant for late, non-searchable productions, but did order the defendant to produce documents in a searchable format with metadata and extended the pretrial schedule so that the plaintiff would not be prejudiced by the late productions.

After being court-ordered to produce documents by August 20, 2013, the defendant produced the majority of the documents (over 200,000 pages) after the deadline on September 13 and September 25.  As a result, the plaintiff filed a motion for terminating sanctions against the defendant pursuant to Federal Rule of Civil Procedure 37(b), based on the defendant's untimely production of documents. In the alternative, the plaintiff sought the following sanctions:

(1)   that within three business days, the defendant must produce all documents responsive to the plaintiff's requests in searchable electronic format with metadata included;

(2)   that the plaintiff's non-expert and expert discovery cut-offs be unilaterally extended to December 15, 2013;

(3)   that the defendant be precluded from using any documents produced after August 20, 2013, for any purpose; and

(4)   that the defendant be required to pay monetary sanctions in the amount of $2,880 to reimburse the plaintiff for the costs of its motion.

The plaintiff also requested that the Court leave the trial date unaltered.

Noting that a terminating sanction should only be imposed in “extreme circumstances”, Judge Illston described the factors to be considered, as follows:

(1)   the public's interest in expeditious resolution of litigation;

(2)   the court's need to manage its docket;

(3)   the risk of prejudice to the other party;

(4)   the public policy favoring the disposition of cases on their merits; and

(5)   the availability of less drastic sanctions.

After consideration of the above factors, Judge Illston declined to impose a terminating sanction, noting that a “sanction less drastic than default is available to remedy any potential prejudice” to the plaintiff.  She did order the defendant to produce the documents in a searchable format with metadata and amend the pretrial schedule for both parties for non-expert and expert discovery cut-offs and deadlines to designate expert witnesses.  In its response, the defendant stated that “it can produce the documents in a searchable format with metadata in a week”.

So, what do you think?  Should the request for terminating sanctions have been granted?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

Moneycase: Should Your Law Practice Be Run Like a Baseball Team? — eDiscovery Trends

Remember the movie Moneyball (adapted from the book of the same name) about Oakland A’s general manager Billy Beane’s use of computer-generated analytics to pick his players to successfully assemble a baseball team that advanced to the baseball playoffs while spending a fraction of the budget as other teams?  Can law firms learn from that example?

According to Angela Hunt in a recent article in Law Technology News (Why Attorneys Love-Hate Data Analytics), maybe they can.  As she notes in her article, James Michalowicz, managing director of Huron Legal advises firms to use big data and performance metrics to minimize legal spending.

Like the old-time baseball experts in Moneyball that scoffed at the use of computer-analytics to pick baseball players, some attorneys question the benefits in the legal arena.  “As much as I think the use of analytics is now penetrating the sports world, I think it’s slower in the legal world,” Michalowicz told Law Technology News. Since a law firm’s value depends heavily on its legal knowledge base, installing a program that does all the heavy thinking can make attorneys feel like their hard-earned legal education is being undermined, explains Michalowicz. “There’s this emotional piece to it. Lawyers don’t want to rely on data. It’s a challenge to their pride.”

However, for large firms and corporations that deal with litigation regularly, Michalowicz recommends using strategic case analytics, a predictive technology that helps attorneys pick their battles.  As the article notes, “[b]y evaluating venue data and case histories within a jurisdiction, law firms and corporate legal departments can give unbiased advice on whether to litigate or settle.”

The past three years, at LegalTech New York (LTNY), we have conducted and published a Thought Leader Series of interviews with various thought leaders in the litigation and eDiscovery industry (here’s the link to this year’s set of interviews).  One of the interviews was with Don Philbin, President and Founder of Picture It Settled®, which is a predictive analytics tool for the settlement negotiation process.  To support this process, they collected data for about ten thousand cases – not just the outcomes, but also the incremental moves that people make in negotiation.  If Billy Beane were an attorney, he’d love it!

Over the next few weeks, we’ll look at other analytics mechanisms to improve efficiency in the litigation and discovery process.

So, what do you think?  Do you employ any data analytics in your discovery practice?   Please share any comments you might have or if you’d like to know more about a particular topic.

Image © 2011 – Sony Pictures

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

He Sees You When You’re Sleeping — eDiscovery Trends

 

A recent post in the Law Librarians Blog illustrates not only the different ways in which personal data can be captured, but also the continued growth of devices that might contain that data.

In He Sees You When You’re Sleeping, He Knows When You’re Awake…, the authors discuss potential tracking of mouse movements, current data tracking on smart TVs and even the possibility for data to be kept and tracked on…your toothbrush:

  • An October story from Ars Technica discusses how Facebook is working on a way to log cursor movements, beyond tracking where someone clicks on a page to determine an ad’s effectiveness.  According to the Wall Street Journal, Facebook wants to pay attention to the areas a cursor lingers over, even without a click or other interaction.  And, if you’re using a mobile device, Facebook will still be noting when, for instance, “a user’s newsfeed is visible at a given moment on the screen of his or her mobile phone.”
  • Imagine if your toothbrush could keep track of your brushing habits?  According to ZDNet, Salesforce CEO Marc Benioff sees that happening.  “Everything is on the Net. And we will be connected in phenomenal new ways," said Benioff. Benioff highlighted how his toothbrush of the future will be connected. The new Philips toothbrush is Wi-Fi based and have GPS. "When I go into the dentist he won't ask if I brushed. He will say what's your login to your Philips account. There will be a whole new level of transparency with my dentist”.
  • One device that is already capturing your personal data is the smart TV, in some cases whether you want it or not.  A blogger in the U.K. has discovered that his LG smart TV sends details about his viewing habits back to LG servers.  Those habits also include the file names of items viewed from a connected USB stick.  There is a setting in the TV that purports to turn this behavior off (it’s on by default).  It doesn’t work as data is forwarded to LG no matter what the setting.  LG’s response to the disclosure was less than reassuring – “The advice we have been given is that unfortunately as you accepted the Terms and Conditions on your TV, your concerns would be best directed to the retailer,” the representatives wrote in a response to the blogger. “We understand you feel you should have been made aware of these T’s and C’s at the point of sale, and for obvious reasons LG are unable to pass comment on their actions.”

Nice.  Imagine a case where, in addition to hard drives and smart phones, data collectors need to perform collection on flatscreen TVs and toothbrushes?  If it sounds farfetched, remember that, several years ago, cell phones didn’t store data and texts didn’t even exist.

So, what do you think?  What is the most unusual device from which you’ve ever collected data?   Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

November Pop Quiz Answers! – eDiscovery Trends

Yesterday, we gave you a pop quiz for the topics we’ve covered in November. If you’re reading the blog each day, these questions should be easy! Let’s see how you did. Here are the answers.

 

1.  Which of the following is NOT an approach for collection as described by the published EDRM Collection Standards document?

 

A.    Forensic Image (Physical or Logical Target)

B.    Custom Content/Targeted Image

C.    Custom Content/Non-Targeted Image

D.    Non-Forensic Copy

 

2.  Which judge just published a Discovery Order for use in his District Court?

 

A.    Shira Scheindlin

B.    Lee Rosenthal

C.    Andrew Peck

D.    Paul Grimm

 

3.  Which US Senator recently voiced concerns about the proposed changes to the Federal Rules regarding discovery?

 

A.    Barbara Boxer

B.    Christopher Coons

C.    Dick Durbin

D.    Marco Rubio

 

4.  In what recent case was the plaintiff’s motion to compel denied because the defendant didn’t have “possession, custody, or control” of the evidence?

 

A.    Kickapoo Tribe in Kansas v. Nemaha Brown Watershed Joint District No. 7

B.    Apple v. Samsung

C.    Crispin v. Christian Audigier Inc.

D.    Novick v. AXA Network

 

5.  How big does the Radicati Group project the market for eDiscovery solutions will grow by 2017?

 

A.    $3.5 billion

B.    $3.6 billion

C.    $3.7 billion

D.    $3.8 billion

 

6.  What are “container files”?

 

A.    A redweld containing paper documents

B.    A file that stores one or more images

C.    A file that stores one or more files in a compressed form

D.    None of the above

 

7.  In which case is a party (and their counsel) facing sanctions for disclosure of confidential agreements?

 

A.    Kickapoo Tribe in Kansas v. Nemaha Brown Watershed Joint District No. 7

B.    Apple v. Samsung

C.    Crispin v. Christian Audigier Inc.

D.    Novick v. AXA Network

 

8.  Which file format yields, on average, the most pages per GB?

 

A.    Text files

B.    Email files

C.    Microsoft Word files

D.    Image files

 

9.  In which case was cost-shifting ruled inappropriate where data was kept in an accessible format?

 

A.    Kickapoo Tribe in Kansas v. Nemaha Brown Watershed Joint District No. 7

B.    Apple v. Samsung

C.    Crispin v. Christian Audigier Inc.

D.    Novick v. AXA Network

 

10. Which of the following is NOT a useful LinkedIn group for eDiscovery information?

 

A.    Electronic Discovery Professionals

B.    Association of Litigation Support Professionals

C.    The Discover Network

D.    All of the above are useful groups for eDiscovery information

 

 

How did you do?  Next month, you’ll get another chance with December topics.  As always, please let us know if you have questions or comments, or if there are specific topics you’d like to see covered.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

November Pop Quiz! – eDiscovery Trends

Did you think we forgot to quiz you about last month’s topics?  Thankfully, no!  Like we did for July, August and September/October (answers here, here and here, respectively), here is a pop quiz for the topics we covered in November.  If you’re reading the blog each day, these questions should be easy!  If not, we’ve provided a link to the post with the answer.  We’re that nice.  Test your knowledge!  Tomorrow, we’ll post the answers for those who don’t know and didn’t look them up.

 

1.  Which of the following is NOT an approach for collection as described by the published EDRM Collection Standards document?

 

A.    Forensic Image (Physical or Logical Target)

B.    Custom Content/Targeted Image

C.    Custom Content/Non-Targeted Image

D.    Non-Forensic Copy

 

2.  Which judge just published a Discovery Order for use in his District Court?

 

A.    Shira Scheindlin

B.    Lee Rosenthal

C.    Andrew Peck

D.    Paul Grimm

 

3.  Which US Senator recently voiced concerns about the proposed changes to the Federal Rules regarding discovery?

 

A.    Barbara Boxer

B.    Christopher Coons

C.    Dick Durbin

D.    Marco Rubio

 

4.  In what recent case was the plaintiff’s motion to compel denied because the defendant didn’t have “possession, custody, or control” of the evidence?

 

A.    Kickapoo Tribe in Kansas v. Nemaha Brown Watershed Joint District No. 7

B.    Apple v. Samsung

C.    Crispin v. Christian Audigier Inc.

D.    Novick v. AXA Network

 

5.  How big does the Radicati Group project the market for eDiscovery solutions will grow by 2017?

 

A.    $3.5 billion

B.    $3.6 billion

C.    $3.7 billion

D.    $3.8 billion

 

6.  What are “container files”?

 

A.    A redweld containing paper documents

B.    A file that stores one or more images

C.    A file that stores one or more files in a compressed form

D.    None of the above

 

7.  In which case is a party (and their counsel) facing sanctions for disclosure of confidential agreements?

 

A.    Kickapoo Tribe in Kansas v. Nemaha Brown Watershed Joint District No. 7

B.    Apple v. Samsung

C.    Crispin v. Christian Audigier Inc.

D.    Novick v. AXA Network

 

8.  Which file format yields, on average, the most pages per GB?

 

A.    Text files

B.    Email files

C.    Microsoft Word files

D.    Image files

 

9.  In which case was cost-shifting ruled inappropriate where data was kept in an accessible format?

 

A.    Kickapoo Tribe in Kansas v. Nemaha Brown Watershed Joint District No. 7

B.    Apple v. Samsung

C.    Crispin v. Christian Audigier Inc.

D.    Novick v. AXA Network

 

10. Which of the following is NOT a useful LinkedIn group for eDiscovery information?

 

A.    Electronic Discovery Professionals

B.    Association of Litigation Support Professionals

C.    The Discover Network

D.    All of the above are useful groups for eDiscovery information

 

 

As always, please let us know if you have questions or comments, or if there are specific topics you’d like to see covered.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by CloudNine Discovery. eDiscoveryDaily is made available by CloudNine Discovery solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscoveryDaily should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.