Posted on: June 02, 2015in Blog
5 Discovery Analytics Workflows for Small Cases
This post depicts five analytics workflows that will reduce, or possibly even eliminate, document review in the typical case under 100,000 records.
While it is true predictive coding and technology assisted review are better suited for cases with more than 100,000 document records, there is a common misperception that only large and very large cases can benefit from the application of analytics. There are a variety of tools and workflows that fall under the umbrella term “analytics.” These include, but are not limited to, email threading, inclusive email review, near-duplication, clustering, categorization, and conceptual search.
When leveraging analytics technology in managed review, the power of defensible, highly prioritized reviews can be adapted to the needs of each case. Download this on-demand webinar to learn how reduce your review costs →
Depending on the data set and goals of the review, small cases may see greater benefits from the application of analytics than large multi-custodian, big data cases.
Here are five analytics workflows that will reduce, or possibly even eliminate, document review in the typical case under 100,000 records.
1. Email Threading and Inclusive Review
Email threading is one of the most commonly used items in the analytics toolkit, but too frequently, people do not consider it for small volume, or single mailbox, email review. Regardless of the amount of email, organizing any review by email thread will promote both the speed and consistency of the reviewer. When limiting the review to just the inclusive emails (emails that represent the end of a conversation or those forwarding an attachment that was left off of a later reply), the review can be reduced by as much as 30 percent. For a review of 3,000 emails, that would be a savings of 600 documents, or one day of review. Email threading can and should be applied to every case in which email will be reviewed.
2. Near-Duplication Comparison of Separate Collections
Data typically comes in on a rolling basis, and with smaller cases, it is likely that one set of data could be completely reviewed before the second wave hits. While it may be difficult to find time to review the second set, the good news is by using near-duplication, you could leverage the document coding from your first review to supplement the coding of the second review. Near-duplication allows you to compare the text and group textually similar documents together. By identifying the near-duplicate groups with individual documents from both collections, you can simply, and quickly, pass the coding from the first set to the second with the click of a button.
3. Clustering Each Custodian Separately
With smaller datasets, clustering can be a powerful workflow. When clustering is applied to a smaller dataset, there are fewer files to skew the automated process of grouping together conceptually similar documents. Small document clusters make it simple to prioritize (or de-prioritize) specific sets of documents for review. With clustering, large chunks of data can be quickly evaluated—and possibly eliminated—before any review is performed, all with little effort from the review team.
4. Sample Clusters to find Categorization Examples
As stated above, small cases do not make good predictive coding candidates. That being said, you can mimic a technology assisted review workflow by running clusters, generating random samples of each cluster, identifying exemplar documents in each sample set and using those exemplars to categorize the documents into 10 categories of your choosing. This workflow combines clustering and categorization with a limited review of your documents to allow you to issue-code an entire population of similar documents. The best part is if the results are limited by not having enough quality documents in the samples, you can generate another round.
5. Smoking Gun Concept Search
Conceptual searches can be incredibly powerful tools and in this workflow, the approach is simple. You can use any body of text to generate a conceptual search, as they are not dependent on example documents in your database. Simply create an ideal document that gets to the heart of the issue in the case. The text should be one to two fully developed paragraphs focused on a single concept. Submit this text as a conceptual search. The results of the search will, at the least, point you to documents that get to the core of your matter. At best, the results will contain the hot, relevant documents—and potentially the “smoking gun” documents—around which you can build your argument.
The above strategies are not exclusive to smaller volume cases and can be used on cases of all sizes. But when used on smaller cases, these workflows will enhance, expedite, and possibly even eliminate, the need for a full document review phase. With the introduction of analytics into smaller cases, we are seeing a shift in the balance of power and a leveling of the playing field. Analytics is no longer just for the terabyte-size cases; it can be just as effective on one gigabyte of email.
D4 Weekly eDiscovery Outlook
Power your eDiscovery intellect with our weekly newsletter.
Posted January 18, 2018
5 Expert Predictions for the eDiscovery Industry in 2018
Posted January 17, 2018
Get Your Passport to GDPR Success - LegalTech New York 2018
Posted January 11, 2018
Is Your Organization Vulnerable to a Cyber Attack? 3 Steps to Put Your Mind at Ease
Posted January 04, 2018
How the EU and China Plan to Deal with Multinational Data
Posted December 28, 2017
How to Navigate International Data Privacy Laws for eDiscovery
Posted December 21, 2017
Cross-Border eDiscovery: An Introduction to Cultural and Legal Obstacles
Posted December 14, 2017
Webinar Q&A Featuring Panelists from Special Counsel and Brainspace
Posted November 30, 2017
Help Your Employees Find the Information They Need with Machine Learning
Posted November 22, 2017
How to Use Managed and Prioritized Workflows to Reduce the Cost of Review [On-Demand Webinar]
Posted November 16, 2017
5 Workflow Tips for Conducting a Foreign Language Review