eDiscovery Daily Blog
Oxygen Forensics Hosts First User Summit: Key Takeaways on Forensic Data Collections and eDiscovery Challenges
Oxygen Forensics recently held their inaugural user summit in Alexandria, VA, featuring a packed agenda of timely, educational sessions addressing today’s pressing challenges in forensic data collections. The event was a blend of technical insights, networking opportunities, and interactive discussions, attracting experts from various sectors, including digital forensics, law enforcement, and legal technology.
Leenote: CEO Lee Reiber’s Journey and Insights
The summit kicked off with a keynote—or “Leenote”—delivered by Oxygen Forensics CEO Lee Reiber. Lee shared a compelling story of his personal and professional journey, revealing how overcoming adversities fueled his relentless curiosity, innovative problem-solving, and drive for excellence in data forensics. He emphasized the importance of continuous learning, warning that complacency is a fast track to being surpassed by peers in this rapidly evolving field. My key takeaway from his talk was simple yet profound: the temptation to believe we’ve “learned it all” will lead to stagnation, and staying curious is essential for ongoing success.
Gus Dimitrelos: “1’s and 0’s Do Not Care”
Following Lee’s presentation, forensic expert Gus Dimitrelos delivered a thought-provoking session titled “1’s and 0’s Do Not Care.” His focus on the objectivity of data was underscored by his high-profile example of authenticating the infamous Hunter Biden laptop. He walked the audience through the intricacies of digital forensics, including how transactional data, geolocation, and communications provide irrefutable evidence—data does not lie. Dimitrelos’ session was a masterclass in how to authenticate and leverage digital footprints, and his work is definitely worth deeper exploration for anyone interested in the evolving world of data authentication.
Private Sector Track: Managing Text Messages in eDiscovery
As the summit split into private and public sector tracks, I opted for the private sector sessions. One standout presentation was delivered by Trent Walton, CEO of Forensic Discovery, on handling text messages in eDiscovery. Walton highlighted that text messages are fast becoming the new frontier in eDiscovery, much like email was years ago. However, law firms are now demanding proper forensic collections rather than relying on screenshots, which lack the necessary chain of custody and authenticity.
Key Steps in Text Message eDiscovery:
- Sound Collection – Proper data collection is critical, with multiple options depending on the complexity of the case:
- Physical Collection – Conducted in-lab or onsite, providing maximum control over the process.
- Remote Collection Kits – Devices are sent to users who physically connect them, with the data extracted and mailed back for analysis. This is helpful because if there are nuances in the data, the remote kit has multiple software platforms that can help with the collection. The issue, though, is that remote collections lack some important chain of custody processes that can be challenged by opposing counsel. It is important to know what data you are after to determine this process.
- Agent-Based Remote Collection – A more lightweight option, where users download an app that triggers a remote collection of specific data, ideal for less complex cases. This allows fast access to the data and is best for lighter collections of just text messages for example rather than a full forensic collection.
- Cloud Collection – Gathering data directly from cloud-based platforms (Slack, Google, Facebook, etc.). It is important to know the nuances with each application to ensure that the data needed is fully collected.
- Processing and Exporting Data – Once collected, the data is processed, and relevant messages are identified. The next decision is how to export this data—whether as native or an image of individual messages, 24-hour threads, or weekly batches, with flexibility based on case needs.
- Review of Messages – Walton stressed the importance of a nuanced review approach, noting that reviewing messages as documents can often strip conversations of their context. Understanding the flow of communication is key to more accurate and efficient reviews. He noted the way in which CloudNine Review allows for a native review preserves context as well as assembles cross channel communications like a conversation starting in email, switching to Slack then text messages.
Richard Rodney: Cloud and Mobile Data Challenges
In another fascinating session, Richard Rodney, President of IFI LLC, tackled the critical challenges around cloud and mobile data. The session explored how modern devices have evolved into powerful computers, requiring more sophisticated collection strategies.
Key Challenges:
- Mobile Device Evolution – With the increasing complexity of mobile operating systems, collecting all data from a device is no longer feasible. Instead, Rodney emphasized focusing on what’s truly important, like messaging apps and deleted group texts.
- Cloud Data – Many applications now store critical data in the cloud, requiring additional steps to access and authenticate it. Rodney underscored the challenges posed by cloud-based storage and the evolving nature of data access control, including passwords and multi-factor authentication.
- Ephemeral Messaging – Apps like Snapchat and other ephemeral messaging platforms present significant hurdles for forensic investigations, as messages are designed to self-destruct. However, Rodney encouraged checking backup servers and reviewing cross-device syncing, which could offer retrieval options.
- Hyperlink Attachments – The rise of hyperlinks embedded in texts or emails adds another layer of complexity. Collecting data from apps like Google Drive or SharePoint requires upfront knowledge of the organization’s data storage practices to ensure successful preservation and collection.
My Session: Data Collections for eDiscovery and Investigations
On day two, I had the opportunity to present on data collections for eDiscovery and investigations. Since the audience was composed of certified forensic examiners, rather than giving my thoughts on data acquisition, I engaged in discussions with them on best practices for collecting and culling modern data sources such as Slack, MS Teams, geolocation data, social media, and more. We examined strategies for balancing targeted collections (focusing on specific data at the time of collection) versus full collections (capturing everything, but refining later), and the nuances involved in cloud-based data collection.
Key Takeaways from My Session:
- Targeted Collections – While collecting only what’s necessary saves time, it can create challenges if additional data is needed later. Full collections with post-collection filtering offers more flexibility.
- Cloud Collections – It’s crucial to understand the scope of cloud data. For example, a simple Meta download may exclude interactions like comments and likes. Tools such as Oxygen’s Remote Explorer can bridge these gaps by pulling all relevant data.
- Data Translation and Transcription – Adding processes like data translation, transcription, and Personally Identifiable Information (PII) identification before review can streamline workflows. By leveraging Oxygen’s platforms in combination with CloudNine Review, we can optimize the searchability and usability of native data alongside more traditional formats.
A Highly Educational Experience
The two-day Oxygen Forensics summit was packed with cutting-edge insights and practical advice. In addition to the sessions, I enjoyed getting to know the talented Oxygen Forensics team, along with the incredibly knowledgeable speakers and attendees. The discussions on evolving trends, tools, and methodologies in data forensics will certainly influence my future work in eDiscovery and forensic investigations. This summit was undeniably a top-tier event, offering a wealth of knowledge in a short amount of time.