Archive

Archive for May, 2011

Says the command line, “I’m not dead yet!”

May 22, 2011 3 comments

Dan Mares has been writing command line utilities for computer forensics, ediscovery, and other purposes for years. The quality and capability of each utility demonstrates how long he’s been doing this, and how well he knows these fields. Unfortunately, his site now has a warning that reads “All Maresware is command line driven, and as such has gone out of style so it is being discontinued.” I’m here to say that the command line is a long way from going out of style for a significant number of us.

First off, I went through college earning most of my CS degree on Linux. The command line is an old friend, and stringing processes together with utilities is second nature. But even if you’re fresh out of college and have never seen Linux you will quickly find that the GUI driven tools just don’t cover all of your needs, and probably never will. This is particularly true if you’re working with a client on a small budget or a client who lacks in house litigation support. Why? You can’t deliver your work via load files or an expensive review platform. Instead, you need to send over zip files and massage the contents so they can be reviewed with commonly available applications. But even in large ediscovery and forensics projects, the GUI driven tools don’t give you 100% coverage.

Case in point. Using dtSearch I had identified 700 files spanning four volumes mounted using FTK Imager. The list of files was in a single text file. I needed to pack all of these files up in multiple zip containers due to bandwidth issues for delivery to a client without modifying their MAC times. And, by the way, the filenames weren’t unique so I couldn’t just zip them up, and I couldn’t copy them to one location and then zip up that location. I also couldn’t put them in a traditional evidence container using FTK Imager because the client didn’t have FTK Imager or MIP.

I eventually wrote my own utility that drove xxcopy ’cause robocopy is designed for directories, not files and xcopy doesn’t preserve MAC times and neither of them will take a list of files to work on as a command line option. It got the job done, but I spent a lot of time thrashing around before I stumbled on this.

Enter Dan Mares and the upcopy utility. It has an incredible number of useful options, but for my purposes, three really stood out:

  1. It preserves MAC times
  2. The –flatten option will take a tree structure and copy all the files to a single directory
  3. The –nodupe option will detect duplicate files that would result in name collisions and add a unique suffix to each duplicate file
Using a combination of these features, I was able to copy 700 files from three different volumes into a single destination directory while preserving duplicates.
I spent some time looking through some of Dan’s other tools and was truly impressed. He’s created utilities that solve a lot of problems that confront us on a daily basis and is offering them for free. If you’re at all willing to step away from your GUI, you should check out Dan’s site:

http://www.dmares.com/index.htm (follow the various links in the direct links section.)

Please note that, despite the disclaimer, Dan is still actively supporting his tools and is still very active in the community.

Advertisement
Categories: Computer forensics

Thoughts on managing increasingly complex digital forensics cases

We’ve all seen articles about the looming death of forensics due to the increase in data volume and data containers. The calmer folk generally just chuckle and get back to work, knowing that they’re gainfully employed for as long as they wish to work. For the less calm, and just to give everyone a few more things to think about, let me offer the following three thoughts:

1) As data volumes and the number of devices increase, clients may need to be willing to pay more for the analysis. The cost of the work isn’t nearly proportional to the number of custodians these days. Just because data volumes are increasing doesn’t mean that the work doesn’t need to be done. The successful practitioners will be the ones who figure out how to process all that data while keeping their clients happy.

2) Then again, does all the data need to be processed immediately? The successful practitioner may also be the ones who successfully triage the problem and can defend those triage decisions to their client and in court. Just because you don’t process all the data immediately doesn’t mean you cannot go for a deeper look later when justified.

3) Approaching the problem as a team rather than as an individual will yield better results. In addition to splitting the problem over multiple cores (technical solution), split the problem over multiple people (organizational solution), each with deep domain knowledge and appropriate skills. The amount of work done by each individual may go down a bit, the total work done by the team will scale with the volume of data and number of devices, and there will be some additional overhead due to coordination. The overall efficiency, given a good team, should increase quite a bit. I know I’m much more efficient with additional eyes on the problem working in concert. The solo practitioner may need to limit the jobs they take on, or form partnerships that allow them to share the work efficiently.

The problem is hardly insurmountable, and in any such challenge there are opportunities. We can wail and gnash our teeth or we can quietly (or, if you’re in marketing, noisily) step up and meet the challenge, ensuring quality services for our clients and a secure job for ourselves.

Categories: Computer forensics