I think the UAV industry in general and the search & rescue community in specific should do a lot more work on designing and performing experiments with UAVs. Vendors and sales outlets keep touting their UAVs as being “good for search & rescue” without providing any data to support this claim, and often without really understanding SAR, SAR missions, and the challenges we face. (More on this in my upcoming presentation for NAASIC in Reno in September.) On the privacy side, people claim “he couldn’t see anything at 200 feet with that drone.” or the opposite position without sharing any data to support these claims.
Since I am an engineer, I like to gather data to support conclusions. And, for similar reasons, I usually form a hypothesis prior to conducting an experiment. Full disclosure – the data did not support my hypothesis. I’ll explain at the end of this post.
For the tl;dr folk – you cannot see much detail in a stock Phantom 2 Vision+ image when taken more than 50 feet above the subject.
This experiment was conducted with a stock DJI Phantom 2 Vision+. The lens specifications, according to DJI, are:
- Sensor Size – 1/2.3″
- Effective Pixels – 14 Megapixels
- Resolution – 4384×3288
- Recording FOV -110° / 85°
I had the camera set to use the “large” photo size and thus the full resolution.
The items in the frame are:
- A black Pelican case
- A human male wearing blue jeans and a reddish t-shirt
- A high visibility orange long sleeve thermal shirt
- A light blue t-shirt
- A white board with black writing on it
The sky was overcast and the winds were between 5 and 15mph out of the south east. I took the Phantom up to 25, 50, 100, 150, 200, 250, and 300 feet, +/- 3 feet as reported by DJI’s Vision app. At each altitude I took a single photograph. After landing, I used Photoshop to zoom in to approximately the same area in each image.
In the raw images viewed natively without any zoom:
- It is hard to find any identifying details of a human in the image above 50 feet.
- At 200 feet it would be hard to identify the human if you did not know what you are looking at.
Using the zoom tool in Photoshop:
- Detail is hard to discern at 100 feet and very difficult past 100 feet
- Given the subject’s pose you can determine that there is a human in the frame up to 300 feet.
- If you thought a drone would be invading your privacy when flown at 200 feet do you still feel this way after looking at these images?
- If you want to use a drone to search for missing people, do these images help you determine your mission parameters and effectiveness?
And my hypothesis? I thought more detail would be available further up. Glad I’m conducting experiments.
Image analysis is not my forte. If you have additional observations, please comment or share them with me directly and I’ll get them included.
[This is the first in a series of posts about the forensic analysis of drones leading up to presentations at BSides NOLO and SANS DFIR Summit in Austin.]
Drones (properly known as small unmanned aerial systems – sUAS) are all the rage. The market is roughly $600 million this year and is expected to be $5 billion by 2021. Drones will touch many aspects of your life, overtly and behind the scenes. They are already used commercially for mapping, precision agriculture, film making, and damage assessment. Illegal uses range from commercial services in violation of FAA regulations to surveillance and drug smuggling. And the hobby community is booming with drones as one of the hottest Christmas presents of the season.
With all of these drones in the air, the forensic analysis of drones is already important and making headlines. Who didn’t hear about the one that crashed on the White House lawn? The demand for analysis that will stand up in court is present and increasing. Tools will not solve the problem alone – we need forward thinking analysts who can work in a variety of disciplines, write their own tools, and go beyond existing techniques. Why? The key is in the final ‘S’ in uSAS. They are small unmanned aerial systems, these are entire networks with multiple operating systems in flight and spread across miles of terrain.
Let’s take a look at all the components of a popular consumer drone.
There are seven components in this unmanned aerial system:
- Radio controller
- Wifi range extender
- Mobile device
Each of these components potentially contains evidence relating to the incident you are investigating. The aircraft contains multiple sensors, a flight controller, radio links, a camera, motors, and more. The radio controller is pretty dumb but there are configuration settings stored in it that contribute to understanding the full environment. The laptop was probably used to maintain and configure many of the other components and will likely have artifacts relating to that work, along with the traditional Internet history, email, and messaging that might significant context. Even the battery stores digital artifacts about its history and health.
The analyst needs to physically collect and document all of these components, a potentially daunting process given that the components might be separated by time and distance. The type of motor, the custom labels on the radio controller, and the wear and tear on the propellers all tell their own piece of the story and must be correctly documented and analyzed.
Once the analyst obtains access to the physical components, they need to gain access to a variety of digital containers, and then analyze digital artifacts that range from firmware to EXIF data in photos to plists, registry settings, and /etc/mount files.
Here is a breakdown on some of the containers and artifacts associated with each physical component:
- Two Linux systems
- OneOpenWRT runningBusybox
- flight controller, media server
- Filesystem – squashfs, overlayfs, jffs2
- One Ambarella A5s IP Camera Reference Platform running Linux
- camera controller
- Filesystems – ubifs
- OneOpenWRT runningBusybox
- One micro SD card
- OpenWRT Linux system
- Wifi range extender
- squashfs, overlayfs, jffs2
- One USB port to configure the controller
- Queried via USB port on aircraft when attached to maintenance application on OS X or Windows
- IOS or OS X
- Many possible apps, including home-grown
A complete analysis of this system will be non-trivial, and a single tool will not give the analyst access to all the relevant information. There are several different flavors of Linux, at least one mobile operating system and at least one standard operating system. There are at least five different file systems, many of which are not recognized by commercial tools. Some artifacts are only accessible via USB and vendor defined protocols. Others require accessing the sUAS’s network and using ssh to connect to the systems. Some systems are running on flash media and maintain no state information after loss of power.
To further complicate the situation, each vendor will use a different collection of components, and those components will vary within their product line and new vendors will enter the market monthly. The open nature of the mission planning software and the flight controllers encourages customization. New sensors and new uses for drones will push both the application of drones as well as the legal and social borders around them.
The forensic analysis of drones, and the larger cybersecurity landscape around them, will be very complex, very fluid, and very exciting. Stay with us as we explore it in depth.