Archive
A Falling DJI Phantom Might, or Might Not, Crush Your Skull
The title is intentionally provoking. Too many public comments are similar to “A falling drone will give you a bump on the head”. In fact, there is a reasonable chance it will kill you. Which of these is actually true? We simply do not know and some formal experiments are required before claims are made either way. So my point is not really that they may kill you, rather it is that we need good data.
I’d like to thank the members of a particular Facebook group for engaging in a spirited discussion that helped me refine this post. It was far too provocative in the early draft and I am certain that it still is for some.
The theoretical analysis follows, and it ignores a lot of variables. These calculations are a starting point and represent the “worst case scenario”. With a lot of additional work, we could add other constraints and end up with a probability estimate of damage from a direct impact.
Weight of a DJI Phantom – 1242g (2.73 lbs)
Altitude at time of failure – 61m (200 feet)
Force required to crush a human skull – 2,300N (Journal of Neurosurgery: Pediatrics)
Let’s plug those numbers into a calculator:
7173N of force. Almost three times the force required to crush a human skull.
Even from half that height, 100 feet, a falling Phantom would generate 3527N, still enough to crush your skull. At 65 feet you might survive the impact as the force is down to 2351N.
There are a lot of variables that I did not account for – drag, impact angle, elasticity in the body and the drone…. Real experiments need to be performed.
UAVs in SAR – Deployment and Effectiveness
In an earlier post I wrote: “I think the search & rescue community should do a lot more work on designing and performing experiments with UAVs. Vendors and sales outlets keep touting their UAVs as being “good for search & rescue” without providing any data to support this claim, and often without really understanding SAR, SAR missions, and the challenges we face. (More on this in my upcoming presentation for NAASIC in Reno in September.)”
This is even more important when we consider what are appropriate missions for UAVs and how to deploy them.
I conducted two very quick experiments to illustrate two of the challenges we face. I intend to develop more formal experiments and welcome others who are interested in assisting with this effort.
Questions:
I wanted to answer two questions:
- How effective is a UAV when searching an area with trees?
- How effective is a UAV when searching for clues in a soybean field?
Both of these are simple examples of SAR problems you can adapt to your own operational area.
tl;dr – You need to be down very low when searching near trees and finding an unresponsive subject in a soybean field with an optical sensor is very tough.
Searching Near Trees:
If this was your search area, and if you were searching for an uncooperative or unresponsive subject (someone who isn’t going to come investigate the noise of the UAV), how would you plan your mission? How would you execute it? How long would it take? How effective would you be? (This was taken at 200 feet by a Phantom Vision 2+. The subject is currently in the frame.)
Ok, if the subject were standing under a tree in this small area, what would you be able to see? (There are a lot of variables here – height of branches, folliage on or off, distance from subject, subject’s distance from the trunk, …. This is just an example.)
Distance from the UAV to the subject was less than 50 feet in all images.
At the subject’s altitude:
At about a 30 degree angle:
50 degrees. The subject’s legs are barely visible due to the contrast between his blue jeans and the green background. (And, if you were looking at this on a mobile device, what would you really be able to see?)
70 degrees or so. The subject is not visible.
Conclusion – you need to get under the level of the tree branches to search around trees for an unresponsive subject. This will increase your time required to search while diminishing your ability to control the UAV at long ranges.
Soybeans
I live, and search, in Illinois. Lots of corn, lots of soybeans. Searching for anyone in a corn field when the corn is above your head is tough. We’ll come back to that one later. Soybeans get to a few feet tall. Walking through soybean fields is … annoying … but you can certainly see a lot more. If the subject is standing up you can just walk to the edge of the field and say “Hey, there they are!” But, what if they are unresponsive and down?
Again, 50 feet up with a DJI Phantom Vision 2+. The subject dropped their high visibility orange shirt, a clue! We can see it easily on the edge of the field.
But, what if they dropped it in the field? Since you know it is in the frame, and since it is right next to the pilot, you can probably see it. If you were looking at images from 100 acres of soybeans how confident are you that you’d see this clue, particularly on a small screen?
Conclusion:
If you are using a normal consumer UAV to search for an unresponsive subject in an area with significant vegetation your probability of detection may be rather low.
What Can a Drone Actually “See”?
I think the UAV industry in general and the search & rescue community in specific should do a lot more work on designing and performing experiments with UAVs. Vendors and sales outlets keep touting their UAVs as being “good for search & rescue” without providing any data to support this claim, and often without really understanding SAR, SAR missions, and the challenges we face. (More on this in my upcoming presentation for NAASIC in Reno in September.) On the privacy side, people claim “he couldn’t see anything at 200 feet with that drone.” or the opposite position without sharing any data to support these claims.
Since I am an engineer, I like to gather data to support conclusions. And, for similar reasons, I usually form a hypothesis prior to conducting an experiment. Full disclosure – the data did not support my hypothesis. I’ll explain at the end of this post.
For the tl;dr folk – you cannot see much detail in a stock Phantom 2 Vision+ image when taken more than 50 feet above the subject.
Experiment:
This experiment was conducted with a stock DJI Phantom 2 Vision+. The lens specifications, according to DJI, are:
- Sensor Size – 1/2.3″
- Effective Pixels – 14 Megapixels
- Resolution – 4384×3288
- Recording FOV -110° / 85°
I had the camera set to use the “large” photo size and thus the full resolution.
The items in the frame are:
- A black Pelican case
- A human male wearing blue jeans and a reddish t-shirt
- A high visibility orange long sleeve thermal shirt
- A light blue t-shirt
- A white board with black writing on it
The sky was overcast and the winds were between 5 and 15mph out of the south east. I took the Phantom up to 25, 50, 100, 150, 200, 250, and 300 feet, +/- 3 feet as reported by DJI’s Vision app. At each altitude I took a single photograph. After landing, I used Photoshop to zoom in to approximately the same area in each image.
Conclusions:
In the raw images viewed natively without any zoom:
- It is hard to find any identifying details of a human in the image above 50 feet.
- At 200 feet it would be hard to identify the human if you did not know what you are looking at.
Using the zoom tool in Photoshop:
- Detail is hard to discern at 100 feet and very difficult past 100 feet
- Given the subject’s pose you can determine that there is a human in the frame up to 300 feet.
Questions:
- If you thought a drone would be invading your privacy when flown at 200 feet do you still feel this way after looking at these images?
- If you want to use a drone to search for missing people, do these images help you determine your mission parameters and effectiveness?
Closing:
And my hypothesis? I thought more detail would be available further up. Glad I’m conducting experiments.
Image analysis is not my forte. If you have additional observations, please comment or share them with me directly and I’ll get them included.
25 feet
50 feet
100 feet
150 feet
200 feet
250 feet
300 feet