When disaster strikes, people in the flood zone or the earthquake radius have the best and most immediate access to pictures, videos and other data scientists need to understand the phenomenon. So Jules White, an assistant engineering professor Virginia Tech, is planning to arm these “citizen scientists” with a cell phone app that could transmit their information to researchers and first responders.
I spoke last week with White, the recipient of a $65,000 grant from the National Science Foundation, about how his system would work — and why the project is so important to him.
Before we talk about your new model, describe the current way data is collected and disseminated during disasters.
I was [in Nashville] during [this year's] flood. All these parts of Nashville were essentially cut off from news and media. I found myself going to Facebook and Twitter to get some of the best information about what was going on. I’d see pictures that friends had posted of the flooding in their areas. You have all this interesting information people scattered around a disaster area are collecting. There are crude ways people post that through social media. From a science perspective, there’s no easy way to take that data and aggregate it. There’s no central way of getting access to it all. There’s no way for the scientists or first responders to ask questions or ask for observations from people in those areas. It’s just one-way. My thought was: How do you connect the scientists or the first responders to these people to get the information that’s needed to make a difference in the disaster?
How would you make that connection?
The idea behind this is an app. When you [launch] the app, there would be different observations that scientists or first responders care about that you could make [based on your location and the disaster]. From the context of the oil spill, a scientist could [request] observations like, “Do you see oil? Do you see dead fish? If so, take a picture of the fish,” [that would be geotagged using the phone's GPS coordinates]. The app would allow scientists who know what should be recorded to push those experiments to people based on where they are. It’s trying to connect people in [disaster] areas with people who need information about them.
You mentioned floods and oil spills. Would the app include other disasters?
Initially, we wanted to study this in the context of the oil spill. But we think of this as a broader thing. It’s a collaborative effort between Virginia Tech, Jeff Gray at the University of Alabama and Andy Gokhale at Vanderbilt. Our vision is to develop a system that’s flexible enough that you could do other types of disasters. The key thing is connecting citizen scientists — people who have these phones and are in the disaster area — with the scientists and first responders. It could be related to global health. You [could] have people in remote villages that have cell phones collect information about health-related issues. It could be based on what experts on global health are pushing out to those areas.
Won’t there be an increased likelihood of errors in data collected by untrained citizens?
There certainly will be data that has errors in it. [But] you’re going from a situation where you can’t get that information at all to suddenly getting data you weren’t able to get before. You’re creating a new problem by solving an old problem. But hopefully you can make up for it by getting things that you couldn’t have gotten at all and getting a volume of information about the area that you couldn’t have gotten before.
Why is this personally important for you?
I grew up in Alabama in a little town that’s right on the bay. When the oil spill happened, I was really upset. My life growing up in that area centered around the bay. The thought of all this oil ruining everything was personally upsetting. At the same time, I knew the culture of that area was to go down to the bay all the time, so the best people to document the effects and show that impact would be the people living there.
Do you have anything else to add?
This [work] is only possible because of these phones, like the iPhone and Android. These phones are so powerful and the sensors on them have so many unique possibilities that people haven’t begun to think about yet. For example, we did a project on detecting traffic accidents using the accelerometers and acoustic data in your phone. If you’re traveling 30 miles per hour and your phone experiences 10gs of deceleration and there’s a loud sound that’s indicative of air bag deployment — you can detect [a potential accident] because of these phones. If you have 10,000 people in an area and suddenly their phones all register vibrations in the accelerometer at one time, maybe you can collect data about an earthquake from them. It’s a really interesting time to be looking at these types of research because of the sophistication of smart phones.
Image: Jules White