The Future of Data Clouds

The U.S. Department of Homeland Security (DHS) and its Federal Emergency Management Agency (FEMA) continue to face significant challenges in the five major phases of managing emergencies and disasters: preventing, protecting against, responding to, recovering from, and mitigating events. All of which continue to evolve at a rapid pace, along with the tools of the trade. During and after almost any such event, the need for rapid and reliable information is perhaps the most critical factor involved in making effective decisions. Whether the decision window requires looking years ahead or simply analyzing an ongoing 12-hour incident command operational period, the need for reliable data continues to be the key component needed for operational success.

How to effectively use that data, though, raises a number of relevant questions, including the following: How many people might have to be evacuated? Are there enough shelters available? Is the power out – and, if so, where? Do the capabilities available match the current and possibly future needs of the city, state, or nation?

The answers to all of these questions, and many others that might be asked, require the use of accurate and timely data – as was amply demonstrated by the widespread damage and loss of life caused by Superstorm Sandy and the “nor’easter” that immediately followed. Responding to and coping with those twin disasters required the quick and effective use of a veritable flood of information, much of it changing literally minute by minute. Twitter feeds and information received from other social media sites provided a huge quantity of helpful information, as did geospatial information and power outage tracking systems. All of these combined are just a small sample of the innovative ways in which essential decision-making data is being captured, analyzed, stored, and communicated.

Intelligent Decisions & Clear Priorities – But Scarce Resources 

Already resident within the federal agency community are stores of information about previous disaster events, current and past weather patterns, and flood models – as well as disaster relief spending and practical information about location of the material resources needed to support response and covert operations. The challenge facing emergency managers – at all levels of government – is to harness all of the data available from their respective “siloed” systems and build the analytical tools and capabilities needed to make quick, intelligent, and economically viable decisions.

A clear understanding of the preparedness capabilities needed and the protection capabilities allowing for critical infrastructure to be more resilient will both help lead to the use of accurate information that not only enhances real-time situational awareness but also helps determine the resource priorities for full and effective response and recovery operations. Combining the data available from an ongoing event with historical data already in the information system will help develop a better overall understanding of the current environment. That understanding should enhance the ability of decision makers to adapt to and mitigate the losses caused by ongoing and/or future threats of a similar nature.

Building and improving this type of analysis, which is ongoing across the nation’s emergency-management and homeland security communities, requires more effective use of the limited financial resources that are likely to be available to federal, state, and local governments.

Leveraging Visual Interfaces and Analytics: A Prime Example 

Numerous federal, state, and local emergency management agencies and organizations are responsible for various disaster planning and response activities and operations. Many of them already have found that using social media provides, in most if not all emergencies, helpful and timely situational awareness to deal with biological events and other potential disasters. At the Centers for Disease Control and Prevention (in 2010-2011), it was determined that using social media provided a better and faster way to accumulate and analyze data for emergency disasters in real time. With such a solution in place, it was found that the agency could expand and improve overall preparedness by leveraging the information flow to more accurately, and more quickly, predict the probable impact and determine the response capabilities required.

In order to reach that predetermined goal, though, the agency needed a higher level of confidence on the approaches already available to gather, analyze, and use the social media data on which it would base any operational decisions. The specific challenges faced by implementing the new solution focused on related issues such as data ingestion and normalization, the building and use of a social media vocabulary, and informational extraction capabilities.

Working with industry leaders, the agency then developed the framework needed to capture, normalize, and transform the open-source media used to characterize and forecast future disaster events in real time. The framework incorporated computational and analytical approaches into the system to help transform the “noise” accumulated from the social media into usable, and useful, information. By leveraging such esoteric algorithms as term frequency-inverse document frequency (TF-IDF), natural language processing (NLP), and predictive modeling, the agency also was able to: (a) characterize and forecast the probable numbers of injured, dead, and/or hospitalized victims resulting from a specific incident; and (b) extract other helpful information – e.g., symptoms, geographic particulars, and the demographics involved – related to specific illness incidents or events.

The solution framework built by the agency was implemented in the cloud – on virtual servers – by taking advantage of its flexible computational power and storage. The new cloud infrastructure also allowed for data capturing and use of a visualization tool, called Splunk, to mine through and analyze vast amounts of data in real time, while at the same time outputting the characterization of, and forecasting the metrics related to, various captured events.

Using Data Management to Improve Understanding 

The agency’s solution included the use of dashboards that characterized the emergency events captured by and reported in the social media. The visual analyses that were generated included such helpful operational tools as event extraction counts, time series counts, forecasting counts, a symptom tag cloud, and geographical isolation. The algorithms were written in a programming language called Python and incorporated into Splunk – located on Amazon Web Services (AWS).

The solution framework captured live, streaming open-source media such as Twitter and RSS (Rich Site Summary) feeds. Building upon the current best practices used in the cyber-terrorism community, the new solution enables near real-time situational awareness through a stand-alone surveillance system capable of capturing, transforming, and analyzing massive amounts of social media data. By leveraging that data and its related analytics to develop more timely and more accurate disaster characterization, the agency is able to plan and respond more effectively as well.

The future of this understanding and analysis of data is not limited, though, to the realm of social media. The federal government: (a) Is in a unique position to harness the capabilities built by the intelligence community in order to cope with weather emergencies and other disasters; and (b) Also can provide – to state and local governments – the tools they need to use the data at all levels of government to make more judicious resource decisions, understand the risks and threats involved, and both respond and recover more quickly when major weather and/or other emergency situations do develop.

Collectively, big data, the cloud, and analytics seem to be on course to be the next “Big Thing” in emergency operations and, not incidentally, to serve as one of the most cost-effective ways of building and securing a truly resilient nation.

bourneheadshot
Marko Bourne

Marko Bourne is a principal at Booz Allen Hamilton and a DomPrep40 advisor. He is leader of both the company’s FEMA market team and its Emergency Management and Response practice, and has more than 27 years of experience in: emergency services; emergency management; policy, governmental, and legislative affairs; and public affairs. Before joining Booz Allen Hamilton he was FEMA’s director of policy and program analysis (2006-2009) – and, earlier, director of business development for homeland security (2004-2006) at Earth Tech Inc./Tyco International. He also served as acting director of the DHS National Incident Management System Integration Center and as deputy director of FEMA’s Preparedness Division (2003-2004).

SHARE:

TAGS:

No tags to display

COMMENTS

Translate »