Forum: From Data to Decisions to Action

Edited by Michael J. Keegan

bio_pic_forum

Introduction: From Data to Decisions to Action— The Evolving Use of Data and Analytics in Government

In a world inundated with all kinds of information, timely, relevant, and more predictive data can drive better decision making. Private sector companies have been analyzing data successfully for some time to gain a competitive edge, improve decision making, achieve better financial outcomes, and improve customer service. Now, government agencies, operating in a similarly data-intensive environment, are pressed to do the same. Although the competitive edge for government agencies isn’t the same as the private sector, today’s government executives seek a performance edge, which involves increasing the efficiency of operations, effectively meeting their missions, mitigating risks, and increasing citizen engagement and public value.

Data are deeply woven into the very fabric of our lives with the proliferation of mobile technology and the acceleration of computational power. Within this context, it is no surprise that government agencies will grapple with how to integrate their disparate data sources, build analytical capacities, and move toward a data-driven decision-making environment. Data analytics can be a powerful tool to help government executives leverage all sorts of data, big or small. Analytics uses data—structured and unstructured—to uncover patterns, identify opportunities, seek parallels, formulate predictions, and inform decisions. It has the potential to transform information into insights—taking diverse volumes of data and predicting the most likely outcomes of key decisions or events. These insights can enhance an organization’s performance. In the end, it is about using data to strengthen agency decision making and inform government action.

From 2011 to 2014, the Partnership for Public Service and the IBM Center for The Business of Government collaborated on three reports and a series of podcasts on using data and analytics:

  • The Power of Analytics – This report set out to study federal agencies’ use of analytics and how it helped them achieve better program results. It focuses on identifying leading practices that illustrate how data informs decisions and drives meaningful and positive program changes.
  • Building an Analytics Culture – The second report examines what it takes to build analytics into an agency’s decision-making process and culture. The report includes concrete steps for building a disciplined approach to analytics and profiles seven agencies that are using analytics to achieve better results.
  • Lessons From Early Analytics Programs – The third report examines long-standing programs and how they have advanced and evolved over time to be a sustainable component of an operation. It highlights five analytics-based efforts that were begun—in one case, more than 25 years ago. Based on these cases, the authors identify a series of lessons that they saw as important if analytics are to be successfully embedded in an agency’s culture.
  • Conversations on Big Data – The fourth and final contribution is a series of podcast interviews with federal leaders in which they describe how they are using data analytics to prevent and counter tax fraud, improve training, respond to emergencies, protect investors, keep our food supply safe, and more. These podcast conversations are designed to provide insights into the essential ingredients for a successful analytics program and offer advice from leaders whose agencies are benefiting from analyzing data.

This forum is dedicated to highlighting the insights, findings, best practices, challenges, and successes outlined in each of these efforts. This forum’s contributions are largely edited excerpts from each of the reports referenced above and from conversations with government executives.


The Power of Analytics

The initial contribution to this forum sets out to study federal agencies’ use of analytics and how analytics helped them achieve better program results. It focuses on identifying leading practices that illustrate how data informs decisions and drives meaningful and positive program changes. It is excerpted from the first report of the Partnership for Public Service and IBM Center collaboration, From Data to Decisions: The Power of Analytics.

Batting average isn’t the best way to determine the effectiveness of a hitter. The Oakland Athletics learned this while doing statistical analyses of players and trying to build a winning team during its 2002 season. “They took everything that happened on the baseball field and sliced it and diced it to its most elemental parts,” Michael Lewis, author of the book, Moneyball said in a radio interview. The A’s surprised just about everyone with their new-found success on the field, besting teams that had millions more to spend on recruiting top players.

Federal agencies don’t field baseball teams, obviously. But they too collect valuable data that tell important stories about how they’re doing in carrying out their missions. Virtually every agency collects data but many struggle to turn the information into useful information that can inform and drive decisions. Yet, trends in that data can pinpoint problems, underscore successes and steer officials toward alternatives and perhaps better ways of carrying out their programs.

Agencies that have extracted the important lessons from their data and relied on the information to manage performance have reduced marine accidents, improved the quality of the care patients received in nursing homes and improved how Social Security services are delivered. The data became useful information that staff relied on to analyze programs and improve results and, yes, sometimes hit the ball out of the park.

Whether agencies have fully immersed themselves in analyzing data or have just begun the process, some basics have become apparent. If agencies want to improve program effectiveness and efficiency, they need to manage performance; and to do so, they have to measure it. The measures they choose need to be meaningful and linked to a desired goal or result. If ending veterans’ homelessness is the goal, for example, a better indicator for success than how many housing vouchers are issued is likely to be how many veterans get into housing.

Heed the Clarion Call

The clarion call to fix government has put great pressure on federal agencies to manage better and to be accountable and transparent in the process. In the midst of tremendous fiscal uncertainty the nation now faces, and with public attitudes toward government at an all-time low, it is more critical than ever that federal leaders base their decisions on accurate data rather than on anecdotes, incomplete information or the belief that things will work out for the best—particularly when those decisions have huge consequences on how tax dollars are sent and society impacted.

What stories do the data tell?

All federal program managers could run their programs better by analyzing their data, but it takes effort to begin. Even if managers believe it is important and necessary, it isn’t necessarily easy. At the heart of knowing how well an organization or program is performing and where leaders need to focus greater attention is analytics. Broadly defined, it is the extensive and systematic use of data, statistical and quantitative analysis, and explanatory and predictive models to drive fact-based actions for effective management. It sounds intimidating, but simply stated, analytics is the process of turning data into meaningful information that program staff and agency leaders can use to make decisions.

Case Studies of What Works

From Data to Decisions: The Power of Analytics from which this contribution is excerpted, sets out to study federal agencies’ use of analytics and how it helped them achieve better program results. It focuses on identifying leading practices that illustrate how data informs decisions and drives meaningful and positive program changes. In particular, its authors sought to determine how employing good data led to changes in how agencies think about their programs and how this led to programmatic insights that influenced their decisions. Seven programs in eight agencies (one program is collaboration between two agencies) that had experience using analytic strategies and techniques were reviewed. Three programs were reviewed in depth with some lessons taken from four others. We focused on mission programs in agencies to illustrate how using analytics can lead to beneficial changes that help agencies meet program goals. We believe the techniques these agencies have used are transferable to other agencies, regardless of previous experience using data.

The four agency programs examined in greater depth are:

  • Department of Housing and Urban Development (HUD) and Department of Veterans Affairs (VA) jointly administered Veterans Affairs Supportive Housing (HUD-VASH) program;
  • Safety Management System (SMS) in the Federal Aviation Administration (FAA); and
  • Department of Health and Human Services’ (HHS) Center for Medicare & Medicaid Services (CMS) Medicare Program; specifically the nursing homes and transplant programs.

The full report also highlights compelling programs from four other agencies, which includes:

  • Coast Guard’s Business Intelligence system (CGBI);
  • “Click It or Ticket” campaign by the National Highway Traffic Safety Administration (NHTSA);
  • Department of the Navy’s Naval Aviation Enterprise (NAE); and
  • Social Security Administration’s (SSA) use of mission analytics in customer service.
  • For more a detail treatment of these programs, please download the full report at businessofgovernment.org/report/ data-decisions-power-analytics.

Different Stages of Maturity

The review of these programs shows agencies at different stages of maturity in using analytics and illustrates the agencies’ continuum of progress as they journey from collecting data to analyzing it for using it to manage their programs.* Some agencies, such as NHTSA and SSA, have decades of experience using data to set goals. Others, such as HUD, are newer in the data arena but now are implementing agency-wide analytics programs. But, at all of the agencies we reviewed, regardless of the level of sophistication in their analytics programs, data analysis helped provide insights into how to improve programs. And, all of the agencies found they needed to change agency culture to take full advantage of an analytics mindset.

Data Are Only the Starting Point

Collecting the data is only the first step. They needed to be analyzed, turned into information and made accessible to staff and executives, to meet varying needs, and be understandable to different audiences. The value of the data came from the stories this information told. Agencies also had to develop meaningful performance measures to assess progress on how far they were in achieving their program goals. We found that those measures changed over time and it was important that they stay meaningful and reliable and are tied to results.

Common Practices from Data to Insight to Decision Making

The agencies highlighted in the full report and referenced in this forum were similar in the ways they gathered data and turned the information into knowledge that improved their program results:

  • Leaders focused on transparency, accountability, and results.
  • Staff had a clear line of sight from where they stood to the desired goals and outcomes.
  • Agencies invested in technology, tools, and talent.
  • Agencies cultivated and leveraged partnerships across the agency and with partners who deliver services.

In reviewing agencies on their road from data to insight to decision making, it became clear that developing an analytics mindset is a not a short-term effort, but an evolutionary process that takes time and a commitment to performance management. Managers must weave into their organizations’ fabric a dedication to continuous improvement.

Finally, research indicated that some agencies may be derailed by myths that surround the process of analytics and measuring results. It is important to debunk these myths.

Key Findings

  • The analytics process turns data into meaningful information that program staff and agency leaders can use to make good decisions.
  • Leadership support and analytics are cornerstones of performance management, which requires supervisors and managers to identify problems, assess progress, and share results.
  • For analytics to become accepted widely, leaders should set expectations and call for accountability.
  • Non-experts, whether leaders or line employees, need data that they can access easily, understand, and tailor to their needs.
  • Collaborating with partners and stakeholders enables agencies to share data for analytics use, improving results.
  • Sharing data requires transparency.
  • The goal is to foster analytical insights, whether agencies have state-of-the-art data tools or less advanced software.
  • For analytics to succeed, employees need a supportive environment, training, and encouragement to use and experiment with data.

 

Building an Analytics Culture

The next contribution to this forum examines what it takes to build analytics into an agency’s decision-making process and culture and is excerpted from the second installment of the Partnership for Public Service and IBM Center collaboration, From Data to Decisions II: Building an Analytics Culture. It includes concrete steps for building a disciplined approach to analytics and profiles seven agencies using analytics to achieve better results.

The initial effort highlighting the power of analytics in government sparked an overwhelmingly positive response from agency leaders and federal performance management practitioners who asked, “Where do we go from here? How do we get an analytics program started?” Their reactions demonstrated a hunger to understand how to develop and grow an analytics culture within their agencies and incorporate it into how they perform their missions. Building an analytics culture looks at day-to-day practices that can help build and sustain an analytics culture, drive meaningful changes, and achieve mission results. The goal of this research is to provide practical approaches, practices or strategies that agency program managers can apply. The authors hope that sharing compelling stories of how agencies are developing, growing, and sustaining their analytics and performance-management approaches will shed light on key steps and processes that are transferable to other agencies.

Through this effort, we set out to learn what is working for managers and staff and what is not; specifically: how they are using analytics, how they got started, what conditions helped to grow their approaches, what challenges arose and why, and what success looks like. There were many parallels in approach across agencies and programs. Driven by budget realities and the push for more data-driven actions, agency managers were examining their programs in a disciplined, comprehensive way to determine how they conduct their business.

As part of this effort, four focus groups representing a cross-section of agencies and a mix of roles—managers, program staff and analytics staff –were convened. We reviewed analytics efforts at the program level in seven agencies that vary in their missions, size and reach:

  • Federal Emergency Management Agency (FEMA) Recovery Directorate, the FEMA Logistics Management Directorate, and the Transportation Security Administration , all within Department of Homeland Security;
  • Bureau of Indian Affairs
  • Air Force
  • Internal Revenue Service
  • Food and Drug Administration, Center for Drug Evaluation and Research, Center for Devices and Radiological Health, National Institutes of Health, National Institute of Biomedical Imaging and Bioengineering and National Institute of Allergy and Infectious Disease, all of which are under the Department of Health and Human Services.

We targeted a range of agencies whose diverse missions would enhance the transferability of our findings. A full description of the agencies we studied and more details on why they were selected can be found in the complete report, which can be downloaded at businessofgovernment.org/ report/data-decisions-ii.

Building an analytics culture involves:

  • Starting with a systematic and disciplined approach.
  • Making analytics the way you do business.
  • Getting the people piece right.

Start with a Systematic and Disciplined Approach

Where does an agency start in its effort to build or expand the use of analytics for making program decisions? What are key steps or actions that will create a culture that values and uses data for program management day to day?

The first step is to get a solid understanding of the agency’s program goals and objectives. Revisit the basic activities an agency, unit, or program performs and what resources, conditions, and other factors go into those activities; tie those activities directly to what they are intended to achieve; and then link those results to the agency’s goals. Focusing on these details will help agencies employ a data-driven approach to managing programs, help them identify the critical information needed to gauge progress and measure results, and ensure that only those activities that are key or essential to meeting desired results are performed.

By instituting systematic processes, agencies start building s so they can look critically at what they do and thoroughly understand how their activities can lead to better results. As part of this endeavor, agency program managers are examining what they do now, what they need to do to improve, and what resources are available. They are systematically mapping out the roles and activities of staff and stakeholders and refining performance goals, data, and metrics that are in place, or should be. And they are comparing data they need with what they collect and analyzing performance measures to examine how they affect results. These analyses highlight the causes and effects of individual and agency actions, including unintended consequences. Using this knowledge to make decisions holds great promise for improving agency performance.

To travel down the analytics road, managers must challenge time-worn assumptions and embrace qualitative measures that are linked to impact. As with any new activity, managers need to be comfortable experimenting and learning and then making changes that improve performance. The agencies we reviewed also deemed it important to find a common language to make sure terms were defined the same way for all—whether they were working with program staff, analytics staff, subject-matter experts, or stakeholders.

Together, they challenged assumptions by re-examining and asking basic questions about performance measures. Are they meaningful? What do they measure? What should be measured? Are the right data being collected? Are they reliable? They revisited data that had been collected to see if it was useful for achieving results. They focused on the questions and the clarity of their goals rather than on the systems or technologies for processing the data. And they tried to be rigorous and disciplined about each stage so that the questions asked and actions taken were consistent each time. In some cases, they set up pilot programs and learned from interim results what needed to be adjusted.

Steps to Get Started

Put together a team that includes people familiar with the work being performed, staff with analytical skills, and subject-matter experts. Bring in key partners and stakeholders and include people who aren’t part of the process but who have a vested interest in the outcome and are willing to challenge the status quo.

  1. Ask questions even if they can’t be answered with current data. The exercise will help highlight what data or other types of information are needed. In fact, craft questions with the understanding that asking the questions will lead to data gathering.
  2. Using the questions as a starting point, brainstorm to define a current process or activity and what a future, improved version or result might be, focusing on top issues and agreeing on a desired outcome or outcomes.
  3. Take large issues and break them into smaller, workable components. This will provide a quick demonstration of the value of data, which can stoke interest in analytics and convince staff to use analytics in their work. An incremental approach can, not only rapidly show benefits, but allow the process to be tested to learn what can be improved and refine data analysis requirements—all of which can help to determine what automated tools or systems are needed. Pilot projects also afford the opportunity to demonstrate proof of concept and the value of the planned actions for all involved; including that they are doable. The “proof” can go a long way toward laying the foundation for buy-in.

Make Analytics the Way You Do Business

For analytics to become an integral part of agency activities, leaders must live by example, using data for decisions in an open and transparent manner. Leaders can be at any level within an organization, but making analytics a way of doing business requires them to be relentless in their efforts to make decisions based on facts, rather than relying on gut instinct or conventional wisdom. Incorporating analytics into day-to-day management activities can change attitudes, transform how work is done and affect results. However, to allay fear in the workplace, the analytics emphasis needs to be on learning how to improve performance, not on placing blame.

Steps to Get Started

  1. “Prepare the troops” by explaining the importance of data and communicating a vision of how that data will be used in decision making; and share, share, share. Provide clear and meaningful information to employees and important stakeholders that communicates what the team is doing and learning, as well as the next steps. Everyone directly impacted by the work of the team needs to be kept in the loop.
  2. Get to know the data and understand what they mean. Then lead by example, using information from the data to make decisions.
  3. Encourage collaborative partnerships across the agency with other agencies and with key partners and stakehold-ers outside the federal government.
  4. Take the initiative and show passion for working on problems that stymie organizational performance. Fight complacency and seek opportunities for changing busi-ness as usual.
  5. Raise issues, demonstrate knowledge about them, and suggest ways to do a job better or achieve better results. Question data and help identify ways to improve data quality and usefulness.

Getting the People Piece Right

Leaders need to communicate the importance of analytics and build the staff capacity to take advantage of them. They also need to understand how employee morale and performance are connected to how well an organization functions. An effective way to institute or expand an analytics program is by working with program staff individually or on a project to demonstrate the usefulness of data analytics, according to agency analytics teams with whom we spoke.

Part of the effort includes using change-management strategies as an agency builds and grows an analytics culture. The shift in how agency work is done challenges business as usual and compels employees to do things differently; this can cause great unease. Steps for introducing change and gaining acceptance for it include communicating purpose and vision, engaging staff and stakeholders, eliciting feedback, and sharing information.

Agencies we spoke with are bringing in frontline program people, analytics staff, subject-matter experts, and stakeholders as they apply systematic approaches for gaining a deeper understanding of their activities. They hope to improve performance and make it routine to achieve results based on analytics. They are building relationships within and between agencies and identifying shared goals among stakeholders. The approach also helps to break down silos that have walled these partners off in the past.

Enhancing staff capacity to analyze data and getting staff to share knowledge within and across agencies are important ingredients for sustaining an analytics program. The more that staff members understand how to analyze and use data, the more they appreciate the power of analytics to carry out an agency’s or program’s work effectively.

Steps to Get Started

  1. Tap the expertise of others—inside and outside the pro-gram, workgroup, and agency—by building networks and communities of practice and sharing knowledge and expertise with colleagues.
  2. Recruit multidisciplinary staff and people with experiences outside the agency or immediate workgroup who can challenge conventional wisdom, think beyond the status quo, and bring valuable insights, knowledge, and lessons learned from other experiences. Get a fresh perspective by tapping people from different disciplines to look at data and approaches.
  3. Provide opportunities for employees to move from pro-gram or line offices to analytics staff offices within one agency, and among line and staff offices in other agency organizations, to broaden knowledge and perspective and to share their expertise. This can be done through work details, cross-functional teams or rotational assignments or reassignment, and can benefit both staff and the agency.

Conclusion

Analytics is an essential component of good management and a foundation for effective performance management and sound decision making. However, there is no single path to success for building, growing, and sustaining the use of analytics for better performance. As our stories show, there are many roads to get there. It is a learning process. Curiosity and the desire to perform well drive the use of analytics. For the agencies we studied, success often bred success. These organizations approached analytics efforts in a systematic, disciplined way that everyone in the organization could observe and understand. Revisiting the basics—mission, goals, objectives, inputs, processes, activities, outputs, and outcomes—and studying how all of the aspects connect, are fundamental to identifying the data needed for an agency to effectively manage and attain the desired results.

Along with a disciplined approach for data collection and analysis, which lends support for good management practices, the way to sustain improvement is to gain acceptance from staff and stakeholders and support from leadership. Top leaders can be in the analytics driver’s seat or support initiatives started elsewhere in the agency, but along the way, it is critical for people to feel ownership of the processes. Most important is that leaders incorporate analytics as a way of doing business, making data-driven decisions transparent and a fundamental approach to day-to-day management. When an analytics culture is built openly, and the lessons are applied routinely and shared widely, an agency can embed valuable management practices in its DNA, to the mutual benefit of the agency and the public it serves.

Key Findings

  • To get started with an analytics program, create a team with agency experience, analytical skills, and subject-matter expertise.
  • Craft questions about work processes and other agency activities that will lead to data gathering and improvements by: defining a current process, describing an improved state, focusing on top issues that need to be addressed, and agreeing on a desired outcome.
  • Determine tools or systems needed and show benefits rapidly; then test and refine data requirements.
  • Communicate accomplishments and next steps clearly and meaningfully to get people on board.
  • Know and understand the data collected and use it to make decisions.
  • Encourage collaborative partnerships internally and with other agencies and partners outside the federal government.
  • Bring in people from various disciplines who will examine data and approaches from different perspectives.

 

Lessons from Early Analytics Programs

The third contribution to this forum examines long-standing programs and how they have advanced and evolved over time to become a sustainable component of an operation. This contribution is excerpted from the third installment of the Partnership for Public Service and IBM Center collaboration, From Data to Decisions III: Lessons From Early Analytics Programs.

Today’s senior managers are tempted to begin analytics programs before determining the mission-essential questions they are seeking data to answer. Older data-based analytics efforts often grew out of the discoveries of line employees who made connections and saw patterns in data after receiving new software or hardware that helped them make sense of what they were studying.

The report highlights five analytic efforts that started before the terms “big data” and “analytics” were in use, let alone in vogue. Examining programs that have been in operation for a longer period of time provides a better understanding of how they have advanced and evolved over time to become a sustainable component of a program’s operation.

A more detailed assessment of these analytics efforts can be found in the full report from which this contribution is based, and available for download at businessofgovernment.org/ report/data-decisions-iii. These include cases from the U.S. Agency for International Development (USAID), the Centers for Disease Control and Prevention (CDC), the Defense Department (DoD), the Animal Plant Health Inspection Service (APHIS), and the Veterans Health Administration (VHA). Based on these cases, lessons were identified as important for analytics to be successfully embedded in an agency’s culture.

Lessons Learned from the Case Studies

Collaborate with other agencies to collect data and share analytics expertise

Save money and effort, and increase the speed of analytics adoption, by acquiring data and services, such as collection, analysis and modeling tools, from other agencies. Analytics pioneers shared and added to one another’s data and expertise in a variety of ways.

  • Most often, they used legal authorities to buy data and the experts and software to analyze it.
    • Some used the government-wide provisions for interagency acquisitions under the 1932 Economy Act.
    • Others relied on agency-specific authority, such as the participating agency program and service agreements provided for under the 1961 Foreign Assistance Act, which created USAID and permits it to use other agencies’ resources when they are uniquely suitable for technical assistance in education, health, housing or agriculture.
  • Another form of interagency agreement, a memorandum of understanding, enabled U.S. Custom and Border Protection (CBP) to collect data and USDA’s APHIS to analyze it, helping both agencies meet their mission goals.
  • The National Aeronautics and Space Administration (NASA) has created research and development programs whose funding is contingent on recipients’ promoting their products to other agencies that can apply them and might invest in developing them further.
  • CDC used grant money to help public health labs acquire the equipment they use to process DNA samples for matching against the PulseNet database.
  • Annual science days give FEWS NET collaborators insights into each other’s work on famine, preventing duplication and augmenting other projects across participating agencies.

Develop data to determine return on investment for analytics programs

Mature analytics programs have struggled to define and measure the outcomes of their efforts. New projects too are challenged to demonstrate return on their data investments. Reporting improved outcomes, such as increased numbers of foodborne illness outbreaks detected or enemy combatants identified, is a bottom-line requirement for mission analytics programs. But just reporting better outcomes is not sufficient, especially now that sequestration is compelling programs to compete fiercely for scarce dollars. Demonstrating return on investment (ROI) is no longer optional. The most powerful ROI estimates mix real-world results and cost-benefit analysis.

Long-term data users often had no ROI measures when they began, but developed and adapted them as their projects evolved:

  • Most initially reported improved mission outputs and outcomes; for example, increases in CDC’s identification of foodborne illness outbreaks and of DoD’s numbers of biometrics matched with the subjects on the high-value-target watch list.
  • Increasingly, however, agencies are called upon to deliver cost-benefit and ROI metrics in monetary terms so that agency leaders can compare program costs to determine whether data-based efforts are more or less cost effective than alternate strategies.
  • To demonstrate ROI, mission analytics programs learned to devote resources to develop data to track financial and other results related in whole or in part to analytics.
  • Predictive analytics programs are still refining their cost-benefit metrics and findings and must take care in estimating costs avoided; for example, ensuring they report all actual and projected costs.
  • To improve their ROI estimates, analytics programs can employ surveys and audits, use experimental methods such as secondary screening, and increase and enhance the data they collect.

Give agency leaders clear, concise analysis and proof of adoption and results they can use to support data-driven programs

Although most analytics users wish everyone would immediately understand and appreciate their findings, it doesn’t always happen. Among the toughest ROI to demonstrate is for analytics programs that marshal data about unpopular truths to persuade reluctant leaders in government and other organizations to act. Presentation is especially important for top officials whose time and attention are limited but whose support is vital. Data visualization—charts, graphs, maps and models— make analytical findings easy to comprehend quickly.

Data program developers with long track records found they had to deliver analysis leaders could use and support:

  • The absence of a powerful sponsor can hobble an analytical effort, even when it shows mission achievements, as DoD biometrics backers have discovered—especially now, when programs vie for funding as budgets are cut.
  • Mature programs struggled when delivering analytics-based messages leaders didn’t want to hear; but these programs made headway when those behind them persisted in presenting the supporting data.
  • Program managers learned to use leaner, punchier, and more visual methods to present their findings so that senior officials could absorb them and get the main points quickly.
  • Programs that grew from the grassroots survived resistance by demonstrating their effectiveness in terms of broad user adoption.

To encourage data use and spark insights, enable employees to easily see, combine and analyze it

Standardize data so users can look across it by time, entity, geography, source, and other attributes to find linkages and patterns and share information. Letting intended users test-drive analytics tools and muck around in the data spurs discoveries that can save time, ease adoption, and ensure success.

Projects built on user insights:

  • Moved beyond using data exclusively to measure or compare employee and organizational performance by providing tools that enabled staff to combine, analyze and use data when, where and how they needed it to speed and ease the work process.
  • Incorporated guidance provided through users’ insights, implementing good ideas from the grassroots and recognizing those who suggested them.
  • Refined analytics tools by watching how employees used them to greatest effect, but without disrupting work flow.
  • Made sure that those who collected data also benefited directly from it or clearly understood how it improved mission delivery.
  • Capitalized on employees’ zeal for the agency’s mission to help them overcome reluctance to adopt analytics.
  • Reflected honesty about the potential for analytics to change agency operations and the jobs of those performing them.

Leaders and managers should demand and use data and provide employees with targeted on-the-job training

Once early analytics adopters demonstrated the value of data-driven approaches by showing they saved money, improved outcomes or avoided costs, they sought to institutionalize the use of analytics. They found that one sure way to do this was to teach leaders to demand data. Making analytics standard operating procedure means building it into the agency’s culture and climate. It pervades the culture when managers at all levels use data in planning, measuring results, budgeting, hiring, and running programs, and when they demand that employees’ work activities and requests are data-based as well. That requires on-the-job training in data analysis.

Instilling analytics in all agency activities became a goal once early programs demonstrated gains. It’s an ongoing process involving:

  • Standardizing data to enable users to look across collections by time, entity, geography, source, and other attributes to find linkages and patterns and to share information.
  • Providing formal and on-the-job education.
  • Training that’s appropriate to the organization and the employee’s position.
  • Teaching leaders to base their decisions on data, so that they, in turn, require employees to muster analytics to support their cases for funds, staff, space, and other resources.
  • Launching centers of excellence with expertise in data analytics, the organization’s operations and policies. CMS, for example, houses policy experts along with statisticians in its analytics laboratory. Policy people provide expertise on what is appropriate to bill to Medicare so that the fraud prevention system can be trained to identify what isn’t.
  • Spawning data evangelists who encourage use of data-driven techniques and tools beyond their own units across organizations.

Conclusion

The experiences of agencies with mature, data-driven programs reinforce many of the findings in our previous reports:

  • Leaders’ attention and support are critical, so make sure the analysis speaks to them.
  • Users will make or break the move to data-driven operations, so listen to them, make their work easier, and make mission analytics a carrot, not just a stick.
  • Find ways to collaborate within and outside your organization to get data, analysis, expertise, and even funding.

What early data users didn’t do was consciously set out to use “big data.” Instead, they asked hard questions and sought data to answer them: How can we detect foodborne illness outbreaks sooner? How can we estimate the quality of a crop months before it is harvested? How can we identify veterans most at risk of hospitalization or death and then target the right care to keep them healthier and at home? How can we focus inspections on the containers that are most likely to hold insects? What patterns of billing and behavior reveal fraud? Those questions and others propelled these users to collect and analyze data, which then became standard operating procedure and helped their programs evolve.


Conversations on Big Data

The fourth contribution to this forum is excerpted from the Conversations on Big Data podcast series, which is the final installment in the Partnership for Public Service and IBM Center collaboration on Data to Decisions. These podcast conversations are designed to broaden the perspective to additional agencies as well as revisit some of those covered in the previous reports profiled in this forum; provide insights into the essential ingredients for a successful analytics program; and offer advice from leaders whose agencies are benefiting from analyzing data. The federal leaders who participated in this series have successfully implemented key data-driven programs and will discuss how they are using data analytics to prevent and counter tax fraud, improve training, respond to emergencies, protect investors, keep our food supply safe, and more.

Conversations on Using Analytics to Improve Mission Outcomes

We provide highlights from these federal leaders on the most important ingredients for a successful analytics program. (You can watch the video of the panel discussion and listen to each of the seven podcast interviews too.) The executives profiled complex programs in several agencies that have a wide impact on citizens, who benefit greatly from leveraging data as a strategic asset in program operations. What follows are some highlights from those executives and salient take-aways for government and stakeholder groups who are implementing key data-driven programs.

Steve Beltz

Assistant Director, Recovery Operation Center, Recovery Accountability and Transparency Board

“You have to have good data, good analysts and good tools; what I refer to as a three-legged stool approach. If you’re missing just one of those components, you’re going to sell yourself very short on the program and not be able to do a full analysis. Where I see most agencies fail is with the analysts. They can’t shortchange themselves on that. You can have the best data and tools in the world but if you don’t have the right person who knows how to ask the right questions, you’ll get nothing. Somebody has to know how to understand the answer and then dig deeper.”

Malcom Bertoni

Assistant Commissioner, Planning, Food and Drug Administration

“You need to have champions both on the analytical side and on the program side—some data junkies who really love measuring and understanding and analyzing how an organization ticks, and some program managers on the front lines who get it, who are willing to embrace it and work with the analysts and improve their organization to make that part of their organizational culture.”

Lisa Danzig

Associate Director for Personnel and Performance, Office of Management and Budget

“Engage a set of people who think this could be worthwhile and/or already have a problem or goal they’re trying to achieve and that you could apply this to. This helps you avoid that cycle of collecting hundreds of metrics that aren’t relevant to the problem. It helps tie together the people who ultimately are going to be the advocates—who are the people with the problems and the goals.”

Carter Hewgley

Director of Enterprise Analytics, Federal Emergency Management Agency

“Find a champion at the leadership level in your organization. Assess the culture as it is. If people are not into it, you’re going to have to have a different strategy than if they are already on board. Then, you’ve got to demonstrate a quick win early. Pick a problem they care about and show them really quickly that, ‘Hey, if you did this differently you could save money or you could improve the quality of outcome for the people you’re trying to serve, or you could just make people’s lives easier.”

Gerald Ray

Deputy Executive Director, Office of Appellate Operations, Social Security Administration

“The key thing is to put the data scientists with subject matter experts. You have to have someone who is very knowledgeable about your program so they can help the data scientists map through the issues you need to analyze. The data scientists are generally very good at doing the analysis themselves. But you also need the subject matter experts to tell them what part of what they’re finding is relevant and what’s not, and to guide them and change the direction to get it more on task and more appropriate for what you need.”

Dean Silverman

Senior Advisor to the Commissioner, Office of Compliance Analytics, Internal Revenue Service

“Agency leaders and analytics leaders need to learn how to experiment, or use what I would call test-and-learn techniques. I would aim at the hard problems. It sounds counterintuitive, but don’t be afraid to point combined operating IT and data analytics teams at big issues and keep them on a short development cycle. I’d make everyone focus on and measure outcomes, not outputs. Lastly, I’d own analytics at the highest level of the organization, especially if you want to create change.”

Lori Walsh

Chief, Center for Risk and Quantitative Analytics, Securities and Exchange Commission

“There are three fundamental pieces. First is having the right data available. Analytics can help fix holes in data but fundamentally, analytics requires good data. The second piece is the right computing infrastructure and tools, and more sophisticated processing of data. If you’re a nationwide program, you need a good network of computing capabilities so people can work together seamlessly as if they were next door to each other. The third piece of a good analytics program is subject matter expertise. You can do all the analytics in the world on all the data you want, but if you don’t have a focus on what you’re trying to find, you won’t be successful.”


Conclusion

This forum, From Data to Decisions to Action: The Evolving Use of Data and Analytics in Government, concludes by sharing perspectives of select government executives who have taken action and realized the value of using data and analytics to improve mission outcomes. These executives lead complex programs in key agencies that have the greatest impact on citizens. They all underscore the benefits derived from leveraging data as a strategic asset in program operations.

Since 2011, the Partnership for Public Service and the IBM Center for The Business of Government have collaborated in studying the evolving use of data and analytics in government. This forum provides an overview of this effort, from illustrating the power of analytics, to identifying the key ingredients for building an analytics culture. It offers a snapshot of lessons learned from early analytics programs and shares insights from government executives who are implementing key data-driven programs. Its goal is to summarize this valuable collaboration between the Partnership and the IBM Center and help agencies continue to enhance their ability to leverage analytics in a way that improves mission results.

In the end, we hope that these insights are instructive and ultimately helpful to today’s government leaders and managers. For a more in-depth exploration of the reports and podcasts introduced in this forum can be found at www.businessofgovernment.org/brief/ using-data-analytics-improve-mission-outcomes.

Download this article

Download the entire Winter 2015 magazine