Tuesday, June 30, 2009

Dispatches from the Personal Democracy Forum

As part of a Google fellowship that I received on behalf of Public Performance Systems, I spent the last two days at the Personal Democracy Forum up in New York. I’ll finish up our discussion on a Performance Management Framework in the next few days but I wanted to first report on a number of fascinating highlights and initiatives from the conference which is meant to be a confluence of government, politics, and technology. Perhaps the most notable observation is that many people who were involved in using technology to bring Obama to the masses and to victory have transitioned into developing tools for governing, now that the election is over. It was a well attended conference and there’s clearly a lot of interest in this space.

Several initiatives were launched or revised at the conference on both the Federal level from the likes of Vivek Kundra as well as from Mayor Bloomberg in NYC. New York will launch their Big Apps competition in the Fall in order to encourage developers to come up with interesting utilities to assist the city in providing its vast data sets to the public. Additionally, Kundra announced revisions to the data.gov website and highlighted updates including usaspending.gov and the new dashboard at it.usaspending.gov. The latter is a slick application and looks nice, although I’ve always contended that there is no shortage of dashboarding software out there and the truly difficult part in presenting information is improving data quality and feedback. Nevertheless, these were all great initiatives.

Another interesting set of initiatives are occurring at blog.ostp.gov and mixedink.com/opengov which are both meant to foster discussion from citizens on policy ideas. While these sites are limited to IT policy discussions, the question was raised whether we’re moving in this direction for more general policy formulation (think policy formed through wiki by citizens). This may be a little farfetched, but it was well received by the technology community who are clearly excited about playing a role in government initiatives that impact the use of data in government transparency.

While it wasn’t discussed as much, government accountability was a major theme and I had discussions with several people about what exactly this means. Surprisingly few of them had much experience with performance management but after making my own case several agreed that some level of gathering metrics to assess performance would be useful. Many were more focused on taking various disparate public data feeds and turning them into something useful. My own interest still lies in assisting governments in building those data sets to begin with.

It was a great experience and I hope to attend again next year. For a replay of some of the activities, check out personaldemocracy.com.

Thursday, June 25, 2009

Bright Side of Government

Microsoft is sponsoring a great blog with stories on successes in government. My company was featured yesterday and if you have a story to share feel free to add it.

Sunday, June 14, 2009

Performance Management Analysis

Continuing our exercise in developing a performance management framework, a summary of the necessary steps for a successful program follows:

1. Report performance metric data on pre-defined schedule.
2. Analyze data for troubling trends or missed targets. Operationally research root cause of problems.
3. Provide corrective action for metrics where target was missed or data is trending in wrong direction.
4. Repeat process for next reporting period.

In previous posts I highlighted the importance of reporting performance metrics in a consistent and well-defined manner. In this post I'll cover the second part of a successful performance management program, analyzing the data from performance measures. A truly operational program must go beyond simple data reporting. Let's take response time from an Emergency Medical Services agency as an example. Is it enough to simply record and report response times? How do you know that what you're reporting is considered a "good" average response time? What percent should be below a specified time? Hard to tell without actually looking at the numbers. The data must be analyzed in the context of identifying troubling trends and researching the root cause of potential problems. All too often, data is reported to meet some external obligation and nobody even bothers to look at it! While complex data analysis is something of a science, there are many cases where any public sector employee can make good use of performance data with no training whatsoever. But to be most effective, it's necessary to record the findings of any current analysis for use in future situations. Raw data alone is useless to an organization, which makes a narrative of performance measures analysis (and eventually corrective action) a requirement for any successful performance management program.

First, don't focus on targets or absolute numbers when analyzing performance data. In most cases, analyzing trends over time is far more useful, and simple graphing exercises in a spreadsheet can highlight even modest improvements (or declines) in performance. This not only gives the layperson in government a powerful data tool, but also provides a baseline of data from which to operate. By focusing on trends, all agencies will feel as if they're operating from their current baseline, no matter how poor it may be. What's more important: that incident response times in Emergency Services are improving over time, or that they're hitting some arbitrary target? By focusing on graphing trends over time, most agencies will have both a powerful tool to evaluate their activities as well as a starting point for a performance management program without enduring criticism of missed targets. This isn't to say that looking at performance data against specific targets isn't useful, particularly when proposing new initiatives within an agency (particularly if it has a budgetary impact). Hard targets may be necessary to justify the initiative's or project's cost and can be used in declaring the initiative a success.

Second, all performance data should be analyzed within the context of whether significant change is due to real improvement in programs and outcomes, or if there is some other lurking variable. In my own experiences, improvements and challenges are often the result of poor data collection processes or errors, and not because of material changes in performance. Always rule out data anomalies first when a particular data point is out of the ordinary (many times by graphing the data these will be easy to spot). In our Emergency Services example, if we're looking at average response times, a few wild outliers could bring up the overall average significantly. What if the outliers were due to a faulty response report or some change in staffing that results in data not being reported properly? Government services are fairly stable over time and one would expect the data to represent this.

Finally, assuming that there is some difference in trends and that data issues are not the underlying reason, find out operationally exactly what the reason is. This is the most difficult part and often entails getting into the "weeds" of the organization. Operations analysts make a living out of improving processes in an organization, but managers with relevant knowledge within an organization should be able to just as easily get behind the numbers to understand what's happening. In these cases, several different metrics may be used to understand what's affecting trends. Going back to our response time example, let's assume we measure both the total number of EMS responses in a month in as well as average response time. If some disaster happened in a given month to cause responses to rise and we consistently measure this number, it may be a telling data point in explaining why average response times increased due to the increased stress on staff. Using multiple measures can help substantiate ideas for certain trends in the analysis phase.

These are just a few examples of good analysis techniques, but there are many ways to slay this dragon. I reiterate, however, the importance of recording any and all analysis that is done in order to create a running narrative to be used in future analysis of why data is trending the way it is. We'll get into the corrective action phase in the next post, but reporting a relevant set of performance metrics followed by analyzing the data are two huge steps towards an effective performance management system.