Challenges Associated With TIM Performance Measurement Data

The most prevalent issue facing most TIM programs is the availability of data and data sharing between agencies responsible for incident response. Discussions of the challenges with performance measurement data list several common themes, including (10,18,19):

  • Does the performance measure represent a key concern?
  • Inconsistent definitions.
  • Data availability.
  • Cost of data collection.
  • Data quality/completeness.
  • Data sharing.
  • Data exchange.
  • Data integration.
  • Assuring appropriate comparisons to other operations.
  • Extrapolating from partial coverage.
  • Understanding extraneous influences in the data.
  • Conflicts with other measuring programs – which is “right”?
  • Timeliness of data.
  • Use of performance measures in the allocation of funding.
  • Liability for action (or lack thereof) based on performance measurement results.
  • Responsibility for measures for which there may be limited control.

While this list is generally self-explanatory, selected challenges are addressed in more detail below.

Key concerns

The analysis of performance measures requires the expenditure of resources for data collection, computation, tracking, presentation, and more.  This is a direct cost to the program or agency.  Therefore, it is unrealistic to think that an agency or program can, or should, collect data for all available performance measures.  In fact, this is a key purpose behind the three national TIM performance measures – to realistically measure and track TIM performance while minimizing the costs to agencies involved in the process.  Any additional performance measures should therefore play a key role in TIM decision-making and planning, avoid unnecessary duplication with other measures, and be considered for their ability to support policy decisions (18). It should be recognized that while resources must be expended to fully participate in TIM performance measurement activities, significant value can result from those efforts by positioning the agency to better understand, request, and receive additional funding and TIM resources.

Inconsistent Definitions

The typical example to illustrate the importance of consistency in defining performance measures is incident response time. Transportation agencies typically define incident response time as the time from when the transportation agency was notified of the incident to the time when the first responder from any agency arrives on at the incident scene. Emergency responders, on the other hand, typically define response time as the time from when the call was received by their dispatch, typically a 911 center, and the time when their first response vehicle arrives on the scene. Unless the transportation agency, 911 dispatch, and law enforcement are fully integrated, the use of these two definitions is likely to result in inconsistent performance measures.

Incident clearance time is another example. Transportation agencies typically define incident clearance time as the time when the first responder arrives on the scene to the time when the incident was removed from the travel lanes. Law enforcement often uses a corresponding measure that ends when their investigative and/or paperwork duties are complete, which may involve interviews of participants in medical facilities, etc. As such, the law enforcement incident clearance time could be substantially different than the transportation agency incident clearance time (20). Additionally, most CAD systems have status code keys to differentiate different activities such as leaving the scene, going to the hospital, etc. Therefore, the correct use of the status key codes is critical for data accuracy and consistency.

This is another key purpose behind the development and use of three national TIM performance measures – to provide consistency in definitions across agencies and locations.

Data Availability

TIM program assessment is still a developing activity, and not all agencies collect the appropriate data. In the case of transportation agencies, some regions of a state may have implemented the necessary data collection and recording guidelines to provide data for TIM program performance; whereas other areas in the same state may have not.

The comprehensive reporting of TIM performance may require data from multiple sources to be collected/shared/integrated. Transportation agencies typically do not get notified of every incident on every roadway. Depending on the presence of a TMC, they may not even get notified of all incidents on freeways. Numerous issues exist with respect to accessing data from multiple sources, such as availability, security, privacy, compatible formats, and the ability to extract the correct and required information. The search for solutions to these issues generally starts with a multi-agency task force encompassing the primary responders and the transportation agency. This task force should analyze available data, means of data exchange, and data definitions to see if data are compatible and shareable (8).

Data Quality/Completeness

A discussion of data quality encompasses many areas; accuracy is one of the primary aspects—ensuring that the information is recorded properly. At either the TMC or in the field, accuracy refers to items such as correctly noting the date and time, as well as the proper identification of the affected lanes and the number of vehicles.

Data quality can also refer to completeness of the data—ensuring that all of the information is recorded. Paying attention to the completeness of data is especially important during updates. For example, the performance measures associated with the time to clear incidents depend on accurate updates on what is happening at the scene (e.g., when all travel lanes are available for traffic flow, when the last responder leaves the scene). If the timelines of incidents are incomplete the data cannot be utilized to calculate the associated performance measures. Alternately, if times are entered long after the incidents are cleared, these long durations will skew the reporting of the clearance times.

Another aspect of data quality comes into play if records have to be transcribed. Mistyping times such as 21 instead of 12 can influence the overall calculation. Even a few data transcribing errors could have a significant impact if they took place on major incidents or were concentrated on a particular type of data query, such as the ICT for non-injury incidents on interstate highways. The only solution for ensuring data quality in a manual transcription scenario is proof reading and secondary oversight.

In the scenario of extracting information electronically, several test data runs should be performed beyond the initial programming to ensure that the resulting data are valid and make sense. If any agency involved in an exchange of data changes formats, types, definitions, etc., the process of ensuring data quality will have to start again.

Data Sharing

Data sharing is the practice of making data from one agency or system available to another agency or system. Largely, this is an institutional issue and may require negotiation as to the uses of the data. In certain situations, agencies may limit data sharing or limit aspects to certain aspects of the data such as vehicle registration information to protect the confidentiality of any involved parties. Data sharing may also be restricted to protect agencies from use of data for political purposes.

Data Exchange

The formal definition of data exchange is taking data structured under one representation or hierarchy and transforming it to another. While the execution of the data exchange is typically automated, the initial mapping between systems is a crucial task that requires active involvement by people intimately familiar with the data. Primarily this exchange occurs as a part of a course of action of trying to unify or utilize data from other sources. Typical examples may include a TMC utilizing data from a law enforcement CAD system to supplement the existing information. In some cases multiple agencies may be involved in the response to incidents, and multiple CAD sources may need to exchange data. For an example of how Virginia DOT and Virginia State Police have exchanged performance measure data click here.

National standards exist for defining data elements associated with incident management.  The National Intelligent Transportation System Architecture utilizes the Traffic Management Data Dictionary and standards from the Institute of Electrical and Electronics Engineers (IEEE) to message sets for data exchange pertaining to incident management (8).  Subsets of this information and data flows are provided for activities such as broadcast traveler information, interactive traveler information, transportation operations data sharing freeway service patrols and more.  While TMC software and CAD systems may utilize national standards for the definition of data elements, care should be taken when integrating systems as agencies may have added customized data elements beyond the standards or changed the definitions all together.  Additional organizations may have other standards that are applicable to incident management.  Overall, attention should be focused on ensuring consistent definitions of data elements when undertaking data exchange tasks.
TIM Data Sharing, Exchange, and Integration - VDOT Example

Data Integration

Data integration is the process of taking data from disparate sources and combing it into a single unified data set for agency purposes, such as the calculation of clearance times. Data integration encompasses many key elements already discussed, such as sharing agreements and standards for mapping data elements between systems. Overall, the process of integration requires significant cooperation on the part of all agencies providing and/or aggregating data. Precise communications to improve and/or resolve differences in data availability, data quality and completeness, data exchange standards and more are key to successful integration projects. These types of projects typically also require top-down support on the part of all involved parties to ensure proper resources and personnel are devoted to the effort.

Appropriate Comparisons

Even when the definitions for the performance measures are standardized, the way in which the performance measures are reported need to be carefully communicated. For example, an agency may only report performance for “DOT notified incidents,” “lane blocking incidents,” or incidents on interstate highways. In any case, this qualifying information should be communicated along with the performance measures to ensure that reporting is consistent and understood across all agencies and that appropriate comparisons are made.

Timeliness of Data

Although it could certainly be helpful for program management, TIM program performance measures do not have to be calculated in real-time; however, the timeliness of data, especially from outside systems, may impact the frequency with which reports or on-going checks of program operation can take place. Established TIM programs generally produce monthly or quarterly reports. If outside data sources, such as law enforcement CAD systems, can only provide data on a manual basis, and it takes significant work to process the information to obtain the appropriate information, it is unlikely TIM program reports can be generated on a consistent and timely basis. In general, performance related information is of little value when it is provided too late to the intended user (21).

Challenges Associated Performance Management Data