Recall that performance measurement is the use of statistical evidence to determine progress towards specific, measurable organizational objectives. Further, the systematic, ongoing performance measurement process involves collecting and analyzing data to determine if these objectives have been met and then reporting the findings to stakeholders and customers.
At the highest level, TIM performance objectives might include the following:
- Reducing roadway clearance times
- Reducing incident clearance times
- Reducing secondary crashes
An agency can use the model TIM performance measurement database to measure and track these performance measures over time. For example, the mean RCT for all incidents over the past month might be 45 minutes; the median ICT for all incidents over the past quarter might be 59 minutes; and the percentage of secondary crashes over the past year might be 18 percent. These numbers are a good starting point for understanding regional TIM performance at a high level.
However, considering the range of different types of incidents that occur and the varying conditions under which incidents occur, using such aggregate measures of performance may not be very informative. More refined information on TIM performance may be necessary to reap the full benefits of performance measurement, such as being able to demonstrate accountability, process efficiency, or program effectiveness to decision makers, stakeholders, or customers/tax payers.
Take for example minor incidents that are confined to the shoulder and major incidents that block multiple roadway lanes and/or involve serious injuries. Whereas the minor incidents can generally be cleared relatively quickly, major incidents often take much longer to clear. If a gross average is used to represent clearance time for both types of incidents, the nuances of the incident characteristics and their impacts on clearance times are diluted or even undetectable in the resulting performance measure. In this case, a more astute analysis would be to calculate the mean ICT for minor incidents and the mean ICT for major incidents separately. Having a populated TIM performance measurement database would allow an agency to easily conduct this analysis and make this comparison.
In fact, with the model TIM performance measurement database, any or all of the incident characteristics identified in the database schema can be used to conduct a more refined analysis of TIM performance. Using the incident characteristics in the database to conduct a finer analysis of TIM performance can help tease out certain factors that may be impacting TIM performance that are not easily recognized from a more aggregate analysis.
The model TIM performance measurement database schema was designed to allow the analyst to do just this – to fine tune the analysis for a better understanding of TIM performance. As such, agencies with a populated TIM performance measurement database are able to set more specific TIM performance objectives because they have the data to support the analyses. In addition, once the data are in the database, it is relatively easy to run analyses, and a lot of analyses can be run in a relatively short amount of time.
A portion of this guidance discussed how to put TIM performance measures into context to add value and to better understand TIM performance. That section also discussed the most common ways of categorizing incidents for conducting these more refined analyses (e.g., by incident severity, injury severity, type of roadway). This section of the guidance takes the information presented in those previous sections one step further by linking these categories of incidents to specific organization objectives or strategic questions regarding TIM program performance that might be of interest to agencies operating TIM programs. In addition, this section provides suggested performance measures for each of the specific organizational objectives, the data elements required for the analysis, and a link to the specific data element in the model TIM performance measurement database schema. It is the hope that this section will get agencies thinking about different ways of viewing and analyzing TIM performance in their area.
Click each link below to expand information on more specific organizational objectives or strategic questions regarding TIM program performance that might be of interest to agencies operating TIM programs. These examples provide suggested performance measures for each of the objectives, the data elements required for the analysis, and a link to the model TIM performance measurement database schema.Incident severity
One of the primary ways of analyzing incident data to get a good idea of TIM performance is by incident severity. Within the model TIM performance measurement database, incident severity is categorized as: minor, intermediate, and major. The definitions of these categories may vary from agency to agency; however, it is generally understood that minor incidents (e.g., contained to the shoulder, involving one vehicle, property damage only) are likely to take much less time to clear than major incidents, which may involve multiple lanes, multiple vehicles, one or more injuries, and/or heavy vehicles. If all incidents are taken together in the calculation of the clearance time performance measures, the result is an average of a wide range and highly variable clearance times. When the calculation of the performance measures is broken down by incident severity, the resulting performance measures can give a TIM program a much better idea of how it is performing for different types of incidents based on severity. A TIM program could even establish organizational objectives around performance by incident severity. The table below shows example organization objectives for TIM performance by incident severity, along with potential performance measures and a link to the model TIM database schema.
Example Objectives for TIM Performance by Incident Severity
|Incident Severity||Organizational Objective||Performance Measure(s)||Link to Schema|
|Minor||Clear all minor incidents in 30 minutes or less.||Number/percent of ICTs > 30 min||severity_type = minor|
|Intermediate||Reduce roadway clearance times for intermediate incidents.||average RCT||severity_type = intermediate|
|Major||Improve response strategies for major incidents.||characteristics of incidents with 5 longest ICTs||severity_type = major|
Injury severity also can impact incident response and clearance times. While incidents involving no injuries may require just a tow truck, incidents involving even a minor injury usually require response by emergency medical services (EMS) personnel, and incidents involving serious injuries are more critical due to the nature of those injuries. Incidents involving fatalities can be even more sensitive, require response from the medical examiner, and could take hours to clear. When multiple injuries or fatalities are involved, response and clearance are further complicated. Therefore, a TIM program may wish to look at performance for incidents with different types/levels of injury severity to see if there are certain areas where they could work to improve clearance times and/or the occurrence of secondary crashes. The table below shows example organization objectives for TIM performance by injury severity, along with potential performance measures and a link to the model TIM database schema.
Example Objectives for TIM Performance by Injury Severity
|Injury Severity||Organizational Objective||Performance Measure(s)||Link to Schema|
|Non-injury||Reduce incident clearance times for non-injury incidents.||median ICT||injury = non|
|All injuries||Reduce roadway clearance times for injury incidents.||average RCT||minor_injuries or serious_injuries > or = 1|
|Fatalities||Reduce secondary crashes during fatal incidents.||percentage of secondary crashes||fatalities > or = 1|
Another way an agency might want to examine its TIM performance is by roadway. Within the model TIM performance measurement database, there are a number of roadway characteristics, including: roadway type (includes freeway, arterial, and collector/distributor), roadway name and direction, and surface conditions. A TIM program might be interested in performance on freeways, a particular freeway, or even a particular direction on a freeway. The model TIM performance measurement database affords the ability for agencies to assess performance at this level by calculating performance measures by roadway characteristics. The table below shows example organization objectives for TIM performance by roadway characteristics, along with potential performance measures and a link to the model TIM database schema.
Example Objectives for TIM Performance by Roadway Characteristics
|Roadway Characteristics||Organizational Objective||Performance Measure(s)||Link to Schema|
|Type||Reduce roadway clearance times for all freeway incidents.||average RCT||roadway_type = freeway|
|Name||Clear 95 percent of all incidents on Interstate X in 90 minutes or less.||95th-percentile ICT||name = Interstate X|
|Direction||Reduce secondary crashes on northbound Interstate X.||percentage of secondary crashes||name = Interstate X and direction = NB|
A factor that can be critical to a TIM program’s response strategy, and thus a factor of interest in performance, is the lanes involved in an incident – both the total number and which lanes in particular are impacted. Generally, the more lanes that are impacted by an incident, the longer it could take to clear the roadway. In some cases, inside lanes may be more difficult to access, impacting response and staging times; while an incident occurring in the middle lane of a multi-lane freeway could hinder traffic diversion around the incident and ultimate roadway clearance time. Having a better understanding of how the lanes involved in an incident affect TIM performance could help a TIM program develop improved strategies for responding to incidents. The table below shows example organization objectives for TIM performance by lanes involved, along with potential performance measures and a link to the model TIM database schema.
Example Objectives for TIM Performance by Lanes Involved
|Lanes Involved||Organizational Objective||Performance Measure(s)||Link to Schema|
|Shoulder only||Clear all incidents confined to the shoulder in 30 minute or less.||Number/percent of ICTs > 30 min||lanesinvolved = shoulder only|
|1 lane||Reduce roadway clearance times for incidents blocking one lane.||average RCT||lanesinvolved = any lane except shoulder only|
|All lanes||Reduce secondary crashes during incidents that block all lanes.||percentage of secondary crashes||lanesinvolved = all lanes|
Time of incident
The time of an incident can be a critical factor impacting incident response and clearance, as well as the occurrence of secondary crashes. Incidents that occur during peak commute times are more challenging to respond to and clear than during times when there are far fewer vehicles on the roadway. An agency may even be interested in examining performance by day of the week, or even month/season of the year, to see how performance various across the weeks/months/year. The table below shows example organization objectives for TIM performance by time of incident, along with potential performance measures and a link to the model TIM database schema.
Example Objectives for TIM Performance by Time of Incident
|Time of Incident||Organizational Objective||Performance Measure(s)||Link to Schema|
|Peak periods||Reduce incident clearance times for incidents occurring during the peak periods.||average ICT||incident_first_recordable_awareness = 7:00-9:00 am and 4:30-6:30 pm|
|Weekday vs. weekend||Determine if roadway clearance times vary systematically by day of week.||average RCT||incident_date = Monday, Tuesday, Wednesday, Thursday or Friday vs. incident_date = Saturday or Sunday|
|Summer vs. other months||Determine if secondary crash occurrence varies systematically by season/month of year.||percentage of secondary crashes||incident_date = January-May or September-December vs. incident_date = June, July, or August|
Other single data elements
As has been seen in the previous examples, use of the model TIM performance measurement database affords the ability to drill down into the data to assess performance for different types of incidents. While the previous examples may be of the most interest, an analyst could conduct analyses of TIM performance using any of the data elements in the database, including vehicles involved and conditions such as weather, lighting, and surface conditions. The table below shows example organization objectives for TIM performance by various other single data elements, along with potential performance measures and a link to the model TIM database schema.
Example Objectives for TIM Performance by Other Single Data Elements
|Other Data Elements||Organizational Objective||Performance Measure(s)||Link to Schema|
|Heavy involved vs. heavy vehicle not involved||Develop improved response strategies for incidents involving heavy vehicles.||average ICT||heavy vehicle_involved = yes vs. heavy vehicle_involved = no|
|Dry vs. inclement weather||Develop improved response strategies for incidents occurring in inclement weather.||median ICT||weather_type = dry vs. weather_type = rain, sleet or hail, snow, and fog|
|Light vs. dark||Develop improved response strategies for incidents occurring after dark.||average ICT||lighting_type = daylight vs. lighting_type = dark|
|Dry surface vs. wet surface||Develop improved response strategies for incidents occurring when surface is wet.||median ICT||pavement_type = dry vs. pavement_type = wet|
Multiple data elements
The above examples have illustrated how a TIM program can use the data in the model TIM performance measurement database to assess performance at a more disaggregate level (as opposed to including all incidents in the analysis). These examples have included the use of just one data element; however, the database can also be used to answer more complex questions regarding TIM performance. Using two or more data elements in combination allows a TIM program to examine TIM performance on a more discrete level of detail than using just one data element. For example, a TIM program might feel or know that it is struggling with response to certain types of incidents, in certain locations, and/or under certain conditions, and the program might want to analyze this more specifically. The table below shows a few examples of more complex TIM performance organization objectives using multiple data elements from the TIM database. The more questions asked and explored, the more the program will learn about its TIM performance.
Example Objectives for TIM Performance by Multiple Data Elements
|Data Elements||Organizational Objective||Performance Measure(s)||Link to Schema|
|Injury severity AND Lanes involved||Reduce RCT for incidents involving serious injuries and blocking 2 or more lanes||average RCT||serious_injuries and lanesinvolved = 2 or more|
|Roadway type AND Number of injuries||Reduce secondary crashes during freeway incidents involving a fatality||percentage of secondary crashes||roadway_type = freeway and fatalities > or = 1|
|Roadway name (location) AND Lanes involved||Reduce ICT on Interstate X during incidents that block at least one lane||average ICT||name = Interstate X and lanesinvolved = 1 or more|
|Lanes involved AND Vehicle type||Reduce roadway clearance times for incidents blocking at least one lane in which a heavy vehicle is involved||average RCT||lanesinvolved = 1 or more and heavy vehicle_involved = yes|
|Incident severity AND Weather conditions||Reduce secondary crashes associated with intermediate and serious incidents that occur during inclement weather conditions||percentage of secondary crashes||severity_type = intermediate and serious and weather_type = rain, sleet or hail, snow, and fog|