Beware that software development for ITS projects can be utterly complex, which demands avoiding pitfalls by following a rigorous systems engineering process.

Experience from iFlorida Model Deployment

Date Posted
09/17/2009
TwitterLinkedInFacebook
Identifier
2009-L00490

iFlorida Model Deployment Final Evaluation Report

Summary Information

The iFlorida Model Deployment, which was started in May 2003, called for the Florida Department of Transportation (FDOT) District 5 (D5) to complete the design, build, and integration of the infrastructure required to support operations in 2 years. The required infrastructure was extensive, spanned numerous stakeholders, and included many technologies that were new to FDOT D5, such as sophisticated traffic management center (TMC) operations software, a wireless network deployed along I-4, an interface to Florida Highway Patrol Computer Aided Dispatch (FHP CAD) data, statewide traffic monitoring, and many others. The iFlorida plans also called for deployment of these technologies in ways that required coordination among more than 20 stakeholders. It was an ambitious plan that would result in dramatically different traffic management operations for FDOT D5 and other transportation stakeholders in the Orlando area.

In implementing the iFlorida plan, FDOT faced many challenges ranging from higher failure rates than expected for some field hardware to difficulties with the Condition Reporting System (CRS) and Central Florida Data Warehouse (CFDW) software. "Despite these challenges, it can be readily claimed that the overall iFlorida Model Deployment was successful," noted in the final evaluation report for the iFlorida Model Deployment, published in January 2009.

The difficulties associated with the iFlorida Model Deployment provided many opportunities to identify lessons learned from the experiences they had. The most important of these are presented below in a series of lessons learned articles.

Lessons Learned

FDOT, aware of the problems with the Condition Reporting System (CRS) software and anticipating the potential that it might fail, had begun testing SunGuide, a replacement for the CRS, in November 2006. FDOT began by installing an existing version of the SunGuide software and configuring it to manage recently deployed ITS equipment on and near I-95. This included a set of loop and radar detectors to measure volume, speed, and occupancy on I-95, DMS signs on I-95, and trailblazer signs on key intersections located near I-95. The agency negotiated a new schedule with the SunGuide contractor to provide a new version of that software that included most of the required transportation management features to be available by August 2007. By September 2007, FDOT was operating a Beta release of the SunGuide 3.0 software. A final release of this version of the software was released in December 2007.



A remarkable observation of the SunGuide development process was how much more smoothly it went than the CRS development process. Insights from this experience include:

  • Devote time at the start of the project to ensure that the contractor shares an understanding of what is desired from the software. One item FDOT noted was that the SunGuide contractor spent much more time working with FDOT at the start of the project to understand FDOT's functional requirements than had the CRS contractor. With SunGuide, FDOT initially provided high-level functional requirements to the contractor, and the contractor mocked up screens to refine these requirements. Sometimes, the contractor would point FDOT to other Web sites as examples of what the contractor believed FDOT had requested. At other times, the SunGuide contractor would make "quick-and-dirty" changes to the version of SunGuide to demonstrate an understanding of FDOT's requirements. In contrast, the CRS contractor took FDOT's high-level requirements and attempted to build tools that complied with those requirements without much additional exploration. This was particularly problematic because FDOT had not developed detailed Concept of Operations for iFlorida. Having an iFlorida Concept of Operations may have helped the CRS contractor better understand FDOT's expectations.
  • Understand that sustaining trained, certified, and knowledgeable staff is a key to success. Software projects are complex requiring skills that are not necessarily common to transportation agencies. If FDOT District 5 (D5) had more stringently followed a systems engineering process, some of these problems might have been avoided. In fact, the FDOT ITS Office had developed a systems engineering approach for ITS projects in Florida, and FDOT D5 chose not to apply that approach. FHWA had also agreed to provide to FDOT personnel responsible for iFlorida software acquisition a course entitled ITS Software Acquisition. FDOT D5 declined FHWA's offer.
  • Beware that the cost structure for a software project is significantly different than that for traditional DOT projects. FDOT also noted that the process of refining the functional requirements required great deal more upfront expense than the agency was accustomed to spending on non-software projects. In traditional FDOT projects, such as widening a road, the specifications are set by existing FDOT standards and there are few questions about the functional requirements of the resulting product. With software projects, there are no existing standards specifying requirements-many functional requirements will be unique to a particular organization, so significantly more time is required to develop a mutual understanding of those requirements.
  • Develop detailed requirements to help ensure that appropriate verification tests are performed. In the case of the CRS, FDOT relied on the CRS contractor to convert the functional requirements into detailed technical requirements. In many cases, the CRS contractor accepted the functional requirements provided by FDOT as the detailed requirements. This was particularly problematic because some FDOT functional requirements were ambiguous or otherwise flawed. This affected the development of the CRS software in that the CRS requirements continued to evolve throughout the development process. It also affected the testing of the software in that testing was performed against high-level, ambiguous functional requirements. As an example, one CFDW (Central Florida Data Warehouse) requirement was that the software would include "a data mining application to facilitate the retrieval of archived data." The test for this requirement was "Verify that the CFDW includes Crystal Reports or similar software." No more significantly detailed requirements or testing of the CFDW data mining capabilities was performed. Another example of an ambiguous, high-level functional requirement was the FDOT specification that "Users shall be able to view/extract archived data via fifteen (15) standard reports." This requirement made its way into the CRS requirements specification, and the associated test was to "Confirm the existence of at least fifteen standard reports" with the acceptance criteria being "Pass if at least fifteen standard reports are available." The CRS contractor should have developed detailed requirements based on the high-level requirements provided by FDOT and developed tests based on the detailed requirements.
  • Provide tools for testing of individual software components to help pinpoint the root cause of problems that may occur. FDOT noted that it would have been useful to have had available tools to help monitor and test individual components of the software. When problems were detected with the CRS, FDOT and the CRS contractor often spent more time trying to locate where problems were occurring than fixing problems. FDOT could have obtained these tools by either including such tools in the CRS requirements or contracting for independent verification and validation effort on the CRS software. The problem of testing individual software components was particularly severe when multiple organizations were involved in the problem. For example, the toll tag reader data was collected at field devices maintained by one FDOT contractor, transmitted through the FDOT network to a FDOT server, transmitted across the FDOT and Orlando-Orange County Expressway Authority (OOCEA) networks to the OOCEA travel time server, which was running software developed by a different contractor. The resulting travel times were pushed back across the OOCEA and FDOT networks to the CRS, which used the data to produce travel time estimates and published those travel times to message signs and the 511 system. When a problem occurred, it was difficult to identify at what point in this chain the problem first appeared, which sometimes resulted in more finger pointing than problem solving. It would have been useful to have had a tool that could sample each of the data streams involved in the travel time calculations so that the location at which an error occurred could be more easily identified.
  • Stay involved in the initial configuration to help determine the manageability of the software configuration process and the ability of end-users to operate the software. With the SunGuide software, the SunGuide contractor provided FDOT with the software and FDOT was responsible for configuring it as they wanted. (The SunGuide contractor did provide support during this process.) This approach simultaneously configured the system, trained FDOT in the administration process, and allowed FDOT to verify that the configuration process was easy to use. Part of the reason for the difference in the administration processes might be related to a difference in how the two contractors (CRS and SunGuide) intended the software to be used. In both cases, the software was based on tools that were being developed for use at multiple locations. With the CRS, the CRS contractor manages most of the deployments-they operate the servers and handle the configuration. With SunGuide, the SunGuide contractor provides software-software that is operated on DOT-owned servers and configured by the DOTs. The CRS contractor can afford more arcane configuration processes because the configuration is performed by CRS staff members expert in the process. The SunGuide contractor must have simpler, more robust configuration processes because the end-user will be performing those operations.
  • Include contractual requirements ensuring that software fixes can be implemented without disrupting operations. FDOT also believed that the SunGuide architecture as a whole-not just the configuration process-was more robust than the CRS architecture. One example they noted was that the SunGuide contractor was able to modify the software without bringing down the entire system. With CRS, the contractor usually was required to shut down the entire system in order to install a patch. This meant that the CRS contractor tended to provide large patches less frequently, while the SunGuide contractor could provide smaller, more frequent patches. This allowed the SunGuide contractor to implement fixes for small bugs in the SunGuide software quickly. It also made it simpler for FDOT to test the patches, since each patch typically covered only a small number of problems.

The software development for iFlorida project was a hugely complex endeavor. Lack of following a well defined systems engineering process gave rise to many problems, including inadequate requirements definition, unreliable system testing and verification, and eventually abandoning the CRS software. Subsequently, FDOT replaced CRS with the SunGuide software.

iFlorida Model Deployment Final Evaluation Report

iFlorida Model Deployment Final Evaluation Report
Source Publication Date
01/31/2009
Author
Robert Haas (SAC); Mark Carter (SAIC); Eric Perry (SAIC); Jeff Trombly(SAIC); Elisabeth Bedsole (SAIC): Rich Margiotta (Cambridge Systematics)
Publisher
United States Department of TransportationFederal Highway Administration1200 New Jersey Avenue, SEWashington, DC 20590
Goal Areas

Keywords Taxonomy: