Lesson

Specify interoperability testing requirements and steps as part of the connected vehicle device requirements prior to starting multiple rounds of testing, feedback, reset, and retesting.

Experience with deploying connected vehicle devices during the Safety Pilot Model Deployment in Ann Arbor, Michigan.


September 2015
Michigan; Ann Arbor; Michigan; United States


Background (Show)

Lesson Learned

The Test Conductor conducted device interoperability testing to verify the ability of the vehicle-based and infrastructure-based devices produced by various suppliers to exchange, decode, log, and/or forward DSRC messages. All possible combinations of device types and suppliers were tested to verify this capability. The initial testing schedule included one round of interoperability testing, scheduled to occur in April through May 2012, prior to the Pre-Model Deployment dry run testing. Two reasons were cited for including only one round of testing. First, it was assumed that all devices would be ready according to the device development schedule and therefore available for testing in April 2012. But more importantly, since all of the devices went through the qualification testing process, it was assumed that no major issues would be identified during the interoperability testing that would require significant updates and additional rounds of retesting.

Unfortunately, the RSUs were not ready for testing as originally scheduled due to the delay in developing the specification and research Qualified Projects List (rQPL). As a result, the interoperability testing was divided into two stages. Stage 1 only tested vehicle-based devices. Stage 1 evaluated the vehicle-to-vehicle (V2V) Basic Safety Message (BSM) compatibility between a variety of vehicle-based devices and platforms from each of the selected suppliers. It included static bench testing at the Test Conductor facilities and dynamic field testing of devices on a predefined route within the MDGA and occurred from mid-April through mid-May as originally planned. The infrastructure-based devices (RSUs) were incorporated into the Stage 2 testing. Stage 2 evaluated both vehicle-based and infrastructure-based devices against a series of seven test cases. This testing was scheduled for two weeks at the end of June. At the time of the Stage 2 bench testing in June, the vehicle-based devices could not be tested against three of the test cases that included security functions, as these devices did not support this functionality at the time. Therefore, these functions were not included in the Pre-Model Deployment Dry Run Testing in July.

It is clear that devices were not as mature as initially assumed based on the qualification testing, as evidenced by the number of changes required following the initial stages of interoperability tests. The Test Conductor tested as much functionality as possible prior to the Pre-Model Deployment Dry Run Testing, however there was still much to be done following the dry run testing in July. As a result, the Test Conductor and USDOT jointly decided that two additional stages, Stages 3 and 4, of interoperability testing were required after the SPMD launch in August 2012. The Stage 3 field testing re-assessed devices that failed in previous stages of testing and tested security functionality that was previously not supported in the Stage 2 testing. A fourth and final stage of bench and field testing was conducted to re-assess the devices after multiple firmware updates were implemented to resolve issues discovered in the field.

Several issues were identified in the interoperability testing process. First, adding additional stages of testing and not being able to adjust the project schedule or deployment launch date added risk to the project since it was unknown how the VADs and ASDs would work with the RSUs until after the devices were purchased and deployed in the field. Second, the process of conducting the interoperability testing required far more resources and time than was originally planned for this activity. The device interoperability testing verified, via a manual process, that the data elements in the BSM sent by the transmitting device were identical to the data elements in the BSM received by the receiving device. The manual verification process involved a field by field comparison of each data element in the transmitted and received BSM. Since there are a large number of fields in the BSM, this was a labor intensive process. Also, this testing was somewhat limited and did not verify the validity and accuracy of the data generated and transmitted by each supplier’s device since this type of testing was assumed to be covered in other testing activities such as the device qualification testing and SPMD dry run testing.

Related recommendations made in the source report include:
  • Develop a plan and process for efficiently recalling devices in the event that numerous updates are required. Clearly communicate to volunteer drivers that they may be contacted to bring in their vehicles at any point during the pilot if problems are detected.
  • Analyze the need to purchase spare devices. If the budget allows, procure a sufficient inventory of spares to replace non-functional units as part of the recall process plan. Work with suppliers to determine replacement times to better estimate how many spares would be required.
  • Include state of health monitoring requirements and supporting processes for each type of device. Implement remote monitoring and device reset capabilities to reduce the number of devices that need to be physically recalled from the field.
  • Utilize automated tools to perform basic data comparisons between devices in order to more efficiently conduct the testing and test a wider variety of cases.
  • Define and document requirements and steps for all interoperability testing participants and data users.
  • Utilize automated tools to perform basic data comparisons between devices in order to more efficiently conduct the testing and test a wider variety of cases.
  • Define and document requirements and steps for all interoperability testing participants
  • Implement a full dry run that includes all installation, operation and interoperability requirements for all devices, infrastructure, and systems. Incorporate sufficient time into the schedule to ensure that all devices and systems are in a stable state prior to implementing a full dry run.
  • Ensure that in-depth system testing requirements, updates, and retest cycles are well-understood, and are appropriately resourced in time and budget. Plan for several iterations of component, subsystem, and total pilot system testing within the dry run. Depending on the number of system components (devices, infrastructure, data collection and backhaul connections, security implementation, etc), this could take several weeks to several months.
  • Be prepared to encounter field issues that were not discovered during the qualification testing, interoperability testing, and dry run testing.


Lesson Comments

No comments posted to date

Comment on this Lesson

To comment on this lesson, fill in the information below and click on submit. An asterisk (*) indicates a required field. Your name and email address, if provided, will not be posted, but are to contact you, if needed to clarify your comments.



Source

Safety Pilot Model Deployment: Lessons Learned and Recommendations for Future Connected Vehicle Activities

Author: Kevin Gay, Valarie Kniss (Volpe)

Published By: U.S. Department of Transportation

Source Date: September 2015

EDL Number: FHWA-JPO-16-363

URL: http://ntl.bts.gov/lib/59000/59300/59361/FHWA-JPO-16-363.pdf

Other Lessons From this Source

Lesson Contacts

Lesson Contact(s):

Kevin Gay
US DOT
kevin.gay@dot.gov

Lesson Analyst:

Gregory Hatcher
Noblis
202-488-5704
ghatcher@noblis.org


Rating

Average User Rating

0 ( ratings)

Rate this Lesson

(click stars to rate)


Lesson ID: 2016-00757