When we had enough changes we issued a new system requirements document and new subsystem requirements document. Those poor contractors had to go through the massive subsystem requirements and manually determine what had changes. I can’t imagine the time the contractors spent just trying to figure out what changes they needed to be concerned about.
It was in the middle of this upgrade project that the customer said enough and tasked my team with evaluating and selecting a requirements management tool. The tool we selected is not important to this particular discussion, but what we learned from this tool selection and implementation is important. Here are some lessons learned.
(1) - There is not a single tool which is going to please everyone. We had users who loved our selection and those who fought us every step of the way. Without a customer supporting and enforcing the change it would not be possible on a large program like this one. One user complained about the column size of the tool generated traceability matrix, totally ignoring the fact that it saved him days of manual effort.
(2) - Our manual traceability was not very clean. Once we imported all of our information into the tool and linked it up we found many gaps in the traceability. What was more disturbing was that we had links that really didn’t make any sense. We had to do a lot of work to clean up our traceability matrices.
(3) - Just tracing requirements was great, but now we could use the same effort to link requirements to test plans, and went so far as to link subsystem requirements to design documents that we could review. This didn’t happen overnight, but it did happen. Eventually we could trace system requirements to a subsystem requirement to a design document to a code module. We even used a tool to determine the complexity of code modules and used this to help determine how difficult a change would be to implement and test.
(4) - Metrics from a requirements tool are key to understanding completeness of testing activities. We often thought we were 50% complete with testing. After all, 50% of the tests were completed. However, what we found was that we were prone to testing the simplest and most understood parts of the system first. So even thought we were 50% complete, everything left was very high risk. We learned to prioritize our testing by looking at requirements priorities and software complexity, information we could not determine through manual traceability.
(5) - It was very easy to get overwhelmed. Start simple. We had to back off our ambitious ideas and begin with a simple traceability model. As we learned and gained more experience with the tool, we added more information to our model. We were constantly assessing our process to figure out what else we could do to make it better.
What a great leargning experience this was for me. If you´re interested in embarking on a change like this to improve your requirements process, contact Visure Solutions. We will be happy to discuss your process with you.
By: Marcia Stinson
No comments:
Post a Comment