By paul ~ March 14th, 2009. Filed under: Systems Engr..
Requirement … Satisfied. That’s “Mission Accomplished” to the systems engineer who has carefully shepherded the requirement from its discovery during Requirements Analysis through design, implementation, test and finally deployment. It’s a dangerous journey! Success doesn’t happen by accident. The path is fraught with dangers that can result in the requirement being poorly articulated, lost, misunderstood, or even eliminated. Good systems engineering teams have to be tough.
In order to be successful, the systems engineering team must take ownership of the customer’s requirements from the outset. Not only must they receive and document the requirements, but they must also invest themselves in the discovery and elaboration of the requirements. A team that simply accepts a requirements spec from a customer at face value with a few clarifying questions is off on the wrong foot already. Perform a formal requirements analysis with the customer and other stakeholders even on projects that seem to have pretty clearly defined requirements. It may not be necessary to do this in the pre-award stage, but it should be included in the proposal and undertaken after award. Although expensive, this is a win-win for the customer and contractor. It helps ensure that the system requirements are clearly understood, elaborated to a greater level of detail, and properly documented. During this activity, initial verification planning can take place as well. To this end, ask questions like “What does the customer expect or demand for verification for each requirement (in terms of Analysis, Inspection, Demonstration or Test.) ” and “Where are periodic Technical Performance Measurements (TPMs) required?”
Once the requirements analysis is complete, the software engineering team ‘turns around” and faces the engineering teams. They will continue to interface with the customer and negotiate requirements, but now they are representing the customer and the system requirements to the engineering teams. The emphasis becomes ensuring that the requirements are adequately understood by the design teams and traceable through the design processes. A Model-Based Systems Engineering (MBSE) or Model Driven Architecture (MDA) approach can significantly aid in this phase by facilitating a tight linkage between system components and the requirements they are intended to satisfy. This was previously discussed in the blog article Why is Model Driven Systems Engineering So Important? For instance, the Foresight modeling environment facilitates requirements traceability by directly linking entities in the model with requirements in a DOORS database. (See the RQIF-DOORS interface.)
The MBSE approach further helps the systems engineer by enabling design verification by Analysis. For complex systems and systems that are very expensive to demonstrate or test, Analysis is the preferred verification method until final test is possible. Further, some performance metrics are so critical that they should be tracked throughout the design process as TPMs. Models facilitate these activities by providing analyzable design artifacts.
It is never too soon to start verification planning. As mentioned previously, initial verification planning should start during requirements analysis. The systems engineering team should complete a preliminary verification plan as early as possible and get it to the quality assurance teams. It is critical that test planning & specification occur as early as possible so that required testability features can be designed into the system rather than being added ad-hoc later. Again, an MBSE approach can help this process. Wherever possible, verification activities (AIDT) should be specified against the model and prototypes as well as the delivered system.
Where subcontractors are involved, it is critical that the entire process survive across the integrator/subcontractor boundary. Without this, the system engineering team’s job becomes more like herding cats than sheep! Requirements traceability, verification planning, modeling, and test planning should be as transparent and coherent as possible across the design teams. The only places where this can be safely relaxed is at the COTS (or nearly COTS) components. Otherwise, verification becomes a nightmare.
While the process described here is heavily front-loaded (an ounce of prevention…), the system engineer’s job isn’t done until that last requirement is checked off, with the customer. It really is like shepherding, and the customer requires frequent (and honest) assurance that things are going as planned. An MBSE strategy provides the systems engineer with a tool that can be used in each milestone review to communicate measurable progress against requirements that is compatible with the customer’s viewpoint.
(As a personal aside, please do not fall for an MBSE or MDA strategy that results in models that are not simulatable and/or otherwise analyzable! If you can’t spec a test against the model, then that model has limited value. It’s really just another kind of documentation. For complex, multi-processor embedded systems, use Foresight as part of your MBSE solution!)
Systems engineering will probably never be for the faint-hearted. However, there are some “best practices” that can help. I’ve described a few from my experience here. I’d love to hear your thoughts on the matter.