Based on our recent experience, `distinguishing fact from fiction' in relation to System of Systems (SoS) safety has emerged as a pertinent topic in a number of senses. From an analytical perspective, we recognise that it would be a mistake to treat a SoS as `just another complex system'. The defining properties of a SoS mean that traditional analysis methods may fall short if applied without additional support. On the other hand, we also argue that the structured and comprehensive analysis of a SoS need not be so complex as to be impractical.
We draw on an internal BAE Systems development project, Integrated Aircrew Training (IAT), as an exemplar. IAT interconnects multiple systems and participants - air and ground assets - into a training SoS. As would be expected we have identified a number of sources of complexity in the analysis of this SoS, chiefly the exponential impact of interactions among increasing numbers of system elements on analysis complexity. However, the training domain provides constraints which may be captured as feature models to structure the analysis.
We outline a SoS hazard assessment process and associated safety case approach that are the subject of ongoing research and development and as such, are not yet formally recognised. They acknowledge that the presence of human decision-makers in a SoS means that human factors analysis contributes significantly to SoS safety assessment. We discuss the human element in SoS safety analysis and show how its treatment in the case of IAT has caused us to recognise that augmented-reality training brings with it both novel sources and consequences of human `error'. In this particular SoS, the `fact versus fiction' differential also applies to SoS users and the notion of participant `immersion' is a key area of interest.