Our unique Digital Mission Engineering Platform has through-life traceability and connectivity helping organisations accelerate and de-risk technical projects and maintain the systems they deliver.
Reduce risk to your system development, acquisition and sustainment through an integrated Digital Twin
A proprietary AI engine connects data to accelerate the engineering, design and testing process
Analyse the impact of changes and integration of systems and interfaces before implementation
DESIGN AT SCALE
As we build larger and larger systems and systems of systems, we need tools and techniques to support us in understanding them and making predictions about their operation and performance. We need to be able to develop a deep, accurate and shared understanding of the relationships between context/environment and constraint, function, performance, and form (design) throughout the entire definition, acquisition/development, integration, test and evaluation, and release lifecycle of a system. We must be able to draw upon this knowledge to understand the impact of change, failure and obsolescence during operational sustainment.
The current best-practice engineering techniques and disparate, distributed tools and artefacts do not enable such an understanding – they impose too high a requirement on human short-term memory.
The goal of any program or project should be to understand what we are building while we are building it so that we can predict how it will perform in its environment before it is deployed to it, and to provide a model of the system, delivered with the system, against which we can monitor and measure actual performance and assess the impact of change in the system and its environment.
DIGITAL MISSION ENGINEERING WITH KOMPOZITION
Kompozition enables the development, maintenance and querying of a single, continuous, and integrated knowledge graph (digital thread) of digital artefacts that represent the system, its verified design and architecture - including representations of its expected behaviour in the operational environment (mission threads) and the contracts/requirements that predicate its design and constrain its operation.
This digital thread in the Kompozition platform extends from operational concept through to the detailed design representing the as-delivered solution and the records of verification and validation that qualifies the solution. When delivered with the system (as an abstract digital twin) this model enables the continuous monitoring and analysis of the impacts of change to and on the system, the operational environment, and any contract requirements or constraints that predicate its design and use. Using an extensible, integrated Kompozition knowledge graph, it would be possible to assess the operational, technical, and contractual impacts of any such change.
The Kompozition platform can be deployed stand-alone (and isolated from the internet), on-premises or via a SaaS subscription.
The Kompozition platform has a full-featured requirement management capability providing all the features required to perform requirements definition and management; including tools to support requirements elicitation and analysis, requirements definition/authoring, trace management, allocation, verification and validation, requirements review and approval, baseline and delta management, and integrated uncertainty management.
Kompozition provides an integrated mechanism for managing the information within a requirement in a consistent digital thread with architecture (operational through technical), design, evaluation and release planning information. The difference between Kompozition and other requirements management and architecture/systems engineering tools (CSM, Sparx EA, ...) is that Kompozition uses AI, heuristic algorithms and an expressive modelling language to help synthesise and maintain a consistent integrated knowledge graph.
Like a traditional requirements database, Kompozition maintains an object for a requirement and maintains the trace and history of that object. However, Kompozition does not just capture artefacts like requirements and traces at an object level, Kompozition uses a Natural Language Processing (NLP) framework to extract and relate the elemental information inherent within the requirement.
Consider the following example of a function and performance requirement for a fictitious Air Deployable Amphibious Vehicle (ADAV)
The ADAV onboard sensors shall be capable of detecting troops at a range of greater than 1000m in open country with a probability of detection greater than 90%.
Based on an NLP parse of this statement, Kompozition automatically:
Identifies a function, Detect Troops (and captures it in the functional decomposition) and captures and associates performance properties and conditions (range, open country conditions) with this function.
Allocates that function to the ADAV as the performer and the RWS Sensor as the enabler (creating either in the ontology and system breakdown, if required – noting that Kompozition automatically consults the knowledge graph that already records Onboard Sensor as an alias of RWS Sensor)
Extracts a probabilistic behaviour representing the Detect Troops event, with its pre-conditions and the resulting state (where the resources/information Troop and Open Country and the relationships between them are all captured in the ontology as part of the Domain Model)
Identifies and classifies uncertainties around the fact that the knowledge graph doesn’t have any information establishing what it means to be in open country and whether the ADAV and the Troops both need to be in Open Country (and links this uncertainty with the requirement, the function, the behavioural representation, and all relevant entities in the ontology), and links the requirement, through the functional decomposition to an objective (already in the platform).
The knowledge graph links all these pieces of elemental information (compositional, structural, behavioural and uncertainty) with the requirement and with each other. It does this with all requirements and other sources of information provided to it, resulting in a complete, integrated model of that information. As a reference, the synthesis and integration of information from a set of requirements, managed within the Kompozition platform were most recently applied within Defence for JP9102 to create a high-fidelity model of their OCD and FPS.
AN OPEN API
Adopting the Kompozition platform and approach requires organisational change, process changes and respective training for engineers. Kompozition has been built with open APIs to integrate with, extract information from, and generate information to the tools already used by engineers, architects, designers, developers, testers, etc., enabling organisations to migrate to a complete Digital Engineering approach gradually. Kompozition can also augment engineering teams with customer success residents who are experts in the Digital Engineering approach supported by Kompozition.