The Explainability Factor: Lockheed Martin and DarwinAI’s Game-Changing AI Platform
Background: The Challenge of the AI Black Box

DarwinAI’s GenSynth Explainability Platform

In addition, the GenSynth platform enables developers to:
- Automatically generate new models which meet performance targets (such as accuracy) within operating restrictions (for example, parameters, size, FLOPs) – reducing development timelines from months down to days.
- Provide transparency at every stage of the development process, from revealing bottlenecks in models, to visualizing performance comparisons of different experiments, to identifying errors, biases and issues in both models and data.
- Reveal critical factors which cause the model to make decisions – showing why decisions are made, unearthing hidden bias and enabling more effective and efficient audits. (Such insights are key in implementing robust and trustworthy AI systems.)

AI Opportunities for the Future Battlespace
Embracing AI promises considerable benefits for the warfighter through its contributions to productivity growth and innovation. According to Lockheed Martin’s AI Center senior manager Dave Casey, AI can sort through data faster than any human mind, reducing cognitive load of the soldier in the field so that they can focus on other components of the mission. Considering AI’s impact on how militaries operate and protect their forces, it is fundamental to comprehend how its internal processes reach their decisions.
With so much at stake, Fernandez added, it is critical that AI works in all scenarios in the field – even the unexpected scenarios – in a robust fashion: “In layman’s terms, this means that the AI will behave in a way that you intend, that it will be trustworthy under a variety of permutations.”
