TL;DR SE Newsletter – June 2025

The BLUF without the fluff. TL;DR:

  • INCOSE SE Handbook Study Group: March 2025 cohort ending; July 2025 cohort starting soon.
  • Project Gordon: Created 500+ flash cards for INCOSE SEP exam prep; now available.
  • MOSA with Digital Engineering: Panel at 2025 Army Summit discussed practical implementations.
  • Russian Cognitive Warfare: Dynamic influence operations targeting decision-making processes.
  • Model-Based Systems Engineering: Review on developing safe, sustainable circular systems.
  • Systems Thinking Definition: Proposed comprehensive definition for educational efforts.
  • Japan’s ispace suffered its second lunar crash: Second crash from similar sensor failure

INCOSE SE Handbook Study Group

The March 2025 cohort is quickly coming to a close and the July 2025 (July 7th) is just around the corner. Fitzgerald Systems is committed to the Study Group format because it delivers results: providing a systematic way to crush the handbook whilst giving members structured and regular feedback on how they are doing.

INCOSE SE Flash Cards

Project Gordon1 by Fitzgerald Systems aims to revolutionize systems engineering professional development by creating flash cards based on the INCOSE SE Handbook v5. These cards help prospective INCOSE SEP examinees prepare for the exam through active recall. The project, completed ahead of schedule, produced 500+ unique flash cards covering all handbook sections. The flash cards are now available, supporting professional growth and equal opportunity. The initiative reflects Fitzgerald Systems’ vision of promoting resolution, humility, industry, and frugality while bridging the gap between practitioners and academia.

Operationalizing MOSA with Digital Engineering

At the 2025 Army Summit, Rosemary Kramer moderated a panel on operationalizing the Modular Open Systems Approach (MOSA) using digital engineering (DE)2. The discussion focused on moving beyond policy into practical, field-tested implementations—particularly in Army Aviation. Real-world applications like Formula 1 racing and modern production lines were used to illustrate how modularity and digital iteration can drive speed, adaptability, and resilience. Kramer emphasized the importance of relatable examples, recalling a missed opportunity during the panel Q&A to connect MOSA to successful commercial practices.

Key insights from panelists included:

  1. Start Early, Iterate Often: Early collaboration and virtual testing via DE tools enable faster, safer integration of modular systems.
  2. Open Interfaces ≠ Open IP: Transparent system interfaces are essential, but do not require vendors to share proprietary designs.
  3. AI + DE = Agility: Artificial intelligence is accelerating digital engineering, which in turn supports modular, upgradable systems crucial for autonomy.

Ultimately, Kramer stressed the cultural shift needed to fully embrace MOSA, including better communication across sectors and shared understanding through accessible analogies. DE isn’t just a toolkit—it’s a mindset driving transformation in defense systems architecture.

A Primer on Russian Cognitive Warfare3

Russia’s cognitive warfare represents a dynamic and tightly integrated system of influence operations designed to achieve strategic outcomes disproportionate to its conventional military power. This system targets adversaries’ decision-making processes through sustained manipulation of perception—operating across informational, cultural, military, and diplomatic domains. Key elements function in feedback loops: internal information control reinforces regime stability, which in turn fuels external operations to project power and mask deficiencies.

The Kremlin treats information as both an enabler and a weapon. Reflexive control—the strategic transmission of select narratives to induce specific responses—forms the system’s logic engine. Tactical-level disinformation feeds operational campaigns, which over time establish durable strategic premises that shape how the world interprets and responds to Russia’s actions.

A hallmark of this cognitive architecture is its adaptability and low-cost scalability. Russia exploits system vulnerabilities—ambiguity, distraction, psychological fatigue—in adversarial societies, while reinforcing its own with ideological insulation and censorship. Yet, the system reveals fragilities: its success depends on predictable patterns and overreliance on false premises, creating leverage points for disruption.

In systems terms, Russia’s cognitive warfare is a self-reinforcing structure engineered to reduce resistance without kinetic engagement. Understanding and intervening at the level of strategic assumptions—not just countering individual messages—is key to breaking the system’s influence loop.

A Review on the Application of Model-Based Systems Engineering in the Development of Safe Circular Systems

This paper4 presents a systematic literature review on how Model-Based Systems Engineering (MBSE) is applied to develop safe and sustainable (circular) systems. Using the Characteristic-Property Modelling (CPM) approach, it highlights how early design decisions—especially those relating to material, geometry, and system architecture—significantly impact sustainability outcomes like carbon footprint, and safety properties across the system’s lifecycle.

The study identifies MBSE’s strengths in managing complexity, improving traceability between requirements and design elements, and enabling early-stage simulations. It also explores the integration of Digital Twins, AI, and Knowledge Management (KM) to improve predictive decision-making, lifecycle optimization, and reverse logistics in circular systems. Methods like FMEA and Systems-Theoretic Process Analysis (STPA) are emphasized for safety assessment, though gaps remain in supporting trade-offs between safety and sustainability.

Key findings show a lack of specialized tools for modeling the evolving properties of reused components and managing risk under uncertainty. The paper proposes a classification of existing supports and calls for future research on harmonizing MBSE, KM, and safety assessment tools to close the gap in circular systems development.

A Definition of Systems Thinking: A Systems Approach

This 2015 paper5 proposes a definition of systems thinking for use in a wide variety of disciplines, with particular emphasis on the development and assessment of systems thinking educational efforts. The definition was derived from a review of the systems thinking literature combined with the application of systems thinking to itself. Many different definitions of systems thinking can be found throughout the systems community, but key components of a singular definition can be distilled from the literature. This researcher considered these components both individually and holistically, then proposed a new definition of systems thinking that integrates these components as a system. The definition was tested for fidelity against a System Test and against three widely accepted system archetypes. Systems thinking is widely believed to be critical in handling the complexity facing the world in the coming decades; however, it still resides in the educational margins. In order for this important skill to receive mainstream educational attention, a complete definition is required. Such a definition has not yet been established. This research is an attempt to rectify this deficiency by providing such a definition.

A Japanese lander crashed on the Moon after losing track of its location6

The Resilience lander and Tenacious rover, seen mounted near the top of the spacecraft, inside a test facility at the Tsukuba Space Center in Tsukuba, Ibaraki Prefecture, on Thursday, Sept. 12, 2024. Credit: Toru Hanai/Bloomberg via Getty Images

Japan’s ispace suffered its second lunar crash when the Resilience lander failed to decelerate and impacted the Moon on June 6, 2025. From a systems engineering lens, the failure highlights critical shortcomings in sensor integration and error propagation control. A delayed response from the lander’s laser rangefinder—key to altitude determination—prevented proper descent control, much like the company’s earlier 2023 crash.

The recurrence of similar failure modes suggests incomplete system closure from Mission 1, potentially tied to validation gaps or interface mismatches from switching rangefinder suppliers. ispace attempted a reboot, but no telemetry followed, revealing missing redundancy and autonomous fault recovery capabilities. The mission carried multiple international payloads (NASA, ESA, Luxembourg), underscoring the need for rigorous requirements traceability and risk management across stakeholders.

Despite the outcome, the mission reflects iterative design improvement and valuable lessons in guidance navigation control (GNC) system robustness and integration testing. A structured root cause analysis, supported by fault modeling tools like STPA or FMEA, will be essential.

Resilience may have failed, but it advances the maturity of commercial lunar systems—where every anomaly is an opportunity to evolve.

  1. Project Gordon – Fitzgerald Systems ↩︎
  2. From Concept to Capability: Operationalizing MOSA with Digital Engineering | LinkedIn ↩︎
  3. A Primer on Russian Cognitive Warfare | Institute for the Study of War ↩︎
  4. Z. Lipšinić, N. Pavković and S. Husung, “A Review on the Application of Model-Based Systems Engineering in the Development of Safe Circular Systems,” in IEEE Access, vol. 13, pp. 100042-100063, 2025, doi: 10.1109/ACCESS.2025.3575578. ↩︎
  5. Ross D. Arnold, Jon P. Wade, A Definition of Systems Thinking: A Systems Approach, Procedia Computer Science, Volume 44, 2015, Pages 669-678, ISSN 1877-0509, https://doi.org/10.1016/j.procs.2015.03.050. ↩︎
  6. A Japanese lander crashed on the Moon after losing track of its location – Ars Technica ↩︎

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *