Engine Control Validation & HIL-Style Test Rig Simulator
A deterministic C11 engine-control validation simulator engineered with industrial software discipline and CI workflows that mirror key concepts of hardware-in-the-loop (HIL) validation.
This project is a deterministic C11 engine-control validation simulator built to generate reviewable evidence, not just a “demo that runs”. It implements a software-only simulation that mirrors key concepts of hardware-in-the-loop (HIL) validation workflows, without interfacing with physical rigs.
It models a SIL-style workflow where unsafe conditions must be detected reliably, noise must not cause spurious trips, and every change is gated by automation.
The deliverable is a command-line simulator that produces deterministic, structured JSON output (validated against a schema) and a separate read-only visualizer that consumes that JSON. The JSON is shaped so it can be automatically validated and used as a gate in CI workflows.
I treat it as a validation harness. Requirements are explicit, fault injection is explicit, and quality gates are enforced in automation. The goal is to make review practical: you should be able to read the docs, run the same checks locally as in CI, and trust that identical inputs yield identical outputs.
The simulator advances in discrete ticks. On each tick it ingests exactly one frame (or observes a timeout), validates and decodes it at the HAL boundary, updates the engine domain state machine, evaluates persistence-window safety logic, and appends one deterministic JSON record. Fault injection happens before HAL validation, for example by corrupting a checksum from a scripted directive, so error handling is exercised under the same boundary constraints as normal bus traffic.
Industrial engine controllers must detect unsafe conditions and transition to a safe state without false triggers from transient noise.
The simulator models faults such as temperature high, oil pressure low, combined high-load conditions, bus corruption and timeouts, and sensor disagreement. The requirement shape is persistence based: avoid reacting to single-sample noise while still reacting within a bounded window. Determinism matters because the output is a validation artifact. If identical inputs do not yield identical outputs, it becomes difficult to debug field-like faults, to perform credible review, and to gate changes in CI.
In industrial workflows, control logic is typically validated in staged environments. A SIL-style simulator stabilizes logic and corner cases before hardware is available. A HIL rig validates the same decisions against real I/O and timing on a bench. Simulation fills the gap between requirements and hardware availability by enabling repeatable fault injection (stuck-at, drift, corrupt frames, missing frames) while producing audit-friendly artifacts such as expected vs actual outcomes, requirement IDs, and machine-readable reports that can be gated in CI.
The implementation makes determinism a design constraint rather than a property you hope for. Wall-clock time is replaced by a discrete tick counter, execution is single-threaded, and the simulator logic uses no dynamic heap allocations - data structures are fixed-size for safety and analyzability. Fixed-capacity buffers and bounded loops keep behavior predictable. Output is serialized as JSON and treated as a contract through schema validation and compatibility checks.
The codebase is organized as a strict dependency stack, and the boundary is enforced by an include-scan in CI (make analyze-layering). At a high level: the domain layer contains the engine state machine and control logic, the platform layer implements the HAL boundary and frame validation, the scenario layer provides scenario profiles and the script parser, the reporting layer handles deterministic formatting, and the app layer is the CLI orchestration.
Error paths are treated as first-class behavior. Functions return explicit status codes that carry severity and recoverability, rather than relying on “best effort” fallthrough. Strict script parsing (--strict) is used so scenario inputs either validate and run deterministically, or fail fast with structured diagnostics.
Correctness is treated as a set of independently checkable signals. Unit tests (make test-unit) cover the unit-testable modules. Integration scenarios run the built-in scenario suite (./build/testrig --run-all) and compare expected vs actual outcomes with requirement IDs attached. Fault injection is exercised through the same HAL boundary as normal traffic using scripted directives such as TICK <n> FRAME CORRUPT. JSON output is validated against the schema, and schema backward compatibility is checked. Repeatability is enforced by deterministic replay (make determinism-check) that compares SHA-256 hashes across two identical runs. Runtime defects are probed with ASan and UBSan (make analyze-sanitizers) and deep memory validation with Valgrind Memcheck (make analyze-valgrind).
The intent of make ci-check is that every merge produces artifacts you can review and trust. It bundles release and debug builds with -Werror, unit and integration execution, static analysis (cppcheck, clang-tidy, and the MISRA addon), layering enforcement, runtime checks (ASan and UBSan), optional deep checks (Valgrind, enabled in CI with CI_VALGRIND=1), JSON contract and schema validation, schema compatibility checks, determinism replay, and a visualizer boundary audit that enforces a JSON-only interface. This is deliberately redundant because different checks catch different classes of regression.
On the current build, 255 out of 255 unit tests pass and 10 out of 10 validation scenarios pass with requirement-tagged reporting. Source-only line coverage on the unit-tested modules is 100% (1355 out of 1355 measured lines). Valgrind Memcheck reports zero leaks and zero invalid accesses across unit tests and scenario execution. These checks are aggregated behind make ci-check, which is the single entry point used as the quality gate.
The simulator is intentionally documented like a small safety-oriented subsystem so a reviewer can trace intent to implementation to verification. The docs/ set includes a structured safety argument (hazards, safety functions, residual risks), a compact FMEA with RPN ratings, and a requirement traceability matrix that maps requirement IDs to scenarios and unit tests. It also includes a MISRA deviations log with rationale and mitigation, ADRs that record the architectural decisions behind determinism, layering, and the JSON contract, plus notes that document the determinism guarantee and schema evolution policy.
This project models pre-hardware validation workflows without claiming production equivalence.
It mirrors SIL-style validation used to stabilize control logic before bench time, and the deterministic replay check supports review practices where identical inputs must yield identical artifacts. Requirement-tagged JSON output is intentionally structured to resemble a minimal software validation component aligned with automation lab pipelines and CI-driven test environments, without coupling the simulator to any vendor system.
An extension path to HIL would be to replace the simulated bus and frame ingress with real I/O while keeping the domain and control behavior unchanged.
What this project deliberately does not model: real-time scheduling, interrupt-driven concurrency, hardware register-level drivers, and certified toolchains. Those concerns would enter when moving from deterministic SIL simulation to real-time embedded deployment.
Design constraints are explicit: deterministic execution (no threads, no wall-clock dependencies, no randomness), no global mutable state in the simulator logic (data crosses boundaries through explicit interfaces), fixed-capacity buffers with bounded loops, and a CI-before-merge policy where correctness signals are gated by automation rather than manual inspection. Static analysis and MISRA checks are treated as build-time constraints.
Deterministic replay turns debugging into diffing by bytes and ticks, rather than hoping to reproduce timing. Layer boundaries are easiest to preserve when CI treats violations as failures, not warnings. Persistence windows behave like debouncing and must be validated at boundary conditions such as recovery resets and off-by-one ticks. Treating JSON as a contract creates a stable interface between deterministic validation and downstream tooling, including CI gates and visualization.
The FRAME CORRUPT directive injects a bad checksum, which the HAL rejects just like a real bus error. New scenarios can be added without changing any source code.
The simulator itself is a command-line tool that outputs JSON. That is great for CI pipelines and automated gating, but not very helpful when you want to understand why a scenario triggered a shutdown at tick 14 or see how oil pressure drifted over time. The visualizer exists to make that kind of analysis quick and intuitive.
It is a separate Raylib program that reads the JSON output file and plays it back as an animated dashboard. It shows RPM, temperature, and oil pressure as real-time line graphs with threshold overlays drawn at the calibrated limits. The current engine mode (INIT, STARTING, RUNNING, WARNING, SHUTDOWN) is displayed alongside the graphs so you can see exactly which tick caused a state transition. A scrubbable tick slider lets you step forward and backward through the scenario. You can switch between multiple scenarios and see cumulative pass/fail statistics.
The visualizer is strictly read-only. It cannot call simulator functions or modify simulation state. This separation is intentional: the deterministic core must never be influenced by rendering code, so the two programs communicate only through the JSON file on disk.
Raylib visualization program showing line graphs of RPM, temperature, and oil pressure with threshold overlays, engine mode display, and tick slider.Raylib visualization DOS theme.Raylib visualization Onyx theme.Raylib visualization Gruvbox theme.Raylib visualization Light theme
The outcome is a deterministic, tick-stepped simulator that produces repeatable validation artifacts suitable for CI gating, with an explicit separation between deterministic simulation and read-only visualization. Validation evidence is measurable and automated through tests, scenarios, coverage, schema and contract checks, determinism replay, sanitizers, and Valgrind.