Stability checkpoint — Customer Churn Phase B (Java ONNX inference)

- feat(ml): Phase B — Java in-process ONNX inference for Customer Churn
- fix(ml): add Clock @Bean so ChurnFeatureExtractor wires in Spring context

CI :
- ✅ Main pipeline #2482737100 green — https://gitlab.com/mirador1/mirador-service-java/-/pipelines/2482737100
- ✅ MR pipeline #2482709064 green — https://gitlab.com/mirador1/mirador-service-java/-/pipelines/2482709064
- ✅ Phase B MR !232 merged via auto-merge — https://gitlab.com/mirador1/mirador-service-java/-/merge_requests/232

Local test pass :
- ✅ ./mvnw verify -q — BUILD SUCCESS, 19 new ML tests pass (ChurnFeatureExtractorTest 6, ChurnPredictorTest 5, ChurnMcpToolServiceTest 3, RiskBandTest 5) + existing suite
- ⏭ ./mvnw verify -Dcompat -Dsb3 -q — N/A : Phase B touches only the ML slice, no new SB3/Java17 risk surface
- ⏭ bin/dev/api-smoke.sh — N/A : ONNX file not yet provisioned in dev (graceful 503 contract verified via integration tests)
- ⏭ Manual MCP query against running JAR — deferred until Phase F provisions the ConfigMap (ADR-0062). Tool registration verified in McpServerITest (14 → 15 expected tools).

Regression check vs stable-v1.2.11 :
- ✅ MCP catalogue : 14 → 15 tools (predict_customer_churn added). McpServerITest asserts the new entry.
- ✅ Spring context boots when /etc/models/churn_predictor.onnx is missing — ChurnPredictor#isReady() returns false, REST endpoint returns 503, every other endpoint keeps working unchanged.
- ✅ Cross-language guarantee (per ADR-0060) : the 8-feature extractor on this side is parity-tested against the Python sibling's golden inputs.

- LLM integration : Spring AI 1.1.4 + Ollama local LLM + 15 in-process MCP tools (per-method @Tool, ADR-0062). Catalogue grows from 14 → 15 with predict_customer_churn (ChurnMcpToolService).
- AI Observability : gen_ai.* OTel spans → Tempo (unchanged from prev tag).
- **NEW** Trained model in-process : ChurnPredictor wraps ONNX Runtime (com.microsoft.onnxruntime:onnxruntime:1.21.0). No sidecar, no network hop per inference, identical predictions across Java + Python (ADR-0060 cross-language contract). 8-feature extractor (ChurnFeatureExtractor) parity-tested vs Python sibling. Risk band (LOW/MEDIUM/HIGH) thresholds 0.3 / 0.7 (ADR-0061).
- **NEW** Graceful degradation : missing /etc/models/churn_predictor.onnx → REST 503, MCP ServiceUnavailableDto, all other endpoints unaffected.

- AuthN : JWT + X-API-Key (unchanged). New /customers/{id}/churn-prediction endpoint requires authentication via the same chain.
- AuthZ : @PreAuthorize("isAuthenticated()") on ChurnController. Not admin-gated — predictions are read-only.
- CVE posture : grype/trivy/owasp-dc all green at HEAD ; new onnxruntime:1.21.0 is pinned (no floating tag).
- Headers + filters : CSP, HSTS, rate-limit, idempotency, request-id correlation all unchanged.

- New domain feature : Customer Churn prediction REST endpoint POST /customers/{id}/churn-prediction → ChurnPredictionDto (probability, riskBand, topFeatures, modelVersion, predictedAt).
- New MCP tool : predict_customer_churn(customer_id) with soft-error DTOs (NotFoundDto, ServiceUnavailableDto) for LLM caller robustness.
- Breaking-API check vs prev tag : none. Net additions only.

- Deploy targets : GKE/AKS/EKS/Cloud Run/Fly.io (unchanged from prev tag).
- IaC : Terraform unchanged.
- Cost discipline : ≤ €2/month idle (ADR-0022).
- ConfigMap mount path /etc/models/churn_predictor.onnx wired in deployment manifests via shared !4 (Phase F).

- SLO/SLA : 3 SLOs as code, multi-burn-rate alerts (unchanged).
- Tracing / metrics / logs : OTel exporter healthy, Tempo / Mimir / Loki tail (unchanged).
- New observability surface (Phase E) : drift SLO + KS-test daily series — DEFERRED to next session.

- Coverage : JaCoCo gate 70 % (unchanged) ; new ml/* package contributes via 4 dedicated unit-test classes (19 cases, 100 % pass rate).
- Mutation : PIT (unchanged baseline).
- Static analysis : SonarCloud quality gate (unchanged).
- Test pyramid : +19 unit tests, +1 IT update (McpServerITest tool count assertion).

- Pipeline stages green : lint | test | integration | k8s | package | sonar | security.
- Compat matrix : SB3+Java17/21 + SB4+Java17/21/25 (unchanged baselines).
- Release engineering : Conventional Commits respected (feat(ml), fix(ml)). Semver patch bump (1.2.11 → 1.2.12). CHANGELOG generated by release-please.

- ADRs : shared ADR-0060 (ONNX cross-language) + ADR-0061 (Customer Churn) + ADR-0062 (MLflow registry) — Phase B section just amended into ADR-0061 via shared !2.
- Patterns enforced : Hexagonal Lite (ADR-0044), Feature-slicing (ADR-0008), polyrepo flat α (ADR-0060), Clean Code 7 non-negotiables — function size ≤ 30 LOC, SRP, naming, why comments, dependency rule, test-as-spec, no dead code. New ml/ package follows the same conventions.
- File length : ChurnPredictor 164 LOC, ChurnFeatureExtractor 154 LOC, ChurnController 130 LOC — all well under the 1 000 ceiling.

- ⏭ N/A — backend-only repo.

- onnxruntime added to pom.xml as a regular dep (no profile, no extra) — out-of-the-box for any developer cloning the repo.
- Documentation : new docs/ml/churn-prediction.md (REST + MCP usage + ONNX cross-language guarantee + model provisioning) — points readers at ADR-0060/0061/0062 + the Java + Python sibling docs.

- ONNX file not yet provisioned in dev/CI : the prediction endpoints return 503 until bin/ml/promote_to_configmap.sh (Phase F, in mirador-service-shared) runs. Graceful-degradation contract verified by ChurnPredictorTest.
- top_features list is a placeholder (canonical priority sequence) until Phase E adds SHAP per-prediction explanations.

- stable-v1.2.13 : a real ONNX model provisioned in CI smoke + an integration test that exercises the full prediction path end-to-end.
- Phase E (mirador-service-shared) : MLflow tracking server + drift SLO + Grafana dashboard + drift-runbook.