State drift
State drift designates the divergence between the actual state of dynamic information (price, stock, delivery, promotion, availability, policy, status) and the state returned by an AI system. The system responds as if the state were stable, when it has changed.
In interpretive governance, state drift is a critical risk, because it transforms a plausible response into a false response, without the error appearing spectacular. It can also create implicit liability (attributed promise).
Definition
State drift is the situation where:
- a rapidly varying attribute has an actual current state (source, system, database);
- the AI system returns an outdated or approximate state;
- and the response does not signal uncertainty, date, or validity condition.
State drift is a collision between real time and stabilized interpretation.
Why this is critical in AI systems
- The model freezes: it transforms an observed state into a durable property.
- Update is not guaranteed: open web, caches, secondary sources, and aggregators prolong the old state.
- Risk is operational: a wrong response can trigger an erroneous decision.
Frequent sources of state drift
- Unbounded dynamic information: absence of date, version, validity period.
- Inertia / remanence: old values persistent in responses.
- Incomplete RAG: retrieval does not retrieve the most recent state.
- Dominant neighborhood: aggregators and secondary pages stabilize an obsolete state.
Practical indicators (symptoms)
- The response never mentions “as of”, “according to the latest update”, “validity period”.
- Two users obtain different states within minutes of each other.
- The system provides a price, stock level, or policy without citing a “real-time” source.
- The system responds affirmatively to a question that should trigger a response condition.
What state drift is not
- It is not a simple static factual error. It is a time-related error.
- It is not only a data problem. It is a response conditions problem.
- It is not merely a retrieval problem. Even with retrieval, AI can freeze and generalize.
Minimum rule (enforceable formulation)
Rule SD-1: any dynamic information must be returned with a validity condition (date, source, period) or produce a legitimate non-response if the state cannot be established from authorized and up-to-date sources. Failing that, the output must be considered state drift.
Example
Question: “What is the current price / current availability?”
Typical error: AI provides a figure without date or “real-time” source.
Governed output: “I cannot establish the current state without an up-to-date source. Here is the last dated value…”.