The streaming code assumed every stream produced a `response.completed`
event and dereferenced its data unconditionally, causing
`undefined method 'data' for nil` whenever OpenAI emitted
`response.failed`, `response.incomplete`, or a top-level `error` event
(e.g. expired `previous_response_id`, context-window overflow,
transient upstream failures). Surface a descriptive `Provider::Error`
instead.
- Extend `ChatStreamParser` to recognise `response.failed`,
`response.incomplete`, and `error` events and emit an `error` chunk
with a `StreamErrorData` payload (event, message, code, details).
- In `Provider::Openai#native_chat_response`, detect the missing
`response` chunk, build a user-facing error message from the
collected error chunk, and raise `Provider::Error`.
- Add unit tests for the parser (8 cases) and integration tests for
the error path in the chat response flow.
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>