Previous incidents
Predictions not running on A40s
Resolved Aug 28 at 06:11am UTC
A40 workloads are running again. We're continuing to monitor and investigate the underlying cause.
1 previous update
Streaming service degraded for A100s
Resolved Aug 21 at 11:04am UTC
We believe these problems have now been resolved. Please contact us if you are still seeing issues with streaming from Europe.
2 previous updates
A40s degraded
Resolved Aug 09 at 03:58pm UTC
A40 behavior has been stable for some time now. All systems are green.
1 previous update
Llama3-70b-chat Delays
Resolved Jul 25 at 11:44pm UTC
This has been resolved and predictions should be handled normally.
2 previous updates
Predictions on trained versions not starting
Resolved Jul 17 at 04:36pm UTC
We've fixed the issue and predictions on trained versions are running again.
1 previous update
Intermittent issues affecting some hardware types
Resolved Jul 16 at 08:16pm UTC
Things are running normally as of about 15 minutes ago.
2 previous updates
API degradation
Resolved Jul 09 at 12:15pm UTC
Service has been restored. Thanks for your patience!
2 previous updates
Llama 3 70b instruct model not processing predictions
Resolved Jul 03 at 11:11am UTC
The model is processing predictions properly again, and the queue is empty.
1 previous update
Some models unavailable
Resolved Jun 21 at 03:40pm UTC
Service has been restored as of a few minutes ago.
1 previous update
Errors publishing model versions
Resolved Jun 20 at 10:41pm UTC
Model version publishing is now working as expected.
1 previous update
Errors with inference
Resolved Jun 04 at 12:36am UTC
The issues with inference was limited to select LLM models. At this time the problematic code has been rolled back and all inference should be operating normally at this time.
1 previous update