Summary
Issue resolved. Cause: A few requests made to the Llama 3.3-70B model caused issues. Impact: Intermittent errors when interacting with the model through serverless inference and/or with agents created using this model. Contact support if issues persist.
Impact
minor
Timeline
[investigating] We are currently investigating an issue affecting the Llama 3.3-70B model. Symptoms: Users may encounter intermittent errors when making serverless inference requests via APIs and Agents. Current Status: Our engineering team is actively investigating the issue to determine the root cause.
via statuspage[monitoring] Fix deployed. Monitoring resources related to the Llama 3.3-70B. Users should no longer experience intermittent errors when making serverless inference requests via APIs and Agents . Awaiting confirmation before closure.
via statuspage[resolved] Issue resolved. Cause: A few requests made to the Llama 3.3-70B model caused issues. Impact: Intermittent errors when interacting with the model through serverless inference and/or with agents created using this model. Contact support if issues persist.
via statuspageLessons Learned
⚠DigitalOcean has experienced 46 incidents in the past year. This frequency suggests systemic reliability challenges that may warrant additional monitoring.
📊Incidents related to api have occurred 182 times across all providers in the past year. This is one of the most common failure categories in cloud infrastructure.
💡This incident is categorized as: API Issue. Consider implementing preventive measures specific to this failure category.
Similar Incidents
Elevated 500 errors from Browser Rendering REST API
Cloudflare · Mar 11, 2026
Elevated errors on Claude.ai (including login issues for Claude Code)
Anthropic · Mar 11, 2026
Degraded experience with Copilot Code Review
GitHub · Mar 11, 2026
Increased errors with ChatGPT file downloads
OpenAI · Mar 10, 2026
Increased errors on ChatGPT File Uploads
OpenAI · Mar 10, 2026