If you face any issues on Claude models you can directly check https://status.claude.com as our Claude models are from anthropic itself
Current issues on Claude : Degrades Performance [ Anthropic side Issue ]
Current issues on Claude : Degrades Performance [ Anthropic side Issue ]
Claude
Claude Status
Welcome to Claude's home for real-time and historical data on system performance.
❤9
Do you guys play Hytale ? Reply in: https://t.me/+1o9YByDY3cU0NWRl
Telegram
MegaLLM Community
https://t.me/megallmio
Admin @zfuels
ONLY ENGLISH IS ALLOWED
Admin @zfuels
ONLY ENGLISH IS ALLOWED
❤3
max https://ggsel.net/catalog/product/102086796
dev https://ggsel.net/catalog/product/102083317
On special request we have added 20 Dev tier.
We have only taken the request in consideration for once and there will he no dev tier until march.
dev https://ggsel.net/catalog/product/102083317
On special request we have added 20 Dev tier.
We have only taken the request in consideration for once and there will he no dev tier until march.
❤8
We are migrating our providers so some models will be unavailable or offline
❤16👍7
We have managed to bring back sonnet, let me know if you are facing issues on sonnet
❤13
Those people who are complaining about opus 4.1
It won’t be back it’s a deprecated model
It won’t be back it’s a deprecated model
❤15
🚀 MegaLLM Update: What’s Changing & What’s Next
Over the last few days, MegaLLM has seen rapid growth — and with that growth comes the need to evolve responsibly.
We briefly announced changes to pricing and model availability.
After listening closely to feedback, we’ve decided to take that announcement down and move forward with a clearer, more flexible approach.
Here’s the updated plan 👇
✅ What Stays the Same
• All existing premium plans remain available
• No models are being removed
• Your current workflows will continue to work as expected
🔄 What’s Changing
Instead of removing or separating models, we’re introducing a multiplier-based usage system.
• Open-source models → lower multipliers (more cost-efficient)
• Premium / proprietary models → slightly higher multipliers
• New models (open-source + premium) will continue to be added
This allows us to:
• Keep premium models accessible
• Maintain platform stability
• Fairly balance infra costs without hard removals or confusing limits
🧠 Why Multipliers?
Different models have very different cost and performance profiles.
Multipliers help us:
• Offer choice, not restrictions
• Keep pricing predictable
• Scale sustainably as usage grows
You choose the models.
The multiplier simply reflects the real cost behind them.
✨ What’s Coming Next
MegaLLM is expanding beyond text inference:
• MegaLLM Chat Platform
• Data Scraping APIs
• Web Search APIs
• More image & multimodal models
📅 Timeline
These changes will roll out by 26 Jan.
Exact details and documentation will be shared before enforcement.
Thank you for sticking with us and holding us to a high standard 💙
We’re building MegaLLM for the long term — with clarity, flexibility, and reliability at the core.
— Team MegaLLM
Over the last few days, MegaLLM has seen rapid growth — and with that growth comes the need to evolve responsibly.
We briefly announced changes to pricing and model availability.
After listening closely to feedback, we’ve decided to take that announcement down and move forward with a clearer, more flexible approach.
Here’s the updated plan 👇
✅ What Stays the Same
• All existing premium plans remain available
• No models are being removed
• Your current workflows will continue to work as expected
🔄 What’s Changing
Instead of removing or separating models, we’re introducing a multiplier-based usage system.
• Open-source models → lower multipliers (more cost-efficient)
• Premium / proprietary models → slightly higher multipliers
• New models (open-source + premium) will continue to be added
This allows us to:
• Keep premium models accessible
• Maintain platform stability
• Fairly balance infra costs without hard removals or confusing limits
🧠 Why Multipliers?
Different models have very different cost and performance profiles.
Multipliers help us:
• Offer choice, not restrictions
• Keep pricing predictable
• Scale sustainably as usage grows
You choose the models.
The multiplier simply reflects the real cost behind them.
✨ What’s Coming Next
MegaLLM is expanding beyond text inference:
• MegaLLM Chat Platform
• Data Scraping APIs
• Web Search APIs
• More image & multimodal models
📅 Timeline
These changes will roll out by 26 Jan.
Exact details and documentation will be shared before enforcement.
Thank you for sticking with us and holding us to a high standard 💙
We’re building MegaLLM for the long term — with clarity, flexibility, and reliability at the core.
— Team MegaLLM
❤18👍5
Community Update & Feedback Request
Hey everyone,
We want to be transparent with you all. Recently, we’ve been experiencing intermittent downtime and instability from some model providers. Since we rely on multiple external AI providers, issues on their side can sometimes impact performance here — and we know that can be frustrating.
Your experience matters a lot to us, and we don’t want to build this platform without your voice.
We’d love your feedback on:
- Which models you use the most
- Any image models or new AI features you’d like us to add
- Pain points you’re facing (speed, reliability, pricing, UX, etc.)
- Ideas or suggestions that could make this platform more useful for you
How you can share feedback:
- Drop your thoughts here: https://tally.so/r/KYV0dD
What we’re actively working on:
- Better fallbacks between providers
- Improved uptime monitoring
- Adding more stable and in-demand models
Thanks for sticking with us.
Your feedback directly shapes what we build next.
~ Team
Hey everyone,
We want to be transparent with you all. Recently, we’ve been experiencing intermittent downtime and instability from some model providers. Since we rely on multiple external AI providers, issues on their side can sometimes impact performance here — and we know that can be frustrating.
Your experience matters a lot to us, and we don’t want to build this platform without your voice.
We’d love your feedback on:
- Which models you use the most
- Any image models or new AI features you’d like us to add
- Pain points you’re facing (speed, reliability, pricing, UX, etc.)
- Ideas or suggestions that could make this platform more useful for you
How you can share feedback:
- Drop your thoughts here: https://tally.so/r/KYV0dD
What we’re actively working on:
- Better fallbacks between providers
- Improved uptime monitoring
- Adding more stable and in-demand models
Thanks for sticking with us.
Your feedback directly shapes what we build next.
~ Team
Tally Forms
MegaLLM Feedback Form
Made with Tally, the simplest way to create forms.
❤13👍5
We did a huge talk with providers and we have a good news the wait is finally over for stable models
26 Jan.
Models will be stable again and all the users will be compensated
26 Jan.
Models will be stable again and all the users will be compensated
1❤46👍14
MegaLLM – Follow-Up Announcement and Roadmap Update
Hello everyone,
We want to share a transparent update on the current state of MegaLLM and what lies ahead.
Over the past few weeks, we have been closely reviewing platform stability, operational costs, and overall service quality. Based on these evaluations, we have made several important decisions.
Current Changes
- Rate limits have been reduced on certain models, particularly proprietary models.
This decision was made based on community feedback, as many users preferred continued limited access rather than complete removal or a full switch to open-source alternatives.
- To compensate for recent downtime, all active subscriptions will be extended by one additional week, ensuring no loss of access.
Subscription and Renewal Update
- All subscription renewals will be paused starting today at 6:00 PM IST.
- Existing subscriptions will remain active until their current billing cycle ends, after which they will be automatically paused.
- For the time being, only Enterprise plans will continue, as they allow us to operate sustainably while we work on improvements.
We strongly believe that quality comes first, and we do not want to offer a service unless we are confident it meets our standards.
What’s Next for MegaLLM
We are actively working on:
- Improving infrastructure stability
- Securing higher model quotas
- Optimizing overall model performance
- Reopening subscriptions once we are confident in the experience we can deliver
We will share updates as soon as we are ready to relaunch at full capacity.
P.S.
We want to be open with our community. MegaLLM is currently facing several challenges, including personal and financial constraints. As a small team, this has been a difficult phase, and we need some time to recover and rebuild properly.
Your patience and understanding mean a great deal to us.
Thank you for your continued trust and support. We appreciate every user and remain committed to coming back stronger and better.
Team MegaLLM
Hello everyone,
We want to share a transparent update on the current state of MegaLLM and what lies ahead.
Over the past few weeks, we have been closely reviewing platform stability, operational costs, and overall service quality. Based on these evaluations, we have made several important decisions.
Current Changes
- Rate limits have been reduced on certain models, particularly proprietary models.
This decision was made based on community feedback, as many users preferred continued limited access rather than complete removal or a full switch to open-source alternatives.
- To compensate for recent downtime, all active subscriptions will be extended by one additional week, ensuring no loss of access.
Subscription and Renewal Update
- All subscription renewals will be paused starting today at 6:00 PM IST.
- Existing subscriptions will remain active until their current billing cycle ends, after which they will be automatically paused.
- For the time being, only Enterprise plans will continue, as they allow us to operate sustainably while we work on improvements.
We strongly believe that quality comes first, and we do not want to offer a service unless we are confident it meets our standards.
What’s Next for MegaLLM
We are actively working on:
- Improving infrastructure stability
- Securing higher model quotas
- Optimizing overall model performance
- Reopening subscriptions once we are confident in the experience we can deliver
We will share updates as soon as we are ready to relaunch at full capacity.
P.S.
We want to be open with our community. MegaLLM is currently facing several challenges, including personal and financial constraints. As a small team, this has been a difficult phase, and we need some time to recover and rebuild properly.
Your patience and understanding mean a great deal to us.
Thank you for your continued trust and support. We appreciate every user and remain committed to coming back stronger and better.
Team MegaLLM
1❤23
Kira
Please communicate in the community channel using English only.
Also not following this will result in a ban.