Think Models overview
IMPORTANT NOTE: Think Models is currently in public beta and may contain bugs, incomplete features, or undergo significant changes based on user feedback before the final release.
Think Models is a Model-as-a-service inference platform where evroc is hosting leading open source AI models. You can either deploy a dedicated model instance yourself or use one of our Shared Model endpoints. Under the hood we are using state of the art NVIDIA H100 and B200 GPU's for unmatched performance and reliablity.
What you get with Think Models
- Access to a range of curated best in class models
- OpenAI API compatible evroc API interfaces
- Fast and reliable endpoints