DeepSeek Providers and Clients
Introduction
DeepSeek's service seems to be unstable at the moment. However, we can still access the model through their providers and clients due to its open-source nature. This article will guide you through the process of accessing DeepSeek's services through providers and clients.
Host DeepSeek Small-Scale Models on Your Local Machine
DeepSeek offers a range of small-scale models that can be hosted on your local machine. These models are designed for users who want to experiment with DeepSeek's technology without the need for extensive computational resources. You can checkout this post to learn more about running DeepSeek-R1 model on your local machine.
API Providers
Cloud platforms and AI proxies has quickly adopted DeepSeek's open-source models.
OpenRouter
OpenRouter is one of the first providers to offer DeepSeek's models through API. You can access the model in their website or through their API. Their API is fullly compatible with OpenAI's API design. You won't have to worry about changing your code if you're already using OpenAI's API.
Check out the following links to access DeepSeek's models on OpenRouter:
- DeepSeek Models on OpenRouter
- DeepSeek: DeepSeek V3
- DeepSeek: DeepSeek R1 (free)
- DeepSeek: DeepSeek R1
- DeepSeek: DeepSeek R1 (nitro)
- Deepseek: Deepseek R1 Distill Qwen 1.5B
- DeepSeek: DeepSeek R1 Distill Qwen 32B
- DeepSeek: DeepSeek R1 Distill Qwen 14B
- DeepSeek: DeepSeek R1 Distill Llama 70B
- DeepSeek V2.5
- DeepSeek-Coder-V2
GitHub Marketplace
GitHub Marketplace is where you can access various AI models hosted on GitHub and mantained by Microsoft Azure that you don't need a Azure subscription. GitHub Marketplace now offers access to DeepSeek R1 model along with OpenAI o1, Llama 3.3, Mistral and other top AI models. This platform is also compatible with OpenAI's API Client.
Check out the following links to access DeepSeek's models on GitHub Marketplace:
- DeepSeek R1 on GitHub Marketplace
- o3-mini on GitHub Marketplace
- o1-mini on GitHub Marketplace
- o1 on GitHub Marketplace
- Llama 3.3 on GitHub Marketplace
Together AI
Together AI also provides access to DeepSeek models through their API. They offer a playground for testing the models and comprehensive API documentation. Their API is also compatible with OpenAI's API design.
You can explore their models via these links:
- DeepSeek V3
- DeepSeek R1
- DeepSeek R1 Distill Qwen 1.5B
- DeepSeek R1 Distill Qwen 14B
- DeepSeek R1 Distill Llama 70B
- DeepSeek R1 Distill Llama 70B (free)
- Meta Llama 3 8B Instruct Turbo
Fireworks.ai
Fireworks.ai is another platform that offers access to various DeepSeek models, along with notable features such as a playground environment for testing models directly in the browser (e.g., DeepSeek R1), the ability to fine-tune specific models for specialized tasks, and the option to upload custom models. Here's an overview of their offerings, including which models are tunable:
Fireworks.ai provides a variety of DeepSeek models, including both base and instruct versions, catering to different needs. Below is a list of the models available on their platform with links:
- DeepSeek V2.5
- Deepseek R1
- Deepseek V3
- DeepSeek Coder 1.3B Base
- DeepSeek Coder 33B Instruct
- DeepSeek Coder 6.7B Base
- DeepSeek Coder V2 Instruct
- Deepseek Coder 7B Instruct v1.5
- Deepseek Coder 7B Base v1.5
- Deepseek R1 Distill Qwen 32b
- Deepseek R1 Distill Llama 8B (Tunable)
- Deepseek R1 Distill Qwen 14B
- Deepseek R1 Distill Qwen 1.5B
- Deepseek R1 Distill Qwen 7B
- Deepseek R1 Distill Llama 70b (Tunable)
They provide comprehensive documentation to guide users through the platform and its features, available here. Their API seems to be compatible with OpenAI's API design.
Fireworks.ai's platform provides users with both ready-to-use models and the flexibility to customize models to better suit specific applications. The fine-tuning feature and custom model uploads make it a compelling choice for developers to optimize model performance.
Clients
Besides those chat platforms, there are also open-source clients that you can use to chat with or call DeepSeek's models.
LobeHub
LobeHub is my favorite open-source web client for chatting with various AI models. They offer a paid version for users to access various AI models including DeepSeek's models without needing to host the client on their own server. And you can also deploy the community version on your own server or local machine for free.
LobeHub comes with a lot of AI providers support, including DeepSeek, OpenAI, OpenRouter, Google Gemini, Azure, HuggingFace, Bedrock, Anthropic, Fireworks AI, Together AI, Groq, Perplexity, xAI, Qwen and many more. You name it, they have it. It's quite an ideal client for you to chat with multiple AI models in one place.
Below are some essential links to LobeHub:
Programming Tools
Several programming tools are emerging to streamline the use of DeepSeek models, especially DeepSeek-R1, within your development workflow. These tools often integrate directly into your IDE or offer specialized functionalities to enhance your coding experience.
Cline
Cline is an open-source Visual Studio Code Extension that provides enhanced code editing features, including intelligent code completion, error detection, and quick fixes. It now also supports the powerful DeepSeek model, accessible through the DeepSeek official API, OpenRouter, or other OpenAI-compatible APIs, allowing developers to leverage cutting-edge AI for code generation, understanding, and manipulation directly within their familiar coding environment.
Cursor
Cursor is an AI-first code editor that utilizes large language models to assist with coding tasks. According to an official post, Cursor has just integrated DeepSeek models.
Continue
Continue is a code editor that leverages AI to provide intelligent code completion, code generation, and code understanding features. It supports various AI models, including DeepSeek, to enhance the coding experience.