6 minutes to read With insights from... Tim Herfurth Lead Data Scientist tim.herfurth@zuehlke.com What is DeepSeek? DeepSeek is an AI company making waves in the GenAI space. Since December 2024, it has introduced two large language models (LLMs) that have drawn significant attention.DeepSeek-V3 is a large language model for advanced text generation, similar to ChatGPT. DeepSeek-R1 followed a month later, with stronger reasoning abilities, especially in math and coding. These models have gained interest for their competitive performance, efficiency, and open-source availability. How does DeepSeek compare to other GenAI models? DeepSeek’s latest models match leading alternatives like GPT-4o, Claude-3.5, and Llama-3.1 in benchmark tests. They handle text generation, reasoning, and logic tasks as well as – or better than – other top models.What sets DeepSeek apart is efficiency. It delivers high performance while using fewer computational resources. Additionally, its open-source nature allows businesses to download and use the models themselves, even for commercial use. This makes DeepSeek a compelling alternative to proprietary AI solutions like GPT, Claude, and Gemini. Is my data safe with DeepSeek? DeepSeek provides both an app with a user interface and a developer API for using its models. Data entered through these services is processed externally, which may raise data privacy concerns. However, as we explain below, DeepSeek’s biggest advantage comes from self-hosting. Businesses can deploy the models in-house, ensuring full control over their data. No information needs to leave company infrastructure. DeepSeek: What does the ChatGPT alternative mean for businesses? With the rapid advancement of generative AI, companies now have access to powerful open-source alternatives that rival the commercial GenAI models. This offers new strategic options for businesses looking to balance AI performance, cost efficiency, and operational control. Open-source models with permissive licenses like DeepSeek allow companies to run LLMs on their own hardware, independent of external cloud providers. Thereby, the high quality of these models means little to no cutbacks in performance. This is a major advantage for organisations with strict security, compliance, or operational requirements. Opportunities and advantages More control, less risk More control, less risk Self-hosting open-source models gives businesses full control over their AI systems. Unlike proprietary APIs, where updates are rolled out silently and may affect model behaviour, locally deployed models ensure stability and predictability. Companies can decide when and how to update their models, avoiding unexpected shifts in performance.A major advantage of on-premise deployment is data privacy—critical for regulated industries like banking and healthcare. With a self-hosted model, sensitive information never leaves company infrastructure, ensuring compliance with strict data protection regulations. Operational costs Operational costs Deploying and running models on a company’s own infrastructure can be a cost-effective alternative to commercial AI services. Resource-efficient models like DeepSeek have relatively low infrastructure demands and operational expenses, potentially making them a viable option for many businesses.This is particularly true for organisations with an existing GenAI platform that streamlines model deployment and maintenance. For companies already using GenAI at scale, switching to an open-source model can significantly reduce ongoing costs while maintaining full control over infrastructure and data security.In early-stage development, however, cloud-based services and established APIs may still be the more practical choice. The ease of integration and lower upfront costs often make them preferable for Proof-of-Concept (PoC) projects before scaling AI in-house. Re-evaluating business cases Re-evaluating business cases GenAI and LLMs offer very promising business cases for various industries like the banking sector. With open-source models, the costs for running LLMs are dropping. This means business cases that once seemed too expensive may soon become viable. Companies should factor this into their AI investment strategies to stay competitive. Avoiding vendor lock-in Avoiding vendor lock-in Open-source LLMs allow businesses to stay independent of specific providers. Instead of being tied to a single LLM vendor, companies can build AI solutions that are adaptable to future developments. The right solution design allows businesses to switch models with minimal effort as technology advances. We implement such a flexibility in our Zühlke Augmented Generation (ZAG) accelerator.Beyond model flexibility, open-source AI also supports a broader AI infrastructure strategy. Companies can develop internal AI platforms that provide centralised monitoring, governance, and compliance management. This enables standardised deployment, ensuring AI models are optimised and aligned with business needs while maintaining full control over operational and security aspects. Risks and considerations Technical expertise and maintenance Technical expertise and maintenance Deploying open-source LLMs requires a higher level of in-house expertise. Unlike managed AI solutions and APIs, DeepSeek does not offer built-in enterprise support or compliance tools. Companies must invest in AI and infrastructure expertise to ensure a seamless and compliant deployment.Beyond serving the LLM itself, building a robust AI solution often requires additional AI-powered tools for data processing. For retrieval-augmented generation (RAG) applications, this means integrating AI search and document intelligence capabilities. Replacing cloud-based services for these tasks adds another layer of complexity, but emerging on-premise solutions from providers like NVIDIA can help streamline deployment and management. Reliability and performance testing Reliability and performance testing While DeepSeek performs well in benchmarks, real-world accuracy depends on proper testing. Businesses must monitor performance, fairness, and robustness when integrating the models.This requires test data, a well-defined ground truth, and domain experts who can assess correctness. Establishing a structured evaluation process not only ensures reliable AI performance but also makes it easier to compare different models and determine which best fits the use case. Security and compliance Security and compliance Self-hosting improves data security, but it also requires companies to manage updates, access control, and regulatory compliance themselves. Unlike API-based solutions, there is no automatic security patching. For businesses looking to mitigate AI-related risks while maintaining compliance, an AI governance framework can help establish security, monitoring, and governance structures for self-hosted AI. Conclusion DeepSeek and other open-source AI models offer businesses a high-quality, cost-effective alternative to proprietary solutions. With the right strategy, companies can reduce costs, maintain flexibility, and avoid vendor lock-in.What should I do next?(Re-)Evaluate AI use cases: Identify where open-source models like DeepSeek can replace or complement existing solutions. Self-hosting can make solutions more viable and enable applications with high data security requirements.Build flexibility: A GenAI platform lets you manage and monitor your AI solutions, stay adaptable and avoid vendor lock-in.Plan for costs: Open-source models need setup but lower long-term expenses.Ensure compliance: Establish security, monitoring, and governance for self-hosted AI.Stay informed: Keep track of AI advancements to remain competitive. “This trend in open-source AI is set to continue. For businesses, that’s good news. More control, lower costs, and adaptable solutions are now within reach. By acting strategically, companies can move beyond reacting to AI trends and start shaping their AI-driven future.” Here at Zühlke, we’re helping businesses across complex and regulated ecosystems to turn AI opportunities into value-driving use cases and scalable solutions. From strategy to implementation, we support businesses in developing responsible, scalable AI solutions. Discover our expertise in AI and data-driven solutions.