DeepSeek vs. OpenAI's OSS: A Tale of Two Open-Source Models

Two major players recently dropped new open-source models, but they represent two fundamentally different philosophies. OpenAI, the established leader, returned to the open-source scene with fanfare and its gpt-oss-20b model. Shortly after, the Chinese startup DeepSeek quietly released v3.1. While one was a media event, the other was a single tweet. The initial results from hands-on testing are starkly one-sided. Out-of-the-Box Performance: A Clear Winner When you evaluate a model as a tool to be used right now, the comparison is not even close. Across multiple practical tests, DeepSeek v3.1 consistently delivered superior results: ...

27 August, 2025 · 4 min · 654 words · Yury Akinin

OpenAI's GPT-OSS: A Major Step Back Towards 'Open'

OpenAI just made a significant move by releasing GPT-OSS, its first truly open-source large language model family since GPT-2. With a permissive Apache 2.0 license, this isn’t just a minor release; it’s a fundamental shift that puts real power back into the hands of developers. The family includes two Mixture-of-Experts (MoE) models, gpt-oss-20b and gpt-oss-120b, designed for high-performance inference with strong reasoning capabilities. Why This Is a Game-Changer For years, the most powerful models from OpenAI have been locked behind APIs. This meant dealing with rate limits, opaque pricing, and sending potentially sensitive data to a third party. GPT-OSS changes that equation entirely. ...

13 August, 2025 · 2 min · 418 words · Yury Akinin