DeepSeek
DeepSeek is a Chinese AI company whose open-weight R1 release is used in [src-061] as the reference point for a major shift in frontier-model competition.
Key facts
- The “DeepSeek moment” refers to DeepSeek R1’s January 2025 release, which Fridman describes as an open-weight Chinese model that surprised the field with near state-of-the-art performance and allegedly much lower compute cost [src-061].
- Raschka says DeepSeek won goodwill among open-weight-model practitioners because it shared strong models openly [src-061].
- Lambert argues that DeepSeek kicked off a broader Chinese open-weight movement, with other labs such as Z.ai, MiniMax, and Kimi/Moonshot becoming more visible and sometimes leapfrogging DeepSeek’s crown [src-061].
- The episode treats DeepSeek less as a one-off company story and more as evidence that open weights can reshape influence, research diffusion, and business-model pressure across the global AI market [src-061].
- DeepSeek-V3.2 is also discussed as an example of sparse attention and indexing for more efficient long-context processing [src-061].
Related entities
Related concepts
- Open-Weight Model Strategy
- Model Lab Differentiation
- Agentic Context Management
- Training-Inference Compute Balance
Source references
- [src-061] Lex Fridman – “State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490” (2026-01-31)