Deepseek. Don’t panic!
So says Bernstein. Did DeepSeek really “build OpenAI for $5M? There are actually two model families in discussion. The first family is DeepSeek-V3, a Mixture-of-Experts (MoE) large language model which, through a number of optimizations and clever techniques can provide similar or better performance vs other large foundational models but requires a small fraction of the