Amazon CEO Andy Jassy has lately praised the Chinese AI startup DeepSeek for its innovative approach to artificial intelligence training. He was speaking at the Q4 earnings call of Amazon and said that he was particularly impressed by the reinforcement learning methods and inference optimizations of the startup. He pointed out that the most intriguing aspects of its approach was the way it structured reinforcement learning. It flipped the sequencing and placed reinforcement learning earlier in the process before supervised fine-tuning. Jassy described it as a “very interesting” move this could influence how AI models are trained in the future.
DeepSeek has also made strides in inference optimization and it is to note here that such thing is a critical factor in AI performance. Inference refers to how efficiently an AI model processes and generates responses. Jassy acknowledged that the improvements of DeepSeek in this area were noteworthy. Reducing inference costs is a priority for companies who are looking to scale AI-powered solutions.
Amazon has integrated the AI model of DeepSeek into its cloud-based AI services and made it available on Amazon Bedrock as well as SageMaker. Jassy emphasized that AWS believes in providing businesses with multiple AI model options. The inclusion of DeepSeek in Amazon’s ecosystem highlights its growing influence in the AI landscape.
Jassy also expressed optimism about the future of AI and predicted that businesses will find it easier to integrate AI into their applications as inference costs decrease. Lower costs will likely encourage more widespread adoption and enable companies to build AI-driven solutions without significant financial barriers.