This paper says that to make math-solving AIs smarter, we should train them more on the hardest questions they can almost solve.
LongCat-Flash-Thinking-2601 is a huge 560-billion-parameter Mixture-of-Experts model built to act like a careful helper that can use tools, browse, code, and solve multi-step tasks.
Big models like Whisper are great for accuracy but too slow for live captions; this paper builds a smaller, faster Thai speech recognizer for real-time use.
SmartSearch teaches search agents to fix their own bad search queries while they are thinking, not just their final answers.