News

Researchers at the MIT Media Lab looked at the cognitive costs of using LLMs like ChatGPT in everyday writing tasks. Their 200-page report, "Your Brain on ChatGPT," focused on how AI tools affect ...
Bloomberg reports that Elon Musk's AI startup xAI is currently burning through about $1 billion every month. The main drivers are steep infrastructure and chip costs for building its Grok chatbot.
OpenAI has launched "ChatGPT Record", a feature in the macOS desktop app that lets Pro, Team, Enterprise, and Edu users record, transcribe, and summarize audio. Each session allows up to 120 minutes, ...
LAION and Intel have released Empathic-Insight, a suite of models and datasets that can analyze facial images and audio files across 40 emotion categories, covering not only emotional but also ...
OpenAI has significantly updated ChatGPT's search feature: it now handles longer contexts, better follows instructions, answers complex questions with several parallel searches, and allows users to ...
TikTok has introduced three new AI tools—"Image to Video," "Text to Video," and "Showcase Products"—that enable advertisers to create video content in a more cost-effective way. These tools ...
A new website called "The OpenAI Files" launched on June 18, 2025, compiles internal documents and criticism of OpenAI's leadership, strategy, and corporate culture. It raises concerns about whether ...
Large Reasoning Models (LRMs) such as Claude 3.7 Sonnet Thinking, Deepseek-R1, and OpenAI's o3 are often described as a step toward more general artificial intelligence. Techniques like ...
Fundamental disagreements over AI's future LeCun's remarks highlight a much deeper debate about the direction of AI research. Companies like Anthropic and OpenAI are racing to commercialize ever more ...
OpenAI has lowered the price of its o3 language model by 80 percent, CEO Sam Altman said. The new cost is $2 per million input tokens and $8 per million output tokens. The move follows Google’s Gemini ...
MiniMax-M1 is a reasoning-focused model with a massive context window of up to one million tokens and a "thinking" budget of up to 80,000 tokens. The model uses an especially efficient reinforcement ...
The 8 TB Common Pile v0.1 was assembled by researchers from the University of Toronto, Hugging Face, EleutherAI, and the Allen Institute for AI (Ai2), among others. It brings together content from 30 ...