Anthropic's Claude Code Draws Fire with Aggressive Cuts and User Blame, As Transformers.js 4.0 Revolutionizes Web ML

Anthropic’s Claude Code has come under intense scrutiny following a series of aggressive usage cutbacks and controversial policy changes. Users, including the host, report experiencing rapid token consumption, with one instance showing 10% usage from a single simple question. Amidst community reports of a potential bug causing faster token drain, Anthropic acknowledged “efficiency corrections” but notably denied any overcharging, instead implying user misuse. New guidelines advise opting for Sonnet 4.6 over Opus, reducing “thought” effort, and starting fresh sessions, with “peak hour” usage now consuming limits faster, adding subjective complexity for developers. Further escalating frustration, Claude Code subscriptions no longer cover third-party tools like OpenClaude, requiring separate usage packs or API keys. The platform has also begun blocking first-party tool usage if it “violates policy,” even preventing analysis of Claude Code’s own code or running OpenClaude from within its terminal, sparking outrage over arbitrary censorship and a lack of clarity in terms of service. This coincides with broader financial tensions at OpenAI, where CFO Sara Frear is reportedly clashing with Sam Altman over IPO timelines, citing insufficient revenue growth against projected expenditures. OpenAI’s acquisition of the tech talk show TVPN has also raised eyebrows as an unusual strategic move, while Wall Street Journal data reveals staggering projected model training costs: $125 billion for OpenAI and $35 billion for Anthropic by 2029. In other critical news, the Node.js project has paused its Back Bounty program due to funding drying up, eliminating monetary rewards for vulnerability disclosures. This decision, influenced by the Internet Bug Bounty program’s discontinuation and a surge in AI-generated reports (mirroring an earlier move by the Curl project), raises significant security concerns for the widely-used JavaScript runtime, especially given its ecosystem’s ongoing exposure to supply chain attacks.

In a significant advancement for web-based machine learning, Transformers.js has released version 4.0.0, introducing a robust WebGPU backend. This pivotal update allows developers to execute a wide array of machine learning models, including Gemma 4, directly in the browser, Node.js, Bun, Deno, Supabase, React Native, and Electron applications, leveraging the device’s GPU for dramatically improved performance. This capability simplifies the consumption of complex models, making powerful AI accessible in client-side and server-side JavaScript environments without requiring dedicated servers. Elsewhere in the tech industry, the debate surrounding the software job market continues, with the host reiterating positive growth trends reported by outlets like Business Insider. This positive outlook, however, faces skepticism from some developers who cite personal struggles with job rejections and concerns about “fake” job postings. This divergence underscores the gap between broader market trends and individual experiences. Adding to the dynamic landscape, gaming giant Take-Two Interactive reportedly laid off its entire AI team, including its lead, due to “disappointing results” from their AI initiatives. This serves as a reminder that despite the hype, practical AI implementation still faces significant hurdles and can lead to unexpected outcomes.