China's latest AI model claims to be even cheaper to use than DeepSeek
I've been watching the AI race in China heat up lately, and wow, things are getting interesting! Z.ai (formerly Zhipu) just dropped their new GLM-4.5 AI model on Monday, and they're claiming it's even cheaper than DeepSeek. That's a pretty big deal.
So what makes this new AI model special? For starters, it's built on what they're calling "agentic" AI. Basically, the system breaks down complex tasks into smaller chunks automatically to get better results. Pretty clever, right? And just like DeepSeek, they've made it open source - so developers can download and use it for free.
The cost efficiency is what really caught my eye though. Z.ai says they'll charge just 11 cents per million input tokens compared to DeepSeek's 14 cents. And for output tokens? Only 28 cents per million versus a whopping $2.19 for DeepSeek. That's a massive difference!
What's really surprising is how little hardware the GLM-4.5 AI model needs. According to Z.ai's CEO Zhang Peng, it runs on just eight Nvidia H20 chips - about half the size of DeepSeek's model. These are the special chips Nvidia made for China to comply with US export rules. Zhang says they've got enough computing power for now, though he wouldn't spill the beans on their training costs.
Remember back in January when DeepSeek shocked everyone? They somehow managed to build an AI model rivaling ChatGPT despite US chip restrictions, and claimed training costs under $6 million. Some analysts weren't buying it though, pointing out this figure probably didn't account for their $500+ million hardware investment over time.
The competition in Chinese AI startups is getting fierce. Just this month, Alibaba-backed Moonshot launched Kimi K2, which they say beats both ChatGPT and Claude at certain coding tasks. Their token pricing is competitive too - 15 cents per million input tokens and $2.50 per million output tokens.
Z.ai isn't exactly a newcomer. Founded in 2019, they've raised over $1.5 billion from big names like Alibaba and Tencent. They're apparently planning an IPO in Greater China soon.
What does all this mean for the global AI landscape? Well, Chinese companies are clearly pushing hard on cost efficiency and open source models. The Nvidia H20 chip usage shows they're finding ways to work within export restrictions while still innovating.
Have you tried any of these Chinese AI models yet? I'm curious how they actually perform compared to the big Western names. The pricing is certainly attractive, but the proof is always in the performance!