Why AI Training Costs Millions: A Look at the 'Gigafactory of Compute'

I’m often asked which AI training project cost millions of dollars and two years of my life. People wonder: why is it so expensive? My usual answer is that it’s not particularly expensive—especially considering we don’t own our own hardware yet. Training AI has always been about massive data centers; that’s just the reality of the field. When you’re not immersed in it, the sheer scale can be hard to visualize. ...

9 May, 2025 · 2 min · 268 words · Yury Akinin

Diary of an AI Startup

This series of posts will be my way of documenting the journey of creating one of our team’s most ambitious products: the intelligent assistant, A.V.E.L.I.N. To give you some context, my development team and I are currently beta-testing the project within our Mozgii Ecosystem AI platform. Our primary focus is on A.V.E.L.I.N.—an intelligent personal assistant in Telegram built to handle both basic and complex tasks involving AI-powered search, processing, and analysis of information. ...

1 May, 2025 · 1 min · 211 words · Yury Akinin

The Emerging Skill of the AI Era: Beyond Just Searching

From my experience working with neural networks, it’s become obvious how two people can interact with the same model and get radically different outcomes. This isn’t a minor variation—it signals a fundamental departure from the search engine paradigm. We have entered a new era of interacting with artificial intelligence. AI doesn’t just aggregate data; it selects and synthesizes relevant information in response to specific requests. This changes the very nature of how we engage with information. Where the key skill was once finding data in search engines, it is now the ability to correctly formulate requests to an AI. ...

25 April, 2025 · 1 min · 205 words · Yury Akinin

Models Are Tools, Not Events: The Real Meaning Behind GPT-4.1 and the End of GPT-4.5

Yesterday, OpenAI opened access to the GPT-4.1 API. It’s a refined version of their flagship model—faster and architecturally closer to the concept of ‘agents.’ In parallel, the company officially announced it is winding down GPT-4.5, its most resource-intensive model, due to its excessive complexity and support challenges. With GPT-4.5, it seems they hit an architectural dead end. We are at a point where models appear and disappear rapidly. They are becoming what they should be: tools, not landmark events. We have a growing catalog of specialized AIs: some calculate, others write code, plan tasks, or generate video. But the average user should not be expected to know and choose between every AI in existence. That paradigm defies the logic of good user experience. ...

15 April, 2025 · 2 min · 269 words · Yury Akinin

A Mouse Brain, 1.6 Petabytes of Data, and the Path to AGI

Scientists recently digitized a single cubic millimeter of a mouse’s visual cortex, a project that generated 1.6 petabytes of data to map 84,000 neurons and half a billion synapses. To put that into perspective, the number of synapses in that tiny piece of brain tissue is comparable to the number of parameters in large-scale AI like DeepSeek or GPT models. It’s significantly more than the 29 billion parameters in a model like GigaChat. This comparison is a useful analogy for scale and complexity: just as synapses determine a brain’s processing capacity, parameters define the “power” of an AI. ...

11 April, 2025 · 2 min · 224 words · Yury Akinin