<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>#ElonMusk on Home</title>
    <link>https://yakinin.com/en/tags/%23elonmusk/</link>
    <description>Recent content in #ElonMusk on Home</description>
    <generator>Hugo -- 0.148.2</generator>
    <language>en</language>
    <lastBuildDate>Fri, 22 Aug 2025 08:23:06 +0000</lastBuildDate>
    <atom:link href="https://yakinin.com/en/tags/%23elonmusk/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Grok&#39;s Public Chats: A Predictable AI Privacy Failure</title>
      <link>https://yakinin.com/en/posts/20250822-grok-ai-privacy-failure/</link>
      <pubDate>Fri, 22 Aug 2025 08:23:06 +0000</pubDate>
      <guid>https://yakinin.com/en/posts/20250822-grok-ai-privacy-failure/</guid>
      <description>&lt;p&gt;It’s a classic story at this point. We saw it recently with OpenAI’s ChatGPT, and now it’s Grok’s turn. Elon Musk’s xAI has inadvertently published hundreds of thousands of its users&amp;rsquo; private conversations, making them fully searchable on Google. This wasn&amp;rsquo;t a sophisticated hack; it was a fundamental product design flaw.&lt;/p&gt;
&lt;h2 id=&#34;the-feature-that-became-a-bug&#34;&gt;The Feature That Became a Bug&lt;/h2&gt;
&lt;p&gt;The mechanism was simple and naive. When a Grok user hit the &amp;ldquo;share&amp;rdquo; button to send a conversation to a colleague or friend, the system generated a unique URL. However, instead of being a private link, this URL was made public and available for search engines to index. In effect, &amp;ldquo;sharing&amp;rdquo; meant &amp;ldquo;publishing to the open web&amp;rdquo; without any warning or disclaimer.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Grok-4 vs. ChatGPT-5: Musk Claims Victory with New Benchmarks</title>
      <link>https://yakinin.com/en/posts/20250813-elon-musk-grok-4-vs-chatgpt-5-grok-5-announcement/</link>
      <pubDate>Wed, 13 Aug 2025 15:50:14 +0000</pubDate>
      <guid>https://yakinin.com/en/posts/20250813-elon-musk-grok-4-vs-chatgpt-5-grok-5-announcement/</guid>
      <description>&lt;p&gt;Elon Musk has once again stirred the AI world, making a bold claim against OpenAI and Microsoft shortly after the ChatGPT-5 release. He asserts that his Grok-4 Heavy model from xAI already outperforms its new competitor.&lt;/p&gt;
&lt;h2 id=&#34;the-benchmark-battle&#34;&gt;The Benchmark Battle&lt;/h2&gt;
&lt;p&gt;According to Musk, the numbers speak for themselves: Grok-4 reportedly scored 15.9% on the Arc-AGI2 test, while ChatGPT-5 achieved 9.9%. He also noted that his model was already &amp;ldquo;smarter&amp;rdquo; two weeks before the GPT-5 launch, a sentiment he claims is echoed in positive user feedback.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Grok&#39;s Ad Integration: Musk&#39;s Necessary Experiment and the High Cost of AI Trust</title>
      <link>https://yakinin.com/en/posts/20250813-elon-musk-x-grok-ads/</link>
      <pubDate>Wed, 13 Aug 2025 00:00:00 +0000</pubDate>
      <guid>https://yakinin.com/en/posts/20250813-elon-musk-x-grok-ads/</guid>
      <description>&lt;div style=&#34;display: flex; justify-content: center; gap: 1em; flex-wrap: wrap;&#34;&gt;
  &lt;img src=&#34;https://yakinin.com/img/20250813-elon-musk-x-grok-ads-0.webp&#34; style=&#34;max-width: 350px; width: 100%;&#34; /&gt;
&lt;/div&gt;
&lt;p&gt;Elon Musk’s announcement to integrate ads directly into Grok’s AI responses isn&amp;rsquo;t just another headline—it’s a direct confrontation with the core economic challenge of building large-scale AI. His reasoning, as stated to advertisers, is brutally simple: &amp;ldquo;So we’ll turn our attention to how do we pay for those expensive GPUs.&amp;rdquo;&lt;/p&gt;
&lt;p&gt;This move marks a critical experiment in the monetization of consumer-facing AI, moving beyond the now-common subscription models.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Why AI Training Costs Millions: A Look at the &#39;Gigafactory of Compute&#39;</title>
      <link>https://yakinin.com/en/posts/20250509-elon-musk-xai-gigafactory-compute/</link>
      <pubDate>Fri, 09 May 2025 00:00:00 +0000</pubDate>
      <guid>https://yakinin.com/en/posts/20250509-elon-musk-xai-gigafactory-compute/</guid>
      <description>&lt;div style=&#34;display: flex; justify-content: center; gap: 1em; flex-wrap: wrap;&#34;&gt;
  &lt;img src=&#34;https://yakinin.com/img/20250509-elon-musk-xai-gigafactory-compute-0.jpeg&#34; style=&#34;max-width: 350px; width: 100%;&#34; /&gt;
&lt;/div&gt;
&lt;p&gt;I&amp;rsquo;m often asked which AI training project cost millions of dollars and two years of my life. People wonder: why is it so expensive?&lt;/p&gt;
&lt;p&gt;My usual answer is that it&amp;rsquo;s not particularly expensive—especially considering we don&amp;rsquo;t own our own hardware yet. Training AI has always been about massive data centers; that&amp;rsquo;s just the reality of the field. When you&amp;rsquo;re not immersed in it, the sheer scale can be hard to visualize.&lt;/p&gt;</description>
    </item>
  </channel>
</rss>
