So why do chips matter to America’s next chapter? From AI to autonomous vehicles, chips dictate the pace of innovation. Imagine this: from medical breakthroughs to robots managing your dirty laundry, all advancements hinge on these chips. And as Chris Miller says in his book Chip Wars, “The rivalry between the United States and China may well be determined by computing power.”
We are living in AI times, and within just a few years, AI will be as invisibly embedded in our lives as the internet, electricity, or cars are. The question “Are you using AI?” will vanish, much like we stopped asking if someone is using the internet.
AI Agents: Your Newest Teammate?
AI is being applied in the workplace at breakneck speeds, and as Dominic Grillo of OpenAI exclaimed at BoxWorks Conference in San Francisco last month, “2025 will be the year of AI agents.” AI agents are programs that perform tasks, autonomously or semi- autonomously, and can analyze data, make decisions, and take actions based on specific goals, all while continuously learning and adapting. While some view this as a hype cycle, AI agents are already executing tasks such as sales development, contract management, customer interactions and workflow automation. 11x.ai recently raised a $50 million Series B, led by Andreessen Horowitz, to scale out their autonomous digital workers. Unlike traditional automation, agents can think, adapt, and even make decisions. Aaron Levie, CEO and co-founder at Box adds: “AI agents could handle tasks like contract edits or customer interactions, saving months of work and completing them in minutes.”
Industry.AI
AI is revolutionizing industries. In health care, drug discovery cycles that once took years are now condensed into weeks. In finance, AI- driven fraud detection systems reveal patterns invisible to humans. In manufacturing, predictive maintenance minimizes downtime, saving millions. “We’re now in an era of the ‘self-healing enterprise’, where intelligent automation tackles issues before they emerge, empowering companies to grow and innovate fearlessly,” highlights Christina Crawford Kosmowski, CEO of LogicMonitor.
This transformation extends to frontline workers, as Steve Murphy, CEO of Epicor, explains: “Imagine a factory worker resolving defects before they escalate or a warehouse team optimizing workflows in real-time—all without technical expertise. It’s about turning good ideas into action at every level of the organization.”
So, what can we expect in the near term for AI integration? Charles Sansbury, CEO of Cloudera, expects to see: “Significant progress in industry-specific applications, such as predictive maintenance and analytics-driven optimization. These early successes will lay the groundwork for advanced use cases as organizations refine their data infrastructure and expertise.”
Cloud Reinvented
The cloud is evolving from simple storage to enabling businesses to scale AI solutions once reserved for tech giants. “One of the most significant shifts in the infrastructure market in the last 20 years has been the rise of AI infrastructure,” says J.J. Kardwell, CEO of Vultr. “However, the performance and price of GPUs have improved drastically, enabling infrastructure providers like us to deliver them as a service at competitive rates.”
This accessibility levels the playing field. Startups and mid-sized companies can now leverage AI for data analysis, customer personalization, and workflow optimization without costly systems. Jim Kavanaugh, CEO of World Wide Technology (WWT), explains: “We are focused on creating unified data fabrics that integrate structured and unstructured data—everything from videos and engineering drawings to voice and text.”
The Question of Safety: Can We Trust AI?
For all its promise, AI comes with risks, and ignoring them could have dire consequences. From algorithmic bias to the potential for misuse, ensuring AI systems remain ethical, fair, and secure must be a priority. Yet, achieving this is no small task, particularly given the complex socio-technical nature of AI.
The National Institute of Standards and Technology (NIST) is taking proactive steps by convening experts from diverse disciplines to create frameworks that address biases while grappling with the inherent complexity of AI systems.
As the outgoing under-secretary Laurie Locascio, director of NIST, highlights, “Our goal is to develop methods to measure these characteristics with as much reliability as possible, even though we acknowledge that we'll never achieve the same precision as we do in purely technical measurements.”
Building trust in AI will require not just technical innovation but collaboration, well- thought-out regulation by those that truly understand AI, and the awareness to address human fallibility in decision-making processes.
The Path to AGI
Over the past four months, I have asked dozens of experts when artificial general intelligence (AGI) might be achieved. AGI, as OpenAI defines it, refers to “AI systems that are generally smarter than humans.” However, Dario Amodei, co-founder and CEO of Anthropic, prefers the term “powerful AI,” given that the definition of AGI is a moving target, in the same way that the definition of a supercomputer has evolved over time. While timelines for achieving AGI differ widely, one consensus is clear: We remain far from this milestone due to the limitations of current digital processors. As Dr. Glenn Ge, CEO of TetraMem, explains: “You can't send people to the moon with a steam engine.” He believes advancements in analog and neuromorphic computing, inspired by the human brain, are essential steps. Analog in-memory computing could significantly accelerate the path to AGI by mimicking natural neural processes more efficiently than traditional systems, addressing energy inefficiencies and data movement bottlenecks.
So when is AGI likely? Ray Kurzweil, futurist and inventor, has long predicted that AGI will be achieved by 2045, coinciding with his concept of the singularity, a moment when technology surpasses human intelligence and ushers in an era of exponential progress. His vision suggests that advancements in areas like neuromorphic computing could serve as critical stepping stones toward AGI.
And why aim for AGI? Imagine all of the problems we face on our planet, from climate change to neurodegenerative diseases. It is not an exaggeration to believe that AGI can offer a solution to them all. In the relative short term, by 2030, Ben Fielding, co-founder at Gensyn, believes there will be a shared infrastructure for AI models. “One company might have specialized data for a share of a model that can interact seamlessly with other specialized models. This creates a global knowledge base similar to what Wikipedia did for text. However, by 2030, humans will still be the primary curators of these pathways,” says Fielding. By 2035, his co-founder, Harry Grieve, believes this process will be heavily automated. The creation and curation of models will create a parallel machine society. “Machines will autonomously manage complex tasks, optimizing themselves with minimal human intervention. This will lead to a decentralized network of machine intelligence, where individual models or ‘agents’ collaborate to provide solutions,” says Grieve. As you can imagine, achieving AGI will not simply be one of humankind’s greatest technical feats but one of its most profound societal and ethical challenges. Guaranteeing safety, managing ethical concerns and preparing for AGI’s potential impact on humanity requires collaboration across the leading AI organizations, whole industries and every major nation. Whether it takes decades or arrives sooner than expected, we are already grappling with what AGI will bring. If it seemingly happens at a specific moment, it will undoubtedly be the most profound human milestone.
RETHINKING CHIPS
The immense potential of AI ultimately rests on hardware, most evident in NVIDIA’s ascent to being the most valuable publicly traded company. However, the industry faces a challenge each day to innovate as Moore’s Law—the prediction that the number of transistors on a chip doubles approximately every two years—is nearing its physical limits. This has pushed the industry to rethink chip design, with a sharp focus on cutting power requirements for AI workloads. But no industry defines Ester Boserup’s famed proverb, “Necessity is the mother of invention”, more than semiconductors.
Necessity is Always the Mother of Invention
By 2030, the AI revolution is projected to consume 8 percent of the world’s electricity supply. The stakes are clear: How do we sustain this revolution without breaking the planet? The first is through novel chip architecture where there is a fundamental change occurring: from general- purpose processors toward tailored solutions that maximize efficiency and reduce waste. A key path lies in developing chips designed specifically for AI workloads. Amazon recently announced a supercomputer made of their AI chips, Trainium 2, as an alternative to NVIDIA’s GPUs. Given NVIDIA’s monopoly of 90 percent, as of December 2024, on the supply of AI chips, it is no surprise that all tech hyperscalers are doubling down on the market. However, innovation does not only lie in the hands of the tech giants. High-growth startups like TetraMem are pioneering analog in-memory computing to overcome efficiency bottlenecks. “Our approach minimizes data movement by integrating memory and computation in the same physical location, achieving brain-level efficiency,” highlights Dr. Glenn Ge, TetraMem’s CEO.
Ambiq has challenged the laws of physics building circuits that operate at just 0.4 volts instead of the standard 1.8 volts. “By drastically reducing the power needed for each operation, we have helped transform what is possible in battery-powered devices. Now, leading smart devices can offer four weeks of battery life, shattering the old limitations of daily charging.” In fact, the wearable this reporter is wearing lasts for a week due to the Ambiq chip inside it.
Meanwhile, Semtech’s ultra-low-power chips are enabling edge AI systems to operate on minimal energy. “Our LoRa technology can enable devices to operate for up to 10 years on a single battery,” explains Dr. Hong Hou, Semtech’s CEO. “It’s a testament to how minimal power consumption can transform not just smart cities but also AI infrastructure”.
Advanced Packaging
Advanced packaging is also increasingly becoming a solution to enhance performance without the large investments needed for cutting-edge transistor development. This is not your typical packaging, rather technology as advanced as the chips themselves. As traditional scaling through Moore’s Law plateaus, “advanced packaging enhances system- level performance through customization, offering a cost-effective solution as transistor shrinkage becomes less viable,” explains Bob Patti, president of NHanced Semiconductors.
Moreover, with transistor density on a single semiconductor die—a small block of semiconducting material on which a given functional circuit is fabricated—becoming harder to achieve, the industry has shifted towards chiplets. “This allows for better functionality at a lower cost,” highlights Fusen Chen, CEO of Kulicke and Soffa Industries. “It is exciting because this shift plays to our strengths in advanced packaging, where we are focused on enabling these next-generation solutions.”