Why Networks Are the Real Future of AI—Insights from Barun Kar

Estimated read time 6 min read

In a recent episode of Lexicon, we had an insightful conversation with Barun Kar, co-founder and CEO of Upscale AI. We delved into the unseen ‘pipelines’ that make AI technologies, like ChatGPT, work effectively.

According to Barun, the networking layer that underpins these AI systems is quickly evolving into a significant hurdle for their growth.

Don’t miss out! Make sure to subscribe to IE+ for exclusive insights and premium content.

AI’s Hidden Challenges

Barun Kar pointed out that “AI workloads are fundamentally transforming the backbone of our networks.” He explained that these workloads require an increase in bandwidth, minimal delays, and intelligent data management, particularly when thousands of GPUs are intercommunicating.

The previously overlooked web of connections is now a critical determinant of how effectively AI can evolve. “Traditional networking fails pretty quickly under substantial AI workloads, primarily because they weren’t designed for the required synchronized exchanges of data between large clusters of GPUs,” he commented. Issues like congestion, data packet losses, and microsecond lags hinder GPU performance and squander valuable resources.

This, he insists, makes AI networks the weakest link in a domain focused obsessively on computing speed. “You can keep making accelerators faster, but that won’t lead to groundbreaking advancements without addressing the bottlenecks caused by networking,” he added.

This challenge extends even to tech giants like Google, Microsoft, and Amazon, who are grappling with complex AI networking scenarios. Barun emphasized that “AI networking comes with high costs and significant complexity.” Developing high-bandwidth, low-latency connections demands unique hardware and enormous energy and cooling resources.

The Monopoly Effect

The complexity of AI networking has resulted in a de facto monopolization, where Barun notes, “Currently, Nvidia remains the primary solution catering to hyperscaler demands — and its system is quite closed off.”

According to him, these closed systems present notable drawbacks. He argued that relying on a lone supplier reduces flexibility and stifles innovation.

Economically, this creates inflated costs and drastically lessens pricing power. Over time, he adds, heavy reliance concentrates AI infrastructure specifically in the hands of major tech corporations.

Barun believes this growing concentration of control not only ruins competition but also diminishes the spur of creativity. “If you’re tied to one company’s hardware and upgrade schedule, integrating new technologies or collaborating across various sectors becomes significantly harder. Closed ecosystems prevent outsiders from effectively entering the fray with Big Tech,” he stated.

The Importance of Open Standards

Upscale AI proposes a straightforward yet powerful solution: develop systems based on open standards.

Barun explains, “Open standards consist of publicly available technologies and protocols not monopolized by any single entity. They act much like a universal language for data, allowing significantly easier interoperability between different systems.”

That interoperability is essential for the future of AI, he asserts. “Using open standards enables the creation of harmonious networks, which can be tailored for specific applications and upgraded rapidly as technology evolves. This openness will democratize access to AI – providing choices for users.”

The visionary also envisions a landscape where innovation flourishes. “We plan to focus on open-standard chips, systems, and software, so AI becomes accessible for everyone. Think of enthusiasts fabricating mini-AIs at home — that’s feasible when barriers are removed!” he said.

Making AI Infrastructure Accessible

Barun dreams of a scenario where AI infrastructure is not monopolized by trillion-dollar companies. “Presently, there’s a monopoly that leaves AI heavyweights holding the keys to cutting-edge technology,” he said. “Open-standard networking will help eliminate this dominance and lower entry barriers for new players,” he added.

He believes this change could catalyze widespread innovation. “With open technology, smaller firms and research hubs will be able to construct formidable AI clusters from parts acquired from various suppliers. It’s all about making advanced systems more deployable for a broader range of organizations.”

Barun also recognizes that established players might attempt to snatch up promising open-standard initiatives, yet remains hopeful: “If one gets absorbed, another will undoubtedly appear to continue pushing the trend forward. It’s a cycle that persists.”

Creating Open-Standard Solutions

So, why hasn’t this been executed already? Barun notes, “AI networking has historically lagged behind GPUs because it’s a complex layer to innovate in.” It demands huge expertise in hardware and distributed systems, coupled with substantial investment for testing and scaling efforts.

However, he recognizes that the right conditions for change are finally emerging. “Since the launch of tools like ChatGPT a couple of years back, we’ve seen newer networking protocols like NVLink and InfiniBand evolving as well. Open standards for networking are just starting to take shape with initiatives like UA Links 1.0 from last year, creating room for startups to disrupt this sector now,” he added.

Upscale AI is dedicated to what Barun refers to as a “full-stack” solution. “With a full-stack approach, we handle every component of the networking process, from chips to hardware systems, racks, and the software layer. Currently, networks tend to be patched together – resulting in inefficiencies and compatibility pitfalls,” he noted.

Upscale aims for a cohesive modular approach solely centered around open standards, which will optimize operations and create a unified platform capable of scaling AI workloads without dependence on proprietary technology. “Everything should be possible as long as we follow the open standards. Plug in your compute systems or swap out components wherever necessary!” he concluded.

A Significant $100 Million Backing

Upscale AI’s recent $100 million seed funding indicates investor confidence in their vision. As Barun suggested, “This funding round signals that investors recognize the growing urgency of addressing the AI networking bottleneck.”

The investment will facilitate expanding their engineering team of over 100 and building intricate testbeds to assess performance at scale. However, it’s not solely rooted in product creation; it aims to foster a larger movement.

He said, “Constructing high-performance networking equipment necessitates a thriving ecosystem and communities focused on open standards. This funding positions us to advance efficient AI networking solutions for production environments effectively.”

The Future: Open and Expansive

Looking forward, Barun is confident that openness will dominate the scene in the coming years. “Give it five years; you’ll see open standards entrenched in the sector, that’s a guarantee,” he asserted. He predicts that AI networking will be significantly faster, more flexible, and—most crucially—much more accessible.

He believes that improved accessibility will trigger a substantive impact across multiple industries. “When hardware is inexpensive and interoperable, advanced AI features will no longer seem exclusive to corporate giants — this smoothed pathway will significantly increase creativity and innovation across the landscape. For everyday consumers, this means lower costs and widely available AI solutions,” Kar mentioned, introducing anticipated advances in personalized assistants and breakthroughs in fields such as healthcare and education.

Barun has a transformative vision: to free the AI landscape’s most closed and intricate areas for accessibility and inventiveness. If Upscale AI manages to pull this off, the next milestone in AI development may arise not from faster chips but from well-composed networks knitting everything together.

Related Posts: