Published 10 Dec 2024

Connectivity and Cloud: Navigating AI's Blind Spots in 2025

By, Mike Hoy, Infrastructure & Cloud Operations Director
Share

Stay updated

Pulsant insights and best practices delivered to your inbox.

Mike Hoy, CTO, Pulsant looks ahead to the next 12 months, identifying some of the challenges organisations are likely to face in delivering AI projects, and highlighting the importance of a resilient, connected infrastructure in meeting these.

 

As AI proofs of concept evolve over the next 12–18 months, they will lay the groundwork for advances in technology. Yet alongside this progress comes a heightened demand for AI applications to access data from diverse locations, including vast reserves of private data. With private data outstripping internet-hosted data by a factor of nine, overcoming this barrier is essential if AI is to reach its potential. 

 

Data: The Heart of AI's Promise 

 

The ability to access data quickly and in a usable format is pivotal for AI success. Without seamless, reliable access, the very foundation of AI crumbles. 

The challenge lies in the reality that organisational data resides across multiple platforms and locations, not just the heavily marketed ecosystems of AWS or Microsoft. AI applications demand robust network reliability to ensure consistent latency, performance, and real-time data exchange. Connectivity, therefore, becomes the linchpin for unifying disparate data sources. 

Yet, many boards underestimate the criticality of connectivity, assuming it “just works.” This oversight could derail AI initiatives. Even the most advanced AI applications, paired with immense computational power, can falter if subjected to a mere 10-millisecond delay in data retrieval. In 2025, deploying AI without factoring in connectivity is not just a misstep—it’s a significant strategic failure. 

 

The Cloud Debate Resurfaces 

 

The connectivity challenge is closely tied to a broader, resurgent debate: the evolution of cloud models to support AI. 

AI models differ fundamentally from traditional software. Early cloud infrastructure was not designed to manage the billions of parameters and real-time data streams integral to AI. As such, cloud design and supporting infrastructure must evolve to meet these demands. 

While security, connectivity, and resilience—enabled by geographically distributed networks—remain foundational, the escalating cost of OPEX is driving organisations to reassess their reliance on providers like AWS and Microsoft. The growing trend toward repatriating workloads to private clouds highlights a pressing need for standardisation in data migration. 

 

Standardisation: A Catalyst for AI Optimisation 

 

Just as banking regulation has simplified switching accounts, the push for legislative guidance on cloud migration could streamline AI infrastructure planning. By introducing standardisation in data movement, organisations can more easily adopt hybrid cloud models tailored to their AI needs and broader business goals. 

With AI workloads increasingly distributed across diverse environments, a standardised approach will accelerate adoption, promote best practices, and distinguish AI leaders as the market matures. 

 

Driving Awareness and Collaboration 

 

As AI drives new demands on infrastructure, the tech industry has a responsibility to raise awareness of the critical interplay between connectivity, cloud models, and the wider ecosystem. To meet real-world AI expectations, organisations must prioritise collaboration with suppliers and partners. 

 

Connectivity and cloud considerations are no longer a secondary focus—they are fundamental to success in this new era of AI. By embedding these priorities into planning and execution, businesses can better navigate the complexities of 2025 and beyond.