The Network is Finally the Application
Feb 18, 2025
The network is, finally, the application in the public cloud. In this episode of the Tech Field Day Podcast, recorded before Cloud Field Day, features Jon Myer, Michael Levan, Larry Smith, and Alastair Cooke. Deploying applications across multiple clouds requires the network be the common connector to integrate applications across those clouds. Everything is an API, deploying networks on the cloud is software defined but every cloud has its own API and many on-premises networks have their own APIs. Observability across a multi-cloud application is far more complex than when everything was on-premises in our own data center, there are so many new places where issues might arise. The pace of change in modern applications makes analyzing and troubleshooting applications challenging too, when networks are built using a CI/CD pipeline the network configuration can change every day, with new software version deployment. Observability is vital; metrics, logs, and traces need to be fed into a single location where insights can be gained, without the insight there is no reason to collect the data.
The reality for most enterprise organizations is that one cloud is never enough, whether it is an on-premises cloud and a public cloud, or multiple public clouds, there is always more than one. Whether it is through mergers and acquisitions or simply applications with different requirements, multi-cloud is a reality for most large organizations. Business applications are spread across multiple locations and multiple clouds so integration between applications needs some network integration, meaning that the network is the application.
One of the challenges of hybrid multi-cloud networking is that each cloud platform and each on-premises network has its own API and capabilities. While everything is an API, it is not a universal API meaning that it is up to each business to integrate a collection of APIs to build a cohesive network. Cloud neutral Infrastructure as Code (IaC) tools, such as Terraform, allow a single tool to work across multiple clouds. Often networks are built by combining IaC with CI/CD pipeline automation, new networks might be built for every new software feature that is released.
A further challenge is troubleshooting issues with these complex multi-cloud networks. Troubleshooting a single on-premises datacenter was relatively simple, and application tended to be monolithic so there was only a single place to look. Modern cloud-native applications might be spread across cloud providers and cloud locations and composed of many small ephemeral objects like containers or serverless servers. Observability is vital, collecting metrics, logs, and traces into a coherent location to create insights into application behavior. Collecting data for the sake of having everything is no longer practical, there are simply too many moving parts. Observability requires that we collect the data we can analyze and use to act, closing the loop to improve the applications on the network.
Training and education are another area for focus. The current generation of senior technologists grew up with physical servers and on-premises virtualization. Hardware they could touch, hear and feel. The coming generation will seldom have that experience as part of their daily work, software-defined and on-demand resources do not have the same tactile experience. Yet, most of these newer services are grounded in the way that the physical devices operated. Learning the principles of how computers operate is very different in a cloud-native world. Many newer technologists will need to get their fix of physical systems outside of work, building their own home labs and learning outside work.
Data is Making the Enterprise Network Better
Feb 11, 2025
The amount of data that has been unearthed in the network over the past few years is astounding. We now have access to more information that we could ever hope to want about the status of packets transiting through the enterprise. But is this data causing networks to be more complex? In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by Pieter-Jan Nefkens, Matyas Prokop, and Dominik Pickhardt as they explore the rise in data-centric network design. They discuss the drivers behind the need for more focused deployments and how access to this amount of data is creating challenges for operations teams. They also look at how security plays a role in the amount of information gathered and what happens to it after it is collected.
The amount of data that has been unearthed in the network over the past few years is astounding. We now have access to more information that we could ever hope to want about the status of packets transiting through the enterprise. But is this data causing networks to be more complex? In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by Pieter-Jan Nefkens, Matyas Prokop, and Dominik Pickhardt as they explore the rise in data-centric network design. They discuss the drivers behind the need for more focused deployments and how access to this amount of data is creating challenges for operations teams. They also look at how security plays a role in the amount of information gathered and what happens to it after it is collected.
The network has always contained the data that we have only recently surfaced. Being able to find that data has been useful for network engineers that are looking to build new networks. Understanding the minutiae of things like performance and tail latency means you can support new applications without the need to go out and purchase expensive new hardware. However, not all that data is useful. It takes a keen eye or a very comprehensive software algorithm to understand what’s happening under the surface.
That’s why the data is valuable but the processed information is even more critical. We need to make sense of the millions of lines of logs and the overwhelming amount of data points that only gain context with analysis. For designers and operations teams it takes more than just a table of numbers to make things better. You need to understand why things are behaving the way they are and where to apply the changes to make them work better. That is the essence of turning data into information. It’s about adding context.
This is even more apparent in security settings. You need the raw data and the analysis to be able to add the next important step of taking action. Network designs and performance may improve when you add more information to your platforms but those actions can be scheduled and planned. When it comes to a security breach you need to take that critical information and make it actionable, whether it is stopping an ongoing attack or plugging a hole that could be used in the future. The refinement of data into information leads to better decisions from that point forward.
The expansion of wireless spectrum technologies is affording new opportunities for growth. Existing areas are becoming more crowded as devices become more prevalent. This includes not only user devices but Internet of Things (IoT) as well. In this episode of the Tech Field Day Podcast, Tom Hollingsworth is joined by Cheryl Connell, Ron Westfall, and Jason Beshara as they discuss how the growth of IoT has caused contention in the current Wi-Fi spaces. They also discuss how the opening of the 6 GHz spectrum band has the potential to create more room for growth, provided device manufacturers adopt these new areas with the guidelines the bring.
One thing that has caused significant issues with adoption today is the fact that Wi-Fi radios are often one of the most economical parts of the device. Some newer tablets and laptops have cutting edge cards that operate at peak performance. However, the resource budgets for IoT devices often dictate less capable radio cards. Sometimes that happens because the CPU on the device isn’t capable of doing enough calculations to provide higher throughput. In other cases it is because the cost for a more capable card would increase the manufacturing cost of the device and reduce the operating profit for making thousands of them. Opportunity costs spare no one.
Another consideration is that crowded spectrums also create issues for operations that could have liability impacts for enterprises. IoT is more than thermostats and intelligent appliances. The term also applies to medical devices and safety sensors. These devices need to have clear ways to communicate and cannot afford to face contention in the air. That means creating new designs that use 6 GHz radios is a must, as is the need for medical and safety organizations to deploy proactively to support these devices right away.
Lastly, the guests discuss how to best prepare for these new deployments and understand the technologies behind them. There will always be challenges and external contractors and tenants will create situations that you must address. Investigation and planning now will prevent the kinds of challenges that could derail upgrade efforts in the near future.
Artificial Intelligence has risen from a plain technical challenge to become the leading geopolitical issue in 2025. This episode of the Tech Field Day podcast, recorded in advance of AI Field Day, features Dr. Bob Sutor, Mitch Ashley, and Jim Czuprynski discussing the balkanization of AI globally. We have previously talked about data sovereignty, but AI is a technology seen to have international importance. From raw materials to chipmaking to systems design to model training, every nation wants to control AI. The United States recently introduced the AI Diffusion Framework to control not just access to chips but also sovereignty of the AI models themselves. We are really just starting to develop AI systems with large language models and yet companies and researchers are already recognizing that we need to move beyond words to spatial, temporal, and true understanding of the world. We are on the cusp of the next space race as regions and nations compete to be the center of the AI universe.
The public cloud is real; are private clouds real? Will we see more private clouds in 2025? Private cloud technology is far from the early days when on-premises virtualization was cloud-washed. In this episode of the Tech Field Day Podcast, Jon Myer, Allyson Klein, and Justin Warren join Alastair Cooke to examine why businesses deploy private clouds. The cloud operating model makes the private cloud more relevant in 2025 than ever.
IoT Is Going to Make Things More Difficult
Jan 14, 2025
IoT devices are growing by the thousands every day. Networks are gaining millions of devices every year and the total number is expected to top 31 billion by the end of 2025. Each of these devices are going to create complexity issues for networking administrators and engineers the world over. In this episode, Tom Hollingsworth is joined by networking legends Peter Welcher and Denise Donohue as they discuss the challenges that IoT is imposing on us all. This discuss the skills gap as well as security concerns as well as how to stay ahead of the deployment process.
IoT Is Going to Make Things More Difficult
IoT devices don’t start out as being overly complicated. Most of the enterprise-focused solutions are aimed at providing functionality for the lowest cost possible at scale. However, the compromises that must be made to save costs or the need for these devices to behave in a certain way often lead to issues with operations. It could be a quirky configuration need or a requirement for always-on connectivity that can lead to confusion.
Networking teams have to continually learn about how these devices work and gain new skills. Much of the basics of networking are still present but we also find that new solutions have new components and those new features lead to the most intense study. This is especially true for security teams as the attack surface increases exponentially with every new device added, especially if the security of those devices must be altered in order to make them easier to deploy and manage.
The other major change that networking teams need to be prepared for is the massive amount of data generated by IoT devices. This causes issues both for data collection but also transmission. When you include things such as location tracking or precision you realize that each of those data points requires packets to be transmitted and stored. That increased the burden on the networking and requires integration with teams to ensure that everything is available to stakeholders when requested.
Finally, the likelihood of ever-increasing IoT deployments will not be going away any time soon. More and more devices are gaining features that require constant connectivity and many manufacturers simply assume that their widgets will always be able to reach the cloud. Networking teams need to shift their decision making to accommodate for and put policies in place now that ensure deployments of new IoT devices follow established guidelines for operations and security.
All marketing is aspirational. Quoted throughput and user counts are hopeful at best. All IT professionals know this. In this episode, Tom Hollingsworth is joined by Kerry Kulp, Scott McDermott, and Mark Houtz as they discuss the inflated claims of marketing teams and how they factor into buying decisions. They also discuss how Wi-Fi compares to other technologies and why the enterprise experience is vastly different from the consumer perspective. Lastly, they provide some ideas for keeping a grasp on reality when it comes to working with Wi-Fi numbers.
All marketing is aspirational. Quoted throughput and user counts are hopeful at best. All IT professionals know this. In this episode, Tom Hollingsworth is joined by Kerry Kulp, Scott McDermott, and Mark Houtz as they discuss the inflated claims of marketing teams and how they factor into buying decisions. They also discuss how Wi-Fi compares to other technologies and why the enterprise experience is vastly different from the consumer perspective. Lastly, they provide some ideas for keeping a grasp on reality when it comes to working with Wi-Fi numbers.
When you see the numbers on the data sheet for an access point, you can bet those are the ideal conditions for a single user as close to the device as possible. Unlike traditional wired networking, wireless has a lot of characteristics which can degrade communications. Despite being marketed as gigabit-capable very few users actually see those kinds of numbers under the best of circumstances.
This is especially true in the continual release of new specifications. These have received marketing terminology, replacing standards designations with simple numbering schemes. Wi-Fi 7 has to be better than Wi-Fi 6 because it’s higher, right? However, the speed numbers are only part of the story. New standards often reduce latency or provide better efficiency. Some even focus more on the crowded enterprise space instead of making consumer devices faster. Even if your AP is running at full speed in your house you may not see the same performance in an office building when the deployment is running with more efficient channel sizes to handle the extra clients.
Rather than obsess over the fastest possible speeds it is better to steer them toward more useful metrics, such as efficiency. These utility aspects make it simpler for people to see that while the speeds may not be the fastest they’ve ever seen they’re experiencing more coverage and reliability instead of peak throughput followed by bad connectivity.
Technology Silos Are a Thing of the Past
Dec 17, 2024
Enterprise IT has long been divided into silos. This is because of scarce resources and specialized knowledge required to perform some IT operations tasks. The world of today is much more focused on outcomes and the need for silos is waning. In this episode of the Tech Field Day Podcast, Stephen Foskett, Alastair Cooke, and Tom Hollingsworth discuss how enterprise IT has moved away from silos due to increased resource availability and cross training. They also look ahead to new challenges from advances like AI and quantum computing.
Enterprise IT has long been divided into silos. This is because of scarce resources and specialized knowledge required to perform some IT operations tasks. The world of today is much more focused on outcomes and the need for silos is waning. In this episode of the Tech Field Day Podcast, Stephen Foskett, Alastair Cooke, and Tom Hollingsworth discuss how enterprise IT has moved away from silos due to increased resource availability and cross training. They also look ahead to new challenges from advances like AI and quantum computing.
Silos were very popular in the days when you storage admins and your network engineers had their own spaces to operate and no one really did any cross training. You had your area and you stuck to it. As IT evolved those silo walls started coming down. Storage and compute merged because of virtualization. Wireless and traditional networking have started to become one edge-focused solution. All of those came even before the cloud started battering down the barriers to how we need to consider working with our infrastructure.
Part of the reason for the changes is abundance. We no longer have to conserve resources as we once did. Bandwidth is plentiful. Cloud computing makes CPU and storage effectively unlimited if budgets allow. Engineers no longer need to worry about the minutia of esoteric configurations that optimize dwindling resources. Instead, engineering and development talent have started to focus on outcomes. Applications have become the atomic unit of deployment, while networking and storage have been relegated to components of the overall solution.
That’s not to say that new challenges aren’t on the horizon for IT silos. AI is creating new boundaries based on resources that, while deep, are also very expensive to create and maintain. These new constraints are creating divisions just like the old silos. Another challenge is the need to simplify and abstract enterprise IT technology. Hiding the complexity from DevOps and AIOps teams doesn’t mean that it goes away. Instead it leads to bigger issues when the abstractions fail and the understanding of the siloed nature of IT isn’t there.
The future continues to be uncertain as the power needs of AI and the hardware requirements of quantum computing seem to be limitless. The unpredictability of the deployment of these technologies and the lack of efficiency they demonstrate today mean that we may have eliminated our existing silos only to have set up the creation of many more.
Company Acquisitions are a Necessary Evil in Enterprise Tech
Dec 10, 2024
The IT industry’s reliance on acquisitions is a necessary driver of innovation, though they often seem to get in the way of competition and progress. This episode of the Tech Field Day podcast, recorded during Cloud Field Day 21, features Ray Lucchesi, Jon Hildebrand, Ken Nalbone, and Stephen Foskett considering whether acquisitions in the IT industry are a necessary evil or a detriment to innovation. Acquisitions are often seen as a double-edged sword, with both positive and negative implications. On one hand, acquisitions can fuel innovation by providing smaller companies with the resources and market access they need to scale their ideas. On the other hand, they can stifle competition, lead to cultural clashes, and sometimes result in the disappearance of promising technologies or products.
We May Not Like Acquisitions in Tech But We Need them!
The IT industry has long been shaped by the cycle of acquisitions, with large companies absorbing smaller, innovative startups to bolster their portfolios. This practice is often seen as a double-edged sword. On one hand, acquisitions can inject fresh ideas and technologies into established organizations, enabling them to stay competitive in a rapidly evolving market. On the other hand, the process can stifle innovation, as smaller companies with promising technologies are often absorbed and their products either languish or are subsumed into larger, less agile corporate structures. The debate over whether acquisitions are a necessary evil or simply detrimental to the industry remains a contentious topic.
One of the key arguments in favor of acquisitions is their role in fostering innovation. Startups often emerge with groundbreaking ideas but lack the resources or market reach to scale effectively. Being acquired by a larger company can provide the necessary capital, infrastructure, and customer base to bring these innovations to a broader audience. However, this process is not without its pitfalls. Many acquisitions result in a clash of corporate cultures, leading to inefficiencies and, in some cases, the eventual dissolution of the acquired entity’s unique value proposition. This raises questions about whether the industry might benefit more from encouraging organic growth rather than relying on acquisitions as a growth strategy.
Critics argue that acquisitions often prioritize short-term financial gains over long-term innovation. Large corporations may acquire smaller companies not to integrate their technologies but to eliminate potential competition. This practice can lead to market consolidation, reducing diversity and stifling the competitive landscape. Furthermore, the focus on financial returns, driven by venture capital and private equity investments, often pressures startups to position themselves as acquisition targets rather than sustainable, standalone businesses. This dynamic can skew the priorities of emerging companies, emphasizing exit strategies over product development and customer satisfaction.
The role of private equity in driving acquisitions adds another layer of complexity. Private equity firms often seek to maximize returns by cutting costs and streamlining operations, which can lead to a loss of innovation and employee morale within the acquired company. While some private equity firms take a more hands-on approach to foster growth and innovation, others focus solely on financial metrics, potentially undermining the long-term viability of the companies they acquire. This dichotomy highlights the need for a more balanced approach to investment, one that prioritizes sustainable growth and innovation over short-term financial gains.
In an ideal world, the IT industry would thrive on organic growth, with companies building sustainable business models and scaling through customer acquisition and market expansion. However, the reality is that acquisitions are deeply ingrained in the industry’s fabric, driven by the need for rapid growth and the financial incentives of venture capital and private equity. While acquisitions may be a necessary evil in the current landscape, the industry must strive to ensure that they are conducted in a way that fosters innovation, benefits customers, and supports the long-term health of the market. The challenge lies in finding a balance that allows both startups and established companies to thrive without compromising the industry’s overall dynamism.
There’s a Gulf Between Storage and AI
Dec 03, 2024
There is a significant gap between storage companies and their ability to effectively support AI infrastructure. In this episode of the Tech Field Day podcast, recorded during the AI Data Infrastructure Field Day 1 in Santa Clara, host Stephen Foskett and guests Kurtis Kemple, Brian Booden, and Rohan Puri explore the evolving relationship between storage and AI. The discussion highlights a significant gap between storage companies’ current capabilities and the demands of AI applications. While storage vendors are pivoting to support AI, many lack deep AI expertise, often focusing on cost and efficiency rather than offering integrated, AI-specific solutions. The panel emphasizes the need for storage companies to move beyond being mere data repositories and instead develop end-to-end solutions that address AI workflows, data preparation, and metadata management. They also stress the importance of education, partnerships, and hiring AI specialists to bridge the knowledge gap and drive innovation. The conversation underscores the early stage of this convergence, with a call for clearer strategies, open standards, and more cohesive integration between storage and AI to meet the growing demands of data-driven applications.
The intersection of storage and AI infrastructure presents a complex and evolving challenge. While storage companies are increasingly pivoting toward AI solutions, there remains a significant gap in understanding and integration. Storage has traditionally been viewed as a low-level, technical domain focused on hardware like disks and file systems. On the other hand, AI, particularly in the context of large language models (LLMs) and data analytics, operates at a higher level, requiring nuanced data management and application-specific insights. This disconnect highlights the need for storage companies to move beyond simply offering cost-effective and high-performance infrastructure. Instead, they must develop a deeper understanding of AI workflows and provide solutions that address the specific needs of AI applications, such as data preparation, metadata management, and seamless integration with AI training pipelines.
One of the key challenges is the lack of “solutioning” in the storage industry. Many storage vendors focus on infrastructure performance and efficiency but fail to address how their products fit into the broader AI ecosystem. For instance, while some companies are integrating with GPU technologies to support AI workloads, this approach often stops at the infrastructure level. True integration requires a more comprehensive understanding of AI applications, extending beyond hardware to include data management, insights, and application-level affordances. Without this, storage solutions risk being perceived as generic and interchangeable, reducing their value proposition in the AI space.
Another critical issue is the fragmentation of data sources and the absence of standardized frameworks for integration. Data in AI workflows often comes from diverse sources, including databases, data warehouses, file systems, and cloud storage. These sources are frequently siloed, making it challenging to consolidate and analyze data effectively. While some progress has been made in the database world with open formats and decoupled layers, similar advancements are lacking in the storage domain. The industry needs open standards and protocols that enable seamless data integration across vendors and platforms, facilitating the development of unified AI solutions.
The role of storage companies in AI could evolve in two distinct directions: becoming specialized storage solutions for AI or serving as connectors that enable AI applications to access existing data seamlessly. Both approaches have merit, but they require a clear strategy and a deep understanding of AI workflows. Companies that choose to specialize in AI storage must offer features like automated data preparation, efficient data movement, and real-time insights. Conversely, those opting to act as connectors must focus on breaking down data silos and providing tools that simplify data access and integration for AI applications.
Education and leadership are crucial for bridging the gap between storage and AI. Storage companies need to hire AI specialists and empower them to influence product development and strategy. This requires a top-down approach, with leadership roles dedicated to understanding and addressing the unique challenges of AI. Without this internal expertise, companies risk creating a disconnect between their AI-focused messaging and the actual capabilities of their products. Moreover, fostering collaboration between storage and AI teams within organizations can lead to more innovative and effective solutions.
Finally, the industry is still in the early stages of addressing the intersection of storage and AI. While the rapid growth of data and the increasing complexity of AI workloads present significant challenges, they also offer opportunities for innovation. Storage companies that can adapt to these demands by developing specialized products, embracing open standards, and fostering cross-disciplinary expertise will be better positioned to succeed. As the market matures, we can expect to see a blending of technologies and a shift toward more integrated and user-friendly solutions that cater to the unique needs of AI applications.