Welcome!

Machine Learning Authors: Mehdi Daoudi, Mark Ross-Smith, Jason Bloomberg, Carmen Gonzalez, Jeffrey Abbott

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Machine Learning , Apache, Cloud Security, @BigDataExpo

@CloudExpo: Blog Feed Post

SDN: Concrete or Concept?

Is SDN a concept, or a concrete architectural construct? Does it really matter?

Now, if we look at the benefits we can attempt to infer what the problems SDN is trying to solve:

BENEFIT

PROBLEM

Programmability Network components today, particularly hardware-based components, run specific feature sets that can only be modified by the vendor, which happens on a 12-18 month schedule, security and hot-fixes not withstanding. New features and functions can only be added by the vendor based on their prioritization, not the customer.
Automation Manual configuration of network components is time consuming, costly, and introduces a higher risk of human error that can result in outages, poor performance, or security risks.
Network control The network today doesn't adapt rapidly to changing conditions or events. While some protocols simulate such adaptability, these protocols can't autonomously route around outages or failures or modify existing policies easily.

These are certainly problems for IT organizations of a variety of sizes and composition. The question then is, how does SDN uniquely solve those problems?

The answer is that as a concrete solution (i.e. components, software, and architectures)  it does not uniquely solve the problem. As a concept, however, it does.

Someone's no doubt quite upset at the moment at that statement. Let's explain it before someone's head explodes.

CONCEPT versus CONCRETE

The concept of separating data and control plane enables programmability. Without that separation, we have what we have today – static, inflexible networking components. But the concept of separating data and control planes isn't unique to solutions labeled specifically SDN. ADN is a good example of this (you saw that coming, didn't you?)

A network component can – and this may surprise some people – internally decouple control and data planes. Yeah, I know, right? And doing so enables a platform that looks a whole lot like SDN diagrams, doesn't it – with plug-ins and programmability. This occurs in full-proxy architectures when there exist dual stacks – one on the client side, one on the server side. Where traffic transitions from one stack to the other exists an opportunity to inspect, to manipulate, to modify, the traffic. Because the architecture requires acting as an endpoint to clients (and conversely as the point of origin for the server side), protocols can even be implemented in this "no man's land" between client and server. That enables protocol transitioning, such as enabling SPDY on the outside while still speaking HTTP on the inside or IPv4 to servers while supporting IPv6 on the client (and vice-versa).

Where the separation occurs is not necessarily as important as the fact that it exists – unless you're focused on concrete, SDN-labeled solutions as being the only solutions that can provide the flexibility that programmability offers.

Automation occurs by exposing the management plane through an API (or implementing a specific API, such as OpenFlow) such that operational tasks and configuration can be achieved through tools instead of time.

Between automation and programmability, you realize network control.

Now, this is not SDN, at least not in terms of protocol support and concrete architecture. But it is software-defined, and it is networking, so does it count?

I guess it depends. ADN has always approached layers 4-7 with an eye toward extensibility, programmability and control that enables agility in the network. We didn't call it SDN and I don't see the industry deciding to "SDN-wash" existing ADN solutions as SDN just because a new term came along and became the TLA du jour.

What I do see is that ADN and specifically full-proxy based ADC (application delivery controllers) already offer the same benefits using the same concepts as SDN. Consider again the core characteristics of SDN:

1. Control and data planes are decoupled

2. Intelligence and state are logically centralized

3. Underlying network infrastructure abstracted from applications

All of these characteristics are present in an ADN. The ability to leverage network-side scripting on the control plane side of the equation enables extensibility, rapid innovation, and ability to adapt to support new protocols, new applications, new business requirements – all without involving the vendor. Which is exactly one of the benefits cited for SDN solutions and specifically OpenFlow-enabled architectures.

So the question really is, does it matter if a solution to the problem of "agility in the network" is a concrete or conceptual SDN solution if it ultimately solves the same set of problems?

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
While many government agencies have embraced the idea of employing cloud computing as a tool for increasing the efficiency and flexibility of IT, many still struggle with large scale adoption. The challenge is mainly attributed to the federated structure of these agencies as well as the immaturity of brokerage and governance tools and models. Initiatives like FedRAMP are a great first step toward solving many of these challenges but there are a lot of unknowns that are yet to be tackled. In hi...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...