Avishai Sharlin

What Service Providers Should Consider When Running in a Hybrid Cloud Environment

Telco considerations for running a hybrid cloud environment

This is an excerpt from Avishai’s byline originally published in TechTarget. The full article can be found here.

When you get a new tech gadget, what typically happens to your old one? Often, that once-loved device is simply forgotten, as we turn all attention to our latest and greatest acquisition.

When you upgrade to a new infrastructure or virtualized network function, don't overlook the fact your existing applications and install base still exist. Simply treating them as outdated and obsolete would be a grave mistake.

Operate and Innovate at Speed with Hybrid Cloud

The first thing to look at when running a hybrid environment is your operations. Consider looking into lift-and-shift opportunities and areas you can containerize apps. Organizations should try to simplify where possible, converting existing applications to run within containers. This is easier said than done, in some cases. Such a process will take time and can mean retraining personnel. There might also be additional operating costs.

Next, consider how you plan to scale your apps. Are you going to use Kubernetes in a standard way, or do you need other methods and techniques? Your approach may determine the best way forward, while also setting expectations for changes in development directions and tooling costs.

Look into APIs. APIs are essential to use the knowledge, libraries and investments of R&D that go into an open platform solution. Also, consider best practices of related API technologies such as Swagger or RestAPI to open your ecosystem and ease integration.

Lastly, plan how you'll monitor and log your hybrid environment and consider using standard Kubernetes patterns, such as native Prometheus, and standard output logs.

As part of a holistic end-to-end hybrid design and architecture, you'll need to look into security and interoperability as well. These topics will determine how your organization moves forward and at what speed and agility.

Need more information on running a hybrid cloud environment? Check out our additional resources or contact us for more info.

 

Related Blogs

Summary

The first thing to look at when running a hybrid cloud environment is your operations. Consider looking into lift-and-shift opportunities and areas you can containerize apps; this is easier said than done.

Follow

Add New Comment

Summary

The first thing to look at when running a hybrid cloud environment is your operations. Consider looking into lift-and-shift opportunities and areas you can containerize apps; this is easier said than done.

Follow

How Edge Computing Challenges Our Industry, Part One - The Talent Game

How Edge Computing Challenges the Communications Industry

This is an excerpt from Avishai’s byline originally published in Forbes. The full article can be found here.

Edge And Fog Computing Networking Models

The growth of internet of things (IoT) devices and the changes in gaming consumption (network effect replacing pure in-house consumption) are two of the main driving forces for immense data growth in both consumption and traffic.

Pushing this data back and forth to the public cloud today presents challenges of latency, bandwidth and security. Think of sensitive data that we may not want on the public cloud as raw data.

If we can process the data locally and push the data to the public cloud when required, we may have a plausible solution. Edge and fog computing implement the fundamentals of cloud computing but are done at the edge of the network, as close as possible to the source of the data produced by IoT devices/endpoints. We are actually looking into the implementation of a microcloud data center close to the source of the data. If you consider a weather forecast, the fog resides below the clouds. The term fog computing comes to indicate a layer below the clouds -- fog computing is the interim stage before cloud computing.

With edge computing, we may find ourselves with new networking models we are less familiar with. IoT devices can be placed in remote areas, where what used to be traditional or simple networking may not be possible. Technologies like mesh networks over Bluetooth allow IoT devices to communicate with each other over their own private network. When it's required, they interact with a gateway that pushes this data out to the public cloud.

Edge Computing Challenges - Microclouds, Metrics & Integrations

Edge computing challenges are similar to the problems we experience in adapting to cloud as a whole. This means we have to look into issues related to infrastructure, operation and integration at a micro-public-cloud scale, decentralized from the public cloud footprint and network. As a result, the public cloud providers may not have any direct access to edge computing processing, which is something that we didn’t experience when dealing with clouds. When we started deploying in clouds, the cloud vendors gave us everything from infrastructures as a service (IaaS) to platforms as a service (PaaS). They monitored and secured all. With edge computing, the fog is not necessarily part of the cloud. This raises several challenges including:

  • Achieving complete integration, which requires that professional services build the required moving parts for edge computing, including software architecture that supports the microcloud at the edge.
  • Interacting with the public cloud in a way that delivers metrics and allows for monitoring and security.
  • Integration with various clouds and devices.

The Talent Game for Service Providers

To support this emerging environment, companies would need to adapt themselves in ways that allow them to support new scenarios and technologies. They, too, (as the service providers) would need to look closely into their talent generation pool. Adoption of serverless computing in the future will only increase the demand for new skills in areas like function as a service (FaaS).

You may add to that the need to understand the new fundamentals of networking such as mesh networks and software-defined networking (SDN) enabled technologies. This pool of talent also needs to know how to operate and orchestrate these fundamentals, which is why we are facing a big challenge in our industry.

Potential Plays in Edge Computing

Companies will differ in how they address the talent problem. The big cloud vendors are already offering courses and architectural certifications -- there are numerous tracks to take and master. In the talent game, we witness new alliances and cooperations forming to address the knowledge gap. If one company has limited resources and the other company is suffering from a shortage in talent, combining forces may allow both of them to survive.

Some companies are targeting the acquisition of boutique shops to bridge knowledge gaps and educate their teams.

All those are important steps that will allow companies to push forward, but they are not enough. Organizations must transform themselves into a learning organism, one that constantly adapts and sharpens its skills. The need is to create a learning culture if one doesn’t exist already and provide people with the tools to compete. Learning in groups, allocating time and means to train and commending those who do are a handful of things modern chief information officer (CIO) must use to succeed.

Summary

Edge and fog computing may be the next piece between on-premise computing and public cloud computing. They may bridge the data latency issue we are experiencing with the myriad IoT devices. These computing models provide an “interim processing” stage done locally to avoid processing in the public cloud. While bringing an important solution needed to the market demand, these technologies will only increase the need for talent and new skills. The industry should prepare itself to address the need. CEOs and CIOs must create learning organizations which will allow us to operate and fully utilize the promise these technologies provide.

Related Blogs

  • How Edge Computing Challenges Communications Service Providers, Part Two - Hyperscale Clouds and SI Industry
  • AI presents a new opportunity to invest in CSP data
  • Capturing moments as they happen: a new age in customer profiles

     

     

  • Summary

    Will edge and fog computing be the next piece between on-premise computing and public cloud computing?

    Follow

    Add New Comment

    Summary

    Will edge and fog computing be the next piece between on-premise computing and public cloud computing?

    Follow