Search

Cloud’s Achilles’ Heel Of Growth Is People Skills

Cloud is undoubtedly the driver of the new tech economy. Be it SaaS, PaaS, IaaS, Public, Private or Hybrid clouds, E-Commerce, IOT (Internet of Things), Big Data or some iteration that at the back of it is supported by cloud technologies. Technology is both enhancing and reducing in cost at such a speed, that it is no longer the entitlement of only the large firms, but can empower any organization from small to large, from startup to established, to be able to revolutionize their customer offering and to elect to disrupt or be disrupted.With this speed of technology, change comes a need for those supporting the business to adapt quickly and adopt new methodologies, knowledge, and skills to empower a company to take advantage of these new possibilities. Switching from Waterfall to Agile, from networking to virtualization to Docker, from hosting to IaaS & PaaS and from C, through Java into Swift, Hack, and Dart.Usage of AWS, Azure and Traditional ArchitecturesTake Amazon Web Services (AWS) and Microsoft Azure as examples. It is extremely easy now to spin up these cloud offerings to run applications or to act as a temporary testing environment. However whilst easy, there is also complexity to address, from modules, security model, and storage approach to utilizing through to managing the user of these across an organization to ensure optimization commercially and that security is not compromised.Why Cloud Technology is slowly taking overA wide range of firms still relies on traditional IT infrastructure (locally deployed server applications and databases) despite the increasingly rapid rate of companies migrating to cloud-based systems.  Digital Transformation seems to be on the agenda of most enterprise organizations, banded about as if it’s a switch to flick and a fast thing to undertake. However, the reality is far from the truth and accepting the change required and building the cloud computing skills at hand to achieve it, are barriers impeding a growing number of companies.Change and Migration to CloudChange is hard to accept at the best of times, particularly if you have previously been the subject expert on a vendor/technology for a long period, to now find that is being disrupted at pace and your worth is diminishing either in your own firm or to the general market. Being prepared to let go of many years of acquired skills and accept the need to re-start and learn a whole range of new skills is hard to accept and many will resist, defending the status quo and hindering business change and their own personal progress.For companies moving applications and services to cloud platforms, migration challenges are one of the top constraints affecting IT, as there are no automated switchovers on offer and customized internal or external migrations vary from mild to heavy development changes.A strategic approach for organizations looking to scale its migration to the cloud platform quickly for business needs.A quick exampleMigrating a homegrown or proprietary application requires new application development and testing. However, if taken on with commitment, the move can provide faster more agile application development through DevOps and utilization of enhanced cloud features and API’s leading to improved application lifecycle management.Cloud computing skills are in demandAs the demand for cloud computing is increasing there is a tremendous need for professionals to keep up with the knowledge and skills of the chosen cloud platform to propel their career and to deliver the migration project in a structured effective manner. Cloud continues to enhance quickly and even those in the cloud a decade ago are finding they are needing to continue to learn new cloud computing skills, such as the usage surge in containers, for which a Robin Systems Survey recently cited that 81% of organizations are planning to increase their use.Big Data has introduced new approaches, tools, skills and with an expected annum growth of 60% (IDC), it cannot be ignored. With the increased volume of data and continual crunching, demands databases are going to live in the cloud and demand new platforms and approaches.Security, Security and Security!With the plethora of changes from new coded applications and architectures holding vast data stores in the cloud, the need for greater cybersecurity expertise is an essential requirement. With the human element recognized as the most vulnerable area of security, the introduction of so many new skill areas will introduce increased risk of new security exposures. Software developers in the cloud must understand and treat with extreme caution,  as there is a need for increased responsibility for security assurance and compliance. With the heightened awareness of security threats and breaches and the introduction of the new GDPR (General Data Protection Regulation) in Europe with far heftier and damaging fines, getting this wrong is now going to be catastrophic. It is estimated that less than 5% of cloud applications are ready for GDPR, leading to a vast breadth of enhancement In a very short period.Reasons for new security threatsThe perfect storm circling this comes from the expectation that 30-40% of the corporate workforce will retire in the next decade, combined with a reduction in those studying relevant ICT subjects and the reduction in educations capability to provide effective education in the required areas. We have a rapidly increasing demand for cloud computing skills (to both support new technologies and to support the digital transformation from old to new) and a big % of those with technology backgrounds retiring rather than reskilling, backed a reduction in educations capability to attract and educate to the level of need required.Businesses now have pressures upon them like never before!  Markets that shift quicker, more fickle and demanding customers, users being influenced by or becoming millennials (who expect faster, quicker, easier, cheaper from the world they have grown up within) and disruption all around them from newborn firms who can attack with the gusto of using all the new world tech and methods, with no legacies to unchain themselves from.SummaryCompanies MUST have access to the skills required to be able to employ the full scope of new tech on offer to their business advantage, to move old creaking applications to newer form factors and to deliver a better quality of service and user experience to the demands of any device, any place, any time working for both their employee and their increasingly new breed of customer.Unless the issue is addressed quickly, you can expect ‘Supply & Demand’ for these new skills is going to simultaneously implode and explode, creating a chasm between need and affordability, as those who can do become scarce and valuable commodities, available to the few who can afford!We hope that this article gave you a clear understanding of how lack of people skill can be a weak point for cloud computing? To learn more about cloud computing, you can check out the other blogs and the AWS certification course by KnowledgeHut.
Rated 4.5/5 based on 3 customer reviews

Cloud’s Achilles’ Heel Of Growth Is People Skills

389
  • by Ian Moyse
  • 31st Jul, 2018
  • Last updated on 11th Sep, 2019
  • 4 mins read
Cloud’s Achilles’ Heel Of Growth Is People Skills

Cloud is undoubtedly the driver of the new tech economy. Be it SaaS, PaaS, IaaS, Public, Private or Hybrid clouds, E-Commerce, IOT (Internet of Things), Big Data or some iteration that at the back of it is supported by cloud technologies. Technology is both enhancing and reducing in cost at such a speed, that it is no longer the entitlement of only the large firms, but can empower any organization from small to large, from startup to established, to be able to revolutionize their customer offering and to elect to disrupt or be disrupted.
With this speed of technology, change comes a need for those supporting the business to adapt quickly and adopt new methodologies, knowledge, and skills to empower a company to take advantage of these new possibilities. Switching from Waterfall to Agile, from networking to virtualization to Docker, from hosting to IaaS & PaaS and from C, through Java into Swift, Hack, and Dart.

Usage of AWS, Azure and Traditional Architectures
Take Amazon Web Services (AWS) and Microsoft Azure as examples. It is extremely easy now to spin up these cloud offerings to run applications or to act as a temporary testing environment. However whilst easy, there is also complexity to address, from modules, security model, and storage approach to utilizing through to managing the user of these across an organization to ensure optimization commercially and that security is not compromised.

Why Cloud Technology is slowly taking over
A wide range of firms still relies on traditional IT infrastructure (locally deployed server applications and databases) despite the increasingly rapid rate of companies migrating to cloud-based systems.  Digital Transformation seems to be on the agenda of most enterprise organizations, banded about as if it’s a switch to flick and a fast thing to undertake. However, the reality is far from the truth and accepting the change required and building the cloud computing skills at hand to achieve it, are barriers impeding a growing number of companies.

Change and Migration to Cloud

Change is hard to accept at the best of times, particularly if you have previously been the subject expert on a vendor/technology for a long period, to now find that is being disrupted at pace and your worth is diminishing either in your own firm or to the general market. Being prepared to let go of many years of acquired skills and accept the need to re-start and learn a whole range of new skills is hard to accept and many will resist, defending the status quo and hindering business change and their own personal progress.

For companies moving applications and services to cloud platforms, migration challenges are one of the top constraints affecting IT, as there are no automated switchovers on offer and customized internal or external migrations vary from mild to heavy development changes.

A strategic approach for organizations looking to scale its migration to the cloud platform quickly for business needs.

A quick example

Migrating a homegrown or proprietary application requires new application development and testing. However, if taken on with commitment, the move can provide faster more agile application development through DevOps and utilization of enhanced cloud features and API’s leading to improved application lifecycle management.

Cloud computing skills are in demand

As the demand for cloud computing is increasing there is a tremendous need for professionals to keep up with the knowledge and skills of the chosen cloud platform to propel their career and to deliver the migration project in a structured effective manner. Cloud continues to enhance quickly and even those in the cloud a decade ago are finding they are needing to continue to learn new cloud computing skills, such as the usage surge in containers, for which a Robin Systems Survey recently cited that 81% of organizations are planning to increase their use.
Big Data has introduced new approaches, tools, skills and with an expected annum growth of 60% (IDC), it cannot be ignored. With the increased volume of data and continual crunching, demands databases are going to live in the cloud and demand new platforms and approaches.

Security, Security and Security!
With the plethora of changes from new coded applications and architectures holding vast data stores in the cloud, the need for greater cybersecurity expertise is an essential requirement. With the human element recognized as the most vulnerable area of security, the introduction of so many new skill areas will introduce increased risk of new security exposures. Software developers in the cloud must understand and treat with extreme caution,  as there is a need for increased responsibility for security assurance and compliance. With the heightened awareness of security threats and breaches and the introduction of the new GDPR (General Data Protection Regulation) in Europe with far heftier and damaging fines, getting this wrong is now going to be catastrophic. It is estimated that less than 5% of cloud applications are ready for GDPR, leading to a vast breadth of enhancement In a very short period.

Reasons for new security threats
The perfect storm circling this comes from the expectation that 30-40% of the corporate workforce will retire in the next decade, combined with a reduction in those studying relevant ICT subjects and the reduction in educations capability to provide effective education in the required areas. We have a rapidly increasing demand for cloud computing skills (to both support new technologies and to support the digital transformation from old to new) and a big % of those with technology backgrounds retiring rather than reskilling, backed a reduction in educations capability to attract and educate to the level of need required.

Businesses now have pressures upon them like never before!  Markets that shift quicker, more fickle and demanding customers, users being influenced by or becoming millennials (who expect faster, quicker, easier, cheaper from the world they have grown up within) and disruption all around them from newborn firms who can attack with the gusto of using all the new world tech and methods, with no legacies to unchain themselves from.

Summary

Companies MUST have access to the skills required to be able to employ the full scope of new tech on offer to their business advantage, to move old creaking applications to newer form factors and to deliver a better quality of service and user experience to the demands of any device, any place, any time working for both their employee and their increasingly new breed of customer.

Unless the issue is addressed quickly, you can expect ‘Supply & Demand’ for these new skills is going to simultaneously implode and explode, creating a chasm between need and affordability, as those who can do become scarce and valuable commodities, available to the few who can afford!

We hope that this article gave you a clear understanding of how lack of people skill can be a weak point for cloud computing? To learn more about cloud computing, you can check out the other blogs and the AWS certification course by KnowledgeHut.

Ian

Ian Moyse

Blog Author

Ian, Sales Director at Salesforce Cloud  Telephony Provider Natterbox , has been a leader in the cloud industry for over 12 years and sat on the Boards of Eurocloud, FAST and the Cloud Industry Forum and in 2016 -2018 Ian was a judge on the UK Cloud Awards.  Ian has been rated #1 Global SaaS influencer (Klout) and #1 cloud Social influencer from 2015-2017 (Onalytica). He is recognised as a leading cloud Blogger and was listed in the EMEA top 50 influencers in Data Centres, Cloud & Data 2017.

Join the Discussion

Your email address will not be published. Required fields are marked *

Suggested Blogs

What is AWS CLI and How to Install it?

Whether you are a small business or a big organisation, you must be familiar with Amazon Web Services. AWS is the world’s most widely used cloud computing platform – something that lets you move faster, operate more securely and save substantial costs. AWS offers 165 services spanning a wide range including computing, tools for the Internet of Things, developer tools, deployment, analytics, database, networking, etc. The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services.Do you want to Get AWS Certified? Learn about various  AWS Certification in detailWhat is AWS CLI?The AWS Command Line Interface (CLI) is a unified tool that manages the AWS services for you. You only have to download and configure one simple tool to control a plethora of AWS services. They are automated through scripts and help you implement a certain level of automation. It is indeed AWS CLI that makes AWS so dynamic and easy to use.How to Install AWS CLI?Putting AWS CLI to use in your AWS involves a few steps.There are a few different ways to install it – you need to choose what works for your system.Every operating system has a different method of installation.Some generic steps are followed after the installation.Then comes the configuration.You will also need to upgrade it timely. Follow these steps to successfully install and configure the AWS CLI for use.Ways to Install:You can effectively install the AWS Command Line Interface (AWS CLI) using:pipa virtual environmenta bundled installerWhat do you need?Unix, macOS, Linux, WindowsPython 3 version 3.3+ or Python 2 version 2.6.5+It is important to know that you may not be able to use an older version of Python with all AWS Services. Update to a newer version if there are Insecure Platform Warning or deprecation notices. To find out what version you currently have, visit:  https://github.com/aws/aws-cli/blob/master/CHANGELOG.rst.1. Installing the AWS CLI Using pipPip is the main distribution method for the AWS CLI on macOS, Windows and Linux. It is a package manager for Python.Installing the current AWS CLI VersionIf you have pip and a supported version of Python, use the following command to install the AWS CLI. Use the pip3 command if you have Python version 3+ installed:$ pip3 install awscli --upgrade –userThe --upgrade option commands pip3 to upgrade the requirements that are already installed. The --user option commands pip3 to install the program to a subdirectory of the user directory. Doing this avoids the complication of modifying libraries used by your operating system.Upgrading to the latest versionUse the pip list -o command to identify packages that are "outdated”:$ aws --version aws-cli/1.16.170 Python/3.7.3 Linux/4.14.123-111.109.amzn2.x86_64 botocore/1.12.160 $ pip3 list -o Package Version Latest Type ---------- -------- -------- ----- awscli 1.16.170 1.16.198 wheel botocore 1.12.160 1.12.188 wheelNow, run pip install --upgrade to get the latest version:$ pip3 install --upgrade --user awscli Collecting aws cli Downloadinghttps://files.pythonhosted.org/packages/dc/70/b32e9534c32fe9331801449e1f7eacba6a1992c2e4af9c82ac9116661d3b/awscli-1.16.198-py2.py3-none-any.whl (1.7MB) |████████████████████████████████| 1.7MB 1.6MB/s Collecting botocore==1.12.188 (from awscli) Using cached https://files.pythonhosted.org/packages/10/cb/8dcfb3e035a419f228df7d3a0eea5d52b528bde7ca162f62f3096a930472/botocore-1.12.188-py2.py3-none-any.whl Requirement already satisfied, skipping upgrade: docutils>=0.10 in ./venv/lib/python3.7/site-packages (from awscli) (0.14) Requirement already satisfied, skipping upgrade: rsa=3.1.2 in ./venv/lib/python3.7/site-packages (from awscli) (3.4.2) Requirement already satisfied, skipping upgrade: colorama=0.2.5 in ./venv/lib/python3.7/site-packages (from awscli) (0.3.9) Requirement already satisfied, skipping upgrade: PyYAML=3.10; python_version != "2.6" in ./venv/lib/python3.7/site-packages (from awscli) (3.13) Requirement already satisfied, skipping upgrade: s3transfer=0.2.0 in ./venv/lib/python3.7/site-packages (from awscli) (0.2.0) Requirement already satisfied, skipping upgrade: jmespath=0.7.1 in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (0.9.4) Requirement already satisfied, skipping upgrade: urllib3=1.20; python_version >= "3.4" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (1.24.3) Requirement already satisfied, skipping upgrade: python-dateutil=2.1; python_version >= "2.7" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (2.8.0) Requirement already satisfied, skipping upgrade: pyasn1>=0.1.3 in ./venv/lib/python3.7/site-packages (from rsa=3.1.2->awscli) (0.4.5) Requirement already satisfied, skipping upgrade: six>=1.5 in ./venv/lib/python3.7/site-packages (from python-dateutil=2.1; python_version >= "2.7"->botocore==1.12.188->awscli) (1.12.0) Installing collected packages: botocore, awscli Found existing installation: botocore 1.12.160 Uninstalling botocore-1.12.160: Successfully uninstalled botocore-1.12.160 Found existing installation: awscli 1.16.170 Uninstalling awscli-1.16.170: Successfully uninstalled awscli-1.16.170 Successfully installed awscli-1.16.198 botocore-1.12.1882. Installing the AWS CLI in a Virtual EnvironmentAnother option is to install the AWS CLI in a virtual environment to separate the tool and its dependencies. You can also use a different Python version for this purpose.3. Installing the AWS CLI Using an InstallerUse the bundled installer for automated and offline installation on Unix, macOS, and Linux. It includes the AWS CLI, its dependencies, and a shell script that is responsible for the installation. For Windows, try MSI installer. Both these methods simplify the initial installation. Installation of Linux or UnixBoth the platforms have identical installation process. You need to have Python’s latest version. We recommend using the bundled installer for this. The steps are as follows:1. To begin the installation:curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"2. Unzip the downloaded package:unzip awscli-bundle.zip3. Run the installation:sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/awsUsing the -b option allows you to use the AWS CLI from any directory.Installation on Amazon LinuxThe AWS Command Line Interface comes preinstalled on both Amazon Linux and Amazon Linux 2. Below are the steps to install:1. Identify currently installed version:$ aws --version aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.1062. Use pip3 to install the latest version of the AWS CLI. If you run the command from within a Python virtual environment (venv), then you don't need to use the --user option.$ pip3 install --upgrade --user awscli3. Add the install location to the beginning of the PATH variable.$ export PATH=/home/ec2-user/.local/bin:$PATH4. Verify that you're running new version with aws --version.$ aws --version aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106Installation on WindowsThe AWS Command Line Interface can be installed on Windows by using a standalone installer or through a pip - a package manager for Python> Through InstallerDownload the appropriate MSI installer.Run the downloaded MSI installer or the setup file.Follow these instructions:By default, the CLI installs to C:\Program Files\Amazon\AWSCLI (64-bit version) or C:\Program Files (x86)\Amazon\AWSCLI (32-bit version). To confirm the installation, use the aws --version command at a command prompt (open the Start menu and search for cmd to start a command prompt).C:\> aws --version aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106If Windows is unable to find the program, you might need to close and reopen the command prompt to refresh the path, or add the installation directory to your PATH environment variable manually.> Through Pip1. Open Start menu→Command Prompt2. Verify that both Python and pip are installed correctly:C:\> python --version Python 3.7.1 C:\> pip3 --version pip 18.1 from c:\program files\python37\lib\site-packages\pip (python 3.7)3. Install AWS CLI via pipC:\> pip3 install awscli4. Check if the installation went rightC:\> aws --version aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106To upgrade to the latest version:C:\> pip3 install --user --upgrade awscliInstallation on MAC OS> Through Installer1. Download the  AWS CLI Bundled Installer$ curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"2. Unzip the package$ unzip awscli-bundle.zip3. Run the installation$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/awsThis command installs the AWS CLI to /usr/local/aws and creates the symlink aws in the /usr/local/bin directory. Using the -b option to create a symlink eliminates the need to specify the install directory in the user's $PATH variable. It enables users to run the AWS CLI by typing “aws” from any directory.If you want to see an explanation of the -i and -b options, use the -h option$ ./awscli-bundle/install -hthe commands summarized for easy cut and paste at the command line.curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip" unzip awscli-bundle.zip sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws> Through PIP1. Download and install the latest version of Python from Python.org.2. Download and run the pip3 installation script provided by the Python Packaging Authority$ curl -O https://bootstrap.pypa.io/get-pip.py $ python3 get-pip.py --user3. Use pip3 to install the AWS CLI. We recommend using the pip3 command if you use Python version 3+$ pip3 install awscli --upgrade --user4. See if AWS CLI is installed correctly$ aws --version AWS CLI 1.16.116 (Python 3.6.8)To upgrade to the latest version, run the command:$ pip3 install awscli --upgrade --userInstallation on Ubuntu> Through APT Package Manager1. Update the package repository cache$ sudo apt-get update2. Install AWS CLI with the following command$ sudo apt-get install awsclipress y and then press to continue. Your screen should look something like this: 3. Now that it’s installed, check if it’s working properly or not$ aws --version> Through PIPAWS CLI being a Python module itself makes it easy for users who install it through PIP to update it on a regular basis. Assuming you have Python 3, follow the below steps to install:1. Install Python PIP with the following command$ sudo apt-get install python3-pipPress y and then press to continue2. Install AWS CLI using PIP with the following command$ pip3 install awscli --upgrade --user3. Run AWS CLI with the following command$ python3 -m awscli --versionAfter InstallationAfter you have successfully installed AWS CLI, you need to set the Path to Include the AWS CLI in your system.> LINUXFind out the folder in which pip installed the AWS CLI$ which aws /home/username/.local/bin/awsYou can reference this as “~/.local/bin/” because of the reason that “/home/username” corresponds to ~ in Linux OSIn case you don't know where Python is installed, run this command$ which python /usr/local/bin/pythonIf this is the same folder you added to the path while installing pip, there’s nothing else to be done. Otherwise, perform those same steps again, adding this additional folder to the path.> WINDOWSThe Windows System PATH tells your PC where it can find specific directories:C:\> where awsC:\Program Files\Amazon\AWSCLI\bin\aws.exeFind out where the aws program is installedC:\> where c:\ awsC:\Program Files\Python37\Scripts\awsIf the command returns the following error, then it is not in the system PATH and you can't run it by typing its name.C:\> where c:\ awsINFO:Could not find files for the given pattern.In that case, you need to add the path manually. First, you need to search where it is installed on your computer:C:\> where /R c:\ awsc:\Program Files\Amazon\AWSCLI\bin\aws.exec:\Program Files\Amazon\AWSCLI\bincompat\aws.cmdc:\Program Files\Amazon\AWSCLI\runtime\Scripts\awsc:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws.cmd...To modify your PATH variable (Windows)Press the Windows key and enter environment variables.Choose the Edit environment variables for your account.Choose the PATH →Edit.Add the path to the Variable value field. For example: C:\new\pathClick OK twice to apply the new settings.Close any running command prompts and reopen the command prompt window.> MAC OSLocate Python$ which python /usr/local/bin/pythonThe output might be the path to a symlink, not the actual program. Run ls -al to see where it points.$ ls -al /usr/local/bin/python ~/Library/Python/3.7/bin/python3.6Pip install programs in the same folder as the Python application. Add this folder to your PATH variable.To modify the PATH variable for macOS (and Linus or Unix):1. Find the shell profile script in the user folder. In case you don’t know which shell you have, run echo $SHELL2. Through the following, add an export command to the profile scriptexport PATH=~/.local/bin:$PATHThis adds a path, ~/.local/bin in this example, to the current PATH variable.3. The updated profile can now be loaded into your current session$ source ~/.bash_profile
Rated 4.5/5 based on 23 customer reviews
9868
What is AWS CLI and How to Install it?

Whether you are a small business or a big organisa... Read More

Test Drive Your First Istio Deployment using Play with Kubernetes Platform- Cloud Computing

As a full stack Developer, if you have been spending a lot of time in developing apps recently, you already understand a whole new set of challenges related to Microservice architecture. Although there has been a shift from bloated monolithic apps to compact, focused Microservices for faster implementation and improved resiliency but the fact is  developers have to really worry about the challenges in integrating these services in distributed systems which includes accountability for service discovery, load balancing, registration, fault tolerance, monitoring, routing, compliance, and security.Let us understand the challenges faced by the developers and operators with the Microservice Architecture in details. Consider a 1st Generation simple Service Mesh scenario. As shown below, Service (A) communicates to Service (B). Instead of communicating directly, the request gets routed via Nginx. The Nginx finds a route in Consul (A service discovery tool) and automatically retries to form the connection on HTTP 502’s happen.                                                                    Figure: 1.0 – 1st Gen Service Mesh                                                      Figure:1.1 – Cascading Failure demonstrated with the increase in the number of servicesBut, with the advent of microservices architecture, the number is growing ever since. Below are the  listed challenges encountered by both developers as well as operations team:How to make these growing microservices communicate with each other?Enabling the load balancing architectures over these microservices.Providing role-based routing for the microservices.How to implement outgoing traffic on these microservices and test canary deployment?Managing complexity around these growing pieces of microservices.Implementation of fine-grained control for traffic behavior with rich-routing rules.Challenges in implementing Traffic encryption, service-to-service authentication, and strong identity assertions.In a nutshell, although you could enable service discovery and retry logic into application or networking middleware, the fact is that service discovery becomes tricky to make it right.Enter Istio’s Service Mesh“Service Mesh” is one of the hottest buzzwords of 2018. As the name suggests, it’s a configurable infrastructure layer for a microservices app. It lays out the network of microservices that make up applications and enables interactions between them. It makes communication between service instances flexible, reliable, and fast. The mesh provides service discovery, load balancing, encryption, authentication and authorization, support for the circuit breaker pattern, and other capabilities.Istio is completely an open source service mesh that layers transparently onto existing distributed applications. Istio v1.0 got announced last month and is ready for production. It is written completely in Go Language and its a fully grown platform which provides APIs that let it integrate into any logging platform, or telemetry or policy system. This project adds a very tiny overhead to your system. It is being hosted on GitHub. Istio’s diverse feature set lets you successfully, and efficiently, run a distributed microservice architecture, and provides a uniform way to secure, connect, and monitor microservices.Figure-1.2: Istio’s CapabilityThe Istio project adds a very tiny overhead to your system. It is being hosted on GitHub. Last month, Istio 1.0 release went public and ready for production environment.What benefits does Istio bring?Istio lets you connect, secure, control, and observe services.It helps to reduce the complexity of service deployments and eases the strain on your development teams.It provides developers and DevOps fine-grained visibility and control over traffic without requiring any changes to application code.It provides CIOs with the necessary tools needed to help enforce security and compliance requirements across the enterprise.It provides behavioral insights & operational control over the service mesh as a whole.Istio makes it easy to create a network of deployed services with automatic Load Balancing for HTTP, gRPC, Web Socket & TCP Traffic.It provides fine-grained control of traffic behavior with rich routing rules, retries, failovers, and fault injection.It enables a pluggable policy layer and configuration API supporting access controls, rate limits and quotas.Istio provides automatic metrics, logs, and traces for all traffic within a cluster, including cluster ingress and egress.It provides secure service-to-service communication in a cluster with strong identity-based authentication and authorization.If you want to deep-dive into Istio architecture, I highly recommend the official Istio website.It’s Demo Time !!!Under this blog post, I will showcase how Istio can be setup on Play with Kubernetes (PWK) Platform for a free of cost. In case you’re new, Play with Kubernetes rightly aka PWK is a labs site provided by Docker. It is a playground which allows users to run K8s clusters in a matter of seconds. It gives the experience of having a free CentOS LinuxVirtual Machine in the browser. Under the hood Docker-in-Docker (DinD) is used to give the effect of multiple VMs/PCs.Open  to access Kubernetes Playground.Click on the Login button to authenticate with Docker Hub or GitHub ID.Once you start the session, you will have your own lab environment.Adding First Kubernetes NodeClick on “Add New Instance” on the left to build your first Kubernetes Cluster node. It automatically names it as “node1”. Each instance has Docker Community Edition (CE) and Kubeadm already pre-installed. This node will be treated as the master node for our cluster.Bootstrapping the Master NodeYou can bootstrap the Kubernetes cluster by initializing the master (node1) node with the below script. Copy this script content into bootstrap.sh file and make it executable using “chmod +x bootstrap.sh” command.When you execute this script, as part of initialization, the kubeadm write several configuration files needed, setup RBAC and deployed Kubernetes control plane components (like kube-apiserver, kube-dns, kube-proxy, etcd, etc.). Control plane components are deployed as Docker containers.Copy the above kubeadm join token command and save it for the next step. This command will be used to join other nodes to your cluster.Adding Worker NodesClick on “Add New Node” to add a new worker node.Checking the Cluster StatusVerifying the running PodsInstalling Istio 1.0.0Istio is deployed in a separate Kubernetes namespace istio-system. We will verify it later. As of now, you can copy the below content in a file called install_istio.sh and save it. You can make it executable and run it to install Istio and related tools.You should be able to see screen flooding with the below output.As shown above, it will enable the Prometheus, ServiceGraph, Jaeger, Grafana, and Zipkin by default.Please note – While executing this script, it might end up with the below error message –unable to recognize "install/kubernetes/istio-demo.yaml": no matches for admissionregistration.k8s.io/, Kind=MutatingWebhookConfigurationThe error message is expected.As soon as the command gets executed completely, you should be able to see a long list of ports which gets displayed at the top center of the page.Verifying the ServicesExposing the ServicesTo expose Prometheus, Grafana & Servicegraph services, you will need to delete the existing services and then use NodePort instead of ClusterIP so as to access the service using the port displayed on the top of the instance page. (as shown below)You should be able to access Grafana page by clicking on “30004” port and Prometheus page by clicking on “30003”.You can check Prometheus metrics by selecting the necessary option as shown below:Under Grafana Page, you can add “Data Source” for Prometheus and ensure that the dashboard is up and running:Congratulations! You have installed Istio on Kubernetes cluster. Below listed services have been installed on K8s playground:Istio Controllers and related RBAC rulesIstio Custom Resource DefinitionsPrometheus and Grafana for MonitoringJeager for Distributed TracingIstio Sidecar Injector (we'll take a look next section)Installing IstioctlIstioctl is configuration command line utility of Istio. It helps to create, list, modify and delete configuration resources in the Istio system.Deploying the Sample BookInfo ApplicationNow Istio is installed and verified, you can deploy one of the sample applications provided with the installation- BookInfo. This is a simple mock bookstore application made up of four services that provide a web product page, book details, reviews (with several versions of the review service), and ratings - all managed using Istio.Deploying BookInfo ServicesDefining the Ingress Gateway:Verifying BookInfo ApplicationAccessing it via Web URLYou should now be able the BookInfo Sample as shown below:Hope, this Istio deployment Kubernetes tutorial helped you to successfully install Istio on Kubernetes. In the future blog post, I will deep dive into Istio Internal Architecture, traffic management, policies & telemetry in detail.We hoped this article helped you get familiar with the concept. If you want to know more about it and get certified, you can try the AWS certification course offered by KnowledgeHut.
Rated 4.5/5 based on 1 customer reviews
1714
Test Drive Your First Istio Deployment using Play ...

As a full stack Developer, if you have been spendi... Read More

SSHing into Ubuntu EC2 instance post blocking port 22 with UFW - Cloud Computing

IntroductionThis blog is in reference to a troubleshooting situation in Amazon Web Services when you have configured firewall setting in your ubuntu ec2 or remote instance and is not able to login via PuTTY through SSH as the instance. Here, we will see how to insert SSH into the instance in a certain situation when you are logged out of that instance.During configuration of SSL security, we may accidentally or purposely block SSH for the instance to make the instance secure. But, what if we again want the same instance to SSH for certain changes.  Below is the highlighted configuration of the instance. Here, you can see that all the instances have all ports opened to everything.Here are the configuration changes which you have made on the login into instance:$ sudo apt-get update $ sudo apt-get install nginx $ sudo apt-get install ufw  Check UFW Status and Rules At any time, you can check the status of UFW with this command:$ sudo ufw status verbose By default, UFW will be disabled so you should see something like this:Output: Status: inactive If UFW is active, the output will say that it's active, and it will list the rules that are set. For example, if the firewall is set to allow SSH (port 22) connections from anywhere, the output might look something like this:Output: Status: active Logging: on (low) Default: deny (incoming), allow (outgoing), disabled (routed) New profiles: skip To                         Action      From --                         ------      ---- 22/tcp                     ALLOW IN Anywhere $ sudo ufw deny ssh $ sudo ufw status verbose Output: Status: active Logging: on (low) Default: deny (incoming), allow (outgoing), disabled (routed) New profiles: skip To                         Action      From --                         ------      ---- 22/tcp                     DENY    Anywhere If you kicked or logged out of the instance once the changes are done, you will be seeing the below results.On SSH into the instance with your Public DNS through PuTTY below are the results which you are seeing as an error i.e. Network error: Connection Timed Out  Below error shows that even after all ports were opened outside, the instance is not able to SSH because of firewall software of Ubuntu at the system level. Let’s see how to resolve this kind of system related issue.Solution to the issue:Step 1: Take an image of the EC2 instance by selecting the instance ->Image-Create ImageStep 2: Provide specification Image name, tick on no reboot and push the create image buttonStep 3: Then Select the image and click on launchStep 4: Go to instance type, select and click on NextStep 5: In the configuration instance, write the below commands under Advanced Details and click on next:#!/bin/bash sudo ufw allow ssh sudo ufw allow 22 sudo ufw allow 443 sudo ufw allow 8080 sudo ufw allow 80 sudo ufw status  sudo ufw enable Step 6: Click next and next tab and add security group similar as providedStep 7: Review and launch the instance and then try to SSH to the instance through PuTTY. You will be now able to add SSH inside the instance with this and you can terminate the old instance as the new instance with all the setup same as that of the old instance without any issue except the public IP and private IP change.Best Practices of Firewall Configuration & Port Blocking:Ensure that the Security Groups will allow a specific IP addresses which are within the VPN Range of the Environment.Use of NACL for allowing and blocking the IP addresses or subnets for a specific Port by using allow and deny rules. A network ACL contains a numbered list of rules that we evaluate in order, starting with the lowest numbered rule. This helps to determine whether traffic is allowed in or out of any subnet associated with the network ACL. The highest number that you can use for a rule is 32766. We recommend that you start by creating rules in increments (for example, increments of 10 or 100) so that you can insert new rules.Use of Bastion Host for accessing critical servers and environment is always a better option to increase the security of the system or environment. We hope that this article gave you a clear understanding of how AWS services work. You can understand how cloud computing works by making yourself familiar with these services. To learn more about AWS, you can check out the other blogs and the AWS certification course by KnowledgeHut.
Rated 4.0/5 based on 3 customer reviews
SSHing into Ubuntu EC2 instance post blocking port...

IntroductionThis blog is in reference to a trouble... Read More

Useful links