Search

What is AWS CLI and How to Install it?

Whether you are a small business or a big organisation, you must be familiar with Amazon Web Services. AWS is the world’s most widely used cloud computing platform – something that lets you move faster, operate more securely and save substantial costs. AWS offers 165 services spanning a wide range including computing, tools for the Internet of Things, developer tools, deployment, analytics, database, networking, etc. The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services.Do you want to Get AWS Certified? Learn about various  AWS Certification in detailWhat is AWS CLI?The AWS Command Line Interface (CLI) is a unified tool that manages the AWS services for you. You only have to download and configure one simple tool to control a plethora of AWS services. They are automated through scripts and help you implement a certain level of automation. It is indeed AWS CLI that makes AWS so dynamic and easy to use.How to Install AWS CLI?Putting AWS CLI to use in your AWS involves a few steps.There are a few different ways to install it – you need to choose what works for your system.Every operating system has a different method of installation.Some generic steps are followed after the installation.Then comes the configuration.You will also need to upgrade it timely. Follow these steps to successfully install and configure the AWS CLI for use.Ways to Install:You can effectively install the AWS Command Line Interface (AWS CLI) using:pipa virtual environmenta bundled installerWhat do you need?Unix, macOS, Linux, WindowsPython 3 version 3.3+ or Python 2 version 2.6.5+It is important to know that you may not be able to use an older version of Python with all AWS Services. Update to a newer version if there are Insecure Platform Warning or deprecation notices. To find out what version you currently have, visit:  https://github.com/aws/aws-cli/blob/master/CHANGELOG.rst.1. Installing the AWS CLI Using pipPip is the main distribution method for the AWS CLI on macOS, Windows and Linux. It is a package manager for Python.Installing the current AWS CLI VersionIf you have pip and a supported version of Python, use the following command to install the AWS CLI. Use the pip3 command if you have Python version 3+ installed:$ pip3 install awscli --upgrade –userThe --upgrade option commands pip3 to upgrade the requirements that are already installed. The --user option commands pip3 to install the program to a subdirectory of the user directory. Doing this avoids the complication of modifying libraries used by your operating system.Upgrading to the latest versionUse the pip list -o command to identify packages that are "outdated”:$ aws --version aws-cli/1.16.170 Python/3.7.3 Linux/4.14.123-111.109.amzn2.x86_64 botocore/1.12.160 $ pip3 list -o Package Version Latest Type ---------- -------- -------- ----- awscli 1.16.170 1.16.198 wheel botocore 1.12.160 1.12.188 wheelNow, run pip install --upgrade to get the latest version:$ pip3 install --upgrade --user awscli Collecting aws cli Downloadinghttps://files.pythonhosted.org/packages/dc/70/b32e9534c32fe9331801449e1f7eacba6a1992c2e4af9c82ac9116661d3b/awscli-1.16.198-py2.py3-none-any.whl (1.7MB) |████████████████████████████████| 1.7MB 1.6MB/s Collecting botocore==1.12.188 (from awscli) Using cached https://files.pythonhosted.org/packages/10/cb/8dcfb3e035a419f228df7d3a0eea5d52b528bde7ca162f62f3096a930472/botocore-1.12.188-py2.py3-none-any.whl Requirement already satisfied, skipping upgrade: docutils>=0.10 in ./venv/lib/python3.7/site-packages (from awscli) (0.14) Requirement already satisfied, skipping upgrade: rsa<=3.5.0,>=3.1.2 in ./venv/lib/python3.7/site-packages (from awscli) (3.4.2) Requirement already satisfied, skipping upgrade: colorama<=0.3.9,>=0.2.5 in ./venv/lib/python3.7/site-packages (from awscli) (0.3.9) Requirement already satisfied, skipping upgrade: PyYAML<=5.1,>=3.10; python_version != "2.6" in ./venv/lib/python3.7/site-packages (from awscli) (3.13) Requirement already satisfied, skipping upgrade: s3transfer<0.3.0,>=0.2.0 in ./venv/lib/python3.7/site-packages (from awscli) (0.2.0) Requirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (0.9.4) Requirement already satisfied, skipping upgrade: urllib3<1.26,>=1.20; python_version >= "3.4" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (1.24.3) Requirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= "2.7" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (2.8.0) Requirement already satisfied, skipping upgrade: pyasn1>=0.1.3 in ./venv/lib/python3.7/site-packages (from rsa<=3.5.0,>=3.1.2->awscli) (0.4.5) Requirement already satisfied, skipping upgrade: six>=1.5 in ./venv/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore==1.12.188->awscli) (1.12.0) Installing collected packages: botocore, awscli Found existing installation: botocore 1.12.160 Uninstalling botocore-1.12.160: Successfully uninstalled botocore-1.12.160 Found existing installation: awscli 1.16.170 Uninstalling awscli-1.16.170: Successfully uninstalled awscli-1.16.170 Successfully installed awscli-1.16.198 botocore-1.12.1882. Installing the AWS CLI in a Virtual EnvironmentAnother option is to install the AWS CLI in a virtual environment to separate the tool and its dependencies. You can also use a different Python version for this purpose.3. Installing the AWS CLI Using an InstallerUse the bundled installer for automated and offline installation on Unix, macOS, and Linux. It includes the AWS CLI, its dependencies, and a shell script that is responsible for the installation. For Windows, try MSI installer. Both these methods simplify the initial installation. Installation of Linux or UnixBoth the platforms have identical installation process. You need to have Python’s latest version. We recommend using the bundled installer for this. The steps are as follows:1. To begin the installation:curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"2. Unzip the downloaded package:unzip awscli-bundle.zip3. Run the installation:sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/awsUsing the -b option allows you to use the AWS CLI from any directory.Installation on Amazon LinuxThe AWS Command Line Interface comes preinstalled on both Amazon Linux and Amazon Linux 2. Below are the steps to install:1. Identify currently installed version:$ aws --version aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.1062. Use pip3 to install the latest version of the AWS CLI. If you run the command from within a Python virtual environment (venv), then you don't need to use the --user option.$ pip3 install --upgrade --user awscli3. Add the install location to the beginning of the PATH variable.$ export PATH=/home/ec2-user/.local/bin:$PATH4. Verify that you're running new version with aws --version.$ aws --version aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106Installation on WindowsThe AWS Command Line Interface can be installed on Windows by using a standalone installer or through a pip - a package manager for Python> Through InstallerDownload the appropriate MSI installer.Run the downloaded MSI installer or the setup file.Follow these instructions:By default, the CLI installs to C:\Program Files\Amazon\AWSCLI (64-bit version) or C:\Program Files (x86)\Amazon\AWSCLI (32-bit version). To confirm the installation, use the aws --version command at a command prompt (open the Start menu and search for cmd to start a command prompt).C:\> aws --version aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106If Windows is unable to find the program, you might need to close and reopen the command prompt to refresh the path, or add the installation directory to your PATH environment variable manually.> Through Pip1. Open Start menu→Command Prompt2. Verify that both Python and pip are installed correctly:C:\> python --version Python 3.7.1 C:\> pip3 --version pip 18.1 from c:\program files\python37\lib\site-packages\pip (python 3.7)3. Install AWS CLI via pipC:\> pip3 install awscli4. Check if the installation went rightC:\> aws --version aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106To upgrade to the latest version:C:\> pip3 install --user --upgrade awscliInstallation on MAC OS> Through Installer1. Download the  AWS CLI Bundled Installer$ curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"2. Unzip the package$ unzip awscli-bundle.zip3. Run the installation$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/awsThis command installs the AWS CLI to /usr/local/aws and creates the symlink aws in the /usr/local/bin directory. Using the -b option to create a symlink eliminates the need to specify the install directory in the user's $PATH variable. It enables users to run the AWS CLI by typing “aws” from any directory.If you want to see an explanation of the -i and -b options, use the -h option$ ./awscli-bundle/install -hthe commands summarized for easy cut and paste at the command line.curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip" unzip awscli-bundle.zip sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws> Through PIP1. Download and install the latest version of Python from Python.org.2. Download and run the pip3 installation script provided by the Python Packaging Authority$ curl -O https://bootstrap.pypa.io/get-pip.py $ python3 get-pip.py --user3. Use pip3 to install the AWS CLI. We recommend using the pip3 command if you use Python version 3+$ pip3 install awscli --upgrade --user4. See if AWS CLI is installed correctly$ aws --version AWS CLI 1.16.116 (Python 3.6.8)To upgrade to the latest version, run the command:$ pip3 install awscli --upgrade --userInstallation on Ubuntu> Through APT Package Manager1. Update the package repository cache$ sudo apt-get update2. Install AWS CLI with the following command$ sudo apt-get install awsclipress y and then press <Enter> to continue. Your screen should look something like this: 3. Now that it’s installed, check if it’s working properly or not$ aws --version> Through PIPAWS CLI being a Python module itself makes it easy for users who install it through PIP to update it on a regular basis. Assuming you have Python 3, follow the below steps to install:1. Install Python PIP with the following command$ sudo apt-get install python3-pipPress y and then press <Enter> to continue2. Install AWS CLI using PIP with the following command$ pip3 install awscli --upgrade --user3. Run AWS CLI with the following command$ python3 -m awscli --versionAfter InstallationAfter you have successfully installed AWS CLI, you need to set the Path to Include the AWS CLI in your system.> LINUXFind out the folder in which pip installed the AWS CLI$ which aws /home/username/.local/bin/awsYou can reference this as “~/.local/bin/” because of the reason that “/home/username” corresponds to ~ in Linux OSIn case you don't know where Python is installed, run this command$ which python /usr/local/bin/pythonIf this is the same folder you added to the path while installing pip, there’s nothing else to be done. Otherwise, perform those same steps again, adding this additional folder to the path.> WINDOWSThe Windows System PATH tells your PC where it can find specific directories:C:\> where awsC:\Program Files\Amazon\AWSCLI\bin\aws.exeFind out where the aws program is installedC:\> where c:\ awsC:\Program Files\Python37\Scripts\awsIf the command returns the following error, then it is not in the system PATH and you can't run it by typing its name.C:\> where c:\ awsINFO:Could not find files for the given pattern.In that case, you need to add the path manually. First, you need to search where it is installed on your computer:C:\> where /R c:\ awsc:\Program Files\Amazon\AWSCLI\bin\aws.exec:\Program Files\Amazon\AWSCLI\bincompat\aws.cmdc:\Program Files\Amazon\AWSCLI\runtime\Scripts\awsc:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws.cmd...To modify your PATH variable (Windows)Press the Windows key and enter environment variables.Choose the Edit environment variables for your account.Choose the PATH →Edit.Add the path to the Variable value field. For example: C:\new\pathClick OK twice to apply the new settings.Close any running command prompts and reopen the command prompt window.> MAC OSLocate Python$ which python /usr/local/bin/pythonThe output might be the path to a symlink, not the actual program. Run ls -al to see where it points.$ ls -al /usr/local/bin/python ~/Library/Python/3.7/bin/python3.6Pip install programs in the same folder as the Python application. Add this folder to your PATH variable.To modify the PATH variable for macOS (and Linus or Unix):1. Find the shell profile script in the user folder. In case you don’t know which shell you have, run echo $SHELL2. Through the following, add an export command to the profile scriptexport PATH=~/.local/bin:$PATHThis adds a path, ~/.local/bin in this example, to the current PATH variable.3. The updated profile can now be loaded into your current session$ source ~/.bash_profile

What is AWS CLI and How to Install it?

10K
  • by Joydip Kumar
  • 10th Sep, 2019
  • Last updated on 11th Mar, 2021
  • 14 mins read
What is AWS CLI and How to Install it?

Whether you are a small business or a big organisation, you must be familiar with Amazon Web Services. AWS is the world’s most widely used cloud computing platform – something that lets you move faster, operate more securely and save substantial costs. AWS offers 165 services spanning a wide range including computing, tools for the Internet of Things, developer tools, deployment, analytics, database, networking, etc. The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services.

Do you want to Get AWS Certified? Learn about various  AWS Certification in detail

What is AWS CLI?

The AWS Command Line Interface (CLI) is a unified tool that manages the AWS services for you. You only have to download and configure one simple tool to control a plethora of AWS services. They are automated through scripts and help you implement a certain level of automation. It is indeed AWS CLI that makes AWS so dynamic and easy to use.

How to Install AWS CLI?

Putting AWS CLI to use in your AWS involves a few steps.

  • There are a few different ways to install it – you need to choose what works for your system.
  • Every operating system has a different method of installation.
  • Some generic steps are followed after the installation.
  • Then comes the configuration.
  • You will also need to upgrade it timely. 

Follow these steps to successfully install and configure the AWS CLI for use.

Ways to Install:

You can effectively install the AWS Command Line Interface (AWS CLI) using:

Ways to install the AWS Command Line Interface in AWS CLI

  1. pip
  2. a virtual environment
  3. a bundled installer

What do you need?

  • Unix, macOS, Linux, Windows
  • Python 3 version 3.3+ or Python 2 version 2.6.5+

It is important to know that you may not be able to use an older version of Python with all AWS Services. Update to a newer version if there are Insecure Platform Warning or deprecation notices. To find out what version you currently have, visit:

  https://github.com/aws/aws-cli/blob/master/CHANGELOG.rst.

1. Installing the AWS CLI Using pip

Pip is the main distribution method for the AWS CLI on macOS, Windows and Linux. It is a package manager for Python.

Installing the current AWS CLI Version

If you have pip and a supported version of Python, use the following command to install the AWS CLI. Use the pip3 command if you have Python version 3+ installed:

$ pip3 install awscli --upgrade –user

The --upgrade option commands pip3 to upgrade the requirements that are already installed. The --user option commands pip3 to install the program to a subdirectory of the user directory. Doing this avoids the complication of modifying libraries used by your operating system.

Upgrading to the latest version

Use the pip list -o command to identify packages that are "outdated”:

$ aws --version
aws-cli/1.16.170 Python/3.7.3 Linux/4.14.123-111.109.amzn2.x86_64 botocore/1.12.160

$ pip3 list -o

Package    Version      Latest        Type 
----------     --------      --------        -----
awscli       1.16.170    1.16.198   wheel
botocore   1.12.160    1.12.188   wheel

Now, run pip install --upgrade to get the latest version:

$ pip3 install --upgrade --user awscli
Collecting  aws cli 
Downloadinghttps://files.pythonhosted.org/packages/dc/70/b32e9534c32fe9331801449e1f7eacba6a1992c2e4af9c82ac9116661d3b/awscli-1.16.198-py2.py3-none-any.whl (1.7MB)
     |████████████████████████████████| 1.7MB 1.6MB/s 
Collecting botocore==1.12.188 (from awscli)
Using cached https://files.pythonhosted.org/packages/10/cb/8dcfb3e035a419f228df7d3a0eea5d52b528bde7ca162f62f3096a930472/botocore-1.12.188-py2.py3-none-any.whl
Requirement already satisfied, skipping upgrade: docutils>=0.10 in ./venv/lib/python3.7/site-packages (from awscli) (0.14)
Requirement already satisfied, skipping upgrade: rsa<=3.5.0,>=3.1.2 in ./venv/lib/python3.7/site-packages (from awscli) (3.4.2)
Requirement already satisfied, skipping upgrade: colorama<=0.3.9,>=0.2.5 in ./venv/lib/python3.7/site-packages (from awscli) (0.3.9)
Requirement already satisfied, skipping upgrade: PyYAML<=5.1,>=3.10; python_version != "2.6" in ./venv/lib/python3.7/site-packages (from awscli) (3.13)
Requirement already satisfied, skipping upgrade: s3transfer<0.3.0,>=0.2.0 in ./venv/lib/python3.7/site-packages (from awscli) (0.2.0)
Requirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (0.9.4)
Requirement already satisfied, skipping upgrade: urllib3<1.26,>=1.20; python_version >= "3.4" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (1.24.3)
Requirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= "2.7" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (2.8.0)
Requirement already satisfied, skipping upgrade: pyasn1>=0.1.3 in ./venv/lib/python3.7/site-packages (from rsa<=3.5.0,>=3.1.2->awscli) (0.4.5)
Requirement already satisfied, skipping upgrade: six>=1.5 in ./venv/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore==1.12.188->awscli) (1.12.0)
Installing collected packages: botocore, awscli
Found existing installation: botocore 1.12.160
Uninstalling botocore-1.12.160:
Successfully uninstalled botocore-1.12.160
Found existing installation: awscli 1.16.170
Uninstalling awscli-1.16.170:
Successfully uninstalled awscli-1.16.170
Successfully installed awscli-1.16.198 botocore-1.12.188

2. Installing the AWS CLI in a Virtual Environment

Another option is to install the AWS CLI in a virtual environment to separate the tool and its dependencies. You can also use a different Python version for this purpose.

3. Installing the AWS CLI Using an Installer

Use the bundled installer for automated and offline installation on Unix, macOS, and Linux. It includes the AWS CLI, its dependencies, and a shell script that is responsible for the installation. For Windows, try MSI installer. Both these methods simplify the initial installation. 

Installation of Linux or Unix

Both the platforms have identical installation process. You need to have Python’s latest version. We recommend using the bundled installer for this. The steps are as follows:

1. To begin the installation:

curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"

2. Unzip the downloaded package:

unzip awscli-bundle.zip

3. Run the installation:

sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

Using the -b option allows you to use the AWS CLI from any directory.

Installation on Amazon Linux

The AWS Command Line Interface comes preinstalled on both Amazon Linux and Amazon Linux 2. Below are the steps to install:

1. Identify currently installed version:

$ aws --version
aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106

2. Use pip3 to install the latest version of the AWS CLI. If you run the command from within a Python virtual environment (venv), then you don't need to use the --user option.

$ pip3 install --upgrade --user awscli

3. Add the install location to the beginning of the PATH variable.

$ export PATH=/home/ec2-user/.local/bin:$PATH

4. Verify that you're running new version with aws --version.

$ aws --version
aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106

Installation on Windows

The AWS Command Line Interface can be installed on Windows by using a standalone installer or through a pip - a package manager for Python

> Through Installer

  1. Download the appropriate MSI installer.
  2. Run the downloaded MSI installer or the setup file.
  3. Follow these instructions:

By default, the CLI installs to C:\Program Files\Amazon\AWSCLI (64-bit version) or C:\Program Files (x86)\Amazon\AWSCLI (32-bit version). To confirm the installation, use the aws --version command at a command prompt (open the Start menu and search for cmd to start a command prompt).

C:\> aws --version
aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106

If Windows is unable to find the program, you might need to close and reopen the command prompt to refresh the path, or add the installation directory to your PATH environment variable manually.

> Through Pip

1. Open Start menu→Command Prompt
2. Verify that both Python and pip are installed correctly:

C:\> python --version
Python 3.7.1
C:\> pip3 --version
pip 18.1 from c:\program files\python37\lib\site-packages\pip (python 3.7)

3. Install AWS CLI via pip

C:\> pip3 install awscli

4. Check if the installation went right

C:\> aws --version
aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106

To upgrade to the latest version:

C:\> pip3 install --user --upgrade awscli

Installation on MAC OS

> Through Installer

1. Download the  AWS CLI Bundled Installer

$ curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"

2. Unzip the package

$ unzip awscli-bundle.zip

3. Run the installation

$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

This command installs the AWS CLI to /usr/local/aws and creates the symlink aws in the /usr/local/bin directory. Using the -b option to create a symlink eliminates the need to specify the install directory in the user's $PATH variable. It enables users to run the AWS CLI by typing “aws” from any directory.

If you want to see an explanation of the -i and -b options, use the -h option

$ ./awscli-bundle/install -h

the commands summarized for easy cut and paste at the command line.

curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
unzip awscli-bundle.zip
sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

> Through PIP

1. Download and install the latest version of Python from Python.org.
2. Download and run the pip3 installation script provided by the Python Packaging Authority

$ curl -O https://bootstrap.pypa.io/get-pip.py
$ python3 get-pip.py --user

3. Use pip3 to install the AWS CLI. We recommend using the pip3 command if you use Python version 3+

$ pip3 install awscli --upgrade --user

4. See if AWS CLI is installed correctly

$ aws --version
AWS CLI 1.16.116 (Python 3.6.8)

To upgrade to the latest version, run the command:

$ pip3 install awscli --upgrade --user

Installation on Ubuntu

> Through APT Package Manager

1. Update the package repository cache

$ sudo apt-get update

2. Install AWS CLI with the following command

$ sudo apt-get install awscli

press y and then press <Enter> to continue. Your screen should look something like this: 



3. Now that it’s installed, check if it’s working properly or not

$ aws --version

> Through PIP

AWS CLI being a Python module itself makes it easy for users who install it through PIP to update it on a regular basis. Assuming you have Python 3, follow the below steps to install:

1. Install Python PIP with the following command

$ sudo apt-get install python3-pip

Press y and then press <Enter> to continue
2. Install AWS CLI using PIP with the following command

$ pip3 install awscli --upgrade --user

3. Run AWS CLI with the following command

$ python3 -m awscli --version

After Installation

After you have successfully installed AWS CLI, you need to set the Path to Include the AWS CLI in your system.

> LINUX

Find out the folder in which pip installed the AWS CLI

$ which aws
/home/username/.local/bin/aws

You can reference this as “~/.local/bin/” because of the reason that “/home/username” corresponds to ~ in Linux OS
In case you don't know where Python is installed, run this command

$ which python
/usr/local/bin/python

If this is the same folder you added to the path while installing pip, there’s nothing else to be done. Otherwise, perform those same steps again, adding this additional folder to the path.

> WINDOWS

The Windows System PATH tells your PC where it can find specific directories:

C:\> where aws

C:\Program Files\Amazon\AWSCLI\bin\aws.exe
Find out where the aws program is installed

C:\> where c:\ aws
C:\Program Files\Python37\Scripts\aws

If the command returns the following error, then it is not in the system PATH and you can't run it by typing its name.

C:\> where c:\ aws

INFO:Could not find files for the given pattern.

In that case, you need to add the path manually. First, you need to search where it is installed on your computer:

C:\> where /R c:\ aws
c:\Program Files\Amazon\AWSCLI\bin\aws.exe
c:\Program Files\Amazon\AWSCLI\bincompat\aws.cmd
c:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws
c:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws.cmd
...
To modify your PATH variable (Windows)

  1. Press the Windows key and enter environment variables.
  2. Choose the Edit environment variables for your account.
  3. Choose the PATH →Edit.
  4. Add the path to the Variable value field. For example: C:\new\path
  5. Click OK twice to apply the new settings.
  6. Close any running command prompts and reopen the command prompt window.

> MAC OS

Locate Python

$ which python
/usr/local/bin/python

The output might be the path to a symlink, not the actual program. Run ls -al to see where it points.

$ ls -al /usr/local/bin/python
~/Library/Python/3.7/bin/python3.6

Pip install programs in the same folder as the Python application. Add this folder to your PATH variable.

To modify the PATH variable for macOS (and Linus or Unix):

1. Find the shell profile script in the user folder. In case you don’t know which shell you have, run echo $SHELL

2. Through the following, add an export command to the profile script

export PATH=~/.local/bin:$PATH

This adds a path, ~/.local/bin in this example, to the current PATH variable.

3. The updated profile can now be loaded into your current session

$ source ~/.bash_profile

Joydip

Joydip Kumar

Solution Architect

Joydip is passionate about building cloud-based applications and has been providing solutions to various multinational clients. Being a java programmer and an AWS certified cloud architect, he loves to design, develop, and integrate solutions. Amidst his busy work schedule, Joydip loves to spend time on writing blogs and contributing to the opensource community.


Website : https://geeks18.com/

Join the Discussion

Your email address will not be published. Required fields are marked *

Suggested Blogs

What are the Best Free Cloud Storages in 2022?

Cloud storage helps slash data on hardware in a remote physical location, which can be accessed from any device with the help of the internet. Below are some of the best free cloud storage in 2019: 1. Microsoft OneDrive: Microsoft provides a cloud storage service and offers 15 GB of free storage to its users. OneDrive, earlier called Skydrive, can be used to store important files securely at one location and these can be accessed from anywhere and from any compatible system. Cloud storage works like a traditional hard drive, the difference being that it is available online and can be accessed from anywhere around the world. Also, companies offer to extend the storage capacity if required. Files of any format can be stored in Microsoft OneDrive, though it is better for Office documents. It offers cross-device and cross-platform design, which implies that you can resume your work from where you left without the need for saving it or making a copy of it. Due to its Office365 integration, it is easy to share your content and files with other users and hence facilitates collaboration among a team. One prominent feature of Microsoft OneDrive is Files on demand. This means you can access your files stored in the cloud from anywhere, without downloading them to your local device which saves a lot of local storage. Since the files are stored and changes are saved in the cloud, the recovery is very convenient in case the system you are working on gets crashed, damaged or stolen. You can easily upgrade your account by buying extra storage in case you are running low on memory.  The important features of Microsoft OneDrive: Create and share folders Save notebooks to OneDrive View Office documents online Upload multimedia files from your personal mobile phones. Real time co-authoring File type support Desktop synchronization Search and discovery tools Device reports Documents storage and tracking Permission management Photo management Microsoft Office integration Device specific selective sync 2. Dropbox: Dropbox is operated by the American company Dropbox, Inc., and provides cloud storage and is used for file sharing and collaboration. Dropbox is available as an application for desktop for Windows, Linux and Macintosh operating systems. It is also available as an app for Android, iPad, iPhone and Blackberry. It provides a limited storage capacity of 2GB, which can be upgraded up to 100 GB by purchasing from several plans offered. The upgraded version of Dropbox is called Dropbox Plus. Upgrading to this version offers many benefits like automatic camera roll back up, which implies that all the photos that you click from your personal mobile phone get backed in Dropbox.  Dropbox provides a robust API that allows other apps to collaborate with it and use it for file storage and sync. Dropbox does not require a complicated upgrade or timely installations. It is mainly focused on file exchange. It does not limit the number of files you can share or a number of users you can share with. You can even share your files with users who do not have a Dropbox account. It supports the exchange of files among any operating system and any device.  Dropbox facilitates easy collaboration among team members. If you edit or make changes to any file that is shared with your team, they will receive a notification about the changes made. Teams can communicate and edit files at a minimal cost. You can easily get started with Dropbox by just downloading the app on your phone. After you create an account, you can start sharing your content. You can even use it offline if you sync your local storage or additional account data with it. To ensure the security of your stored files, Dropbox is designed with multiple layers of protection.  Layers of protection include the following: Dropbox files are encrypted using 256 bit Advanced Encryption Standard (AES). It uses Secure Socket layer or transport layer security to secure data being exchanged with Dropbox apps and servers. Dropbox applications are regularly tested for security threats and if such vulnerabilities exist, it is fixed immediately. During login, two step verification is available to provide additional privacy. The important features of Dropbox are as follows: Automatic updates File storage and file sync Offline access Manually set bandwidth Automatic organization and backup Preview and download Any device accessibility Large files sharing Easy link sharing 3. Google Drive Google Drive is a file storage and sharing service offered by Google. The first 15 GB is free. You can store any kind of files like photos, videos, design, drawings, recordings, and stories. Just like most other cloud services, you can upload your files online, and can access it from anywhere and any device. You can even invite people to view, download or contribute to your project.  Google drive provides storage options to any kind of files such as images, sheets, pdf, videos, word documents etc, one can even save their email attachment in google drive directly through Gmail without going through the hassle of moving files. One of the greatest features of Google Drive is that it allows the user to view the file in the browser without having to download it. The files that can be previewed are Adobe, Microsoft Office, images, audios, texts and video files, such an option is not provided by most of the zero knowledge services as those services are unable to decrypt the files. Google Drive's versatility makes it more user-friendly than any other storage platforms. Significant features of Google Drive are:  Google drive provides 15 GB free storage on signing up. With the assistance of Google's homegrown office suite, Google docs and other third party applications it is more productive and efficient than most other storage options. It is very versatile as one can store almost any kind of files and preview the same without downloading it. Google drive can be accessed by logging in with your Google account. Many different types of programs can be integrated with Google drive. Google drive's browser interface is fairly intuitive and user friendly. With the help of Google client, one can make optimum use of its sync capabilities. File replication. It provides third party applications library. The only drawback Google Drive has is that it does not provide any specific way to share files, however, it is one of the best storage options as its strength surpasses its weaknesses. 4. pCloud Though pCloud is new in the market compared to its counterparts, it has emerged as a versatile cloud solution for both business and personal use. The features of pCloud which make it stand out among other cloud services, are as follows: pCloud crypto: It provides you with an option to secure your files and data with encryption on your device before uploading on cloud to protect them from any unwanted and insecure access. Lifetime subscription: Unlike other cloud services available in the market, pCloud offers a lifetime subscription, that means you can use its services permanently without going through limited and expensive subscriptions. The advantages of using pCloud are as follows: pCloud rewind: The main reason why people go for cloud storage is data recovery in case of any issue occurring in the local device. It should be easy to recover  files in case it is lost due to corruption, virus, or hardware problems. To ensure this, pCloud stores five separate copies of every file you upload at three different locations at a centralized data center facility in Dallas, Texas, so you always have a copy of your file available for download. pCloud has a feature called pCloud rewind which tracks and saves your file history from last 30 days in case if you make any mistakes or accidentally deleted your original file. You can also go for long periods of history by including the extended file history add on (EFH) that keeps previous file versions for up to 360 days. High level security: The sensitive files and data are highly vulnerable in the current date of technologies for hacking and stealing data. Organizations dealing with sensitive data can opt for pCloud since it provides client side encryption service called pCloud crypto. Other cloud service providers use AES (Advanced Encryption Standard) TLS/SSL 256 bit encryption which helps secure your files in transit from device to server. But after these files are uploaded on the cloud storage server, it gets back to its original format. This means that anyone who has access to the server can access the data as well. pCloud offers an additional level of data security where the responsibility of encryption is in the hands of the client. You can encrypt your files or data on your personal system before you upload it on the storage server. You can unlock these files by using the generated key called CryptoPass.  Convenient file management: pCloud is available on all digital platforms including Mac, Windows, Linux and also mobile systems like iOS, Android and Windows. This lets you access your file on the go from any system even while you are traveling. All the changes made are saved to the files irrespective of the device it is made on. You can access your files from any device without extra charges. Store your files from other online platforms: If your files or work is saved on Google Drive, Dropbox or any other third party cloud storage platforms, pCloud offers to easily exchange information with these services conveniently and also upload them. You can even sync your information with social media platforms like Facebook, Picasa, and Instagram. 5. Mega Mega is a secure cloud storage offered by Mega Limited, an Auckland based company. Mega offers 50 GB of free storage space. Mega is also available for Windows phone, iOS and Android in the form of mobile apps. It provides services basically through a web based app. For Windows, Linux, and Mac desktop programs, one can download MEGAsync. A folder is created in which one can drag and drop the files that one needs to upload on the MEGA account. For Mozilla Thunderbird users have the option of MEGAbird add on to share large files over email.  It was founded by Kim Dotcom in 2013 and was launched on January 19, 2013. Kim Dotcom left the company in 2015 when the company was taken over by a Chinese investor who was wanted for fraud in China and the consequent seizure of the investor's shares by the New Zealand government. The site is now controlled by the New Zealand government. Mega was created for privacy and security reasons and all files are end-to-end encrypted locally before they are uploaded. It has bandwidth limits that have to be set in account settings while uploading through a browser and through desktop client while working in the application. The users of free account receive 15 GB of base storage data and 35 GB trial on signup for one month. Through various achievements, one can activate additional storage but the maximum limit is 50 GB. On the other hand, paid account users have four levels of options: 200 GB (1 TB bandwidth per month) 1 TB storage (2 TB of bandwidth per month ) 4 TB storage ( 8TB bandwidth per month ) 8 TB storage (16 TB bandwidth per month) Significant features of Mega: MEGAchat - It was launched in 2015 as an encrypted alternative to applications such as Skype and others. It is a browser-based chat service that covers email, video, chat, voice, and mobile. Browser extension- They launched a plugin extension MEGA chrome extension in 2015, marketed for its improved downloading and loading and improved security. Later in 2018 it was found that chrome web store extension was compromised due to the addition of code which was designed to steal credentials of cryptocurrency however the original code on GitHub was secure. Desktop sync client- It assists in removing file uploads which have keywords in common that reduces uploading of unnecessary content. API - Mega has recently released a feature of documentation of its API enabling the developers to write their own applications. The whole folder can be uploaded through the browser. Limitations of Mega There is no option to share files among a group of users. One can experience inconvenience while working with browsers other than Google Chrome or Mozilla Firefox. There is no advanced sharing features, one can send Mega file only by creating a public link. There is a 10 GB bandwidth limit which is refilled every 30 minutes. Hope this article helped you get an understanding of all the best cloud storage available for free. If you want to learn about AWS, you should try the AWS Certification course 
8984
What are the Best Free Cloud Storages in 2022?

Cloud storage helps slash data on hardware in a re... Read More

Top Cloud Certifications

What is Cloud Computing?Cloud is the new buzzword these days, and the term Cloud Computing is everywhere. Everyone, everywhere, is moving their storage to the cloud, and reaping its immense benefits. With the advent of Cloud storage, there's also been a rise in job opportunities in the field. Cloud Computing jobs relate to professionals in Cloud Data Management systems, who have the expertise to deal with cloud servers and the problems that may arise both on the user level and the server levels.Why Cloud Computing?Due to the rise of Cloud services, like iCloud and Dropbox, to name a few, there's also a rise in the number of professionals needed for the job. Cloud Professionals and engineers are paid handsome amounts for their work - as much as $117892 (Source) per year or even more, depending on the level of experience or expertise. It is a growing field, so jobs are unlikely to diminish over the next few years, in fact quite the opposite. So, it is not too late to gain experience and get started in the world as a Cloud Computing professional.The Need for Certification and Prospective OpportunitiesAs we have mentioned before, the value and salary of a Cloud Professional depends on their experience. One of the ways to show expertise is through certifications. They provide you with appropriate knowledge to deal with the job and provide valuable proof of expertise in the market if they're obtained from reputed sources. A certification is sure to kickstart fruitful career in Cloud Computing. Keeping that in mind, we have compiled a set of the most reputed certifications in the field of Cloud.  Top Cloud Certificationsndustry-recognised certifications give you an edge over your non-certified peers, increasing your employability and helping you get ahead in your cloud career. Fresh, certifiable skills are guaranteed to open new career opportunities and increase your salary as well! Listed below are the top cloud certifications that you can consider: Google Certified Professional Data Architect  Amazon Web Services (AWS) Certified Solutions Architect- Associate  MCSE: Cloud Platform and Infrastructure (Microsoft)  Certified Cloud Security Professional (CCSP)  CompTIA Cloud +  VMware VCP7-CMA  CCNP Cloud (Cisco) 1. Google Certified Professional Cloud Architect Google Certified Professional Data Architect has the honour of topping the lists of the hugest paying IT certifications in the United States of America. Google is a borderline ubiquitous brand. Most people used a few Google products to reach this article in the first place, so here is the same reputed company offering a certification that validates proficiency on the Google Cloud Platform. It includes the fields of Cloud architecture design, development, and management on different scales and an incredibly high degree of security and standards.  Prerequisites: There are no official pre-requisites, but Google recommends more than three years of experience in the industry, including more than a year's worth of designing and management experience using the Google Cloud Platform.  Exam Cost & Duration: The exam costs $200 each, (Source), and a test Center can be found on the Google Cloud website. The exam duration is 2 hours long and can be taken in either English or Japanese.  Exam Guide: Google Cloud offers an exam guide with a dedicated list of topics and many case studies that can help with studying for the exam. They also offer a training path comprising of texts and videos, which are easy to engage with as well.  Salary: The average Pay for a Google Cloud Architect can be around $103K (Source)  2. Amazon Web Services (AWS) Certified Solutions Architect- Associate Amazon Web Services is one of the top cloud computing companies in recent times. They have achieved an impressive 43% growth over the last year. They are followed by Microsoft Azure and Google Cloud with a close lead. Amazon Web Services offer certifications at the foundation, associate, and professional levels. This prepares a candidate for developing and architecture roles and offers operational knowledge. The associate certification can be a steppingstone to a potential professional level certification which comes with veteran level jobs and authorizes years' worth of experience in the field of Cloud Computing architecture, and design.  Prerequisites: Amazon prefers hands-on experience in the fields of networking, database, computational, and storage AWS services with the ability to perfectly define requirements for an AWS application. They require critical thinking skills taking into view the AWS service format along with knowledge on building security services.  Exam Cost: It costs $150. A practice exam can be purchased for $20 USD. (Source).  The exam can be taken in English, Japanese, Korean, and Simplified Chinese languages. Exam Guide: Amazon offers a collection of hands-on training courses, videos, and much more to prepare for the exam. Self-evaluation methods include an exam guide and sample questions. 65 MCQ format questions.  Salary: The average Pay an AWS solution Architect can expect is around $121K (Source)  3. MCSE: Cloud Platform and Infrastructure (Microsoft)Microsoft is, again, one of the brands which have made a mark on the technology industry today. The MCSE: Cloud Platform and Infrastructure Course certifies a person's ability to effectively manage cloud data, shows their skill in managing virtual networks, storage management systems, and many more cloud technologies,  Prerequisites: One does not just take the MCSA (Microsoft Certified Solution Associate): Azure certification. They must also score a passing grade on an exam called the MSCE, which covers development, and Azure-based and related architecture solutions along with hybrid cloud operations and bits of big data analytics. An MCSE along with two or three pre-requisite exams need to be taken.   Exam Cost: The MCSE exam costs $165, (Source) while the pre-requisite exams cost $165 and $300 (MCSA and LFCS, respectively) Exam Guide/Courses: Microsoft Virtual Academy (MVA) offers free courses and reference matter relevant to Cloud professionals and cloud development. A program called Exam Reply is available that allows candidates to buy a slightly discounted exam, a practice attempt (which needs a slight upcharge), and a retake attempt as well.    Salary: The job title that can be earned after this certification is Microsoft Cloud Solution Architect, and this role can earn around $154133 per year. (Source)  4. Certified Cloud Security Professional (CCSP)Offered by the (ISC)^2 (International Information System Security Certification Consortium, the CCSP is a globally recognized certification. It validates a candidate's ability to work within a cloud architecture along with good abilities in the field of design, secure applications, along data and infrastructure. These are carried out under the protocols offered by (ISC)^2, which are a hallmark of security. It's ideal for those who want an enterprise architect role, and other roles include systems engineers, security administrator or a consultant in the field of security.  Prerequisites: The (ISC)^2 recommends around five years of experience in the field of IT, including three in Information security and one in any of the domains prescribed by CSSP Common Body of Knowledge.  Exam Cost: The exam is provided by Pearson VUE. The standard registration for the exam costs $600 (Source)  Exam Guide/Courses: The CCSP examination involves preparation in 6 different domains, as highlighted in the CCSP exam outline.  Salary: The job title earned is Cloud Security Professional, a job that can pay up to $138k per annum. (Source)  5. CompTIA Cloud+ An acronym for Computing Technology Industry Association, CompTIA is a non-profit. It serves the IT industry and is one of the global leaders in certifications like the ones you're looking for on the list. These are vendor-neutral, meaning you can apply to a broad range of jobs, and it means you're not restrained to any particular company. They cover certifications from novice to professional levels.  CompTIA Cloud+ acts as a foundation-level certification. Like its selling point, Cloud+ offers a piece of foundational knowledge in a broad domain in the Cloud market. It authorizes skills in the maintenance and optimization of cloud software. It shows that a candidate can demonstrate the ability to migrate data to cloud platforms, manage cloud resources and make appropriate modifications, perform automation tasks to improve performance, all the while focusing on security.  Prerequisites: It needs 2-3 years' worth of experience in system administration.  Exam Cost & Format: The exam includes 90 questions. Available in English and Japanese Costs $338 (Source), The certification expires in 3 years after launch Salary: As a cloud specialist an average pay that can be expected in the US market is around $80317 (Source). 6. VMware VCP7-CMA VMware is a company that is well known within the IT-sphere for its strong grasp of virtualization technologies. The VCP7- Cloud Management and Automation is the latest in a series of certifications the company has rolled out. The vRealise and the vSphere-based program are instrumental in certifying new as well as veteran IT professionals in the field of virtualization in the Cloud.  Prerequisites:  A prerequisite is to have a minimum of 6-month experience with the vSphere 6 and realized software.  One also needs to complete one of the training courses offered by VMware, which keeps updating on the current course list portion of the website.  Candidates can choose one out of 3 exams: vSphere 6 Foundations, vSphere 6.5 Foundations, or VMware Certified Professional Management and Automation exam.  Exam Cost: vSphere 6 and 6.5 cost $125, whereas the third exam costs $250 (Source). A VMWare candidate ID is needed to register.  Exam Guide: Exam Self-study material is available on the certification page.  Salary: As a VMWare Staff Engineer, the salary expected could be up to $188446 every year. (Source)  7. CCNP Cloud (Cisco)CCNP stands for Cisco Certified Network Professional. This is one of the more reputed certifications that allows a professional to validate their skills in the fields of data management, cloud architecture, and design and authorize their path as a cloud professional. Along with the Cloud, the CCNP is also available as a Collaboration, Service Provider, Data Centre, and many other fields in the collection of solutions. Be warned, though. Cisco focuses on the practical requirements as well, so their certification process is equally rigorous, with design, practical, architecture-based assessments to keep one on their toes. But in the end, this multidisciplinary approach proves itself. An understanding of Application Centric Infrastructure (ACI) is also vital. They provide a lot of resources to prepare as well, with assignments, discussion forums, self-assessments, and much more!  Training in the fields of CLDING, CLDDES, CLDAUT, CLACI, CLDINF is highly recommended. These cover information on Cisco cloud infrastructure, automation, infrastructure, and troubleshooting.  Prerequisites: There are four exams that need to be taken in each of the above fields. They are administered by Pearson VUE.  Exam Cost: Each exam costs $300, $1200 total. (Source)  Exam Guide: For the study material, Cisco has curated many resources like Learning Network games, self-assessment modules, seminars, videos, and much more. Textbooks and other materials are also available on the Cisco Marketplace Bookstore.  Salary: The typical job that can be obtained is Cisco Systems Cloud Engineer that pays around $158010 per annum.(Source)  Certification LevelsThese cloud certifications can be segregated into Professional and Associate levels, where various criteria are required to be fulfilled to be eligible to apply for the respective certification. As per the market trends and the demand, here is a detailed description of some of the most coveted certifications: Amazon Web Services - AWS 1. AWS Certified Solutions Architect - Professional This certification is for professionals who have experienced hands-on solutions architect roles. A candidate must have 2 or more years of experience in operating and managing the AWS operations. The exam costs 300 USD and is 180 minutes long. This course validates the following abilities:  Implementation of cost control strategies  Designing fault proof applications on AWS  Choosing appropriate AWS services for design and application   Migrating the complex applications on AWS  Exam criteria 2 or more years of experience in handling cloud architecture on AWS  One should have diverse knowledge of AWS CLI, AWS APIs, AWS CloudFormation templates, the AWS Billing Console, and the AWS Management Console  Detailed knowledge of the scripting language  Must have worked on Windows and Linux  Must be able to explain the five pillars of the AWS architecture Framework  Practical knowledge of the architectural design across multiple projects of the company.  2. AWS Certified Solutions Architect - Associate  This course is for professionals who have one year of experience in handling and designing fault free and scalable distributed systems on AWS.  This certificate validates the following abilities:  In depth knowledge of deploying the secure and powerful applications on AWS  Knowledge and application of customized architectural principles  Exam criteria The course requires a complete understanding of the AWS global infrastructure, network technologies, security features and tools related to AWS  Knowledge of how to build secure and reliable AWS applications  Experience of deployment and management of management services.  The exam duration is 130-minutes and the fee is $150. The above were some of the main certified courses of AWS. The other two Associate level courses are AWS SysOps Administrator Associate and the AWS Developer Associate. 3. The AWS Certified DevOps Engineer – ProfessionalThis exam is for professionals who have experience as a DevOps engineer and have experience in provisioning, operating, and managing AWS environments.  This course validates the following abilities: Management and implementation of delivery systems and methodology on AWS  Deploying and managing the logging, metrics, and monitoring system on AWS  Implementation and management of highly scalable, and self-healing systems on AWS.  Automation of security controls, government processes and compliance validation  Exam criteria  Knowledge and experience in administering operating systems and building highly automated infrastructure.  Knowledge of developing code in at least one high level programming language.  The cost of the exam is 300 USD and the duration is 180 minutes. There will be 75 questions.  Microsoft Web Service – Azure:  1. Azure Developer Associate AZ-204This course will provide you with the skill set to design, build, test and maintain cloud solutions from the start to the end.  You will master the basics of developing an app and all the other services Azure provides. This certification course will help you learn the actual syntax and programming languages that are used to integrate the application on Azure.  Exam criteria You are required to take an Exam AZ-204: Developing Solutions for Microsoft Azure ($165 USD) and must have at least 1-2 years’ of experience with development and azure development.  Having a good command in any of these languages like C#, PHP, Java, Python, or JavaScript would be a plus.  Getting certified with this course will set you ahead of your peers in the development sector.  2. Azure Data Scientist Associate DP-100Turning data and facts related to a business into useful and actionable insights is an art, and getting the Azure data scientist certification will prove that you have the required expertise in data and machine learning.  This course is for professionals who are currently working as a data scientist or are planning to become one soon. Exam: DP-100: Designing and Implementing a Data Science Solution on Azure ($165 USD)  Exam criteria You should have knowledge and experience in data science and in using Azure Machine Learning and Azure Databricks. This certification course can future-proof your career, as there is spectacular growth in internet use and the demand for job roles in this sector will continue to increase year on year.   Wondering where to start? Here are some pointers:Are you a Newbie?If you are a lost soul in the world of technology but want to learn, then the perfect way to start is the Azure Fundamentals Course. Any beginner can grasp the fundamentals and get started.Are you in the middle of the road?If you are someone who has average experience and has worked with hands-on AWS, GCP, or Azure then too we would recommend you start with the Fundamental course. Refresh your knowledge and make your basics stronger before you move on to the Administrator Associate certification, which can be very intimidating otherwise. Are you an Expert?  If you have had enough experience with cloud computing or have got serious geek vibes in you, then you can take up any speciality or professional certifications to add the missing edge to your expertise.  If you still need more clarity, you can explore our cloud certification category page for more details. Need more handholding? Contact our experts by using the Contact Learning Advisor button and fill up a small form. Let’s connect! Why be a Cloud Computing Professional? 1. A Growing Field As more and more of our lives are uploaded on the Cloud, the demand for professionals with the capabilities to handle cloud architecture is increasing by the day. Professionals with the right expertise are paid handsome salaries, and the investments made in certification repay themselves many times over. The demands for Cloud professionals outstrip the supply by a huge margin, making this an easy job for entry-level applicants.   2. A Good PayThe salary for a Cloud Engineer ranges from $117,892 to $229,000 (Source). This is a rewarding field, indeed! You can get onto the entry point of the ladder and work your way up, which is an easy journey if you earn a certification. It is one of the highest paying jobs that can be found in the IT sector.   Companies Hiring Certified Cloud Computing Professionals Some of the companies whose certifications we addressed above are also among the key employers in the Cloud Computing market. The key employers for these jobs are listed below.    Amazon They are the undoubted leaders in the fields of Cloud Computing and management. They are branching out in the fields of AI, the Internet of things, machine learning, and database management as well, and you can explore exciting new opportunities in any of these fields. As documented above, AWS has faced over 43% growth year after year for a sustained period. They are undoubtedly one of the largest hirers in the field as they need competent workforce for their expanding ventures.  Microsoft After the enormous success of the Office 365 platform, Cloud Computing was the next step forward for Microsoft, with the Azure platform. They are neck to neck with Amazon for the number 1 spot in the field of Cloud architecture and database management.   IBM The waning brand of IBM has now made a sudden resurgence to capitalize on the demand in the fields of AI, the Information Age, and the new Cloud phenomenon. They have recently acquired Red Hat and have entered the field of hybrid cloud development. They will surely be looking for professionals in the field to boost their chances. Dell Technologies (VMware) VMware, mentioned on the above lists, has partnered with Dell Technologies to form a robust cloud platform. A veteran player in the industry already, VMware has constantly evolved to adapt to advancements in the industry. They have partnerships with all the huge players like AWS, Microsoft Azure and Google Cloud as well.  ConclusionIt is quite evident that Cloud computing is one of the most exciting and lucrative fields one can be in, considering the investment to return ratio. These certifications offer incredibly excellent value for money and will lead to placements in leading companies, which is not easy via other paths.  There is a lot to learn in the field of Cloud Computing, and it is a highly adaptive job as well; that is why one needs to keep an eye on the newest software and architecture in the market. These certifications make sure that you can validate your experience and increase your employability. While there are many certifications available, only the ones from reputed institutions help to get a job. They show that you have the knowledge and expertise to make your mark in the industry.  It is never too late to start your learning journey, so grab that certification exam guide and start learning. Happy computing!  
4513
Top Cloud Certifications

What is Cloud Computing?Cloud is the new buzzword ... Read More

A Glimpse Of The Major Leading SAFe® Versions

A Quick view of SAFe® Agile has gained popularity in recent years, and with good reason. Teams love this approach that allows them to get a value to the customer faster while learning and adjusting to change as needed. But teams often don’t work in isolation. Many teams work in the context of larger organizations.  Often Agile doesn’t fit their needs. Some teams need an Agile approach that scales to larger projects that involve multiple teams.   It’s possible to do this. That’s where the Scaled Agile Framework, or SAFe®, can help.Why SAFe® is the best scalable framework?The Scaled Agile Framework is a structured Agile approach for large enterprises. It’s prescriptive and provides a path for interdependent teams to gain the benefits of using an Agile approach.Scaled Agile provides guidance not only at the team level but also at the Program and Portfolio levels. It also has built-in coordinated planning across related teams who are working in Release Trains.These planning increments allow teams to plan together to work with customers and release value frequently in a way that’s sustainable to teams.And it supports continuous improvement.It’s a great way for large companies to maintain structure and roll out Agile at a large scale.  What is SAFe® 4.5? Scaled Agile, otherwise known as SAFe®, was initially released in 2011 by Dean Leffingwell as a knowledge base for enterprises to adopt Agile. Over the years it has grown and evolved. SAFe® 4.5 was released on June 22, 2017, to accommodate improvements to the framework. Following are some of the key improvements in SAFe® 4.5:Essential SAFe® and ConfigurabilityInnovation with Lean Startup and Lean UXScalable DevOps and Continuous DeliveryImplementation roadmapBenefits of SAFe® 4.5 to companies:Organizations who adopt SAFe® 4.5 will be able to gain the following benefits:1) Test ideas more quickly. SAFe® 4.5 has a build-in iterative development and testing. This lets teams get faster feedback to learn and adjust more quickly.2) Deliver much faster. The changes to SAFe® 4.5 allow teams to move complex work through the pipeline and deliver value to the customer faster.3) Simplify governance and improve portfolio performance. Guidance and support have been added at the Portfolio level to guide organizations in addressing Portfolio-level concerns in a scaled agile context. SAFe® 4.5 - Key areas of improvements:A. Essential SAFe® and ConfigurabilityFour configurations of SAFe® that provide a more configurable and scalable approach:Essential SAFe®: The most basic level that teams can use. It contains just the essentials that a team needs to get the benefits of SAFe®.Portfolio SAFe®: For enterprises that implement multiple solutions that have portfolio responsibilities such as governance, strategy, and portfolio funding.Large Solution: Complex solutions that involve multiple Agile Release Trains. These initiatives don’t require Portfolio concerns, but only include the Large Solution and Essential SAFe® elements.  SAFe® Full SAFe®: The most comprehensive level that can be applied to huge enterprise initiatives requiring hundreds of people to complete.Because SAFe® is a framework, that provides the flexibility to choose the level of SAFe® that best fits your organization’s needs.B. Innovation with Lean Startup and Lean UXRather than creating an entire project plan up-front, SAFe® teams focus on features. They create a hypothesis about what a new feature will deliver and then use an iterative approach to develop and test their hypothesis along the way. As teams move forward through development, they perform this development and test approach repeatedly and adjust as needed, based on feedback. Teams also work closely with end users to identify the Minimum Viable Product (MVP) to focus on first. They identify what will be most valuable to the customer most immediately. Then they rely on feedback and learning as they develop the solution incrementally. They adjust as needed to incorporate what they’ve learned into the features. This collaboration and fast feedback and adjustment cycle result in a more successful product.  C. Scalable DevOps & Continuous DeliveryThe addition of a greater focus on DevOps allows teams to innovate faster. Like Agile, DevOps is a mindset. And like Agile, it allows teams to learn, adjust, and deliver value to users incrementally. The continuous delivery pipeline allows teams to move value through the pipeline faster through continuous exploration, continuous integration, continuous deployment, and released on demand. DevOps breaks down silos and supports Agile teams to work together more seamlessly. This results in more efficient delivery of value to the end users faster. It’s a perfect complement to Scaled Agile.D. Implementation RoadmapSAFe® now offers a suggested roadmap to SAFe® adoption. While change can be challenging, the implementation roadmap provides guidance that can help with that organizational change.Critical Role of the SAFe® Program ConsultantSAFe® Program Consultants, or SPCs, are critical change agents in the transition to Scaled Agile.Because of the depth of knowledge required to gain SPC certification, they’re perfectly positioned to help the organization move through challenges of change.They can train and coach all levels of SAFe® participants, from team members to executive leaders. They can also train the Scrum Master, Product Owners, and Agile Release Train Engineers, which are critical roles in SAFe®.The SPC can also train teams and help them launch their Agile Release Trains (ARTs).And they can support teams on the path to continued improvement as they continue to learn and grow.The SPC can also help identify value streams in the organization that may be ready to launch Agile Release Trains.The can also help develop rollout plans for SAFe® in the enterprise.Along with this, they can provide important communications that help the enterprise understand the drivers and value behind the SAFe® transition.       How SAFe® 4.5 is backward compatible with SAFe® 4.0?Even if your organization has already adopted SAFe® 4.0, SAFe® 4.5 has been developed in a way that can be easily adopted without disruption. Your organization can adopt the changes at the pace that works best.Few Updates in the new courseware The courseware for SAFe® 4.5 has incorporated changes to support the changes in SAFe® 4.5.They include Implementing SAFe®, Leading SAFe®, and SAFe® for Teams.Some of the changes you’ll see are as follows:Two new lessons for Leading SAFe®Student workbookTrainer GuideNew look and feelUpdated LPM contentSmoother lesson flowNEW Course Delivery Enablement (CDE) Changes were made to improve alignment between SAFe® and Scrum:Iteration Review: Increments previously known as Sprints now have reviews added. This allows more opportunities for teams to incorporate improvements. Additionally, a Team Demo has been added in each iteration review. This provides more opportunity for transparency, sharing, and feedback.Development Team: The Development team was specifically identified at the team level in SAFe® 4.5. The development team is made up of three to nine people who can move an element of work from development through the test. This development team contains software developers, testers, and engineers, and does not include the Product Owner and Scrum Master. Each of those roles is shown separately at the team level in SAFe® 4.5.Scrum events: The list of scrum events are shown next to the ScrumXP icon and include Plan, Execute, Review, and Retro (for a retrospective.)Combined SAFe® Foundation Elements SAFe® 4.0 had the foundational elements of Core Values, Lean-Agile Mindset, SAFe® Principles, and Implementing SAFe® at a basic level.SAFe® 4.5 adds to the foundation elements by also including Lean-Agile Leaders, the Implementation Roadmap, and the support of the SPC in the successful implementation of SAFe®.Additional changes include: Communities of Practice: This was moved to the spanning palette to show support at all levels: team, program, large solution, and portfolio.Lean-Agile Leaders: This role is now included in the foundational level. Supportive leadership is critical to a successful SAFe® adoption.SAFe® Program Consultant: This role was added to the Foundational Layer. The SPC can play a key leadership role in a successful transition to Scaled Agile.Implementation Roadmap: The implementation roadmap replaces the basic implementation information in SAFe® 4.0. It provides more in-depth information on the elements to a successful enterprise transition to SAFe®.Benefits of upgrading to SAFe® 4.5With the addition of Lean Startup approaches, along with a deeper focus on DevOps and Continuous Delivery, teams will be situated to deliver quality and value to users more quickly.With improvements at the Portfolio level, teams get more guidance on Portfolio governance and other portfolio levels concerns, such as budgeting and compliance.  Reasons to Upgrade to SAFe® 4.5 Enterprises who’ve been using SAFe® 4.0 will find greater flexibility with the added levels in SAFe® 4.5. Smaller groups in the enterprise can use the team level, while groups working on more complex initiatives can create Agile Release Trains with many teams.Your teams can innovate faster by using the Lean Startup Approach. Work with end users to identify the Minimum Viable Product (MVP), then iterate as you get fast feedback and adjust. This also makes your customer more of a partner in development, resulting in better collaboration and a better end product.Get features and value to your user community faster with DevOps and the Continuous Delivery pipeline. Your teams can continuously hypothesize, build, measure, and learn to continuously release value. This also allows large organizations to innovate more quickly.Most Recent Changes in SAFe® series - SAFe® 4.6Because Scaled Agile continues to improve, new changes have been incorporated with SAFe® 4.6. with the addition of five core competencies that enable enterprises to respond to technology and market changes.Lean Portfolio Management: The information needed for how to use a Lean-Agile approach to portfolio strategy, funding, and governance.Business Solutions and Lean Systems: Optimizing activities to Implement large, complex initiatives using a Scaled Agile approach while still addressing the necessary activities such as designing, testing, deployment, and even retiring old solutions.DevOps and Release on Demand: The skills needed to release value as needed through a continuous delivery pipeline.Team and Technical Agility: The skills needed to establish successful teams who consistently deliver value and quality to meet customer needs.Lean-Agile Leadership: How leadership enables a successful agile transformation by supporting empowered teams in implementing agile practices. Leaders carry out the Agile principles and practices and ensure teams have the support they need to succeedSAFe® Agilist (SA) Certification exam: The SAFe® Agilist certification is for the change leaders in an organization to learn about the SAFe® practices to support change at all levels: team, program, and portfolio levels. These change agents can play a positive role in an enterprise transition to SAFe®.In order to become certified as a SAFe® Agilist (SA), you must first take the Leading SAFe® class and pass the SAFe® certification exam. To learn more about this, see this article on How To Pass Leading SAFe® 4.5 Exam.SAFe® Certification Exam: KnowledgeHut provides Leading SAFe® training in multiple locations. Check the site for locations and dates.SAFe® Agile Certification Cost: Check KnowledgeHut’s scheduled training offerings to see the course cost. Each course includes the opportunity to sit for the exam included in the cost.Scaled Agile Framework Certification Cost: There are multiple levels of SAFe® certification, including Scrum Master, Release Train Engineer, and Product Owner. Courses range in cost, but each includes the chance to sit for the corresponding SAFe® certification.SAFe® Classes: SAFe® classes are offered by various organizations. To see if KnowledgeHut is offering SAFe® Training near you, check the SAFe® training schedule on our website.TrainingKnowledgeHut provides multiple Scaled Agile courses to give both leaders and team members in your organization the information they need to for a successful transition to Scaled Agile. Check the site for the list of classes to find those that are right for your organization as you make the journey.All course fees cover examination costs for certification.SAFe® 4.5 Scrum Master with SSM Certification TrainingLearn the core competencies of implementing Agile across the enterprise, along with how to lead high-performing teams to deliver successful solutions. You’ll also learn how to implement DevOps practices. Completion of this course will prepare you for obtaining your SAFe® 4 Scrum Master certificate.SAFe® 4 Advanced Scrum Master (SASM)This two-day course teaches you to how to apply Scrum at the enterprise level and prepares you to lead high-performing teams in a Scaled Agile environment. At course completion, you’ll be prepared to manage interactions not only on your team but also across teams and with stakeholders. You’ll also be prepared to take the SAFe® Advanced Scrum Master exam.Leading SAFe®4.5 Training Course (SA)This two-day Leading SAFe® class prepares you to become a Certified SAFe® 4 Agilist, ready to lead the agile transformation in your enterprise.  By the end of this course, you’ll be able to take the SAFe® Agilist (SA) certification exam.SAFe® 4.5 for Teams (SP) This two-day course teaches Scrum fundamentals, principles tools, and processes. You’ll learn about software engineering practices needed to scale agile and deliver quality solutions in a Scaled Agile environment. Teams new to Scaled Agile will find value in going through this course. Attending the class prepares you for the certification exam to become a certified SAFe® 4 Practitioner (SP). DevOps Foundation Certification trainingThis course teaches you the DevOps framework, along with the practices to prepare you to apply the principles in your work environment. Completion of this course will prepare you also to take the DevOps Foundation exam for certification.
5223
A Glimpse Of The Major Leading SAFe® Versions

A Quick view of SAFe® Agile has gained popularit... Read More