Search

What is AWS CLI and How to Install it?

Whether you are a small business or a big organisation, you must be familiar with Amazon Web Services. AWS is the world’s most widely used cloud computing platform – something that lets you move faster, operate more securely and save substantial costs. AWS offers 165 services spanning a wide range including computing, tools for the Internet of Things, developer tools, deployment, analytics, database, networking, etc. The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services.Do you want to Get AWS Certified? Learn about various  AWS Certification in detailWhat is AWS CLI?The AWS Command Line Interface (CLI) is a unified tool that manages the AWS services for you. You only have to download and configure one simple tool to control a plethora of AWS services. They are automated through scripts and help you implement a certain level of automation. It is indeed AWS CLI that makes AWS so dynamic and easy to use.How to Install AWS CLI?Putting AWS CLI to use in your AWS involves a few steps.There are a few different ways to install it – you need to choose what works for your system.Every operating system has a different method of installation.Some generic steps are followed after the installation.Then comes the configuration.You will also need to upgrade it timely. Follow these steps to successfully install and configure the AWS CLI for use.Ways to Install:You can effectively install the AWS Command Line Interface (AWS CLI) using:pipa virtual environmenta bundled installerWhat do you need?Unix, macOS, Linux, WindowsPython 3 version 3.3+ or Python 2 version 2.6.5+It is important to know that you may not be able to use an older version of Python with all AWS Services. Update to a newer version if there are Insecure Platform Warning or deprecation notices. To find out what version you currently have, visit:  https://github.com/aws/aws-cli/blob/master/CHANGELOG.rst.1. Installing the AWS CLI Using pipPip is the main distribution method for the AWS CLI on macOS, Windows and Linux. It is a package manager for Python.Installing the current AWS CLI VersionIf you have pip and a supported version of Python, use the following command to install the AWS CLI. Use the pip3 command if you have Python version 3+ installed:$ pip3 install awscli --upgrade –userThe --upgrade option commands pip3 to upgrade the requirements that are already installed. The --user option commands pip3 to install the program to a subdirectory of the user directory. Doing this avoids the complication of modifying libraries used by your operating system.Upgrading to the latest versionUse the pip list -o command to identify packages that are "outdated”:$ aws --version aws-cli/1.16.170 Python/3.7.3 Linux/4.14.123-111.109.amzn2.x86_64 botocore/1.12.160 $ pip3 list -o Package Version Latest Type ---------- -------- -------- ----- awscli 1.16.170 1.16.198 wheel botocore 1.12.160 1.12.188 wheelNow, run pip install --upgrade to get the latest version:$ pip3 install --upgrade --user awscli Collecting aws cli Downloadinghttps://files.pythonhosted.org/packages/dc/70/b32e9534c32fe9331801449e1f7eacba6a1992c2e4af9c82ac9116661d3b/awscli-1.16.198-py2.py3-none-any.whl (1.7MB) |████████████████████████████████| 1.7MB 1.6MB/s Collecting botocore==1.12.188 (from awscli) Using cached https://files.pythonhosted.org/packages/10/cb/8dcfb3e035a419f228df7d3a0eea5d52b528bde7ca162f62f3096a930472/botocore-1.12.188-py2.py3-none-any.whl Requirement already satisfied, skipping upgrade: docutils>=0.10 in ./venv/lib/python3.7/site-packages (from awscli) (0.14) Requirement already satisfied, skipping upgrade: rsa<=3.5.0,>=3.1.2 in ./venv/lib/python3.7/site-packages (from awscli) (3.4.2) Requirement already satisfied, skipping upgrade: colorama<=0.3.9,>=0.2.5 in ./venv/lib/python3.7/site-packages (from awscli) (0.3.9) Requirement already satisfied, skipping upgrade: PyYAML<=5.1,>=3.10; python_version != "2.6" in ./venv/lib/python3.7/site-packages (from awscli) (3.13) Requirement already satisfied, skipping upgrade: s3transfer<0.3.0,>=0.2.0 in ./venv/lib/python3.7/site-packages (from awscli) (0.2.0) Requirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (0.9.4) Requirement already satisfied, skipping upgrade: urllib3<1.26,>=1.20; python_version >= "3.4" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (1.24.3) Requirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= "2.7" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (2.8.0) Requirement already satisfied, skipping upgrade: pyasn1>=0.1.3 in ./venv/lib/python3.7/site-packages (from rsa<=3.5.0,>=3.1.2->awscli) (0.4.5) Requirement already satisfied, skipping upgrade: six>=1.5 in ./venv/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore==1.12.188->awscli) (1.12.0) Installing collected packages: botocore, awscli Found existing installation: botocore 1.12.160 Uninstalling botocore-1.12.160: Successfully uninstalled botocore-1.12.160 Found existing installation: awscli 1.16.170 Uninstalling awscli-1.16.170: Successfully uninstalled awscli-1.16.170 Successfully installed awscli-1.16.198 botocore-1.12.1882. Installing the AWS CLI in a Virtual EnvironmentAnother option is to install the AWS CLI in a virtual environment to separate the tool and its dependencies. You can also use a different Python version for this purpose.3. Installing the AWS CLI Using an InstallerUse the bundled installer for automated and offline installation on Unix, macOS, and Linux. It includes the AWS CLI, its dependencies, and a shell script that is responsible for the installation. For Windows, try MSI installer. Both these methods simplify the initial installation. Installation of Linux or UnixBoth the platforms have identical installation process. You need to have Python’s latest version. We recommend using the bundled installer for this. The steps are as follows:1. To begin the installation:curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"2. Unzip the downloaded package:unzip awscli-bundle.zip3. Run the installation:sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/awsUsing the -b option allows you to use the AWS CLI from any directory.Installation on Amazon LinuxThe AWS Command Line Interface comes preinstalled on both Amazon Linux and Amazon Linux 2. Below are the steps to install:1. Identify currently installed version:$ aws --version aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.1062. Use pip3 to install the latest version of the AWS CLI. If you run the command from within a Python virtual environment (venv), then you don't need to use the --user option.$ pip3 install --upgrade --user awscli3. Add the install location to the beginning of the PATH variable.$ export PATH=/home/ec2-user/.local/bin:$PATH4. Verify that you're running new version with aws --version.$ aws --version aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106Installation on WindowsThe AWS Command Line Interface can be installed on Windows by using a standalone installer or through a pip - a package manager for Python> Through InstallerDownload the appropriate MSI installer.Run the downloaded MSI installer or the setup file.Follow these instructions:By default, the CLI installs to C:\Program Files\Amazon\AWSCLI (64-bit version) or C:\Program Files (x86)\Amazon\AWSCLI (32-bit version). To confirm the installation, use the aws --version command at a command prompt (open the Start menu and search for cmd to start a command prompt).C:\> aws --version aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106If Windows is unable to find the program, you might need to close and reopen the command prompt to refresh the path, or add the installation directory to your PATH environment variable manually.> Through Pip1. Open Start menu→Command Prompt2. Verify that both Python and pip are installed correctly:C:\> python --version Python 3.7.1 C:\> pip3 --version pip 18.1 from c:\program files\python37\lib\site-packages\pip (python 3.7)3. Install AWS CLI via pipC:\> pip3 install awscli4. Check if the installation went rightC:\> aws --version aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106To upgrade to the latest version:C:\> pip3 install --user --upgrade awscliInstallation on MAC OS> Through Installer1. Download the  AWS CLI Bundled Installer$ curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"2. Unzip the package$ unzip awscli-bundle.zip3. Run the installation$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/awsThis command installs the AWS CLI to /usr/local/aws and creates the symlink aws in the /usr/local/bin directory. Using the -b option to create a symlink eliminates the need to specify the install directory in the user's $PATH variable. It enables users to run the AWS CLI by typing “aws” from any directory.If you want to see an explanation of the -i and -b options, use the -h option$ ./awscli-bundle/install -hthe commands summarized for easy cut and paste at the command line.curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip" unzip awscli-bundle.zip sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws> Through PIP1. Download and install the latest version of Python from Python.org.2. Download and run the pip3 installation script provided by the Python Packaging Authority$ curl -O https://bootstrap.pypa.io/get-pip.py $ python3 get-pip.py --user3. Use pip3 to install the AWS CLI. We recommend using the pip3 command if you use Python version 3+$ pip3 install awscli --upgrade --user4. See if AWS CLI is installed correctly$ aws --version AWS CLI 1.16.116 (Python 3.6.8)To upgrade to the latest version, run the command:$ pip3 install awscli --upgrade --userInstallation on Ubuntu> Through APT Package Manager1. Update the package repository cache$ sudo apt-get update2. Install AWS CLI with the following command$ sudo apt-get install awsclipress y and then press <Enter> to continue. Your screen should look something like this: 3. Now that it’s installed, check if it’s working properly or not$ aws --version> Through PIPAWS CLI being a Python module itself makes it easy for users who install it through PIP to update it on a regular basis. Assuming you have Python 3, follow the below steps to install:1. Install Python PIP with the following command$ sudo apt-get install python3-pipPress y and then press <Enter> to continue2. Install AWS CLI using PIP with the following command$ pip3 install awscli --upgrade --user3. Run AWS CLI with the following command$ python3 -m awscli --versionAfter InstallationAfter you have successfully installed AWS CLI, you need to set the Path to Include the AWS CLI in your system.> LINUXFind out the folder in which pip installed the AWS CLI$ which aws /home/username/.local/bin/awsYou can reference this as “~/.local/bin/” because of the reason that “/home/username” corresponds to ~ in Linux OSIn case you don't know where Python is installed, run this command$ which python /usr/local/bin/pythonIf this is the same folder you added to the path while installing pip, there’s nothing else to be done. Otherwise, perform those same steps again, adding this additional folder to the path.> WINDOWSThe Windows System PATH tells your PC where it can find specific directories:C:\> where awsC:\Program Files\Amazon\AWSCLI\bin\aws.exeFind out where the aws program is installedC:\> where c:\ awsC:\Program Files\Python37\Scripts\awsIf the command returns the following error, then it is not in the system PATH and you can't run it by typing its name.C:\> where c:\ awsINFO:Could not find files for the given pattern.In that case, you need to add the path manually. First, you need to search where it is installed on your computer:C:\> where /R c:\ awsc:\Program Files\Amazon\AWSCLI\bin\aws.exec:\Program Files\Amazon\AWSCLI\bincompat\aws.cmdc:\Program Files\Amazon\AWSCLI\runtime\Scripts\awsc:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws.cmd...To modify your PATH variable (Windows)Press the Windows key and enter environment variables.Choose the Edit environment variables for your account.Choose the PATH →Edit.Add the path to the Variable value field. For example: C:\new\pathClick OK twice to apply the new settings.Close any running command prompts and reopen the command prompt window.> MAC OSLocate Python$ which python /usr/local/bin/pythonThe output might be the path to a symlink, not the actual program. Run ls -al to see where it points.$ ls -al /usr/local/bin/python ~/Library/Python/3.7/bin/python3.6Pip install programs in the same folder as the Python application. Add this folder to your PATH variable.To modify the PATH variable for macOS (and Linus or Unix):1. Find the shell profile script in the user folder. In case you don’t know which shell you have, run echo $SHELL2. Through the following, add an export command to the profile scriptexport PATH=~/.local/bin:$PATHThis adds a path, ~/.local/bin in this example, to the current PATH variable.3. The updated profile can now be loaded into your current session$ source ~/.bash_profile

What is AWS CLI and How to Install it?

10K
  • by Joydip Kumar
  • 10th Sep, 2019
  • Last updated on 05th Nov, 2019
  • 14 mins read
What is AWS CLI and How to Install it?

Whether you are a small business or a big organisation, you must be familiar with Amazon Web Services. AWS is the world’s most widely used cloud computing platform – something that lets you move faster, operate more securely and save substantial costs. AWS offers 165 services spanning a wide range including computing, tools for the Internet of Things, developer tools, deployment, analytics, database, networking, etc. The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services.

Do you want to Get AWS Certified? Learn about various  AWS Certification in detail

What is AWS CLI?

The AWS Command Line Interface (CLI) is a unified tool that manages the AWS services for you. You only have to download and configure one simple tool to control a plethora of AWS services. They are automated through scripts and help you implement a certain level of automation. It is indeed AWS CLI that makes AWS so dynamic and easy to use.

How to Install AWS CLI?

Putting AWS CLI to use in your AWS involves a few steps.

  • There are a few different ways to install it – you need to choose what works for your system.
  • Every operating system has a different method of installation.
  • Some generic steps are followed after the installation.
  • Then comes the configuration.
  • You will also need to upgrade it timely. 

Follow these steps to successfully install and configure the AWS CLI for use.

Ways to Install:

You can effectively install the AWS Command Line Interface (AWS CLI) using:

Ways to install the AWS Command Line Interface in AWS CLI

  1. pip
  2. a virtual environment
  3. a bundled installer

What do you need?

  • Unix, macOS, Linux, Windows
  • Python 3 version 3.3+ or Python 2 version 2.6.5+

It is important to know that you may not be able to use an older version of Python with all AWS Services. Update to a newer version if there are Insecure Platform Warning or deprecation notices. To find out what version you currently have, visit:

  https://github.com/aws/aws-cli/blob/master/CHANGELOG.rst.

1. Installing the AWS CLI Using pip

Pip is the main distribution method for the AWS CLI on macOS, Windows and Linux. It is a package manager for Python.

Installing the current AWS CLI Version

If you have pip and a supported version of Python, use the following command to install the AWS CLI. Use the pip3 command if you have Python version 3+ installed:

$ pip3 install awscli --upgrade –user

The --upgrade option commands pip3 to upgrade the requirements that are already installed. The --user option commands pip3 to install the program to a subdirectory of the user directory. Doing this avoids the complication of modifying libraries used by your operating system.

Upgrading to the latest version

Use the pip list -o command to identify packages that are "outdated”:

$ aws --version
aws-cli/1.16.170 Python/3.7.3 Linux/4.14.123-111.109.amzn2.x86_64 botocore/1.12.160

$ pip3 list -o

Package    Version      Latest        Type 
----------     --------      --------        -----
awscli       1.16.170    1.16.198   wheel
botocore   1.12.160    1.12.188   wheel

Now, run pip install --upgrade to get the latest version:

$ pip3 install --upgrade --user awscli
Collecting  aws cli 
Downloadinghttps://files.pythonhosted.org/packages/dc/70/b32e9534c32fe9331801449e1f7eacba6a1992c2e4af9c82ac9116661d3b/awscli-1.16.198-py2.py3-none-any.whl (1.7MB)
     |████████████████████████████████| 1.7MB 1.6MB/s 
Collecting botocore==1.12.188 (from awscli)
Using cached https://files.pythonhosted.org/packages/10/cb/8dcfb3e035a419f228df7d3a0eea5d52b528bde7ca162f62f3096a930472/botocore-1.12.188-py2.py3-none-any.whl
Requirement already satisfied, skipping upgrade: docutils>=0.10 in ./venv/lib/python3.7/site-packages (from awscli) (0.14)
Requirement already satisfied, skipping upgrade: rsa<=3.5.0,>=3.1.2 in ./venv/lib/python3.7/site-packages (from awscli) (3.4.2)
Requirement already satisfied, skipping upgrade: colorama<=0.3.9,>=0.2.5 in ./venv/lib/python3.7/site-packages (from awscli) (0.3.9)
Requirement already satisfied, skipping upgrade: PyYAML<=5.1,>=3.10; python_version != "2.6" in ./venv/lib/python3.7/site-packages (from awscli) (3.13)
Requirement already satisfied, skipping upgrade: s3transfer<0.3.0,>=0.2.0 in ./venv/lib/python3.7/site-packages (from awscli) (0.2.0)
Requirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (0.9.4)
Requirement already satisfied, skipping upgrade: urllib3<1.26,>=1.20; python_version >= "3.4" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (1.24.3)
Requirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= "2.7" in ./venv/lib/python3.7/site-packages (from botocore==1.12.188->awscli) (2.8.0)
Requirement already satisfied, skipping upgrade: pyasn1>=0.1.3 in ./venv/lib/python3.7/site-packages (from rsa<=3.5.0,>=3.1.2->awscli) (0.4.5)
Requirement already satisfied, skipping upgrade: six>=1.5 in ./venv/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= "2.7"->botocore==1.12.188->awscli) (1.12.0)
Installing collected packages: botocore, awscli
Found existing installation: botocore 1.12.160
Uninstalling botocore-1.12.160:
Successfully uninstalled botocore-1.12.160
Found existing installation: awscli 1.16.170
Uninstalling awscli-1.16.170:
Successfully uninstalled awscli-1.16.170
Successfully installed awscli-1.16.198 botocore-1.12.188

2. Installing the AWS CLI in a Virtual Environment

Another option is to install the AWS CLI in a virtual environment to separate the tool and its dependencies. You can also use a different Python version for this purpose.

3. Installing the AWS CLI Using an Installer

Use the bundled installer for automated and offline installation on Unix, macOS, and Linux. It includes the AWS CLI, its dependencies, and a shell script that is responsible for the installation. For Windows, try MSI installer. Both these methods simplify the initial installation. 

Installation of Linux or Unix

Both the platforms have identical installation process. You need to have Python’s latest version. We recommend using the bundled installer for this. The steps are as follows:

1. To begin the installation:

curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"

2. Unzip the downloaded package:

unzip awscli-bundle.zip

3. Run the installation:

sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

Using the -b option allows you to use the AWS CLI from any directory.

Installation on Amazon Linux

The AWS Command Line Interface comes preinstalled on both Amazon Linux and Amazon Linux 2. Below are the steps to install:

1. Identify currently installed version:

$ aws --version
aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106

2. Use pip3 to install the latest version of the AWS CLI. If you run the command from within a Python virtual environment (venv), then you don't need to use the --user option.

$ pip3 install --upgrade --user awscli

3. Add the install location to the beginning of the PATH variable.

$ export PATH=/home/ec2-user/.local/bin:$PATH

4. Verify that you're running new version with aws --version.

$ aws --version
aws-cli/1.16.116 Python/3.6.8 Linux/4.14.77-81.59.amzn2.x86_64 botocore/1.12.106

Installation on Windows

The AWS Command Line Interface can be installed on Windows by using a standalone installer or through a pip - a package manager for Python

> Through Installer

  1. Download the appropriate MSI installer.
  2. Run the downloaded MSI installer or the setup file.
  3. Follow these instructions:

By default, the CLI installs to C:\Program Files\Amazon\AWSCLI (64-bit version) or C:\Program Files (x86)\Amazon\AWSCLI (32-bit version). To confirm the installation, use the aws --version command at a command prompt (open the Start menu and search for cmd to start a command prompt).

C:\> aws --version
aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106

If Windows is unable to find the program, you might need to close and reopen the command prompt to refresh the path, or add the installation directory to your PATH environment variable manually.

> Through Pip

1. Open Start menu→Command Prompt
2. Verify that both Python and pip are installed correctly:

C:\> python --version
Python 3.7.1
C:\> pip3 --version
pip 18.1 from c:\program files\python37\lib\site-packages\pip (python 3.7)

3. Install AWS CLI via pip

C:\> pip3 install awscli

4. Check if the installation went right

C:\> aws --version
aws-cli/1.16.116 Python/3.6.8 Windows/10 botocore/1.12.106

To upgrade to the latest version:

C:\> pip3 install --user --upgrade awscli

Installation on MAC OS

> Through Installer

1. Download the  AWS CLI Bundled Installer

$ curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"

2. Unzip the package

$ unzip awscli-bundle.zip

3. Run the installation

$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

This command installs the AWS CLI to /usr/local/aws and creates the symlink aws in the /usr/local/bin directory. Using the -b option to create a symlink eliminates the need to specify the install directory in the user's $PATH variable. It enables users to run the AWS CLI by typing “aws” from any directory.

If you want to see an explanation of the -i and -b options, use the -h option

$ ./awscli-bundle/install -h

the commands summarized for easy cut and paste at the command line.

curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
unzip awscli-bundle.zip
sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

> Through PIP

1. Download and install the latest version of Python from Python.org.
2. Download and run the pip3 installation script provided by the Python Packaging Authority

$ curl -O https://bootstrap.pypa.io/get-pip.py
$ python3 get-pip.py --user

3. Use pip3 to install the AWS CLI. We recommend using the pip3 command if you use Python version 3+

$ pip3 install awscli --upgrade --user

4. See if AWS CLI is installed correctly

$ aws --version
AWS CLI 1.16.116 (Python 3.6.8)

To upgrade to the latest version, run the command:

$ pip3 install awscli --upgrade --user

Installation on Ubuntu

> Through APT Package Manager

1. Update the package repository cache

$ sudo apt-get update

2. Install AWS CLI with the following command

$ sudo apt-get install awscli

press y and then press <Enter> to continue. Your screen should look something like this: 



3. Now that it’s installed, check if it’s working properly or not

$ aws --version

> Through PIP

AWS CLI being a Python module itself makes it easy for users who install it through PIP to update it on a regular basis. Assuming you have Python 3, follow the below steps to install:

1. Install Python PIP with the following command

$ sudo apt-get install python3-pip

Press y and then press <Enter> to continue
2. Install AWS CLI using PIP with the following command

$ pip3 install awscli --upgrade --user

3. Run AWS CLI with the following command

$ python3 -m awscli --version

After Installation

After you have successfully installed AWS CLI, you need to set the Path to Include the AWS CLI in your system.

> LINUX

Find out the folder in which pip installed the AWS CLI

$ which aws
/home/username/.local/bin/aws

You can reference this as “~/.local/bin/” because of the reason that “/home/username” corresponds to ~ in Linux OS
In case you don't know where Python is installed, run this command

$ which python
/usr/local/bin/python

If this is the same folder you added to the path while installing pip, there’s nothing else to be done. Otherwise, perform those same steps again, adding this additional folder to the path.

> WINDOWS

The Windows System PATH tells your PC where it can find specific directories:

C:\> where aws

C:\Program Files\Amazon\AWSCLI\bin\aws.exe
Find out where the aws program is installed

C:\> where c:\ aws
C:\Program Files\Python37\Scripts\aws

If the command returns the following error, then it is not in the system PATH and you can't run it by typing its name.

C:\> where c:\ aws

INFO:Could not find files for the given pattern.

In that case, you need to add the path manually. First, you need to search where it is installed on your computer:

C:\> where /R c:\ aws
c:\Program Files\Amazon\AWSCLI\bin\aws.exe
c:\Program Files\Amazon\AWSCLI\bincompat\aws.cmd
c:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws
c:\Program Files\Amazon\AWSCLI\runtime\Scripts\aws.cmd
...
To modify your PATH variable (Windows)

  1. Press the Windows key and enter environment variables.
  2. Choose the Edit environment variables for your account.
  3. Choose the PATH →Edit.
  4. Add the path to the Variable value field. For example: C:\new\path
  5. Click OK twice to apply the new settings.
  6. Close any running command prompts and reopen the command prompt window.

> MAC OS

Locate Python

$ which python
/usr/local/bin/python

The output might be the path to a symlink, not the actual program. Run ls -al to see where it points.

$ ls -al /usr/local/bin/python
~/Library/Python/3.7/bin/python3.6

Pip install programs in the same folder as the Python application. Add this folder to your PATH variable.

To modify the PATH variable for macOS (and Linus or Unix):

1. Find the shell profile script in the user folder. In case you don’t know which shell you have, run echo $SHELL

2. Through the following, add an export command to the profile script

export PATH=~/.local/bin:$PATH

This adds a path, ~/.local/bin in this example, to the current PATH variable.

3. The updated profile can now be loaded into your current session

$ source ~/.bash_profile

Joydip

Joydip Kumar

Solution Architect

Joydip is passionate about building cloud-based applications and has been providing solutions to various multinational clients. Being a java programmer and an AWS certified cloud architect, he loves to design, develop, and integrate solutions. Amidst his busy work schedule, Joydip loves to spend time on writing blogs and contributing to the opensource community.


Website : https://geeks18.com/

Join the Discussion

Your email address will not be published. Required fields are marked *

Suggested Blogs

What Are the Six Common Myths About Microsoft Azure You Should Not Worry About?

You may be interested in moving your infrastructure to public cloud storage and services such as Microsoft Azure. This form of storage is becoming increasingly popular for its perceived ability to enhance efficiency and simplify your digital operation. Part of what has made Azure such a renowned option is the familiarity of the software – with a setup in common with other Microsoft products.But there is a great deal of misinformation on Microsoft Azure and this has led to the establishment of common half-truths or misapprehensions about the services. Whether you are interested in simple public cloud hosting or a more advanced personalized solution, Azure has the services to suit you.Here we take a look at six common myths about Microsoft Azure, examine the truths behind them – and why you should consider choosing Microsoft Azure.Myth 1: Azure is too complicated for my businessSome organizations are concerned that switching to Microsoft Azure would be too complicated and they lack the technical expertise to make it work. However, one of the major benefits of Azure is that if you are familiar with other Microsoft and Windows products you will find it actually relatively easy to use. The service has been designed to be suitable for businesses of all sizes, so there is no reason to assume you will be overwhelmed.Of course, not every company has the technical knowledge and it can leave owners worried that they will not feel in full control of their infrastructure. Thankfully, however, it is possible to work with specialists who can offer managed Azure hosting services. So, whether you need just a little help with Azure deployments or a fully managed package, there is something out there to suit your business.Myth 2:  It’s a security riskWith rules and regulations such as the GDPR (General Data Protection Regulation) coming into force, it has never been more important to have powerful cybersecurity in place. This means that many businesses worry that their data is put at risk if they use services like Microsoft Azure. However, if this is a concern for you, it’s worth knowing that Azure holds the most comprehensive list of compliance certifications of any cloud provider.Microsoft is an industry leader in privacy protection with unique residency guarantees to protect data at all times. The system is designed to give customers complete confidence in their security. So ultimately you can have complete peace of mind that working with Azure can keep your business entirely secure and private.                                                                                                 Source: Home SecurityMyth 3: The costs are prohibitiveIn some cases, businesses are put off the idea of switching to Microsoft Azure because they believe either that the cost of changing the infrastructure and working practice will be too high, or the actual ongoing costs are too expensive. Of course, it is natural to be want to be in complete control of finances but it can often be the case that businesses that are not willing to invest in infrastructure get overtaken by competitors who do.Clearly, this will depend on your business, but it should be noted that a switch to Azure can see excellent improvements in efficiency and productivity. Additionally, Azure is engaged in a pricing battle with other services - which ensure that the costs of bandwidth and storage are kept to a minimum.Myth 4: Efficiency is the only reason to switch to public cloud storageYes, there is no doubt that increased efficiency is a major benefit of using public cloud storage services with Microsoft Azure, but it is far from the only reason to consider making the change. One of the best reasons to choose public cloud storage is the increased innovation it allows – this might even top reason. It allows your developers to spend their time actually developing, rather than simply on the time-consuming maintenance and management. Azure actually handles many of the complex tasks for you, taking that burden away from developers and freeing them up. This could give them time to work on anything from machine learning or business-critical enhancements to your website. Myth 5: It’s incompatible with the open source software It is sometimes assumed that Azure is completely incompatible with popular open source software, which could make it difficult to implement alongside your existing infrastructure. If this was true it could make it potentially extremely challenging for those businesses that currently utilize open source software as it would mean not only changing over their system to Microsoft Azure but also changing many internal systems.Myth 6: Using the cloud effectively means ONLY using the cloudPerhaps you are concerned that if you make the move to Azure you are committing the whole of your business to the cloud. Some businesses do not fully understand how the cloud works and it can put them in a position where they never consider services such as Azure because they don’t think it would be appropriate for them. However, this does not need to be the case at all as Azure supports hybrid functionality, allowing you to keep on-premises infrastructure in place and simply work with Azure to offer additional capacity. The solutions can be completely bespoke and based around the needs of your company, so there is no reason to assume that you have to stick with a one-size-fits-all package. If you are still concerned about whether Microsoft Azure is right for you, it’s a good idea to speak to specialists, who have experience working with businesses like yours. Whether you would benefit from a hybrid system or hosting based entirely on the cloud, they will be able to advise you as to whether it is a good idea to make the switch.Cloud Computing is the present and the future. We hoped this article helped you get familiar with the Common Myths About Microsoft Azure. If you want to know more about Microsoft Azure and get certified, you can try the Microsoft Azure certification course offered by KnowledgeHut.
5015
What Are the Six Common Myths About Microsoft Azur...

You may be interested in moving your infrastructur... Read More

Cloud Networks: Benefits and Savings For Businesses of All Sizes

Cloud networks and storage solutions can provide companies of any size with an outsourced infrastructure that can facilitate strategic growth and create opportunities with clients and customers across the world. With solutions that include content management along with internal and external communications, businesses can find scalable solutions that can enable their team to work remotely and across a variety of devices. With insight from key market research studies, cloud solutions can reduce a company’s TCO (total cost of ownership) by reducing their reliance on traditional brick & mortar spaces, limited markets, and inefficient operations. This can translate into six key benefits for businesses, including Improved collaboration, strategic and managed growth, lowered operational costs, secure data backup, greater reliability, and eased resource management. Through standardized, more efficient solutions and infrastructure, companies can grow on their own terms, which can allow organizations to reinvest in their future success. Learn more about these tangible benefits.  As businesses adapt to new technologies and changing markets, their daily operations and data storage solutions need to evolve as well. In previous generations, a company’s core functions, including communication and record-keeping, were a logistical feat often requiring the acute focus of a dedicated operations manager to oversee multiple, non-standardized processes running concurrently across multiple platforms. More often than not, these processes required manual input and countless hours to update, leaving key areas vulnerable to human error or oversight. Apart from inefficient time use, manual operations often required physical data storage, meaning separate filing cabinets or rooms full of servers just to keep tabs on valuable information. Benefits For Businesses of Any Size That’s where cloud networks come in. By outsourcing basic functions like communication and data storage, companies stand to save significantly in terms of time and resources that would have normally been simply spent on maintaining current projects. Let’s say that you run an advertising firm in southern California. How much of your day would be eaten up by rote, mundane tasks related to intra-office communications and project management? By subcontracting emails, content management, and network security to a Los Angeles IT services firm, your company would be able to focus on core functions, strengthened client relations, and strategic reinvestment in growth that includes attracting larger accounts as well as promotional efforts. Six Key Benefits According to American Express, the benefits of cloud computing can be broken down into six key areas: Improved collaboration Strategic and managed growth Lowered costs Secure data backup Greater reliability Eased resource management These benefits are applicable to companies of all sizes across industries and sectors, even small startups that are just getting their footing. Because cloud resources can be accessed remotely, team members have the opportunity to work from anywhere across a range of devices. Content and Communications Management Let’s look at a startup advertising firm as an example again. Using a content management system (CMS) such as Trello or Basecamp, account managers would be able to liaise with writers, graphic designers, and even clients from anywhere with a reliable internet connection. With both collaboration and growth in mind, this would allow the firm to connect innovative and creative talent with clients from anywhere in the world while allowing management to work remotely. This remote workforce would render a traditional office space obsolete, possibly eliminating real estate or rental costs. Hosting content on a CMS in addition to a cloud document tool such as Google Docs would also provide a low-cost, secure data backup. All in all, an agency that utilized this model would be primed for as much growth as it would be able to handle, without being weighed down by overhead costs with which traditional brick & mortar firms would have to contend.  Savings For Any Company In terms of financial benefits, Forbes has some insight into projected savings. Louis Columbus, a contributor specializing in enterprise resource planning (ERP) and cloud software, discussed these potential savings in a column on how cloud computing could be a viable investment for any company. With data from a study conducted by market research firm, Vanson Bourne, 1300 businesses across multiple industries in the UK and the US were surveyed on the cost benefits of cloud networks and computing solutions for their organizations. Cost reduction averaged about 23% since companies were able to save on infrastructure, both physical and technical. Further, 62% of firms surveyed were able to invest these saved funds back into their businesses. Continued Innovation and Growth  Based on the data and findings from market researchers and media outlets alike, being less reliant on costly hardware and physical space can enable any organization to streamline operations and reduce their total cost of ownership as explained in another analysis from a Forbes contributor. While cloud networks and services are being provided by industry behemoths like Amazon and Microsoft, the emergence of many startups in the retail and services sectors is leveling the playing field among smaller and medium-sized businesses in many communities worldwide. Along with reduced costs and more intuitive communication and management, cloud networks help companies to innovate beyond the expectations of their local markets. It’s clear that advances in network and data storage solutions will continue to facilitate the growth of small and medium sized businesses in the digital space. Wishing that the blog turned out to be enriching and educative for you. Furthermore, if you aim at taking your Amazon Web Services knowledge to the next level, then feel free to enrol yourself for the AWS Certification Training course offered by us.
Cloud Networks: Benefits and Savings For Businesse...

Cloud networks and storage solutions can provide c... Read More

Business Transformation through Enterprise Cloud Computing

The Cloud Best Practices Network is an industry solutions groups and best practices catalogue of how-to information for Cloud Computing. While we cover all aspects of the technology our primary goal is to explain the enabling relationship between this new IT trend and business transformation, where our materials include: Core Competencies – The mix of new skills and technologies required to successfully implement new Cloud-based IT applications. Reference Documents – The core articles that define what Cloud Computing is and what the best practices are for implementation, predominately referring to the NIST schedule of information. Case studies – Best practices derived from analysis of pioneer adopters, such as the State of Michigan and their ‘MiCloud‘ framework . Read this article ‘Make MiCloud Your Cloud‘ as an introduction to the Cloud & business transformation capability. e-Guides – These package up collections of best practice resources directed towards a particular topic or industry. For example our GovCloud.info site specializes in Cloud Computing for the public sector. White papers – Educational documents from vendors and other experts, such as the IT Value mapping paper from VMware. Core competencies The mix of new skills and technologies required to successfully implement new Cloud-based IT applications, and also the new capabilities that these platforms make possible: Virtualization Cloud Identity and Security – Cloud Privacy Cloud 2.0 Cloud Configuration Management Cloud Migration Management DevOps Cloud BCP ITaaS Procurement Cloud Identity and Security Cloud Identity and Security best practices (CloudIDSec) provides a comprehensive framework for ensuring the safe and compliant use of Cloud systems. This is achieved through combining a focus on the core references for Cloud Security, the Cloud Security Alliance, with those of Cloud Identity best practices: IDaaS – Identity Management 2.0 Federated Identity Ecosystems Cloud Privacy A common critcal focus area for Cloud computing is data privacy, particularly with regards to the international aspects of Cloud hosting. Cloud Privacy refers to the combination of technologies and legal frameworks to ensure privacy of personal information held in Cloud systems, and a ‘Cloud Privacy-by-Design’ process can then be used to identify the local legislated privacy requirements of information. Tools for designing these types of privacy controls have been developed by global privacy experts, such as Ann Cavoukian, the current Privacy Commissioner for Ontario, who provides tools to design and build these federated privacy systems. The Privacy by Design Cloud Computing Architecture (26-page PDF) document provides a base reference for how to combine traditional PIAs (Privacy Impact Assessments) with Cloud Computing. As this Privacy Framework presentation then explains these regulatory mechanisms that Kantara enables can then provide the foundations for securing the information in a manner that encompasses all the legacy, privacy and technical requirements needed to ensure it is suitable for e-Government scenarios. This then enables it to achieve compliance with the Cloud Privacy recommendations put forward by global privacy experts, such as Ann Cavoukian, the current Privacy Commissioner for Ontario, who stipulates a range of ‘Cloud Privacy By Design‘ best practices Cloud 2.0 Cloud is as much a business model as it is a technology, and this model is best described through the term ‘Cloud 2.0′. As the saying goes a picture tells a thousand words, and as described by this one Cloud 2.0 represents the intersection between social media, Cloud computing and Crowdsourcing. The Social Cloud In short it marries the emergent new online world of Twitter, Linkedin et al, and the technologies that are powering them, with the traditional, back-end world of mainframe systems, mini-computers and all other shapes and sizes of legacy data-centre. “Socializing” these applications means moving them ‘into the Cloud’, in the sense of connecting them into this social data world, as much as it does means virtualizing the application to run on new hardware. This a simple but really powerful mix, that can act as a catalyst for an exciting new level of business process capability. It can provide a platform for modernizing business processes in a significant and highly innovative manner, a breath of fresh air that many government agency programs are crying out for. Government agencies operate many older technology platforms for many of their services, making it difficult to amend them for new ways of working and in particular connecting them to the web for self-service options. Crowdsourcing Social media encourages better collaboration between users and information, and tools for open data and back-end legacy integrations can pull the transactional systems informtion needed to make this functional and valuable. Crowdsourcing is: a distributed problem-solving and production process that involves outsourcing tasks to a network of people, also known as the crowd. Although not a component of the technologies of Cloud Computing, Crowdsourcing is a fundamental concept inherent to the success of the Cloud 2.0 model. The commercial success of migration to Cloud Computing will be amplified when there is a strong focus on the new Web 2.0 type business models that the technology is ideal for enabling. Case study – Peer to Patent One such example is the Whitehouse project the Peer to the Patent portal, a headline example of Open Government, led by one its keynote experts Beth Noveck. This project illustrates the huge potential for business transformation that Cloud 2.0 offers. It’s not just about migrating data-center apps into a Cloud provider, connecting an existing IT system to a web interface or just publishing Open Data reporting data online, but rather utilizing the nature of the web to entirely re-invent the core process itself. It’s about moving the process into the Cloud. In this 40 page Harvard white paper Beth describes how the US Patent Office was building up a huge backlog of over one million patent applications due to a ‘closed’ approach where only staff from the USPTO could review, contribute and decide upon applications. To address this bottleneck she migrated the process to an online, Open version where contributors from across multiple organizations could help move an application through the process via open participation web site features. Peer to Patent is a headline example of the power of Open Government, because it demonstrates its about far more than simply publishing reporting information online in an open manner, so that they public can inspect data like procurement spending numbers. Rather it’s about changing the core decision-making processes entirely, reinventing how Government itself works from the inside out, reinventing it from a centralized hierarchical monolith to an agile, distributed peer to peer network. In essence it transforms the process from ‘closed’ to ‘open’, in terms of who and how others can participate, utilizing the best practice of ‘Open Innovation‘ to break the gridlock that had occured due the constraints caused by private, traditional ways of working. Open Grantmaking – Sharing Cloud Best Practices Beth has subsequently advised further on how these principles can be applied in general across Government. For example in this article on her own blog she describes ‘Open Grantmaking‘ – How the Peer To Patent crowdsourcing model might be applied to the workflows for government grant applications. She touches on what is the important factor about these new models, their ability to accelerate continual improvement within organizations through repeatedly sharing and refining best practices: “In practice, this means that if a community college wins a grant to create a videogame to teach how to install solar panels, everyone will have the benefit of that knowledge. They will be able to play the game for free. In addition, anyone can translate it into Spanish or Russian or use it as the basis to create a new game to teach how to do a home energy retrofit.” Beth describes how Open Grantmaking might be utilized to improve community investing in another blog, describing how OG would enable more transparency and related improvements. Cloud 2.0 As the underlying technology Cloud 2.0 caters for both the hosting of the software and also the social media 2.0 features that enable the cross-enterprise collaboration that Beth describes. Cloud Configuration Management CCM is the best practice for change and configuration management within Cloud environments, illustrated through vendors such as Evolven. Problem Statement One of the key goals and perceived benefits of Cloud computing is a simplified IT environment, a reduction of complexity through virtualizing applications into a single overall environment. However complexity actually increases.  Virtual Machines (VMs) encapsulate application and infrastructure configurations, they package up a combination of applications and their settings, obscuring this data from traditional configuration management tools. Furthermore the ease of self-service creation of VMs results in their widespread proliferation, and so actually the adoption of Cloud technologies creates a need for a new, extra dimension of systems management. This is called CCM, and incorporates: Release & Incident Management The increased complexity therefore increases the difficulties in trouble-shooting technical problems, and thus requires an updated set of tools and also updates to best practices like the use of ITIL procedures. ‘Release into Production’ is a particularly sensitive process within software teams, as major upgrades and patches are transitioned from test to live environments. Any number of configuration-related errors could cause the move to fail, and so CCM software delivers the core competency of being better able to respond quicker to identify and resolve these issues, reducing the MTTR significantly. DevOps DevOps is a set of principles, methods and practices for communication, collaboration and integration between software development and IT operations. Through the implementation of a shared Lean adoption program and QMS (Quality Management System) the two groups can better work together to minimize downtimes while improving the speed and quality of software development. It’s therefore directly linked to Business Agility. The higher the value of speed and quality = a faster ability to react to market changes, deploy new products and processes and in general adapt the organization, achieved through increasing the frequency of ‘Release Events’: It’s therefore directly linked to Business Agility. The higher the value of speed and quality = a faster ability to react to market changes, deploy new products and processes and in general adapt the organization, achieved through increasing the frequency of ‘Release Events’: ITaaS Procurement The fundamental shift that Cloud Computing represents is illustrated in one key implementation area:   Procurement. Moving to Cloud services means changing from a financial model for technology where you buy your own hardware and software, and pay for it up front, to an approach where instead you access it as a rental, utility service where you “PAYG – Pay As You Go”. To encompass all the different ‘as a Service’ models this is known at an overall level as ‘ITaaS’ – IT as a Service. Any type of IT can be virtualized and delivered via this Service model. Towards the end, I hope that you have gained a clear understanding of How Business Transforms Through Enterprise Cloud Computing. If this article has helped you clear your fundamentals and if you wish to learn more about Cloud computing by getting certified, then you can undertake the AWS certification course offered by KnowledgeHut.
Business Transformation through Enterprise Cloud C...

The Cloud Best Practices Network is an industry ... Read More