upGrad KnowledgeHut SkillFest Sale!

Cloud Computing Interview Questions and Answers

Cloud computing refers to the delivery of computing services, such as servers, storage, databases, software, and analytics, over the internet to enable faster innovation, flexible resources, and economies of scale. Cloud computing is becoming increasingly popular due to its scalability, accessibility, and cost-efficiency. Whether you’re a beginner or preparing for an advanced-level interview, our set of expert curated interview questions will help you understand the concepts in detail. The various topics of the question set are different types of cloud, different services, cloud computing tools, Azure, Databases, and more. Our set of cloud computing interview questions and answers will help you prepare the interview confidently.

  • 4.6 Rating
  • 55 Question(s)
  • 30 Mins of Read
  • 13069 Reader(s)

Intermediate

National Institute of Standards and Technology(NIST) defined Cloud computing  as a model facilitate on-demand globally accessible network to a shared pool of computing resources (e.g., networks, servers, storage, applications, and services) that can be provisioned by self service portal  provided by cloud service provider.

Cloud Computing is a new class of network based computing services that is available over the Internet, This model is similar to Utility Computing a collection/group of integrated and networked hardware, software and Internet infrastructure (called a platform).

  1. Internet based web service provides hardware, software and networking services to end users
  2. These cloud platforms provides simple GUI and APi (Application Programming interface) to access web based computing resources hiding underlying infrastructure details and complexity to end users.
  3. In addition, the platform provides on demand services, that are always on, anywhere, anytime and any place.
  4. Pay for use and as on demand, elastic
  5. scale up and down in capacity
  6. The hardware and software services are available to everyone e.g public, enterprises, corporations and businesses markets

Cloud Computing is a collection of layers formed together to deliver a IP based computing, virtualization is a layer/module inside cloud computing architecture which will enable the providers to deliver the IaaS "Infrastructure as a Service" on the fly.

Virtualization is a Software which creates “separated” multiple images of the hardware and software on the same machine. This makes possible to install multiple OS, multiple software and multiple applications on the same physical machine.

  • Virtualization: More Servers on the Same Hardware;
  • Cloud Computing: Measured Resources, Service based delivery, Pay for What You Use.

The globalization of business, difficult economic environment and the on-demand consumption model for consumers have increased the pressure on organizations to be agile and cost effective. Cloud computing helps organization to competitive and expand. The key drivers of cloud computing are cost, risk and agility. The cloud computing drivers are depicted in below diagram:

Drivers of Cloud Computing

Cloud types depends how to describe the services are delivered as well as underlying ownership. Cloud deployment types describe the nature of the specialised services that are offered.

Public Clouds – Public Clouds- Public cloud is the most  common and popular cloud option adopted by users. The IT infrastructure resources like compute, network, storage in secured manner at low cost is available at public cloud environment. these IT infrastructures are shared amongst multiple clients therefore it is cheaper to use. All the resources are accessed and managed by web browser over internet.Public cloud services provided offering are Infrastructure as a service (IaaS), platform as a Service (PaaS) ans Software as a service (SaaS). Some of the public cloud offering are office 365, salesforce etc.

Advantage of Public Cloud:

  • Economics of scale as per unit cost reduces with increase in consumption volume
  • It reduces time to market and brings agility in business
  • Users can increase and reduce consumption of resource dynamically depending on business requirement.

Disadvantages Of Public Cloud:

  • Fewer options for customization
  • Less secure compare to private cloud or on premises Data center
  • Fixed architecture cannot (at times) grow with the needs of the business

Private Clouds – A private cloud consists of computing resources used exclusively by one business or organisation. The private cloud can be physically located at organisation’s on-site data center or it can be hosted by a third-party service provider. private cloud, the services and infrastructure are always maintained on a private network and the hardware and software are dedicated solely to your organisation.

Advantages Of Private Cloud:

  • Provide cloud has more customizable security options and capabilities to meet organization requirement for internal users with business growth and be expanded or changed as needed compare to public cloud

Disadvantages Of Private Cloud:

  • Management and IT department stake holder should be in same page to make private cloud
  • Private cloud needs huge capital investment in the initial phase
  • It takes long time to deliver services compare to public cloud

A common question in cloud computing interview questions for freshers, don't miss this one.

Hybrid Clouds – Hybrid cloud combines the benefit of public and private clouds to reduce cost and distribute workload as per business demand. A hybrid cloud allows flow of data between private and public clouds in secure manner. It gives more flexibility and deployment option to the enterprise organizations.

Advantages Of Hybrid Cloud:

  • Hybrid cloud combines the benefit of both private and public cloud
  • It provides resource access to both internal and external users
  •  Hybrid cloud helps to modernize applications and processes incrementally as resources permit.

Disadvantages Of Hybrid Cloud:

  • Hybrid cloud security management is a big challenge
  • Standardization of rules and policies to govern infrastructure and data governance in hybrid cloud is difficult

Currently,lists as below:

  • Infrastructure as a Service (IaaS) – Virtual or physical hardware resources (e.g. compute, storage, network) offered as a service. IaaS is provided using a shared, multi-tenant IT infrastructure through on-demand services. IaaS enables the end-user to provision servers, storage, networks, and other fundamental computing resources. End-users are able to deploy and run software, which can include operating systems and applications, on cloud-based servers. The end-user does not manage or control the underlying cloud infrastructure but has control over operating systems, data, deployed applications.
  • Platform as a Service (PaaS) – delivers a computing platform or solution stack as a service, most often providing a complete development platform for organisations requiring a development instance of an application.
  • Software as a Service (SaaS) – a hosted application accessed through a web browser. SaaS alleviates the maintenance and technical operation and support of business and consumer software. Management  or control the underlying cloud infrastructure including network, servers, operating systems, storage, application management is part of cloud SaaS provider. SaaS is offered with a subscription instead of traditional software license. The Pictorial representation is shown below:

Types exist for cloud computing

The basic characteristics of Cloud computing mentioned below:

  • Scalability: Infrastructure capacity scale up for traffic spikes in real time
  • Resiliency: Cloud providers have mirrored images in more than one location to minimize downtime in the event of a disaster. This type of resiliency can give businesses the sustainability they need during unanticipated events.
  • On-demand self-service: Self service portal allowing cloud end users to provision compute, network, storage and databases services any time any where without human intervention automatically.
  • Broad network access: Cloud services are accessible over internet from any where any time. End users can connect to cloud services using laptop, tab or mobile phones over internet.
  • Resource pooling: Foundation of Cloud service model is multi tenancy to enable sharing of resources amongst end users of different organization scattered over multiple region. Clients are free to choose compute, storage, network, database resources available at many location depending upon their business function. The cloud service provider Data centre is located all over the globe.
  • Rapid elasticity: Resource provision to meet business function as per demand can be scale up and scale down rapidly and elastically as resource is available for usage appear to be unlimited and can be purchased in any quantity at any time.
  • Measured Service: The services or resources used by users are measured correctly and displayed in dash board of cloud portal to enable users to track their consumption pattern against every resource and to keep a tab on cost on resource consumption.

Azure computing is virtualized environments backed by services provider hardware (Datacenter) to meet the on-demand resources like cloud computing, storage, web apps etc. by internet using pay as you go model. Cloud Computing is the delivery of services like server storage, networking services, WebApps, databases, analytics and intelligence etc. & it provides the innovation, resources flexibility.

Basically, we no need to set up a data centre for each and every service as cloud computing offers all of these services in virtualized environments which we can utilize and enable the services to meet the business requirements.

Azure Cloud computing the best examples are Azure Iaas, Paas & SaaS services and Azure cloud platform which provides all services like, Big data, Compute, Analytics, reporting services, Databases, Open sources etc. which will enable the faster solution with geographical availability than traditional services in this competitive world.

Microsoft Azure is a flexible, open and enterprise-grade cloud computing platform which is more fast, secure, trusted, intelligent and enable to hybrid environments.

MS Azure is Virtualized environments where we will access all the services and deployed without any hardware requirements and software license. It charges to pay as you go model. If I consume for 1 hr. it will charge for 1 hr. only.

  • Deployment is faster.
  • Pay as Go model
  • Cost Saving
  • Reliable and Scalable Environments.
  • It’s secure and can manage securely.

What Microsoft Azure?

Unsurprisingly, this one pops up often in cloud computing basic interview questions.

  • Azure VMS: Azure backup will help you to take the backup of your azure VMS while enabling the Azure backup and retain the backup up to 30 days by default but you can increase up to 999 days/year/Months/Weeks as per customer requirements.
  • Azure SQL DB on VMs: This feature will help you to take the backup of your SQL Database instance, which resides inside the VMs. This feature is in preview.
  • Azure File storage:  It will help you to take the backup of azure file storage, If we have files stored in Azure file storage, than you can enable the backup from azure recovery vault to take the backup of Azure file storage.
  • On-premise VMS: Azure backup will help you to take the backup of Azure on-premises VMS while setting up the Azure backup services in on-premises.

Azure load balancer works on layer 4 and distributes the traffic across the VMS. The load balancer is of 2 types, Internal load balancer which used to the internal application and external load balancer which used for external application. Let say if you have a web application running on a set of VMs and you want to load balance then internally or externally then you can utilize the Azure load balancer. You can configure the health prob and another rule for your web application. Even if you want to apply the NAT rules you can set up the same.

It will help your infrastructure and application to protect from DDoS attacks. It works in HTTPS load balancers to provide a defence of your infrastructure. We can allow/deny the rule for the same. Cloud Armor’s are flexible in rules language which enables the customization of defence and mitigate the attacks. Even it has predefined rules to defend against cross-site scripting (XSS) and SQL injection (SQLi) application-aware attacks. If you are running a web application then it will help you on protecting from SQL injection and DDos attacks and more based on the allow and deny rules you have configured.

Expect to come across this popular question in basic cloud computing interview questions.

VPC provides connectivity from your on-premise and across all regions without exposure to the internet. It Provides connectivity to computing virtual machine instances, Kubernetes Engine clusters, App Engine Flex instances, and other resources based on the projects. we can use multiple VPCs in various projects.

It’s associated with firewall rules and routes for global resources, not individual any specific regions. Even it is allowed to share the VPC for multiple projects.

Commonly used for the Google cloud platform and in a hybrid scenario.

Cloud Storage are used to store or retrieve the data worldwide. We can integrate into apps with a single API. It’s restful online storage for WebApps to store and access the data using google cloud platforms. It provides geo-redundancy with the highest level of availability and performance. Cloud storage has low-latency, high-QPS content serving to users distributed across geographic regions.

Common Use Case:

  • Streaming videos and music
  • Serving images and website content
  • Mobile app development

It’s a Web framework which can be deployed in the Google cloud platform. We will deploy the google app engine, it’s fully managed the automatic engine and provide better security and reliability. it supports Java, PHP, Node.js, Python, C#, Net, Ruby and automatically scalable when traffic is more.  It’s a highly available application which auto upgrade and downgrade the instance as per usage. We can manage the resources using the command line tools and debug the source code & run the API easily using DevOps tools like visual studio, PowerShell, SDK, cloud source repositories.

We can secure the application while using the App firewall and managing SSL/TLS certificates.

A common question in cloud computing questions and answers, don't miss this one.  

Yes, you can replicate the S3 bucket data across the region. Bucket features allow you to copy the objects across different AWS regions.

Can be replicated the AWS S3 across the region?

It provides computing to AWS services, if you want to deploy a VMS then you need to use the EC2 instance and can deploy in any region. It’s a highly available and scalable instance in AWS to deploy heavy workloads in Amazon EC2 instance. Even it provides the paired key to secure the remote connection. EC2 instance used to deploy the application, SQL DB and any IaaS based application. Cost for the EC2 instance based on the VMS usage per second. Even you can use this kind of solution for a heavy workload.

What is the EC2 Instance?

Below is the storage I have used in my various projects.

  • Amazon EBS: It provides the persistent block storage volume for VM instance to protect component failure and high availability.
  • Amazon EC2 instance:  It provides the different types of instances, so we can choose the CPU, Memory and storage for VMS instance. It’s on-demand, spot instance and reserved instance in Amazon web services.
  • Amazon S3: It’s secure, durable and highly scalable storage in AWS. It can be integrated with web application and can store a larger amount of data.

In lieu to the availability, it is the time duration the provider ensures your services are available, regardless of your cloud types (Public, Private, Community, or Hybrid) and service types (SaaS, PaaS, or IaaS). Commonly this metric is revealed with the percentage of uptime as its’ basis. Moreover, uptime is the amount of time the respective service is available and operationally online in a specific time interval. And so, if the uptime is 99.99% in a year, the total duration you will be unable to access the service, widely known as downtime, is no more than 52 minutes and 36 seconds in 12 months.

Availability of the services varies from one cloud to another cloud provider. Therefore, you need to know your availability requirements in the first place which includes but not limited to a business, mission-critical, time-critical system you have and your expected uptime/acceptable downtime.

Be mindful that the definition and measurement of availability are also different from one provider to another.

That’s said, upon the identification, search for the vendor with service availability that meets or, even more, exceeds your requirements. Ideally, you should choose the vendor that guarantees, not only publishes, their availability means that they will compensate your organization for missing the promised metrics and thresholds. Always take a look and learn their Service Level Agreement (SLA) carefully and thoroughly. Comprehend their policies, terms, conditions and provision of compensation in case of an outage in clauses

If you are keen to understand from a technical perspective, you might want to know how the provider manage the outage (unplanned downtime), their Business Continuity Plan (BCP) and Disaster Recovery Plan (DRP) and how they handle their maintenance (planned downtime) too.

By default, when you store, use, share or communicate your data in the cloud, usually, your data is in a raw, unencrypted format, known as ‘plaintext’, unless you have encrypted your data before being saved or transmitted.

If you leave your data unencrypted, you will face the risk that anyone who gains access to your account can read, copy or delete your data. This leaves your data leaked or exposed to unauthorized individuals and entities. Thus, end-to-end data encryption including your emails if stored in Cloud servers, at rest, in-use and in motion, is a must.

On the other hand, from the provider’s point-of-view, they will provide secure storage space and impose confidentiality obligations by limiting user access to those who are authorized to view, edit, add, delete the data based on your requests. What’s more, they will also protect the data from accidental or purposeful unauthorized access by internal or external actors.

Over and above that, you should gather the following information on data confidentiality policies, controls, practices, and technologies the provider has put in place:

  1. Access

Whether the vendor provides various ways to securely access our data and services based on certain Access Control Matrix (ACM) constitutes of the users, groups,  permissions, privileges and credentials they offer.

  1. Log Management

Whether the vendor provides log files to capture key activities occurring in our cloud environment so we will be able to monitor, analyze them and do follow up, for the purpose of an audit trail in particular.

  1. Data ownership

Whether you as the customer maintain full control of your data and has the responsibility for managing your data, not only the provider’s services and resources. Ask for the guarantee that they do not access or use your data for any purpose without your consent. Even more, they don’t utilize your content or derive information for marketing or advertising.

  1. Storage

Whether you could choose which region, country, or a city in which your data is  stored and what type of storage deployed. Ensure the provider ensure they don’t move, modify, add, delete or replicate your data without your prior consent.

  1. Encryption

Encryption provided by the vendor: the type (at rest, in transit, in-use), the algorithm (Symmetric such as Advanced Encryption Standard (AES) or Asymmetric with the likes of Rivest–Shamir–Adleman (RSA) and Elliptic Curve Cryptography (ECC)), the encryption keys and the Key Management.

  1. Exception

Whether the provider has any exception policy in intentionally disclosing our data to other parties usually due to a legal obligation, illegal conduct, and or binding order. If it happens, you need to know to whom your data is being unveiled, for what purpose and the provider needs to notify you prior to the disclosure.

As you might already be aware, data integrity, one of the key aspects in Information Security, means that the degree to which data is consistent, accurate and complete over its entire lifecycle.

To maintain it, you and the cloud provider must, hand-in-hand, provide such assurance.

First, assure the data cannot be modified by an unauthorized individual, entity, or program. This could happen by deploying Access Controls through Access Control Matrix (ACM) or Access Control List (ACL) revealing username, role, privilege, menu, function and object. The forensic tool may be also needed to recover from the accidental deletion by authorized users. In addition, implement also another control, checksum, to verify integrity.

Second, have data backup for occurrences like a power outage, database crash, storage failure. Given that the data is corrupted, try to identify the root cause then recover it in immediate. If it doesn’t go through, restore correct data from the backup. Regardless of the storage media utilized for the backup, always put it in separate logical or even better physical premise and location. A secure, confidential, safe one, obviously. Security policy, both logical and physical, applied to primary and backup data must be the same.

Third, implement algorithms and protocols namely Message-Digest algorithm 5 (MD5), Advanced Encryption Scheme (AES), Secure Hash Algorithm (SHA) and Rivest–Shamir–Adleman (RSA) to provide maximum levels of integrity management from any tampering or unauthorized access, specifically for data stored in the public cloud.

Fourth, getting data integrity verified through IT Audit activities. It could be possibly conducted by internal (from your side/provider end) or external entities (third-party/independent). As the 3rd layer of defence inside an organization, IT Auditor will be assessing, validating and testing IT General Controls and IT Application Controls as necessary to verify the consistency, accuracy and completeness of certain static as well as dynamic data.

In more general terms, data and information privacy is the rights you have on having various controls over how your data and information are managed across its entire lifecycle – when acquired, maintained, used, published, transferred, stored, archived and disposed.

Even though it looks quite similar with data and information security, in most cases these two definitions are found overlapped, they have a major difference to call to mind.

Privacy centre of attention is on how data and information are used and governed through certain policies, laws and regulations. On the contrary, the primary focal point of security is how data and information are protected from countless threats and vulnerabilities. In consequence, the last-mentioned isn’t adequate enough to deal with privacy.

What you could possibly do is collecting any information from the cloud provider as much as you could on:

  1. Processes

How your data and information are processed in the cloud together with but not limited to where the provider is from, their head and in-country office, storage media, storage/server location, backup media and its location.

  1. Control

How they enable users to have proper controls over their data and information across its lifecycle.

What are the controls – administrative, technical and physical – the provider deploys such as policy, procedure, mechanism, standard related to data and information privacy?

  1. Guarantee

How they assure our data and information are appropriately managed and the compensation they bring into the table if the privacy is broken.

  1. Responsible Party

The entity that is responsible for ensuring compliance to a certain standard, applicable law and regulation, along with regulatory requirements.

  1. Third-Party

Whether there is any subcontractor involved in providing products and services to the cloud provider and to what extent this vendor involved in your data and information processing.

  1. Standard and Framework

Standard and framework on data and information privacy the provider follow and comply with. You also need to know and understand their implications on your data and information privacy.

  1. Law and Regulation

Law and regulation on data and information privacy the provider comply with. Be aware of similar law and regulation your country may have. Also acquainted with their implications and consequences.

  1. Cross-Border

Identify the processes on how the provider deal with cross-border data transfer if we store and process our data in multiple sites across several geographical premises in a great number of countries.

Expect to come across this popular question in cloud computing scenario based questions.

Make sure you develop a business case in the first place that consists of a minimum of three options put into the spotlight and one recommendation. Those alternatives unveiled could possibly be, for instance, #1 public cloud, #2 private cloud, #3 hybrid cloud, #4 business as usual or do nothing – assuming what your organization needs is, for instance, Infrastructure as a Service (IaaS).

In detail, a business case is a written document typically containing material related to a new business or business improvement idea intended to convince the respected decision-makers to take any action. Into the bargain, it is aimed to justify the investment of resources and finances then obtain the stakeholder’s approval based on research, analysis and facts.

According to the commonly accepted industry practice, the business case shall constitute:

  1. Executive Summary

A high-level view explaining the problem the proposed alternative is intended to solve, major considerations, desired deliverable, as well as the predicted business and financial aspects the recommendation shall achieve.

  1. Problem/Opportunity Description

What problems to solve by implementing IaaS or what opportunity will your office rejoice the benefit from with IaaS deployment?

  1. Solution Options

Four alternatives mentioned earlier are slated here.

  1. Business Value Analysis

A structured approach in which we can compare the solutions for effective decision-making speaks volumes. Example: SWOT analysis, Real Options Decision Tree.

  1. Cost-Benefit Analysis

An advanced level of analysis is needed to evaluate whether the solution being pursued is also financially viable e.g. Cost-Benefit Analysis/Ratio (CBA or CBR), Net Present Value (NPV), Internal Rate of Return (IRR), Return on Investment (ROI).

  1. Risk and Mitigation

Identify risks of each option (public, private, hybrid, stay as is) and how to deal with them through mitigation activities.

  1. Implementation Plan

Provides a realistic picture of how each proposed alternative will be rolled out in high-level point-of-view including its approach, timeframe, benefits, costs, and quality.

  1. Assumptions and Dependencies

Assumptions should be validated and confirmed since the viability of alternative is dependent on them. The similar treatment goes to dependencies because they will become the prerequisites for a specific alternative/solution.

  1. Recommendations

Disclose one from four alternatives that have been assessed with Business Value Analysis and Cost-Benefit Analysis as the recommended option.

Numerous success factors that should be thought about are:

  1. IT Infrastructure

Start with development and testing environment first. Leave the production system and its configuration out to minimize the probability and more importantly impact of the concerns you may encounter.

  1. Criticality

Choose to move the lowest time and mission-critical systems because if there is any incident, your organization’s business will suffer.

  1. Complexity

Identify and migrate the systems with the simplest architecture as the action will be considered as low risk.

  1. Usage

Assess your systems entirely then decide which one you are keen to migrate. Analyze minimum specifications, configurations and actual usage in various situations (high, medium, low workload). Upon completion, identify and choose the appropriate cloud environment to match, whenever and wherever possible, all your organization’s requirements.

  1. Licensing

Be mindful of the licensing including the model and cost as well as Terms and Conditions of the cloud service (IaaS, SaaS, PaaS) and model (public, private, hybrid) you will procure ever since they differ from vendor to vendor and cloud service provider.

  1. Service Level Agreement (SLA)

You should assess and evaluate SLA thoroughly including the compensation if the provider is unable to achieve the agreed metrics.

  1. Security and Privacy
  2. Integration

Ensure that you identify and discover application/system dependencies to avoid unplanned outages and limited functionality that usually occurs when the migration is completed.

  1. Architecture

Review each architecture (Application, Data, Infrastructure) comprehensively to achieve optimization of the cloud platform.

  1. Migration

Inquire about the processes, activities, methods, tools needed and or offered by the provider to migrate from their cloud into another cloud.

  1. Network connectivity

Ask the provider on their network availability and bandwidth requirements because you and other end users will be accessing cloud services and products anywhere, any time.

In general, the cloud adoption activities should have a pre-adoption review, planning, execution, testing and post-adoption review to make sure things go well for you.

  1. Pre-Adoption Review
  • In the beginning, you could assess and analyze your IT organization overall readiness to complete this adoption job. Assure that you and your team have the expertise, resources, budget and time to do below items:
    • Roll out the cloud adoption project, while on the other hand, also doing IT operational activities
    • Fulfill all requirements
    • Deploy new processes as necessary
    • Architect the cloud to achieve metrics, objectives and goals
    • Operate and maintain your cloud services, products or resources
    • Review the migration project based on key success criteria

On condition that most of, or better, all the answers are yes, you could do this assignment alone. Otherwise, you could identify the adoption partner based on scores of measurements like the vendor’s capability, experience, resources, support, tools, portfolio and their client’s testimonials.

  • Identify cloud adoption baseline metrics and Key Performance Indicators (KPI) namely page response time, database load time, availability, confidentiality, integrity (data completeness and accuracy), CPU utilization and memory usage.

Both metrics and KPI will help you to understand the current state of your IT environment and determine whether your adoption is successfully completed.

  • Assuming your answer is the first, you could identify and select the right cloud services (type and model) and also the cloud provider that suits your organization and their requirements
  1. Planning

Develop your cloud-adoption plan that accommodates the below factors:

  • Determine the approach: short sprint vs big bang.
  • Evaluate what to migrate (software, application, platform, infrastructure, data).
  • Consider the architecture and dependency.
  • Identify what to move first.
  • Assess if there is any change in architecture needed.
  • Analyze whether the migration will impact the performance.
  • Discover how the new service will operate.
  • Give thought to operational continuity.
  • Prepare on how to deal with downtime.
  • Recognize governance and compliance concerns.
  • Make ourselves ready with audit and security issues.
  • Work up with other risks.
  • Techniques and tools utilized in the migration.
  • Support (internal and external) during and after the migration.
  1. Execution

In big-bang (do it all at once) scenario, it drives a huge change over a longer period of time as you move your entire computing components over and run a test to see if it works as expected. Presuming you take short-sprint (do it a little bit at a time) option, you migrate your computing component over, validating it then continuing these activities until all components are moved to the cloud.

  1. Testing

You are urged to make sure everything is working by conducting the test. It could be manual or automated, based on plenty of scenarios, by capitalizing the previously agreed baseline metrics and KPIs as key success criteria.

  1. Post-Adoption Review

It constitutes three main points: what went well (good things), what’s the room for improvement (bad things), and what’s the action plan (to improve the bad).

  • CompTIA Cloud Essentials

Constitutes of basic understanding of cloud computing from both business and technical point of view, migration from on-premise to the cloud, also the governance of cloud computing environments. Issued by a non-profit, Information Technology trade association, the Computing Technology Industry Association (CompTIA). No prerequisite required; nevertheless, the examinee is recommended to have six months of working experience in IT services environment at the minimum.

Slightly different from Cloud Essentials, it validates your skills in maintaining and optimizing cloud infrastructure services. Consequently, it will assess our competence to perform data centre jobs effectively and efficiently such as configuration, deployment, security, troubleshooting, maintenance and management.

  • EXIN Cloud Computing Foundation

Offered by an independent exam and certification company EXIN, it covers cloud computing basic concepts and principles, tests the technical knowledge namely Security and Compliance as well as looks at general aspects inclusive of implementation, management, and evaluation.

  • CCC Cloud Technology Associate

For those who are relatively new to cloud computing, this credential is assessing your basic knowledge of Cloud Computing concepts. Developed by joint forces between EXIN and an international member-based organization Cloud Credential Council (CCC), it tests your understanding of the main concepts of Cloud Services Model, Virtualization, Cloud Technologies and Applications, Security, Risk, Compliance, Governance, Adoption, and Service Management.

  • AWS Certified Solutions Architect – Associate

This Amazon Web Services (AWS) Certified Solution Architect’s accreditation is divided into two paths: Associate and Professional. The first is aimed to assess the individual knowledge in architecting and deploying secure, robust systems on AWS while, on the one hand, it’s also a prerequisite to achieving the professional certification. In the second place, it also validates your ability to define solutions based on customer/end-user requirements using architectural design principles and provides implementation guidance to your organization based on best practices throughout the Project Life Cycle.

  • AWS Certified Solutions Architect – Professional

What’s more, the professional path targets individuals with two or more years of hands-on experience in designing and deploying cloud architecture and architecting and implementing dynamically scalable, highly available, fault-tolerant, and reliable applications on AWS. It also validates the exam taker’s competence in migrating complex, multi-tier applications on the platform, designing and deploying enterprise-wide scalable operations and implementing cost-control strategies.

  • Google Cloud Certified – Professional Cloud Architect

As a professional Cloud Architect, you are expected to have the necessary skills and knowledge to enable your organization to leverage Google Cloud technologies. By securing this testament, your ability to design, plan, develop, implement, manage and provision robust, secure, scalable, highly available and reliable cloud architecture using Google Cloud Platform (GCP) along with dynamic solutions to drive business objectives is recognized.

It's no surprise that this one pops up often in interview questions about cloud computing.

To date, many ISO standards have been applied to the cloud. Taking out the expired and withdrawn versions, here is the list:

  • ISO/IEC 17788:2014

Information Technology -- Cloud computing – Overview and vocabulary

  • ISO/IEC 17789:2014

Information Technology -- Cloud computing -- Reference architecture

  • ISO/IEC 17826:2016

Information Technology -- Cloud Data Management Interface (CDMI)

  • ISO/IEC 19086-1:2016

Information Technology -- Cloud computing -- Service level agreement (SLA) framework -- Part 1: Overview and concepts

  • ISO/IEC 19086-2:2018

Cloud computing -- Service level agreement (SLA) framework -- Part 2: Metric model

  • ISO/IEC 19086-3:2017

Information Technology -- Cloud computing -- Service level agreement (SLA) framework -- Part 3: Core conformance requirements

  • ISO/IEC 19086-4:2019

Cloud computing -- Service level agreement (SLA) framework -- Part 4: Components of security and of protection of PII (Personally Identifiable Information)

  • ISO/IEC 19099:2014     

Information Technology -- Virtualization Management Specification

  • ISO/IEC 19831:2015     

Cloud Infrastructure Management Interface (CIMI) Model and RESTful HTTP-based Protocol -- An Interface for Managing Cloud Infrastructure

  • ISO/IEC 19941:2017

Information Technology -- Cloud computing -- Interoperability and portability

  • ISO/IEC 19944:2017

Information Technology -- Cloud computing -- Cloud services and devices: Data flow, data categories and data use

  • ISO/IEC TR 22678:2019

Information Technology -- Cloud computing -- Guidance for policy development

  • ISO/IEC TR 23186:2018

Information Technology -- Cloud computing -- Framework of trust for processing of multi-sourced data

  • ISO/IEC 27017:2015

Information Technology -- Security techniques -- Code of practice for information security controls based on ISO/IEC 27002 for cloud services

  • ISO/IEC 27018:2019

Information Technology -- Security techniques -- Code of practice for protection of PII in public clouds acting as PII processors

Like any other ISO standards, conforming to them has many benefits for the provider’s businesses: building credibility at the international level, saving time and money by identifying and solving recurring problems, and improving and enhancing the system and process efficiency and effectiveness. On top of that, it is also living proof, publicly accessible, that the provider has properly managed their information security, including its risk, fulfilled their audit requirements and established trust both internally and externally that controls are properly placed and implemented in order to serve their customers better and hence increase their satisfaction level.

You, as the user, are urged to assess their ISO certification. Critical points to reflect on are: which product, service, or location does it actually cover? Is the certification for the entire organization or only for their head office exclusive of their branches? Who issues the certification and whether the issuer is one of the ISO-accredited bodies? For certain, you must see the original certificate and witness what information revealed there.

Advanced

Hypervisors is a software which is used to virtualise physical server to logical servers to optimise resource utilization Hypervisors are divided into two types.

Bare metal hypervisor are deployed over physical server are classified as  Type one hypervisor. Some examples of the type 1 hypervisors are Microsoft Hyper-V hypervisor, VMware ESXi, Citrix XenServer.

When hypervisor run on top of OS then it its type2 hypervisor and examples are. Kvm, oracle virtualbox

Multi cloud is cloud deployment model where IT infrastructure resource like compute, storage, network band width are used from multi cloud service or in house Data centre to complete business transaction. Its pooling of resources from different cloud service provider or combination of IT resource from in house Data centre and cloud services. This model is good use case where business function resources can not be met from one location.

A common question in basic interview questions on cloud computing, don't miss this one.

The cloud hosting drivers when identifying workloads are following

  1. High growth workloads: When workloads are growing faster than anticipated or unexpected growth rate then workloads are fit for cloud hosting. The cost of hosting will reduce with increased resource capacity, as capacity in the cloud is cheaper due to economy of scale compare to private infrastructure capacity.
  2. Throughput intensive applications: The reporting and analytics workloads are compute intensive to process large set of data and provide high throughput. Look for the top applications in terms of throughput. These kinds of applications are right fit for cloud hosting.
  3. Low I/O density workloads: The workloads which are not low I/O density and not sensitive to end user response time and no dependency on on premise workloads are right fit for cloud hosting.

Business application which handles missing critical, ERP and data sensitive information are not fit for cloud hosting. The applications which are running other than Intel platform are also not fit for immediate migration to cloud platform.

  • Performance issues: The applications which demands low latency, high network throughput and high performance  are not fit for cloud hosting.
  • Sensitive data: There are data which are sensitive in nature like credit card information, health record, bank details of customer which needs to protected and these information needs to be maintained within organization boundary and cannot be shared with third party due to regulatory restriction so these data can be not hosted in cloud.
  • Application architecture: Business application which are running consistently over the years and meeting performance  and operation demand are not fit for cloud hosting as change of environment from physical to virtual may affect performance and availability of services

n that case, we need to create the Storage accounts V1 or V2 based on the requirements and create the file storage and create the directory. We will click on the connect button and map the drive to customer servers.

  • Click on the storage account
  • Click on the file share
  • Provide the Name and Quota up to 5 TB each file storage.
  • Create File storage accounts.

 Azure File storage

  • Azure SQL Managed Instance:  Azure SQL managed instance has dedicated RAM, CPU and storage accounts and we can enable the VNet to secure the DB. It’s provides nearly 100% compatibility and secure database can integrate the VNet for more secure. In SQL managed instance we can utilize the private IP address.
  • Azure SQL General Purpose:  It’s PaaS services it’s shared model and we can’t integrate the Vnet and it has the limitation storage up to 4 TB. It’s is a fully managed SQL database engine based in the enterprise edition of SQL server. It’s Database-as -a-services which is hosted in the cloud. It’s has been built-in based on the standard hardware software which owned and managed by Microsoft.

Azure SQL managed instance V/S General-purpose Instance

It collects the logs based on Azure monitor and stores in log analytics workspace for analyzing and sending alerts. Even we can query and find the specific alerts or logs if required. It’s basically a monitoring tool which is monitoring most of the Azure services and it will collect the logs from the various ways.
What is Azure Log analytics

I will click on new and go to market place search for WebApps and provide the details and create the same.

  • In the Azure Portal,
  • click Create a resource
  • Web + Mobile
  • Web App.
  • Select your subscription.
  • Create a resource group.
  • Create an App Service plan
  • Create

Azure WebApps,

  • Cosmos DB: It’s Globally distributed and scalable horizontally and we can integrate the MongoDB api, Document DB API, Graph API etc. Azure cosmos database is no SQL database as services which is designed to help elasticity and flexibility. We can use it in IOT and global-facing application in a cloud-based application.
  • SQL DB: It’s Database as services and has relation DBMS. It’s a traditional SQL database server which we can use a database as services for cloud-based applications, we can integrate with API like OLE Db, Tabular systems, ado.net etc. It provides compatibility with on-premise SQL Database.

Blob storage is used to store the massive amount of unstructured data like jpeg file or archived files. It’s a cloud-based solution, It provides the durability and high availability & It’s a secure, manageable solution for larger data. We can access the storage account easily using the HTTP/https, api etc.

Few of the scenarios you will use the blob storage accounts.

  • We can access images or documents directly to a browser.
  • Storing files for distributed access.
  • Streaming video and audio.
  • Storing data for backup and restore disaster recovery, and archiving.
  • Storing data for analysis by an on-premises or Azure-hosted service.

Below is the list

  • Big Query: It’s a serverless highly scalable designed to analyze the data. It will analyze the data by creating a logical data warehouse. We can use Build and operationalize machine learning solutions with simple SQL
  • Cloud Composer: It’s fully managed workflow orchestration service that is used to create, manage, Schedule and monitor the services.
  • Google Marketing platform: It’s a bundle of services like display video 360, Analytics 360, data studio optimize360 etc.
  • Cloud dataflow: It’s a collection of SDKs for building the batch or streaming parallelized data processing pipelines.
  • Cloud IAM: It’s Identity and access management services which will help to control the access or give the rights to admin to manage the permission over the Google platform services.
  • Firebase Authentication:  Provides the backend services, easy to use, ready UI libraries to authenticate the apps. It’s used the OAuth 2.0 and OpenID Connect protocol for authentication.
  • Cloud Security scanner: It’s a web security scanner of vulnerabilities in app engine, compute engine & other application. It will scan automatically scan and detect common vulnerabilities which include cross-site mapping, Flash injection mixed content etc.
  • Cloud HSMIt’s hardware security module (HSM), we can create the key and encrypt the services and it helps to meet the compliance mandate.
  • Cloud Key management services: It’s a Cryptographic key management service. Cloud KMS is a cloud-hosted key management service that helps you manage cryptographic keys in cloud services. We can generate, use, rotate, and destroy AES256, RSA 2048, RSA 3072 cryptographic keys.

Google stack drivers provide depth diagnostics and monitor the health of the App Engine, which will monitor Google google services and sent out an alert for the same.

It’s collected the logs based on matrices, logs, and events from google cloud infrastructure, Application and other operations which are running in Google platform. Based on the logs collection, it’s observed the speed RCA and reduce the time to resolution. Even it does not require any integration to provide support to developers.

RDBMS is easy to set up and operate. It’s a highly scalable relational database in the AWS cloud. RDBMS is a cost-effective solution. We can resize the capacity of the RDBMS when it’s not in use. Helps us to reduce the administration, patching & backup task while automating the process.

Amazon RDS is available on several database instance types - optimized for memory, performance or I/O - and provides you with six familiar database engines, including Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle Database, and SQL Server. we can use the AWS Database Migration Service to easily migrate or replicate your existing databases to Amazon RDS.

  • In Azure: Cost Management/Cloudy: Which will help you to manage and optimize the cost. It uses the algorithms to analyse the billing, usage of each service, un-utilized resource, Reserved instance and suggestion. We can enable the weekly, monthly and yearly billing reports as customer requirements.
  • In AWS-Cost Explorer: It’s a tool that helps you to view and analyze your costs and usage. We can explore the usage and costs using the main graph, cost and usage reports & Cost Explorer RI reports. We can view data for up to the last 13 months & forecast how much spend for the next three months as per customer requirements. It enables the recommendations for Reserved Instances to purchase which will save the cost.

Azure recovery Vault services are used to take the backup of VMs and other services and it will provide the migration feature which can be utilized if we need to migrate the on-premise VMs to Azure.

It’s also used for on-premises migration to   (Hyper-V, VMware, Physical server migration to Azure). Its dose supports backup for Azure VMs (Linux/Windows), Azure File storage, PaaS SQL, WebApps, SQL DB ON Azure VMs etc. We can configure the daily backup policy and schedule the backup. Maximum backup can be retail up 999 years.  It provides fine-grain access management through RBAC. We can configure the site recovery using Azure Portal for backup and migrating the on-premises environment to Azure.

What Azure Recovery Vault?

  • Certificate of Cloud Security Knowledge (CCSK)

This vendor-neutral accreditation issued by Cloud Security Alliance certifies our understanding of security issues and best practices over a broad range of cloud computing domains ranging from architecture, governance, compliance, operations, encryption, to 

  • Certified Cloud Security Professional (CCSP)

Getting the certification provided by Cloud Security Alliance and International Information System Security Certification Consortium known as (ISC)² indicates you have advanced technical skills and knowledge to design, manage and secure data, applications and infrastructure in the cloud using best practices, policies and procedures.

Nevertheless, it is considered one of the advanced credentials among others – consequently, those who are interested in pursuing are required to have five years of working experience in IT fields on a full-time basis. Three of them shall correlate with information security whereas one year must pertain to architectural concept, design requirements, security in cloud data, cloud platform and infrastructure, cloud application also operations, legal and compliance.

If and only if you have earned CCSK, the previously mentioned one-year requirement will be waived. Supposing that you already hold Certified Information Systems Security Professional (CISSP) certification, it can replace the entire five-year requirement.

  • Certified Integrator Secure Cloud Services

This EXIN certification focuses on the interconnection of three areas: Service Management, Cloud Computing, and IT Security. That being the case, you will be automatically granted it without any cost when you possess their three foundational certificates: EXIN Information Security, EXIN Cloud Computing Foundation, and (IT) Service Management.

  • Professional Cloud Security Manager

Issued by Cloud Credential Council (CCC) and managed by EXIN, it recognizes skills and knowledge an individual possesses on security, risk and compliance cloud computing issues. In detail, the certificate primary focus is on the intersection between business and technical security challenges in an enterprise’s cloud computing environment. Five years of working experience in enterprise security with a deep understanding of cloud computing services and deployment models are the recommended prerequisites.

Speaking about applicable laws and regulations to certain data and information, there is a term called ‘data sovereignty’ or ‘information sovereignty’. In this case, since you’re asking about data, the first expression will be capitalized.

Basically, it is subject to numerous laws and regulations of the country in which the data is located or stored, used and transmitted – both sent and received.

Since those early days, it’s been one of the key challenges when an individual/organization wants to move into the cloud, the government/authority insist that the data should never leave their jurisdiction, which directly means we couldn’t place it in our desired services.

Thus far, there is no international policy, standard, or agreement which provides one set of data sovereignty’s requirements that all countries should be following.

Day in day out, it gains more weight and in response, many countries have established and regulated compliance requirements by amending the current laws or enacting new legislation that requires customer data kept within the country it resides. Over the past few years, this kind of obligations has been lately enforced in Vietnam, Brunei, Iran, China, Brazil, India, Australia, South Korea, Nigeria, Russia and Indonesia.

Additionally, the laws and regulations vary by country whilst some are, in fact, stricter than the others. Some of them mandate their citizens’ data is stored on physical servers within the country’s physical borders. Australia, for example, commands the provider to reveal what information is being sent outside the country.

In addition to that, the European Union (EU) restrict the transfer of Personally Identifiable Information (PII) to countries outside their member countries. PII itself is the type of data that could potentially identify a specific individual and refers to a relatively narrow range of data such as name, address, birth date, credit card number or bank account.

What You Could Do You’d better know the storage, server and any other device where your data will reside, what’s in the fine print, whether the provider has already complied to data sovereignty laws in the country where your data is located at. If your government body requires you to, in this context, store your data at the country where you are based, make sure two things. First, the provider has its’ storage deployed there. Second, their obligations to applicable laws and regulations on data sovereignty are already fulfilled.

The cloud providers have what they call ‘Cloud Management’ as tools, aimed as administrative control over public, private and hybrid clouds. What’s more, the software is intended for the users to manage, ranging from capabilities, availability, security, utilization, resource allocations, workflow, automation, workload balancing, capacity planning, monitoring, controlling, orchestration, provisioning, budgeting, cost and expense, performance, reporting and even the migration of cloud products and services we have subscribed into.

On top of it, there are two types of cloud management software. In-house software developed and offered by public, private and hybrid cloud provider and the second is mass product’s that comes from a third-party vendor to complement the aforementioned tools.

Over and above, if you choose public cloud then often time you will be given the option to manage your services with third-party tools simply because the servers, storage, networking, and other infrastructure operations are taken care of by the providers.

Besides, if you are a private cloud user, the tool is required to create the virtualization and virtualized computing resources, deal with resource allocation, security, monitoring, tracking as well as billing through a self-service portal.

Cloud management is more complex to handle when it comes to hybrid cloud due to obligation on having to deal with the network, computing, and storage devices across multiple domains including but not limited to installation, configuration, administration of instances, images, user accounts, and their access rights as part of Identity and Access Management.

Regardless of a native and third-party tool designed to provide rich functionality across one or multiple cloud providers, the platform must be able to provide the following features at the very least:

  • Provisioning
    • Acquire cloud types (public, private, hybrid) as well as cloud services, products and resources from the provider through self, advanced or dynamic subscription model. 
    • Create, edit and delete resources.
    • Workloads Management.
    • Automation and Orchestration
    • Configuration Management
    • Cloud Deployment and Consumption
    • Virtual Machine Management
    • Application Migration
    • Workflow Orchestration
  • Service Management
    • Receive and fulfil a request to access.
    • Deploy and manage cloud resources.
  • Information and Event Management
    • Monitor performance and other metrics.
    • Manage incident, problem and log files.
  • Change Management: manage the configuration and changes.
  • Security Management
    • Identity and Access Management
    • Encryption Implementation
    • Key Management
    • Endpoint Security
    • Mobile Device Security
  • Compliance and Governance
    • Risk Assessment
    • Vulnerability Assessment
    • Threat Analysis
    • Compliance to Standards, Laws, and Regulations
    • Audit, Review and Investigation
    • Service Governance
    • Resource Governance
  • Cost Management: budgeting, expense, rightsizing, chargeback and billing.
  • Performance Monitoring: monitor the network, application, storage, computing and other resources.
  • Continuity and Recovery: enable business continuity and disaster recovery processes and activities across our cloud environment.

Since your company is yet to decide which vendor to pick up, a few frameworks you could check out is:

  • NIST Special Publication 500-316 on Framework for Cloud Usability

If you find it challenging to deal with User Experience (UX) as the complexity and diversity of cloud systems emerges from time to time, this framework from National Institute of Standards and Technology (NIST) under U.S. Department of Commerce comes into the picture.

Framework for Cloud Usability

You could evaluate your cloud UX and the user expectations in a more structured way through five attributes and 21 elements provided.

End user and organization

  • NIST Special Publication 500-292 on Cloud Computing Reference Architecture

Developed by NIST Cloud Computing Reference Architecture and Taxonomy Working Group, this paper is intended to portray a high-level conceptual model for defining requirements, structures along with the operations of cloud computing.

On top of that, it is divided into two parts. One is the complete overview of actors including their roles and the architectural components for managing and providing cloud services with the likes of service deployment, service orchestration, cloud service management, security and privacy. The other is Taxonomy presented in own section and appendices constituting of terms, definitions and examples of cloud services.

cloud provider

Without question, there is no such thing as a one-framework-fits-all. Each of them has its pros and cons. Due to this reason, it’s important to analyze the available frameworks and put the Cost and Benefit approach, for instance, or any other key metrics into consideration.

Else, you could develop your own framework or explore the opportunity to design a hybrid framework by combining yours with existing framework out there, or even more, drawing this further, combining a handful of frameworks to help your organizations meeting their unique requirements as well as business objectives.

  • Cloud Security Alliance’s Cloud Controls Matrix version 3.0.1

If you need to assess security risks of a cloud provider, this framework will bear the fruit while on the other hand, it provides fundamental security concepts and principles in 13 domains and 133 controls for the vendor to follow.

Shortly known as CCM, from the vendor’s perspective, it will improve or enhance security control environments by emphasizing business information security control requirements, identifying and mitigating from security threats and vulnerabilities in the cloud. The matrix also offers cloud taxonomy and terminology, security measurements, standardized security risk, IT risk and operational risk when notably managing one or all of them.

Sr. No.Cloud Control Matrix - DomainsNo. of Controls for Each Domain (Cloud Security Alliance)
1.AIS: Application & Interface Security4
2.AAC: Audit Assurance & Compliance3
3.BCR: Business Continuity Management & Operational Resilience11
4.CCC: Change Control & Configuration Management5
5.DSI: Data Security & Information Lifecycle Management7
6.DCS: Datacenter Security9
7.EKM: Encryption & Key Management4
8.GRM: Governance and Risk Management11
9.HRS: Human Resources11
10.IAM: Identity & Access Management13
11.IVS: Infrastructure & Virtualization Security13
12.IPY: Interoperability & Portability5
13.MOS: Mobile Security20
14.SEF: Security Incident Management, E-Discovery & Cloud Forensics5
15.STA: Supply Chain Management, Transparency and Accountability9
16.TVM: Threat and Vulnerability Management3
  • NIST Special Publication 800-144 on Guidelines on Security and Privacy in Public Cloud Computing

If you consider adopting public cloud computing, then this 80-page document shall come to the light. It will give you a big picture of security and privacy challenges and crucial points to consider when you outsource your data, applications and infrastructure to a public cloud provider in which they own and operate the infrastructure and computational resources aside from the fact they deliver services to the public via a multi-tenant platform.

Cloud provider

As this paper tells us, it does not recommend any specific cloud computing service, service arrangement, service agreement, service provider, or deployment model. Such consequence is each organization is encouraged to apply their very own guidelines when analyzing their requirements, inclusive of security and privacy, and to assess, select, engage, and oversee the public cloud services that can fulfil those requirements at the most.

Other than two frameworks explained above, you could also bring another document titled ‘Security Guidance for Critical Areas of Focus in Cloud Computing v4.0’ from Cloud Security Alliance (CSA) into play. Developed based on previous iterations of the security guidance, dedicated research, and public participation from their members, working groups, and industry experts within their community, it provides how to manage and mitigate security and risks in adopting cloud computing technology while also pledge guidance and insights to support business goals.

 Critical Areas of Focus in Cloud Computing

  • Information System Audit/Assurance Program: Cloud Computing Management 

The document issued by ISACA is intended to both actors, cloud users and cloud providers, so they could assess the design and operating effectiveness of the cloud computing internal controls (administrative, physical, technical) and security, identify internal control discrepancies and deficiencies within the end-user organization and its interface with the service provider. In essence, we could refer to this guide to, after all, provide the results of an audit assessment and our ability to rely upon our own IT department and or the cloud provider’s attestations on internal controls.

  • Cloud Security Framework Audit Methods

As the title stands and tells us, this white paper from SANS Institute guides us on how to conduct a security audit on our cloud environment and also is aimed for the cloud provider to audit their cloud environment.

It constitutes of audit methodology, audit checklist, standards, laws and regulations we could put into service to witness security risks and in the end test the respective controls.

Area to be audited is as follows:

  1. Governance
  2. Data Management
  3. Data Environment
  4. Cyber Threat
  5. Infrastructure
  6. Logs and Audit Trails
  7. Availability
  8. Identity and Access Management
  9. Encryption
  10. Privacy
  11. Regulatory Compliance
  12. Legal
  • Information Technology Assurance Framework

As we might already know, ISACA develops IT Assurance Framework (ITAF) as a guideline that provides information and direction for the practice of IT audit and assurance. IT also offers tools, techniques, methodologies, and templates to direct the application of IT audit and assurance processes. Read up on ITAF sections 3400 – IT Management Processes, sections 3600 – IT Audit and Assurance Processes, and keep an eye on sections 3800 – IT Audit and Assurance Management.

Well, safely say the components such as audit objective, scope, risk, plan, methodology/approach, along with its procedures (processes and techniques), are much the same as other types of IT or IS Audit engagement.

The main thing is, in the cloud, with shared resourcing, multitenancy and geolocation, the boundaries are difficult to define and isolate meanwhile the end-user specific transactional information is difficult to obtain. As such, IT Assurance needs to become more real-time, continuous and process-oriented vs. transactional in focus, while the cloud providers need to provide greater transparency to their clients.

Objective

Organizations should strive to align their business objectives with the objectives of the audit. During the planning stage, the auditor shall identify what the objectives then have them agreed with the auditee. From the auditor end, they are going to use the objectives as a way of concluding on the evidence they obtain. Some of the notable objectives are:

  • Provide stakeholders with a result of the assessment on the effectiveness of cloud

computing service provider’s internal controls.

  • Identify internal control deficiencies within the end user’s organization and its interface with the service provider.
  • Provide stakeholders with a result of the assessment on the quality and their ability to rely upon the service provider’s attestations related to internal controls.

Above controls also includes IT application controls, not merely IT general controls that are aimed to provide assurance of specific application, its functionality and suitability.

To get an idea on the controls including their objectives on the cloud environment, have a look at ISACA Control Objectives for Information and Related Technologies (COBIT). Even though it is developed as a general control framework, some of the control objectives have some applicability to the cloud.

Scope

  • The Governance that affects cloud computing
  • The Contractual compliance between the user and service provider 
  • Control issues and concerns specific to cloud computing

When it comes to IT general controls, the auditor from the customer’s end shall do the review on:

  • Identity and Access Management (IAM)

If your IAM system is integrated with the cloud computing system

  • Security Incident Management

To interface with and manage cloud computing incidents

  • Network Perimeter Security 

As an access point to the internet

  • Systems Development and Maintenance

If the cloud is part of your application infrastructure

  • Project management
  • IT Risk Management
  • Data Management 
  • Vendor Management 
  • Vulnerability Management

It is also important to note that the controls that are maintained by a vendor are not included in the scope of a cloud computing audit.

  • Methodology/Approach

It is a common practice an organization may use these two approaches to measure a cloud provider:

  • Vendor Management

Inclusive of vendor risk assessment, vendor due diligence, vendor rating/tiering, vendor Scope of Work, vendor agreement, and vendor Service Level Agreement (SLA)

  • Independent Assurance

Third-party auditor whether provided by the cloud provider or the end-user.

Procedure

Whether it’s rolled out by your internal function, the vendor’s organizational unit, or by the third party, the auditor will turn stacks of processes and techniques to account to obtain evidence through inquiry of data and document, assessment, confirmation, recalculation, reperformance, observation, meeting, discussion, inspection, analytics, and confirmation.

Cloud Governance is basically a set of standardized policies and practices involving people, process and technology related to cloud computing environment and designed to ensure the organization and more importantly business objectives are met without surpassing risk tolerance and compliance requirements.

Business goals and objectives are varied between one and another entity, however, the most commonly found is the performance, budget/cost optimization, customer satisfaction, employee attraction and retention and resource productivity.

According to The Open Group, governance answers three huge questions. First, are we doing the right things? Second, are we doing things in the right way? Last, how do we know that we have done both?

The global consortium that enables the achievement of business objectives through IT standards think that Cloud Computing Governance is a view of IT Governance focused on accountability, defining decision rights and any other way balancing benefit/value, risk, and resources in a cloud environment. At large, it is a subset of overall business governance which includes IT Governance and Enterprise Architecture (EA) Governance.

You could put their Cloud Computing Governance Framework to use. As a pool of business-driven policies and principles that establish the appropriate degree of investments and control around the Cloud Computing lifecycle and its processes, your organization could make certain all expenses and costs associated are aligned with your company business objectives, foster data integrity organization-wide, stimulate innovation, and manage the risk of data loss and or non-compliance with regulations, they say.

Scope and Relationship between Cloud Computing

As it will help our organization to identify vulnerabilities before a compromise could take place, the process is started by identifying and assigning severity levels to security defects through manual and automated techniques in a certain period of time. Be mindful that since this is related to cloud computing, there are two types of PT. First is the test the provider does to its own platform and second the test you could do to their resources, specifically for your systems. Importantly, not all cloud vendors allow penetration testing.

Ideally, the assessment shall target different layers of technology from Host, Network, Storage, Server, Virtualization, Operating System, Middleware, Runtime, Database, and Application by highly considering your cloud models (SaaS, PaaS, IaaS, etc.) and cloud deployment models.

Separation of Responsibilities in Cloud Models

The Organization that Owns and Manages the Cloud

Description

Cloud computing delivers computing services like servers, storage, databases, networking, software, analytics, intelligence and many more over the Internet (“which is called Cloud”) to offer faster innovation, flexible resources and economies of scale. The organization which offers these services is called a Cloud provider.

Today, the scope of Cloud Computing is huge as it is a very fast emerging business standard. Many organizations are experiencing the fruits of Cloud applications in a few different ways. Also, features like less cost, faster speed, globally scalable, more productivity, and, most important, data protection from potential threats are responsible for a big shift from the traditional way businesses to cloud computing services.  

The rapid shift to the Cloud in an era of innovation has offered many organizations to employ a cloud-first approach to product design and some technology and business innovations available as cloud services. Microsoft is the global leading provider of Cloud computing services for businesses of all sizes. Many companies offer Cloud Computing services and refer Cloud Computing providers. These top Cloud Computing companies are Microsoft, SAP, Oracle, Google, IBM, at&t, and Salesforce. etc.

According to the 2019 Cloud Computing report by Forrester, Dave Bartoletti, Vice President and  Principal Analyst at Forrester, has pegged 2019 as the year of widespread enterprise adoption of cloud to power digital transformation efforts. Moreover, he also stated that "In 2019, cloud computing will be shorthand for the best way to turn disruptive ideas into amazing software."

Individuals skilled in areas like AI, cloud computing, digital marketing and cyber security are predicted to be in high demand in 2019,” Katie Bardaro, lead economist and vice president of data analytics at PayScale, told FOX Business.

We have brought hand-picked top Cloud computing interview questions after lots of detailed research to help you in your interview. These Cloud computing interview questions and answers for experienced and freshers alone will help you excel in the Cloud job interview and provide you an edge over your competitors. Therefore, to succeed in the interview, you need to go through these questions and practice these Cloud computing interview questions as much as possible. You can look for many other courses in cloud computing to upskill your career more progressively. 

If you want to make your career in Cloud, you need not worry, as the set of Cloud Computing interview questions designed by experts will guide you to get through the Cloud interviews. Stay in tune with the following interview questions and prepare beforehand to become familiar with the questions you may encounter while searching for a dream job. You can enroll in our cloud Architecting on AWS Certification Training to be better prepared for other cloud career roles. 

Hope these Cloud Computing Interview Questions will help you to crack the interview. All the best!

Happy job hunting!

Read More
Levels