Search

Impact On IT Service Management From Cloud Computing

In IT Service Management, the ITIL V3 philosophy stresses that “IT is the business.” This is truly realized for providers offering cloud computing services. There are a number of considerations that affect Service Management processes when moving to a cloud services model. Service Desk These traditional ITIL Service Support processes are tightly linked. In the cloud computing model, high expectations of availability are part of the model’s selling point, so rapid restoration of service becomes critical through the use of these processes and the Service Desk that performs them. Change management In addition, Change Management work flow activities can sometimes be done best by the Service Delivery Architects. They are the ones who determine the rules used by the automation tools rather than by the Service Management team, who traditionally performed those tasks now being done by the automation tools. Configuration and Asset Management With the Cloud service model’s standardized infrastructure and specialized tool sets, configuration is typically much simpler than that of an enterprise environment, with its extensive variety of hardware and software that must be orchestrated together. Many service-specific tools provide configuration capability for that service, thus reducing the amount of manual coordination requiredwhen compared to the Enterprise IT model. Asset management is related to configuration management and, in a cloud service, has both a virtual component (e.g. tracking virtual resources) and a dynamic component (i.e., assets can change every hour) to its management process.  Configuration Management needs to address a consumer view, (i.e., what assets belong to the service being consumed), a service view, (since assets equal revenue), and an enterprise view, (showing the business status of all cloud services being offered.) Service Level Management With a cloud environment, a single SLM process can exist, but separate SLAs and Service Level Packages should be defined, monitored, and managed for each service. The monitoring components for SLA related performance data will require tools that do not just rely on knowledge of the infrastructure, given the unpredictability of which cloud infrastructure components will actually be used. Instead, monitoring service performance and availability for ensuring SLA compliance must be done from the perspective of the user not from the perspective of the infrastructure. Availability, Capacity, Continuity, and Security The cloud environment’s “Provider-Consumer” model breaks the link between IT continuity and the customer’s business continuity. The Cloud service provider must offer to its customers a warranty of service continuity and make it part of the SLA that comprises the Service Level Packages offered by the service provider. With a cloud service provider, the hardware and software environments are much more uniform and more easily managed. With the provider’s more homogeneous infrastructure offering a much more stable environment, the risks from a change to production are significantly reduced. With reduced risk, Cloud service providers can deliver modifications to services much faster and hence their agility becomes a part of the business model and a distinguishing capability in the market. It also implies that the Release and Deployment Management (RDM) process is often replaced by a different paradigm. In the cloud model, scalability of capacity and performance is a core offering provided by cloud service providers and their SLAs should reflect this. To accomplish this real-time scalability requires two things on the part of the service provider. For cloud services, availability is vital; much of the availability must be architected into the service. Where once the technical developers of a service could ignore the availability issues of their applications, leaving that to the job of the IT organization, cloud service availability is a key factor to its commercial success and hence must be built-in by the service developers and service architects working with the IT organization. With a combination of tools and resiliency built into the design and implementation of the service itself, the responsibility for availability must be shifted into the development lifecycle long before a service goes into production.
Rated 4.0/5 based on 20 customer reviews

Impact On IT Service Management From Cloud Computing

389
Impact On IT Service Management From Cloud Computing

In IT Service Management, the ITIL V3 philosophy stresses that “IT is the business.” This is truly realized for providers offering cloud computing services. There are a number of considerations that affect Service Management processes when moving to a cloud services model.

Service Desk

These traditional ITIL Service Support processes are tightly linked. In the cloud computing model, high expectations of availability are part of the model’s selling point, so rapid restoration of service becomes critical through the use of these processes and the Service Desk that performs them.

Change management

In addition, Change Management work flow activities can sometimes be done best by the Service Delivery Architects. They are the ones who determine the rules used by the automation tools rather than by the Service Management team, who traditionally performed those tasks now being done by the automation tools.

Configuration and Asset Management

With the Cloud service model’s standardized infrastructure and specialized tool sets, configuration is typically much simpler than that of an enterprise environment, with its extensive variety of hardware and software that must be orchestrated together. Many service-specific tools provide configuration capability for that service, thus reducing the amount of manual coordination requiredwhen compared to the Enterprise IT model.

Asset management is related to configuration management and, in a cloud service, has both a virtual component (e.g. tracking virtual resources) and a dynamic component (i.e., assets can change every hour) to its management process.  Configuration Management needs to address a consumer view, (i.e., what assets belong to the service being consumed), a service view, (since assets equal revenue), and an enterprise view, (showing the business status of all cloud services being offered.)

Service Level Management

With a cloud environment, a single SLM process can exist, but separate SLAs and Service Level Packages should be defined, monitored, and managed for each service. The monitoring components for SLA related performance data will require tools that do not just rely on knowledge of the infrastructure, given the unpredictability of which cloud infrastructure components will actually be used. Instead, monitoring service performance and availability for ensuring SLA compliance must be done from the perspective of the user not from the perspective of the infrastructure.

Availability, Capacity, Continuity, and Security

The cloud environment’s “Provider-Consumer” model breaks the link between IT continuity and the customer’s business continuity. The Cloud service provider must offer to its customers a warranty of service continuity and make it part of the SLA that comprises the Service Level Packages offered by the service provider.

With a cloud service provider, the hardware and software environments are much more uniform and more easily managed. With the provider’s more homogeneous infrastructure offering a much more stable environment, the risks from a change to production are significantly reduced. With reduced risk, Cloud service providers can deliver modifications to services much faster and hence their agility becomes a part of the business model and a distinguishing capability in the market. It also implies that the Release and Deployment Management (RDM) process is often replaced by a different paradigm.

In the cloud model, scalability of capacity and performance is a core offering provided by cloud service providers and their SLAs should reflect this. To accomplish this real-time scalability requires two things on the part of the service provider.

For cloud services, availability is vital; much of the availability must be architected into the service. Where once the technical developers of a service could ignore the availability issues of their applications, leaving that to the job of the IT organization, cloud service availability is a key factor to its commercial success and hence must be built-in by the service developers and service architects working with the IT organization. With a combination of tools and resiliency built into the design and implementation of the service itself, the responsibility for availability must be shifted into the development lifecycle long before a service goes into production.

KnowledgeHut

KnowledgeHut Editor

Author

KnowledgeHut is a fast growing Management Consulting and Training firm that is a source of Intelligent Information support for businesses and professionals across the globe.


Website : http://www.knowledgehut.com/

Join the Discussion

Your email address will not be published. Required fields are marked *

Suggested Blogs

Test Drive Your First Istio Deployment using Play with Kubernetes Platform- Cloud Computing

As a full stack Developer, if you have been spending a lot of time in developing apps recently, you already understand a whole new set of challenges related to Microservice architecture. Although there has been a shift from bloated monolithic apps to compact, focused Microservices for faster implementation and improved resiliency but the fact is  developers have to really worry about the challenges in integrating these services in distributed systems which includes accountability for service discovery, load balancing, registration, fault tolerance, monitoring, routing, compliance, and security.Let us understand the challenges faced by the developers and operators with the Microservice Architecture in details. Consider a 1st Generation simple Service Mesh scenario. As shown below, Service (A) communicates to Service (B). Instead of communicating directly, the request gets routed via Nginx. The Nginx finds a route in Consul (A service discovery tool) and automatically retries to form the connection on HTTP 502’s happen.                                                                    Figure: 1.0 – 1st Gen Service Mesh                                                      Figure:1.1 – Cascading Failure demonstrated with the increase in the number of servicesBut, with the advent of microservices architecture, the number is growing ever since. Below are the  listed challenges encountered by both developers as well as operations team:How to make these growing microservices communicate with each other?Enabling the load balancing architectures over these microservices.Providing role-based routing for the microservices.How to implement outgoing traffic on these microservices and test canary deployment?Managing complexity around these growing pieces of microservices.Implementation of fine-grained control for traffic behavior with rich-routing rules.Challenges in implementing Traffic encryption, service-to-service authentication, and strong identity assertions.In a nutshell, although you could enable service discovery and retry logic into application or networking middleware, the fact is that service discovery becomes tricky to make it right.Enter Istio’s Service Mesh“Service Mesh” is one of the hottest buzzwords of 2018. As the name suggests, it’s a configurable infrastructure layer for a microservices app. It lays out the network of microservices that make up applications and enables interactions between them. It makes communication between service instances flexible, reliable, and fast. The mesh provides service discovery, load balancing, encryption, authentication and authorization, support for the circuit breaker pattern, and other capabilities.Istio is completely an open source service mesh that layers transparently onto existing distributed applications. Istio v1.0 got announced last month and is ready for production. It is written completely in Go Language and its a fully grown platform which provides APIs that let it integrate into any logging platform, or telemetry or policy system. This project adds a very tiny overhead to your system. It is being hosted on GitHub. Istio’s diverse feature set lets you successfully, and efficiently, run a distributed microservice architecture, and provides a uniform way to secure, connect, and monitor microservices.Figure-1.2: Istio’s CapabilityThe Istio project adds a very tiny overhead to your system. It is being hosted on GitHub. Last month, Istio 1.0 release went public and ready for production environment.What benefits does Istio bring?Istio lets you connect, secure, control, and observe services.It helps to reduce the complexity of service deployments and eases the strain on your development teams.It provides developers and DevOps fine-grained visibility and control over traffic without requiring any changes to application code.It provides CIOs with the necessary tools needed to help enforce security and compliance requirements across the enterprise.It provides behavioral insights & operational control over the service mesh as a whole.Istio makes it easy to create a network of deployed services with automatic Load Balancing for HTTP, gRPC, Web Socket & TCP Traffic.It provides fine-grained control of traffic behavior with rich routing rules, retries, failovers, and fault injection.It enables a pluggable policy layer and configuration API supporting access controls, rate limits and quotas.Istio provides automatic metrics, logs, and traces for all traffic within a cluster, including cluster ingress and egress.It provides secure service-to-service communication in a cluster with strong identity-based authentication and authorization.If you want to deep-dive into Istio architecture, I highly recommend the official Istio website.It’s Demo Time !!!Under this blog post, I will showcase how Istio can be setup on Play with Kubernetes (PWK) Platform for a free of cost. In case you’re new, Play with Kubernetes rightly aka PWK is a labs site provided by Docker. It is a playground which allows users to run K8s clusters in a matter of seconds. It gives the experience of having a free CentOS LinuxVirtual Machine in the browser. Under the hood Docker-in-Docker (DinD) is used to give the effect of multiple VMs/PCs.Open https://labs.play-with-k8s.com/ to access Kubernetes Playground.Click on the Login button to authenticate with Docker Hub or GitHub ID.Once you start the session, you will have your own lab environment.Adding First Kubernetes NodeClick on “Add New Instance” on the left to build your first Kubernetes Cluster node. It automatically names it as “node1”. Each instance has Docker Community Edition (CE) and Kubeadm already pre-installed. This node will be treated as the master node for our cluster.Bootstrapping the Master NodeYou can bootstrap the Kubernetes cluster by initializing the master (node1) node with the below script. Copy this script content into bootstrap.sh file and make it executable using “chmod +x bootstrap.sh” command.When you execute this script, as part of initialization, the kubeadm write several configuration files needed, setup RBAC and deployed Kubernetes control plane components (like kube-apiserver, kube-dns, kube-proxy, etcd, etc.). Control plane components are deployed as Docker containers.Copy the above kubeadm join token command and save it for the next step. This command will be used to join other nodes to your cluster.Adding Worker NodesClick on “Add New Node” to add a new worker node.Checking the Cluster StatusVerifying the running PodsInstalling Istio 1.0.0Istio is deployed in a separate Kubernetes namespace istio-system. We will verify it later. As of now, you can copy the below content in a file called install_istio.sh and save it. You can make it executable and run it to install Istio and related tools.You should be able to see screen flooding with the below output.As shown above, it will enable the Prometheus, ServiceGraph, Jaeger, Grafana, and Zipkin by default.Please note – While executing this script, it might end up with the below error message –unable to recognize "install/kubernetes/istio-demo.yaml": no matches for admissionregistration.k8s.io/, Kind=MutatingWebhookConfigurationThe error message is expected.As soon as the command gets executed completely, you should be able to see a long list of ports which gets displayed at the top center of the page.Verifying the ServicesExposing the ServicesTo expose Prometheus, Grafana & Servicegraph services, you will need to delete the existing services and then use NodePort instead of ClusterIP so as to access the service using the port displayed on the top of the instance page. (as shown below)You should be able to access Grafana page by clicking on “30004” port and Prometheus page by clicking on “30003”.You can check Prometheus metrics by selecting the necessary option as shown below:Under Grafana Page, you can add “Data Source” for Prometheus and ensure that the dashboard is up and running:Congratulations! You have installed Istio on Kubernetes cluster. Below listed services have been installed on K8s playground:Istio Controllers and related RBAC rulesIstio Custom Resource DefinitionsPrometheus and Grafana for MonitoringJeager for Distributed TracingIstio Sidecar Injector (we'll take a look next section)Installing IstioctlIstioctl is configuration command line utility of Istio. It helps to create, list, modify and delete configuration resources in the Istio system.Deploying the Sample BookInfo ApplicationNow Istio is installed and verified, you can deploy one of the sample applications provided with the installation- BookInfo. This is a simple mock bookstore application made up of four services that provide a web product page, book details, reviews (with several versions of the review service), and ratings - all managed using Istio.Deploying BookInfo ServicesDefining the Ingress Gateway:Verifying BookInfo ApplicationAccessing it via Web URLYou should now be able the BookInfo Sample as shown below:Hope, this Istio deployment Kubernetes tutorial helped you to successfully install Istio on Kubernetes. In the future blog post, I will deep dive into Istio Internal Architecture, traffic management, policies & telemetry in detail.
Rated 4.5/5 based on 1 customer reviews
1092
Test Drive Your First Istio Deployment using Play ...

As a full stack Developer, if you have been spendi... Read More

Team Collaboration: The Benefits Of Using Cloud Tools

According to Forbes Insights: Collaboration in the Cloud,the ability to mingle across geographical parameters, time zones, or even organizational borders is becoming an ever more critical foundation of success. The cloud has reached a place where it has become an integral part of our everyday work-lives. Based on its affordability and colossal effectiveness, modern entrepreneurs are using cloud services to build and run their small businesses, while working with people from different parts of the globe. In fact, collaborative work based on cloud technology has already proved its competency in bringing your most diverse teams together. By working together on on-site and off-site documents alike, cloud collaboration tools allow your team members to ingress, edit and even share files outside of the company firewall. Thanks to the growing practices of BYOD (Bring Your Own Device) and remote employment, cloud tools are fast becoming permanent features of the contemporary workplace. Modern-day workforce wants to move and communicate easily between various devices and cloud collaboration helps them to do that. Perks Of The Cloud Despite the increasing trend of cloud collaboration to amplify business productivity, many small businesses are yet to understand its benefits. According to the report published by Endurance International Group, many entrepreneurial ventures fail to take advantage of the significant advances in cloud collaboration technology and are also “unfamiliar with some of the most advantageous online tools.” The study also found that even though one-third of the respondents had heard of the term, they were unsure of what cloud computing actually meant. While 11 percent of the group said they rely on cloud collaboration for their business functioning,  67 percent of people admitted to not having invested in the technology yet. As a business person, if you too are thinking of investing in cloud tools for collaboration, here are the five basic benefits of the practice for start-ups and their teams. Easy Deployment & Maintenance SaaS (software as a service) collaborative solutions, often requiring little more than a license activation to get you up and running, are easy to install and maintain – with no unnecessary software downloads or costly, time-consuming hardware installations. Often by simply entering an email address, you can test a cloud-based collaboration product with a free trial. By taking advantage of free trials, you can be more confident when it comes time to purchase. Cost Effective Many of the cloud collaboration perks are linked to its cost-effectiveness. Unlike the age-old, on premise software, you don’t have to invest in pricey infrastructure with cloud-based collaborative tools. They also eradicate the need for expensive customization and heavy IT involvement. Following the pay-as-you-go model, cloud collaboration is indeed a cost effective solution for tech-businesses. The computer billing method, under this utility, only requires you to pay for procured, rather than actual computer resources. This means, instead of paying for an entire infrastructure in the case of in-house IT department, you only have to pay for the services used in a given month. Since cloud-based collaboration allows employees to work from remote locations, you can actually cut off office overheads, making it especially beneficial for entrepreneurial ventures that usually have a strict work budget. After subtracting the upfront and extensive capital expense, you can also mitigate project risk by using cloud technology to your advantage. Everything is web-based, which means no time-consuming installation, system configuration or maintenance involved either. Thus, as a part of tech-service industry, you don’t have to bother about constant updates or server upkeep. Boosts The Project Management Process The ability to connect people to share information when it’s most needed is one of the key advantages of cloud collaboration tools. Besides, it helps businesses streamline the documentation process by keeping all project related files within the same workspace, allowing them to manage everything in a single domain; the cloud. With these collaborative project management tools, you can retain archived versions of all your information and update them in real time. Additionally, your team members can add tasks and comments to each file, providing you with a detailed audit trail of all project related interactions. You can even create and manage task lists, making your overall project management much faster and simpler. According to a 2014 Software Advice study, thousands of interviewed buyers, shopping for the right project management tool, revealed that around 46% of small business consumers were using manual methods like email, pens, papers and Excel. But companies that have a cloud-based deployment preference voted for cloud solutions over on-site disposal. And the biggest chunk, around 98% of them looked for project management as one of the integrated services in their cloud collaboration tools. The challenging aspect of projecting across various stand-alone platforms makes the reason behind such preferences quite obvious. Nonetheless, the key is to find a collaborative tool that helps you create and manage milestones and goals seamlessly to keep your project on track. Syncing Files & Data By hosting the on-going project documents, shareable records, presentations and more in the cloud, your files are automatically synced and available from any of your devices. Without any additional action required to access the file, a presentation can be uploaded from your work computer and further presented from your tablet at home. You can also easily share documents with your co-workers without the need to remain in the same office. Greater Scalability Cloud based technologies are famously known for their flexibility and scalability. Using a cloud-based collaboration tool, you can start small and add on the resources as the demand increases. It is often the case for start-ups that the number of participants and the volume of contributions increase rapidly in-between projects. Under cloud collaboration, it becomes easier to cater to such growing demands without adding much cost. The most important benefit of cloud tools is they allow you to scale for users, workload and adoption. You can, therefore, accommodate the growth seamlessly. Without making any infrastructure investment, you only have to pay for additional storage and users as you need them. Additionally, you can manage peaks in demand by automatically allocating managing team performance and capacity when facing data-intensive periods. This helps you achieve greater organizational agility. In Conclusion The outlook of cloud collaboration is such that it serves companies and the mobile workforce with multiple locations and allows remote workers to access information seamlessly across devices, helping them to become more productive at work. Upon understanding the real benefits of cloud collaboration, you will also realise that its goal is to provide a natural working experience with your team members, even when that’s not the case.      
Rated 4.0/5 based on 20 customer reviews
Team Collaboration: The Benefits Of Using Cloud To...

According to Forbes Insights: Collaboration in the... Read More

How To Configure Jenkins With BITBUCKET - Cloud Computing

PURPOSE:This article will provide you a complete detailed knowledge of configuring Jenkins with Bitbucket repository which will help in automating the deployment process and will help to set the path for the interaction of Jenkins continuous integration tool with bitbucket repository.INTRODUCTION:As a part of the continuous integration and deployment process, it is a requirement that the code repository should have an interaction with the build automation and continuous integration tool. In this article, we are working with bitbucket and Jenkins to achieve the same, but this could be similar to the other tools that are available in the market. Let’s see the steps to integrate a Bitbucket with the Jenkins.Ways for Enabling Jenkins with Bitbucket:STEP 1: Login to your Jenkins and then select Manage Jenkins->Plugin Manager and on the tab available select the option bitbucket plugin and click on install without restart. In my case, this is installed already hence it’s showing under the Installed section.Step 2: Once the installation is done, configure a sample project and in build triggers section during the creation of a new project select build when a change is pushed to bitbucket and click apply and save to enable trigger Jenkins with bitbucket:Step 3: Now, visit Bitbucket URL: https://bitbucket.org/ and create your own repository, if you haven’t created yet else visit your own bitbucket repository for the purpose of this course. I have LearnShareKnowledge as the repository-Step 4: Click on the clone and copy the repository URL which you want to configure with Jenkins from bitbucket:Step 5: Go back to Jenkins and click on the project which you have to create and click on configure under the Source Code Management section, select the repository type which you have in your bitbucket repository. In my case, it is git and paste the URL by removing git clone to connect Jenkins with bitbucket. Click on save to reflect the changes:Step 6: Select the credentials if it is not a public repository for configuring the credentials, you can click the Add button on credentials section and proceed with username and password under global credentials domain and click Add.Step 7: Now, again visit the bitbucket repository which you want to configure. For that, click on settings under that repository:Step 8: Under settings, click on webhooks highlighted in the picture:Step 9: Click on add webhook and give the title to that webhook and paste the URL of Jenkins which you have configured with bitbucket and then select active and repository push and save the changes. If you want to make this more secure, then you can verify this with SSL certificate but in my case,  I used a secure key which provides a security to this setup:Now with the Jenkins setup for Bitbucket, we are ready to test the complete configuration. As soon as you do commit to the repository, you will be having an automated build triggered initiating a job inside the Jenkins project which you have configured with the repository.
Rated 4.0/5 based on 3 customer reviews
How To Configure Jenkins With BITBUCKET - Cloud Co...

PURPOSE:This article will provide you a complete d... Read More

other Blogs