Search

Monitoring AWS Machine using Azure Log Analytics - Cloud Computing

What is Azure Log Analytics?Logs make your life easier and help everyone in the organization understand what you do and what's Actually going on with the software. Ensure that you follow best practices of Azure Log Analytics for monitoring AWS machines effectively.Logs are created by network devices, applications, operating systems, and programmable or smart devices. They comprise several messages that are chronologically arranged and stored on a disk, in files, or in an application like a log collector.Firstly, we will try to understand why we need logs to monitor AWS services.We need logs because, with Azure Log Analytics, you can gather and search log data from all your resources, whether they reside on Azure, on another cloud host, or on-premises. Then you can transform the gathered data into rich analytics with AI-enhanced insights into your environment.  How Log Analytics worksThe log data from Virtual machines and other cloud resources, capture via some Agent which we need to install on VMs and Resource can be named as Connected Sources which further will be gathered in records and sent to OMS repository that is the part of Azure Cloud resource and stored in the Azure cloud.After receiving logs to log analytics it further is available for log search and other activity like Set Alerts, Dashboard View, Power BI View, and Export into Excel and Data files. Data collection in Azure Log AnalyticsData can be collected in azure Log Analytics in different ways –Agents on Windows and Linux virtual machines send telemetry from the guest operating system and applications to Log Analytics according to Data Sources that you configure.Connect a System Center Operations Manager management group to Log Analytics to collect data from its agents.Azure services such as Application Insights and Azure Security Center store their data directly in Log Analytics without any configuration.Write data from PowerShell command line or Azure Automation runbook using Log Analytics cmdlets.If you have custom requirements, then you can use the HTTP Data Collector API to write data to Log Analytics from any REST API client.Azure Monitor                   Data sources for events and performanceVirtual Machines               Data sources for events and performanceOperations Manager         Data from management group agentsApplication Insights           Application requests and exceptionsAzure Security Center       Security eventsPowerShell                        PowerShell command line or runbookData Collector API            Rest API for custom dataIncoming data automatically indexed. Data types and tables automatically created.Log AnalyticsData available through log search and smart analytics to multiple channels.Design and test queries and analyze data         AnalyticsVisualize data in Azure portal                            DashboardsWorkflows consuming Log Analytics data         Logic AppsAutomatically respond critical conditions           AlertsExport for visualization with other sources        Power BIPowerShell command line or runbook               PowerShellRest API for custom application                         Log Search APIWork flow of Log AnalyticsNow let's find out the workflow of log analytics like how it collects data, analyze, visualize and alerts.So first look into Collect part–The logs can be collected via: -Event LogsCustom App LogsIIS LogsCrash DumpsPerformance DataWe can also filter the type of logs by just checking and unchecking the log types.Once the collection has done we need to Analyse the data which can be done by: -Filter based on attributesAnalyze data with Kusto Query languageSort dataExport log data to Excel and Power BIConditional FilteringAfter that, we can Visualize all the logs in an attractive DashboardAlerts can be configured on the basis of event conditions like once the value will be reached greater than or less than the limit threshold it will automatically generate the alerts and take configured actions like sending main, sending a message and start runbook (part of azure automation).Steps to collect Data and log from AWS MachinesAWS VMs could be of Windows and Linux, so we can install the agent which can be download from azure log analytics page by just selecting the type of the OS and the Bit of the VMs.After installing the Agent we need to configure the agent by entering the workspace id and key which will be provided by the azure after configuring. We can get the all the logs within some Hours in the portal.Supported AWS OS and VersionsLet’s discuss the OS and versions of VMs that are supported on AWS Cloud.If you are using Windows, the configuration should be:Windows Server 2008 Service Pack 1 (SP1) or laterWindows 7 SP1 and laterIf you are using Linux, the configuration should be:Amazon Linux 2012.09 to 2015.09 (x86/x64)CentOS Linux 5, 6, and 7 (x86/x64)Oracle Linux 5, 6, and 7 (x86/x64)Red Hat Enterprise Linux Server 5, 6 and 7 (x86/x64)Debian GNU/Linux 6, 7, and 8 (x86/x64)Ubuntu 12.04 LTS, 14.04 LTS, 16.04 LTS (x86/x64)SUSE Linux Enterprise Server 11 and 12 (x86/x64)After verifying the supported configuration, we can successfully install the agent and receive the logs.Kusto to query AWS Machine LogsKusto is a log analytics cloud platform optimized for ad-hoc big data queries.Kusto Query Reference Portal: https://docs.loganalytics.ioThe Kusto Query Language is used to query Azure services.OMS (Operational Management Suite)The Microsoft Operations Management Suite (OMS), previously known as Azure Operational Insights, is a software as a service platform that allows an administrator to manage on-premises and cloud IT assets from one console.Azure OMS provides 4 types of services: -Log Analytics: Monitor and analyze the availability and performance of different resources including physical and virtual machines.Automation: Automate manual processes and enforce configurations for physical and virtual machines.Backup: Backup and restore critical data.Site Recovery: Provide high availability for critical applications.Management SolutionsManagement solutions leverage services in Azure provides additional insight into the operation of a particular application or service.Here, we can manually select the required solution and add to the home page where we can only see the logs of that solution type.ReportsThe final reports of logs can be exported in Excel and Power BI that can be displayed in table and chart format.Conclusion:Azure Log Analytics is a very powerful tool to capture different types of system log. Kusto Query plays a very important role in extracting insights from the log file. Also, custom reports can be prepared using KUSTO Query,which helps the organization in saving many man-hours.Detailed reports and easy to export to excel and power helps to keep the troubleshooting and diagnosis handy.Reports can be embedded in any website with live refresh data and code snipped can be generated within power BI.Choose the right Azure logging service for AWS monitoring and use it to save many man-hours and reduce the time of troubleshooting and diagnosis.
Rated 4.0/5 based on 2 customer reviews

Monitoring AWS Machine using Azure Log Analytics - Cloud Computing

1K
  • by Raju Kumar
  • 17th Sep, 2018
  • Last updated on 06th Mar, 2019
  • 4 mins read
Monitoring AWS Machine using Azure Log Analytics - Cloud Computing

What is Azure Log Analytics?

Logs make your life easier and help everyone in the organization understand what you do and what's Actually going on with the software. Ensure that you follow best practices of Azure Log Analytics for monitoring AWS machines effectively.

Logs are created by network devices, applications, operating systems, and programmable or smart devices. They comprise several messages that are chronologically arranged and stored on a disk, in files, or in an application like a log collector.

Firstly, we will try to understand why we need logs to monitor AWS services.

We need logs because, with Azure Log Analytics, you can gather and search log data from all your resources, whether they reside on Azure, on another cloud host, or on-premises. Then you can transform the gathered data into rich analytics with AI-enhanced insights into your environment.  

How Log Analytics works
The log data from Virtual machines and other cloud resources, capture via some Agent which we need to install on VMs and Resource can be named as Connected Sources which further will be gathered in records and sent to OMS repository that is the part of Azure Cloud resource and stored in the Azure cloud.

After receiving logs to log analytics it further is available for log search and other activity like Set Alerts, Dashboard View, Power BI View, and Export into Excel and Data files.
 
Data collection in Azure Log Analytics

Data can be collected in azure Log Analytics in different ways –

  • Agents on Windows and Linux virtual machines send telemetry from the guest operating system and applications to Log Analytics according to Data Sources that you configure.
  • Connect a System Center Operations Manager management group to Log Analytics to collect data from its agents.
  • Azure services such as Application Insights and Azure Security Center store their data directly in Log Analytics without any configuration.
  • Write data from PowerShell command line or Azure Automation runbook using Log Analytics cmdlets.
  • If you have custom requirements, then you can use the HTTP Data Collector API to write data to Log Analytics from any REST API client.

  • Azure Monitor                   Data sources for events and performance
  • Virtual Machines               Data sources for events and performance
  • Operations Manager         Data from management group agents
  • Application Insights           Application requests and exceptions
  • Azure Security Center       Security events
  • PowerShell                        PowerShell command line or runbook
  • Data Collector API            Rest API for custom data

Incoming data automatically indexed. Data types and tables automatically created.

Log Analytics

Data available through log search and smart analytics to multiple channels.

  • Design and test queries and analyze data         Analytics
  • Visualize data in Azure portal                            Dashboards
  • Workflows consuming Log Analytics data         Logic Apps
  • Automatically respond critical conditions           Alerts
  • Export for visualization with other sources        Power BI
  • PowerShell command line or runbook               PowerShell
  • Rest API for custom application                         Log Search API

Work flow of Log Analytics
Now let's find out the workflow of log analytics like how it collects data, analyze, visualize and alerts.

So first look into Collect part–

The logs can be collected via: -

  • Event Logs
  • Custom App Logs
  • IIS Logs
  • Crash Dumps
  • Performance Data

We can also filter the type of logs by just checking and unchecking the log types.
Once the collection has done we need to Analyse the data which can be done by: -

  • Filter based on attributes
  • Analyze data with Kusto Query language
  • Sort data
  • Export log data to Excel and Power BI
  • Conditional Filtering

After that, we can Visualize all the logs in an attractive Dashboard
Alerts can be configured on the basis of event conditions like once the value will be reached greater than or less than the limit threshold it will automatically generate the alerts and take configured actions like sending main, sending a message and start runbook (part of azure automation).

Steps to collect Data and log from AWS Machines

AWS VMs could be of Windows and Linux, so we can install the agent which can be download from azure log analytics page by just selecting the type of the OS and the Bit of the VMs.

After installing the Agent we need to configure the agent by entering the workspace id and key which will be provided by the azure after configuring. We can get the all the logs within some Hours in the portal.
Supported AWS OS and Versions

Let’s discuss the OS and versions of VMs that are supported on AWS Cloud.

If you are using Windows, the configuration should be:

  • Windows Server 2008 Service Pack 1 (SP1) or later
  • Windows 7 SP1 and later

If you are using Linux, the configuration should be:

  • Amazon Linux 2012.09 to 2015.09 (x86/x64)
  • CentOS Linux 5, 6, and 7 (x86/x64)
  • Oracle Linux 5, 6, and 7 (x86/x64)
  • Red Hat Enterprise Linux Server 5, 6 and 7 (x86/x64)
  • Debian GNU/Linux 6, 7, and 8 (x86/x64)
  • Ubuntu 12.04 LTS, 14.04 LTS, 16.04 LTS (x86/x64)
  • SUSE Linux Enterprise Server 11 and 12 (x86/x64)

After verifying the supported configuration, we can successfully install the agent and receive the logs.

Kusto to query AWS Machine Logs

Kusto is a log analytics cloud platform optimized for ad-hoc big data queries.

Kusto Query Reference Portal: https://docs.loganalytics.io

The Kusto Query Language is used to query Azure services.
OMS (Operational Management Suite)

The Microsoft Operations Management Suite (OMS), previously known as Azure Operational Insights, is a software as a service platform that allows an administrator to manage on-premises and cloud IT assets from one console.

Azure OMS provides 4 types of services: -

  • Log Analytics: Monitor and analyze the availability and performance of different resources including physical and virtual machines.
  • Automation: Automate manual processes and enforce configurations for physical and virtual machines.
  • Backup: Backup and restore critical data.
  • Site Recovery: Provide high availability for critical applications.

Management Solutions

Management solutions leverage services in Azure provides additional insight into the operation of a particular application or service.
Here, we can manually select the required solution and add to the home page where we can only see the logs of that solution type.

Reports

The final reports of logs can be exported in Excel and Power BI that can be displayed in table and chart format.
Conclusion:

Azure Log Analytics is a very powerful tool to capture different types of system log. Kusto Query plays a very important role in extracting insights from the log file. Also, custom reports can be prepared using KUSTO Query,
which helps the organization in saving many man-hours.

Detailed reports and easy to export to excel and power helps to keep the troubleshooting and diagnosis handy.

Reports can be embedded in any website with live refresh data and code snipped can be generated within power BI.

Choose the right Azure logging service for AWS monitoring and use it to save many man-hours and reduce the time of troubleshooting and diagnosis.

Raju

Raju Kumar

Blog Author

Certified Azure Solution Architect and MCT, also specialized in AWS & Google Cloud Platform.

Having 6+ Years of Enterprise Product Development, worked in 4 MNC's across global teams.
I am a regular contributor at various technical conferences, Meetups & Community events to help the community to spread cloud awareness.
Things I do: Training, Consulting and Product Development. Let's Connect for a discussion !!

Join the Discussion

Your email address will not be published. Required fields are marked *

Suggested Blogs

How To Configure Jenkins With BITBUCKET - Cloud Computing

PURPOSE:This article will provide you a complete detailed knowledge of configuring Jenkins with Bitbucket repository which will help in automating the deployment process and will help to set the path for the interaction of Jenkins continuous integration tool with bitbucket repository.INTRODUCTION:As a part of the continuous integration and deployment process, it is a requirement that the code repository should have an interaction with the build automation and continuous integration tool. In this article, we are working with bitbucket and Jenkins to achieve the same, but this could be similar to the other tools that are available in the market. Let’s see the steps to integrate a Bitbucket with the Jenkins.Ways for Enabling Jenkins with Bitbucket:STEP 1: Login to your Jenkins and then select Manage Jenkins->Plugin Manager and on the tab available select the option bitbucket plugin and click on install without restart. In my case, this is installed already hence it’s showing under the Installed section.Step 2: Once the installation is done, configure a sample project and in build triggers section during the creation of a new project select build when a change is pushed to bitbucket and click apply and save to enable trigger Jenkins with bitbucket:Step 3: Now, visit Bitbucket URL: https://bitbucket.org/ and create your own repository, if you haven’t created yet else visit your own bitbucket repository for the purpose of this course. I have LearnShareKnowledge as the repository-Step 4: Click on the clone and copy the repository URL which you want to configure with Jenkins from bitbucket:Step 5: Go back to Jenkins and click on the project which you have to create and click on configure under the Source Code Management section, select the repository type which you have in your bitbucket repository. In my case, it is git and paste the URL by removing git clone to connect Jenkins with bitbucket. Click on save to reflect the changes:Step 6: Select the credentials if it is not a public repository for configuring the credentials, you can click the Add button on credentials section and proceed with username and password under global credentials domain and click Add.Step 7: Now, again visit the bitbucket repository which you want to configure. For that, click on settings under that repository:Step 8: Under settings, click on webhooks highlighted in the picture:Step 9: Click on add webhook and give the title to that webhook and paste the URL of Jenkins which you have configured with bitbucket and then select active and repository push and save the changes. If you want to make this more secure, then you can verify this with SSL certificate but in my case,  I used a secure key which provides a security to this setup:Now with the Jenkins setup for Bitbucket, we are ready to test the complete configuration. As soon as you do commit to the repository, you will be having an automated build triggered initiating a job inside the Jenkins project which you have configured with the repository.
Rated 4.0/5 based on 3 customer reviews
How To Configure Jenkins With BITBUCKET - Cloud Co...

PURPOSE:This article will provide you a complete d... Read More

SSHing into Ubuntu EC2 instance post blocking port 22 with UFW - Cloud Computing

IntroductionThis blog is in reference to a troubleshooting situation in Amazon Web Services when you have configured firewall setting in your ubuntu ec2 or remote instance and is not able to login via PuTTY through SSH as the instance. Here, we will see how to insert SSH into the instance in a certain situation when you are logged out of that instance.During configuration of SSL security, we may accidentally or purposely block SSH for the instance to make the instance secure. But, what if we again want the same instance to SSH for certain changes.  Below is the highlighted configuration of the instance. Here, you can see that all the instances have all ports opened to everything.Here are the configuration changes which you have made on the login into instance:$ sudo apt-get update $ sudo apt-get install nginx $ sudo apt-get install ufw  Check UFW Status and Rules At any time, you can check the status of UFW with this command:$ sudo ufw status verbose By default, UFW will be disabled so you should see something like this:Output: Status: inactive If UFW is active, the output will say that it's active, and it will list the rules that are set. For example, if the firewall is set to allow SSH (port 22) connections from anywhere, the output might look something like this:Output: Status: active Logging: on (low) Default: deny (incoming), allow (outgoing), disabled (routed) New profiles: skip To                         Action      From --                         ------      ---- 22/tcp                     ALLOW IN Anywhere $ sudo ufw deny ssh $ sudo ufw status verbose Output: Status: active Logging: on (low) Default: deny (incoming), allow (outgoing), disabled (routed) New profiles: skip To                         Action      From --                         ------      ---- 22/tcp                     DENY    Anywhere If you kicked or logged out of the instance once the changes are done, you will be seeing the below results.On SSH into the instance with your Public DNS through PuTTY below are the results which you are seeing as an error i.e. Network error: Connection Timed Out  Below error shows that even after all ports were opened outside, the instance is not able to SSH because of firewall software of Ubuntu at the system level. Let’s see how to resolve this kind of system related issue.Solution to the issue:Step 1: Take an image of the EC2 instance by selecting the instance ->Image-Create ImageStep 2: Provide specification Image name, tick on no reboot and push the create image buttonStep 3: Then Select the image and click on launchStep 4: Go to instance type, select and click on NextStep 5: In the configuration instance, write the below commands under Advanced Details and click on next:#!/bin/bash sudo ufw allow ssh sudo ufw allow 22 sudo ufw allow 443 sudo ufw allow 8080 sudo ufw allow 80 sudo ufw status  sudo ufw enable Step 6: Click next and next tab and add security group similar as providedStep 7: Review and launch the instance and then try to SSH to the instance through PuTTY. You will be now able to add SSH inside the instance with this and you can terminate the old instance as the new instance with all the setup same as that of the old instance without any issue except the public IP and private IP change.Best Practices of Firewall Configuration & Port Blocking:Ensure that the Security Groups will allow a specific IP addresses which are within the VPN Range of the Environment.Use of NACL for allowing and blocking the IP addresses or subnets for a specific Port by using allow and deny rules. A network ACL contains a numbered list of rules that we evaluate in order, starting with the lowest numbered rule. This helps to determine whether traffic is allowed in or out of any subnet associated with the network ACL. The highest number that you can use for a rule is 32766. We recommend that you start by creating rules in increments (for example, increments of 10 or 100) so that you can insert new rules.Use of Bastion Host for accessing critical servers and environment is always a better option to increase the security of the system or environment. 
Rated 4.0/5 based on 3 customer reviews
SSHing into Ubuntu EC2 instance post blocking port...

IntroductionThis blog is in reference to a trouble... Read More

Delivering Messages Made Easy With Azure Service Bus

Integrating two different systems is often complicated and comes up with lots of challenges with respect to the availability of both systems, processing speed, scaling and many more. Amongst many recommendations for designing and developing applications for the cloud, enabling asynchronous communication between multiple services plays a vital role in achieving the reliability, scalability and the efficiency of the system.What are Message Queues?Message Queues is the solution to the challenges faced during Integration in distributed systems. It is an efficient way of enabling asynchronous communications between different software services.Following are three most important benefits Queuing solution comes with:1. Decoupling: Messaging queues provide a persistent storage and asynchronous communication and thus the availability of one service does not impact the another. They are eligible to work in a disconnected fashion.2. High reliability: Messaging queues use transactions to manage the message and help to roll back the transaction to recover the message, In case of a failure.3. Granular Scalability – Messaging queues helps to achieve granular scalability where the producer or consumer can scale on their own choice without even impacting the other.Azure Service Bus – A managed Queuing system on Azure CloudAzure Service bus is a highly scalable service that helps to achieve asynchronous messaging and exchanging data among decoupled systems. Moreover, since it is a Platform as a Service (PaaS) offering from Microsoft, thus, you don’t have to manage the Infrastructure and configuration. Azure cloud manages all this for you.Among all others, the most important feature of Azure Service Bus queue is that it guarantees messages to be delivered in FIFO order, which many other queuing solutions fail to provide, even Azure Storage Queues. This makes service bus the most suitable choice than any other Message Queues, though not the only choice. However, Other features to include high availability, auditing, Geo redundancy etc.Azure Service Bus has 3 offerings:1. Queues2. Topics and Subscriptions3. Relays1. Service Bus Queues:The queue is an optimum choice when we are implementing one-directional messaging and, we want to ensure that only one consumer can fetch the message. This is generally used when both the producer and the consumer are not online at the same point in time. All the messages sent by the producer are stored in the queue until consumed by the consumer or gets expired. Also, each message in the queue is identified by a unique Message-ID.Queues come with the assumption that the message needs to be consumed by only one service. However, in practical scenarios, one message might need to be delivered to multiple consumers on some business decisions or need to be broadcasted. To meet those requirements Service bus does have a different offering, Topics.2. Topics and Subscriptions:Topics also provide one-directional communication. However, it works on the publish-subscribe principle where the same message can be consumed by more than one consumer. A single topic may have multiple subscriptions associated with it. A Subscription is somewhat like Queue. When the topic receives the message, it delivers it to all the relevant subscriptions or distributes based on the subscription filters.3. Relays:Unlike Queues and Topics, Relays provides more sort of bi-directional communication. Relays do not support brokered messaging i.e. they don’t store any messages instead simply passes the message from one service to the other. Therefore, both the publisher and subscriber need to be active at the same point in time in case of relays. Relays are automatically created and deleted in a service bus namespace i.e. they need not be created beforehand and deleted post use by services.Azure Service Bus ArchitectureThe Azure Service Bus architecture is depicted in the figure below:Some Important Limits and QuotasLimit/Quota NameValueQueue/topic size1,2,3,4,or 5 GBIf partitioning is enabled -80 GBNumber of concurrent connectionsNet Messaging:1,000AMQP: 5,000Number of topics/queues per service namespace10,000Number of partitioned topics/queues per service namespaceBasic and standard Tiers- 100Premium- 1,000 {per messaging unit]Message size for a queue/topicKsubscription entityMaximum message size: 256 KB[Standard tier} I 1 MB (Premier tier).Number of subscriptions per topic2,0002,000Advanced Features of Azure Service BusAzure Service Bus also has some advanced features that can help you to solve most complex messaging problems. The key features are listed below:1. Dead LetteringService bus provides dead letter subqueue to store messages that could not be delivered or processed. Dead letter queues can be used to move expired or poisoned messages from the parent queue. Those messages then can be retrieved for further investigations. Dead letter queues need not be created manually but are automatically created with the queue.2. TransactionsService bus provides transactions to group multiple operations together into one execution scope. This ensures that all the operations within a group either succeed or fail together.3. Duplicate detectionEnabling Duplicate detection helps to identify duplicate messages added on the basis of the unique message id. The duplicate message could be added by an application on restart of unexpected failure or exception scenarios not handled. Such messages need not be handled manually by the application because the service bus automatically handles those messages.4. Batch processingBatch processing feature of Azure service bus helps to add and retrieve messages in batch instead of one by one message. This extends help to the systems that have to process bulk messages.5. SessionsSometimes the messages are bigger in size say more than 1 MB (maximum message size capacity of queues). Sessions help in such scenarios by sending the message in parts and allowing the processing of the same only when all the parts are received at the consumer end.SummaryMicrosoft’s PaaS offering, Azure Service Bus is really helpful in developing and implementing highly scalable services without even care about infrastructure. It provides asynchronous communication and ensures greater reliability.Azure also lets you select from different options in service bus - for brokered and one directional message we have Queues and Topics and for non-persistent and bi-directional messages we have Relays.
Rated 4.0/5 based on 2 customer reviews
5534
Delivering Messages Made Easy With Azure Service B...

Integrating two different systems is often complic... Read More

Useful links