Search

Featured blog posts

Top Devops Tools You Must Know

In the last decade for most of the enterprises, the term DevOps has transformed from just a buzzword to a way of working. The concept of DevOps origin... Read More

Docker Vs Virtual Machines(VMs)

Let’s have a quick warm up on the resource management before we dive into the discussion on virtualization and dockers.In today’s multi-te... Read More

Understanding Big Data- Best Big Data Frameworks

The massive world of ‘BIG DATA’If one strolls around any IT office premises, over every decade (nowadays time span is even lesser, almost ... Read More

Why Do Scrum Masters Get Paid so Much?

Is Scrum Master the best paying job? Do Scrum Masters are highly paid individuals? Though these two questions look very simple in asking but are very ... Read More

SAFe® Agilist Certification Vs PMI-ACP®: Which Certification Should You Choose?

The competition for jobs is getting tough in today’s world. Whether you are a job seeker, corporate employee, or a consultant, you should keep y... Read More

CSM®️ or PSM™ - Which Certificate Is More Valuable?

What is Scrum?The source of a correct definition of Scrum is the official Scrum Guide, authored and maintained by Jeff Sutherland and Ken Schaber. Scr... Read More

What Are The Main Differences Between CSM & PSM Certification?

Agile and scrum are the two dynamic tools ruling the software industry nowadays, most of the IT-based organizations and companies are looking to hire ... Read More

Scrum Master Certification - The Definitive Guide

What is a Scrum Master?The Scrum Master in a Scrum team is responsible for managing the process, while at the same time ensures that the team is obser... Read More

What is PRINCE2®

PRINCE2® is one of the methodologies which will lend you a helping hand in managing your project in a successful manner.In this article, we will d... Read More

Difference Between PRINCE2®  Foundation and Practitioner

PRINCE2® is built on seven principles, themes, and processes, and is the world’s most widely-adopted project management method, used by peop... Read More

Topics

Latest Posts

Understanding Big Data- Best Big Data Frameworks

The massive world of ‘BIG DATA’If one strolls around any IT office premises, over every decade (nowadays time span is even lesser, almost every 3-4 years) one would overhear professionals discussing new jargons from the hottest trends in technology. Around 5 -6 years ago, one such word has started ruling IT services is ‘BIG data’ and still has been interpreted by a layman to tech geeks in various ways.Although services industries started talking about big data solutions widely from 5-6 years, it is believed that the term was in use since the 1990s by John Mashey from Silicon Graphics, whereas credit for coining the term ‘big data’ aligning to its modern definition goes to Roger Mougalas from O’Reilly Media in 2005.Let’s first understand why everyone going gaga about ‘BIG data’ and what are the real-world problems it is supposed to solve and then we will try to answer what and how aspects of it.Why is BIG DATA essential for today’s digital world?Pre smart-phones era, internet and web world were around for many years, but smart-phones made it mobile with on-the-go usage. Social Media, mobile apps started generating tons of data. At the same time, smart-bands, wearable devices ( IoT, M2M ), have given newer dimensions for data generation. This newly generated data became a new oil to the world. If this data is stored and analyzed, it has the potential to give tremendous insights which could be put to use in numerous ways.You will be amazed to see the real-world use cases of BIG data. Every industry has a unique use case and is even unique to every client who is implementing the solutions. Ranging from data-driven personalized campaigning (you do see that item you have browsed on some ‘xyz’ site onto Facebook scrolling, ever wondered how?) to predictive maintenance of huge pipes across countries carrying oils, where manual monitoring is practically impossible. To relate this to our day to day life, every click, every swipe, every share and every like we casually do on social media is helping today’s industries to take future calculated business decisions. How do you think Netflix predicted the success of ‘House of Cards’ and spent $100 million on the same? Big data analytics is the simple answer.Talking about all this, the biggest challenge in the past was traditional methods used to store, curate and analyze data, which had limitations to process this data generated from newer sources and which were huge in volumes generated from heterogeneous sources and was being generated  really fast(To give you an idea, roughly 2.5 quintillion data is generated per day as on today – Refer infographic released by Domo called “Data Never Sleeps 5.0.” ), Which given rise to term BIG data and related solutions.Understanding BIG DATA: Experts’ viewpoint BIG data literally means Massive data (loosely > 1TB) but that’s not the only aspect of it. Distributed data or even complex datasets which could not be analyzed through traditional methods can be categorized into ‘Big data’ and hence Big data theoretical definition makes a lot of sense with this background:“Gartner (2012) defines, Big data is high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation.”Generic data possessing characteristics of big data are 3Vs namely Variety, Velocity, and VolumeBut due to the changing nature of data in today’s world and to gain most insights of it, 3 more Vs are added to the definition of BIG DATA, namely Variability, Veracity and Value.The diagram below illustrates each V in detail:Diagram: 6 V’s of Big DataThis 6Vs help understanding the characteristics of “BIG Data” but let’s also understand types of data in BIG Data processing.  “Variety” of above characteristics caters to different types of data can be processed through big data tools and technologies. Let’s drill down a bit for understanding what those are:Structured ex. Mainframes, traditional databases like Teradata, Netezza, Oracle, etc.Unstructured ex. Tweets, Facebook posts, emails, etc.Semi/Multi structured or Hybrid ex. E-commerce, demographic, weather data, etc.As the technology is advancing, the variety of data is available and its storage, processing, and analysis are made possible by big data. Traditional data processing techniques were able to process only structured data.Now, that we understand what big data and limitations of old traditional techniques are of handling such data, we could safely say, we need new technology to handle this data and gain insights out of it. Before going further, do you know, what were the traditional data management techniques?Traditional Techniques of Data Processing are:RDBMS (Relational Database Management System)Data warehousing and DataMartOn a high level, RDBMS catered to OLTP needs and data warehousing/DataMart facilitated OLAP needs. But both the systems work with structured data.I hope. now one can answer, ‘what is big data?’ conceptually and theoretically both.So, it’s time that we understand how it is being done in actual implementations.only storing of “big data” will not help the organizations, what’s important is to turn data into insights and business value and to do so, following are the key infrastructure elements:Data collectionData storageData analysis andData visualization/outputAll major big data processing framework offerings are based on these building blocks.And in an alignment of the above building blocks, following are the top 5 big data processing frameworks that are currently being used in the market:1. Apache Hadoop : Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.First up is the all-time classic, and one of the top frameworks in use today. So prevalent is it, that it has almost become synonymous with Big Data.2 Apache Spark : unified analytics engine for large-scale data processing.Apache Spark and Hadoop are often contrasted as an "either/or" choice,  but that isn't really the case.Above two frameworks are popular but apart from that following 3 are available and are comparable frameworks:3. Apache Storm : free and open source distributed real-time computation system. You can also take up Apache Storm training to learn more about Apache Storm.4. Apache Flink : streaming dataflow engine, aiming to provide facilities for distributed computation over streams of data. Treating batch processes as a special case of streaming data, Flink is effectively both batch and real-time processing framework, but one which clearly puts streaming first.5. Apache Samza : distributed Stream processing framework.Frameworks help processing data through building blocks and generate required insights. The framework is supported by the whopping number of tools providing the required functionality.BIG DATA Processing Framework and technology landscape Big data tools and technology landscape can be better understood with layered big data architecture. Give a good read to a great article by Navdeep singh Gill on XENONSTACK for understanding the layered architecture of big data.By taking inspiration from layered architecture, different available tools in the market are mapped to layers to understand big data technology landscape in depth. Note that, layered architecture fits very well with infrastructure elements/building blocks discussed in the above section.Few of the tools are briefed below for further understanding:  1. Data Collection / Ingestion Layer Cassandra: is a free and open-source, distributed, wide column store, NoSQL database management system designed to handle large amounts of data across many commodity servers, providing high availability with no single point of failureKafka: is used for building real-time data pipelines and streaming apps. Event streaming platformFlume: log collector in HadoopHBase: columnar database in Hadoop2. Processing Layer Pig: scripting language in the Hadoop frameworkMapReduce: processing language in Hadoop3. Data Query Layer Impala: Cloudera Impala:  modern, open source, distributed SQL query engine for Apache Hadoop. (often compared with hive)Hive: Data Warehouse software for data Query and analysisPresto: Presto is a high performance, distributed SQL query engine for big data. Its architecture allows users to query a variety of data sources such as Hadoop, AWS S3, Alluxio, MySQL, Cassandra, Apache Kafka, and MongoDB4. Analytical EngineTensorFlow: n source machine learning library for research and production.5. Data storage LayerIgnite: open-source distributed database, caching and processing platform designed to store and compute on large volumes of data across a cluster of nodesPhoenix: hortonworks: Apache Phoenix is an open source, massively parallel, relational database engine supporting OLTP for Hadoop using Apache HBase as its backing storePolyBase: s a new feature in SQL Server 2016. It is used to query relational and non-relational databases (NoSQL). You can use PolyBase to query tables and files in Hadoop or in Azure Blob Storage. You can also import or export data to/from Hadoop.Sqoop: ETL toolBig data in EXCEL: Few people like to process big datasets with current excel capabilities and it's known as Big Data in Excel6. Data Visualization LayerMicrosoft HDInsight: Azure HDInsight is a Hadoop service offering hosted in Azure that enables clusters of managed Hadoop instances. Azure HDInsight deploys and provisions Apache Hadoop clusters in the cloud, providing a software framework designed to manage, analyze, and report on big data with high reliability and availability. Hadoop administration training will give you all the technical understanding required to manage a Hadoop cluster, either in a development or a production environment.BIG Data Best Practices Every organization, industry, business, may it be small or big wants to get benefit out of “big data” but it's essential to understand that it can prove of maximum potential only if organization adhere to best practices before adapting big data:Answering 5 basic questions help clients know the need for adapting Big Data for organizationTry to answer why Big Data is required for the organization. What problem would it help solve?Ask the right questions.Foster collaboration between business and technology teams.Analyze only what is required to use.Start small and grow incrementally.Big Data industry use-cases We talked about all the things in the Big Data world except real use cases of big data. In the starting, we did discuss few but let me give you insights into the real world and interesting big data use cases and for a few, it’s no longer a secret ☺. In fact, it’s penetrating to the extent you name the industry and plenty of use cases can be told. Let’s begin.Streaming PlatformsAs I had given an example of ‘House of Cards’ at the start of the article, it’s not a secret that Netflix uses Big Data analytics. Netflix spent $100mn on 26 episodes of ‘House of Cards’ as they knew the show would appeal to viewers of original British House of Cards and built in director David Fincher and actor Kevin Spacey. Netflix typically collects behavioral data and it then uses this data to create a better experience for the user.But Netflix uses Big Data for more than that, they monitor and analyze traffic details for various devices, spot problem areas and adjust network infrastructure to prepare for future demand. (later is action out of Big Data analytics, how big data analysis is put to use). They also try to get insights into types of content viewers to prefer and help them make informed decisions.   Apart from Netflix, Spotify is also a known great use case.Advertising and Media / Campaigning /EntertainmentFor decades marketers were forced to launch campaigns while blindly relying on gut instinct and hoping for the best. That all changed with digitization and big data world. Nowadays, data-driven campaigns and marketing is on the rise and to be successful in this landscape, a modern marketing campaign must integrate a range of intelligent approaches to identify customers, segment, measure results, analyze data and build upon feedback in real time. All needs to be done in real time, along with the customer’s profile and history, based on his purchasing patterns and other relevant information and Big Data solutions are the perfect fit.Event-driven marketing is also could be achieved through big data, which is another way of successful marketing in today’s world. That basically indicates, keeping track of events customer are directly and indirectly involved with and campaign exactly when a customer would need it rather than random campaigns. For. Ex if you have searched for a product on Amazon/Flipkart, you would see related advertisements on other social media apps you casually browse through. Bang on, you would end up purchasing it as you anyway needed options best to choose from.Healthcare IndustryHealthcare is one of the classic use case industries for Big Data applications. The industry generates a huge amount of data.Patients medical history, past records, treatments given, available and latest medicines, Medicinal latest available research the list of raw data is endless.All this data can help give insights and Big Data can contribute to the industry in the following ways:Diagnosis time could be reduced, and exact requirement treatment could be started immediately. Most of the illnesses could be treated if a diagnosis is perfect and treatment can be started in time. This can be achieved through evidence-based past medical data available for similar treatments to doctor treating the illness, patients’ available history and feeding symptoms real-time into the system.  Government Health department can monitor if a bunch of people from geography reporting of similar symptoms, predictive measures could be taken in nearby locations to avoid outbreak as a cause for such illness could be the same.   The list is long, above were few representative examples.SecurityDue to social media outbreak, today, personal information is at stake. Almost everything is digital, and majority personal information is available in the public domain and hence privacy and security are major concerns with the rise in social media. Following are few such applications for big data.Cyber Crimes are common nowadays and big data can help to detect, predicting crimes.Threat analysis and detection could be done with big data.  Travel and TourismFlight booking sites, IRCTC track the clicks and hits along with IP address, login information, and other details and as per demand can do dynamic pricing for the flights/ trains. Big Data helps in dynamic pricing and mind you it’s real time. Am sure each one of us has experienced this. Now you know who is doing it :DTelecommunications, Public sector, Education, Social media and gaming, Energy and utility every industry have implemented are implementing several of these Big Data use cases day in and day out. If you look around am sure you would find them on the rise.Big Data is helping everyone industries, consumers, clients to make informed decisions, whatever it may be and hence wherever there is such a need, Big Data can come handy.Challenges faced by Big Data in the real world for adaptationAlthough the world is going gaga about big data, there are still a few challenges to implement and adopt Big Data and hence service industries are still striving towards resolving those challenges to implement best Big Data solution without flaws.An October 2016 report from Gartner found that organizations were getting stuck at the pilot stage of their big data initiatives. "Only 15 percent of businesses reported deploying their big data project to production, effectively unchanged from last year (14 per cent)," the firm said.Let’s discuss a few of them to understand what are they?1. Understanding Big Data and answering Why for the organization one is working with.As I started the article saying there are many versions of Big Data and understanding real use cases for organization decision makers are working with is still a challenge. Everyone wants to ride on a wave but not knowing the right path is still a struggle. As every organization is unique thus its utmost important to answer ‘why big data’ for each organization. This remains a major challenge for decision makers to adapt to big data.2. Understanding Data sources for the organizationIn today’s world, there are hundreds and thousands of ways information is being generated and being aware of all these sources and ingest all of them into big data platforms to get accurate insight is essential. Identifying sources is a challenge to address.It's no surprise, then, that the IDG report found, "Managing unstructured data is growing as a challenge – rising from 31 per cent in 2015 to 45 per cent in 2016."Different tools and technologies are on the rise to address this challenge.3. Shortage if Big Data Talent and retaining themBig Data is changing technology and there are a whopping number of tools in the Big Data technology landscape. It is demanded out of Big Data professionals to excel in those current tools and keep up self to ever-changing needs. This gets difficult for employees and employers to create and retain talent within the organization.The solution to this would be constant upskilling, re-skilling and cross-skilling and increasing budget of organization for retaining talent and help them train.4. The Veracity VThis V is a challenge as this V means inconsistent, incomplete data processing. To gain insights through big data model, the biggest step is to predict and fill missing information.This is a tricky part as filling missing information can lead to decreasing accuracy of insights/ analytics etc.To address this concern, there is a bunch of tools. Data curation is an important step in big data and should have a proper model. But also, to keep in mind that Big Data is never 100% accurate and one must deal with it.5. SecurityThis aspect is given low priority during the design and build phases of Big Data implementations and security loopholes can cost an organization and hence it’s essential to put security first while designing and developing Big Data solutions. Also, equally important to act responsibly for implementations for regulatory requirements like GDPR.  6. Gaining Valuable InsightsMachine learning data models go through multiple iterations to conclude on insights as they also face issues like missing data and hence the accuracy. To increase accuracy, lots of re-processing is required, which has its own lifecycle. Increasing accuracy of insights is a challenge and which relates to missing data piece. Which most likely can be addressed by addressing missing data challenge.This can also be caused due to unavailability of information from all data sources. Incomplete information would lead to incomplete insights which may not benefit to required potential.Addressing these discussed challenges would help to gain valuable insights through available solutions.With Big Data opportunities are endless. Once understood, the world is yours!!!!Also, now that you understand BIG DATA, it's worth understanding the next steps:Gary King, who is a professor at Harvard says “Big data is not about the data. It is about the analytics”You can also take up Big data and Hadoop training to enhance your skills furthermore.Did the article helps you to understand today’s massive world of big data and getting a sneak peek into it Do let us know through the comment section below?
Rated 4.5/5 based on 11 customer reviews
6600
Understanding Big Data- Best Big Data Frameworks

The massive world of ‘BIG DATA’If one strolls ... Read More

Guide to Installation of Spark on Ubuntu

Apache Spark is a fast and general-purpose cluster computing system. It provides high-level APIs in Java, Scala, Python and R, and an optimized engine that supports general execution graphs. It also supports a rich set of higher-level tools including Spark SQL for SQL and structured data processing, MLlib for machine learning, GraphX for graph processing, and Spark Streaming.In this article, we will cover the installation procedure of Apache Spark on the Ubuntu operating system.PrerequisitesThis guide assumes that you are using Ubuntu and Hadoop 2.7 is installed in your system.Audience:This document can be referred by anyone who wants to install the latest version of Apache Spark on Ubuntu.System requirementsUbuntu OS Installed.Minimum of 8 GB RAM.20 GB free space.PrerequisitesJava8 should be installed in your Machine.Hadoop should be installed in your Machine.Installation Procedure1. Before installing Spark ensure that you have installed Java8 in your Ubuntu Machine. If not installed, please follow below process to install java8 in your Ubuntu System.a. Install java8 using below command.sudo apt-get install oracle-java8-installerAbove command creates java-8-oracle Directory in /usr/lib/jvm/ directory in your machine. It looks like belowNow we need to configure the JAVA_HOME path in .bashrc file..bashrc file executes whenever we open the terminal.b. Configure JAVA_HOME and PATH  in .bashrc file and save. To edit/modify .bashrc file, use below command.vi .bashrc Then press i(for insert) -> then Enter below line at the bottom of the file.export JAVA_HOME= /usr/lib/jvm/java-8-oracle/ export PATH=$PATH:$JAVA_HOME/binBelow is the screen shot of that.Then Press Esc -> wq! (For save the changes) -> Enter.c. Now test Java installed properly or not by checking the version of Java. Below command should show the java version.java -versionBelow is the screenshot2. Now we will install Spark on the System.Go to the below official download page of Apache Spark and choose the latest release. For the package type, choose ‘Pre-built for Apache Hadoop’.https://spark.apache.org/downloads.htmlThe page will look like belowOr You can use a direct link to download.https://www.apache.org/dyn/closer.lua/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz3. Create a directory called spark under /usr/ directory. Use below command to create spark directorysudo mkdir /usr/sparkAbove command asks password to create spark directory under the /usr directory, you can give the password. Then check spark directory is created or not in the /usr directory using below commandll /usr/It should give the below results with ‘spark’ directoryGo to /usr/spark directory. Use below command to go spark directory.cd /usr/spark4. Download spark2.3.3 in spark directory using below commandwget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgzIf use ll or ls command, you can see spark-2.4.0-bin-hadoop2.7.tgz in spark directory.5. Then extract spark-2.4.0-bin-hadoop2.7.tgz using below command.sudo tar xvzf spark-2.4.0-bin-hadoop2.7Now spark-2.4.0-bin-hadoop2.7.tgz file is extracted as spark-2.4.0-bin-hadoop2.7Check whether it extracted or not using ll command. It should give the below results.6. Configure SPARK_HOME path in the .bashrc file by following below steps.Go to the home directory using below commandcd ~Open the .bashrc file using below commandvi .bashrcNow we will configure SPARK_HOME and PATHpress i for insert the enter SPARK_HOME and PATH  like belowSPARK_HOME=/usr/spark/spark-2.4.0-bin-hadoop2.7PATH=$PATH:$SPARK_HOME/binIt looks like belowThen save and exit by entering below commands.Press Esc -> wq! -> EnterTest Installation:Now we can verify spark is successfully installed in our Ubuntu Machine or not. To verify use below command then enter.spark-shell Above command should show below screenNow we have successfully installed spark on Ubuntu System. Let’s create RDD and Dataframe then we will end up.a. We can create RDD in 3 ways, we will use one way to create RDD.Define any list then parallelize it. It will create RDD. Below are the codes. Copy paste it one by one on the command line.val nums = Array(1,2,3,5,6) val rdd = sc.parallelize(nums)Above will create RDD.b. Now we will create a Data frame from RDD. Follow the below steps to create Dataframe.import spark.implicits._ val df = rdd.toDF("num")Above code will create Dataframe with num as a column.To display the data in Dataframe use below commanddf.show()Below is the screenshot of the above code.How to uninstall Spark from Ubuntu System: Please follow below steps to uninstall spark on Windows 10.Remove SPARK_HOME from the .bashrc file.To remove SPARK_HOME variable from the .bashrc please follow below stepsGo to the home directory. To go to home directory use below command.cd ~Open .bashrc file. To open .bashrc file use below command.vi .bashrcPress i for edit/delete SPARK_HOME from .bashrc file. Then find SPARK_HOME the delete SPARK_HOME=/usr/spark/spark-2.4.0-bin-hadoop2.7 line from .bashrc file and save. To do follow below commandsThen press Esc -> wq! -> Press EnterWe will also delete downloaded and extracted spark installers from the system. Please do follow below command.rm -r ~/sparkAbove command will delete spark directory from the system.Open Command Line Interface then type spark-shell,  then press enter, now we get an error.Now we can confirm that Spark is successfully uninstalled from the Ubuntu System. You can also learn more about Apache Spark and Scala here.
Rated 4.5/5 based on 19 customer reviews
9878
Guide to Installation of Spark on Ubuntu

Apache Spark is a fast and general-purpose cluster... Read More

Differences Between Power BI and Tableau: Power BI Vs Tableau Comparison

Power BIPower BI is a Data Visualization and Business Intelligence tool provided by Microsoft. It can collect data from different data sources like Excel spreadsheets, on-premise database, cloud database and convert them into meaningful reports and dashboards. Its features such as creating quick insights, Q&A, Embedded Report, and Self Service BI made it top among all BI tools. It is also robust and always ready for extensive modeling and real-time analytics, as well as custom visual development.TableauTableau offers business analysts to take business decisions by its feature, data visualization available to all business users of any background. It can establish a connection with any data source (Excel, local/on-premise database, cloud database).Tableau is the fastest growing Data Visualization Tool among all visualization tools. Its visualizations are created as worksheets and dashboards. The beauty of tableau is that it does not require any technical or programming knowledge to create or develop reports and dashboards. Image SourceWhat is Power BI? Power BI is a Data Visualization & Business Intelligence tools that offer us to connect to single or multiple data sources and convert that connected raw data into impressive visual and share insights across an organization. It also offers us to embed the report into our application or website.Product Suite of Power BI:Power BI Desktop Free to download and install.Connect and access various of on-prem and cloud sources like Excel, CSV/Text files, Azure, SharePoint, Dynamics CRM, etc.Prepare by mashing up the data and create a data model using power query which uses M Query LanguageAfter loading data to Power BI Desktop can establish the relationship between tables.Create calculated measures, columns, and tables using Data Analysis eXpression(DAX).Drag & drop interactive visuals on to pages using calculated measures and columns.Publish to Power BI Web Service.Power BI ServiceThis is one of the ways to embed the reports within the Website under an organization.In the Power BI service forum, there are a collection of sections like Workspace, Dashboards, Reports, and Datasets.Can create our own workspace as My-Workspace which helps to maintain personal work in Power BI Service.Can pin number of Reports to a Dashboard to get together a number of meaningful Datasets for clear insight.In this we can interact with our data with the help of Q&A {natural language query.}Power BI Report ServerThis is one of the products to allow businesses to host Power BI reports on an on-premise report server.Can use the server to host paginated reports, KPI’s, Mobile reports and Excel workbook.Shared data sets and shared data sources are in their own folders, to use as building blocks for the reports.Power BI MobileOver Power BI provides mobile app services for IOS, Android and Windows 10 mobile devices.In the mobile app, you can connect to and interact with your cloud and on-premise data.It is very convenient to manage dashboard and reports on the go with your mobile app to stay connected and being on the same page with the organization.On-Premise GatewayThis is a bridge to connect your on-premise data to online services like Power BI, Microsoft flow, Logic App’s and Power App’s services, we can use a single gateway with different services at the same time.e.g.: - If you are using Power BI as well as Power App’s, a single gateway can be used for both which is dependent on the account you signed with it.The on-premises data gateway implements data compression and transport encryption in all modes.On-premises data gateway is supported only on 64-bit Windows operating system.Multiple users can be share and reuse a gateway in this mode.For Power BI, this includes support for schedule refresh and Direct Query.Image sourceWhat is Tableau?Tableau is a Business Intelligence & Data Visualization Tool that used to analyze our data visually. Users can create and share interactive reports & dashboards using it. It offers Data Blending to users to Connect multiple data sources.Product Suite of Tableau:Tableau ServerTableau Server is an enterprise-wide visuals analytics platform for creating interactive dashboards.It is essentially an online hosting platform to hold all your tableau Workbooks, Data sources and more.Being the product of tableau, you can use the functionality of tableau without needing to always be downloading and opening workbooks with tableau desktop.Can give security level permission to different work in an organization to determine who can access and interact with what.As a tableau server user, you will be able to access UpToDate content and gain quick insight without relying on static distributed content.   Tableau DesktopThis is a downloadable on-premise application for Computers and it is used for developing visualization in the form of sheets, Dashboards, and Stories.There are some useful functionalities of tableau desktop are: Data transformation, Creating Data Sources, Creating Extracts and Publishing Visualizations on tableau server.Tableau desktop produces files with extensions twb and twbx.It is a licensed product but comes with two weeks of the trial period.Starting from creating reports and charts to combining them to form a dashboard, all this work is done in tableau desktop.Tableau PrepTableau Prep is a personal data preparation tool that empowers the user with the ability to cleanse aggregate, merge or otherwise prepare their data for analysis in tableau.Tableau Prep has a simple and clean user interface that looks and feels like a final form of tableau desktop data sources screen.In Tableau Prep the data is stored in flow pane manner with has universal unique identifier [UUID] which can store big data sets in a secure way.Tableau ReaderTableau Reader is a free desktop application that you can use to open with data visualizations built in tableau desktop.It required to read and interact with tableau packaged workbooks.Tableau reader has the ability to retain interaction with visualization created in tableau desktop but will not allow connections to data which can be refreshed.It only supports to read tableau data files; without the reader, you may need to share it publicly or convert the workbook into a pdf format.Tableau OnlineTableau online is an analytics platform which is fully hosted in the cloud.It can publish Dashboards and share your discoveries with anyone.It has a facility to empower your organization to ask any question from any published data source using natural language.It can connect to any cloud databases at any time anywhere and it can automatically refresh the data from Web-App like Google analytics and salesforce.It empowers site admins to easily manage authentication and permissions for users, content, and data.Tableau PublicThis is a free service that lets anyone public interactive data visualizations to the web.Visualizations are created in the accompanying app Tableau Desktop Public edition which required no programming skills.It is for anyone who's interested in understanding data and sharing those discoveries as a data visualization with the world.It has some features highlights those are: - Heat Maps, Transparent sheets, Automatic Mobile Layouts, and Google Sheets.As visualization are public so anyone can access the data and make a change by downloading the workbook so it is totally unsecured.It has limitations of 15,000,000 rows of data per workbook.It has 10GB of storage space for your workbook which is kind of limitation towards storage.It supports Python with Tableau public called ‘Tabpy’, A new API that enables evaluation of python code within a tableau workbook  Image SourceStrengths & Weakness of Power BI:Strengths:Free Power BI Desktop application for authors to develop reportsUses DAX expressions for data calculationsFree Training Modules available for usersComposite Model (Direct Query, Dual, and Import) to connect dispersed multiple data sources and create a modelMultiple visuals in a single pageAlso has Drill Down-Drill Up in visuals, Drill through pages, Toggle page or visual using Bookmarks, selection pane & buttonsAbility to connect multiple data sourcesIt is affordable desktop – free and pro (Power BI Service to share and collaborate with other users in the organization) – $9.99Can integrate with Cortana – Windows Personal Voice AssistantPower BI has integrated with all Microsoft products (Azure, SharePoint, Office 365, Microsoft Dynamics, Power Apps, Microsoft Flow)Dataflow in power BI Service to connect to Azure Data lake storage 2 and other online services.Weakness:It is difficult for users who do not have knowledge of ExcelClients who use large data sets must opt for Premium Capacity services to avoid unpleasant experience with datasets and its users with performance and timeouts issuesPower BI service compatible with few database driversPower BI has got a large set of product options which make it complex to understand, which option is best suited for a Business.Strengths & Weakness of Tableau:Strengths:Tableau provides much beautiful visualization for which it stood top in the market among all BI tools.Quickly combine shape, & clean the data for analysis.It provides Data Blending.Capable of Drill Down-Drill Up in visuals, Drill through pages and filters.It can handle a large amount of data.Uses Scripting languages such as R & Python to avoid performance and for complex table calculations.Can build reports, dashboards, and story using Tableau Desktop.Weakness:Tableau is expensive when compared to other tools.Scheduling or Notification of reports & dashboards.Importing Custom Visualization is a bit difficult.Complexity in embedding report to other applications.Tableau is suitable for Huge organization which can pay for licensing cost.Benefits of Power BIMicrosoft is a Brand. I hope everyone remembers the school or college days, the time when we started learning and using Microsoft products as they are very simple to understand and user-friendly. Hence, obvious that our eyes and brain are trained on all Microsoft products.One who has working experience excel can easily cope up with Power BI Desktop & Mobile in no time.Pin the visual available in Excel to Power BI Service Using Excel Add-on.Once can build swift & reliable reports by simply drag and drop both inbuilt/custom visuals and this URL for Best practices to make an optimum performance for the report.Accessibility of Colossal Learning Assets available Guided Learning in this URL.As Power BI belongs to Microsoft family, hence it has privileged with Single Sign-On (SSO) and also tight integration with Microsoft products like Dynamics 365, Office 365, SharePoint Online, Power Apps, MS Flow, Azure SQL Database, Azure SQL Data warehouse, Azure Analysis server database… etc.Power Query Many options related to wrangling and clean the data bring it as a perfect data model.Post publishing the data into Power BI web service can schedule refresh without manual intervention.Power BI backed superpower of with Artificial intelligence and Machine learningMicrosoft introduced Power Platform (Power BI to Measure, Power Apps to Act & Microsoft Flow to automate) and you can find more details in this URL.Forthcoming Road Map provided for Power BI by Microsoft available in this URL.Power BI is integrated with both Python and R coding to use visualizations.Power BI Desktop Free – $0.00 & Power BI Web Service (Azure) Pro – $9.99 MonthlyDisadvantages of Power BIPower BI desktop is the best tool to analyze your data while you connect using Direct query (or) Live connections and might struggle handle huge if you import data into the application and at times it might get hung or simply crashes. However, in future monthly updates, Microsoft Product team will surely resolve this problem.Benefits of TableauTableau can connect various sources, can effortlessly handle huge data and is a very good tool for Data visualization and create dashboards by simply drag and drop.Tableau supports Python and R languages for creating visuals.Tableau has spent its term as Leader in Gartner’s report URL from 2012 – 2018 and now moved to second place.Disadvantages of TableauTableau Creator – $70.00 & Tableau Online – $35 MonthlyTableau product team has not concentrated advanced technologies missed integrated with Artificial intelligence and Machine learning.Once pushed the reports to tableau online, it does not support scheduled refresh and one must refresh the data manually.Analyst must use only inbuilt visual available in Tableau and no option to import custom visuals from the portal. Instead, according to the requirement developers need to create custom visuals by themselves.To create a data model, data preparation options in Tableau is limited. For advance data wrangling and cleaning one must take the help of other tools like Excel, Python, R, or Tableau Prep.There is integration with other Microsoft products like Dynamics 365, Office 365, Power Apps, Microsoft Flow which uses Single Sign-On (SSO).Difference between Power BI Vs TableauLet's put a difference between Power BI and Tableau in a tabular form:Power BITableauIt is provided by Microsoft.It is provided by Tableau.It is available at an affordable price.It is more expensive than power bi.Uses DAX for measures and calculated columns.Uses MDX for measures and dimensionsConnect limited data sources but increasing its data source connectors in monthly updates.It can connect to numerous data sources.Can handle large data sets using Premium Capacity.Can handle large data sets.It Provides Account based subscription.It Provides Key based subscription.Embedding report is easy.Embedding report is a real-time challenge in tableau.Why are Power BI and Tableau the most business intelligence tools in Business Intelligence and Data Visualization?Power BI & Tableau are most happening BI tools among all tools in business intelligence because of their features and capabilities like Embedded BI,  Data Blending, Multi Data Source connection like Cloud databases and on-premise databases. They make sharing of reports and dashboards for the users, easy. Business Analyst without even having to access these tools can access reports & dashboards and take critical business decisions.These two tools stood top in the BI market because of the attractive visualizations available. Power BI offers a feature of import of custom visual and creation of custom visual which is its beauty. These facts have made these BI tools most happening BI tools in the market till the date.According to Gartner Magic Quadrant for Analytics and Business Intelligence Platforms report, the 1st choice is Power BI and 2nd top choice is Tableau in BI Tool in the present market.Which one to choose, Power BI or Tableau?Data Analytics field has been changed over time from traditional bi practice embedded bi and collaborative bi. Initially, data analytics led by companies like IBM, Oracle, SAP but now this is not a situation. Now, this led by companies like Microsoft & Tableau because of their features like Embedded BI Collaborative BI, Data Blending, Multi Data Source Connection.Both Power BI and Tableau have their own Pros and Cons. The right product can be chosen based on touchstones & priority.TouchstonesPower BITableauDescriptionA cloud-based business intelligence platform which offers an overview of critical dataA collection of intuitive business intelligence tools used for data discoveryVisualizationProvides various visualizationsProvides a larger set of visualizations than Power BIOS supportOnly WindowsWindows and Macintosh OSGraphical featuresRegular charts, graphs, and mapsAny category of charts, bars, and graphsCostCheaperCostlyOrganizationSuitable for Small, Medium & Large type of OrganizationSuitable for Medium & Large type of Organization
Rated 4.5/5 based on 12 customer reviews
6595
Differences Between Power BI and Tableau: Power BI...

Power BIPower BI is a Data Visualization and Busin... Read More