Course Description

Welcome to the Building Big Data Pipelines with R & Sparklyr & Tableau course. In this course we will be creating a big data analytics solution using big data technologies for R.

In our use case we will be working with raw earthquake data, we will be applying big data processing techniques to extract transform and load the data into usable datasets. Once we have processed and cleaned the data, we will use it as a data source for building predictive analytics and visualizations.

The tools we will be using are the following:

Tableau Desktop, a powerful data visualization tool used for big data analysis and visualization.
It allows for data blending, real-time analysis and collaboration of data. No programming is needed for Tableau Desktop, which makes it a very easy and powerful tool to create
dashboards applications and reports.

Sparklyr, an open-source library that is used for processing big data in R, by providing an interface between R and Apache Spark. It allows you to take advantage of Spark's ability
to process and analyze large datasets in a distributed and interactive manner. It also provides interfaces to Spark's distributed machine learning algorithms and much more.


Course curriculum

  1. 01
    • Introduction to course

  2. 02
    • R Installation

    • Apache Spark Installation

    • Java Installation

    • Testing Spark

    • Sparklyr Installation

  3. 03
    • Dataset Extraction

    • Dataset Transformation and Cleaning

    • Data Writing as CSV

  4. 04
    • Data Pre-Processing and Preparation

    • Building the Machine Learning Model

    • Prediction Results

  5. 05
    • Tableau Desktop Trial Installation

    • Data Source Import

    • Earthquake Prediction Map Visualization

    • Bar Chart Visualization of Earthquake Occurence

    • Doughnut Chart of Different Earthquake Types

    • Plot of Maximum and Average Magnitude Values

    • How to Create Dashboard Analytics

  6. 06
    • Source Code

Pricing - Life time Access

What will you learn?

  • How to create big data processing pipelines using R.

  • Machine learning with geospatial data using the Sparklyr library.

  • Data analysis using Sparklyr, R and Tableau.

  • How to manipulate, clean and transform data using Spark dataframes.

  • How to create Geo Maps in Tableau Desktop.

  • How to create dashboards in Tableau Desktop.

GEO Premium

Access our ENTIRE content instantly with a subscription

Student profile?

  • Undergraduate students

  • Master students and PhD candidates

  • Researchers and Academics

  • Professionals and Companies

Some more information

  • Certificates of Completion

    After you successfully finish the course, you can claim your Certificate of Completion with NO extra cost! You can add it to your CV, LinkedIn profile etc

  • Available at any time! Study at your best time

    We know hard it is to acquire new skills. All our courses are self paced.

  • Online and always accessible

    Even when you finish the course and you get your certificate, you will still have access to course contents! Every time an Instructor makes an update you will be notified and be able to watch it for FREE

About your Instructor

Data Engineer and business intelligence consultant with an academic background in Bsc computer science and around 5 years of experience in IT. Involved in multiple projects ranging from Business Intelligence, Software Engineering, IoT and Big data analytics. Expertise are in building data processing pipelines in the Hadoop and Cloud ecosystems and software development. My career started as an embedded software engineer writing firmware for integrated microchips, then moved on as an ERDAS APOLLO developer at geo data design a hexagon geospatial partner. Am now a consultant at one of the top business intelligence consultancies helping clients build data warehouses, data lakes, cloud data processing pipelines and machine learning pipelines. The technologies I use to accomplish client requirements range from Hadoop, Amazon S3, Python, Django, Apache Spark, MSBI, Microsoft Azure, SQL Server Data Tools, Talend and Elastic MapReduce.

Edwin Bomela

Data Engineer and business intelligence consultant