Course Description

Welcome to the Web Scraping and Mapping Dam Levels in Python and Leaflet Linux Version course. We'll be building a Python GIS application from scratch using a variety of open-source technologies. The purpose of this course and many more to follow, is to learn to create geospatial analytics and convert it into a functional application. In our use case, we will be working with dam storage level data and we will be web scraping and data processing techniques to extract transform, and load the data into our spatial database. Once we have processed and cleaned the data, we will use it as a data source for building our GeoDjango Web Map Application. We will be powering our application with a PostgreSQL and PostGIS database. In the Front-End we'll use Bootstrap, JavaScript, Leaflet, and Ajax. On the server side, we'll be using Python and Django combined with the use of scientific libraries like pandas, for our data transformation and conversion operations. The operating system that we will be working on is Ubuntu Linux LTS. (course update: January 2024)

What will you learn?

  • How to apply web scraping to collect dam level data from a website

  • How to use the Django Template Engine instead of Ajax, to pass data from the back-end to the front-end.

  • How to build a Spatial Database using Postgresql and PostGIS.

  • How to create charts with Chart.js.

  • How to build Web Maps with Leaflet.js.

  • How to build REST API Endpoints.

  • Some JavaScript programming.

  • How to build Web Applications using the Django MVC framework.

  • How to build a small dashboard that floats over your map, which will contain graphs that will visualize your model.

GEO Premium

Access our ENTIRE content. Not just courses. We provide you with courses, tools and data to start learning and advance your skills.

Course curriculum

  1. 01
  2. 02
    • Lesson 2: Installing PostgreSQL and PostGIS Part1

    • Lesson 3: Installing PostgreSQL and PostGIS Part2

    • Lesson 4: Creating the Spatial Database

  3. 03
    • Lesson 5: Creating a Python Virtual Environment

    • Lesson 6: Installing and Configuring GeoDjango

    • Lesson 7: Installing Visual Studio Code IDE

    • Lesson 8: Creating the Django Base Application

    • Lesson 9: Testing the Django Installation

  4. 04
    • Lesson 10: Adding the Spatial Database to our Django Backend

    • Lesson 11: Creating a Django Admin User

    • Lesson 12: Creating the Model

  5. 05
    • Lesson 13: Scraping Data From the Web

    • Lesson 14: Cleaning and Transforming the Data Part1

    • Lesson 15: Cleaning and Transforming the Data Part2

    • Lesson 16: Loading the Data

  6. 06
    • Lesson 17: Adding the Leaflet Config Code

    • Lesson 18: Adding the Static Files

    • Lesson 19: Creating the Layout Page Part 1

    • Lesson 20: Creating the Layout Page Part 2

    • Lesson 21: Creating the Index Page

    • Lesson 22: Creating the Index View

  7. 07
    • Lesson 23: Creating the Dataset API Endpoints

    • Lesson 24: Displaying Data on the Map

    • Lesson 25: Creating the Sliding Sidebar

    • Lesson 26: Creating the Pie Chart

    • Lesson 27: Creating the Multi-Bar Bar Chart

    • Lesson 28: Creating the KPI

Pricing

About your Instructor

Data Engineer and business intelligence consultant with an academic background in Bsc computer science and around 5 years of experience in IT. Involved in multiple projects ranging from Business Intelligence, Software Engineering, IoT and Big data analytics. Expertise are in building data processing pipelines in the Hadoop and Cloud ecosystems and software development. My career started as an embedded software engineer writing firmware for integrated microchips, then moved on as an ERDAS APOLLO developer at geo data design a hexagon geospatial partner. Am now a consultant at one of the top business intelligence consultancies helping clients build data warehouses, data lakes, cloud data processing pipelines and machine learning pipelines. The technologies I use to accomplish client requirements range from Hadoop, Amazon S3, Python, Django, Apache Spark, MSBI, Microsoft Azure, SQL Server Data Tools, Talend and Elastic MapReduce.

Edwin Bomela

Data Engineer and business intelligence consultant

Some more information

  • Certificates of Completion

    After you successfully finish the course, you can claim your Certificate of Completion with NO extra cost! You can add it to your CV, LinkedIn profile etc

  • Available at any time! Study at your best time

    We know hard it is to acquire new skills. All our courses are self paced.

  • Online and always accessible

    Even when you finish the course and you get your certificate, you will still have access to course contents! Every time an Instructor makes an update you will be notified and be able to watch it for FREE