ETL Python

Posted 3 years ago
ETL Python
Job Location
4+ years

Brief Job Description

  • In depth understanding and knowledge on distributed computing with spark.
  • Deep understanding of Spark Architecture and internals
  • Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
  • Expertise in ETL processes, data warehousing and data lakes.
  • Hands on with python for Big data and analytics.
  • Hands on in agile scrum model is an added advantage.
  • Knowledge on CI/CD and orchestration tools is desirable.
  • AWS S3, Redshift, Lambda knowledge is preferred

Job Requirements

  • 4+ years experience
  • Data Warehousing Concepts
  • Any ETL tool (Mandatory) in the past (Preference Snaplogic)
  • AWS Ecosystem + Redshift or Snowflake
  • Python Scripting & PySpark (Strong Knowledge) (Mandatory)
  • Experience sourcing data from API

Apply Online

A valid email address is required.
A valid phone number is required.