Design, architect, develop, and support GCP data pipelines to extract, load, and transform data between the EDW and vendor platform.
Maintain a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and supported (i.e. documentation)
Working with the project team, estimate and plan the work effort.
Attend daily team standups.
Requirements
7-10+ years experience Data Engineering
3+ years experience with Google Cloud Platform. Must have experience with BigQuery, GCS, & Cloud Composer Apache Airflow
5-10 years Python experience
Proficient with GCP tools Google Cloud Storage (GCS), BigQuery, and Cloud Composer / Airflow.
Proficient developing Python ELT data pipelines.
Proficient writing optimized GCP BigQuery SQL data translation queries and scripts.
Advanced SQL skills, including the ability to write, tune, and interpret SQL queries.
Experience writing and maintaining Unix/Linux scripts.
Experience with GitHub source control and CI/CD workflows.
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.