GCP Data Engineer - Senior Level

Cerecore Remote , US

Posted 4 weeks ago

Classification: Contract
Contract Length: 12 Months

Location: 100% RemoteJob ID: 16464710

CereCore® provides EHR implementations, IT and application support, IT managed services, technical staffing, strategic IT consulting, and advisory services to hospitals and health systems nationwide. Our heritage is in the hallways of some of America’s top-performing hospitals. We have served as leaders in finance, operations, technology, and as clinicians turned power users and innovators. At CereCore, we know firsthand the power that aligned technology can provide in delivering care. As a wholly-owned subsidiary of HCA Healthcare, we are committed to bringing the expertise we have gained as operators to deliver IT services that emphatically address the needs of health systems across the United States. Our team of over 600 clinical and technical professionals has implemented EHR systems in more than 400 facilities and provides managed services support to tens of thousands of health system employees. We work tirelessly to provide healthcare organizations specialized IT services that support the delivery of patient care. The Link to Life-Saving Care.
CereCore is seeking a GCP Data Engineer – Senior Levelto join our team Remotely. The Sr Cloud Data Engineer will integrate a new vendor platform into our Google Cloud Platform (GCP) Enterprise Data Warehouse (EDW). This role is responsible for Extract, Load, Transform (ELT) data pipeline research, architecture, development, and support. The successful candidate will have excellent verbal and written communication skills, the ability to establish effective working relationships and manage multiple priorities. This position also works with business analysts and project management to review business requirements and to produce technical design specs that will meet the requirements.Responsibilities 
  • Design, architect, develop, and support GCP data pipelines to extract, load, and transform data between the EDW and vendor platform.
  • Maintain a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and supported (i.e. documentation)
  • Working with the project team, estimate and plan the work effort.
  • Attend daily team standups.
Requirements
  • 7-10+ years experience Data Engineering
  • 3+ years experience with Google Cloud Platform. Must have experience with BigQuery, GCS, & Cloud Composer Apache Airflow
  • 5-10 years Python experience
  • Proficient with GCP tools Google Cloud Storage (GCS), BigQuery, and Cloud Composer / Airflow.
  • Proficient developing Python ELT data pipelines.
  • Proficient writing optimized GCP BigQuery SQL data translation queries and scripts.
  • Advanced SQL skills, including the ability to write, tune, and interpret SQL queries.
  • Experience writing and maintaining Unix/Linux scripts.
  • Experience with GitHub source control and CI/CD workflows.
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon
lc_ad

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove

GCP Data Engineer - Senior Level

Cerecore