Cyber Security Specialist - Data Engineer L4 (Remote)

Community Health System Franklin , TN 37064

Posted 4 weeks ago

Job Description

Community Health Systems is one of the nation's leading healthcare providers. Developing and operating healthcare delivery systems in 40 distinct markets across 15 states, CHS is committed to helping people get well and live healthier. CHS operates 71 acute-care hospitals and more than 1,000 other sites of care, including physician practices, urgent care centers, freestanding emergency departments, occupational medicine clinics, imaging centers, cancer centers and ambulatory surgery centers.

Summary:

Are you looking to solve the most interesting problems at the cross-roads of Data Engineering and Cyber Security?

As a Cyber Security Engineering Specialist, Data Engineer L4 for the Cyber Security Risk Management organization you'll be responsible for acquiring, curating, and publishing data for analytical or operational uses. You will prepare data for use by data scientists, business users, and technology platforms by creating a single version of the truth for all data consumers. You will work with streaming and batch-loading data sources from cybersecurity solutions. Successful data engineers have the skills to design, build, and maintain reliable data pipelines and ETL processes to feed databases and data warehouses using a variety of tools and techniques. You will have the opportunity to work with various programming languages, technologies, and both structured and unstructured data.

A qualified candidate is:

  • Lifelong Learner and Passionate about Technology

  • Derives joy from tackling complex problems and working through solution tradeoffs

  • Able to learn on the fly and fill knowledge gaps on demand

  • Able to work with a variety of people at various levels

  • Excellent data management and QA skills - Process Oriented

  • Able to debug problems to their root cause, especially when the path leads through multiple systems or environments

  • Interest in working with data at the protocol level

  • Aptitude for data presentation and ability to transform raw data into meaningful, actionable reports

  • Significant experience creating data pipelines and ETL processes

  • Experienced with Google Cloud Composer / Apache Airflow or similar data orchestration services

  • Experienced with BigQuery or other data warehouse products

  • Excellent communication ability

Essential Duties and Responsibilities:

  • Consults on complex data product projects by analyzing moderate to complex end to end data product requirements and existing business processes to lead in the design, development and implementation of data products.

  • Builds data cleansing, imputation, and common data meaning and standardization routines from source systems by understanding business and source system data practices and by using data profiling and source data change monitoring, extraction, ingestion, and curation of data flows.

  • Responsible for producing data views and data flows for varying client demands such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research & exploration.

  • Translates business data stories into a technical story breakdown structure and work estimate so value and fit for a schedule or sprint.

  • Creates business user access methods to structured and unstructured data by such techniques as mapping data to a common data model, transforming data as necessary to satisfy business rules and validation of data content.

  • Collaborates with enterprise teams and other internal organizations on CI/CD best practices experience using Google Tables, JIRA, Jenkins, Confluence etc.

  • Implements production processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

  • Develops and maintains scalable data pipelines for both streaming and batch requirements and builds out new API integrations to support continuing increases in data volume and complexity

  • Writes and performs data unit/integration tests for data quality with input from a business requirements/story, creates and executes testing data and scripts to validate that quality and completeness criteria are satisfied. Can create automated testing programs and data that are reusable for future code changes.

  • Practices code management and integration with engineering Git principle and practice repositories.

  • Participates as an expert and learner in team tasks for data analysis, architecture, application design, coding, and testing practices.

Qualifications:

  • Required Education: Undergraduate studies in computer science, management information systems, business, statistics, math, a related field or comparable experience and education are strongly preferred.

  • Preferred Education: Graduate studies in business, statistics, math, computer science or a related field are a plus.

  • Required Experience:

  • Five to eight years of relevant experience with data quality rules, data management organization/standards and practices.

  • Three to five years' experience in data warehousing and queries.

  • Experience with Cloud technology and infrastructure.

  • Data application and practice knowledge.

  • Strong problem solving, oral and written communication skills.

  • Ability to influence, build relationships, negotiate, and present to senior leaders.

  • Experience manipulating, processing, and extracting value from large disconnected datasets

  • Advanced query authoring (SQL)

  • Advanced Python scripting

  • Advanced working knowledge of a variety of databases

  • Working experience with Git, and GitHub or GitLab.

  • Experience with data modeling and design in a data warehouse setting.

  • Preferred Experience:

  • Healthcare/Insurance/financial services industry knowledge

  • Cyber Security experience

  • Experience with AI / Machine Learning

  • Experience with Google Dataflow or Dataproc

  • Experience with GitHub integration with Google Composer for automated code deployment

  • Experience handling and working with sensitive data

  • Experience with Collibra

  • Required License/Registration/Certification: None

  • Computer Skills Required:

  • Advanced skills with Python and SQL required

  • Experience with other modern programming and scripting languages are a plus (e.g. R, Spark, Javascript, Java, UNIX Shell scripting, Perl, Ruby, etc)

  • Desired experience in: Looker / Google Data Studio, BigQuery, Apache Airflow / Google Cloud Composer

Physical Demands:

In order to successfully perform this job, with or without a reasonable accommodation, the following are outlined below:

  • The Employee is required to read, review, prepare and analyze written data and figures, using a PC or similar, and should possess visual acuity.

  • The Employee may be required to occasionally climb, push, stand, walk, reach, grasp, kneel, stoop, and/or perform repetitive motions.

  • The Employee is not substantially exposed to adverse environmental conditions and; therefore, job functions are typically performed under conditions such as those found within general office or administrative work.

icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon
lc_ad

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove

Cyber Security Specialist - Data Engineer L4 (Remote)

Community Health System