Red Hat Inc. Raleigh , NC 27611
Posted 2 weeks ago
About the job
Develop, test, and deploy Kubernetes and OpenShift microservices using Kafka streaming technologies, including Confluent Kafka APIs. Develop, test, and deploy Kafka streaming applications used to test and verify data accuracy and throughput.
Telecommuting permitted: work may be performed within normal commuting distance from the Red Hat, Inc. office in Raleigh, NC.
What you will do
Contribute to and maintain open-source Python libraries useful for working with Kafka streaming technologies.
Work independently to design, implement, test, deploy and maintain stable and scalable full-stack applications using Kafka streaming technologies.
Work with scrum teams across teams and departments to build, test, deploy, and maintain scalable container-based solutions.
Identify and adopt best practices for data integrity, test design, analysis, validation, coding and recommend ways to improve data reliability, efficiency, and quality.
Create ETL processes that drive business value for marketing and sales teams. Provide tier 2 investigations by working with a data anomaly team and using critical thinking to analyze and provide answers to platform questions.
Develop automated unit tests, end-to-end tests, and integration tests to assist in quality assurance procedures.
Implement system health monitoring, reporting, and meaningful and actionable alerts.
Design and build data architecture for ingestion, processing and surfacing data for large-scale data-intensive applications that save time for marketers.
What you will bring
Bachelor's degree (U.S. or foreign equivalent) in Computer Science, Computer Engineering or related field and five (5) years of experience in the job offered or related role.
Must have three (3) years of experience with: Python application development; Apache Kafka support and development; OpenShift Containerization Platform and Kubernetes; job scheduling technologies including Airflow and Luigi; API development and testing; and Unit Testing frameworks.
Must have two (2) years of experience with: Kafka streaming application development; ETL development; Git CI and CD methodologies and technologies; database technologies including MongoDB, MariaDB, NoSQL, and MySQL and SQLite; and shell scripting.
Must have one (1) year of experience with data modeling.
Red Hat Inc.