Data Operations Engineer
HCL Global Systems
Salt Lake City , UT 84103
Posted 2 months ago
Role: Data Operations Engineer
Duration: 12+ Months
Salt Lake City, UT
Interview: Webex/ Telephonic
Required Skills : Experience in continuous integration tools (Git, Jenkins, TFS, Maven, Nexus). Strong UNIX/Linux systems administration skills, including configuration, troubleshooting and automation
Job Description :
Lead Data Operations Engineer for one of our financial clients in Salt Lake City. This position will be a long term contract.
We are looking for a team player to join our data analytics team, playing an essential role in developing and deploying enterprise grade platforms that enable data-driven solutions.
You are both a generalist, capable of picking up and working with multiple, disparate systems, and an expert, having an ability to dive deep into specific topics and quickly master them.
Things you do:
Evangelize and lead the team in Data Operations best practices, ensuring delivery of a highly available and scalable systems
You are able to effectively communicate decisions, ideas, designs, and operation of systems and services to others in a clear and concise manner
Foster collaboration with software product development, architecture, and IT teams to ensure releases are delivered with repeatable and auditable processes
Build and deploy reproducible infrastructure via common Infrastructure as Code tooling (Ansible, SaltStack, Terraform etc.)
Background and Skills:
Engineering background, in either Computer Science, Computer Engineering, Mathematics or Software Engineering
Strong UNIX/Linux systems administration skills, including configuration, troubleshooting and automation
Strong experience configuring and/or integrating with monitoring and logging solutions such as syslog, ELK (Elastic, Logstash, Kibana) and Kafka.
1+ years Building/maintaining CI/CD pipelines in an enterprise setting
1+ years working with Big Data (e.g. Hadoop, Kafka, Spark, Cassandra)
Experience in continuous integration tools (Git, Jenkins, TFS, Maven, Nexus)
Experience building data pipelines and automating Big Data platform applications/services
Variety of data stores/platforms (data warehouses/data marts, NoSQL)
CI/CD, GIT, Jenkins, TFS,Maven, Nexus, UNIX/ LINUX, Ansible, Big Data (Hadoop, Kafka, Spark,Cassandra), ELK (Elastic, Logstash, Kibana)