Solid reputation, passionate people and endless opportunities. That's Travelers. Our superior financial strength and consistent record of strong operating returns mean security for our customers - and opportunities for our employees. You will find Travelers to be full of energy and a workplace in which you truly can make a difference.
Job Description Summary
Travelers is seeking a Software Engineer I to join our organization as we grow and transform our Technology landscape.
Individual will participate in the design, build and management of large-scale data structures and pipelines and efficient Extract/Load/Transform (ETL) workflows. Individual will complete intermediate end to end engineering tasks for specific system assignments including developing, analyzing, configuring, testing, debugging, troubleshooting, documenting, health monitoring/alerting, and implementing based on user or system design specifications, as well as participating in troubleshooting, conducting impact analysis and escalating appropriately.
Primary Job Duties & Responsibilities
Perform analysis, design, development, and configuration functions as well as define technical requirements for assignments of intermediate complexity.
Participate with team to perform analysis, assessment and resolution for defects and incidents of intermediate complexity and escalate appropriately.
Work within guidelines set by team to independently tackle well-scoped problems.
Seek opportunities to expand technical knowledge and capabilities.
Participates in the development of large-scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs
Writes ETL (Extract / Transform / Load) processes, designs database systems and develops tools for offline analytic processing
Uses advanced programming skills in Python, Scala or any of the major languages to build robust data pipelines and dynamic systems
Uses in-depth knowledge on Hadoop architecture, HDFS commands and experience designing & optimizing queries to build scalable, modular, and efficient data pipelines
Builds data marts and data models to support clients and other internal customers
Integrates data from a variety of sources, assuring that they adhere to data quality and accessibility standards
Bachelor's degree or its equivalent in work experience.
One year of programming/development experience.
Education, Work Experience, & Knowledge
Job Specific Technical Skills & Competencies
Experience with one or more modern data ingestion/curation/wrangling tools/technologies (e.g.: Ab Initio, Informatica, Talend, AWS Glue, etc.)
Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources
Experience with one or more data platforms (e.g.: Teradata, Hadoop, Oracle, SQL Server, DB2)
Proficiency in writing complex SQL scripts and procedures is a must
Experience building data transformation and processing solutions
Experience in building high volume data pipelines on distributed compute platforms like EMR & Databricks using Python, PySpark, Hive, etc.
Experience with bash shell scripts, UNIX utilities & UNIX Commands
Experience in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment
Knowledge in NoSQL databases (e.g. MongoDB, Cassandra) is preferred
Experience in writing Shell/Perl scripts in LINUX, Power shell in Windows is nice to have
Exposure to Big Data / Hadoop applications using Sqoop, Pig, Hive, Python, Spark, Kafka and Storm
Experience in building software solutions on Public Cloud such as AWS or GCP is a plus
Equal Employment Opportunity Statement
Travelers is an equal opportunity employer.
The Travelers Companies