SR. Scala Engineer, Database Engineering

Experfy Inc Chicago , IL 60632

Posted 6 days ago

As a Sr. Software Engineer for our Data Platform Engineering team you will join skilled Scala engineers and core database developers responsible for developing hosted cloud analytics infrastructure (Apache Spark-based), distributed SQL processing frameworks, proprietary data science platforms, and core database optimization. This team is responsible for building the automated, intelligent, and highly performant query planner and execution engines, RPC calls between data warehouse clusters, shared secondary cold storage, etc. This includes building new SQL features and customer-facing functionality, developing novel query optimization techniques for industry-leading performance, and building a database system that's highly parallel, efficient and fault-tolerant. This is a vital role reporting to exec leadership and senior engineering leadership.


  • Writing Scala code with tools like Apache Spark + Apache Arrow to build a hosted, multi-cluster data warehouse for Web3
  • Developing database optimizers, query planners, query and data routing mechanisms, cluster-to-cluster communication, and workload management techniques
  • Scaling up from proof of concept to cluster scale (and eventually hundreds of clusters with hundreds of terabytes each), in terms of both infrastructure/architecture and problem structure
  • Codifying best practices for future reuse in the form of accessible, reusable patterns, templates, and code bases to facilitate meta data capturing and management
  • Managing a team of software engineers writing new code to build a bigger, better, faster, more optimized HTAP database (using Apache Spark, Apache Arrow and a wealth of other open source data tools)
  • Interacting with exec team and senior engineering leadership to define, prioritize, and ensure smooth deployments with other operational components
  • Highly engaged with industry trends within analytics domain from a data acquisition processing, engineering, management perspective
  • Understand data and analytics use cases across Web3 / blockchains
Skills & Qualifications
  • Bachelors degree in computer science or related technical field. Masters or PhD a plus.
  • 6+ years experience engineering software and data platforms / enterprise-scale data warehouses, preferably with knowledge of open source Apache stack (especially Apache Spark, Apache Arrow, and others)
  • 3+ years experience with Scala and Apache Spark
  • A track record of recruiting and leading technical teams in a demanding talent market
  • Rock solid engineering fundamentals; query planning, optimizing and distributed data warehouse systems experience is preferred but not required
  • Nice to have: Knowledge of blockchain indexing, web3 compute paradigms, Proofs and consensus mechanisms is a strong plus but not required
  • Experience with rapid development cycles in a web-based environment
  • Strong scripting and test automation knowledge
  • Nice to have: Passionate about Web3, blockchain, decentralization, and a base understanding of how data/analytics plays into this
icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove
Sr Scala Engineer Remote

The Hartford

Posted 1 week ago

VIEW JOBS 1/24/2023 12:00:00 AM 2023-04-24T00:00 <p>You are a driven and motivated problem solver ready to pursue meaningful work. You strive to make an impact every day &amp; not only at work, but in your personal life and community too. If that sounds like you, then you've landed in the right place.</p><p>The Hartford is looking for a Senior Data Engineer to join our Feature Integration team within Data Science Enablement organization. This person is expected to be adaptable and a quick learner to support our Entity Resolution tech stack. The candidate must possess strong communication skills to communicate with the business and other delivery team.</p><p>Responsibilities:</p><ul><li><p>Estimate, design, build, test and deliver high quality code</p></li><li><p>Develop and test fully functional components</p></li><li><p>Collaborate with team members to define requirements and delivery planning activities</p></li><li><p>Contributes to design, estimate, build, peer-review, and test practices with minimal supervision</p></li><li><p>Contributor to Engineering Culture</p></li><li><p>Teamwork, critical thinking, and effective communication</p></li><li><p>Awareness and working knowledge of code quality tools</p></li><li><p>Expertise in using IDEs, technology frameworks and any toolkits used for development</p></li><li><p>Testing skills, awareness, and proficiency</p></li><li><p>Organization skills, prioritization, deadlines, managing expectations</p></li><li><p>Creativity, critical thinking, decision making</p></li><li><p>Self-driven, continuous learner</p></li><li><p>Devops CI and CD automation, configuration management, alerts, and monitoring</p></li><li><p>Software quality practices including peer reviews</p></li><li><p>Find opportunities for continuous improvements in software delivery practices.</p></li></ul><p>Qualification:</p><ul><li><p>Must be authorized to work in the United States</p></li><li><p>Bachelor's degree or equivalent experience in related field required</p></li><li><p>5+ years of experience in software development.</p></li><li><p>2+ years of hands-on software development experience using Scala (both Object Oriented and functional programming</p></li><li><p>Experience using javadoc and/or scaladoc</p></li><li><p>Creation of spark scala jobs for data transformation and aggregation</p></li></ul><p>unit tests for spark scala job</p><ul><li>Scala development experience with AWS EMR</li></ul><p>Nice to have:</p><ul><li><p>Experience with Cloud (AWS) technologies (i.e. S3, EMR)</p></li><li><p>Experience with big data technologies (i.e. Hadoop, Spark, Hive, etc.)</p></li><li><p>Experience with Entity Resolution.</p></li><li><p>Experience with Quantexa software</p></li></ul><p>Compensation</p><p>The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:</p><p>$110,560- $165,840</p><p>Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age</p><p>About Us | Culture &amp; Employee Insights | Diversity, Equity and Inclusion | Benefits</p><p>Sr Data Engineer - GE07BE</p><ul><li>--</li></ul><p>Skills:</p><br> The Hartford Chicago IL

SR. Scala Engineer, Database Engineering

Experfy Inc