Big Data Engineer

Quicken Loans Inc Detroit , MI 48222

Posted 3 months ago

The Rock Family of Companies is made up of nearly 100 separate businesses spanning fintech, sports, entertainment, real estate, startups and more. We're united by our culture a drive to find a better way that fuels our commitment to our clients, our community and our team members. We believe in and build inclusive workplaces, where every voice is heard and diverse perspectives are welcomed. Working for a company in the Family is about more than just a job it's about having the opportunity to become the best version of yourself.

The Big Data Engineer is responsible for engaging in the design, development and maintenance of the big data platform and solutions at Quicken Loans. This includes the platform host data sets that support various business operations and enable data-driven decisions as well as the analytical solutions that provide visibility and decision support using big data technologies. The Big Data Engineer is responsible for administering a Hadoop cluster, developing data integration solutions, resolving technical issues, and working with Data Scientists, Business Analysts, System Administrators and Data Architects to ensure the platform meets business demands. This team member also ensures that solutions are scalable, include necessary monitoring, and adhere to best practices and guidelines. The Big Data Engineer helps mentor new team members and continues to grow their knowledge of new technologies.

Responsibilities

  • Develop ELT processes from various data repositories and APIs across the enterprise, ensuring data quality and process efficiency

  • Develop data processing scripts using Spark

  • Develop relational and NoSQL data models to help conform data to meet users' needs using Hive and HBase

  • Integrate platform into the existing enterprise data warehouse and various operational systems

  • Develop administration processes to monitor cluster performance, resource usage, backup and mirroring to ensure a highly available platform

  • Address performance and scalability issues in a large-scale data lake environment

  • Provide big data platform support and issue resolutions to Data Scientists and fellow engineers

Requirements

  • Master's degree in computer science, software engineering or a closely related field

  • 2 years of experience with Hadoop distribution and ecosystem tools such as Hive, Spark, NiFi and Oozie

  • 2 years of experience developing batch and streaming ETL processes

  • 2 years of experience with relational and NoSQL databases, including modeling and writing complex queries

  • Proficiency in at least one programming language, such as Python or Java

  • Experience with Linux system administration, scripting and basic network skills

  • Excellent communication, analytical and problem-solving skills

Who We Are

We're America's largest mortgage lender, closing loans in all 50 states. J.D. Power ranked Quicken Loans "Highest in Customer Satisfaction in Primary Mortgage Origination" for the past nine consecutive years, 2010 2018. The company was also ranked highest in the nation for client satisfaction among mortgage servicers by J.D. Power for five consecutive years, 2014 through 2018, each year the company was eligible. There's a simple reason we've been so successful: We care about the people we work with.

If you're tired of stuffy, bureaucratic workplaces, then you'll be delighted to find something different here. We strive to make a creative, fun and collaborative environment you simply won't find anywhere else. Quicken Loans was named #1 in ESSENCE Magazine's first ever list of "Best Places to Work for African Americans" in 2015. We've been on Computerworld's "Best Places to Work in IT" list for 13 years running, hitting #1 the last five years. We were also ranked #14 in FORTUNE Magazine's list of "100 Best Companies to Work For" in 2018, remaining in the top-30 for the past 15 years.

The Company is an Equal Employment Opportunity employer, and does not discriminate in any hiring or employment practices. The Company provides reasonable accommodations to qualified individuals with disabilities in accordance with state and federal law. Applicants requiring reasonable accommodation in completing the application and/or participating in the employment application process should notify a representative of the Human Resources Team, The Pulse, at 1-800-411-JOBS.


icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon
lc_ad

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove
Senior Big Data Engineer

Quicken Loans Inc

Posted 1 week ago

VIEW JOBS 9/5/2019 12:00:00 AM 2019-12-04T00:00 The Rock Family of Companies is made up of nearly 100 separate businesses spanning fintech, sports, entertainment, real estate, startups and more. We're united by our culture – a drive to find a better way that fuels our commitment to our clients, our community and our team members. We believe in and build inclusive workplaces, where every voice is heard and diverse perspectives are welcomed. Working for a company in the Family is about more than just a job – it's about having the opportunity to become the best version of yourself. The Senior Big Data Engineer is responsible for the full life cycle of the back-end development of a data platform. This team member creates new data pipelines, database architectures and ETL processes, and they observe and suggest what the go-to methodology should be. This person gathers requirements, performs vendor and product evaluations, delivers solutions, conducts trainings and maintains documentation. They also handle the design and development, tuning, deployment and maintenance of information, advanced data analytics and physical data persistence technologies. This team member establishes analytic environments required for structured, semi-structured and unstructured data, and they implement the business requirements and business processes, build ETL configurations, create pipelines for the data lake and data warehouse, research new technologies and build proofs of concept around them. They carry out monitoring, tuning and database performance analysis and perform the design and extension of data marts, meta data and data models. This team member also ensures all data platform architecture code is maintained in a version control system. The Senior Big Data Engineer is responsible for sharing knowledge with fellow team members, allowing the entire team to grow and become proficient to further build out and enhance the data platform. Responsibilities * Focus on scalability, performance, service robustness and cost trade-offs * Design and implement high-volume data ingestion and streaming pipelines using Apache Kafka and Apache Spark * Create prototypes and proofs of concept for iterative development * Learn new technologies and apply the knowledge in production systems * Develop ETL processes to populate a data lake with large data sets from a variety of sources * Create MapReduce programs in Java and leverage tools like AWS Athena, AWS Glue and Hive to transform and query large data sets * Monitor and troubleshoot performance issues on the enterprise data pipelines and the data lake * Follow the design principles and best practices defined by the team for data platform techniques and architecture Requirements * Bachelor's degree in computer science or equivalent experience * 2 years of experience with big data tools: Hadoop, Spark, Kafka, NiFi, Hive and/or Sqoop * 2 years of experience with AWS cloud services: EC2, S3, EMR, RDS, Redshift, Athena and/or Glue * 2 years of experience with stream-processing systems: Spark-Streaming, Kafka Streams and/or Flink * 3 years of experience with object-oriented/object function scripting languages: Java (preferred), Python and/or Scala * 2 years of experience with relational SQL and NoSQL databases like MySQL, Postgres, Cassandra and Elasticsearch * 2 years of experience working in a Linux environment * Expertise in designing/developing platform components like caching, messaging, event processing, automation, transformation and tooling frameworks * Demonstrated ability to performance-tune MapReduce jobs * Strong analytical and research skills * Demonstrated ability to work independently as well as with a team * Ability to troubleshoot problems and quickly resolve issues * Strong communication skills What'll Make You Special * Experience with managing real estate data * Experience leading a team of engineers on a large enterprise data platform build Who We Are Rocket Homes Real Estate LLC is a Detroit-based, tech-driven company with a passion for simplifying real estate. Our mission is to create a seamless home buying and selling experience by combining the process of searching for homes, connecting with a trusted real estate agent and getting a mortgage. Since 2006, we've partnered with our sister company, Rocket Mortgage® by Quicken Loans, and our nationwide network of top-rated real estate agents to help over 500,000 clients with their real estate needs. The Company is an Equal Employment Opportunity employer, and does not discriminate in any hiring or employment practices. The Company provides reasonable accommodations to qualified individuals with disabilities in accordance with state and federal law. Applicants requiring reasonable accommodation in completing the application and/or participating in the employment application process should notify a representative of the Human Resources Team, The Pulse, at 1-800-411-JOBS. Quicken Loans Inc Detroit MI

Big Data Engineer

Quicken Loans Inc