PK currently has an exciting opportunity for a Data Engineer to join our Data team and help us improve and maintain our Data Warehouse in Bellevue, WA.
As part of our Data Engineer team you will:
Provide and maintain infrastructure and tools that can be used to deliver end-to-end solutions for big data and analytical problem sets
You will be working on many projects at a time, but also focused on the details while finding creative ways to pursue big picture challenges
Actively mentor team members to effectively collaborate and communicate complex technical concepts to a broad variety of audiences
As lead, you will need to immerse yourself in all aspects of the product, understand the problems, and tie them back to data engineering solutions
Analytical mindset to understand business needs, and come up with engineering solutions that scale to answer broad problems balancing complexity and simplicity
Perform administrative/troubleshooting tasks in maintaining Big Data infrastructure
Carry software engineering mindset and ability to write elegant, maintainable code, and follow engineering best practices
Be comfortable outside of your comfort zone - explore new tech, tackling hard problems that others struggle to solve
Contribute to the core design of data architecture, data models and schemas, and implementation plans
Design data warehouse on platforms such as AWS, Azure
Work collaboratively within the data designing team and analyst team to develop and implement requirements for data reporting and exports
4+ years experience in the data warehouse space.
Broad knowledge of current tools and technologies utilized in data engineering
4+ years experience with object-oriented programming languages.
Experience with schema design and dimensional data modeling.
Excellent communication skills, particularly translating between technical and non-technical stakeholders
Experience with deploying clean solutions in Linux environment
2+ years of container management and development experience using Docker/Kubernetes
BS/BA in Technical Field, Computer Science or Mathematics.
Experience in writing SQL and ETL processes.
Expertise in one or more programming languages (Scala, Python, or Java).
Experience building analytical Data Products.
Working experience in designing solutions in Cloud: AWS, Asure or GC
Experience working with big data technology tools like Spark, Hadoop. Kafka, Impala, Kudu
Experience with BI and data visualization tools
~ In order to provide equal employment and advancement opportunities to all individuals, employment decisions at PK are based exclusively on merit. PK does not discriminate in employment opportunities or practices on the basis of race, color, religion, sex, including gender identity and identity expression, national origin, age, or any other characteristic protected by law.