Age of Learning is a leading education technology innovator based in Glendale, California, with a talented team of 600+ individuals comprised of nationally-renowned educators, curriculum experts, designers, animators, engineers, and more. We develop engaging, effective digital learning content to help children build a strong academic foundation for lifelong success.
Our flagship product ABCmouse.com Early Learning Academy is a comprehensive curriculum and the #1 learning app and website for children ages 2 through 8. More than 18 million children worldwide have completed over 4 billion learning activities on ABCmouse. Other Age of Learning programs include an immersive English language learning product for children in China; ReadingIQ, a world-class digital library with thousands of curated books; and a groundbreaking adaptive learning system that personalizes math instruction for every learner.
We are committed to helping all children succeed. Through our Education Access Initiatives, we make ABCmouse available at no cost to millions of children through schools, libraries, Head Start programs, and community centersincluding public housing authorities and after-school programs.
As we expand our global reach and broaden the educational impact of our programs, we're looking for passionate, ambitious, and collaborative leaders to become a part of our growing team.
We are seeking a full time in-house Senior Data Engineer to join our development team. This person will be helping us develop high performance, high throughput services using modern technologies and techniques.
Design, develop, test, implement and support of Marketing Technology applications using custom ETL (Extract Transform Load) or open source tools such as Talend.
Prepare high level component architecture, design document, data flow diagrams, detail design document, data schema and modeling combined with test plan documents.
Design, develop and test highly available and scalable data pipelines and relevant data storage systems to enable business success across a multi-product functionality.
Proactively identify operational and systemic issues within the data supply value chain (from collection to processing to reporting) and work with production operations (DevOps) team to implement monitoring solutions.
Ensure testing and validation best practices are followed across the team so that accuracy of data transformations and data verification is complete and documented.
Execute in a fast-paced matrix organization across product and engineering teams to identify best data driven solutions for the underlying data infrastructure and platform.
Ensure high operational efficiency and quality of your solutions to meet SLA (Service Level Agreement) and support commitment to stakeholders (both internal and external).
Be an active participant and advocate of agile/scrum practice to ensure health and process improvements for your team.
Minimum of 8+ years of experience designing and building large scalable data systems preferably across a multi-product portfolio.
Strong SQL skills with proven ability to write complex data queries across large data sets.
Minimum of 5+ years of exposure to software development preferably working in agile/scrum/kanban environments across multiple products.
Minimum of 5+ years of experience analyzing and manipulating data across diverse data sources (Python, Scala as an example).
Minimum of 5+ years of hands on experience working across AWS (Amazon Web Services) cloud environment (EC2, S3, RDS, Sagemaker).
Strong exposure to big data technology preferably across a containerized environment (Hadoop, Spark, Hive, Presto).
Experience with sourcing and modeling data from Restful API (Application Programming Interface).
Strong attention to detail with excellent analytical, problem-solving, and communication skills.
Bachelor's degree in Computer Science, Computer Engineering, Information Technology or related fields (or equivalent combination of training and experience).
Exemplary communication skills (both written and oral), with experience producing technical and design documentation of complex processes.
Good time management and ability to work on concurrent assignments with different priorities - Ability to work in a fast paced, iterative development environment with short turn-around times.
Experience with A/B Testing and related optimization across desktop and mobile in a digital environment a plus (examples include: Optimizely, Leanplum, deltaDNA).
Experience analyzing and manipulating data across several data formats (JSON, Avro, Parquet, ORC).
Experience building and architecting data warehouse workflows in large cloud-based production environments (Snowflake as an example).
Understanding of columnar data warehouse solutions (Redshift, Vertica as an example)
Experience migrating on-prem data solutions to the cloud with a strong data operational hygiene.
Experience developing and maintaining metadata catalogue API/s across a variety of data sources (AWS Glue, Metacat as an example).
Prior experience working in Educational Technology companies and related competitive landscape an added plus.
Age Of Learning