Multimodal ML Engineer - Health Sensing Evaluation

Apple Inc. Cupertino , CA 95014

Posted 2 weeks ago

The Health Sensing team builds outstanding technologies to support our users in living their healthiest, happiest lives by providing them with objective, accurate, and timely information about their health and well-being. As part of the larger Sensor SW & Prototyping team, we take a multimodal approach, using a variety of sensors across HW platforms, such as camera, PPG, and natural languages.

Key Qualifications

  • Motivation to ensure that powerful AI systems stay under human control for high stake usages such as Health

  • Expertise within ML/DL fundamentals; experience in multimodal modeling is a plus

  • Knowledge in generative machine learning techniques, for example, diffusion models, GANs, VAEs, and transformers

  • Proficiency in Python and ML frameworks e.g. PyTorch, Tensorflow

  • Write performant and clean code; familiar with software development standard methods/collaborations

  • Independently running and analyzing ML experiments to diagnose problems and test changes aiming real improvements

  • Excellent interpersonal skills; comfortable in a collaborative and ground breaking research environments

Description

In this role, you will be at the forefront of evaluating multimodal and generative models for real-world health/wellbeing applications on their objective quality and alignment with human intent and perception, such as truthfulness, adaptability, and model generalizability. You will work on data and evaluation pipeline of both human and synthetic data for model evaluation, leverage ML technologies such as reinforcement learning with human feedback and adversarial models.

Responsibilities:

Build the back-end system that generate and lead data from a variety of endpoints (e.g. health databases, human annotations, synthetic generations)

Build quality and eval pipeline, model experimentation such as adversarial testing

Build insights/interpretability tools; explore methods to understand and predict failure modes

Being a critical part of the core multimodal ML dev team, innovate solutions to enhance model performance on quality metrics such as robustness and generalizability

Team up with algorithm engineers to build end-to-end pipelines that prioritize rapid iterations in support for reliability of a complex multi-year project

Education & Experience

BS and a minimum of 3 years relevant industry experience

Pay & Benefits

  • At Apple, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay range for this role is between $138,900.00 and $256,500.00, and your base pay will depend on your skills, qualifications, experience, and location.

Apple employees also have the opportunity to become an Apple shareholder through participation in Apple's discretionary employee stock programs. Apple employees are eligible for discretionary restricted stock unit awards, and can purchase Apple stock at a discount if voluntarily participating in Apple's Employee Stock Purchase Plan. You'll also receive benefits including: Comprehensive medical and dental coverage, retirement benefits, a range of discounted products and free services, and for formal education related to advancing your career at Apple, reimbursement for certain educational expenses - including tuition. Additionally, this role might be eligible for discretionary bonuses or commission payments as well as relocation.


icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon
lc_ad

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove

Multimodal ML Engineer - Health Sensing Evaluation

Apple Inc.