Sorry, this job is no longer accepting applications. See below for more jobs that match what you’re looking for!

Data Service Developer

Expired Job

First Republic San Francisco , CA 94118

Posted 4 months ago

Overview

First Republic is an ultra-high-touch bank that provides extraordinary client service. We believe that one-on-one interactions build lasting relationships. We move quickly to serve our clients' needs so that their financial transactions are handled with ease and efficiency. Client trust and security are paramount in our line of business. Ultimately, our goal is unsurpassed client satisfaction which will lead to personal referrals our number one source of new business. We recognize that our competitive advantage starts with our people and our culture. At First Republic, we work hard and move quickly as a very coordinated team. If you are looking for an opportunity to grow and contribute in a fun, fast-paced environment, First Republic is the place for you. We have exceptional people focused on providing extraordinary service.

The ETL /Python Developer is responsible for designing and developing ETL Code in Informatica or Python packages/Modules. Creating the related ETL Design Specifications and framework. The developer coordinates with Architects, Data Analysts, cross functional teams on a project basis to effectively communicate design and development on related activities with a wide range of technical, non-technical, and third party team members.

This position reports to a Wealth Management Technology Manager/lead and supports the development and implementation of Wealth Management's technology roadmap. This role has a key responsibility within First Republic to ensure that our client experience within Wealth Management maintains the highest standards of the Bank.

The developer works with architects, data analysts and business analysts to accept requirements, translates these requirements into solution design and participates in the full development process life-cycle from concept through testing, implementation, and support using the Agile development methodology. This includes leveraging/improving established Data Model, ETL framework, identifying approaches to creatively solve data and application problems.

Responsibilities

  • Design and development of Data Warehouse/Data Mart. Be able to develop a project through its entire lifecycle.

  • Build distributed backend application in the cloud.

  • Understand repeatable automated processes for building the application, test it, document it, and deploy it at scale.

  • A desire to work as part of a growing, fast-paced, and highly-flexible team. Ability to quickly learn new technologies and adapt to a fast-paced development environment.

  • Work closely with Database Administrators and Data Integration (ETL) developers resulting in effective data driven solutions

  • Work closely with, and incorporate feedback from, product designers and other stakeholders in the company.

  • Establish quality processes to deliver a stable and reliable solution

  • Complex SQL, stored proc development

  • Understand the project proposal and assist the team in analyzing how the new system or functionality can be integrated in the current environment.

  • Ability to identify and resolve any performance and/or data related issues

  • Provide documentation (Data Mapping, Technical Specifications, Production Support, data dictionaries, test cases, etc.) for all projects

Qualifications

Experience & Education:

  • 7+ Years of experience in ETL Developer role

  • 3+ Years of experience in Python, Pandas, Django

  • 2+ Years of experience in Cloud implementation

  • Experience with Celery, RESTful APIs and server-side APIs integration

  • Strong experience in building data warehouse solutions and Data Modeling.

  • Strong ETL performance-tuning skills and the ability to analyze and optimize production volumes and batch schedules.

  • Strong understanding of both relational and NoSQL databases

  • Experience with batch data processing with SFTP/SSH, Unix/Linux

  • Expertise in operational data stores and real time data integration

  • Stream processing services such as Kafka, AWS Kinesis, Apache Storm, Spark Streaming and etc.,.

  • Have experience in DevOps stack (CI & CD) and other dependency management and build tools such as Jenkins, Gradle, Maven, Ant and Ivy

  • Experience with Development Methodologies, Databases Platforms and Data Modeling tools (ERwin/Model Manager)

  • Experience in any big data technologies - Hadoop, EMR, Amazon Redshift, Amazon s3

  • Expert level skill in modelling, managing, scaling and performance tuning high volume transactional database

  • Bachelor's Degree in Computer Science or Engineering.

Technical Skills:

  • Experience in Python language and Cloud Implementation like AWS/GCP

  • Knowledge in Cloud technologies

  • Proficiency with Data Modeling tools such as Erwin/ER Studio.

  • Experience with ETL tools like Informatica, SSIS tools

  • Proficiency in master data management (MDM) projects and solutions.

  • Proficiency with high volume OLTP Databases and large data warehouse environments.

  • Experience and knowledge of optimizing database performance and capacity utilization to provide high availability and redundancy

  • Understanding of Agile and its implementation for Data Warehouse Development

  • Strong experience with SQL Server Management Studio, Advanced T-SQL and SQL Server Databases

Professional Skills/Competency:

  • Capital Markets knowledge and experience is highly desired

  • Understanding of traditional and alternative asset class investment data, including but not limited to equity, fixed income, private equity, real estate, derivatives, mutual funds, ETFs, global assets, foreign exchange, etc.

  • Focus on development/ improvement of framework to support repeatable and scalable solutions

Personal Skills/Competency

  • Consistently demonstrates and follows high standards of integrity in business decision-making

  • Looks toward the broadest possible view of an issue/ challenge; can easily pose future scenarios; can think globally about all aspects of the Bank; can discuss multiple considerations of an issue and forecast them into the future; understands how the Bank works, competes, serves clients, and generates shareholder value

  • Demonstrates excellent communication and interpersonal skills; able to communicate clearly and concisely in a variety of settings and styles; is effective in a variety of formal presentation and meeting settings

  • Gains support for change by providing context and responding with sensitivity to concerns; takes initiative to recommend/ develop innovative approaches to getting things done

  • Can quickly find common ground and solve problems for the good of all; is a team player and encourages collaboration

Mental/Physical Requirements:

  • The ability to learn and comprehend basic instructions; understand the meanings of words and respond effectively; and perform basic arithmetic accurately and quickly.

  • Vision must be sufficient to read data reports, manuals and computer screens.

  • Hearing must be sufficient to understand a conversation at a normal volume, including telephone calls and in person.

  • Speech must be coherent to clearly convey or exchange information, including the giving and receiving of assignments and/or directions.

  • Position involves sitting most of the time, but may involve walking or standing for brief periods of time.

  • Must be able to travel in a limited capacity.

Options


See if you are a match!

See how well your resume matches up to this job - upload your resume now.

Find your dream job anywhere
with the LiveCareer app.
Download the
LiveCareer app and find
your dream job anywhere
lc_ad

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove
Big Data Developer

Agile Enterprise Solutions

Posted 1 week ago

VIEW JOBS 12/1/2018 4:55:17 PM 2019-03-01T16:55 <p><strong>Position:</strong> Big Data Developer<br /> <strong>Duration</strong><strong>: </strong>Full Time</p> <p><strong>Location</strong>: San Francisco, CA</p> <p> </p> <p><strong>Visa:H1,OPT,GC-EAD,GC,USC</strong></p> <p> </p> <p><strong>Job Description:</strong></p> <p> </p> <p>We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.</p> <p> </p> <p><strong>Responsibilities:</strong></p> <ul> <li>Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities</li> <li>Implementing ETL process </li> <li>Monitoring performance and advising any necessary infrastructure changes</li> <li>Defining data retention policies</li> </ul> <p> </p> <p><strong>Skills and Qualifications:</strong></p> <ul> <li>Proficient understanding of distributed computing principles</li> <li>Management of Hadoop cluster, with all included services </li> <li>Ability to solve any ongoing issues with operating the cluster</li> <li>Proficiency with Hadoop v2, MapReduce, HDFS</li> <li>Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming</li> <li>Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala</li> <li>Experience with Spark </li> <li>Experience with integration of data from multiple data sources</li> <li>Experience with NoSQL databases, such as HBase, Cassandra, MongoDB</li> <li>Knowledge of various ETL techniques and frameworks, such as Flume</li> <li>Experience with various messaging systems, such as Kafka or RabbitMQ</li> <li>Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O {{if you are going to integrate Machine Learning in your Big Data infrastructure}}</li> <li>Good understanding of Lambda Architecture, along with its advantages and drawbacks</li> <li>Experience with Cloudera/MapR/Hortonworks</li> </ul>   <p> </p> <p><strong> </strong></p> <p> </p> Agile Enterprise Solutions San Francisco CA

Data Service Developer

Expired Job

First Republic