Job Title: Systems Engineer IT Level 3
Duration: 9 to 12 Month Contract Position
Fort Worth, TX
Work Status: Must Be A United States Citizen
Work Hours: First Shift / 1st Shift / Day Time Hours
High School Diploma or GED is A MUST.
Pay: Depends on Experience Range is $35.00 to $47.34 per hour.
This position will be responsible for the administration of the Cloudera Hadoop clusters activities to include: performance, installation, design, configuration management, integrity and security. Typical responsibilities consist of but are not limited to: Designing and architecting big data solutions Commissioning and installing new applications and COTS products Monitoring performance and managing parameters Configuration management Controlling access permissions and privileges Ensuring that storage, archiving, back-up and recovery procedures are functioning correctly Developing, managing, and testing back-up and recovery plans Collaborating with IT project managers, database engineers and application programmers Communicating regularly with technical, applications and operational staff.
Skills / Experience & Education:
Mandatory Knowledge of applying big data technologies at scale in bare metal and cloud infrastructure Knowledge of designing big data architectures Knowledge of Linux or Unix system administration Knowledge of DevOps processes and technologies Knowledge of Hadoop setup, configuration, benchmarking, and management of a multi-node cluster Knowledge of complex server-based architectures.
Desired Knowledge of Cloudera Manager Knowledge of installing CDH on servers Knowledge of Hadoop technologies like Pig, Hive and HBase.
Knowledge of Kerberos and Securing Hadoop Clusters. Knowledge of systems monitoring tools to tune, configure, and administer Hadoop clusters Knowledge of automation and configuration management platforms such as Ansible, Salt, Puppet, or Chef Knowledge of cloud platforms such as AWS, Openstack, Azure, and/or GCE, along with cloud storage technologies such as S3, Swift, and/or Ceph Knowledge of Ruby, Python, Java, shell scripting, Spark, and/or Kafka Knowledge of the software development life cycles Demonstrated customer service and interpersonal skills Technical and analytical problem solving skills Effective written and verbal communication skills An interest and capacity to learn new skill sets. Would like 3 years of experience with HADOOP at a minimum.
Less than 3 years of HADOOP may not be considered for the position.
Engineering, Unix or Linux & Hadoop
9 to 12 Months
Butler America Aerospace, LLC