Sr Project Administrator

GM Financial Arlington , TX 76004

Posted 2 months ago


The Senior Project Administrator will maintain the Enterprise Project Portfolio Management (PPM) System, providing system administration, training, support, analytics, and reporting for users. This role will also perform documentation, process audits to ensure compliance with project controls, and configure workflows. The Senior Project Administrator will develop and monitor routine, ad hoc, and executive-level reports on project activities. This person works closely with Project Managers, Business Executives, and other company employees.



  • Maintain, configure, and support Enterprise Project Management (EPM) software technologies, including Enterprise PPM Tool, and SharePoint project sites

  • Perform process audits of project deliverables such as documentation and resource hours

  • Support project and executive reporting needs of the PMO, developing multi-level presentations as needed

  • Participate in PPM upgrades, enhancements, and work with vendor to resolve defects

  • Perform other duties as assigned

  • Conform with all corporate policies and procedures



  • Advanced knowledge of the software development process and industry standards to support the processes

  • Demonstrated success in project management

  • Knowledge of infrastructure systems

  • Support-level knowledge of a PPM Tool and SharePoint Services

  • Understanding of technology infrastructure, security concepts and platforms


  • Ability to analyze, see the big picture of workflow business processes and how they interface within the company infrastructure

  • Ability to make decisions

  • Computer proficiency in MS Office

  • Detail oriented

  • Excellent written and verbal communication skills

  • Interpersonal skills necessary to work well with others in teams and collaborative work situations

  • Organization and prioritization abilities

  • Strong problem solving and multi-tasking skills

  • Written and verbal communication skills

Additional Knowledge Skills and Abilities

  • Knowledge of Enterprise level PPM tool

  • Knowledge of reporting and analytics tools (Microsoft Power BI, Microsoft SQL Server Report Builder, etc)


  • Bachelor's Degree any discipline required


  • 2-4 years of supporting Enterprise Applications required
  • 3-5 years of supporting enterprise PPM system preferred

Working Conditions

  • normal office environment

  • Flexible schedule with possibility of working long hours

  • Limited travel may be required to support business needs

icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove
Sr Hadoop Administrator

GM Financial

Posted 5 days ago

VIEW JOBS 4/19/2019 12:00:00 AM 2019-07-18T00:00 Overview We are expanding our efforts into complementary data technologies for decision support in the areas of operating a big data platform while supporting fast development of intelligent applications. Our interests are in enabling, development, deployment, and monitoring of these big data platforms along with intelligent applications using containers.To that end, this role will engage with team counterparts in deploying and operating big data platform technologies for intelligent and other analytical applications. Deployment of big data platform technologies activities include identifying and enabling opportunities for automation of fast integration and deployment of applications. Operation of big data platform technologies activities include monitoring and troubleshooting incidents, enabling security policies, managing data storage and compute resources. Responsibility also includes coding, testing, and documentation of new or modified automation for deployment and monitoring. This role participates along with team counterparts to architect an end-to-end framework developed on a group of core data technologies. Other aspects of the role include developing standards and processes for big data platforms in support of projects and initiatives. Responsibilities JOB DUTIES * Manage Hadoop and Spark cluster environments, on bare-metal and container infrastructure, including service allocation and configuration for the cluster, capacity planning, performance tuning, and ongoing monitoring * Work with data engineering related groups in the support of deployment of Hadoop and Spark jobs * Work with IT Operations and Information Security Operations with monitoring and troubleshooting of incidents to maintain service levels * Work with Information Security Vulnerability Management and vendors to remediate known impacting vulnerabilities * Contribute to the evolving distributed systems architecture to meet changing requirements for scaling, reliability, performance, manageability, and cost * Report utilization and performance metrics to user communities * Contributes to planning and implementation of new/upgraded hardware and software releases * Responsible for monitoring the Linux, Hadoop, and Spark communities and vendors and report on important defects, feature changes, and or enhancements to the team * Research and recommend innovative, and where possible, automated approaches for administration tasks Identify approaches to efficiencies in resource utilization, provide economies of scale, and simplify support issues Qualifications Knowledge * Ability to correct PC related problems in a timely manner * Advanced knowledge of WAN security and design * Advanced knowledge of networking concepts including TCP/IP, Subnetting, Routing, DHCP, Command line and DNS * In-depth knowledge of PC hardware and software * Working knowledge of VB Script and or PowerShell * Deep understanding of Hadoop and Spark cluster security, networking connectivity and IO throughput along with other factors that affect distributed system performance * Strong working knowledge of disaster recovery, incident management, and security best practices * Working knowledge of containers (eg, docker) and major orchestrator's (eg, Mesos, Kubernetes, Docker Datacenter) * Working knowledge of automation tools (eg, Puppet, Chef, Ansible) * Working knowledge of software defined networking * Working knowledge of parcel based upgrades with Hadoop (ie, Cloudera) preferred * Proven track record with Red Hat Enterprise Linux administration * Proven track record with troubleshooting YARN jobs preferred * Experience with Apache Spark development and/or administration preferred * Experience with Bluedata administration * Working knowledge of hardening Hadoop with Kerberos, TLS, and HDFS encryption * Advanced support-level knowledge of the Windows desktop Operating Systems * Strong knowledge of operating systems, applications and associated hardware (eg, Windows Desktop OSs, Windows Server OSs, OS/400, UNIX/Linux) * Understanding of enterprise computer hardware/software and information systems * Understanding of network monitoring concepts (eg, TCP/IP, Subnetting, Routing, DHCP, DNS, Active Directory, etc) * Must have a broad understanding of enterprise computer hardware/software and corporate information systems Skills * Excellent analytical and troubleshooting skills * Ability to evaluate problems and issues quickly, and to make recommendations for courses of action * Ability to make independent decisions and use sound judgment in relation to the management of team members * Ability to coach, develop and lead others * Ability to prioritize tasks and ensure their completion in a timely manner * Ability to accept change and to adapt to shifting organizational challenges and priorities * Advanced ability to analyze problems, correlate data from multiple sources and communicate pertinent information to the appropriate support teams * Strong interpersonal, verbal and written skills * Ability to manage multiple tasks at one time while remaining cool under pressure Education * Bachelor's Degree or equivalent experience preferred Experience * 5-7 years of hands-on experience with supporting Linux production environments required * 3-5 years of hands-on experience with supporting Hadoop and/or Spark ecosystem technologies in production required * 3-5 years of hands-on experience with scripting with bash, perl, ruby, or python required * 2-4 years of hands-on development/ administration experience on Kafka, HBase, Solr, and Hue required Work Conditions * Work primarily in a controlled climate environment * Mostly stationary with occasional need to travel between nearby DFW office locations to visit business partner customers and attend meetings * Occasional handling and lifting of computer and/or networking equipment involved * Occasional travel to attend conferences or training for development pursuits GM Financial Arlington TX

Sr Project Administrator

GM Financial