We are storming the marketplace with the highly skilled, experienced, and certified professionals that businesses need.

Find your perfect job.

Hadoop Developer (Closed)

SkillStorm is seeking a Hadoop Developer for our client in Charlotte, NC. Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.

Job Description:

  • Responsible for developing complex requirements, enhancing, modifying and/or maintaining applications in the Financial Crimes technology to accomplish business goals. Software developers design, code, test, debug and document programs as well as support activities for the maintaining the application. Work closely with business partners in defining requirements for system applications. Typically requires 10+ years of applicable experience. Utilizes in-depth knowledge of business requirements, business environments, and technological alternatives to recommend innovations that enhance and/or provide a competitive advantage to the organization. Responsible for providing insight and direction from a data perspective, assessment of impact on technology systems and participation in the full development lifecycle of the capability being delivered.
  • Contribute to story refinement/defining requirements. Estimate work necessary to realize a story/requirement through the delivery lifecycle. Perform proof of concept as necessary to mitigate risk or implement new ideas. Setup and automate continuous integration/continuous delivery pipeline.
  • Working closely with Production Support teams, Platform teams and Business Partners, this person handles technical aspects of the application, including Change Management, Maintenance, Platform Upgrades, and Changes to requirements from both upstream and downstream interfacing applications. Demonstrated data sourcing, data analysis and modeling skills with the ability to build innovative data provisioning models to support large scale financial crimes data sourcing initiatives. Work with a team of Data Analysts and Developers related to Data to ensure best practice and governance is followed. Promoting and applying best practices and standards at a project, program level.
  • Partners with the business to develop plans, including ongoing success measures, to sustain the change. Accountable for analyzing present-state, developing alternative future-state approaches and facilitating implementation. Effectively communicates with managers, peers and business partners on deliverables and timelines. Responsible for being Agile and following agile practice. In parallel, ensuring all Enterprise Change Standards are met.

Required Skills:

  • Expertise and demonstrated experience in Hadoop ecosystem components: HIVE, HDFS, SPARK
  • Strong grasp and experience in PySpark (2.4 or higher) development
  • Skilled in Bigtable design, with HBASE experience preferred
  • Strong knowledge of Impala, including demonstrated experience in optimized view design
  • Knowledge and implementation experience with RESTer API
  • Strong SQL knowledge and data analysis skills
  • Familiarity and experience with Autosys scheduler
  • Experience in Unix shell scripting
  • Strong working experience in Agile methodology
  • Working experience with Autosys or other Schedular tools
  • Good experience in SDLC, Agile, Continuous Integration Continuous Delivery, and change management - Jira, Bitbucket,Jenkins, Artifactory, Ansible 
  • Degree from outstanding university
  • BS/MS in Computer Science, Engineering, or any quantitative discipline

Desired Skills:

  • Scala experience is preferred, but not required
  • Familiarity with Python or Java programing language strongly preferred
  • Familiarity and working experience with HIVE 3.0 (CDP platform) preferred
  • Banking domain experience (especially in compliance and risk)

#LI-DNI

Similar Jobs

Entry Level Software Developer

Contract job in Charlotte

Business Event Analyst

Contract job in Charlotte

Technical Design Lead

Contract job in Charlotte