We are storming the marketplace with the highly skilled, experienced, and certified professionals that businesses need.

Find your perfect job.

Sr. Hadoop Developer (Closed)

A Hadoop developer is responsible for the design, development and operations of systems that store and manage large amounts of data.

The Senior ETL hadoop developer will be responsible for development, programming, coding of ETL jobs for Enterprise Data Services - Data Integration team . IT Developers are responsible for documenting detailed system specifications, participation in unit testing and maintenance of planned and unplanned internally developed applications, evaluation and performance testing of purchased products. IT Developers are responsible for including IT Controls to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application. IT Developers are assigned to moderately complex development projects.

Essential Functions:
• Write code for moderately complex system designs. Write programs that span platforms. Code and/or create ETL jobs for Terra Bytes of data.
• Write code for enhancing existing programs or developing new programs.
• Provide production support for existing ETL jobs.
• Provide input to and drive programming standards.
• Write detailed technical specifications for subsystems. Identify integration points.
• Report missing elements found in system and functional requirements and explain impacts on subsystem to team members.
• Consult with other IT Developers, Business Analysts, Systems Analysts, Project Managers and vendors.
• “Scope” time, resources, etc., required to complete programming projects. Seek review from other IT Developers, Business Analysts, Systems Analysts or Project Managers on estimates.
• Perform unit testing and debugging. Set test conditions based upon code specifications. May need assistance from other IT Developers and team members to debug more complex errors.
• Supports transition of application throughout the Product Development life cycle. Document what has to be migrated. May require more coordination points for subsystems.
• Researches vendor products / alternatives. Conducts vendor product gap analysis / comparison.
• Accountable for including IT Controls and following standard corporate practices to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application.
• The essential functions listed represent the major duties of this role, additional duties may be assigned.

Job Requirements:
• Experience and understanding with unit testing, release procedures, coding design and documentation protocol as well as change management procedures
• Proficiency using Hadoop Eco systems, Kafka, Spark Streaming, Scala, Hbase, Hive.
• Knowledge of Postgres, MongoDB, NoSQL, SQL
• Demonstrated organizational, analytical and interpersonal skills. Thorough knowledge of Information Technology fields and computer systems
• Flexible team player
• Ability to manage tasks independently and take ownership of responsibilities
• Must demonstrate initiative and effective independent decision-making skills
• Ability to communicate technical information clearly and articulately
• Ability to adapt to a rapidly changing environment
• In-depth understanding of the systems development life cycle
• Proficiency programming in more than one object-oriented programming language
• Proficiency using standard desktop applications such as MS Suite and flowcharting tools such as Visio
• Proficiency using debugging tools
• Domain Knowledge of Health Insurance Industry preferred.

Specific Tools/Languages Required:
HADOOP Eco Systems
Spark, Hive

Required Skills:
• Hadoop EcoSystems with Spark/ Scala
• Kafka, spark streaming/ Batch/ real-time processing
• Hive, Hbase
• Sqoop, NiFi
• MongoDB, Postgres DB
• Shell scripts

Preferred Skillsets:
• Java, JavaScript
• SQL

Experience
5-8 years related work experience or equivalent combination of transferable experience and education
IT development/programming/coding professional within a Hadoop environment

Experience with Agile Methodology

Required Education:
Related Bachelor's degree in an IT related field or relevant work experience

#LI-DNI