Hadoop Developer (Closed)
SkillStorm is seeking a Hadoop Developer for our client in Newark, DE Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.
Job Description:
- Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.
- Experience in working with a Big Data implementation in production environment
- Experience in Spark, Map Reduce, Hive, impala, Scala, Kafka, flume, Linux/Unix technologies is mandatory
- Experience in Unix shell scripting is mandatory
- Experience in python/perl is mandatory, able to analyze and debug the existing code
- Experience in Banking domain is mandatory
- Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
- Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecting big data environments.
- Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment.
- Benchmark and debug critical issues with algorithms and software as they arise.
- Lead and assist with the technical design/architecture and implementation of the big data cluster in various environments.
- Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts.
- Exposure to ETL tools e.g. data stage, NoSQL (HBase, Cassandra, MongoDB)
- Work with line of business (LOB) personnel, external vendors, and internal Data Services team to develop system specifications in compliance with corporate standards for architecture adherence and performance guidelines.
- Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities.
- Support multiple projects with competing deadlines
Qualifications & Experience:
- Bachelor’s degree in Science or Engineering
- 10+ year of Industry experience.
- Minimum 3+ years of Big Data experience
- Experience in developing real time streaming applications using Flume and kafka
- Experience in working different file formats including parquet and Avro
- Benchmark systems, analyze system bottlenecks, and propose solutions to eliminate them
- Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
- Continuously evaluate new technologies, innovate and deliver solution for business critical applications
#LI-DNI