Big Data / Hadoop Developer (Closed)
SkillStorm is seeking a Big Data / Hadoop Developer for our client in Charlotte, NC. Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.
Job Description:
- Enterprise Data Platforms and Application Services OMNI initiative is looking for a Hadoop/CDP Engineer who will be responsible for development of the OMNI Big Data Platform. The ideal candidate is someone who has hands-on experience developing in the Hadoop, Cloudera data environment. The candidate will be working with agile teams and responsible for all aspects of SDLC.
- Exhibits strong knowledge of data structures and Big data modeling.
- Experience in Hadoop/HDFS/SPARK concepts and ability to write Spark Dataset, Data frame & HiveQL jobs.
- Proven understanding with Hadoop, Spark, Hive and ability to write Shell scripting.
- Familiarity with data loading tools like Sqoop, Flume and Kafka. Knowledge of workflow/schedulers like Oozie.
- Good aptitude in multi-threading and concurrency concept. Loading data from disparate data source sets.
- Added advantage with certifications like Cloudera Developer (CCA175), Spark & Hive/Administrator Certifications added advantage.
- Hands on experience with NO SQL databases. Ability to analyze, identify issues with existing cluster, suggest architectural design changes.
- Ability/Knowledge to implement Data Management and Governance in Hadoop Platform.
Responsibilities:
- Experience in Apache Hadoop/Spark development
- Well versed in Linux Environment
- Extensive experience in application development
- Excellent analytical and business process flows, design and diagrams skills
- Strong Collaboration and Team skills
- Proven history of delivering against agreed objectives
- Demonstrated problem solving skills
- Ability to pick up new concepts and apply to knowledge
- Ability to coordinate competing priorities
- Ability to work in diverse team environments that are local and remote
- Strong Communication skills (Verbal and Written)
- Work with minimal supervision
Primary Skill:
- Hadoop
Required Skills:
- Bachelor's degree from an accredited college or university with 5+ years hands on experience in coding for the following Hadoop Eco System:
- Hadoop
- Spark
- Spark-SQL
- Hive on Tez, Hive 3.x
- Impala
- Oozie
- Job Scheduler (i.e. like Autosys)
- Shell Scripting
Desired Skills:
- Java, Python, Scala, Py-Spark
- Kafka
- HBASE
- Restful Services
- Machine Learning/Predictive analytical
- Spark streaming
#LI-DNI
Similar Jobs
Business Event Analyst
Contract job in Charlotte
Technical Design Lead
Contract job in Charlotte