We are storming the marketplace with the highly skilled, experienced, and certified professionals that businesses need.

Find your perfect job.

Data Engineer (Closed)

SkillStorm is seeking a Big Data Developer for our client in Charlotte, NC. Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.

Job Description:

  • Our organization is modernizing our HR data & analytics technology services inclusive of self-serve & API-driven access to Big Data. The Sr. Data Engineer-Data Lake will fill a lead role on the Bank's Global Human Resources Technology, Data Lake Development Team. We are looking for a well-rounded data engineer who ""gets it"" and sets the example for effective collaboration in a team environment. The candidate must demonstrate excellent communication skills and critical thinking to ensure all assumptions, constraints, behaviors are well thought through. The individual ensures the systems design and requirements are aligned to achieve the desired business outcomes, and that team practices and coding/quality principles are aligned to achieve the desired technology outcomes. This individual will deliver developing complex solutions in Data Lake environment, involving technical design, development, testing.
  • The desired individual will have demonstrated experience in standing up Big Data operational & analytics platforms, translating business requirements to scalable & sustainable technical solutions. Qualified candidates must be well-versed in data warehousing and ‘Big Data’ distributed processing and storage technologies as well as Data Lake design pattern. In addition, the selected candidate should have knowledge and experience with data management techniques such as Meta data management , Data Quality (DQ) management, Data Governance, Data Integration/Ingestion, Data Architecture, and Data Profiling.

Responsibilities Include:

  • developing and scripting in Python programming as well as SQL in Linux environments.
  • integrating data with Sqoop and ingest files of multi-record types with various data formats Parquet, Avro, and Json.
  • Create and maintain optimal data pipeline architecture in Cloudera CDH or similar platform with application development skills in hive, Sqoop, Pyspark
  • Participate in status meetings to track progress, resolve issues, articulate & mitigate risks and escalate concerns in a timely manner.
  • understands and evangelizes great design, engineering, & organizational practices (unit testing, release procedures, coding design and documentation protocols) that meet/exceed the Bank's change management procedures
  • sets the bar for team communications and interactions-is an excellent teammate to peers, influencing them in a positive direction
  • uses versioning tools such as GIT/Bit bucket
  • sets up jobs using autosys for automation 
  • Resource is required to be on the assignment for the required hours as specified in the job description. Forty hours per week is the expectation in the absence of specified hours. Attendance and punctuality are essential components of the position; therefore, unexcused absences and tardiness could result in dismissal. 10 + years. Responsible for translating the requirements created by functional analysts into the architecture for that solution and describing it through the set of architecture and design documents. Those documents are then used by the rest of the development team to implement the solution. The process of defining architecture by the Solutions Architect often involves selection of the most appropriate technology for the problem being solved, impact assessment as well as technical and operational feasibility. Also, responsible for developing the high-level strategy to solve business problems. Ensure solutions adhere to enterprise standards. May be aligned to an applications or a technology stack

#LI-DNI