We are storming the marketplace with the highly skilled, experienced, and certified professionals that businesses need.

Find your perfect job.

Scala Developer (Closed)

SkillStorm is seeking a Scala / Big Data Developer for our client in New York, NY. Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.

Job Description:

  • Financial regulation for banks has increased dramatically over the past few years. This role is to work on the Global Banking and Markets (GBAM) Non-Financial Regulatory Reporting (NFRR) Data Delivery program, a new initiative to define and implement consistent and efficient regulatory reporting processes that adhere to enterprise standards, simplify controls and enable re-use.
  • As part of this initiative, we are developing a Data Processing Framework using Scala and Big Data technologies for authoring domain-specific-language (DSL) components to seamlessly combine data from multiple data sources. The DSL components interpret transformation rules written in a configuration style syntax that can be applied to one or more standard data sets to produce a transformed data set. This framework provides a unified language for describing the data needs of a report.
  • Enables users to seamlessly retrieve and combine data from multiple sources
  • Enables users to author reports without having to worry about the mechanics of actually retrieving, filtering, projecting, or aggregating the data
  • Ensures proper versioning of the report definitions and the running of the reports as they existed at specific points in time
  • Ensures that the system is scalable enough to run hundreds of reports in parallel, if require

Role:

  • We are looking for a Scala developer with Apache Hadoop/Spark/Flink experience to assist with the development of the regulatory reporting processing engine using Scala and Big Data technologies to provision regulatory reporting data via domain-specific-language (DSL) scripts. The role requires close partnership with the NFRR Program analysis and development teams.

Responsibilities:

  • The candidate will work directly with the Director of the Data Processing Engine and will participate in all phases of development of the platform and subsequent reports.
  • Develop technical specification and component level design for DSL Extensions for input and output
  • Design parameterized and configurable modules for DSL Extensions
  • Develop DSL for the data outputs needed for NFRR Reports
  • Integrate Code Repository and Management

Qualifications:

  • The candidate must be a self-starter, able to work in a fast paced and results driven environment with minimal oversight. The candidate is required to have excellent communications skills and possess a strong sense of accountability and responsibility.
  • 5+ years development experience
  • Good general Scala programming skills
  • Experience with Hadoop Distributed File System (HDFS), HBase and Hive
  • Experience with custom aggregations within Spark/Flink, preferred
  • Experience with databases, a plus
  • Experience on regulatory or reporting projects, preferred
  • Ability to perform detailed and complex data analysis
  • Attention to detail and ability to work independently
  • Ability to handle tight deadlines, and competing demands in a fast paced environment
  • Knowledge of Global Banking and Markets’ products/asset classes and associated data including fixed income, equities, derivatives, and foreign exchange securities, preferred
  • Scala developer with Apache Hadoop/Spark/Flink experience
  • 5-7 years

#LI-DNI

Similar Jobs

Entry Level Software Developer

Contract job in Orlando

Entry Level Software Developer

Contract job in Melbourne

Application Architect V

Contract job in Jersey City

Technical Project Manager

Contract job in Jersey City