We are storming the marketplace with the highly skilled, experienced, and certified professionals that businesses need.

Find your perfect job.

Data / ETL Developer (Closed)

SkillStorm is seeking a Data / ETL Developer for our client in Charlotte, NC. Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.

Job Description:

  • Corporate Investments Data Warehouse (CIDW) and Transaction Hub (THUB) are data warehouses supported by the Data Horizontal team within Risk & Finance Technology. CIDW is both a general purpose LOB data store and calculation engine supporting the needs of Corporate Treasury (Corporate Investments, Global Funding, Finance, etc.) as well as the Enterprise Authorized Data Source (ADS) for Cash & Cash Equivalents, Intercompany Loans and Long Term Debt. THUB is a data store of Intrader (3rd party hosted SOR) position & transactional data.
  • The teams consist of a scrum master, developers, data quality analysts and data/ business analysts who support front office, middle office, market risk and finance users in collecting, transforming, loading and reporting end of day and intraday fixed income and derivative trading positions and other financial data.
  • The role is for a Data Developer on one of the CIDW agile teams. As a data /ETL developer, you will be expected to help the team craft data solutions to meet business and enterprise requirements using Informatica, Oracle Exadata and emerging data technologies.
  • While our core stack is Informatica / Oracle, we are exploring new methods of moving data. Candidates experienced with Big Data Technologies such as Hadoop, Kafka, Spark, Hive, Nifi and with Python and/or Java are strongly encouraged to apply.

Primary Skill:

  • Oracle ADF

Required Skills:

  • Minimum of 5+ years of development experience in Oracle, SQL Server, Netezza, or another industry accepted database platform.
  • Minimum of 3+ years with Informatica Power Center 9.x or above designing, developing, testing, and debugging.
  • Minimum of 2+ years in Data Warehouse / Data Mart / Business Intelligence delivery.
  • Minimum of 2+ years of Linux / shell scripting complementary with Informatica.
  • Experience in job scheduling using AutoSys or another industry accepted scheduling tool.
  • Minimum of 3+ years in Data Analysis.
  • Experience in Creating Low-level and High-level Design Artifacts.
  • Proven experience in designing and building integrations supporting standard data warehousing data models star schema, snow flake and different Normalization Forms.
  • Ability to present technical concepts to senior level business stakeholders.
  • Ability to provide end to end technical guidance on the software development life cycle (requirements through implementation).
  • Analytical and problem-solving skills.
  • Excellent Communication skills – verbal and written.
  • Should be a self-motivated worker.
  • Excellent interpersonal skills, positive attitude, team-player.
  • Willingness to learn and adapt to changes.
  • Experience in working in a global technology development model.
  • Effectively deal with multiple deadline-driven, customer-sensitive projects and tasks.
  • Strong time management and task prioritization.
  • Bachelor’s degree.

Desired Skills:

  • Experience with Big Data and/or Emerging data technology tools and methodologies.
  • Experience with Python (e.g. Pandas, Data Frames) and use in data processing solutions.
  • Experience with Java and use in implementing web services.
  • Experience with Kafka, Sqoop, Spark, nifi.
  • Experience with Data Wrangling tools such as Alteryx and/or Trifacta.
  • Experience with Data Visualization tools such as Tableau and/or Microstrategy.
  • Banking / Capital Markets / Accounting domain knowledge.
  • Knowledge of agile methodology and frameworks like Scrum, Kanban, etc.
  • Experience working in a SAFe Agile delivery model.
  • Advanced degree.

#LI-DNI

Similar Jobs

Business Event Analyst

Contract job in Charlotte

Technical Design Lead

Contract job in Charlotte