want to join an elite group that accelerates innovation? Start here.

Tell SkillStorm about yourself and what you want to achieve in your career. Then we’ll let you know if your next step is becoming a tried-and-true Stormer.

Q

Interested in Hiring Exceptional Tech Talent?

Start here

Q

Interested in Hiring Exceptional Tech Talent?

Start here



want to join an elite group that accelerates innovation?

Start here



Q

Accelerate Your Career

Whether you’re a college graduate, military veteran, or an experienced pro, we can train you, we can upskill you, we can put you to work.

We are storming the marketplace with the highly skilled, experienced, and certified professionals that businesses need.

Find your perfect job.

Application Architect V

SkillStorm is seeking a Data Architect for our client in Charlotte, North Carolina. Candidates must be able to work on SkillStorm's W2; not a C2C position. EOE, including disability/vets.

Job Description:

  • Looking for Big Data Platform Architect to be a part of the development of next-generation enterprise big data platform that supports every line of business for our client. Candidates must be well-versed in building Big Data analytical platforms using containerization, cloud technologies and Big Data parallel processing technologies as well as the advantages, disadvantages and trade-offs of various architecture approaches. Candidate will be responsible for developing architecture and deployment plans for Big Data implementations. As a Big Data Platform Architect candidate should have sound knowledge and architecture experience with Big Data tools, Data Science tools and libraries, Machine Learning, Streaming Data and Enterprise Data Warehousing. Candidate should be able to help accelerate our customer's journey to Private Cloud by moving and improving existing Hadoop installations, modernizing their data lakes with emerging and proven industry trends. Must be familiar with agile methods and related SDLC tools.

Responsibilities:

  • Design and Develop Big Data architecture patterns on on-prem and Private Cloud platforms
  • Work on new product evaluation, certification, defining standards for tool fitment to the platform
  • Develop technical architecture for enabling Big Data services using industry best practices for large scale processing
  • Design, build, and automate Big Data solutions centered around the Kubernets container orchestration platform
  • Stand up architecture review, operating model, routines, and evaluation criteria for Big Data container platform adoption by applications
  • Maintain in-depth knowledge of the organization's technologies and architectures.
  • Ensure the reference architecture is optimized for larger workloads and come up with recommended tuning techniques
  • Develop standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy
  • Communicate architectural decisions, plans, goals and strategies
  • Participate in regular scrum calls to track progress, resolve issues, mitigate risks and escalate concerns in a timely manner
  • Contribute to the development, review, and maintenance of requirements documents, technical design documents and functional specifications

Required Skill Set:

  • Experience in Big Data/Analytics/Data science tools and a good understanding of the leading products in the industry are required along with passion, curiosity, and technical depth
  • Thorough understanding and working experience in Cloudera/Horton Hadoop distributions
  • Solid functional understanding of the Big Data Technologies, Streaming and NoSQL databases
  • Experience in working with Big Data eco-system including tools such as YARN, Impala, Hive, Flume, HBase, Sqoop, Apache Spark, Apache Storm, Crunch, Java, Oozie, SQOOP, Pig, Scala, Python, Kerberos/Active Directory/LDAP
  • Experience in solving Streaming use cases using Spark, Kafka, NiFi
  • Thorough understanding, strong technical/architecture insight and working experience in Docker, Kubernets
  • Containerization experience with Big Data stack using Open Shift/Azure
  • Exposure to Cloud computing and Object Storage services/platforms
  • Experience with Big Data deployment architecture, configuration management, monitoring, debugging and security
  • Experience in performing Cluster Sizing exercise based on capacity requirements
  • Ability to build strong partnership with internal teams, vendors on resolving product gaps/issues and escalate to the management on timely manner
  • Good Exposure to CI/CD tools, application hosting, containerization concepts
  • Excellent verbal and written skills, Team skills, Proficient with MS Visio, Strong analytical and problem-solving skills
  • Must be a self-starter, excellent communication and interpersonal skills.
  • Strong problem-solving and analytical skills
  • Effective verbal and written communication skills

Years of Experience Required:

  • 5-7, 7-10 or 10+

Top Required IT / Technical Skillsets:

  • Hands on architect with coding experience, Kubernetes, SME in one of the following: Kafka/Spark/NiFi/Ranger

Similar Jobs

Application Architect V

Contract job in Charlotte

Application Architect V

Contract job in Charlotte

Application Architect V - Hadoop

Contract job in Charlotte

Application Architect V

Contract job in Charlotte

Application Architect V

Contract job in Charlotte

Application Architect IV

Contract job in Charlotte

Application Programmer V

Contract job in Charlotte

Application Programmer V - SAP Ariba

Contract job in Charlotte

Database Administrator V

Contract job in Charlotte

Job Application for
Application Architect V

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Join the Stormer Community and prepare to advance