Scala Spark Developer in New York, NY

  • Title: Scala Spark Developer
  • Code: RCI-15392-1
  • RequirementID: 120120
  • Location: New York, NY 10036
  • Posted Date: 08/07/2024
  • Duration: 6 Months
  • Salary ($): 41.55 - 41.94 per Hourly
Talk to our recruitment team

  Job Description

Looking for Scala Spark Developer:-

Mandatory Skills:

  • Big Data, Scala, Spark, Core Java

Prior client experience – N

Onsite Requirement - Y

Number of days onsite - 3 days

If Onsite – Office Address –Broadway New York, NY 10036

Experience:

  • 5+ years

Client is a leading provider of platforms; digital innovation; artificial Intelligence and end-to-end IT services and solutions for Global 1000 companies.

  • We are transforming corporations through deep domain expertise; knowledge-based ML platforms; as well as profound anthropological efforts to understand the end customer and design products and interactions that create delight.
  • We are deeply committed to developing a comprehensive understanding of our client's problems and developing platforms to address them
  • We are seeking a highly skilled and motivated Spark Scala Developer to join our dynamic team. As a Spark Scala Developer, you will play a critical role in the design, development, deployment and optimization of data processing application.

Key Responsibilities:

  • Develop and maintain data processing applications using Spark and Scala.
  • Collaborate with cross-functional teams to understand data requirements and design efficient solutions.
  • Implement test-driven deployment practices to enhance the reliability of application.
  • Deploy artifacts from lower to higher environment ensuring smooth transition
  • Troubleshoot and debug Spark performance issues to ensure optimal data processing.
  • Work in an agile environment, contributing to sprint planning, development and delivering high quality solutions on time
  • Provide essential support for production batches, addressing issues and providing fix to meet critical business needs

Skills/Competencies:

  • Strong knowledge of Scala programming language
  • Excellent problem-solving and analytical skills.
  • Proficiency in Spark, including the development and optimization of Spark applications.
  • Ability to troubleshoot and debug performance issues in Spark.
  • Understanding of design patterns and data structure for efficient data processing
  • Familiarity with database concepts and SQL * Java and Snowflake (Good to have).
  • Experience with test-driven deployment practices (Good to have).
  • Familiarity with Python (Good to have).
  • Knowledge of Databricks (Good to have).
  • Understanding of DevOps practices (Good to have).


About Rangam:

Rangam Consultants is a minority, women-owned, disability workforce solutions global organization. Specialized in attracting and retaining talents globally for a rewarding career in IT, Engineering, Scientific, Clinical, Healthcare, Administrative, Finance, Business Management, and many more, while integrating veterans and individuals with disabilities into the workforce. Indeed, we connect career aspirants to relevant job opportunities, be it jobs in USA, UK, India, or Ireland. Also remote jobs, work-from-home jobs, or contract jobs in different verticals and industries.
Rangam strives to put job seekers first, giving them free access to search for jobs, post resumes, and research companies.Every day, we connect millions of people to new opportunities.