Hadoop Developer- Gurgaon (2-4 Years of Experience)

Last Updated : 02 Feb, 2017
< 1 min read

Experience : 2 – 4 years
Requirements :
Task Info :

Position Summary:

We are looking for candidates with hands on experience in Big Data technologies to be based out of our Gurgaon office.

Key Responsibilities:

  • Build the Big Data infrastructure to store and process terabytes of data
  • Understand the business need what kind of data, how much data, types of algorithms to be run, load on the system, budget etc.- and recommend optimal solutions
  • Build and implement the solution. This will need you to be hands on to build in quick prototypes / proof of concepts data processing benchmarks
  • Work with the operations team to build systems, process and team required to run and maintain the systems securely, reliably and in a scalable manner
  • Work with the analytics team to understand what data landscaping would be required

Qualifications and Skills:

  • Must have 2-4 years of experience with Big Data technologies such as Hadoop and the related ecosystem
  • Practical experience and in-depth understanding of Map Reduce
  • Hands-on experience with Spark/Hive/Pig/Flume/Sqoop
  • Should have a good programming background with expertise in Java
  • Data infrastructure tools landscape e.g cloud service providers, virtualization software, system monitoring tools and development environments
  • Ability to program and guide junior resources on technical aspects
  • Ability to craft documents that can explain complex ideas in simple terms in order to build consensus or educate
  • Knowledge of R or any other Statistical Programming Language is a plus
  • Degree – Graduates/Postgraduates in CSE or related field

College Preference : no-bar
Min Qualification : ug
Skills : flume, hadoop, hive, mapreduce, pig, r, spark, sqoop
Location : Gurugram
APPLY HERE

Responses From Readers

Clear

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details