Hadoop Engineer
$30,000 USD/year Pay is set based on global value, not the local market. Most roles = hourly rate x 40 hrs x 50 weeks ($15 USD/hour)

Not accepting applications on crossover.com at this time.

Description

Are you a Hadoop expert with a deep understanding of its multifaceted environment and the ability to leverage Hadoop platforms to their maximum potential? As the world of data continues to expand, so does the complexity and potential of Hadoop.

Skyvera stands at the intersection of innovation and precision in big data solutions. Our vision is grounded in a customer-obsessed support structure. We approach the challenges of the telco sector by leveraging Hadoop platforms to deliver consistent, high-quality support to our customers.

Your role will be central to this vision by managing the delicate balance between capacity and performance of the Hadoop environment, ensuring our systems are tuned, secure, and process-efficient. You'll administer operating systems, networks, and hardware, crafting a robust and reliable ecosystem. Directing the data landscape will be under your purview — and, at the heart of it all, you'll be a key player in providing unparalleled support and upholding our service commitments.

Joining Skyvera is more than a career move; it's a commitment to excellence and innovation in the big data realm. If you're ready to be a cornerstone in this journey, we encourage you to apply!

What you will be doing

  • Administering the Hadoop environment, including the associated operating system, tools, network, and hardware
  • Maintaining and optimizing the Hadoop capacity, including performance tuning, security, configuration and process scheduling
  • Planning and execution of hardware and software installations as well as managing data, users, and job executions

What you will NOT be doing

  • Getting stuck in repetitive tasks that don't tap into the full potential of the Hadoop ecosystem
  • Wasting time on obsolete technologies instead of leveraging the latest tools in the Hadoop space

Key responsibilities

  • Ensuring the reliability, stability, and efficiency of the Hadoop environment

Candidate requirements

  • Currently based in or willing to relocate to Mumbai or Pune
  • At least 2 years of experience supporting large-scale production environments with Hadoop distributions (Apache, Hortonworks, Cloudera, etc.)
  • At least 2 years of experience using Hadoop monitoring tools (Cloudera Manager, Ambari, Nagios, Ganglia, etc.)
  • Excellent Linux skills

Meet a successful candidate

Watch Interview
Isuru Samarasinghe
Isuru  |  Software Engineer
Sri Lanka  

Isuru saves two hours a day by cutting his daily commute around Colombo, Sri Lanka's capital. Working remotely as a Software Engineer for Tr...

Meet Isuru
How it works

Applying for a role? Here’s what to expect.

We’ve curated a series of steps that take the guesswork (and cognitive bias) out of recruiting the best person.

Pass Cognitive Aptitude Test.
STEP 1

Pass Cognitive Aptitude Test.

Pass English Proficiency Test.
STEP 2

Pass English Proficiency Test.

Prove Real-World Job Skills.
STEP 3

Prove Real-World Job Skills.

Ace An Interview Or Two.
STEP 4

Ace An Interview Or Two.

Accept Job Offer.
STEP 5

Accept Job Offer.

Celebrate!
STEP 6

Celebrate!

Frequently asked questions

About Crossover

Meet some people who've landed similar jobs

Why Crossover

Recruitment sucks. So we’re fixing it.

The Olympics of work

The Olympics of work

It’s super hard to qualify—extreme quality standards ensure every single team member is at the top of their game.

Premium pay for premium talent

Premium pay for premium talent

Over 50% of new hires double or triple their previous pay. Why? Because that’s what the best person in the world is worth.

Shortlist by skills, not bias

Shortlist by skills, not bias

We don’t care where you went to school, what color your hair is, or whether we can pronounce your name. Just prove you’ve got the skills.