Job Opportunities !

Cloudera Admin

Requirement:

Cloudera Admins required with 4-5 years of experience. 

Responsibilities:

  • Cloudera Software installation and configuration.
  • Database backup and recovery
  • Database connectivity and security
  • Performance monitoring and tuning    
  • Disk space management
  • Workload management
  • Communication about downtime, issues etc 
  • User requests for access etc
  • Keeping the clusters running smoothly as per SLA”
  • Candidate must have hands on experience in Hbase ,Storm and Kafka.                                         If not storm,very strong in Hbase and kafka. 

Senior Data Engineer

Requirement:
  • Bachelor’s Degree or more in Computer Science or a related field.
  • Good understanding and experience in SQL is a must .
  • Programming experience in at least one or the programming languages Python, Spark, Java,Scala and a willingness to learn new programming languages to meet goals and objectives.
  • Designing, building and operating distributed applications for scale
    Good understanding of distributed processing technologies(spark, mapreduce etc) with a focus on internals
  • Good understanding of performance tuning in Spark and Hive applications as they scale to millions of requests/day and 100’s of terabytes of data
  • Excellent troubleshooting, problem solving, critical thinking, and communication skills
  • Good understanding of Unix/Linux based operating system.
  • Proficient in unix, command-line tools, and general system debugging

Responsibilities:

  • Work as a Senior Data Engineering developer for client and internal projects
  • Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.
  • Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.
  • Work closely with the engineering team to integrate your work into our production systems.
  • Process unstructured data into a form suitable for analysis.
  • Support business decisions with ad hoc analysis as needed.
  • Monitoring data performance and modifying infrastructure as needed.
  • Provide architecture guidance to solve data engineering problems .