Teradata is hiring a Senior Consultant with expertise in Big data and Apache Hadoop and be part of the dynamic Big Data CoE team.The ideal candidate must be a highly energetic self-starter who can perform complex hands on architecture and development on Hadoop ecosystem. The resource will also be tapped to perform proof of concepts (PoCs) for our customers during pre-sales activities.
Successful candidates will -
· Take a leadership role in consulting engagements and managing day-to-day client relationships and results.
· Ensure that engagements adhere to client strategies.
· Define the consulting engagement strategy.
· Ensure engagement quality.
· Ensure engagement adherence to budget objectives and scope.
· Engage with Teradata Account teams and prospective customers to analyse and understand customer requirements;
· Shape and influence customer requirements so that they are deployed in an optimum Hadoop architecture;
· Assist in qualifying requirements and provide guidance within the Big Data CoE that will enable him / her to determine whether Hadoop is a good fit for the problem that the customer is trying to solve;
· Lead the team and/or participate in design, plan and execute on-site/off-site customer proof-of-concepts;
· Lead the team and/or configure and use the Horton Hadoop distribution of tools and associated products. Typically Hive, Pig, HCatalog and MapReduce procedural programming languages;
· Lead the team and/or partner with the Hadoop administrator to secure and configure the Hadoop cluster to optimise performance and administrate the Hadoop environment;
· Post-POC-execution, document and disseminate the results and lessons learned to all stakeholders;
· Challenges standard approaches
Teradata is hiring a Senior Consultant with expertise in Big data and Apache Hadoop and be part of the dynamic Big Data CoE team.
The ideal candidate must be a highly energetic self-starter who can perform complex hands on architecture and development on Hadoop ecosystem. The resource will also be tapped to perform proof of concepts (PoCs) for our customers during pre-sales activities.
Successful candidates must –
· Have hands-on experience in the design, development or support of Hadoop in an implementation environment at a leading technology vendor or end-user computing organization;
· 5+ years experience leading and managing development teams
· 1+ years experience implementing ETL/ELT processes with MapReduce, PIG, Hive.
· 1+ years hands on experience with HDFS, and NoSQL database such as HBASE, Cassandra on large data sets
· Hands on experience with NoSQL (e.g. key value store, graph db, document db)
· 5+ years experience in performance tuning and programming languages such as; Shell, C, C++, C#, Java, Python, Perl, R.
· Demonstrate a keen interest in, and solid understanding of, “big data” technology and the business trends that are driving the adoption of this technology;
· Demonstrate analytical and problem solving skills; particularly those that apply to a Big Data environment
· Maintain a good level of understanding about the Hadoop technology marketplace;
· Strong understanding of data structures, modeling and Data Warehousing.
· Team-oriented individual with excellent interpersonal, planning, coordination, and problem-solving skills.
· High degree of initiative and the ability to work independently and follow-through on assignments.
· Excellent oral and written communications skill.
· BS or MS degree in Computer Science or relevant fields
APAC Sales & Services