Hadoop With Spark Development (4 - 6 Years) Hyderabad

Capgemini

should be good in developing the Unix/Shell/PL/SQL, SCALA framework.  • Expertise in java/J2EE and big data technologies like Hadoop , Apache Spark and Hive is required. Must have applied these skills continuously in the last 2-3 years.  • Good knowledge in Python skills and have used Machine learning in solving data science problems in at least 2-3 projects. • Industry experience: Insurance Auto industry is preferred but not a must.Continue Reading > Read Full Description
Confirmed 3 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles

One Step Register
Need an account? Sign Up