EY - GDS Consulting - D&A - Azure Architect

Ernst & Young

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. 

EY GDS – Data and Analytics (D&A) – Azure/Databricks Architect

We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team.

Your key responsibilities

  • Architecting big data solutions in a cloud environment using Azure Cloud services
  • ETL design, development, and deployment to Cloud Service
  • Interact with Onshore, understand their business goals, contribute to the delivery of the workstreams
  • Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis.
  • Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud
  • Recommend design alternatives for data ingestion, processing, and provisioning layers

Skills and attributes for success

  • 8+ years of experience in architecting big data solutions with proven track record in driving business success
  • Hands-on expertise in cloud services like Microsoft Azure
  • Experience with databricks, python, and ADF
  • Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Hadoop and Cassandra.
  • Experience with BI, and data analytics databases
  • Strong understanding & familiarity with all Hadoop Ecosystem components and Hadoop administrative Fundamentals
  • Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
  • Experience in the development of Hadoop APIs and MapReduce jobs for large scale data processing.
  • Hands-on programming experience in Apache Spark using SparkSQL and Spark Streaming or Apache Storm
  • Hands on experience with major components like Hive, PIG, Spark, MapReduce
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Experienced in Hadoop clustering and Auto scaling.
  • Good knowledge in apache Kafka & Apache Flume
  • Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
  • Knowledge of Apache Oozie based workflow
  • Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc.
  • Experience in Enterprise grade solution implementations.
  • Knowledge in Big data architecture patterns [Lambda, Kappa]
  • Experience in performance bench marking enterprise applications
  • Experience in Data security [on the move, at rest] and knowledge of data standards like APRA, BASEL etc
  • Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop technologies
  • Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies
  • Strong UNIX operating system concepts and shell scripting knowledge
  • Knowledge of microservices and API development

To qualify for the role, you must have

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Strong verbal and written communication skills.
  • Must be a team player and enjoy working in a cooperative and collaborative team environment.
  • Adaptable to new technologies and standards.
  • Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support.
  • Minimum 6 years hand-on experience in one or more of the above areas.
  • Minimum 8 years industry experience

EY | Building a better working world 

EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. 

Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. 

Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. 

Read Full Description
Confirmed 7 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles