Senior Big Data Architect

Nissan

Nissan is a pioneer in Innovation and Technology. With a focus on Mobility, Operational Excellence, Value to our Customers and Electrification of vehicles, you can expect to be part of a very exciting journey here at Nissan. 

Nissan is going after a massive Digital Transformation backed by leading technologies across the organization globally. We are committed to building a diverse, entrepreneurial organization, and our current team is a strong evidence of that. Our people are what drive the business forward. At Nissan Digital, you will be part of a dynamic team with ample opportunities to grow and make a difference. 

The Role: 

We are looking for passionate and seasoned senior big data architect to build next generation cloud data platforms(Datalake,DWH etc.) and business intelligence solutions in cloud (AWS preferred) using big data technologies. The candidate would be required to understand and gather the data warehouse / data lake requirements, architect, design and implement timely solutions for the business needs. The candidate should be able to own and be responsible for the complete architecture of multiple data projects / products at the same time. 

Responsibilities: 

Senior big data architect need to architect, design and implement data platforms (datalake, DWH, ODS etc.) and frameworks (Streaming, ML, IOT analytics etc.) in cloud using various big data technologies. 

The primary responsibilities include: 

  • Own and drive the architecture for multiple projects / products at the same time considering the global roll outs 
  • Manage global roll outs of data projects / products working with various stakeholders 
  • Choose the right architecture that would solve the business problems 
  • Lead a team of data engineers and senior / lead engineers by mentoring and guiding them 
  • Communicate and translate the technical solution to the team and other stakeholders 
  • Architect, design and implement various components of data pipeline that include data integration, storage, processing and analysis of business data 
  • Drive re-use initiatives and promote re-use culture within the team 
  • Collaborate with application development and support teams on various IT projects 
  • Develop complex modules and proof of concepts 
  • Lead and drive performance optimization efforts 
  • Define standards and guidelines for the project and ensure team is following those 
  • Assist in setting strategic direction for database, infrastructure and technology through necessary research and development activities 
  • Automation should be a key driver and the development & testing should be automated through frameworks / tools 
  • Monitoring performance and advising any necessary infrastructure changes through capacity planning & sizing exercises 
  • Work with various vendors like cloud providers and consultants to deliver the project 
  • Define non-functional requirements and make sure that the solution adhere to the requirements 
  • Define best practices of agile & DevOps while implementing solutions 

Skills and Qualifications: 

The ideal candidate should have architected multiple end-to-end data warehousing / data lake and / or business intelligence solutions in the past using big data technologies. The candidate should have the following skills sets: 

  • Excellent experience in cloud data platforms and data services is a must. AWS certification is preferred. 
  • Excellent understanding of distributed computing paradigm 
  • Excellent programming background using Java, Scala or Python 
  • Should have good knowledge in data warehouse / data lake technology and business intelligence concepts 
  • Good understanding of Lambda architecture 
  • Should have strong implementation experience in three or more tools from any of the below technologies: 
  • Data integration – Ingestion mechanism like Kinesis,Kafka, NiFi, AWS Glue 
  • Experience in cloud data eco-system, specifically with AWS. Experience in AWS data components like S3, Glue,EMR,Redshift & RDS, managed services like Aurora & Athena are desirable 
  • Data storage - Relational, NO-SQL and big data storage. Distributed data warehouse solutions like Snowflake 
  • Big data – Hadoop eco-system, HDFS, Distributions like EMR/Cloudera / Hortonworks, Pig and HIVE 
  • Data processing frameworks – Spark & Spark streaming or Databricks 
  • Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required 
  • Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming 
  • Good conceptual knowledge in data modelling using dimensional & transactional modelling 
  • Demonstrate strong analytical and problem solving capability 
  • Good understanding of the data eco-system, both current and future data trends 
  • Familiarity with other cloud providers like Microsoft Azure and Google GCP would be added advantage 
  • Experience in architecting solutions in a cloud agnostic / vendor agnostic way would be desirable 
  • Exposure to advanced data analytics using machine learning techniques for various business use cases would be required 
  • Exposure to data visualization techniques and analytics using tools like Tableau would be required 
  • Minimum 12-16 years of experience in the data engineering space using above technologies with owning number of complex & high volume data projects / products with global roll outs as an architect is mandatory. 

Drive your career forward and join the company leading the technology and business evolution in the automotive industry. 

Trivandrum Kerala India

Read Full Description
Confirmed 16 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles