Oracle has flagged the Java / JS Architects - 10+ Years job as unavailable. Let’s keep looking.

The Role:

We are looking for passionate and seasoned data architects to build next generation data warehouses / data lakes and business intelligence solutions for our business stakeholders. The candidate would be required to understand and gather the data warehouse / data lake requirements, architect, design / model and implement timely solutions for the business needs. The candidate should be able to own and be responsible for the complete architecture of the solution.

Data architect will own, optimize and maintain conceptual, logical and physical data models. Exposure to Sales / Finance, Manufacturing, Quality, Engineering, Supply Chain, Finance, After-sales / Credit and Digital workplace Domain would be preferred.

Responsibilities:

The primary responsibilities of the solution architect include:

  • Own and drive the architecture for a project / product.
  • design, modify, and test technical architecture.
  • Continually research the current and emerging technologies and propose changes where needed
  • Choose the right architecture that would solve the business problems
  • Assess the business impact that certain technical choices have.
  • Provide updates to stakeholders on product development processes, costs and budgets.
  • Lead a team of technical people, mentor and guide them during the solution
  • Communicate and translate the technical solution to the team and other stakeholders
  • Drive re-use initiatives and promote re-use culture within the team
  • Design and implement various components of data pipeline that include data integration, storage, processing and analysis of business data
  • Analyze business requirements and derive conceptual model by identifying entities and the relationship between them. Identify attributes and create logical and physical models
  • Business process modelling, process flow modeling, data flow modelling skills preferable
  • Proficient in creating flowcharts, process flows and data flow diagrams
  • Collaborate with application development and support teams on various IT projects
  • Develop complex modules and proof of concepts
  • Lead and drive performance optimization efforts
  • Define standards and guidelines for the project and ensure team is following those
  • Assist in setting strategic direction for database, infrastructure and technology through necessary research and development activities
  • Automation should be a key driver and the development & testing should be automated through frameworks / tools
  • Monitoring performance and advising any necessary infrastructure changes through capacity planning & sizing exercises
  • Work with various vendors like cloud providers and consultants to deliver the project
  • Define non functional requirements and make sure that the solution adhere to the requirements
  • Define best practices of agile & DevOps while implementing solutions

Skills and Qualifications:

The ideal candidate should have worked on end-to-end data warehousing / data lake and / or business intelligence solutions in the past. The candidate should have the following skills sets:

  • Excellent understanding of distributed computing paradigm
  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts
  • Should have in depth understanding of coding languages like Java, Python, Scala, SQL etc.
  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write queries
  • Should have strong implementation experience in all the below technology areas (breadth) and deep technical expertise in some of the below technologies:
  • Data integration – ETL tools like Glue, Talend and Informatica. Ingestion mechanism like Flume & Kafka
  • Front End/ Backend technologies: Sprintboot, React Js, Angular Js etc.
  • Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage
  • Data visualization – Tools like Tableau
  • Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM
  • Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE
  • Data processing frameworks – Spark & Spark streaming 
  • Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required.
  • Exposure to multiple AWS components like Lambda, EC2, SNS, SES, Cloud Watch, RDS etc.
  • Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc are preferable
  • Experience in cloud data eco-system, specifically with AWS. Experience in AWS data components like S3, Redshift & RDS, managed services like Aurora & Athena are desirable
  • Demonstrate strong analytical and problem solving capability
  • Good understanding of the data eco-system, both current and future data trends
  • 10+ years of experience in the Solution Architect space with at least few complex & high volume data projects as an architect is mandatory

Drive your career forward and join the company leading the technology and business evolution in the automotive industry.

Trivandrum Kerala India

Read Full Description
Confirmed an hour ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles