IBM has flagged the Application Architect - Big Data job as unavailable. Let’s keep looking.

NorthBay Solutions is looking for a highly skilled and motivated Tech Lead/Architect - Big Data with solid experience in data engineering and a passion for Big Data. The ideal candidate will have experience of over 12 years and 7years specifically in Data Engineering.

You will have the chance to join a global, multi-cultural team of Data Engineers, and be part of the growth and development of it, participate in a large variety of exciting and challenging projects and work with our exceptional team of Technology professionals.

Technology Stack Used & Required Knowledge:

  • Must have experience of developing and implementing pipelines on AWS cloud that extract, transform, and load data into an information product that helps the organization reach its strategic goals
  • Must have experience on ETL Data Engineer, Python, Java, Bigdata (Hadoop, PySpark, Spark SQL, Hive)
  • Effective time management skills, including demonstrated ability to manage and prioritize multiple tasks of project.
  • Experience in creating and driving large-scale ETL pipelines in AWS-based environment
  • Experience with integration of data from multiple data sources.
  • Overall 8+ years of relevant work experience in Big data engineering, ETL, Data Modeling, and Data Architecture.
  • Experience with DevOps and Continuous Integration/Delivery (CI/CD) concepts and tools such as Bitbucket, Bamboo, Ansible, Sonar, and the Atlassian tool suite
  • Strong software development and programming skills with a focus on data using Python/ PySpark and/or Scala for data engineering.
  • Experience and understanding of various core AWS services such as IAM, Cloud Formation, EC2, S3, EMR/Spark, Glue, Lambda, Athena, and Redshift will be a plus
  • Understanding of data types/ handling of different data models.
  • Experience with the AWS data management tools such as Data lake and Databricks or AWS Snowflake is a plus
  • Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms is a plus
  • Good scripting and programming skills.

To keep it short, below are some key responsibilities:

  • Design and develop using Big Data technologies.
  • Design and develop cutting edge Big Data, and Cloud technologies to build critical, highly complex distributed systems from scratch
  • Design, develop and manage complex ETL jobs and pipelines. 
  • Act as a critical thought leader, consulting and mentoring other teams as your new systems transform the way this entire firm leverages data
  • Deploying data models into production environments. This entails providing the model with data stored in a warehouse or coming directly from sources, configuring data attributes, managing computing resources, setting up monitoring tools, etc. 
  • Mentor and train colleagues where necessary by helping them learn and improve their skills, as well as innovate and iterate on best practices.
  • Solve complex issues and provide guidance to the team members when needed.
  • Make improvement and process recommendations that have an impact on the business

The Rest of the qualities, you know them:

  • Leading and organizing teams of various sizes
  • Client Facing Skills
  • Flexibility in using different technologies/ platforms
  • Ability to shift gears quickly and cope with change
  • Analytical & Detail Oriented
  • Confident, proactive, and self-motivated
  • Disciplined and “Team first approach” oriented
Read Full Description
Confirmed 18 hours ago. Posted 25 days ago.

Discover Similar Jobs

Suggested Articles