Teradata has flagged the Think Big Principal Data Engineer job as unavailable. Let’s keep looking.

Description

Position at Grid Dynamics

Our client is one of the largest wealth management institutions in the US. The company provides financial services to individuals, corporations and municipalities through its subsidiary companies that engage primarily in investment and financial planning. This is a company with an impressive history and outstanding financial results - it now has approximately $1.26 trillion in client assets. 

We invite a Senior Big Data Engineer with strong Python and NoSQL knowledge to join us on an exclusive project, where our team will be responsible for defining the architecture, technical design and following implementation of universal and scalable solution for recording of user interactions with web applications into data storage for the following reporting and analytics.

General responsibilities:

  • Designing, developing, and maintaining big data solutions using technologies such as Oracle, Kafka, Python, Data Streaming, and MongoDB.
  • Developing data pipelines to ingest and transform data from various sources into a format suitable for analysis.
  • Building and maintaining scalable and efficient data storage solutions using technologies such as Hadoop and NoSQL databases like MongoDB.
  • Developing data streaming solutions using Kafka or other streaming technologies to enable real-time data processing and analytics.
  • Creating and maintaining automated data workflows using tools like Airflow or similar.
  • Collaborating with other teams to ensure data security, data quality, and data governance.
  • Developing and implementing best practices for data management, data processing, and data integration.
  • Monitoring and optimizing the performance of big data solutions and providing solutions for performance issues.
  • Participating in code reviews and providing feedback to other engineers.
  • Mentoring junior team members and providing technical guidance as needed.
  • Keeping up-to-date with the latest developments in big data technologies and incorporating them into the existing solutions.

Requirements:

  • At least 5 years of experience in designing, developing, and maintaining big data solutions using. technologies such as Oracle, Kafka, Python, Data Streaming, and MongoDB.
  • Strong programming skills in Python.
  • Experience in building and maintaining data pipelines using technologies like Apache NiFi, Kafka Connect, or other similar tools.
  • Solid understanding of distributed systems, data modeling, and database design concepts.
  • Experience with big data processing frameworks such as Apache Spark, Flink, or similar.
  • Experience in building scalable and efficient data storage solutions using Hadoop, NoSQL databases like MongoDB, or similar technologies.
  • Experience in developing data streaming solutions using Kafka or other streaming technologies.

We offer:

  • Work with a highly motivated and dedicated team
  • Competitive salary
  • Flexible schedule
  • Medical insurance
  • Benefits program
  • Corporate social events

About us:

Grid Dynamics is a leading provider of technology consulting, agile co-creation and scalable engineering and data science services for Fortune 1000 corporations undergoing digital transformation. We help organizations become more agile and create innovative digital products and experiences using deep expertise in emerging technology, top global engineering talent, lean software development practices, and a high-performance product culture. Founded in 2006, Grid Dynamics is headquartered in Silicon Valley with offices across the US, Mexico, UK, Netherlands, Switzerland, India, and Central and Eastern Europe.

Join us and prepare to grow! 

Read Full Description
Confirmed 6 hours ago. Posted 21 days ago.

Discover Similar Jobs

Suggested Articles