Senior Data Engineer

Sony

Education
Benefits
Qualifications
Special Commitments

We look for the risk-takers, the collaborators, the inspired and the inspirational. We want the people who are brave enough to work at the cutting edge and create solutions that will enrich and improve the lives of people across the globe. So, if you want to make the world say wow, let's talk.

The conversation starts here. If this role matches your ambitions and skillset, let's get started with your application. Take a look at our other open positions too. Our many opportunities can lead to infinite possibilities.

Job Description – Senior Data Engineer (FTE)

[About SISC]

Sony India Software Centre Pvt Ltd is seeking talented and motivated individuals to join our team! As a subsidiary of Sony Corporation, a multinational conglomerate based in Tokyo, Japan, we are at the forefront of software development and support services in the technology industry.

Located in Bangalore, India, we were established in 1998 with the aim of providing software development and support services to various Sony group companies and have since grown to become a key player in the software development industry, providing innovative solutions in areas such as consumer electronics, professional broadcasting, and game development.

With a talented and experienced team of engineers, we are dedicated to deliver high-quality software products and services, while also contributing to the growth of the technology industry in India.

[Job Title]

Senior Data Engineer

[Project Details]:

GISC/IC/Compliance & Security

[Technology and Sub-technology]

S3, Redshift, DynamoDB, Map Reduce, Kafka, & Streaming technologies

[Base Location]:

  • Bangalore

[Type]:

  • Hybrid

[Qualifications]

  • A university bachelor's in science degree in Computer Science or Business Intelligence or Data Analytics is a must-have.
  • Solid knowledge of AWS database and data technologies
  • Solid knowledge of data modeling and database design
  • Solid knowledge of workings of a distributed database models including SQL, No-SQL and performance optimization
  • Solid knowledge of data structures and algorithms
  • Certifications like AWS Database specialty, AWS Data Analytics specialty is preferred
  • [Job Overview]:
  • The primary technologies leveraged will be AWS data services, Power BI, and Python.
  • Open-source data technologies may also be leveraged from time-to-time
  • Understand the business domain, core data objects, and data relationships
  • Model the structure and content of the feeds from the various source systems
  • Model the structure and content of the processed and transformed data into various target systems
  • Design the ETL layer, data warehouse, data mart and transactional databases including the facets of load parameters
  • Induct aspects of high performance, security, usability, operability, maintainability, traceability, observability, evolvability into the systems design
  • Assess design influencing parameters like normalization, de-normalization, most executed transactions, record count, data size, I/O parameters at the database and OS level in the database and table designs
  • Maintain a catalog of meta, master, transactional and reference data
  • Tune the transactions and queries and determine the use of appropriate client libraries and fetch mechanism (like query vs stored procedures)
  • Design the system for resilience, fail-over, self-healing and institute rollback plans
  • Develop and test database code and other core and helper utilities in Python
  • Develop and profile queries, triggers, indices, and stored procedures
  • Monitor the health of queries and identify patterns leading to bottlenecks in the system before the customer finds it
  • Own the DevOps and release mgmt. practices
  • Estimate the cost of AWS services usage
  • Design and develop data REST API layer on Python

[Primary Skills]:

  • Overall technology experience of 8+ years
  • Minimum experience of 7 years in designing, implementing, and supporting medium to large scale database systems
  • Minimum experience of 5 years in requirements analysis, data modelling and database design
  • Minimum experience of 4 years designing, developing, and tuning solutions using AWS database and storage technologies
  • Experience with designing, developing, and supporting solutions using S3, Redshift, DynamoDB and any of the Managed RDS is a must-have
  • Experience with designing, developing, and supporting solutions using Map Reduce, Kafka, & Streaming technologies is a must-have
  • Advanced python programming skills is a must-have

[Good to have Skills]:

  • Prior experience with designing, developing, and supporting solutions using database technologies like MySQL, PostgreSQL, Cassandra is a plus

[Responsibilities and Duties]:

  • Project delivery:
  • The primary technologies leveraged will be AWS data services, Power BI, and Python.
  • Open-source data technologies may also be leveraged from time-to-time
  • Understand the business domain, core data objects, and data relationships
  • Model the structure and content of the feeds from the various source systems
  • Model the structure and content of the processed and transformed data into various target systems
  • Design the ETL layer, data warehouse, data mart and transactional databases including the facets of load parameters
  • Induct aspects of high performance, security, usability, operability, maintainability, traceability, observability, evolvability into the systems design
  • Assess design influencing parameters like normalization, de-normalization, most executed transactions, record count, data size, I/O parameters at the database and OS level in the database and table designs
  • Maintain a catalog of meta, master, transactional and reference data
  • Tune the transactions and queries and determine the use of appropriate client libraries and fetch mechanism (like query vs stored procedures)
  • Design the system for resilience, fail-over, self-healing and institute rollback plans
  • Develop and test database code and other core and helper utilities in Python
  • Develop and profile queries, triggers, indices, and stored procedures
  • Monitor the health of queries and identify patterns leading to bottlenecks in the system before the customer finds it
  • Own the DevOps and release mgmt. practices
  • Estimate the cost of AWS services usage
  • Design and develop data REST API layer on Python

[Keywords]

  • S3, Redshift, DynamoDB
  • Map Reduce, Kafka, & Streaming technologies
  • Advanced python
Read Full Description
Confirmed 2 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles