Lead Data Engineer

S&P Global

Education
Benefits
Qualifications

S&P Global

The Role: Lead Data Engineer, Agile & PPM Tools

Grade: 11

The Location: Gurgaon/Hyderabad

The Team:

The Agile & PPM Tools team is responsible for developing and managing enterprise tools enabling Agile Execution (Azure DevOps), Project Portfolio Management (Solution Business Manager) and Reporting (Tableau), for all the S&P Global Divisions. These tools are used across S&P for Portfolio Management capabilities like prioritizing, executing, and tracking Portfolios, Agile Workflow Execution to enable Agile delivery and Portfolio & Agile Analytics. 

The Impact:

Are you looking for an opportunity to advance your career as an innovative enterprise team member? The Agile & PPM Tools Team is looking for an innovative professional who can bring teamwork, creativity, and business analyst experience to a global team.

What’s in it for you:

As a Data Engineer, you will have the opportunity to work on latest technologies and engage closely with senior stakeholders from all the Business Groups and Divisions within S&P. Overall goal of the Team is to develop and support a common Enterprise toolset for all divisions to enable Portfolio Management, resourcing, execution, and reporting with cutting edge technologies like Azure DevOps, AWS Cloud Services (S3, EC2, RDS, Lambda, Kinesis, Glue, Redshift etc), Databricks as well as Market leaders in PPM and Agile tools.

Responsibilities:

In the day-to-day operations, you will be working with the Agile & PPM Analytics Scrum team with the goal to develop cutting edge Data Products for consumption of Agile & PPM data by our internal customers which include users from across the S&P Global Enterprise at all levels from senior leadership to individual users. You will also drive Strategic Initiatives that enable us to deliver better products, faster and with world class support. With your acute investigation, troubleshooting and communication skills, you will be championing support best practices and improving our user’s experience. Some of your areas of ownership will include:

What You'll Do 

  • Work in a Scrum framework to develop solutions to extract data from various sources into our Data Lake, powered by Databricks and AWS Redshift
  • Translate the business requirements into technical requirements
  • ETL development using native and 3rd party tools and utilities
  • Lead the semantic modelling and development activities
  • Write and optimize complex SQL and shell scripts
  • Design and develop code, scripts, and data pipelines that leverage structured and unstructured data
  • Data ingestion pipelines and ETL processing, including low-latency data acquisition and stream processing
  • Design and develop processes/procedures for integration of data warehouse solutions in an operative IT environment.
  • Monitoring performance and advising any necessary configurations & infrastructure changes.
  • Create and maintain technical documentation that is required in supporting solutions.
  • Build a deep understanding of the solutions delivered on both platforms to understand utility from our users/customers perspective.
  • Keep up to date on support and customer experience best practices and suggest or implement improvements in processes or tools.

What We’re Looking For:

Basic Qualifications:

  • 8+ years professional experience as a data engineer or similar role
  • B.S. / M.S. in Computer Sciences or related field
  • Hands on experience with one or more ETL tools.
  • Hands on experience with Industry standard data models and mapping between source systems, core and semantic data structures.
  • Working knowledge of financial services domain
  • Good understanding of different dimensional modeling techniques such as Star vs SnowFlake Schemas.
  • Working experience on creating semantic and reporting mapping documents.
  • Strong concepts/experience of designing and developing ETL architectures.
  • Strong RDBMS concepts and SQL development skills
  • Strong knowledge of data modeling and mapping
  • Experience with Data Integration from multiple data sources
  • Working experience in one or more business areas and industries: Financial etc.
  • Working experience with one or more Cloud environments like AWS, Azure or GCP along with hands on experience with different cloud services.
  • Hands on development experience on Python/R along with a good knowledge of programming languages such as .NET, Java, AngularJS, etc.
  • Knowledge of Agile Execution tools like Azure DevOps, JIRA etc.
  • Experience using any Workflow Management tool like ServiceNow, Solutions Business Manager, VersionOne, etc.
  • Demonstrable experience in facilitating, leading, influencing, and managing within large-scale matrix, globally distributed organizations
  • Open to working flexible hours as per business needs

Preferred Qualifications:

  • Good data analysis skills using Excel and any other BI tools
  • Prior experience with Power BI/Tableau/Databricks/AWS Services
  • Experience on PPM tools like Solutions Business Manager (Micro Focus Tool) and/or Agile
  • 10+ years professional experience as a data engineer or similar role
  • Experience with SQL/NoSQL databases, including MongoDB.
  • Experience with cloud computing platforms, including Amazon Web Services and Microsoft Azure
  • Proficiency in Python/R or MATLAB

What You'll Bring 

  • Excellent communication and presentation skills, both verbal and written
  • Proven experience in customer facing roles for large engagements and managing solution delivery teams.
  • Ability to solve problems using a creative and logical mind set
  • Demonstrated skills in team leadership, coaching, and competency building
  • Must be self-motivated, analytical, detail oriented, organized and pursue excellence on all tasks.
  • Good understanding of reporting tools such as PowerBI, Tableau.
  • Good knowledge of Big Data technologies such as Data lake, Databricks, Redshift, Spark, Kafka
  • Experience with NoSQL databases, such as HBase, MongoDB
  • Experience with any of the Hadoop distributions such as Cloudera/Hortonworks
  • Experience with AWS tools and technologies will be a plus.
  • Training/Certification on AWS/Agile will be a plus.

Equal Opportunity Employer

S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. 

If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person.  

US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. 


10 - Officials or Managers (EEO-2 Job Categories-United States of America), IFTECH103.2 - Middle Management Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning)

Read Full Description
Confirmed 20 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles