Principal Data Engineer

Fidelity

Education
Benefits
Qualifications
Special Commitments
Skills

Job Description:

Position Description:

Creates data visualizations using object oriented/object function scripting languages such as Python, Java, C++, and Scala. Performs data analysis using Big Data tools such as Hadoop, Spark, Kafka, and Kubernetes. Tracks the data lifecycle with Data Management tools such as Collibra, Alation, and BigID. Works in various data councils, with data owners, data stewards and data custodians. Leads projects with new/emerging technology and/or products in the early strategy and design stage. Leverages AWS cloud services (EC2, EMR, Snowflake, and Elastic-Search) to perform big data analytics.

Primary Responsibilities:

  • Applies Data Management practices including data governance, data catalog, data privacy, data quality, and data lineage to ensure data is secure, private, accurate, available, and usable.
  • Works with Data Governance groups across the Enterprise to align and scale effective practices.
  • Partners with key stakeholders to understand their key business questions, and delivers analytic self-service solutions.
  • Simplifies and effectively communicates data governance challenges, solutions options, and recommendations to business partners and technology leadership.
  • Collaborates with business stakeholders, chapter leads, squad leads, tech leads, and architects to drive Fidelity’s Data Strategy forward.
  • Applies process and technology to deliver innovative solutions to meet business challenges.
  • Understands detailed requirements and delivers solutions that meet or exceed customer expectations.
  • Confers with data processing or project managers to obtain information on limitations or capabilities for data processing projects.
  • Develops or directs software system testing or validation procedures, programming, or documentation.
  • Maintains databases within an application area, working individually or coordinating database development as part of a team.
  • Analyzes information to determine, recommend, and plan computer software specifications on major projects and proposes modifications and improvements based on user need.
  • Develops software system testing and validation procedures, programming, and documentation.

Education and Experience:

Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and five (5) years of experience as a Principal Data Engineer (or closely related occupation) performing data analysis, data mapping, and data transformation in the financial services industry.

Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal Data Engineer (or closely related occupation) performing data analysis, data mapping, and data transformation in the financial services industry.

Skills and Knowledge:

Candidate must also possess:

  • Demonstrated Expertise (“DE”) managing metadata assets using Collibra Intelligence Platform and Alation Data Intelligence Platform and performing data analytics in the operational data sources using Snowflake, SQL Server, DB2, DynamoDB, or PostgreSQL -- creating data source connections, performing data quality checks and validation, integrating platforms, building business glossaries, stitching glossary terms to table columns, and documenting data sharing practices and policies.
  • DE providing extraction, transformation, and loading (ETL) solutions by developing Informatica mappings and Snowflake databases using Python; establishing an inbound data cleansing capability system using Informatica Metadata Manager (MMR) and Data Quality Management (IDQ); creating metadata lineages in Collibra Intelligence Platform to perform impact analysis, troubleshooting support, and collaboration for development projects; and providing end-to-end visibility of data flows throughout the enterprise.
  • DE designing and developing ETL data management processes -- source code management and version control -- using Informatica PowerCenter and DevOps (Git-stash, Jenkins, and Jira); optimizing data quality solutions using Informatica Data Quality (Informatica Developer Client and Informatica Analyst); scheduling batch jobs using Control-M; and validating data sets using Informatica's Data Validation Object (DVO).
  • DE designing and developing database objects and applications in Cloud computing applications (AWS EC2, EMR, S3, and CloudFormation); migrating data from On-Prem (Informatica) to Snowflake using Python scripts; and building and supporting database objects and applications, using Snowflake, AWS EC2, EMR, and Datadog.

#PE1M2

#LI-DNI

Certifications:

Category:

Information Technology

Fidelity’s hybrid working model blends the best of both onsite and offsite work experiences. Working onsite is important for our business strategy and our culture. We also value the benefits that working offsite offers associates. Most hybrid roles require associates to work onsite every other week (all business days, M-F) in a Fidelity office.

Read Full Description
Confirmed 8 hours ago. Posted 25 days ago.

Discover Similar Jobs

Suggested Articles