[Hiring] Senior Analyst, Data Integration & Workflows @S&P Global
Senior Analyst, Data Integration & Workflows @S&P Global
Data and Analytics
Salary unspecified
Remote Location
Employment Type full-time
Posted 2d ago

[Hiring] Senior Analyst, Data Integration & Workflows @S&P Global

2d ago - S&P Global is hiring a remote Senior Analyst, Data Integration & Workflows. 💸 Salary: unspecified 📍Location: Netherlands

Role Description

The Senior Analyst, Data Integration & Workflows plays a critical role within the Data AI & Enablement organization, serving as a senior technical leader responsible for designing, implementing, and operationalizing production-grade data pipelines and workflow automation that power SPDJI's index and analytical solutions. This role combines hands-on technical expertise with leadership capabilities to drive delivery excellence, mentor technical talent, and ensure all data integration solutions meet enterprise standards for quality, reliability, and maintainability.

Responsibilities and Impact

  • Technical Leadership & Solution Delivery
    • Lead complex data integration initiatives from design through production deployment, ensuring solutions are scalable, observable, and aligned with enterprise architecture standards.
    • Design and implement production-grade data pipelines (batch and streaming) that transform raw inputs into trusted curated outputs, incorporating robust error handling, validation, and reconciliation controls.
    • Establish and evangelize engineering best practices for ETL/ELT patterns, workflow orchestration, data quality controls, and operational observability across the team and value streams.
    • Drive technical decision-making for pipeline architecture, technology selection, and design patterns, balancing business requirements with technical feasibility and long-term maintainability.
    • Partner with PPD on technical planning and feasibility, providing realistic estimates, identifying technical dependencies, and shaping scope to ensure achievable delivery commitments.
  • Enablement & Co-Development
    • Lead hands-on enablement with value stream SMEs through pair programming, structured guidance, and co-development sessions—adapting approach based on SME technical capability.
    • Assess SME technical readiness and recommend appropriate engagement models (SME-led with review, co-development, or led build with validation).
    • Build reusable automation components and templates (frameworks for ingestion, validation, transformation, publishing, backfills) that accelerate consistent delivery across domains.
    • Develop SME technical capabilities through targeted coaching, code reviews, and knowledge transfer, fostering a culture of engineering excellence and continuous learning.
    • Create and maintain technical documentation, including reference architectures, design patterns, coding standards, and implementation guides.
  • Quality Assurance & Production Readiness
    • Conduct comprehensive code reviews for SME-built and team-developed pipelines, ensuring adherence to standards for maintainability, testing, logging, data validation, and documentation.
    • Implement data reliability controls including validation rules, reconciliation checks, anomaly detection, and completeness/timeliness monitoring that protect downstream index processes.
    • Engineer observability and monitoring solutions by implementing logging standards, metrics, alerts, and runbooks that enable effective production support.
    • Prepare IT-ready handover artifacts including technical documentation, test evidence, operational procedures, and clear support boundaries.
    • Partner with IT during QA and deployment, resolving issues quickly and ensuring solutions meet enterprise standards for security, supportability, and operational excellence.
  • Operational Excellence & Continuous Improvement
    • Provide L3 support for production business-logic issues, collaborating with value stream SMEs to drive root-cause analysis and implement permanent fixes for recurring failures.
    • Optimize pipeline performance and cost through appropriate partitioning strategies, caching, incremental processing patterns, and compute resource tuning.
    • Implement workflow orchestration patterns (scheduling, dependency management, retries, idempotency, parameterization) ensuring pipelines are resilient to upstream variability.
    • Capture and share lessons learned, updating engineering playbooks, patterns, and standards based on production outcomes and emerging best practices.
    • Monitor operational metrics related to pipeline reliability, data quality, performance, and cost efficiency; drive continuous improvement initiatives.
  • Collaboration & Stakeholder Management
    • Collaborate with Data Integration Lead to shape team strategy, prioritize initiatives, and align technical approaches with organizational goals.
    • Partner effectively with AI Solutions and Data Governance teams on cross-cutting concerns including data quality standards, AI pipeline requirements, and compliance.
    • Engage with Data Value Streams to understand business requirements, validate technical solutions, and ensure alignment with domain expertise.
    • Work with Data Services & Strategy teams (Vendor Governance, Catalog) to establish scalable integration patterns and ensure proper metadata and lineage tracking.
    • Build strong relationships with IT and PPD teams to ensure infrastructure readiness, smooth deployments, and operational excellence.
  • Shared Accountabilities
    • With Data Integration Lead: Execute on team strategy; provide technical leadership on complex initiatives; mentor junior team members; contribute to standards and capability development.
    • With PPD: Collaborate on technical feasibility assessments and planning; provide realistic estimates; align integration efforts with platform capabilities and roadmap.
    • With IT: Ensure infrastructure readiness; coordinate handover processes; support production gateway requirements; partner on operational excellence.
    • With Data Value Streams: Co-develop solutions with SMEs; validate business logic alignment; assess and develop SME technical capabilities.
    • With Data Services & Strategy: Establish scalable integration patterns; ensure proper metadata and lineage tracking; align with vendor governance requirements.
    • With AI Solutions & Data Governance: Coordinate on data quality standards, AI data pipeline requirements, and governance compliance.
  • Ownership
    • Complex Technical Initiatives: Own the end-to-end delivery of high-complexity data integration and workflow automation projects.
    • Engineering Standards Implementation: Responsible for implementing and enforcing technical standards, patterns, and best practices within assigned domain or value streams.
    • SME Technical Development: Own the hands-on enablement and capability development of assigned value stream SMEs in data engineering practices.
    • Production Solution Quality: Accountable for ensuring all solutions meet production readiness criteria before IT handover.
  • Parameters for Success
    • Deliver Production-Ready Solutions: Consistently deliver high-quality, production-ready data pipelines that meet business requirements and enterprise standards.
    • Build SME Capability: Demonstrably improve technical capabilities of value stream SMEs through effective enablement and mentorship.
    • Drive Reusability: Create and promote adoption of reusable components and standardized patterns that accelerate delivery.
    • Ensure Operational Excellence: Implement robust observability, monitoring, and support frameworks that minimize production incidents and enable rapid issue resolution.
    • Foster Technical Excellence: Contribute to a culture of craftsmanship, continuous improvement, and engineering best practices.
  • Key Performance Indicators (KPIs)
    • Solution Delivery Quality: Percentage of delivered pipelines that pass IT QA on first submission; production incident rate for delivered solutions.
    • SME Capability Development: Measurable improvement in technical skills of mentored SMEs through assessments, code review quality progression, and feedback.
    • Operational Reliability: Pipeline uptime and reliability metrics; mean time to resolution for production issues; data quality incident rates.

Qualifications

  • Bachelor's degree in Computer Science, Engineering, Information Systems, or related field; Master's degree preferred.
  • 8+ years of experience in data engineering, data integration, or related technical roles with progressive responsibility.
  • 3+ years of experience in technical leadership roles, including mentoring engineers and leading complex technical initiatives.
  • Proven track record of designing and implementing production data pipelines in complex enterprise environments.
  • Experience in financial services, index management, or similar data-intensive industries preferred.

Requirements

  • Proficiency in ETL/ELT design patterns, data pipeline architecture, and workflow orchestration frameworks.
  • Advanced programming skills in languages commonly used in data engineering (Python, SQL, Scala, or similar).
  • Solid understanding of data quality frameworks, data validation techniques, reconciliation patterns, and anomaly detection.
  • Experience implementing observability, monitoring, and alerting systems for production data pipelines.
  • Familiarity with data governance principles, metadata management, and compliance frameworks.

Leadership & Soft Skills

  • Strong technical leadership with demonstrated ability to lead complex initiatives and influence technical direction.
  • Excellent mentorship and coaching abilities, with track record of developing technical talent and improving team capabilities.
  • Outstanding collaboration skills with ability to partner effectively across technical and business teams in a matrixed organization.
  • Clear communication abilities, able to articulate complex technical concepts to varied audiences and translate business requirements into technical solutions.
  • Problem-solving mindset with focus on root-cause analysis, sustainable solutions, and continuous improvement.
  • Adaptability and pragmatism in selecting appropriate engagement models based on SME capability and project requirements.
  • Strong stakeholder management skills with ability to manage expectations and deliver on commitments.

Preferred Qualifications

  • Experience with index calculation processes, financial data workflows, or SPDJI products and methodologies.
  • Certification in Python or SQL Development (e.g., Python Institute, Microsoft SQL certifications).
  • Experience with Agile/Scrum methodologies and product-oriented delivery models.
  • Knowledge of data lineage, metadata management, and data cataloging tools (e.g., Collibra, Alation, DataHub).
  • Experience with AI/ML data pipeline requirements and integration patterns.

Benefits

  • Health & Wellness: Health care coverage designed for the mind and body.
  • Flexible Downtime: Generous time off helps keep you energized for your time on.
  • Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills.
  • Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs.
  • Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in-class benefits for families.
  • Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference.
Before You Apply
remote Be aware of the location restriction for this remote position: Netherlands
Beware of scams! When applying for jobs, you should NEVER have to pay anything. Learn more.
Senior Analyst, Data Integration & Workflows @S&P Global
Data and Analytics
Salary unspecified
Remote Location
Employment Type full-time
Posted 2d ago
Apply for this position
Did not apply
Applied
Sent Follow-Up
Interview Scheduled
Interview Completed
Offer Accepted
Offer Declined
Application Denied
Unlock 160,000+ Remote Jobs
remote Be aware of the location restriction for this remote position: Netherlands
Beware of scams! When applying for jobs, you should NEVER have to pay anything. Learn more.
Apply for this position
Did not apply
Applied
Sent Follow-Up
Interview Scheduled
Interview Completed
Offer Accepted
Offer Declined
Application Denied
Unlock 160,000+ Remote Jobs
×

Apply to the best remote jobs
before everyone else

Access 160,000+ vetted remote jobs and get daily alerts.

4.9 ★★★★★ from 500+ reviews
Unlock All Jobs Now

Maybe later