[Hiring] Data Engineer, Mortgage Servicing @Lakeview Loan Servicing
Data Engineer, Mortgage Servicing @Lakeview Loan Servicing
Data and Analytics
Salary $220,000 to $30..
Remote Location
πŸ‡ΊπŸ‡Έ USA Only
Employment Type full-time
Posted 2d ago

[Hiring] Data Engineer, Mortgage Servicing @Lakeview Loan Servicing

2d ago - Lakeview Loan Servicing is hiring a remote Data Engineer, Mortgage Servicing. πŸ’Έ Salary: $220,000 to $300,000, plus annual bonus πŸ“Location: USA

Role Description

The Data Engineer, Mortgage Servicing on the Nebula team acts as the mortgage servicing data subject matter expert and plays a critical role in building and evolving the data foundation that powers analytics, reporting, AI development, and operational decision-making across the organization. This role is responsible for designing, building, and maintaining reliable, scalable, and flexible data systems that support a wide range of internal and external use cases.

  • Requires domain awareness in mortgage and servicing-related data environments.
  • Understanding of the complexities associated with loan-level lifecycle data, transaction processing, cash movement, and reconciliation across systems.
  • Must be able to translate business workflows and system behavior into accurate, auditable data structures that support downstream reporting, operational processes, and regulatory requirements.
  • Contributes to the development and evolution of core data capabilities, including batch and real-time pipelines, operational and analytical data stores, semantic models, and BI-ready datasets.
  • Expected to operate effectively in a modern engineering environment, using automation, observability, and infrastructure-as-code practices.
  • Help enable downstream analytics, reporting, product capabilities, and AI systems by ensuring that data is trustworthy, accessible, and fit for purpose.

Responsibilities

  • Data Pipeline Development:
    • Design, build, and maintain robust data pipelines for a wide variety of input and output sources, including internal systems, third-party platforms, files, APIs, event streams, and databases.
    • Develop scalable ETL and ELT workflows for both batch and real-time processing.
    • Ensure pipelines are reliable, testable, observable, and easy to extend as business needs evolve.
    • Build reusable data integration patterns that support growing volumes, new source systems, and downstream consumers across analytics, applications, and AI initiatives.
  • Data Platform & Storage:
    • Design and manage data architectures that support OLTP, OLAP, and reporting workloads across operational and analytical environments.
    • Build and optimize data models, warehouse schemas, and curated datasets for analytics and BI use cases.
    • Contribute to the design and operation of modern data platforms, including warehouses, lakehouses, streaming systems, and supporting orchestration frameworks.
    • Help define patterns for data storage, partitioning, performance optimization, retention, and lifecycle management.
  • Servicing-Oriented Data Modeling & Integrity:
    • Design and maintain data models that accurately reflect loan-level lifecycle events, including payment activity, balances, adjustments, and status changes.
    • Ensure consistency and reconciliation across systems where transactional, financial, and reporting data must align.
    • Identify and resolve discrepancies across source systems, and build data structures that support accurate, auditable outputs for downstream operational processes, reporting, and decisioning.
  • Cloud Deployment & Operations:
    • Deploy, operate, and improve data pipelines and data stores on major cloud platforms such as AWS, GCP, or Azure.
    • Use infrastructure-as-code, CI/CD, and automation practices to improve deployment speed, consistency, and reliability.
    • Monitor production data systems using logging, alerting, and observability tooling to proactively identify and resolve issues.
    • Support secure, resilient, and cost-conscious operation of cloud-based data infrastructure.
  • Data Quality, Reliability & Governance:
    • Implement data quality checks, validation rules, reconciliation processes, and monitoring to ensure trustworthy data across systems.
    • Establish and maintain standards for lineage, documentation, metadata, schema evolution, and operational runbooks.
    • Partner with stakeholders to improve data accessibility, consistency, and usability while maintaining appropriate controls and governance.
    • Contribute to practices that support security, privacy, auditability, and compliance in a regulated environment.
  • Cross-Functional Collaboration:
    • Partner closely with Product, Engineering, and business stakeholders to understand data needs, workflows, and constraints.
    • Translate business and operational requirements into clean, scalable, and maintainable data solutions.
    • Support downstream consumers of data, including analysts, researchers, product teams, and operational users.
    • Communicate clearly with both technical and non-technical stakeholders about data availability, quality, tradeoffs, and delivery timelines.
  • Iteration & Continuous Improvement:
    • Continuously improve pipeline performance, reliability, scalability, and developer productivity.
    • Identify opportunities to simplify architecture, reduce operational toil, and improve data platform leverage across teams.
    • Operate with a strong bias toward action and iterative delivery, moving quickly from problem definition to implementation and improvement.
    • Help raise the bar on engineering quality through thoughtful design, testing, documentation, and operational discipline.

Qualifications

  • 5-8+ years of experience building and operating production-grade data pipelines and data systems.
  • Prior experience in mortgage, servicing, or similarly regulated financial domains.
  • Strong experience with industry-standard tools and platforms for ETL/ELT, orchestration, data warehousing, streaming, and BI.
  • Experience working with both OLTP and OLAP systems, with a strong understanding of the tradeoffs between transactional and analytical workloads.
  • Experience building flexible data pipelines that integrate with many different source and destination types, including databases, APIs, files, message queues, SaaS platforms, and event streams.
  • Experience supporting both batch and real-time data processing patterns.
  • Experience deploying and operating data infrastructure on major cloud platforms such as AWS, GCP, or Azure.
  • Strong SQL skills and experience with data modeling, transformation frameworks, and performance optimization.
  • Experience building AI-powered capabilities on top of LLMs, including orchestration, evaluation, and data integration patterns.
  • Experience with modern programming languages commonly used in data engineering, such as Python, Java, Scala, or Go.
  • Comfort working with CI/CD, infrastructure-as-code, observability, and production operations for data systems.
  • Strong judgment in ambiguous environments where requirements evolve and systems must balance speed, reliability, and flexibility.
  • Clear communication skills with both technical and non-technical teammates.

Preferred Experience

  • Experience with modern orchestration and transformation tools such as Airflow, Dagster, dbt, or similar platforms.
  • Experience with cloud-native data warehouses or lakehouse platforms such as Snowflake, BigQuery, Redshift, Databricks, or equivalent technologies.
  • Experience with streaming and real-time data platforms such as Kafka, Kinesis, SQS, or similar systems.
  • Experience enabling BI and self-service analytics through curated datasets, semantic layers, and reporting platforms such as Looker, Power BI, Tableau, or similar tools.
  • Experience working with loan-level or transaction-heavy financial data within residential mortgage servicing domains.
  • Experience dealing with data reconciliation challenges across multiple systems, particularly where cash balances, or investor/reporting outputs must align.
  • Experience building data platforms that support AI, machine learning, or decisioning workflows.
  • Experience improving data quality, reliability, cost efficiency, and platform scalability as a system grows.

A Note to Candidates

If you are a strong data engineer with solid technical judgment, a systems mindset, and excitement for solving complex data problems, we would love to hear from you. If your background does not line up perfectly with every bullet, but this role feels like the kind of work you want to do, please apply.

Bayview is an Equal Employment Opportunity employer. All aspects of consideration for employment and employment with the Company are governed on the basis of merit, competence and qualifications without regard to race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, or any other category protected by federal, state, or local law.

#LI-Remote

Before You Apply
️
πŸ‡ΊπŸ‡Έ Be aware of the location restriction for this remote position: USA Only
β€Ό Beware of scams! When applying for jobs, you should NEVER have to pay anything. Learn more.
Data Engineer, Mortgage Servicing @Lakeview Loan Servicing
Data and Analytics
Salary $220,000 to $30..
Remote Location
πŸ‡ΊπŸ‡Έ USA Only
Employment Type full-time
Posted 2d ago
Apply for this position
Did not apply βœ“
Applied βœ“
Sent Follow-Up βœ“
Interview Scheduled βœ“
Interview Completed βœ“
Offer Accepted βœ“
Offer Declined βœ“
Unlock 160,000+ Remote Jobs
️
πŸ‡ΊπŸ‡Έ Be aware of the location restriction for this remote position: USA Only
β€Ό Beware of scams! When applying for jobs, you should NEVER have to pay anything. Learn more.
Apply for this position
Did not apply βœ“
Applied βœ“
Sent Follow-Up βœ“
Interview Scheduled βœ“
Interview Completed βœ“
Offer Accepted βœ“
Offer Declined βœ“
Unlock 160,000+ Remote Jobs
Γ—

Apply to the best remote jobs
before everyone else

Access 160,000+ vetted remote jobs and get daily alerts.

4.9 β˜…β˜…β˜…β˜…β˜… from 500+ reviews
Unlock All Jobs Now

Maybe later