[Hiring] Lead Data Engineer (GenAI / LLM Applications) @Clario
Lead Data Engineer (GenAI / LLM Applications) @Clario
Data and Analytics
Salary unspecified
Remote Location
Employment Type full-time
Posted 2d ago

[Hiring] Lead Data Engineer (GenAI / LLM Applications) @Clario

2d ago - Clario is hiring a remote Lead Data Engineer (GenAI / LLM Applications). 💸 Salary: unspecified 📍Location: India

Role Description

We are looking for a skilled and motivated Lead Engineer to join our Data Science and Delivery group at Clario, a part of Thermo Fisher Scientific. This role combines software development, data engineering, and analytical problem-solving to design, build, and maintain scalable data platforms that support clinical trial operations and business intelligence. You will work across the full software development lifecycle (SDLC)—from requirements gathering through production support—collaborating closely with data scientists, analysts, product managers, and engineering teams to deliver high-quality, data-driven solutions.

Qualifications

  • Bachelor’s or higher degree in Computer Science, Information Technology, or a related technical field.
  • 5+ years of professional experience in software engineering, data engineering, or data-focused development roles.
  • Strong proficiency in Python, including frameworks and libraries such as Django or Flask, pandas, NumPy, Plotly, and ag-Grid.
  • Strong SQL expertise with Oracle, MS SQL Server, PostgreSQL, and/or Snowflake.
  • Proven experience writing complex SQL, including analytical and window functions, subqueries, all join types, DML/DDL/TCL statements, CASE expressions, and performance tuning.
  • Working knowledge of cloud platforms, with a preference for AWS (S3, EC2, Secrets Manager, Bedrock, Lambda).
  • Experience using AI-assisted development tools and frameworks such as GitHub Copilot and LangChain for building LLM-powered applications and workflows.
  • Experience with Git-based version control systems and CI/CD pipelines.
  • Familiarity with data modeling concepts for both structured and unstructured data.
  • Strong analytical thinking, problem-solving abilities, and communication skills.
  • Willingness to work across all phases of the SDLC, including requirements gathering, design, development, deployment, and production support.

Requirements

  • Design, develop, and maintain scalable software architectures and data pipelines that integrate with analytical and operational systems.
  • Write clean, reusable, and well-tested Python code using frameworks such as Flask and related libraries.
  • Leverage AI-assisted development tools, including GitHub Copilot and LangChain, to design, build, and integrate LLM-powered solutions such as retrieval-augmented generation (RAG) pipelines, intelligent agents, and automated workflows using AWS Bedrock or similar services.
  • Develop and optimize complex SQL across Oracle, MS SQL Server, PostgreSQL, and Snowflake, including procedures, functions, views, analytical functions, and dynamic SQL.
  • Design and implement ETL pipelines using Snowflake and related data processing technologies.
  • Implement scheduling and orchestration using Apache Airflow or similar workflow orchestration frameworks.
  • Establish and maintain data quality frameworks, versioning, and governance practices to ensure data reliability, integrity, and compliance.
  • Develop and maintain data architectures and models for both structured and unstructured data sources.
  • Troubleshoot production issues and drive continuous improvement in software quality, performance, and reliability.
  • Deploy, manage, and support solutions on AWS, including storage, compute, and pipeline services.
  • Create source-to-target mappings and support data and code migration initiatives.
  • Partner with stakeholders to gather requirements, translate business needs into technical solutions, and produce clear, well-structured documentation.
  • Collaborate with product managers, analysts, and cross-functional teams to deliver data-driven insights and reporting using tools such as Plotly and Power BI.

Benefits

  • Competitive compensation aligned with local market practices.
  • Comprehensive health and wellness benefits.
  • Paid time off and company holidays.
  • Opportunities for professional development, learning, and career growth.
  • The flexibility of working from Bangalore or remotely within India, while collaborating with global teams.
Before You Apply
remote Be aware of the location restriction for this remote position: India
Beware of scams! When applying for jobs, you should NEVER have to pay anything. Learn more.
Lead Data Engineer (GenAI / LLM Applications) @Clario
Data and Analytics
Salary unspecified
Remote Location
Employment Type full-time
Posted 2d ago
Apply for this position
Did not apply
Applied
Sent Follow-Up
Interview Scheduled
Interview Completed
Offer Accepted
Offer Declined
Unlock 145,000+ Remote Jobs
remote Be aware of the location restriction for this remote position: India
Beware of scams! When applying for jobs, you should NEVER have to pay anything. Learn more.
Apply for this position
Did not apply
Applied
Sent Follow-Up
Interview Scheduled
Interview Completed
Offer Accepted
Offer Declined
Unlock 145,000+ Remote Jobs
×

Apply to the best remote jobs
before everyone else

Access 145,000+ vetted remote jobs and get daily alerts.

4.9 ★★★★★ from 500+ reviews
Unlock All Jobs Now

Maybe later