Role Description
We are looking for a highly skilled Machine Learning Engineer with strong expertise in both traditional machine learning and modern Generative AI (LLMs) to design, build, and deploy scalable AI solutions. The ideal candidate will work on large-scale structured and unstructured data, enabling intelligent automation and insights, particularly in domains such as insurance (claims, underwriting, fraud detection).
Key Responsibilities
-
Design, develop, and deploy end-to-end machine learning and deep learning models for real-world business problems
-
Build scalable solutions for large-volume data processing (structured & unstructured)
-
Develop and optimize Generative AI applications using LLMs (e.g., RAG pipelines, copilots, summarization, Q&A systems)
-
Implement predictive analytics models such as classification, regression, clustering, and anomaly detection
-
Work on insurance-focused use cases, including:
-
Claims anomaly/fraud detection
-
Risk scoring and underwriting support
-
Document processing (OCR + NLP pipelines)
-
Build and maintain data pipelines and feature engineering workflows
-
Fine-tune and evaluate LLMs and embedding models for domain-specific use cases
-
Ensure model performance, scalability, and monitoring in production environments
-
Collaborate with cross-functional teams (product, data engineering, business stakeholders)
-
Maintain best practices in MLOps, model versioning, and CI/CD pipelines
Qualifications
-
Strong foundation in machine learning algorithms (supervised & unsupervised)
-
Experience with anomaly detection, time-series, and predictive modeling
-
Proficiency in Python and ML libraries (Scikit-learn, XGBoost, PyTorch/TensorFlow)
-
Experience with data preprocessing, feature engineering, and model evaluation
-
Hands-on experience with LLMs (OpenAI, Claude, Hugging Face, etc.)
-
Strong understanding of:
-
RAG (Retrieval-Augmented Generation)
-
Prompt engineering & evaluation
-
Embeddings & vector databases (e.g., FAISS, Milvus)
-
Experience building GenAI applications (chatbots, document Q&A, summarization systems)
-
Experience working with large datasets (batch + streaming)
-
Knowledge of data pipelines and tools (Spark, Airflow, or similar)
-
Familiarity with cloud platforms (Azure, AWS, or GCP)
-
Experience deploying models using APIs (FastAPI/Flask)
-
Understanding of Docker, CI/CD pipelines, and model monitoring
-
Knowledge of version control and experiment tracking tools
Requirements
-
Experience in the Insurance domain (claims processing, fraud detection, underwriting analytics)
-
Familiarity with document AI / OCR / NLP pipelines for insurance workflows
-
Experience with graph-based or network-based anomaly detection
-
Exposure to multi-agent systems or AI orchestration frameworks
-
Understanding of regulatory and compliance considerations in insurance AI
Benefits
-
Opportunity to work on large-scale cloud data transformation initiatives
-
Flexible remote work model from India
-
Exposure to enterprise clients across the East Coast region
-
Growth-oriented leadership culture