Role Description
We are seeking a highly experienced and versatile Senior AI Programmer / Data Analyst with strong expertise in machine learning, large language models (LLMs), and intelligent agent systems. The ideal candidate should be comfortable building production-ready AI applications and performing deep, actionable data analyses. This is a cross-functional role, combining advanced AI programming with analytical insights to drive product innovation and decision-making.
-
Design, develop, and deploy AI/ML solutions with a focus on LLM-based systems and autonomous agents
-
Implement and fine-tune transformer models (e.g., GPT, BERT, LLaMA, etc.) for a variety of use cases (chatbots, summarization, Q&A, RAG, etc.)
-
Build and maintain AI agents that interact with tools, APIs, or external environments
-
Analyze complex datasets to identify patterns, trends, and actionable insights
-
Collaborate with data engineers, product teams, and stakeholders to align AI capabilities with business goals
-
Develop and monitor data pipelines and model performance in production
-
Translate analytical findings into clear, compelling dashboards, reports, and narratives
-
Stay ahead of the curve on emerging trends in generative AI, LLMs, and agentic workflows
-
Mentor junior developers and contribute to code quality and architectural decisions
Qualifications
-
Bachelor’s or Master’s degree in Computer Science, Data Science, AI/ML, or a related field
-
5+ years of experience in AI/ML development, data analytics, and intelligent system design
-
Hands-on experience with LLMs (OpenAI, Anthropic, Cohere, Hugging Face models, etc.)
-
Proven ability to build and deploy multi-step autonomous agents using frameworks like LangChain, AutoGPT, or custom architectures
-
Proficiency in Java, Python and ML libraries such as PyTorch, TensorFlow, scikit-learn
-
Strong SQL and data wrangling skills
-
Experience with data visualization tools (Tableau, Power BI, Plotly, etc.)
-
Working knowledge of MLOps, CI/CD pipelines for ML, and cloud platforms (AWS, GCP, or Azure)
-
Ability to turn business requirements into technical solutions with measurable impact
Requirements
-
Experience with Retrieval-Augmented Generation (RAG) pipelines
-
Familiarity with vector databases (e.g., FAISS, Pinecone, Weaviate, Chroma)
-
Knowledge of prompt engineering and evaluation techniques for LLMs
-
Exposure to big data tools like Apache Spark, Kafka, or Snowflake
-
Previous work in domains like finance, healthcare, or SaaS platforms
Benefits
-
Work on cutting-edge AI projects using the latest in LLM and agent tech
-
Collaborate with a team of forward-thinking engineers and analysts
-
Competitive compensation and benefits
-
Flexible working environment (remote-friendly)
-
Opportunities for learning, growth, and industry exposure