Role Description
The Senior Product Manager, Scoring & Assessment Systems will own the product strategy, vision, and evolution of scoring across NCBEโs exam products. This is a senior product leadership role with clear decision authority, responsible for defining how scoring works at NCBE: from human grading models to the responsible adoption of automated scoring. The role requires strong product judgment, industry perspective, and strong communication skills to evaluate tradeoffs and make recommendations that shape the future of high-stakes legal licensure assessment.
This Senior Product Manager will serve as the single accountable product leader on scoring decisions, working across test development, psychometrics, operations, and technology. When priorities conflict or perspectives differ, this role will guide teams toward decisions grounded in best practices, data, and product outcomes.
This position is ideal for someone who has owned or significantly shaped end-to-end scoring in a large-scale assessment organization and can bring that expertise to a complex, evolving product ecosystem.
Essential Duties and Responsibilities
-
Own Scoring Product Strategy & Decisions
-
Define and drive NCBEโs scoring strategy across multiple-choice and constructed-response item types.
-
Own the product vision and roadmap for scoring, aligning it to organizational priorities.
-
Make informed product decisions on scoring approaches, balancing validity, reliability, operational feasibility, and grader and examinee experience.
-
Serve as the decision-making authority when tradeoffs arise across stakeholder groups.
-
Lead Evolution of Scoring Models (Human & Automated)
-
Evaluate and guide the adoption of automated scoring, including where and how it should (or should not) be used.
-
Articulate the benefits, risks, and limitations of different scoring approaches, including hybrid human/AI models.
-
Partner with psychometric and research teams to ensure appropriate validation, calibration, and evidence frameworks.
-
Bring Industry Best Practices to NCBE
-
Apply industry experience and best practices from leading assessment organizations to inform scoring design, workflows, and policies.
-
Ensure NCBEโs approach reflects current best practices in large-scale, high-stakes testing.
-
Introduce improvements to rubrics, training materials, and scoring processes (e.g., training papers, calibration methods).
-
Manage staff in the day-to-day performance of their roles, including mentoring, setting performance goals, reviewing performance, training, professional development, recruitment, and onboarding. Foster a high-performing, accountable team environment aligned to NCBEโs product strategy and organizational goals.
-
Translate Strategy into Product Requirements
-
Define clear product requirements for scoring systems, workflows, and tools.
-
Partner with technology teams to deliver scalable, reliable scoring platforms and integrations.
-
Maintain a forward-looking roadmap for scoring capabilities.
-
Influence & Align Cross-Functional Stakeholders
-
Work closely with test development, psychometrics, operations, and external partners to align on scoring approaches.
-
Work with senior stakeholders to make decisions and align teams when priorities conflict.
-
Communicate and defend product decisions with clarity and credibility.
-
Build alignment across teams, especially in areas of ambiguity or disagreement.
-
Represent Scoring Internally and Externally
-
Serve as a key point of contact for scoring-related questions, including with external partners (e.g., jurisdictions such as JX).
-
Present strategy, decisions, and outcomes to executive leadership, governance boards and committees, and external stakeholders.
-
Clearly explain complex scoring concepts to both technical and non-technical audiences.
Qualifications
-
5โ10+ years of experience in large-scale assessment, licensure, or certification testing.
-
Direct experience owning or leading an assessment scoring product, capability, or major component end-to-end (not solely supporting, coordinating, or analyzing).
-
Experience working at a recognized or high-quality assessment organization.
-
Demonstrated ability to make, own, and defend product decisions in complex, cross-functional environments.
-
Experience writing product requirements and working closely with technology teams.
Requirements
-
Strong understanding of:
-
Constructed-response scoring and rubric design.
-
Rater training, calibration, and monitoring (including training papers).
-
Inter-rater reliability and why it matters.
-
Core concepts of validity and scoring quality (no advanced psychometric analysis required).
-
Practical knowledge of scoring operations at scale, including tradeoffs between quality, speed, and cost.
-
Working knowledge of automated or AI-assisted scoring approaches, including:
-
Strengths and limitations of different models.
-
Validation and monitoring considerations.
-
Appropriate use cases in high-stakes environments.
-
Strong product management mindset with the ability to translate strategy into execution.
-
Proven ability to influence without authority and navigate competing perspectives.
-
Excellent communication skills, including the ability to explain and justify decisions to senior and external audiences.
-
Able to communicate confidently in high-stakes discussions.
Location
Madison (Or Other), Wisconsin (Remote)
Department
Product Development
Employment Type
Full-Time
Minimum Experience
Senior Manager/Supervisor