Role Description
Pulse Labs is hiring its first dedicated quality role. This is a foundational position: you'll define how quality works here, not just execute an existing playbook. You'll sit at the intersection of Product, Engineering, and Design β ensuring that what we build matches what we intended, and that what we intended was well-defined in the first place.
You'll spend the majority of your time testing what we've built and serving as the primary release gate before features ship β but you'll also work upstream with Product to translate acceptance criteria into thorough QA test plans.
Key Responsibilities
-
Test Planning (~30%)
-
Take acceptance criteria authored by Product Managers and build robust QA test cases that go well beyond the happy path β covering edge cases, error states, boundary conditions, and regression scenarios.
-
Flag gaps or ambiguities in acceptance criteria back to PMs when test cases can't be written cleanly against them.
-
Maintain a living test case library organized by feature area and risk level.
-
Testing & Release Gating (~60%)
-
Own the QA gate in the release process. You have authority to flag releases as not ready and escalate blocking issues.
-
Execute manual testing against test cases for all features prior to release, with a strong emphasis on regression testing across the platform.
-
Coordinate with Engineering on test coverage: they own unit tests and CI automation; you own functional, integration, and regression testing.
-
Coordinate with Product and Design on UAT: they validate intent and experience; you validate correctness and completeness.
-
Document and track defects with clear reproduction steps, severity, and expected vs. actual behavior.
-
Process Development (~10%)
-
Build Pulse Labs' QA processes from scratch: define the release checklist, bug triage criteria, and quality reporting cadence.
-
Identify high-risk areas of the platform where automated testing would have the highest ROI, and advocate for engineering investment there.
-
Establish quality metrics and communicate trends to the broader team.
Qualifications
-
Rigorous attention to detail. You catch the thing everyone else missed. You think in edge cases.
-
Strong analytical and communication skills. You can read a requirements doc, identify what's undertested, and articulate gaps clearly to a PM or engineer.
-
Experience writing test cases and test plans from acceptance criteria β not just executing someone else's.
-
Comfort working directly with engineering teams. You don't need to write code, but you need to be credible in technical conversations β comfortable reading logs, navigating browser dev tools, testing API responses, and understanding system architecture at a conceptual level.
-
Experience with manual testing methodologies, particularly regression testing.
-
Comfortable coordinating across time zones. The engineering team is primarily India-based (your day-to-day collaborators), while Product and Design are US-based. You'll need solid written communication for async handoffs with the US team.
-
3+ years in a QA, Quality Analyst, or similar role.
Nice-to-Have
-
Experience being the first QA hire or building QA processes from the ground up.
-
Familiarity with AI/ML products or research platforms.
-
Exposure to test automation frameworks (we're not asking you to build them, but understanding what's possible helps you advocate for the right investments).
-
Experience with tools like Jira, or similar issue trackers.
-
Background in UX research tooling or data platforms.