Zero Risk Enrollment: Receive a full refund through the second week of the course, no questions asked.
Course features
Level: Foundations
2 hours/week
6 weeks
Class Size: 15 Students
Thursdays, 3-5pm ET US October 23-December 4, 2025 (excl. Thanksgiving Day)
Format: Live Online via Google Meet
Why take a quantitative UX research course on surveys?
Surveys are an essential tool in any UX researcher's toolbox. Yet they are one of the most deceptively difficult research methods. There are innumerable "invisible" mistakes you might be making, and no survey tool will tell you what's wrong.
Subtle wording choices, question ordering, or response format decisions could be systematically skewing your results. The insights drawn from flawed data could be driving major product decisions. If there's no one at your org conversant in survey methodology, no one will be able to say what's going wrong.
Survey methodology has decades of established best practices that most UX researchers never learn. We pick up survey writing through trial and error on the job. If we're lucky, we absorb some theory for self-study resources or from peers.
Course Description
This course teaches you everything you need to design surveys that yield reliable, accurate data. It will give you the confidence to do survey work that informs strategic decisions, and build essential quantitative UX research skills.
You'll learn the systematic approach that professional survey scientists use to generate trustworthy findings. No more wondering whether your data will hold up under scrutiny. No more presenting results with that creeping doubt in the back of your mind.
Course Format
Most survey courses focus on abstract theory or academic examples. These almost never translate easily to UX research. This course is built from the ground up to be relevant to UX research practitioners, You will learn the theory and skills you need to design reliable impactful surveys, exactly the way you will do it on the job.
Students will form small groups and work on realistic product scenarios throughout the course. Each week introduces new survey design techniques and immediately connects them to practical challenges, so you’re always practicing in context.
Week 1: Spot common survey failure modes and practice structured goal-setting.
Weeks 2–5: Build out survey instruments step by step, applying principles of question design, evaluation frameworks, validity testing, and design psychology. Work in teams to critique and improve each other’s surveys, just like you’ll do in the field.
Week 6: Interpret survey results and practice communicating findings clearly to stakeholders.
You’ll need: Google Meet for live sessions and basic spreadsheet tools (Google Sheets or Excel). No prior stats software experience required.
What new skills will I gain from this quantitative UX research course?
Core Competencies
Research Strategy and Planning
Define measurable objectives that align surveys with business decisions
Choose survey modes that minimize bias for your specific context
Select an appropriate sample size for your analysis needs
Question Engineering
Write questions based on best practices that capture true user sentiment
Design response scales that generate actionable data
Sequence questions to maintain engagement and reduce dropout
Quality Assurance
Apply professional evaluation frameworks to catch flaws before launch
Test question interpretation through systematic cognitive interviews
Optimize survey design to reduce measurement error
Data Interpretation and Communication
Interpret quantitative survey results and explain statistical significance
Identify when sample limitations affect generalizability
Present findings with appropriate caveats to non-technical stakeholders
How will this course help my career?
Quantitative UX research skills are increasingly valuable
Advancing in the field of UX research increasingly requires quantitative UX research skills. As an individual contributor, you may be called upon to run surveys, even if you identify as a qualitative researcher. As a manager, it is likely you'll be called upon to manage quantitative UX researchers, and to critique their work. Knowing survey methodology will be essential.
Gain increased stakeholder trust
Product leaders increasingly demand quantified evidence when making decisions. When your survey methodology is bulletproof, leaders will feel more comfortable relying on your insights for big investments and strategic decisions.
Achieve methodological authority
Become the go-to person for evaluating survey quality, training junior researchers, and establishing research standards across your organization.
Lead mixed-methods studies with confidence
Combine qualitative insights with quantitative validation, positioning yourself for more complex, high-visibility research projects.
Who is this course for?
UX researchers who regularly use surveys but want to be able to speak confidently about their process and defend the validity and reliability of their data. Qualitative UX researchers looking to transition to mixed-methods or quantitative UX research roles.
Research leaders and managers who need to quality-check team members' survey work but lack expertise to spot subtle methodological problems.
Product managers who commission user research and want to distinguish good surveys from poor ones, enabling better collaboration with research teams and more informed interpretation of findings.
This course is especially critical if you...
Support product decisions where small measurement errors translate into major strategic mistakes
Work at organizations where user data directly influences significant business decisions
Want to advance into senior research roles where quantitative competency is increasingly non-negotiable
Prerequisites
None. No prior experience with survey methodology, statistics, or advanced mathematics required. We start from first principles and build systematically.
Write your awesome label here.
HarmoniJoie Noel, PhD
Bio
Dr. HarmoniJoie Noel is a Senior Mixed Methods Researcher with a PhD in Sociology and Survey Research Methodology, bringing 15 years of expertise in healthcare research and patient experience studies. She has held distinguished roles at major organizations including RTI International, CDC's National Center for Health Statistics, and Booz Allen Hamilton, conducting groundbreaking research on health insurance literacy and patient experiences that has directly informed healthcare policy.
Free Advisory Session
Have questions about the course? Want to chat about your learning goals to see if they align with the course approach? Book a free call with the instructor.
Learning Outcomes
By course completion, you will confidently:
Identify the goals, requirements, and constraints that shape effective survey design.
Craft questions based on best practices that minimize error and maximize impact.
Evaluate survey quality using structured frameworks like QAS.
Test questions with cognitive interviewing to uncover hidden flaws.
Interpret basic survey data to spot patterns and communicate significance to stakeholders.
Overview
Course Syllabus
Week 1: Survey Goals and Common Failure Modes
Learning Objective: Identify poor survey practices and apply a structured framework for defining survey goals.
We’ll start with a real example of a “bad” survey—40 open-ended questions with an expected 100% response rate—and unpack all the ways it went wrong. From there, we’ll introduce key questions to ask before writing a single survey item: What do you want to know? How precise do you need to be? Who are the stakeholders? What resources are available? These considerations set the foundation for effective sampling and analysis planning.
Workshop: Break into pairs (one acting as stakeholder, one as researcher) to define goals for different UX scenarios using the full checklist of guiding questions.
Week 2: Question Types and When to Use Them
Learning Objective: Differentiate among question formats and match them to research needs.
This week covers the major categories of survey questions and their trade-offs: single vs. multiple select, check-all-that-apply vs. yes/no, unidirectional vs. bidirectional scales, number of points on a scale, midpoints vs. no midpoint, horizontal vs. vertical layouts, and frequency vs. attitudinal items.
Workshop: Each student is assigned a question type and writes a sample item on a given topic. The class shares and critiques the resulting questions.
Week 3: What Makes a Well-Written Question?
Learning Objective: Apply the Questionnaire Appraisal System (QAS) to evaluate survey questions.
We’ll explore the components of strong questions and the common pitfalls to avoid. The QAS framework provides criteria across domains like clarity, assumptions, memory demands, sensitivity/bias, and response categories.
Workshop: Review good and bad question examples, then evaluate sample items using the QAS framework in small groups.
Week 4: Domains, Constructs, and Cognitive Interviewing
Learning Objective: Translate abstract concepts into measurable constructs and test them for validity.
We’ll use health plan/CAHPS examples to illustrate how domains lead to constructs and constructs to questions, while also introducing measurement error. You’ll then practice cognitive interviewing—having participants “think aloud” as they answer questions—to understand how people interpret and respond.
Workshop: Create constructs and questions for a UX domain, review them with the QAS, and run cognitive interviews in pairs.
Week 5: Survey Modes and Web Design Best Practices
Learning Objective: Understand how mode and design choices affect data quality.
We’ll compare survey modes (in-person, phone, mail, web, IVR) with a focus on web surveys, where most UX work occurs. Topics include layout (progress bars, questions per page, horizontal vs. vertical scales), formatting, instructions, spacing, grouping, and ordering.
Workshop: Review real examples of short but complete surveys (web and mail). Identify design and layout flaws, then propose and share improvements.
Week 6: From Data to Insights
Learning Objective: Interpret quantitative survey results to identify patterns and test significance.
The final session introduces analysis strategies, including how to interpret crosstabs, spot patterns, and apply the basics of statistical significance. The goal is not advanced stats, but understanding how to generate meaningful insights from your survey data. We’ll also briefly cover how to use AI tools to run analysis, which can be a great shortcut if used judiciously.
Workshop: Work with sample survey results to practice identifying patterns, look for significance and meaningful findings, and explain results in plain language for stakeholders.