This guide will walk you through how to set up an A|B study
A/B testing is a valuable tool for comparing two versions of a product, such as different website layouts or app designs. This guide will walk you through the minimum steps to create an A/B study in MUIQ.
Choose the Study Type
Navigate to the Usability section in the study creation dashboard.
Select the A/B Test option.
Define Your Conditions
Go to the Conditions tab.
The study is pre-set with two conditions (e.g., “Version A” and “Version B”).
Optional: Click “+” to Add Conditions :
Define what and how the conditions will be shown to participants.
Choose either:
“All Conditions” if participants will see both versions. (default)
“One Condition” or “One Fixed Condition and Another at Random” for specific testing scenarios.
Specify the condition order. Choose either:
Fully Randomized (default)
Listed Order
Set Up Tasks
Open the Tasks tab to create the activities participants will complete.
Expand the Prototype Task to display configuration options for both Version A and Version B.
Click the downward arrow icon to the left of the Task Title (e.g., Prototype Task”). This will expand the task details.
Version A:
Enter Pre-task Text instructions in the Text editor to provide Scenario/Context:
Provide the participant with a scenario and any necessary information to complete the study tasks. This information should be presented in the goal-oriented language of a typical user of your product.
Provide concise instructions for each task, including the task itself and the desired endpoint.
If using a Task Validation Question, remind participants to make note of the answer for a follow-up question.
If it’s a think-aloud task, specify what aspects of the interface or prototype you’d like the participant to comment on.
Enter In-Task Text instructions in the Text editor:
This should be an abbreviated version of your Pre-Task instructions.
Typically, it does not need to repeat the Context/Scenario.
Should include only the essential information to complete the task.
Specify the Task Type, select either:
Send to Target URL
Input the URL (e.g., Figma) for each condition under “Task Target URL.”
Text/Image Task (e.g., screenshot)
If using URL Task Type, enter the URL (e.g., Figma).
Participant Recording Options:
No Recording: No screen or interaction recordings are collected.
Screen Recording (Default): Records URLs, clicks, and the participant’s screen.
Privacy Blur: Blurs sensitive information and all screen text while still capturing interaction data.
Think Aloud Recording: Records URLs, clicks, screen, and audio (microphone input). Note: May require extra task instructions and could increase dropout rates.
Custom: Allows customized recording settings
❗Ensure you pilot test the study thoroughly and review the results. Heat and click maps may not render correctly in some cases.
Manage Participant Logic:
Choose how tasks will be assigned:
Use Visibility Logic to determine when tasks appear.
Always Shown (default)
Only shown if…
Task Randomization order will minimize bias:
Randomized (default)
Fixed in position
Version B: (Repeat Step 3)
Review and Finalize
Confirm that all tasks and conditions are configured correctly.
Preview the study using the Generate Preview option in the top (right) menu.
Once satisfied, click Activate Study to launch.
Key Tips for Success
Use task randomization to reduce participant bias.
Double-check condition URLs for accuracy.
Test the study workflow thoroughly before activation.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article