How to Create a Custom Code Experiment
This guide walks you through setting up a custom code experiment in Elevate — from writing your code to launching the test.
Before You Start
Custom code experiments run your JavaScript and CSS on your live store. Unlike other experiment types where Elevate handles the implementation, you're responsible for the code quality.
Before creating the experiment:
Prototype in browser dev tools first — Write and test your JS/CSS in the browser console on your live store to make sure it works
Check for errors — Open the browser console and verify there are no JavaScript errors
Test on mobile and desktop — Make sure your code is responsive
Step 1: Create an Experiment
From your Elevate dashboard, go to Experiments and click New Experiment. Select Custom Code Experiment.
If the Elevate theme extension isn't enabled on your store, you'll be prompted to enable it before proceeding.
Step 2: Name Your Experiment and Write a Hypothesis
Give your experiment a descriptive name — something like "Sticky Add to Cart Bar" or "Exit Intent Popup Test."
Fill in the description and hypothesis fields. For example:
"Adding a sticky add-to-cart bar on mobile product pages will increase add-to-cart rate by keeping the purchase action visible as visitors scroll through the page."
Step 3: Set Up Variations
Select how many variations you want (including the control). The minimum is 2.
Control — No code is injected. Visitors see your store as-is. This is set automatically.
Variation(s) — For each variation, you'll configure three things:
JavaScript
Enter your custom JavaScript. This code executes after the page loads for visitors assigned to this variation.
Common patterns:
DOM manipulation (adding elements, modifying content, hiding sections)
Event listeners (click tracking, scroll triggers)
Conditional logic (show different content based on cart value, time of day, etc.)
CSS
Enter custom CSS styles. These are injected into the page and apply to visitors in this variation.
Common patterns:
Styling new elements added by your JS
Overriding existing theme styles
Hiding or repositioning elements
Responsive adjustments with media queries
Page Targeting (Pathnames)
Specify which pages your code should run on:
*— All pages (default)Specific paths — e.g.,
/products/*for all product pages,/collections/salefor a specific collectionExclude paths — Specify pages where the code should NOT run, even if they match the include pattern
Important: Be precise with your pathnames. Running code on unintended pages can cause unexpected behavior. If your code is meant for product pages only, use
/products/*instead of*.
Each variation has its own independent code and pathname configuration — you can test completely different approaches in each variation.
Step 4: Set Traffic Allocation
Set how traffic is split between your variations. The default is an even split. Adjust as needed — percentages must add up to 100%.
Step 5: Choose Your Experiment Goal
Select the primary metric for determining a winner:
Revenue Per Visitor — best for understanding overall revenue impact
Conversion Rate — best for measuring purchase likelihood
Average Order Value — best if your code affects cart behavior
Add-to-Cart Rate — best for changes that target the purchase decision
Checkout Start Rate — best for mid-funnel changes
Step 6: Add Audience Targeting (Optional)
Narrow your audience if needed — by device, location, traffic source, visitor type, or UTM parameters.
For the full list of targeting options, see Audience Targeting.
Step 7: Launch
Click Create Experiment to submit. The experiment moves into review status. Once you've completed QA, click Launch Experiment to go live.
Quality Assurance
Custom code experiments require extra QA attention since you're injecting code into your live store:
Visit the targeted pages — Confirm your code is executing and the changes are visible
Open the browser console — Check for any JavaScript errors caused by your code
Test on mobile and desktop — Responsive issues are the most common problem with custom code
Test page interactions — Click buttons, scroll, navigate between pages — make sure your code doesn't break existing functionality
Check page load speed — Compare load times between the control and variation to ensure your code isn't causing a noticeable slowdown
Test with different browsers — Chrome, Safari, Firefox at minimum
Verify tracking — Check the Raw Data tab to confirm events are recording correctly
After Launching
Monitor the first few hours closely — Since you're running custom code, watch for any issues that might not have surfaced during QA
Check results — Use the Results tab to track performance
Wait for significance — Let the experiment run until it reaches a definitive status. See Statistical Significance.
If the variation wins — The code will need to be permanently implemented in your theme. You can either add it to your theme's code directly or work with a developer to integrate it properly.
End the experiment — When ready, see Ending an Experiment
Last updated