Persosa supports A/B testing a personalized experience against another experience, or your site's default experience, to see how it performs against a control group. Before enabling any tests, be sure to read the Best Practices section below.

To create a test, click on the Experiments tab of the Personalize section in the Admin interface. You will see a list of all your currently running and completed experiments. You can click Create Experiment to start a new one.

You'll be prompted to give the new experiment a name. Each experiment must have a completely unique name, so we recommend using some form of a date to denote when the experiment was created:

You will also need to select the Champion Experience, the Challenger Experience, which can be (none) if you want to test against your default website experience, and also the percentage of traffic that should be shown the Challenger Experience.

This experiment won't become active until you Publish all changes to your live site.

How the experiments work

When a visitor comes to your site and qualifies for a Champion experience that's to be split-tested, a percentage of the visitors will see the experience while the rest (the challenger percentage you selected) will see the challenger experience that you are testing.

Once a visitor is segregated into one of those two groups, they will continue to see that experience in any subsequent visit until the experiment is turned off.

NOTE: Only visitors that qualify for a Champion experience will fall into the experiment. So, even if your site has 20k sessions in a month, if none of them meet the segment conditions to fall into the split-tested experience, none of them will be in the experiment. If a visitor qualifies for one of the Challenger experiences but not the Champion, they will not be placed into the experiment.

How can I monitor the test?

To see the results of your test, be sure to activate our Google Analytics Integration. Persosa will set the "Persosa Experience" custom dimension to the following: [Experiment Name] | [Experience Served].

So, using our example above, a portion of your traffic will have a value of "My Experiment - 2018-06-01 | Experience A" and the other portion will have "My Experiment - 2018-06-01 | (none)".

Utilizing custom reports and/or dimension drill-downs, you can evaluate performance for each experiment and see how experiences perform compared to your default experience.

Best Practices

Due to potential conflicts between tests, Persosa experiments should not be run at the same time as paid advertising tests (e.g., split testing on Facebook). We recommend running tests on your paid platforms first to determine which marketing campaigns attract the most traffic.

Once you've optimized your paid marketing campaigns, run Persosa experiments to refine the optimal experiences to show to your visitors.

Did this answer your question?