At TravelTriangle, we are obsessed with the idea of empowering travellers with the best experience throughout their vacation. Our product development team work continuously with the goal of improving user experience which results in improving some key metrics.
They have different requirements like:
- Redesigning a page to increase retention by improving the user experience
- Adding a brand new feature
- Enhancing an existing feature to increase product usage.All these changes are done one thing in mind:
To make our product more valuable for the customer.
However, Before changing anything we have to answer the following question:
- Will customer like this new change?
- Instead of enhancing what if it degraded the user experience.
Working with Travel industries, there are many factors like, climate change, market, etc. which influence traveller behaviour. This can be good or bad. However, It became important to isolate these external factors to make product decision effective.
To solve the above problem we experiment with different variations of a feature using AB testing.
A/B testing also known as split testing, is an experiment where two (or more) variants of a webpage are shown to users at random to determine which variant leads to more acceptance. After finding a better variant, it is then released for all users.
A/B testing is extremely important in the world of user acquisition. A good article by Jason Fried can be found here.
There are many A/B experimentation platform available which provide a complete feature for AB testing. At travel triangle, we use VWO which provides flexibility to set up, implement and analyze experiments quickly.
In our use case, following were the requirements:
- Multiple variations should be easily created for a feature
- Once a user is shown variation (not included control), a user should always go to the same variation.
- Traffic distribution should be done through experimentation framework and should allow various traffic distribution mechanism
- A complex experiment should also be doable using the same and should not involve codebase change and release
VWO was extremely helpful, However, as we moved to our frontend from rails and jQuery to react (creating SPA with server-side rendering) we were unable to run any experimentation by it.
one common issue is that UI components are re-rendered every time the state is changed. So, A/B testing tool changes are removed by React.
There are many react ab test module available on npm.
However, every single module requires us to have all variant in our codebase. Plus every time we need to change the experiment we need to update our codebase. Not to mention we need to do a release after it.
A development effort followed by a release is required for most of the experiments which should not be needed as most of them are usually fail fast experiments.
To resolve the above issue we have created an npm module dynamic-react-ab. Dynamic react ab module allows the user to define and code feature as plain JS file and inject it using any third party like Google tag manager, VWO, Optimizely, AB tasty etc. After experimentation is complete, we can now implement winning variant with a better understanding of the problem and almost complete requirements. Also, this does not require clean-up of losing variants.
Using Dynamic react ab:
Let’s take a scenario:
Product manager: We need to add edit contact functionality on any xyz page.
Product manager: However… (Real problem comes now…) We have two variant to test which suits best to the user.
Me: How do you want traffic to be distributed between variants?
Product manager: We are not closed on this part. When we can have this form. We can decide about traffic later…
Me (In my mind): As always, Requirements are not complete or to volatile… 🙁
- The requirement not clear or likely to be changed.
- Traffic allocation is unknown. Plus you can not hard code it (changing it will be cumbersome)
- There can be more variants… who knows.
- Using dynamic react ab we can delegate traffic allocation and the variant decision to third-party AB framework (like VWO, optimizely, AB tasty etc)
- As we do not change our source code no release is required
- Changes are reflected as soon as possible
- We have the flexibility to update experiment anytime with very less dev effort
- Source code remains clean as no code is been done there
With all the above points, Let’s see an example to add single variation via VWO to our react page.
- Skeleton app is created using create-react-app
- VWO is AB testing framework. You should have basic knowledge of this tool.
- VWO does works only with the publicly accessible website. We are using ngrok to access locally deployed react application. If your application is already deployed with public URL you can skip this. To know about ngrok go here.
Step 1: Create a VWO account
Step 2: Add the website URL we want to track
Step 3: Set up Goal to track
Step 5: Add install code inside the head tag of your website
Step 7: VWO works on public accessible site. It will not work on localhost. I am using ngrok. Copy URL for which experiment needs to be run.
Step 8: VWO requires hypothesis for each experiment. Create one for your experiment
Step 9: Clicking create will take you to live edit page. We won’t use this feature as any changes done using editor will be overwritten by react re-render.
Step 10: Adding JS code for variation
Step 11: We already have sample code ready for the same here. Copy it.
Step 13: We can define traffic distribution using advanced options:
Step 14: Finally we can start our experiment by clicking start now
Step 15: Add dynamic-react-ab into you project using:
npm install –save dynamic-react-ab
Step 16: To use dynamic-react-ab you just need to import and add it your page component (preferably at bottom).
That’s all you need to change in your code base. All experiment related code changes can be inside VWO, Google tagmanager, etc. AB testing framework.
Step 17: Start your application and we can see an edit contact detail button
Step 18: On clicking it, An edit contact modal opens:
Step 19: Do some changes, A setTimeout is used to fake post call after which an success message is shown.
It became a matter of hours to analyze, divide subtask, make edit flow (open, edit, success message), code and release your experiments. And you have not touched the main codebase. If for somehow this experiment does not go well. You can delete this experiment from VWO and that’s all.
For a successful experiment, we have more understanding of the problem and almost complete requirements (Ahh… they are never complete.. :P). This will help in implementing the feature in our main application.
Experiment first approach for product design and development allow product and development teams to focus on the right things and able to do quick recursion using the fail-fast approach.
The main purpose to create this module for our team is simple: no code changes in codebase are needed to run the experiment. This lets product and marketing teams launch tests in production faster and without the involvement of engineers.