Kintura ML

Why does Kintura's AI outperform all of the competitors?


Machine Learning and Traffic Distribution


It's more than Split Testing. Kintura is the only tracker using anything close to AI.

Other trackers call "AI" simply applying Bayesian statistics to determine when a lander or offer's performance is proven by crossing a threshold of statistical significance. Similar to Binomial Confidence Intervals, Bayesian statistics can give you some indication that Landing Page "A" has been given a fair shot to outperform Landing Page "B" but has not. As such, tracking platforms touting AI (it's insulting, really) will move the weight slider a little to the left to draw traffic away from the lander that isn't performing. These "Blanket Adjustments" treat all traffic the same. All information about why any of the customers who converted on Landing Page "A" has been discarded; or more importantly, it was never taken into consideration in the first place.

This is wrong. This is the old way. This is not "AI" at all. Traffic is too expensive to do it this way. But alas...

https://kin-assets.s3.amazonaws.com/docs/extracted/kintura_m_l-194UXIFCsl

Conversely, if we're going to throw around terms like "AI" we should be utilizing Deep Learning in order to determine what your traffic wants on a click-by-click basis.

By default, when you switch on "Kintura ML" in a campaign you're telling Kintura to train a real-time, online model of what your users are showing a preference for on a per-user basis. This is more like what Facebook does with the Facebook Pixel. Every time there's a conversion, Kintura adds it to the model for that campaign. It looks at various traits (everything from the user's browser version, the creative you showed them at the source, their time of day in their time zone) and looks for statistically significant features utilizing our own blend of Random Forest and Decision Tree algorithms. Now, when the next user comes along, we can figure out whether they're more likely a "Landing Page A" or a "Landing Page B" by comparing their features against the "Model" -- in this case the Landing Page Model. We also keep an Offer Model which *includes* whatever Lander they just looked at.

Humans are very, very complex creatures.

Let's start by clarifying the minimum requirements for getting the most from Kintura ML.

  1. You have more than one Landing Page or Offer in a Default Route attached to your campaign
  2. You have at least two Landing Pages or Offers and they are of similar vertical and price resistance (important, because science)
  3. You understand what price resistance is and that you cannot conduct an experiment between a $450,000 Ferrari and a $5 Dating Membership
  4. You have a basic understanding of the Scientific Method meaning that you are somewhat aware when you're introducing significant bias into an experiment

TL;DR make a single default route and attach it to your campaign and add some variants of Landing Pages and/or Offers. Change headings, opening paragraphs, background colors, images, etc. What we want you to do is find either landers or offers what really push the limits of variants in terms of flavor and message.

Turn on the Kintura ML switch and click Save.

Keep an eye on your Distribution Report!

This report gives you a complete, beautiful breakdown of why we sent each click where. Were we training the model? Did the user revisit the campaign and get sent to the next offer in the list? Did you use something like /out/1 or /out/2 on your lander? That's a Hard-1 or a Hard-2. While you're there also check the Entries report which will show you how much money you're throwing away on repeat impressions and clicks; or perhaps that you need get this ad in front of users as much as possible (see Rule of 7s in Marketing).

If you have multiple Landing Pages

We understand that some of you are used to using ambiguous yet detrimental landing offer links like /out/1 and /out/2. The reason we do not advise this is because it is a violation of the Scientific Method. Because you will inevitably present a user with Offer #1 before or after Offer #2, you've introduced catastrophic bias. Remember this bias when analyzing Offer performance.

There is no recovery from this bias. A much better method is to use simply /out on your landing page and have the offers routed at random in order to eliminate bias.

When you utilize Kintura ML, this is done for you. More so, Kintura ML will train its distribution model based on how your traffic responds based on a dozen key attributes in real-time.

So yeah, at this time, Kintura is the only tracker that offers true AI. This is the same method Facebook uses with their Conversion Pixel. We are happy to share more about this if other trackers want to join us. As for now, they seem pretty impressed by moving sliders around, though.