Why Uber & Lyft leaving Austin is just a real-world A/B test

By 0 , , , , 0
Uber and Lyft’s temporary departure from Austin earlier this year was quite simply a real-world A/B test. Never mind Prop 1, driver fingerprinting and passenger safety. If Uber and Lyft were serious about these issues, they would have complied, rather than left the city.
By temporarily leaving a large and growing tech metropolis without the dominant ride sharing duopoly, they’ve been able to test the speed to market and ability of competitive ride share services to mobilize and thrive, including traditional taxis.
And mobilize they did. Fasten, Getme, zTrip, Fare and Wingz, to name a few, were quick off the mark to launch or beef up their apps and offer discount codes to drive customer acquisition. The quality and reliability of these services and apps initially varied greatly (spurious arrival forecasts, crashing apps, stationary car avatars). After a few months of updates, plus customer & driver acquisition, the platforms have rapidly evolved closer to the true Uber experience (reliable app, fast customer service).

Driver and rider feedback

Monday morning May 8th, 10,000 drivers awoke to learn Uber and Lyft delivered on their threat of leaving Austin after losing the vote on Prop 1. Slowly riders and drivers adopted the new platforms. My anecdotal feedback from talking to about 30 ex-Uber/Lyft drivers in Austin has been very mixed. Some said they would instantly go back to Uber (great incentives and bonuses). Others claimed they would never deal with the companies again after the recent debacle. That being said, selecting which service one drives for in an a la cart fashion is now common (much like what riders do). Drivers can have all the ride share apps open and exercise the option to pick whichever service delivers the next best fare.

Real World A/B testing

Real world A/B testing is more difficult to implement and measure accurately than online. Conversely, A/B testing online is so damn easy and cheap to implement with free tools such as Google Analytics, Optimizely and Unbounce.

Direct mail has utilized A/B testing for many years. David Ogilvy’s Confessions Of An Advertising Man from 1964 talks about A/B testing his DM campaigns. FMCG companies commonly test products at city and country levels. Supermarkets (who enjoy a wealth of data) do it at a store level. It can be harder to do with physical products depending on your business’s resources and reach. A simpler broad preference test can work such as “Daily Special” and “Limited Edition” product runs like chip flavors and special edition burgers.

Re-enter Uber and Lyft

So what has Uber and Lyft learned from this A/B test? They’ve seen that similar apps/companies can quickly sprout up and deliver a comparable service. They’ve seen DUIs arrests increase. However, they have not seen the taxi industry spring back into life at such a golden opportunity created by the vacuum.

Uber and Lyft will inevitably re-enter the Austin market soon. The subsequent test will be if these ride share platforms will survive. Although clunky at first, I’ve had great service riding both Fasten and Getme. However, I will certainly jump back on the Uber and Lyft bandwagon when they return. It seems we are all suckers for big brands and their supposed reliability.

Shout outs (tl;dr)

  • Real world AB testing is more difficult to accurately implement and measure than online.
  • If you’re not A/B testing your campaigns and product mixes, you need to be.
  • There are many cost-effective tools available, without having to invest in a full marketing automation platform.