It was a chilly March morning. Inside a neon-hued production studio, a group of elite cyclists clipped into their stationary racing bikes. They’d met in competition before. They knew when to sprint, whether to dive through a gap in the pack, when to stay on the wheels of another athlete, and how to manage headwinds in the final kilometers. Ahead of them was the digital starting line of a punishing race, up mountains and around narrow, unforgiving bends in the road. Just staying in the peloton and drafting would be a test of their sustained power and endurance.
It was the start of the UK’s first national cycling esports championship, held on the Zwift indoor cycling platform in March 2019. A Peloton or SoulCycle rider would have as much success competing in a Zwift championship race as they would the Tour de France. Even elite riders would find themselves challenged: These athletes had to train their bodies and condition their muscles, but they also had to learn the nuances of the video game, like when to use power-ups, temporary advantages that boost strength and stamina.
As the starting line’s red pixels faded away, the athletes began pedaling hard—pushing 400 or 500 watts—while their digital avatars followed suit. A small studio audience cheered the athletes on, clapping and calling out their team names.
One athlete in particular, elite cyclist and YouTube star Cameron Jeffers, was a clear leader throughout the race. Though all riders competed on nondescript stationary bikes, in the game Jeffers’ avatar rode a Concept Z1, which other Zwifters liked to call a “Tron bike” because it glowed like the futuristic lightbikes from the 1982 movie. Technically, anyone had access to the Z1, but its enhancements had to be earned by completing a series of difficult side quests, like climbing a grueling virtual mountain range in the Swiss Alps within a set time limit. Bikes like the Z1 weren’t just colorful, they were lighter, more aerodynamic, gripped the road better, and offered more power.
In the final stages, which took riders through the bubbling lava fields of a volcano, Jeffers pulled ahead of the rest of the riders. The screen showed he was pushing 961 watts, a staggeringly high amount of power for the end of a long, hard race. Even the announcers were shocked. “How much power does this man have?” one shouted. Jeffers easily crossed the finish line, earning double points for his team and winning the tournament, which included a special British Cycling jersey and a cash prize.
Except that Jeffers didn’t win on his own. His upgraded Tron bike, which aided his performance that day, wasn’t acquired through brute force riding and completing side quests, as required by the game, but with a simulation program. In the weeks leading up to the competition, Jeffers used a bot to ride in the game for him, often hitting an inconceivable 2,000 watts for distances of more than 125 miles at a time. (At his peak, Lance Armstrong rode only 110 miles a day during training.)
Simulation programs like the one Jeffers used are designed and programmed to complete tasks using defined parameters. In this case, the bot was built to simulate riding the stationary bike and trick the platform into believing that Jeffers was actually doing the work. The program logged in from multiple locations, simulating Jeffers’ avatar riding side quests in order to earn credits in the game. Data from the rides were immediately deleted in order to avoid detection, but the credits earned stayed in Jeffers’ account. Not long after the tournament completed, Zwift detected the bot in its system.
Jeffers’ ploy marked the first reported instance of robo-doping: using performance-enhancing algorithms to gain a competitive advantage in esports competition. Esports is a fast-growing field, and that has only accelerated in the midst of Covid-19. Zwift is one of many hybrid digital-physical platforms for other sports like rowing and running. Even longtime race promoter [Ironman is going digital](https://www.ironmanvirtualclub.com/: It launched a virtual race platform, where athletes earn points for their achievements and can compete in live races. But soon, sophisticated algorithmic enhancements in these competitions will make Lance Armstrong's dalliances with blood transfusions and hormones look blunt, rudimentary, naive.
Doping in sports is a well-known issue—every professional league has clear regulations and punishments when the rules are broken. But in digital competitions, there’s no guideline or rule against the use of simulation programs to enhance performance. While esports associations ban certain substances, there is no international standard for robo-doping. It’s a murky area today, and that portends thorny problems on the horizon.
The factors that conspired to enable robo-doping that March day—fast internet speeds, cloud-based platforms, connected devices, algorithms, automation systems, bots—also power our day-to-day lives. Outside of esports, we’ve already seen how easily algorithms can be manipulated. In order to steer people away from their neighborhoods, people reported fake traffic accidents on Waze. At Reagan National Airport near Washington, DC, Uber and Lyft drivers have simultaneously gone offline for a few minutes to trick the platforms into thinking no drivers were available, resulting in surge pricing. Last year, a deepfake video of Facebook CEO Mark Zuckerberg, saying “Whoever controls the data controls the future” went viral on the night of Congressional hearings. Robo-doping is just the latest iteration of these virtual manipulations—and it’s not difficult to imagine how similar tactics might impact workplaces and schools next.
For instance, board certification for an anesthesia residency, one of the most challenging in medical school, used to require in-person simulations, but there’s now an option to complete this portion virtually. In the digital version, residents treat computer-generated patients that have all manner of ailments, then monitor and make adjustments, just as they would for a human patient. While offering this virtual option may be more efficient, it also opens up the possibility of a new kind of Jeffers-inspired cheating. If a resident had run a bot to simulate the required number of practice sessions, it’s unlikely you’d want to end up as their first real-world patient.
Or maybe the company you work for decides to make our Covid-era work-from-home policy permanent. In exchange for being able to work anywhere, employees have agreed to be continually monitored: A platform tracks how often you use various apps and cloud-based processes, when you talk to other team members, and how quickly you’re able to complete tasks. Your performance plan is tied to these metrics.
You feel like you’re making positive contributions to your team, but one coworker seems to somehow outperform and outpace everyone. The joke on Slack is that this person must have a clone, because their constantly online, always working, and their metrics are amazing. But that’s not out of the question—your colleague may actually have a clone in the form of bots that log in and out of the apps and tools being monitored.
Remember when a handful of wealthy parents were discovered gaming the admissions system to get their kids into college? Some bought fake IDs and hired smart twentysomethings to sit for entrance exams. But in the not-so-distant future, a similar scheme might involve using a simulation program that takes millions of PSAT exams online until it learns patterns and delivers a set of questions with a statistically relevant probability of appearing on the real exam—then memorizing the answers in advance.
Simulation programs can be wonderful tools for training employees, enhancing our productivity, and freeing us from monotonous, repetitive tasks. But they’re also vulnerable to manipulation. And while threats to the integrity of esports are concerning, the possibility of robo-doping spilling outside of sports, where the people we may someday depend on decided to game the system rather than put in the work, is terrifying. Many professions increasingly rely on simulation programs because they’re efficient, cheap, and more objective. We ought to think about downstream risk now, while we still have a chance to intervene. It’s one thing to robo-dope in a nascent esports competition—it’s quite another to simulate practice hours ahead of surgery, or a murder trial, or a long flight in a cockpit full of buttons.