Robots are getting better at tasks that require a blend of speed, precision, and unpredictability. A new wave of physical AI systems is stepping out of simulated environments and into the messy, demanding real world. Two recent milestones, one from Sony AI and another from a Beijing half marathon, illustrate just how far machines have come in mastering complex physical challenges.
Sony AI’s autonomous table tennis robot, named Ace, has officially beaten high-level human players in regulated matches. According to Reuters, the system competed under International Table Tennis Federation rules, officiated by licensed umpires. In trials from April 2025, Ace won three out of five matches against elite players and lost two against professionals. Later bouts in December 2025 and early 2026 saw it toppling professional opponents too.
The Physics Problem That Chess Never Had
Why is table tennis so hard for AI? Unlike chess or video games, where every variable is controlled, real table tennis involves a ball moving at high speed with complex spin and changing trajectories. Peter Dürr, director at Sony AI Zurich and lead of the project, put it bluntly: “Unlike computer games, where prior AI systems surpass human experts, physical and real-time sports like table tennis remain a major open challenge.” The sport demands rapid sensing and coordinated movement under tight time constraints, a far cry from turn-based logic.
Ace’s architecture reads like a sci-fi spec sheet. The robot uses nine synchronised cameras and three vision systems to track the ball’s movement and spin. Dürr claims the system processes visual data fast enough to “capture motion that would be a blur to the human eye.” That speed is critical, because a spinning ball can change direction in milliseconds, and a human player’s split-second decision is now matched by silicon.
Training in Simulated Worlds, Not Watching Humans
Most AI systems learn from human demonstrations. Ace did the opposite. It trained entirely in simulation, developing its own strategies and play patterns. Dürr says the robot “learns to play not from watching humans” but through self-training in virtual environments. The result is a playing style that feels alien. Professional player Mayuka Taira, who lost a match to Ace, noted the robot’s lack of visible cues. “Because you can’t read its reactions, it’s impossible to sense what kind of shots it dislikes or struggles with,” she said. Rui Takenaka, an elite player who both won and lost against Ace, added that the robot handled complex spins well but was more predictable on simpler serves.
The robotic platform uses eight joints to control its racket. Three joints handle positioning, two manage orientation, and three control shot force and speed. This configuration was designed to meet the bare minimum mechanical requirements for competitive play. It is not over-engineered. It is precisely engineered.
Humanoid Robots Sprinting Through Beijing
Meanwhile, on the streets of Beijing, humanoid robots were doing something no AI could have attempted a decade ago: running a half marathon. At the 2026 Beijing E-Town Humanoid Robot Half Marathon, more than 100 robots joined approximately 12,000 human participants on separate tracks over a 21-kilometre course. A robot named Lightning, developed by Honor, finished first with a time of 50 minutes and 26 seconds. That is faster than Olympic runner Jacob Kiplimo’s 57 minutes and 20 seconds at the Lisbon Half Marathon in March. Yes, a robot outran an Olympic athlete.
Lightning’s run was not flawless. It collided with a barricade mid-race but kept going and finished first. Honor robots also claimed second and third place. The improvement from the previous year is staggering. In 2025, the fastest robot completed the course in two hours, 40 minutes and 42 seconds. That is more than a two-hour improvement in one year.
Autonomy vs. Remote Control
The race rules prioritised autonomous navigation, which made Lightning’s win significant. According to the Associated Press, another Honor robot completed the course in 48 minutes under remote control, but that entry did not qualify as the official winner. Engineers from Honor said technologies developed for Lightning, including structural reliability and liquid-cooling systems, could be applied in industrial scenarios. Think factories, warehouses, or disaster zones where humanoid robots might one day replace humans in dangerous tasks.
What This Means for Physical AI
These two events, though different in scale, point to a shared trajectory. Physical AI is moving from research labs into real-world applications. Dürr’s team believes the perception and control techniques used in Ace could transfer to manufacturing and service robotics. The same cameras and joint control that help a robot return a spinning ball could help it pick fragile items or operate machinery. The half marathon shows that humanoid robots can handle long-duration, dynamic tasks with a degree of reliability that was unthinkable a few years ago.
Still, challenges remain. Both systems struggle with adaptability. Ace is excellent at reading spin but less flexible in adjusting mid-match. Lightning fell but got up, which is more than many human runners can claim, but its collision shows the difficulty of real-time obstacle avoidance in crowded environments. These are not weaknesses, but rather the next frontiers for engineers.
So where does this leave us? Robots are no longer confined to tables and screens. They are playing sports and racing through cities. And they are getting faster, smarter, and harder to predict. For developers and technologists, the message is clear: the boundary between simulation and reality is thinning, and the real world is finally ready for AI to navigate it.