
On February 27, 2026, during the CCTV Spring Festival Gala, humanoid robots captured significant attention. Among them, a lively group of humanoid robots from Yushu Technology performed a martial arts show titled “Wu BOT”, showcasing rapid movements, dynamic formations, and martial arts actions. This high-dynamic, highly collaborative autonomous swarm control technology made its global debut, sparking lively discussions online even before the show concluded. The performance featured dozens of Yushu G1 robots alongside an H2 robot, demonstrating impressive “cyber skill.” The critical role of LiDAR technology was pivotal to this performance.
According to reports, the real-time coordination and ultra-low synchronization latency of the humanoid robots were made possible through a collaboration of Yushu Technology’s self-developed AI integration positioning algorithm and Hesai Technology’s JT LiDAR. This collaboration successfully addressed the cumulative motion error challenges in long sequence performances, ensuring that every action was synchronized perfectly. The deployment of multiple Hesai LiDAR units provided the robots with 360-degree, precise environmental awareness, allowing for a stable and impressive high-dynamic collective performance.
Analysts believe that the humanoid robot performance at the 2026 Spring Festival Gala not only showcased the rapid iteration of domestic technology but also marked a significant shift in the industry from the validation phase of “0 to 1” to the new chapter of mass production from “1 to 10.” This shift is expected to catalyze the core components and OEM segments of humanoid robotics. The explosion of robotics is just beginning; Yushu’s appearance at the gala is merely a signal. In the future, whether in industrial or household robots, LiDAR will be a standard requirement for navigating complex environments.
Currently, LiDAR has become a core component of physical AI infrastructure. Hesai’s positioning has evolved from focusing on “core automotive perception” to becoming a “physical AI infrastructure.” This shift indicates that LiDAR products are transitioning from being a “safety standard” in automobiles to becoming a “perception standard” across the entire robotics industry. Analysts suggest that this strategic leap for Hesai Technology signifies a move from “being number one in a market” to “defining a new market and becoming its infrastructure.” This does not mean abandoning the automotive market but rather using it as a leverage point to engage a larger physical AI ecosystem.
Hesai Technology’s ultimate goal is to make “Hesai Inside” synonymous with safety in the intelligent era—whether it’s in cars speeding on highways, robots dancing on the Spring Festival stage, or intelligent devices serving in homes in the future.
As the automotive industry transitions from the “first half” of electrification to the “second half” of intelligence, a wave of smart driving characterized by “universal intelligent driving” and “hybrid intelligence” is sweeping through. LiDAR, as a core 3D perception sensor and safety component in passenger vehicles, has become an “invisible safety airbag” as features like NOA (Navigation on Autopilot) and AEB (Automatic Emergency Braking) become prevalent. Data shows that LiDAR can effectively prevent collision accidents caused by visual perception defects due to bright lights, dark lighting, blind spots, and poor weather conditions. Vehicles equipped with LiDAR can reduce the risk of fatal high-speed accidents by 90% and decrease conventional traffic accidents by 30%. In extreme scenarios, it serves as a truly life-saving safety component.
Hesai has spent seven to eight years reducing the cost of LiDAR from hundreds of thousands to just $200 per unit, achieving a 99.5% price drop while continuously improving performance. This transformation is not merely about lowering prices; it is about Hesai’s self-developed chip technology that has converted an industrial-grade product into a consumer-grade component, making the safety previously available only in luxury cars a standard feature in everyday vehicles.
As of today, this change is accelerating. By 2025, the penetration rate of LiDAR in China’s new energy vehicle market is expected to reach 21%, meaning that for every five new energy vehicles sold, one will be equipped with LiDAR. To date, Hesai has secured production contracts with over 38 automotive brands, covering all of the top ten car brands in China, and has established partnerships with the leading two ADAS clients for their entire model range in 2026. These partnerships include in-depth collaborations with Ideal Auto, Xiaomi Auto, Leap Motor, and BYD.
On January 5, 2026, Hesai announced its selection as a LiDAR partner for NVIDIA’s NVIDIA DRIVE AGX Hyperion 10 platform. This platform is a reference architecture for computing and sensors aimed at helping various vehicle models achieve Level 4 autonomous driving, assisting automotive manufacturers and developers in creating safe, scalable, and AI-defined high-performance fleets. Hesai is among the latest partners to complete certification for the DRIVE Hyperion open mass production architecture sensor suite. The sensor ecosystem being developed under this architecture is continually expanding, encompassing cameras, millimeter-wave radar, LiDAR, and ultrasonic technology, empowering automotive manufacturers and developers to build and validate perception systems optimized for Level 4 autonomous driving.
Hesai’s chip business is driving rapid growth in NVIDIA’s performance. On February 26, NVIDIA released its financial report, showing revenues of $68.127 billion for the quarter ending January 25, 2026, an increase of 73% compared to $39.331 billion during the same period last year. LiDAR not only promotes the large-scale commercialization of intelligent driving but also opens up a broader realm of possibilities, as it serves as the intelligent eye for robots. While cars primarily operate in relatively mature and regulated environments, robots must navigate more complex and variable settings.
The stable and reliable 3D environmental data provided by LiDAR is fundamental for robots to build spatial awareness and achieve autonomous navigation and interaction. Robots may need to operate in diverse scenarios including homes, factories, and hospitals, each with unique layouts and conditions. For instance, furniture arrangements in a home can change at any time, and materials in a factory may be repositioned frequently, necessitating robots that can adapt to various environments. This means robots must deeply interact with the physical world, using sensors to perceive their surroundings and employing hardware devices like robotic arms to physically manipulate objects.
Previously, LiDAR primarily served the autonomous driving sector; however, it is now rapidly entering homes and stages. Transitioning from “seeing the road” to “seeing people” and from rooftops to robot shoulders, LiDAR provides stable, reliable input about the physical world, serving as the core bridge connecting digital intelligence with physical action. Some experts argue that when Hesai redefines LiDAR from “automotive components” to “sensors for the physical world,” its applications extend beyond just automobiles to encompass all embodied intelligent agents.
By 2025, the LiDAR industry is expected to reach a critical juncture of “crossing the chasm.” According to data from Gaishi, new energy passenger vehicle sales are projected to reach 12.48 million, with LiDAR-equipped vehicle sales hitting 2.58 million, yielding a 21% penetration rate, and peaking at 28% in some months—meaning that one in every four new energy vehicles sold will be equipped with LiDAR. From November to December 2025, the penetration rate of LiDAR in the entire passenger vehicle market reached 17%-19%, crossing the critical “chasm” threshold of 16% for the first time. This milestone signifies that LiDAR technology has transitioned from the early market into mainstream adoption, entering a new phase of large-scale dissemination.
For example, Xiaomi’s new cars in 2026 will come standard with LiDAR, which will likely boost both Xiaomi’s vehicle sales and the sales of Hesai’s LiDAR systems. Industry forecasts indicate that by 2030, the penetration rate of LiDAR in new energy vehicles could rise to 56%, with the market size expected to continue to multiply.
As the market experiences rapid growth, Hesai firmly retains its position as the global leader in LiDAR technology. Recent data from the High-Technology Automotive Research Institute indicates that Hesai holds over 40% of the market share in the front-mounted primary LiDAR segment for passenger cars in China, maintaining its industry-leading position. Gaishi data shows that Hesai’s annual market share reached as high as 41.35%, with a single month’s share exceeding 47%. In 2025, Hesai’s total global LiDAR delivery volume surpassed 1.6 million units, with approximately 1.4 million units delivered for ADAS products and over 200,000 units for robotic applications. Hesai’s popular ATX product achieved a delivery volume of over 1 million units in its first year, designed for L2-level driving assistance safety requirements. The ATX’s exceptional distance measurement capabilities and unique photon isolation safety technology enable it to detect potential dangers early, enhancing vehicle safety. Furthermore, the newly released ATX version at the end of November 2025 has already received over 4 million orders from major automotive manufacturers and is set to commence mass production and delivery in April 2026.
Hesai’s JT series has surpassed 200,000 units in cumulative deliveries and is being used in multiple robotic products, indicating that this crucial sensor has achieved scalable and cost-effective delivery. Hesai’s CEO, Li Yifan, has stated that while one person cannot drive two cars simultaneously, many robots can work for you at the same time. This implies that the potential for robotic LiDAR is far greater than for automotive applications. The robotics market is not just “another automotive market”; it represents an exponentially larger demand pool for sensors.
Hesai Technology positions itself as a key builder of “physical AI infrastructure,” with its LiDAR products transitioning from being the “safety standard” in automobiles to becoming the “perception standard” in the entire robotics industry. Analysts believe that by providing high-performance, mass-producible core sensors, Hesai Technology has addressed the technical bottlenecks of accurate perception and collaboration for humanoid robots. By enabling scalable delivery, they are driving down costs and maturing the ecosystem, serving as a critical enabler for humanoid robots to move from cutting-edge research to large-scale industrial application, profoundly shaping the future landscape of the entire industry.
In an environment where a fixed supply chain for the robotics market has yet to form, Hesai leverages its scale advantage from the automotive market (with annual deliveries of 1.6 million units) to disrupt the robotics sensor market, aiming to establish a first-mover advantage.
High-level autonomous driving depends on the redundant perception provided by LiDAR. Eight months ago, Tesla launched its “Robotaxi” service in Austin, Texas. CEO Elon Musk made ambitious promises, including deploying 500 unmanned vehicles, covering half the U.S. population, and providing a fully autonomous driving experience, with plans to expand to 8-10 cities by the end of 2025. However, there is a significant gap between Musk’s promises and the reality of Tesla’s Robotaxi service, as reported by U.S. media. Currently, only about 42 Tesla unmanned vehicles are operating in Austin, with a utilization rate of less than 20% and an accident rate nine times that of human drivers. Moreover, the “unmonitored” ride service, heavily promoted by Musk before the latest earnings report, has also fallen short.
During the fourth-quarter earnings call in 2025, Musk claimed that the combined number of vehicles in Austin and the San Francisco Bay Area “far exceeded 500” and predicted reaching 2 million within a year, doubling each month. However, independent data suggests that Tesla has about 200 unmanned vehicles, and in Austin, there may be fewer than 24 operating on any given week. Beyond fleet size, the issue of availability is even more severe. According to data from RobotaxiTracker over the past 48 hours, the availability of the Austin Robotaxi service was only 19%, with 81% of the time spent offline. This indicates that for most of the time, only a few vehicles are online, leaving passengers in many areas unable to request a ride.
In contrast, Waymo operates over 100 unmanned vehicles in Austin through the Uber platform, covering 90 square miles and providing 24/7 service, accounting for about 20% of local Uber orders. Tesla has not disclosed its order volume but is expected to be significantly lower. Data submitted by Tesla to the NHTSA shows that its vehicle collision rate is approximately nine times that of human drivers, with trained safety supervisors present in the cars. Tesla’s data indicates that its unmanned vehicles have traveled approximately 500,000 miles, experiencing nine accidents, averaging one accident every 55,000 miles. In comparison, human drivers average one accident every 500,000 miles. While not all accidents are severe, the frequency of accidents raises concerns.
Additionally, Tesla’s “autonomous driving taxi” service ceases operation during rainfall, while Austin experiences an average of over 80 rainy days a year. Tesla’s pure visual navigation system relies on eight cameras, which can be obstructed by rain, fog, and glare from the sun, compromising the system’s safety. The National Highway Traffic Safety Administration (NHTSA) has inquired about Tesla’s operational plans under low visibility conditions, to which Tesla requested confidentiality on the response. Meanwhile, Waymo employs a fusion solution of LiDAR, radar, and cameras, providing redundancy to operate in harsh weather, ensuring 24/7 service.
Tesla has been a proponent of the pure visual approach, with Musk previously claiming that “LiDAR is a fool’s errand.” The practical situation indicates that the safety of pure visual solutions in Level 4 autonomous driving is inadequate, as Tesla’s challenges demonstrate that high-level autonomous driving relies on redundant perception from LiDAR. Analysts assert that Tesla’s setbacks in the Robotaxi domain illustrate that LiDAR is not a “cost burden” but rather a safety necessity.
Original article by NenPower, If reposted, please credit the source: https://nenpower.com/blog/humanoid-robots-shine-at-2026-spring-festival-as-hesai-and-nvidia-team-up-elevating-lidar-to-key-player-in-physical-ai/
