news

Why are driverless cars called "radishes"?

2024-07-15

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

If we talk about the hottest star in the car circle recently, it must be "LuoBoKuaiPao".

This cute-looking thing is actually Baidu's driverless taxi. A few days ago, Luobo Kuaipao was widely launched in Wuhan, attracting many people to try it with its ultra-low price and novelty.

But soon, some people discovered that although Luobo Kuaipao was a high-tech product, it was a bit clumsy. Many Wuhan people jokingly called it "苕萝卜", which means "stupid radish" in Wuhan dialect.

So, how stupid is Carrot Run?

This carrot is a bit tangy.

In fact, the name "苕萝卜" is quite realistic. In some cases, the radish running fast does have a silly temperament.

Being "scared" by various obstacles and not daring to move, it was just basic operation. I saw a car with double flashes on and motionless in the middle of the road. When I looked closely, I found that it was actually a green woven bag lying on the road that was "controlling" it. Facing this obstacle, the car looked helpless and would not go around it, so it just stupidly stopped in front of the woven bag.

Finally, a passerby who couldn't bear to watch took the bag away, and Luobo Kuaipao finally dared to move.


Image source: Screenshot of Douyin @(Xiaoou) video

Sometimes there are no obstacles on the road, but the road conditions are slightly complicated, and even though Carrot is running fast, his "IQ is not enough".

For example, in the heavy traffic during the morning rush hour, a scooter wanted to change lanes to pick up a passenger but didn't dare to, so it had to wait where it was, looking weak, pitiful and helpless among the crowd of cars.


Image source: Screenshot of Douyin @少喝奶茶 video

Faced with the endless stream of pedestrians and other vehicles, the Turnip Run directly "judged" that there was no room to move forward, and then, regardless of where it was or whether parking would cause congestion, it just lay down on the road and waited obediently. Another video shot by a netizen showed that a Turnip Run had to "courteously" give way to 10 oncoming cars at the intersection before it seized the opportunity to continue driving.


Then there was a traffic jam as expected | Image source: Screenshot of the video by @承蒙厚爱 on TikTok

Even when there were no cars nearby and they just happened to meet their brothers, they didn't know what to do. They just moved a little and stopped together, with the three words "polite" written on their foreheads.


Image source: Screenshot of Douyin @糯米糕饼 video

But sometimes, the car that "stops when encountering problems" can also act like a reckless man. Even if there is water in front that almost submerges the wheels, the car will still drive on.


Goodbye, Mom. I'm sailing away tonight~ | Source: Douyin @ Wuhan full-case designer Chen Fang video screenshot

Ultimately, the above situations are caused by inadequate judgment of road conditions and lack of intelligence.

In addition, sometimes, the reason why "radishes" are stupid is that they have "following the rules" engraved in their DNA.

For example, he has to wait until his passengers arrive, otherwise even if there is a long traffic jam behind him, Luobot will not move at all.


Image source: Screenshot of Douyin @蔡蔡CC video

For example, if we agree to take you to your destination, even a meter short of that is not acceptable.

Blogger "Hot dry noodles need vinegar" said that he originally planned to get off the bus across the street from his destination and then walk a few steps to his destination, but Luobo Kuaipao still insisted on walking to the next intersection and then turning around, vowing to take the passengers to their destination. Getting off halfway? Impossible.

Image source: Screenshot of the video @热干面要加醋 on Douyin

And never overtake, even if there is an electric car in front of you that is moving very slowly...

Image source: Screenshot of the video of @胖小洋吃得瘦 on TikTok

It seems that people in Wuhan have their reasons for calling it "taro radish".

Why is “Vegetable Radish” so stupid?

To understand why Carrot Run is so stupid, you must first know how intelligent driving is achieved.

Gao Haofei (pseudonym), an autonomous driving algorithm engineer at a well-known domestic automobile company, told YiDu that there are currently two major technical routes for the development of intelligent driving. One is the traditional modular route, and the other is the end-to-end model.

The modular route is to split the implementation of intelligent driving into modular steps: perception (obtaining information), planning (algorithm analysis), and control (executing operations). That is, first obtain environmental information through on-board radar, cameras and other sensors, and then the on-board processor plans the path and speed based on the information obtained and combines high-precision maps and built-in algorithms, sends control commands, and finally the underlying processor executes control to make the vehicle move.

The planning algorithm is the "way of thinking" of the driverless car. The reason why the Turnip Express and other smart driving cars are sometimes "stupid" is that the planning algorithm is not mature enough. At present, the planning algorithm is mainly based on artificially defined rules. For example, the algorithm program may stipulate: drive along the middle of the lane line and obey traffic rules, do not change lanes on solid lines, stop if there is an obstacle ahead, and continue to drive through if there is no obstacle ahead.

But the problem is that artificially defined rules are difficult to cover all scenarios, and in real life, road scenarios are very complex and even contradictory situations often occur. For example, when encountering an obstacle in a solid lane, traffic rules stipulate that you cannot cross the solid line, but the driving path planned to bypass the obstacle will inevitably cross the line.

At this time, if a person is driving, he can flexibly make the best choice, but if this contradictory scenario is not covered by the algorithm, the driverless car will not be able to make the next move. "The final result may be that it can only handle the situation 'rigidly' according to the algorithm code, and will not move when encountering a contradictory scenario, which seems stupid."


The code may not include the situation of "wanting to turn but encountering a roadblock" | Image source: Screenshot of Douyin @李壬申 video

Moreover, the modular route also has the problem of cascading errors. Simply put, in the process of processing and transmitting information, the information may deviate, from "Ma Dongmei" to "Ma Dongmei", which may eventually affect the accuracy of vehicle control.

So, what is end-to-end? Can it solve the dilemma faced by people like Carrot Run?

Gao Haofei explained that end-to-end actually breaks the boundaries between modular steps, allowing cars to learn and summarize from a large amount of actual experience like humans, and directly make the best global plan for the current driving environment. "The future direction is to reduce human-defined rules and give (cars) the ability to learn autonomously. This is like ChatGPT, which conducts anthropomorphic learning, allowing cars to flexibly handle various scenarios with contradictions and problems like an experienced driver." Gao Haofei said.

At present, many companies including Huawei and Xiaopeng have proposed plans for end-to-end technical routes for autonomous driving and are gradually implementing them.

But it is worth noting that the complete end-to-end route is not perfect either. If there is a problem with the modular route, we can locate which step has the problem - information is transmitted step by step, and we can check each step, but when an end-to-end error occurs, it is difficult to trace the cause.

Gao Haofei used ChatGPT as an analogy: "It's like ChatGPT sometimes talks nonsense, such as teaching you to make a dish of 'shoe soles fried with chili peppers', but you don't know why it says that; after being fully end-to-end, smart driving cars may also make some inexplicable moves, and you don't know why it drives like that."

ChatGPT's nonsense may not cause any serious consequences, but if the car makes any "strange" moves, the consequences may be unbearable for people. Therefore, the end-to-end autonomous driving technologies currently released by various companies also have a large number of rule algorithms to ensure driving safety.

Regarding the idea of ​​some netizens "buying a driverless car to pick me up and drop me off during get off work, and going out to earn money to support me the rest of the time", Gao Haofei and his colleagues said that it is not very realistic at present. If a traffic accident occurs, it is unclear who will be responsible.

"Besides, who knows whether your smart self-driving car is out soliciting customers or going out to 'fight' on the streets?"

Author: Minmin Du Yuxin

Report/Feedback