“If a machine enters the human society and coexists with humans, the machine will definitely cause an accident rate. If it is unpleasant, it will be called murder.” At the recent World Artificial Intelligence Conference, Su Qing, President of Huawei Intelligent Driving, pointed out Slug Autopilot is “killing”.
While these remarks continued to ferment, on July 10, local time, Tesla began to officially push the “fully autonomous driving” FSD Beta 9.0 version in the United States, and the first batch of internal testing was about 2,000 people.
Users tested FSD Beta 9.0
After the update, some enthusiastic car owners drove Tesla into various difficult scenarios overnight to help Tesla conduct extreme tests. Such as roundabouts, narrow roads full of cars on the side of the road, foggy nights and so on.
Users tested FSD Beta 9.0
The biggest difference between Tesla’s FSD Beta 9 and other manufacturers’ autonomous driving solutions is that Tesla only needs “camera + AI algorithm” to complete “autonomous driving”.
Don’t talk about lidar, even millimeter wave radar is no longer needed this time.
“Don’t worry about whether Tesla can achieve autonomous driving. At least in Musk’s plan, if it is too expensive, cannot be widely promoted, and cannot be applied globally, it is an unacceptable development model. So pure vision (pure vision) It has full applicability.” A person close to Tesla told Hu Xiu.
1. Autopilot with 8 cameras?
Musk once put forward an important point. He believes that Tesla cars “can achieve L5 autonomous driving only by improving the software.”
It sounds ridiculous indeed.
Looking back, Tesla has continuously made subtractions for cars: greatly reducing the length of vehicle wiring harnesses, greatly reducing the number of body parts, and greatly reducing the production and manufacturing processes. Nowadays, even millimeter wave radar has to be abandoned by Tesla.
All Tesla models including Model 3, Model Y, Model S and Model X have 8 cameras installed in the same position. They are arranged on the left and right sides of the front fender, the left and right sides of the B-pillar, and the front Rear, and above the front windshield.
Although Tesla is also equipped with front-end radar and ultrasonic radar to assist, but the effect is limited, and the number is far less than the products of domestic car companies. It doesn’t matter, Musk has a crazier idea- to use only 8 cameras to achieve autonomous driving.
As we all know, if a system wants to be reliable, the key principle is to make the system “redundant.” Therefore, car companies all over the world are making crazy additions for autonomous driving, a system that desperately needs reliability.
Big domestic autopilot manufacturers such as Baidu and Huawei have adopted the technical route of “lidar + high-precision map” to move towards L4 level autopilot. Even the more mature L2 assisted driving at this stage will adopt the “camera + radar” hardware configuration.
There are all kinds of autopilot “stacking monsters”.
At present, most of the L4 self-driving models that will be mass-produced on the market are high-end models with high prices. For example, the Weilai ET7 equipped with Lidar and 1000TPOS computing power is priced from RMB 448,000 to RMB 526,000. Similarly, the Polar Fox Alpha S Huawei HI version equipped with 3 lidars is priced from 388,900 to 429,900.
Last month, Baidu Apollo and Jihu cooperated to launch a new generation of autonomous driving model Apollo Moon equipped with laser radar customized by Hesai Technology. The official price of this car is 480,000 yuan- with this money, you can Beijing hired a skilled old driver to drive for four consecutive years.
The cost issue has always been a controversial point in the industry for lidar. For example, Li Bin, the founder of Weilai, believes that the cost increase in security is worthwhile, and that the cost is generally controllable. ” Lidar is like a car’s airbag. From the perspective of car safety, although the role of seat belts is greater than that of airbags, the car cannot be without airbags, which can increase the reliability of the car.” He said.
Cost is still safety. The obvious answer to this question is that when it comes to autonomous driving technology, there is a disagreement.
Although Tesla FSD Beta 9 relies on its super algorithmic capabilities and only uses the camera to sense it, it allows the vehicle to drive autonomously on urban roads, but it still requires a human driver to take over when there is a BUG.
FSD Beta 9.0 enters the bus lane
For example, in the test video on the Internet, there are situations such as rushing into the bus lane, going backwards on a one-way street, and merging the compaction line. In the serious case, there are green plants rushing straight to the center of the road.
In mid-June this year, the U.S. automobile safety regulator released a data that since 2016, there have been 30 accidents involving Tesla vehicles using advanced driver assistance systems, involving 10 deaths. Behind those accidents that turn on assisted driving, humans and machines are driving together, and the identification of responsibility will become more difficult.
FSD Beta 9.0 solid line change lane
Professor Zheng Ge from Kaiyuan Law School of Shanghai Jiaotong University believes that “under the current immature industry standards, the vehicle manufacturer or the autonomous driving system development plan can be unified to assume responsibility, if other software and hardware suppliers are at fault. , Can pursue its tort liability or liability for breach of contract based on fault, which can more effectively protect the rights and interests of accident victims.” For example, Weimar CEO Shen Hui said that if an accident occurs in Weimar’s unmanned driving, the responsible party should go to the OEM. .
At present, what Tesla can do is to warn users-Tes will also warn drivers in the FSD Beta 9 test version description, not to over-trust the assisted driving system. And Musk claimed: “Running pre-built software is both work and fun. But the test list is still stagnant, because we have many known problems to fix. Beta 9 solves most of the known problems, but it will also There are unknown issues, so please stay vigilant. At Tesla, safety is always the top priority. “
However, it is difficult in itself to have a human driver ready to take over the machine at any time.
As Su Qing, President of Huawei Smart Driving, said: “Ordinary users have a tendency to distrust new technology products at first, but once they try it and think it’s good, they will become very, very trusting. It was the beginning of the accident.”
2. There is a “plug-in”, why not use it?
“Autonomous driving in a broad sense is a problem because it needs to solve a lot of real-world artificial intelligence problems. I didn’t expect it to be so difficult before, but the difficulty is obvious in retrospect.” This was before the V9 push, Musk was on social media. A sentence written on it.
Behind the “sell miserable”, Musk still has a sentence ignored-“Nothing has more freedom than reality.” The “reality” here refers to the “real world” seen in Tesla’s camera, which is the pure visual perception that Tesla insists on.
At present, the mainstream sensors in the industry include cameras, lidars, millimeter wave radars, and ultrasonic radars.
Considering that various sensors have their own limitations, car companies or autopilot integrators need to fully consider functional safety redundancy in the process of designing application scenarios and system functions, and adopt a sufficiently reliable sensor solution with reliable algorithms, As well as computing and execution units to ensure the functional safety of the overall system.
For example, the camera used for pure visual perception has low cost and can recognize different objects. It has advantages in object height and width measurement accuracy, lane line recognition, pedestrian recognition accuracy, etc. It is to realize lane departure warning and traffic sign recognition. Indispensable sensors for other functions.
However, the camera has two fatal shortcomings: First, the operating range and ranging accuracy are not as good as millimeter wave radar. Second, it is easily affected by factors such as light and weather.
The millimeter wave radar can make up for it, but the millimeter wave radar cannot image, so it is difficult to identify elements such as lane lines and traffic signs. Therefore, the two need to cooperate with each other.
Data set of millimeter wave radar (Source: Oxford Radar RobotCar Dataset)
The vast majority of companies will use a camera + millimeter wave radar fusion solution to achieve L2 autonomous driving capabilities, more accurately ADAS assisted driving, such as adaptive cruise, lane keeping and other functions.
Although the cost of these two parts is relatively low, Musk still rejects the option of radar. Because of his worries, the perception fusion of the camera and the radar.
In theory, sensor fusion is equivalent to superimposing the advantages of each sensor
The biggest obstacle to fusing camera data and millimeter-wave radar data is that the signal-to-noise ratio of millimeter-wave radar is very low. In other words, there are a lot of false detections. When visual perception results and millimeter-wave radar results are fused, if the vision and millimeter-wave perception results are Inconsistent, the usual practice is to trust the vision and ignore the millimeter wave detection results.
For example, Tesla has experienced “ghost braking”. When a vehicle enters a tunnel or a shadow, the system will recognize the shadow as an obstacle and brake suddenly, even leading to a rear-end collision. This problem also appears in the new ideal ONE. Compared with the old model, its front camera, millimeter wave radar, and algorithm have clear changes.
Musk has previously stated through social media that the pure visual solution will completely eliminate the problem of ghost brakes. So the solution is to ignore the millimeter wave radar signal.
There is also Tesla’s famous “hit a white truck” accident.
The culprit is also the millimeter wave radar. Because millimeter-wave radar cannot measure the height of bridges and road signs above trucks, in the “eyes” of millimeter-wave radar, stationary trucks, bridges, and signs are all objects on the ground. In this case, the signal of the millimeter wave radar needs to be shielded, but it is obvious that there was no such thing when the accident happened.
Therefore, Musk’s approach is to “get rid of” the millimeter-wave radar that is causing trouble and focus on breaking the shortcomings of the camera.
In March of this year, Tesla received a patent for “using visual image data to estimate object attributes.” This patent uses two neural networks and only uses image data to measure the distance to an object. One of the neural networks can determine the distance between the object and the image captured by the on-board camera. Another neural network creates training materials for the former in the form of annotated images.
Tesla has solved the problem of ranging. But as for extreme weather conditions, whether the pure visual perception of the camera alone can guarantee safety, there is currently no “lucky winner” among Tesla’s test users.
“Users are free security officers,” a person from an autonomous driving company concluded to Huxi.
3. Why don’t domestic car companies need to learn Tesla?
So, since the problem of sensor hardware can be solved, why don’t domestic car companies learn Tesla?
First of all, the amount of data is not at one level.
Tesla’s “shadow mode” always observes the external environment and the driver’s actions. If the driver’s operation in a particular scene does not match the “shadow” prediction, the data will be transmitted to Tesla’s server for corrective training of the algorithm, which will be corrected next time in the same scene.
Behind this, rely on the real scene data of millions of vehicles driven by users. Rather than some self-driving test cars running out in the wilderness, lap by lap. In essence, artificial intelligence is “feeding” data, and autonomous driving cannot escape this link.
Autonomous driving test vehicle of a certain brand
Second, the business model is also different.
Even Weilai Automobile and Xiaopeng Automobile, which flaunt Internet genes, still account for more than 90% of their revenue from automobile sales. The so-called smart cars launched by domestic car companies are still in the stage of running scale, and the iteration of new products will naturally be accompanied by the addition of new hardware. Let’s not say whether the function is realized or not, at least raise the ceiling of the hardware to the highest.
Tesla hopes to break the pattern of “selling cars without making money” as soon as possible, and make more money from paying for self-driving software.
But the prerequisite is that the consistency of the autopilot hardware needs to be maintained between the stock models and the new models, so that a unified software upgrade can be achieved through a software OTA. And this is why, whether it is the Model S with a price of one million or the Model 3 with a price of more than two hundred thousand, Tesla uses the same set of 8 camera configuration combinations.
Tesla’s Chief Financial Officer Zach Kirkhorn said at the Q1 earnings call in 2021: “Recurring revenue based on Fully Autonomous Driving (FSD) will have huge potential. We have a series of different models with a large number of User.”
Tesla’s global stock users have already exceeded 1 million. In 2020, Tesla delivered 490,000 new cars worldwide. In the first half of this year, 160,000 new cars were sold in China alone. The FSD software package upgrade is priced at 64,000 yuan, so the scale of income can be seen.
Finally, in terms of supervision, domestic car companies themselves have more freedom.
For example, recently, the topic of data security and road information collection has attracted much attention. At present, when most domestic manufacturers are doing autonomous driving, they cannot bypass high-precision maps because they use lidar. Compared with ordinary maps, high-precision maps have more precise positioning and information, which can be accurate to the centimeter level.
“Autonomous driving has two basic functions, perception and positioning. Lidar can not use high-precision maps for perception, of course, the effect will be worse. Lidar does positioning, you need to rely on high-precision maps. ” An autopilot engineer told Tiger Sniff.
The NGP navigation assisted driving function of Xiaopeng P7 has already adopted AutoNavi’s high-precision map. Ideal ONE’s upcoming NOA is also supported by AutoNavi. Weilai NOP is a high-precision map from Baidu. With the support of high-precision maps, the vehicle has more precise positioning, which can produce more excellent decision-making and planning, and enhance the comfort of driving experience.
If Tesla wants to follow the path of lidar + high-precision map to achieve high-end autonomous driving, Tesla may no longer face technical issues, but regulatory issues. Especially in China, although Tesla has cooperated with Tencent Maps and Baidu Maps, there is still no sign that they will adopt these two high-precision maps.
In China, it is even more unlikely for Tesla to obtain high-precision map collection qualifications. On a global scale, the only thing Tesla can do is to collect a large amount of real road condition data through the on-board camera, which the industry calls the “crowdsourcing model.” As Musk said: “Eight cameras capture 360-degree high frame rate video at the same time.”
“The data currently collected by Tesla is vision + millimeter wave, and the quality is indeed inferior to that of lidar. However, it relies on advanced automatic labeling, neural network research and development, supercomputers and other capabilities, and it has been trained with poor data. Very powerful function.” The above-mentioned autonomous driving engineer added.
Write at the end
What kind of company is Tesla? In the past few years, there have been different versions, from the earliest electric car company, to the later energy company, to the later software company.
From the beginning of FSD V9, Tesla may have changed its “hat” again. As Musk said on the earnings call for the first quarter of 2021: ” From a long-term perspective, people will find that we are an artificial intelligence robot company. “
Of course, I still advise everyone to listen less to what Musk said, and to see what Musk did.
Posted by:CoinYuppie，Reprinted with attribution to:https://coinyuppie.com/is-tesla-experimenting-with-human-lives/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.