How many black technologies are hidden inside the driverless car?

It fundamentally changed the traditional "human-car-road" closed-loop control method, and the driver who could not be strictly restrained by the rules was invited out from the closed-loop system, thereby greatly improving the efficiency and safety of the transportation system. A revolutionary product of the development of the automotive industry.

Since the 1980s, humans have started research on vehicle autonomous driving. The United States is one of the earliest and highest-level countries in the world to study autonomous vehicles. Among them, Google driverless cars have the most extensive influence and are among the most mature companies in terms of technology. Google claims that its driverless car has safely traveled more than 1.6 million kilometers on the road, during which no serious collisions have occurred. However, there are very few companies that can achieve the technical level of Google's autonomous vehicles, and the key technology threshold is relatively high.

How many black technologies are hidden inside the driverless car?

Let's talk about a few key technologies in autonomous vehicles.

Environmental awareness

The sensor detects the environmental information, but only arranges and stores the detected physical quantities in an orderly manner. At this point, the computer does not know what the physical meaning of the data is mapped to the real environment. Therefore, it is necessary to use the appropriate algorithm to mine the data we are interested in from the detected data and give physical meaning to achieve the purpose of sensing the environment.

For example, when we are driving a vehicle, our eyes look at the front, and we can distinguish the lane line we are currently driving from the environment. In order for the machine to obtain the lane line information, the camera needs to obtain the environment image. The image itself does not have the physical meaning mapped to the real environment. In this case, an image that can be mapped to the real lane line is found by the algorithm, and the image is given. Its lane line meaning.

There are many sensors for self-driving vehicles to perceive the environment, such as cameras, laser scanners, millimeter wave radars, and ultrasonic radars.

For different sensors, the perceptual algorithm used will be different, which is related to the mechanism of the sensor's perception of the environment. Each sensor's ability to sense the environment and its impact on the environment are also different. For example, the camera has an advantage in object recognition, but the distance information is lacking. The recognition algorithm based on it is also affected by weather and light. Laser scanners and millimeter-wave radars accurately measure the distance of an object, but are far weaker than the camera in recognizing objects. The same sensor will exhibit different characteristics due to different specifications. The long-range millimeter-wave radar has a detection range of up to 200 meters and a small angular range (±10 degrees), while the medium-range radar has a detection range of 60 meters and a large angular range (±45 degrees).

In order to take advantage of their respective sensors and make up for their shortcomings, sensor information fusion is the future trend. In fact, some component suppliers have done this, such as the camera and millimeter wave radar combined sensing module developed by Delphi has been applied to the production car.

Behavior planning

When it comes to behavior planning, maybe everyone will be stranger. We can start with path planning. The concept of path planning is more common in robots and is generally defined as:

In an environment with obstacles, look for a collision-free path from the initial state to the target state according to certain evaluation criteria. For an unmanned vehicle, if the vehicle pose of the target location is determined, the vehicle specifically travels to the target location with a motion path, which is the path planning.

Path planning actually includes a wide range of global path planning that does not consider motion details and local path planning specific to motion trajectories.

In order to visually classify and analyze the partial path of the unmanned vehicle, the concept of "behavior" was introduced. When a vehicle is driving autonomously on a city road, it should have driving behaviors such as lane keeping, changing lanes, straight intersections, intersection turns, U-turns, obstacles, intelligent start and stop, and automatic parking. The orderly arrangement of behaviors and organic connections can complete the entire autonomous driving task.

"Driving behavior" is the driving unit subdivided in the local path. Of course, its division should be diverse, depending on the algorithm implementation.

There is a relative independence between behavior and behavior, but the behavior is switched with a smooth transition. When the vehicle is driving, when to use what kind of behavior is behavior planning (also called behavior decision).

A single driving behavior, in fact, many automakers or research institutes have done a lot of work, and some have already been introduced to the market. Such as Tesla's lane keeping, automatic lane change, and follow-up functions, these are specific examples of driving behavior. But how do these behaviors switch and how to transition, Tesla gives it to people. Adaptive cruising, lane keeping, and automatic lane change require the driver to manually manage the machine and be ready to take over.

When a person drives a vehicle under the same working conditions, the sequence of driving behaviors generated is different, and even the specific execution of the same behavior is different, which is related to the personality, safety awareness, and mood at the time. For example, when we are in a hurry, the number of lane changes will increase, and the safety factor of overtaking will decrease. When a newbie drives a car, the timing of the lane change is not good, and the brakes are often braked; even in the event of an accident, choose a crash or a collision. People, different people may have different choices. These many high-level thinking that belongs to people also involve law and ethics. It is still difficult for machines to reach this level. But artificial intelligence may be the breakthrough to solve this problem.

Vehicle positioning

When driving autonomously, autonomous vehicles need to solve three basic problems: 1. Where is the vehicle; 2. Where to go; 3. How to get there.

Where the vehicle is is actually the positioning of the vehicle. There are various positioning methods, such as satellite positioning, ground station positioning, visual or laser positioning, and inertial navigation. At present, there are many satellite positioning and base station positioning methods for unmanned vehicles in domestic universities, and the latter two are basically not involved.

How many black technologies are hidden inside the driverless car?

Each type of positioning has its limitations, and the convergence of positioning methods is a trend.

For example, although the satellite positioning system has a wide application range and high absolute position accuracy, it is not suitable for indoor or obstructed areas, and the position will drift with time. The relative positional accuracy of the vision or laser positioning is very high, no position drift, but it is greatly affected by the environment.

When positioning technology is applied to unmanned vehicles, satellite positioning can solve a wide range of absolute position positioning, highway positioning and other open space positioning problems, but when the car enters the tunnel, high building sections or indoors, the positioning signal will be unstable. Or lost. At this time, indoor positioning methods such as visual or inertial guidance are needed to make up for it.

Vehicle positioning can directly or indirectly affect the realization of vehicle motion control and behavioral decisions, and even important information needed to perceive the environment. When performing the planned motion trajectory, the motion control algorithm needs positioning information to continuously feedback the actual motion state for real-time adjustment. When switching behaviors, the timing of switching needs to fully understand the location of the traffic environment in which the vehicle is located. Perceptual aspects, such as the use of SLAM technology to construct maps, require relative positioning information of the vehicle.

Conclusion

Autonomous vehicles are the product of collision and fusion between the automotive industry and the robotics community. They bring together a series of high-tech technologies such as mechatronics, environmental awareness, electronics and computers, automatic control and artificial intelligence. As an important means of transportation for human beings, with the integration, development and breakthrough of these sub-technologies, cars will become more and more intelligent, and eventually achieve all-weather unmanned driving.

Touch Panel For Iphone 11

Touch Panel For Iphone 11,Touch Panel For Iphone X11,Lcd Touch Panel For Iphone,Original Touch Screen For Iphone 11Promax

Shenzhen Xiangying touch photoelectric co., ltd. , https://www.starstpmobile.com