CN107650908B - Unmanned vehicle environment sensing system - Google Patents
Unmanned vehicle environment sensing system Download PDFInfo
- Publication number
- CN107650908B CN107650908B CN201710969796.5A CN201710969796A CN107650908B CN 107650908 B CN107650908 B CN 107650908B CN 201710969796 A CN201710969796 A CN 201710969796A CN 107650908 B CN107650908 B CN 107650908B
- Authority
- CN
- China
- Prior art keywords
- distance
- unmanned vehicle
- detection system
- detection
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 claims abstract description 93
- 230000007306 turnover Effects 0.000 claims abstract description 10
- 238000000034 method Methods 0.000 claims abstract description 9
- 230000008569 process Effects 0.000 claims abstract description 7
- 238000004891 communication Methods 0.000 claims description 15
- 230000000087 stabilizing effect Effects 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 12
- 230000004927 fusion Effects 0.000 claims description 3
- 230000009286 beneficial effect Effects 0.000 abstract description 5
- 238000012544 monitoring process Methods 0.000 abstract description 4
- 230000002060 circadian Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 208000013586 Complex regional pain syndrome type 1 Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0097—Predicting future conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention discloses an unmanned vehicle environment sensing system which comprises a first detection system for detecting whether an obstacle exists in a first distance range around an unmanned vehicle, a second detection system for collecting azimuth information of a moving target in a second distance range around the unmanned vehicle, and a third detection system for collecting image information in a set angle range in a third distance range around the unmanned vehicle in a turnover scanning mode, wherein the third distance is larger than the second distance and the second distance is larger than the first distance. The system realizes multi-distance and multi-level environment sensing, the first detection system is used for detecting whether an obstacle exists in a short distance range or not so as to perform anti-collision early warning, the second detection system is used for detecting azimuth information of a moving target in a medium distance range, detection of the moving obstacle and anti-collision early warning in a high-speed driving process are facilitated, and the third detection system is used for long-distance and narrow-view image detection and is beneficial to remote control and locking monitoring of a specific target.
Description
Technical Field
The invention relates to the field of unmanned vehicle control, in particular to an unmanned vehicle environment sensing system.
Background
The unmanned vehicle is critical to achieve its unmanned purpose, the environmental awareness. The environment sensing system collects, processes and semantically expresses surrounding environment parameters through various sensors. By collecting and analyzing parameters of the environment, the expression of the semantic level of the surrounding environment can be provided, and the semantic level expression is an important input parameter of the automatic controller.
The sensors for environmental perception can be classified into a visual sensor, a laser sensor, a microwave sensor, and an ultrasonic sensor according to principles. The visual sensor acquires two-dimensional or three-dimensional image information of the surrounding environment of the vehicle through machine vision, and senses the driving environment through an image analysis and recognition technology. The laser sensor acquires two-dimensional or three-dimensional distance information of the surrounding environment of the vehicle through a laser radar, and senses the driving environment through a distance analysis and identification technology. The laser sensor can be further divided into a plurality of types such as single line, 4 line, 8 line, 16 line, 32 line, 64 line, etc. according to the number of laser lines scanned. The microwave sensor and the ultrasonic wave are similar to the laser sensor, and acquire distance information.
The current mainstream environmental awareness schemes can be divided into two categories according to the primary sensor: lidar and machine vision. The laser radar mainly detects the characteristic quantities such as the position, the speed and the like of a target by emitting laser beams. The vehicle-mounted laser radar generally adopts a plurality of laser transmitters and receivers to establish a three-dimensional point cloud image, thereby achieving the purpose of real-time environment perception. From the current vehicle-mounted lidar, a mechanical multi-beam lidar is the mainstream scheme. The laser radar has the advantages of wider detection range and higher detection precision. However, the disadvantages of lidar are also evident: the performance is poor in extreme weather such as rain, snow, fog and the like; the data volume is too large; is very expensive.
The machine vision environment perception system is more in line with the understanding of people to the world, and the visual products are low in cost and can be installed in a large quantity; but no good machine vision type environment sensing algorithm exists at present; and the adaptability to the environment is not strong, and the effect can be influenced in rainy and snowy weather and at night.
Unmanned as a technology capable of replacing manual driving in the future, the unmanned vehicle is required to have an environment sensing system capable of working in all weather and at multiple distances. This is not satisfied by the simple lidar scheme and the machine vision scheme.
Disclosure of Invention
The invention provides an unmanned vehicle environment sensing system, which aims to solve the technical problems that the existing unmanned vehicle environment sensing is limited by the working range of a sensor and cannot meet the monitoring requirements of multiple distances, all weather and 360 degrees.
The technical scheme adopted by the invention is as follows:
the unmanned vehicle environment sensing system comprises a first detection system for detecting whether an obstacle exists in a first distance range around an unmanned vehicle, a second detection system for acquiring azimuth information of a moving object in a second distance range around the unmanned vehicle, and a third detection system for acquiring image information in a set angle range in a third distance range around the unmanned vehicle in a turnover scanning mode, wherein the third distance is larger than the second distance and the second distance is larger than the first distance.
Further, the system also comprises a processor which is in communication connection with the first detection system, the second detection system and the third detection system, and the processor is used for receiving at least one detection information of the first detection system, the second detection system and the third detection system and generating an instruction for controlling the unmanned vehicle.
Further, the first detection system comprises a plurality of ultrasonic radars arranged around the unmanned vehicle body, and the plurality of ultrasonic radars are connected with the processor through the ultrasonic controller.
Further, a first processing module is configured on the processor and is used for generating an anti-collision early warning instruction according to obstacle detection information fed back by the plurality of ultrasonic radars.
Further, the second detection system comprises a multi-line laser radar for acquiring three-dimensional point clouds of a plurality of moving targets around the unmanned vehicle and/or a plurality of millimeter wave radars for acquiring azimuth information of the moving targets around the unmanned vehicle.
Further, the plurality of millimeter wave radars include a long-distance forward millimeter wave radar mounted at a front end of the unmanned vehicle body and a lateral short-distance millimeter wave radar mounted at a rear end of the unmanned vehicle body.
Further, a second processing module for generating an instant map according to detection information fed back by the multi-line laser radar and the millimeter wave radars is configured on the processor.
Further, the third detection system comprises an image stabilizing cradle head arranged on the unmanned vehicle body, and a visible light camera, an infrared light camera and a laser range finder which are driven by the image stabilizing cradle head to synchronously perform turnover scanning.
Further, the third detection system further comprises a controller, the controller is connected with the image stabilizing holder, the visible light camera, the infrared light camera and the laser range finder, the controller is in communication connection with the processor and is used for controlling turnover actions of the image stabilizing holder and feeding back detection information of the visible light camera, the infrared light camera and the laser range finder to the processor after fusion.
Further, the processor is provided with a remote communication port which is used for being in communication connection with the remote control terminal, and is used for uploading the detection information generated by the third detection system to the remote control terminal and receiving a remote control instruction issued by the remote control terminal.
The invention has the following beneficial effects:
according to the unmanned vehicle environment sensing system, the first detection system, the second detection system and the third detection system are arranged on the unmanned vehicle, so that multi-distance and multi-layer environment sensing is realized, the first detection system is used for detecting whether an obstacle exists in a short distance range or not so as to perform anti-collision early warning, the second detection system is used for detecting azimuth information of a moving target in a middle distance range, detection of the moving obstacle and anti-collision early warning in a high-speed driving process are facilitated, and the third detection system is used for long-distance and narrow-view image detection and is beneficial to remote control and locking monitoring of a specific target. The three distance level detection systems are combined, so that the multifunctional environment sensing is realized, and the method has a wide application value.
In addition to the objects, features and advantages described above, the present invention has other objects, features and advantages. The invention will be described in further detail with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention. In the drawings:
FIG. 1 is a schematic block diagram of an unmanned vehicle environment awareness system in accordance with a preferred embodiment of the present invention;
FIG. 2 is a schematic block diagram of an unmanned vehicle environment awareness system according to another embodiment of the present invention;
fig. 3 is a schematic block diagram of an unmanned vehicle environment awareness system according to yet another embodiment of the present invention.
Reference numerals illustrate:
10. a first detection system; 11. an ultrasonic radar; 12. an ultrasonic controller;
20. a second detection system; 21. a multi-line laser radar; 22. millimeter wave radar;
30. a third detection system; 31. a visible light camera; 32. an infrared light camera; 33. a laser range finder; 34. a controller;
40. a processor; 41. a remote communication port.
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
Referring to fig. 1, a preferred embodiment of the present invention provides an unmanned vehicle environment sensing system including a first detection system 10 for detecting the presence of an obstacle within a first distance range around an unmanned vehicle, a second detection system 20 for acquiring azimuth information of a moving object within a second distance range around the unmanned vehicle, and a third detection system 30 for acquiring image information within a set angle range within a third distance range around the unmanned vehicle in a turnover scanning manner, wherein the third distance is greater than the second distance and the second distance is greater than the first distance.
According to the unmanned vehicle environment sensing system, the first detection system 10, the second detection system 20 and the third detection system 30 are arranged on the unmanned vehicle, so that multi-distance multi-layer environment sensing is realized, and the first detection system 10 is used for detecting whether an obstacle exists in a short distance range or not so as to perform anti-collision early warning. The second detection system 20 is used for detecting azimuth information of a moving object in a middle distance range, and is beneficial to detection of a moving obstacle and anti-collision early warning in a high-speed driving process. The third detection system 30 is used for image detection at a long distance and a narrow viewing angle, and is beneficial to remote control and locking monitoring of specific targets. The three distance level detection systems are combined, so that the multifunctional environment sensing is realized, and the method has a wide application value.
In this embodiment, referring to fig. 2, the unmanned vehicle environment sensing system further includes a processor 40 communicatively connected to each of the first detection system 10, the second detection system 20 and the third detection system 30, and the processor 40 is configured to receive at least one detection information of the first detection system 10, the second detection system 20 and the third detection system 30 and generate an instruction for controlling the unmanned vehicle.
Referring to fig. 3, in the present embodiment, the first detection system 10 includes a plurality of ultrasonic radars 11 arranged around the body of the unmanned vehicle, and the plurality of ultrasonic radars 11 are connected to a processor 40 via an ultrasonic controller 12. Preferably, the number of the ultrasonic radars 11 is 12, the ultrasonic radars are distributed around the vehicle body, specifically, 4 front wheels and 4 rear wheels, 1 ultrasonic radar is connected to each wheel, the collection is controlled by the ultrasonic controller 12, and the ultrasonic controller 12 is in communication connection with the processor 40 through RS-232 and is used for sending detection data to the processor 40. In the present embodiment, the ultrasonic radar 11 is used to detect an obstacle within 3 meters around the vehicle. The processor 40 is provided with a first processing module for generating an anti-collision warning command according to obstacle detection information fed back by the plurality of ultrasonic radars 11. The first processing module plays a role in intelligent detection and collision-prevention early warning of the short-distance obstacle in the running process of the unmanned vehicle. Preferably, the ultrasonic controller 12 may acquire distance and azimuth data of the close-range obstacle according to the feedback signal of the ultrasonic radar 11, and feed back the distance and azimuth data to the first processing module, and the first processing module generates an instruction for performing route avoidance or deceleration control on the unmanned vehicle according to the received distance and azimuth data of the obstacle.
In this embodiment, the second detection system 20 includes a multi-line laser radar 21 for acquiring three-dimensional point clouds of a plurality of moving objects around the unmanned vehicle and/or a plurality of millimeter wave radars 22 for acquiring azimuth information of the moving objects around the unmanned vehicle. Specifically, the multi-line laser radar 21 is in communication connection with the processor 40 via the ethernet, and can directly acquire a three-dimensional point cloud within 360 degrees and 100 meters of the vehicle body for subsequent obstacle recognition and high-precision matching positioning. The multi-line lidar 21 herein includes, but is not limited to, 16-line, 32-line, 64-line lidar. In this embodiment, the plurality of millimeter wave radars 22 include a long-distance forward millimeter wave radar ESR installed at the front end of the vehicle body of the unmanned vehicle and two lateral short-distance millimeter wave radars RSDS installed at the rear end of the vehicle body of the unmanned vehicle, and by combining the ESR and the two RSDS, the distance, azimuth, speed and other information of a moving object within 360 degrees and 100 meters of the unmanned vehicle can be obtained. Preferably, the processor 40 is configured with a second processing module for generating the instant map SLAM (Simultaneous localization and mapping) according to the detection information fed back by the multi-line laser radar 21 and the millimeter wave radars 22, so as to facilitate high-precision three-dimensional map matching positioning, obstacle detection, movement obstacle detection and anti-collision early warning in the high-speed driving process. According to the unmanned aerial vehicle, under intelligent control of the second processing module, automatic control running of the unmanned aerial vehicle at the speed of more than 60KM/h can be achieved.
In this embodiment, the third detection system 30 includes an image stabilizing cradle head disposed on the body of the unmanned vehicle, and a visible light camera 31, an infrared light camera 32 and a laser range finder 33 that are driven by the image stabilizing cradle head to perform synchronous turnover scanning. The third detection system 30 further comprises a controller 34, the controller 34 is connected with the image stabilizing holder, the visible light camera 31, the infrared light camera 32 and the laser range finder 33, the controller 34 is in communication connection with the processor 40, and is used for controlling the turnover action of the image stabilizing holder and feeding back the detection information of the visible light camera 31, the infrared light camera 32 and the laser range finder 33 to the processor 40 after fusion. In this embodiment, the visible light camera 31 is used for collecting a visible light image sequence (daytime running) of a target or a scene, the infrared light camera 32 is used for collecting an infrared image sequence (circadian running) of the target or the scene, the laser range finder 33 is used for measuring a distance (circadian running) of the target, and tasks such as 360 ° circadian patrol reconnaissance, key target tracking, key target multiplying power staring and the like can be realized. The image stabilizing cradle head controls the visible light camera, the infrared light camera and the laser range finder to rotate 360 degrees, and the visible light camera can work day and night (principle decision) when being used for daytime, the infrared light camera and the laser range finder. Preferably, the visible light camera is equipped with a high magnification continuous zoom lens, so high magnification gaze of the key target can be achieved. Preferably, the processor has built-in image tracking cards for tracking objects in the visible light camera and the infrared light camera.
Preferably, in this embodiment, the processor 40 is provided with a remote communication port 41 for communication connection with a remote control terminal, so as to upload the detection information generated by the third detection system 30 to the remote control terminal and receive a remote control instruction issued by the remote control terminal. The remote communication port 41 here may be WIFI or 4G or other similar wireless mobile communication module.
Preferably, an unmanned vehicle is further provided, which comprises the unmanned vehicle environment sensing system of the embodiment, so that the unmanned vehicle can meet the multi-distance and multi-layer environment sensing requirements.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (1)
1. An unmanned vehicle environment sensing system is characterized by comprising a first detection system (10) for detecting whether an obstacle exists in a first distance range around an unmanned vehicle, a second detection system (20) for acquiring azimuth information of a moving object in a second distance range around the unmanned vehicle, and a third detection system (30) for acquiring image information in a set angle range in a third distance range around the unmanned vehicle in a turnover scanning mode, wherein the third distance is larger than the second distance and the second distance is larger than the first distance; the system further comprises a processor (40) which is in communication connection with the first detection system (10), the second detection system (20) and the third detection system (30), wherein the processor (40) is used for receiving at least one detection information of the first detection system (10), the second detection system (20) and the third detection system (30) and generating an instruction for controlling the unmanned vehicle; the first detection system (10) comprises a plurality of ultrasonic radars (11) arranged around the unmanned vehicle body and used for detecting obstacles within 3 meters around the vehicle, the plurality of ultrasonic radars (11) are connected with the processor (40) through an ultrasonic controller (12), a first processing module used for generating an anti-collision early warning instruction according to obstacle detection information fed back by the plurality of ultrasonic radars (11) is arranged on the processor (40), the ultrasonic controller (12) can acquire distance and azimuth data of a close-range obstacle according to feedback signals of the ultrasonic radars (11) and feed the distance and azimuth data back to the first processing module, and the first processing module generates an instruction for carrying out route avoidance or deceleration control on the unmanned vehicle according to the received distance and azimuth data of the obstacle;
the second detection system (20) comprises a multi-line laser radar (21) for acquiring three-dimensional point clouds of a plurality of moving targets around the unmanned vehicle and/or a plurality of millimeter wave radars (22) for acquiring azimuth information of the moving targets around the unmanned vehicle; the multiple millimeter wave radars (22) comprise long-distance forward millimeter wave radars arranged at the front end of the unmanned vehicle body and lateral short-distance millimeter wave radars arranged at the rear end of the unmanned vehicle body so as to acquire the distance, the azimuth and the speed of a moving target within 360 DEG and 100 meters of the unmanned vehicle, and the processor (40) is provided with a second processing module for generating an instant map according to detection information fed back by the multi-line laser radars (21) and the multiple millimeter wave radars (22), so that the matching and the positioning of the high-precision three-dimensional map are facilitated, the detection and the anti-collision early warning of the moving obstacle in the high-speed driving process can be realized, and the automatic control driving of the unmanned vehicle at the speed of more than 60KM/h can be realized;
the third detection system (30) comprises an image stabilizing cradle head arranged on the body of the unmanned vehicle, and a visible light camera (31), an infrared light camera (32) and a laser range finder (33) which are driven by the image stabilizing cradle head to synchronously perform turnover scanning; the third detection system (30) further comprises a controller (34), the controller (34) is connected with the image stabilizing holder, the visible light camera (31), the infrared light camera (32) and the laser range finder (33), the controller (34) is in communication connection with the processor (40) and is used for controlling turnover of the image stabilizing holder and feeding back detection information of the visible light camera (31), the infrared light camera (32) and the laser range finder (33) to the processor (40) after fusion; the processor (40) is provided with a remote communication port (41) which is used for being in communication connection with a remote control terminal, and is used for uploading detection information generated by the third detection system (30) to the remote control terminal and receiving a remote control instruction issued by the remote control terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710969796.5A CN107650908B (en) | 2017-10-18 | 2017-10-18 | Unmanned vehicle environment sensing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710969796.5A CN107650908B (en) | 2017-10-18 | 2017-10-18 | Unmanned vehicle environment sensing system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107650908A CN107650908A (en) | 2018-02-02 |
CN107650908B true CN107650908B (en) | 2023-07-14 |
Family
ID=61118296
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710969796.5A Active CN107650908B (en) | 2017-10-18 | 2017-10-18 | Unmanned vehicle environment sensing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107650908B (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108089203A (en) * | 2018-02-05 | 2018-05-29 | 弗徕威智能机器人科技(上海)有限公司 | A kind of special obstacle object detecting method |
CN109283538B (en) * | 2018-07-13 | 2023-06-13 | 上海大学 | Marine target size detection method based on vision and laser sensor data fusion |
CN108944947A (en) * | 2018-07-15 | 2018-12-07 | 北京三快在线科技有限公司 | The prediction technique and device of steer decision |
CN109177876A (en) * | 2018-08-13 | 2019-01-11 | 吉利汽车研究院(宁波)有限公司 | A kind of moving Object Detection alarm system and method |
CN110799853B (en) * | 2018-10-26 | 2024-04-30 | 深圳市大疆创新科技有限公司 | Environment sensing system and mobile platform |
CN109375629A (en) * | 2018-12-05 | 2019-02-22 | 苏州博众机器人有限公司 | A kind of cruiser and its barrier-avoiding method that navigates |
CN109782764A (en) * | 2019-01-21 | 2019-05-21 | 湖北汽车工业学院 | A kind of unmanned logistics distribution system of intelligent solar, control method and dispensing vehicle |
CN110239597A (en) * | 2019-07-03 | 2019-09-17 | 中铁轨道交通装备有限公司 | A kind of active Unmanned Systems of Straddle type monorail train |
CN110412986A (en) * | 2019-08-19 | 2019-11-05 | 中车株洲电力机车有限公司 | A kind of vehicle barrier detection method and system |
CN110531791A (en) * | 2019-08-25 | 2019-12-03 | 西北工业大学 | The machine integrated target detection unmanned vehicle of multiple instruction set hypencephalon |
CN110525431B (en) * | 2019-09-24 | 2021-07-02 | 江苏经纬智联航空科技有限公司 | Special vehicle anti-collision method applied to airport based on intelligent control |
CN110751336B (en) * | 2019-10-22 | 2023-04-14 | 深圳市道通智能航空技术股份有限公司 | Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier |
CN111308461B (en) * | 2020-04-15 | 2024-05-07 | 长春大学 | Obstacle detection system, detection method and detection device for low-speed vehicle |
CN111781606A (en) * | 2020-07-02 | 2020-10-16 | 大唐信通(浙江)科技有限公司 | Novel miniaturization implementation method for fusion of laser radar and ultrasonic radar |
CN111845347B (en) * | 2020-07-30 | 2022-03-15 | 广州小鹏汽车科技有限公司 | Vehicle driving safety prompting method, vehicle and storage medium |
CN112596050B (en) * | 2020-12-09 | 2024-04-12 | 上海商汤临港智能科技有限公司 | Vehicle, vehicle-mounted sensor system and driving data acquisition method |
CN113405546A (en) * | 2021-05-18 | 2021-09-17 | 智能移动机器人(中山)研究院 | Global sensing early warning system of intelligent sensing sensor |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016088183A (en) * | 2014-10-31 | 2016-05-23 | 株式会社Ihi | Obstacle detection system and railway vehicle |
CN105711597A (en) * | 2016-02-25 | 2016-06-29 | 江苏大学 | System and method for sensing local driving environment in front |
CN207274661U (en) * | 2017-10-18 | 2018-04-27 | 长沙冰眼电子科技有限公司 | Unmanned vehicle context aware systems |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI339627B (en) * | 2008-12-30 | 2011-04-01 | Ind Tech Res Inst | System and method for detecting surrounding environment |
-
2017
- 2017-10-18 CN CN201710969796.5A patent/CN107650908B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016088183A (en) * | 2014-10-31 | 2016-05-23 | 株式会社Ihi | Obstacle detection system and railway vehicle |
CN105711597A (en) * | 2016-02-25 | 2016-06-29 | 江苏大学 | System and method for sensing local driving environment in front |
CN207274661U (en) * | 2017-10-18 | 2018-04-27 | 长沙冰眼电子科技有限公司 | Unmanned vehicle context aware systems |
Also Published As
Publication number | Publication date |
---|---|
CN107650908A (en) | 2018-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107650908B (en) | Unmanned vehicle environment sensing system | |
US10739438B2 (en) | Super-resolution radar for autonomous vehicles | |
US20190204834A1 (en) | Method and apparatus for object detection using convolutional neural network systems | |
EP3742200B1 (en) | Detection apparatus and parameter adjustment method thereof | |
US11852746B2 (en) | Multi-sensor fusion platform for bootstrapping the training of a beam steering radar | |
US20230052240A1 (en) | Geographically disparate sensor fusion for enhanced target detection and identification in autonomous vehicles | |
US11011063B2 (en) | Distributed data collection and processing among vehicle convoy members | |
CN111010532B (en) | Vehicle-mounted machine vision system based on multi-focal-distance camera group | |
US11585896B2 (en) | Motion-based object detection in a vehicle radar using convolutional neural network systems | |
CN105137421A (en) | Photoelectric composite low-altitude early warning detection system | |
CN113085896B (en) | Auxiliary automatic driving system and method for modern rail cleaning vehicle | |
CN114442101B (en) | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar | |
CN106313046A (en) | Multi-level obstacle avoidance system of mobile robot | |
US20230196510A1 (en) | Super-resolution radar for autonomous vehicles | |
CN209852236U (en) | Environment sensing device for unmanned truck | |
CN108958267A (en) | A kind of unmanned vehicle barrier-avoiding method based on laser radar | |
US11852749B2 (en) | Method and apparatus for object detection using a beam steering radar and a decision network | |
EP3749977A1 (en) | Method and apparatus for object detection using a beam steering radar and convolutional neural network system | |
CN207274661U (en) | Unmanned vehicle context aware systems | |
CN204124125U (en) | A kind of front vehicles state of kinematic motion follows the trail of prediction unit | |
CN113759947A (en) | Airborne flight obstacle avoidance auxiliary method, device and system based on laser radar | |
CN113734197A (en) | Unmanned intelligent control scheme based on data fusion | |
CN207396738U (en) | A kind of intelligent Vehicular radar system | |
CN112735121A (en) | Holographic sensing system based on image-level laser radar | |
CN206757038U (en) | Pilotless automobile anticollision millimetre-wave radar system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |