WO2023027538A1 - Véhicule autonome - Google Patents
Véhicule autonome Download PDFInfo
- Publication number
- WO2023027538A1 WO2023027538A1 PCT/KR2022/012777 KR2022012777W WO2023027538A1 WO 2023027538 A1 WO2023027538 A1 WO 2023027538A1 KR 2022012777 W KR2022012777 W KR 2022012777W WO 2023027538 A1 WO2023027538 A1 WO 2023027538A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- vehicle
- distance
- modules
- main
- Prior art date
Links
- 238000000926 separation method Methods 0.000 claims description 36
- 238000012545 processing Methods 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 5
- 238000007789 sealing Methods 0.000 claims description 4
- 238000000034 method Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000005484 gravity Effects 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a vehicle for autonomous driving, capable of photographing the surroundings of the vehicle at a wide angle of view, providing images around the vehicle with almost no blind spots, and capable of obtaining distance information from the vehicle to objects existing in various distance ranges. It is about autonomous vehicles.
- understanding the driving environment surrounding the vehicle plays a particularly important role. Also, the processing of visual information is the process that accounts for the most weight in understanding the driving environment.
- An autonomous vehicle detects an object from visual information, plans the vehicle's movement path based on this, and drives the vehicle to move.
- object recognition in the autonomous vehicle field utilizes various and heterogeneous sensors at the same time, because satisfactory object recognition is not achieved with information from only one sensor.
- Sensors that collect visual information from autonomous vehicles include lidar, radar, and cameras.
- lidar and radar transmit electromagnetic waves outward and collect signals scattered by objects.
- LiDAR processes and provides 3D information, so it is suitable for use in autonomous vehicles, but it cannot directly measure speed, and there is a concern that long-distance recognition ability may deteriorate in harsh environments such as bad weather and strong sunlight.
- Radar can directly detect the distance and speed to the outside and can recognize a distance from the lidar, but has poor resolution and is not suitable when the target is a person or a vehicle.
- ultrasonic sensor that is widely used as a sensor that emits a warning sound when a vehicle approaches an adjacent vehicle during parking.
- Ultrasonic sensors calculate distance by generating mechanical sound waves and measuring reflections from nearby objects. It works flawlessly in bad weather and even works well in adverse conditions where the sensors are covered in dust, but it is not yet universally adopted for self-driving vehicles.
- a typical sensor that collects visual information in an autonomous vehicle is a camera. Cameras can collect a wealth of information, but recognizing objects requires multiple computations.
- Autonomous vehicles not only recognize objects, but also determine whether an object is far or near, and focus more on nearby objects to ensure the accuracy and stability of autonomous driving.
- autonomous vehicles can recognize external objects even using cameras, provide images around the vehicle with a wide angle of view with almost no blind spots, and obtain distance information from objects in various distance ranges from the vehicle. need this
- the present invention has been devised to solve the conventional problem, and the present invention can photograph the surroundings of an autonomous vehicle with a wide angle of view, providing images around the vehicle with almost no blind spots, and present in various distance ranges from the vehicle.
- An object of the present invention is to provide an autonomous vehicle capable of acquiring distance information with an object.
- the plurality of camera groups include a front left camera group, a front center camera group, a front right camera group mounted along the circumference of the vehicle, A right side camera group, a left side camera group, and a rear center camera group are included, and each camera group is composed of at least one camera module composed of a pair of a main camera and an auxiliary camera,
- a processor for obtaining distance information between an object and the vehicle may be included.
- the self-driving vehicle can capture the surroundings of the vehicle with a wide angle of view, providing images around the vehicle with almost no blind spots.
- the self-driving vehicle selectively combines a plurality of cameras mounted along the circumference of the vehicle according to a distance range to be measured and drives them as a pair, thereby increasing the distance between the two selected cameras. Distance information between an object existing in a measurable range determined according to the vehicle and the vehicle may be obtained.
- FIG. 1 is a block diagram showing the configuration of an apparatus for measuring proximity distance of an autonomous vehicle according to an embodiment of the present invention.
- FIG. 2 is a diagram schematically illustrating the configuration of an apparatus for measuring proximity distance of an autonomous vehicle according to the present invention.
- FIG. 3 is a diagram illustrating an example in which a proximity distance measuring device for an autonomous vehicle according to an embodiment of the present invention calculates a distance to a nearby object.
- FIG. 4 is a diagram illustrating an example of another configuration of a main camera and an auxiliary camera according to an embodiment of the present invention.
- FIG. 5 is a diagram illustrating an example of another configuration of a main camera and an auxiliary camera according to an embodiment of the present invention.
- FIG. 6 is a flowchart illustrating a method for measuring a proximity distance of an autonomous vehicle according to an embodiment of the present invention.
- FIG. 7 illustrates an example of a camera group mounted on an autonomous vehicle according to another embodiment of the present invention.
- FIG. 8 is an example illustrating a short-distance range and a short-distance range in which a distance can be measured in an autonomous vehicle according to an embodiment of the present invention.
- FIG. 9 is an example illustrating a mid-distance range in which a distance can be measured in an autonomous vehicle according to an embodiment of the present invention.
- FIG. 10 is an example illustrating a long-distance range in which a distance can be measured in an autonomous vehicle according to an embodiment of the present invention.
- 11 is an example illustrating various ranges in which a distance can be measured in an autonomous vehicle according to an embodiment of the present invention.
- FIG. 12 illustrates an example of a plurality of camera groups mounted on an autonomous vehicle according to an embodiment of the present invention.
- FIG. 13 illustrates an example of a plurality of camera groups mounted on an autonomous vehicle according to an embodiment of the present invention.
- FIG. 14 is a diagram illustrating an example of an integrated frame mounting a camera group of an autonomous driving vehicle according to an embodiment of the present invention.
- 15 is a diagram illustrating an example of a camera group and peripheral devices mounted in an integrated frame according to an embodiment of the present invention.
- 16 is a diagram illustrating an example of wiring of a camera group and a processor mounted in an integrated frame according to an embodiment of the present invention.
- FIG. 1 is a block diagram showing the configuration of an apparatus for measuring proximity distance of an autonomous vehicle according to an embodiment of the present invention.
- an apparatus for measuring a proximity distance of an autonomous vehicle includes a main camera 100 , an auxiliary camera 200 and a processor 300 .
- the main camera 100 captures an external image for autonomous driving of the vehicle and transmits the captured image data to the processor 300 .
- the auxiliary camera 200 is provided as a pair with the main camera 100 and transmits the acquired auxiliary image to the processor 300 . Driving of the main camera 100 and the auxiliary camera 200 will be described in detail below.
- the main camera 100 and the auxiliary camera 200 are installed outside the vehicle to reduce the load of processing image data by narrowing the angle of view rather than inside the vehicle, but in some cases may be installed inside the vehicle, and the installation location is not limited.
- the processor 300 controls the overall operation of the proximity distance measuring device of the autonomous vehicle.
- the processor 300 may detect the driving speed of the autonomous driving vehicle and execute the proximity distance measuring device in the driving mode and the proximity mode according to the detected driving speed.
- the processor 300 executes a driving mode when it is detected that the self-driving vehicle is traveling above a predetermined speed range, and the autonomous vehicle is traveling below a predetermined speed range. When detected, it activates proximity mode.
- the driving mode refers to an execution in which a surrounding image is captured through a general lens camera mounted on the self-driving vehicle and the captured image is transmitted to the processor 300 to be used to control the vehicle traveling at a predetermined speed or higher.
- the camera used at this time may be set as the main camera 100 in the present invention.
- the main camera 100 and the auxiliary camera 200 provided as a pair are operated together so that the main camera 100 and the auxiliary camera ( 200) to measure the proximity distance between an object close to the vehicle and the vehicle using the acquired images, and transmits the measured distance information to the processor 300.
- the autonomous vehicle When the autonomous vehicle travels below a preset speed range, it may be, for example, in a parked state, in a deceleration state, in a sudden stop state, and the like. Therefore, when parking, the proximity distance between the adjacent sidewall and the vehicle is precisely measured, and during a sudden stop or deceleration, the proximity distance between a nearby object and the vehicle is precisely measured, and the vehicle is controlled based on the measured distance information to prevent collisions. and improve the safety and accuracy of autonomous driving.
- FIG. 2 is a diagram schematically illustrating the configuration of an apparatus for measuring proximity distance of an autonomous vehicle according to the present invention.
- the main camera 100 is composed of a lens 110 mainly used in a camera for an autonomous driving vehicle, and has a main sealed housing in which the lens 110 is mounted.
- the main camera 100 may obtain a main image of a nearby object.
- the auxiliary camera 200 may be provided as a pinhole camera, and is provided as a pair with the main camera 100 to acquire auxiliary images.
- a pinhole 201 at a position corresponding to the front of the auxiliary camera 200, a light-transmitting part 202 capable of being opened and closed at a part corresponding to the front of the pinhole 201, and opening and closing of the light-transmitting part 202 under the control of the processor 300 It consists of a pinhole opening and closing valve 204 that opens and closes the pinhole from the outside by adjusting, and is provided with a secondary sealing housing 205 for mounting them.
- a pinhole camera is a camera that transmits light through a pinhole, which is a small hole, detects subject information by an image sensor, and converts it into an image signal to take pictures.
- the pinhole camera has the advantage of being able to clearly check the image even in the hole of a needle, and being ultra-small, it can be installed regardless of the installation location.
- a pinhole camera is different from a camera that uses a lens, so it has a feature that focuses well from close to far distances, but requires a long exposure because the amount of light entering through the hole is small, and it is difficult to shoot moving objects. Since it is not suitable, it is employed as a secondary camera.
- the auxiliary camera 200 may be provided so that the front side of the main camera 100 faces the same direction. Also, the pair of main cameras 100 and auxiliary cameras 200 may be provided in various parts of the vehicle, and the processor 300 may be connected to and shared with each pair of cameras.
- the processor 300 executes the driving mode when the driving speed of the autonomous vehicle is detected to be higher than the first speed range.
- the driving mode generally refers to an execution of acquiring an image through a normal camera installed in an autonomous vehicle and transmitting the image to the processor 300 .
- the processor 300 executes the proximity mode to operate the main camera 100 and the auxiliary camera 200 together when the driving speed of the self-driving vehicle is detected to be less than or equal to the second preset speed range, so that the main camera ( A main image may be acquired from the camera 100 and a secondary image may be acquired from the auxiliary camera 200 .
- the first speed range and the second speed range may be set to the same value or different values according to a designer's setting.
- the processor 300 transmits a pinhole valve open signal to the pinhole opening/closing valve 204 to open the light transmission unit 202 and acquires an auxiliary image of a nearby object through the auxiliary camera 200. there is.
- the processor 300 may detect a nearby object in each of the main image and the auxiliary image, and calculate detection area information including coordinates of the detected nearby object.
- the processor 300 detects a nearby object using pattern recognition based on an object's appearance.
- Pattern recognition for a set of learning images of various object shapes can learn object patterns and detect nearby objects using the learned model.
- linear discriminant analysis, artificial neural network, Adaboost, Haar feature filter, Methods such as support vector machines may be used.
- the processor 300 is not limited to this object detection method and may recognize and detect a nearby object in the main image and the auxiliary image using various known methods and algorithms.
- the processor 300 detects matching points in each of the main image and the auxiliary image.
- the matching point is a feature point or feature area that is a standard when calculating the distance to a nearby object to be photographed based on the main image and the auxiliary image, and is the same feature point in each of the main image and the auxiliary image. detect each.
- the processor 300 selects a specific part of the vehicle detected in each of the main image and the auxiliary image as an arbitrary feature point and detects it as a matching point.
- a center point between two headlights of an adjacent vehicle is set as a matching point, and in this case, the matching point can be detected by calculating the center of gravity of the two headlights in each image.
- the center of gravity calculation can be calculated using a known algorithm for calculating a center of gravity in a general 2D shape area or pixel area.
- the processor 300 calculates a distance from the cameras 100 and 200 to a nearby object based on matching points detected in the main image and the auxiliary image, respectively. In one embodiment, the processor 300 may measure the distance to a nearby object by matching the main image and the auxiliary image.
- FIG. 3 is a diagram illustrating an example in which a proximity distance measuring device for an autonomous vehicle according to an embodiment of the present invention calculates a distance to a nearby object.
- the main camera 100 and the auxiliary camera 200 are installed to look at the front of each camera at the same viewing angle, and photograph a nearby object S in front. At this time, the two cameras 100 and 200 are separated by a predetermined distance D.
- the angle between the matching point of the nearby object S and the center of the main image in the main image captured by the main camera 100 is the first angle ⁇ 1
- the angle between the matching point of the nearby object S and the center of the auxiliary image is measured as the second angle ⁇ 2.
- the processor 300 calculates the first angle ⁇ 1 and the second angle ⁇ 2 in each image, and calculates the vehicle on which the cameras 100 and 200 are mounted based on the angle and the distance D between the cameras. It is possible to calculate the distance from to the nearby object (S).
- FIGS. 4 and 5 are diagrams illustrating an example of another configuration of a main camera and an auxiliary camera according to an embodiment of the present invention.
- the auxiliary camera 200 may be disposed so that the front side of the main camera 100 faces the same direction, and may be provided in plurality in an area adjacent to the main camera 100 .
- FIG. 4 shows an example in which the plurality of auxiliary cameras 210 to 240 are provided in a straight line with the main camera 100 as the center, and FIG. It shows a configuration enveloping in a 360-degree direction.
- the processor 300 calculates an average value for distances to nearby objects calculated by using a pair of each of the auxiliary cameras 210 to 270 corresponding to one main camera 100, and calculates an average value.
- the obtained average value may be finally determined as the distance from the vehicle to the nearby object S.
- an error rate for a distance measured from a camera mounted on a vehicle to a nearby object may be reduced.
- the processor 300 may determine the length, width, and length of a nearby object by using distances to a nearby object calculated using a pair of each of the plurality of sub-cameras 210 to 270 corresponding to the main camera 100.
- the height can be calculated, and the volume can be calculated based on the calculated length, width and height of the nearby object.
- the method of detecting the nearby object using pattern recognition based on the appearance of the nearby object has been described above.
- the present invention increases the accuracy of driving environment recognition and further improves the stability of autonomous driving by calculating the volume of a nearby object by using a pair of each of the plurality of auxiliary cameras 210 to 270 corresponding to the main camera 100. can be raised
- FIG. 6 is a flowchart illustrating a method for measuring a proximity distance of an autonomous vehicle according to an embodiment of the present invention.
- the speed of the autonomous driving vehicle is detected (S100), the detected speed is compared with a predetermined speed range (first speed range) set in advance (S110), and the detected speed is the first speed. If it exceeds the range, the driving mode is executed until the driving ends (S120). When the speed of the vehicle is detected to be less than the first speed range while the driving mode is executed, the process is switched to step S140.
- first speed range a predetermined speed range set in advance
- step S110 if the sensed speed of the vehicle is less than the first speed range, it is compared whether it is equal to or less than the preset second speed range (S140), and if the sensed speed is equal to or less than the second speed range, the proximity mode is executed (S150). Accordingly, the main camera and the auxiliary camera are operated together (S160), the main image is obtained from the main camera, and the auxiliary image is obtained from the auxiliary camera (S170), and the distance from the vehicle to the nearby object is calculated (S180). .
- matching points are detected in each of the main image and the auxiliary image.
- the matching point is a feature point or feature region that is a standard when calculating a distance to a nearby object to be photographed based on the main image and the auxiliary image, and the same feature point is detected in each of the main image and the auxiliary image.
- a nearby object when a nearby object is analyzed as a vehicle, a specific part of the vehicle detected in each of the main image and the auxiliary image may be selected as an arbitrary feature point and detected as a matching point. Accordingly, the distances from the cameras to nearby objects are calculated based on the matching points detected in the main image and the auxiliary image, respectively. In one embodiment, the distance to a nearby object may be measured by matching the main image and the auxiliary image.
- driving of the autonomous vehicle is controlled until the driving ends (S190). While continuously detecting the driving speed until the driving ends, the driving mode may be changed to a driving mode when the driving speed changes, or the proximity mode may be maintained according to the real-time driving speed.
- the apparatus for measuring the proximity distance of an autonomous vehicle executes a driving mode or a proximity mode according to the driving speed of the vehicle, and in the proximity mode, the distance between a nearby object and the vehicle during parking or sudden stop is measured as the proximity distance. It can be precisely measured and based on this, the autonomous driving of the vehicle can be safely controlled.
- the present invention acquires a two-dimensional image and Since 3D relative coordinates can be matched from 2D images in images), precision can be increased.
- an autonomous vehicle according to another embodiment of the present invention includes a front left camera group 10, a front center camera group 20, and a front right camera group 30 provided along the outer circumference of the vehicle. , a right side camera group 40, a left side camera group 50, a rear center camera group 60, and a processor 300.
- Each of the camera groups 10 to 60 may include at least one camera module including a pair of the main camera 100 and the auxiliary camera 200 described above. Operations of each camera group 10 to 60 may be controlled by the processor 300 .
- the main camera 100 is a lens camera generally used by autonomous vehicles, and functions as a viewer by taking an image in front of the camera and transmitting the captured image to the processor 300 at normal times.
- the main camera 100 is selected by the processor 300 to calculate the distance to an object near the vehicle, it is driven with the auxiliary camera 200 selected together in a pair combination to calculate the distance to the object. used to do
- the angle of view of the main camera 100 is set to about 45 degrees, and a method commonly known in the art may be used as a method for adjusting the angle of view of the camera.
- the auxiliary camera 200 is provided as a pinhole camera in the same manner as in the above-described embodiment, and the processor 300 transmits a pinhole valve open signal to the pinhole opening/closing valve 204 to open the light transmission unit 202, thereby enabling photographing. .
- the angle of view of the auxiliary camera 200 is set to about 45 degrees.
- the front left camera group 10 includes a plurality of camera modules composed of a pair of the above-described main cameras 101 to 103 and auxiliary cameras 201 to 203, and in one embodiment, the first camera module ( 101 and 201), the second camera module 102 and 202, and the third camera module 103 and 203 may be arranged in order and provided at the front left corner of the vehicle.
- the front center camera group 20 includes one fourth camera module 104, 204 composed of a pair of a main camera 104 and an auxiliary camera 204, and may be provided at the front center of the vehicle.
- the front right camera group 30 includes a plurality of camera modules composed of a pair of main cameras 105 to 107 and auxiliary cameras 205 to 207, and in one embodiment, the fifth camera modules 105 and 205, The sixth camera modules 106 and 206 and the seventh camera modules 107 and 207 may be arranged in order and provided at the front right corner of the vehicle.
- the right side camera group 40 includes a plurality of camera modules provided with a pair of main cameras 108 and 109 and auxiliary cameras 208 and 209, and in one embodiment, the eighth camera modules 108 and 208 And the ninth camera module (109, 209) is disposed in a parallel direction may be provided on the right center side of the vehicle.
- the left side camera group 50 includes a plurality of camera modules provided with a pair of main cameras 111 and 112 and auxiliary cameras 211 and 212, and in one embodiment, the tenth camera modules 111 and 211 And the eleventh camera modules 112 and 212 are disposed in a parallel direction and may be provided on the left center side of the vehicle.
- the rear center camera group 60 includes one twelfth camera module 113 and 213 composed of a pair of a main camera 113 and an auxiliary camera 213 and may be provided at the rear center of the vehicle.
- the self-driving vehicle of the present invention has a short range (A), a short range (B), a medium range (C), and a long range (D) as shown in FIG. ) Information for calculating a distance between an object existing in the vehicle and the vehicle may be obtained.
- the processor 300 selectively combines a main camera and an auxiliary camera from among a plurality of camera modules according to a distance range to be measured and drives them as a pair, so that the measurable distance determined according to the separation distance between the selected two cameras. Distance information between an object existing in the range and the vehicle may be obtained.
- the processor 300 is based on a pair of main camera 100 and auxiliary camera 200 for each of the first camera module to the twelfth camera module, the object existing in the proximity range (A) and The distance between vehicles can be calculated.
- A the proximity range
- the distance between vehicles can be calculated.
- FIG. 8 is an example illustrating a short-distance range and a short-distance range in which a distance can be measured in an autonomous vehicle according to an embodiment of the present invention.
- the processor 300 uses a pair of main camera 100 and auxiliary camera 200 constituting one camera module. work together A part where the angles of view of the pair of main cameras 100 and the auxiliary camera 200 constituting one camera module overlap may be a close distance range.
- the overlapping angle of view of the pair of main cameras 100 and the auxiliary camera 200 constituting one camera module will be described as a first angle of view, and the first angle of view is about 45 degrees.
- the distance D between the pair of main cameras 100 and the auxiliary camera 200 constituting one camera module will be described as a first separation distance L1.
- the first camera modules 101 and 201, the second camera modules 102 and 202, and the third camera module When both 103 and 203) are operated, it is possible to cover the front with a 135-degree field of view corresponding to the close distance range (A).
- the processor 300 may take a panoramic photograph through the first camera modules 101 and 201 , the second camera modules 102 and 202 , and the third camera modules 103 and 203 .
- the processor 300 connects the images taken through the first main camera 101, the second main camera 102, and the third main camera 103, but the overlapping angles of view of the cameras separate the images.
- a panoramic image may be obtained by deleting the overlapping portion using a software program.
- the proximity range (A) It is possible to cover the front of the 135 degree angle of view corresponding to .
- the processor 300 may take a panoramic photograph through the fifth camera modules 105 and 205 , the sixth camera modules 106 and 206 , and the seventh camera modules 107 and 207 .
- the processor 300 uses a main camera in one of the two camera modules spaced apart by the second separation distance L2. It is possible to obtain distance information with an object existing in the short distance range (B) by driving and driving an auxiliary camera in the camera module on the other side.
- the two adjacent camera modules refer to two camera modules spaced apart by a second separation distance L2, such as the third camera modules 103 and 203 and the fourth camera modules 104 and 204.
- the fourth camera modules 104 and 204 and the fifth camera modules 105 and 205 are also spaced apart by a second separation distance L2.
- the main camera of one camera module and the auxiliary camera of the other camera module are driven, and their angles of view are in the short range (B) based on the overlapping area. It is possible to obtain distance information with an object.
- the second separation distance L2 is longer than the above-described first separation distance L1, and therefore, a shorter distance range determined according to the second separation distance L2 than the proximity distance range A determined according to the first separation distance L1.
- (B) has a longer distance from the vehicle.
- FIG. 9 is an example illustrating a mid-distance range in which a distance can be measured in an autonomous vehicle according to an embodiment of the present invention.
- the processor 300 uses a main camera in one of the two camera modules spaced apart by the third separation distance L3. It is possible to obtain distance information with an object existing in the middle distance range (C) by driving and driving an auxiliary camera in the camera module on the other side.
- the two camera modules spaced apart by the third distance L3 are two spaced apart by the third distance L3, such as the first camera modules 101 and 201 and the tenth camera modules 111 and 211. It means two camera modules.
- the seventh camera modules 107 and 207 and the eighth camera modules 108 and 208 are spaced apart by a third separation distance L3.
- the third separation distance (L3) is longer than the aforementioned second separation distance (L2), and therefore, the middle distance range (( C) has a longer distance from the vehicle.
- the processor 300 uses a main camera in one of the two camera modules spaced apart by the fourth separation distance L4. It is possible to obtain distance information with an object existing in the long-distance range (D) by driving and driving an auxiliary camera in the camera module on the other side.
- the two camera modules spaced apart by the fourth separation distance L4 are provided with the third camera modules 103 and 203 and the fifth camera modules 105 and 205 spaced apart by the fourth separation distance L4. It means two camera modules.
- the fourth separation distance (L4) is longer than the aforementioned third separation distance (L3), and therefore, the long distance range ( D) has a longer distance from the vehicle.
- the camera group of the present invention can cover 0.02 ⁇ 3m from the vehicle as a short range (B), cover 2 ⁇ 40m from the vehicle as a medium range (C), and long range As (D), it may be possible to cover about 30 to 100 m from the vehicle, and this is just an example, and the distance that each camera group can cover may vary depending on the design.
- the processor 300 may control the self-driving vehicle based on distance information with objects around the vehicle obtained in the above-described method.
- the vehicle camera group for autonomous driving selectively combines a plurality of cameras provided along the circumference of the vehicle according to the distance range to be measured and drives them as a pair, thereby detecting the distance between the two selected cameras.
- Distance information between an object existing in a measurable distance range determined according to the separation distance and the vehicle may be obtained.
- 12 and 13 illustrate an example of a plurality of camera groups mounted on an autonomous vehicle according to an embodiment of the present invention.
- the self-driving vehicle described above with reference to FIGS. 7 to 11 is equipped with a total of 12 camera modules used to measure a viewer function around the vehicle and a distance from the vehicle to an object.
- the front left camera group includes 4 camera modules
- the front center camera group includes 3 camera modules
- the front right camera group includes 4 camera modules.
- the right side camera group includes 4 camera modules
- the left side camera group includes 4 camera modules
- the rear center camera group includes 1 camera module, for a total of 20 camera modules. .
- the angle of view of each camera may be set to about 30 degrees.
- the front left camera group can implement a viewer with a 120-degree view angle
- the front center camera group can implement a viewer with a 90-degree view angle
- the front right camera group can implement a viewer with a 120-degree view angle.
- a viewer with a wider angle than the camera modules of the embodiment shown in FIG. 11 a viewer with a total view angle of 120 degrees can be implemented by the right side camera group.
- the self-driving vehicle of the embodiment shown in FIG. 12 can implement a wider range of view angles around the vehicle than the autonomous vehicle embodiment shown in FIG. An object of can be detected, distance information to the object can be obtained, and blind spots can be minimized.
- the front left camera group includes 4 camera modules
- the front center camera group includes 3 camera modules
- the front right camera group includes 4 camera modules
- the right front camera group includes 4 camera modules.
- the side camera group includes 5 camera modules
- the left side camera group includes 5 camera modules
- the rear center camera group includes 1 camera module, for a total of 22 camera modules.
- the angle of view of each camera may be set to about 30 degrees. Accordingly, a viewer with a 120 degree angle of view can be implemented by the front left camera group, a viewer with a 90 degree angle of view can be implemented by the front center camera group, and a viewer with a 120 degree angle of view can be implemented by the front right camera group. A viewer with a total angle of view of 150 degrees can be realized by the left and right side camera groups.
- the self-driving vehicle of the embodiment shown in FIG. 13 can implement a wider range of view angles around the vehicle than the autonomous vehicle embodiment shown in FIG. and obtain distance information with the object, and realize a viewer with almost no blind spots.
- 14 to 16 are diagrams illustrating an example of an integrated frame mounting a camera group of an autonomous driving vehicle according to an embodiment of the present invention.
- the integrated frame 70 for mounting the camera group of the autonomous driving vehicle is configured in the form of a frame so that it can be detachably mounted on at least a part of the vehicle.
- it may be provided in the same shape as the shape of the circumference of at least a part of the vehicle, and FIG. 12 shows an example provided in a 'c' shape along the circumference of the front and left and right parts of the vehicle.
- the integral frame 70 may be configured to further include a mounting means for detachably mounting the frame 70 to the vehicle, and the mounting means may be configured in any one of various embodiments, such as screws, clips, and mounting magnets. there is.
- Components of the frame may be made of the same material as that of the vehicle body.
- the front left camera group 10, the front center camera group 20, the front right camera group 30, the right side camera group 40, and the left side camera group ( 50) may be provided inside the integrated frame 70, and a part corresponding to the lens part of the camera modules constituting each camera group may be exposed.
- the integrated frame 70 has a front left camera group 10 and a front center camera group 20 ,
- the front right camera group 30, the right side camera group 40, and the left side camera group 50, as well as the rear center camera group 60 can be mounted on the integrated frame 70.
- each side of the integrated frame 70 may be provided as a slide structure that can be mounted on a vehicle by adjusting the size of the integrated frame 70 according to the size of the vehicle, or may be provided as a link connection structure that can be stretched.
- an example of implementing the integrated frame 70 is not limited to the above-described slide type or link structure, and may be implemented in various other embodiments.
- an integrated frame 70 mounting a camera group of an autonomous driving vehicle includes an air pump 81, an air tank 82, and a solvent unit 83 used to clean each camera therein. It can be mounted, and a power unit 84 for supplying power to each component mounted inside the integrated frame 70 can be further mounted.
- FIG. 16 shows a front left camera group 10, a front center camera group 20, a front right camera group 30, and a right side camera group ( 40), the left side camera group 50 and the rear center camera group 60 show an example of wiring connected to one processor 300.
- Methods and apparatus according to the present invention may be driven by instructions that cause one or more processors to perform the functions and processors described above.
- such instructions may include interpreted instructions, such as script instructions such as JavaScript or ECMAScript instructions, or executable code or other instructions stored on a computer readable medium.
- the device according to the present invention may be implemented in a distributed manner over a network, such as a server farm, or may be implemented in a single computer device.
- implementations of the subject matter and functional operations described herein may be implemented in other types of digital electronic circuitry, or may be implemented in other types of digital electronic circuitry, or may include the structures disclosed herein and their structural equivalents. It may be implemented as computer software, firmware, or hardware, or a combination of one or more of them. Implementations of the subject matter described herein relate to one or more computer program products, that is to say computer program instructions encoded on a tangible program storage medium for execution by or for controlling the operation of an apparatus according to the present invention. It can be implemented as more than one module.
- a computer readable medium may be a machine readable storage device, a machine readable storage substrate, a memory device, a composition of matter that affects a machine readable propagating signal, or a combination of one or more of these.
- Implementations of the subject matter described herein may include back-end components, such as, for example, data servers, or may include middleware components, such as, for example, application servers, or may include, for example, web browsers or graphical users through which users may interact with implementations of the subject matter described herein. It may be implemented in a computing system that includes a front-end component such as a client computer having an interface or any combination of one or more of such back-ends, middleware or front-end components. The components of the system are interconnectable by any form or medium of digital data communication, such as, for example, a communication network.
- the present invention can be applied to an autonomous vehicle capable of capturing the surroundings of an autonomous vehicle at a wide angle of view, providing an image around the vehicle with almost no blind spot, and obtaining distance information with objects existing in various distance ranges from the vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
La présente invention se rapporte à un groupe de caméras pour un véhicule autonome, l'environnement du véhicule autonome pouvant être photographié avec un grand angle de vue, ce qui permet d'obtenir des images de l'environnement du véhicule avec presque aucun angle mort et d'obtenir des informations concernant des distances par rapport à des objets qui sont présents dans diverses plages de distances à partir du véhicule.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020210114016A KR102506812B1 (ko) | 2021-08-27 | 2021-08-27 | 자율 주행용 차량 |
KR10-2021-0114016 | 2021-08-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023027538A1 true WO2023027538A1 (fr) | 2023-03-02 |
Family
ID=85321933
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/012777 WO2023027538A1 (fr) | 2021-08-27 | 2022-08-26 | Véhicule autonome |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102506812B1 (fr) |
WO (1) | WO2023027538A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090028547A1 (en) * | 2007-07-27 | 2009-01-29 | Chih-Yi Yang | Cartridge type attachment lens for a digital camera |
KR20130081914A (ko) * | 2012-01-10 | 2013-07-18 | 에스엘 주식회사 | 차량용 카메라 노출 제어 장치 및 그 방법 |
KR20180006733A (ko) * | 2016-07-11 | 2018-01-19 | 엘지전자 주식회사 | 차량 운전 보조장치 및 이를 포함하는 차량 |
US20200112657A1 (en) * | 2017-05-10 | 2020-04-09 | Mobileye Vision Technologies Ltd. | Cross field of view for autonomous vehicle systems |
KR20200049643A (ko) * | 2018-10-29 | 2020-05-08 | (주)서울소프트 | 스마트단말기로 구성된 멀티비젼을 구비한 다채널 avm 시스템 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102022773B1 (ko) | 2017-10-11 | 2019-09-18 | 한국철도기술연구원 | 자율주행차량의 정밀위치감지 장치, 감지방법, 그 정밀위치감지장치를 통한 정차지원 시스템 및 정차지원방법 |
KR102249769B1 (ko) | 2019-12-06 | 2021-05-12 | 주식회사 모빌테크 | 2차원 영상의 픽셀별 3차원 좌표값 추정 방법 및 이를 이용한 자율주행정보 추정 방법 |
-
2021
- 2021-08-27 KR KR1020210114016A patent/KR102506812B1/ko active IP Right Grant
-
2022
- 2022-08-26 WO PCT/KR2022/012777 patent/WO2023027538A1/fr unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090028547A1 (en) * | 2007-07-27 | 2009-01-29 | Chih-Yi Yang | Cartridge type attachment lens for a digital camera |
KR20130081914A (ko) * | 2012-01-10 | 2013-07-18 | 에스엘 주식회사 | 차량용 카메라 노출 제어 장치 및 그 방법 |
KR20180006733A (ko) * | 2016-07-11 | 2018-01-19 | 엘지전자 주식회사 | 차량 운전 보조장치 및 이를 포함하는 차량 |
US20200112657A1 (en) * | 2017-05-10 | 2020-04-09 | Mobileye Vision Technologies Ltd. | Cross field of view for autonomous vehicle systems |
KR20200049643A (ko) * | 2018-10-29 | 2020-05-08 | (주)서울소프트 | 스마트단말기로 구성된 멀티비젼을 구비한 다채널 avm 시스템 |
Also Published As
Publication number | Publication date |
---|---|
KR102506812B1 (ko) | 2023-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020071839A1 (fr) | Dispositif et procédé de surveillance de port et de navires | |
WO2017008224A1 (fr) | Procédé de détection de distance à un objet mobile, dispositif et aéronef | |
WO2020004817A1 (fr) | Appareil et procédé de détection d'informations de voie, et support d'enregistrement lisible par ordinateur stockant un programme informatique programmé pour exécuter ledit procédé | |
WO2015194867A1 (fr) | Dispositif de reconnaissance de position de robot mobile utilisant le suivi direct, et son procédé | |
WO2018070686A1 (fr) | Robot de guidage d'aéroport et son procédé de fonctionnement | |
JPH10501386A (ja) | 可動プラットフォームから移動物体を表示するビデオ技術 | |
WO2019182355A1 (fr) | Téléphone intelligent, véhicule et caméra ayant un capteur d'image thermique, et procédé d'affichage et de détection l'utilisant | |
WO2012005387A1 (fr) | Procédé et système de suivi d'un objet mobile dans une zone étendue à l'aide de multiples caméras et d'un algorithme de poursuite d'objet | |
WO2017183915A2 (fr) | Appareil d'acquisition d'image et son procédé | |
WO2017078213A1 (fr) | Procédé pour détecter un objet en déplacement dans une image photographiée, et système de prévention d'accident d'embarquement et de débarquement l'utilisant | |
US20180268225A1 (en) | Processing apparatus and processing system | |
WO2022039404A1 (fr) | Appareil de caméra stéréo ayant un large champ de vision et procédé de traitement d'image de profondeur l'utilisant | |
WO2021157904A1 (fr) | Appareil électronique et procédé de commande associé | |
WO2020230931A1 (fr) | Robot générant une carte sur la base d'un multi-capteur et d'une intelligence artificielle, configurant une corrélation entre des nœuds et s'exécutant au moyen de la carte, et procédé de génération de carte | |
WO2020141900A1 (fr) | Robot mobile et son procédé d'avancement | |
WO2020189831A1 (fr) | Procédé de surveillance et de commande de véhicule autonome | |
WO2020096263A1 (fr) | Système de caméra inclus dans un véhicule et procédé de commande de celui-ci | |
WO2022196884A1 (fr) | Système de détection de véhicule et procédé de détection de véhicule utilisant une caméra stéréo et un radar | |
WO2023146071A1 (fr) | Appareil de contrôle de conditions dans un véhicule extérieur de zone latérale avant | |
WO2016086380A1 (fr) | Procédé et dispositif de détection d'objet, dispositif mobile de commande à distance et véhicule de vol | |
WO2023027538A1 (fr) | Véhicule autonome | |
WO2020130209A1 (fr) | Procédé et appareil de mesure de vitesse de véhicule à l'aide d'un traitement d'images | |
WO2017146403A1 (fr) | Système d'affichage | |
WO2020218716A1 (fr) | Dispositif de stationnement automatique et procédé de stationnement automatique | |
WO2020230921A1 (fr) | Procédé d'extraction de caractéristiques d'une image à l'aide d'un motif laser, et dispositif d'identification et robot l'utilisant |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22861746 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |