WO2016186319A1 - 차량 운전 보조 장치 및 차량 - Google Patents
차량 운전 보조 장치 및 차량 Download PDFInfo
- Publication number
- WO2016186319A1 WO2016186319A1 PCT/KR2016/003586 KR2016003586W WO2016186319A1 WO 2016186319 A1 WO2016186319 A1 WO 2016186319A1 KR 2016003586 W KR2016003586 W KR 2016003586W WO 2016186319 A1 WO2016186319 A1 WO 2016186319A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- processor
- information
- camera
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000004364 calculation method Methods 0.000 claims abstract description 3
- 238000000034 method Methods 0.000 claims description 38
- 238000001514 detection method Methods 0.000 claims description 31
- 238000004891 communication Methods 0.000 description 39
- 238000010586 diagram Methods 0.000 description 21
- 238000010276 construction Methods 0.000 description 12
- 230000001133 acceleration Effects 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 10
- 239000000725 suspension Substances 0.000 description 9
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 description 8
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 description 8
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 8
- 238000012790 confirmation Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000002441 reversible effect Effects 0.000 description 6
- 238000012795 verification Methods 0.000 description 6
- 239000000446 fuel Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 238000004378 air conditioning Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000011017 operating method Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000011112 process operation Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 241001300198 Caperonia palustris Species 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 235000000384 Veronica chamaedrys Nutrition 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000002803 fossil fuel Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
- B60R21/0134—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/285—Analysis of motion using a sequence of stereo image pairs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/133—Equalising the characteristics of different image components, e.g. their average brightness or colour balance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
Definitions
- the present invention relates to a vehicle including a vehicle driving assistance device and a vehicle driving assistance device.
- the vehicle is a device for moving in the direction desired by the user on board.
- An example is a car.
- Sensors mounted on autonomous vehicles include cameras, infrared sensors, radars, GPS, riders, gyroscopes, etc. Among them, cameras occupy an important position as sensors that replace human eyes.
- a stereo camera may be used for the vehicle.
- Previous stereo cameras used lenses having the same angle of view. Cameras comprising a narrow angle lens are advantageous in detecting a far object, while disadvantageous in detecting a near object.
- a camera including a wide-angle lens is advantageous in detecting near objects, while disadvantageous in detecting far objects. Since the stereo camera according to the related art uses two cameras having the same angle of view, there is a problem in that it is not properly utilized for both near object detection and far object detection.
- an object of the present invention is to provide a vehicle driving assistance apparatus including two cameras having different angles of view.
- the first camera including a first lens having a first angle of view, to obtain a first image in front of the vehicle;
- a second camera including a second lens having a second angle of view different from the first angle of view, and obtaining a second image in front of the vehicle;
- detecting an object based on each of the first image and the second image processing each of the first image and the second image to obtain a stereo image, and based on the stereo image,
- a vehicle driving assistance apparatus comprising; a processor that performs a disparity operation.
- an embodiment of the present invention provides a vehicle comprising the vehicle driving assistance device.
- the camera since the camera includes both a narrow angle camera and a wide angle camera, both near object detection and far object detection are possible.
- FIG. 1 is a view showing the appearance of a vehicle according to an embodiment of the present invention.
- FIG. 2 is a diagram referred to describe a vehicle driving assistance apparatus included in the vehicle of FIG. 1 according to an embodiment of the present invention.
- 3A to 3B illustrate various examples of internal block diagrams of a vehicle driving assistance apparatus according to various embodiments of the present disclosure.
- FIGS. 3A-3B illustrates an internal block diagram of the processor of FIGS. 3A-3B
- FIGS. 5A-5B are diagrams for reference to an operation description of the processor of FIG. 4.
- 5A and 5B are views referred to for describing an operating method of the processor 170 of FIG. 4 based on stereo images obtained in the first and second frame sections, respectively.
- 6A to 6B are views referred to for describing the operation of the vehicle driving assistance apparatus of FIGS. 3A to 3C.
- FIG. 7 is an example of an internal block diagram of a vehicle according to an embodiment of the present invention.
- FIG 8 is a view referred to for explaining the binning process and the cropping process operation according to an embodiment of the present invention.
- FIG. 9 is a diagram referred to for describing an operation of generating a stereo image according to an embodiment of the present invention.
- FIG. 10 is a diagram referred to for describing a first image according to an embodiment of the present invention.
- FIG. 11 is a diagram referred to for describing a second image, according to an embodiment of the present invention.
- FIG. 12 is a diagram for describing a stereo image generated based on a first image and a second image, according to an embodiment of the present invention.
- FIG. 13 to 14 are exemplary views illustrating a first image, a second image, and a stereo image according to an embodiment of the present invention.
- the vehicle described herein may be a concept including an automobile and a motorcycle.
- a vehicle is mainly described for a vehicle.
- the vehicle described herein may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, an electric vehicle having an electric motor as a power source, and the like.
- the left side of the vehicle means the left side of the driving direction of the vehicle
- the right side of the vehicle means the right side of the driving direction of the vehicle
- LHD left hand drive
- RHD vehicles are also included in the scope of the present invention.
- FIG. 1 is a view showing the appearance of a vehicle according to an embodiment of the present invention.
- the vehicle 700 includes wheels 103FR, 103FL, 103RL,... Rotated by a power source, a steering wheel 721a for adjusting the traveling direction of the vehicle 700, and the vehicle 700.
- the vehicle driving assistance apparatus 100 provided therein may be provided.
- the vehicle driving assistance apparatus 100 may include a stereo camera, and an image obtained by the stereo camera may be signal processed in the processor.
- the overall length is the length from the front to the rear of the vehicle 700
- the width is the width of the vehicle 700
- the height means the length from the bottom of the wheel to the loop.
- the full length direction L is a direction in which the full length measurement of the vehicle 700 is a reference
- the full width direction W is a direction in which the full width measurement of the vehicle 700 is a reference
- the full height direction H is a vehicle. It may mean the direction that is the basis of the height measurement of (700).
- the vehicle 700 may be a concept including an autonomous vehicle.
- the vehicle 700 may be described as a host vehicle 700 in order to be distinguished from other vehicles.
- FIG. 2 is a diagram referred to describe a vehicle driving assistance apparatus included in the vehicle of FIG. 1 according to an embodiment of the present invention.
- the vehicle driving assistance apparatus 100 may include a first camera 195a including the first lens 193a and a second camera 195b including the second lens 193b. have.
- the first camera 195a and the second camera 195b may be referred to as a stereo camera.
- the first lens 193a may be a lens having a first angle of view.
- the second lens 193b may be a lens having a second angle of view different from the first angle of view.
- the first lens 193a may be a narrow angle lens
- the second lens 193b may be a wide angle lens.
- the narrow angle lens and the wide angle lens may be defined based on an angle of view of 60 degrees.
- the narrow angle lens may be a lens having an angle of view of 60 degrees or less.
- the narrow angle lens may have an angle of view of 0 degrees or more.
- the narrow angle lens may have an angle of view of 12 degrees or more and 60 degrees or less.
- the narrow angle lens may be referred to as a telephoto lens.
- the wide angle lens may be a lens having an angle of view greater than 60 degrees.
- the wide angle lens may have an angle of view of 180 degrees or less.
- the wide-angle lens may have an angle of view of 63 degrees to 104 degrees.
- the image sensor included in the first lens 193a and the first camera 195a may be disposed perpendicular to the ground so that the first camera 195a may acquire an image in front of the vehicle.
- the image sensor included in the second lens 193b and the second camera 195b may be disposed perpendicularly to the ground so that the second camera 195a may acquire an image in front of the vehicle.
- the second camera 195a may be spaced apart from the first camera 195a by a predetermined distance in a horizontal direction. Since the first camera 195a and the second camera 195b are spaced apart from each other, the disparity based on the first image received from the first camera 195a and the second image received from the second camera 195b. It may be possible to perform the operation.
- the vehicle driving assistance device 100 includes a first light shield 192a and a second light shield for shielding light incident on the first lens 193a and the second lens 193b, respectively.
- the part 192b may be provided.
- the first light shielding part 192a may shield a part of the light incident on the first lens 193a or may guide light incident on the first lens 193a.
- the second light shield 192b may shield a part of the light incident on the second lens 193b or may guide light incident on the second lens 193b.
- the first and second light shields 192a and 192b may be referred to as first and second light guide parts.
- the vehicle driving assistance apparatus 100 of the drawing may be a structure detachable to a ceiling or a windshield of the vehicle 700.
- 3A to 3B illustrate various examples of internal block diagrams of a vehicle driving assistance apparatus according to various embodiments of the present disclosure.
- the vehicle driving assistance apparatus 100 of FIGS. 3A to 3B may process signals related to images received from the first camera 195a and the second camera 195b based on computer vision, and may be associated with a vehicle. Information can be generated.
- the vehicle related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for driving guide to the vehicle driver.
- 3A is an internal block diagram of a vehicle driving assistance apparatus 100 according to an embodiment of the present invention.
- the vehicle driving assistance apparatus 100 of FIG. 3A may include a first camera 195a, a second camera 195b, an interface unit 130, a memory 140, a processor 170, and a power supply unit. 190 may be included.
- the first camera 195a may acquire a first image of the surroundings of the vehicle.
- the first camera 195a may acquire a first image in front of the vehicle.
- the first camera 195a may include an image sensor (eg, CMOS or CCD).
- the first image may include a plurality of frames.
- the first image may further include information about a distance compared to the second image. This is because the first image was captured by the first camera 195a including the narrow angle lens.
- the second camera 195b may acquire a second image of the surroundings of the vehicle.
- the second camera 195b may acquire a second image in front of the vehicle.
- the second camera 195b may include an image sensor (eg, CMOS or CCD).
- the second image may include a plurality of frames.
- the second image may further include information of a wider field of view than the first image. This is because the second image was captured by the second camera 195b including the wide-angle lens.
- the first camera 195a and the second camera 195b may include an image processor.
- the image processor may process and process a still image or a moving image acquired through the image sensor.
- the image processor may be configured separately from or integrated with the processor 170.
- the interface unit 130 may exchange signals, information, and data with other devices in the vehicle 700.
- the interface unit 130 may receive vehicle-related data or transmit a signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 performs data communication with the control unit 770, the vehicle display apparatus 400, the sensing unit 760, the vehicle driver 750, and the like, by wire or wireless communication. can do.
- the interface unit 130 may receive navigation information by data communication with the controller 770, the vehicle display apparatus 400, or an additional navigation device.
- the navigation information may include set destination information, route information according to the destination, map information related to driving of the vehicle, and current location information of the vehicle. Meanwhile, the navigation information may include location information of the vehicle on the road.
- the interface unit 130 may receive sensor information from the controller 770 or the sensing unit 760.
- the sensor information includes vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery information, fuel information, tire information, vehicle lamp It may include at least one of information, vehicle interior temperature information, vehicle interior humidity information, rain information.
- Such sensor information may include heading sensors, yaw sensors, gyro sensors, position modules, vehicle forward / reverse sensors, wheel sensors, vehicle speed sensors,
- the vehicle body may be obtained from a body tilt sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle interior temperature sensor, a vehicle interior humidity sensor, a rain sensor, and the like.
- the position module may include a GPS module for receiving GPS information.
- vehicle driving information the vehicle driving information related to the vehicle driving.
- the interface unit 130 may provide a control signal to the controller 770 or the vehicle driver 750.
- the signal may be a control signal.
- the processor 170 may provide a control signal to the power source driver 751, the steering driver 752, or the brake driver 753 through the interface unit 130.
- the memory 140 may store various data for operations of the overall vehicle driving assistance apparatus 100, such as a program for processing or controlling the processor 170.
- the memory 140 may store data for identifying an object. For example, when a predetermined object is detected in the images acquired through the first camera 195a and the second camera 195b, the memory 140 determines, by a predetermined algorithm, what the object corresponds to. Data can be stored.
- the memory 140 may store data about traffic information. For example, when predetermined traffic information is detected in the images acquired through the first camera 195a and the second camera 195b, the memory 140 may determine what the traffic information is by a predetermined algorithm. You can save the data to see if that is the case.
- the memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware.
- the processor 170 controls the overall operation of each unit in the vehicle driving assistance apparatus 100.
- the processor 170 may process a vehicle front image or a vehicle surrounding image obtained by the first camera 195a and the second camera 195b. In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 may acquire an image of the front of the vehicle or the surrounding of the vehicle from the first camera 195a and the second camera 195b, and perform object detection and object tracking based on the image. In particular, the processor 170 may detect lane detection (LD), vehicle detection (VD), pedestrian detection (PD), light spot detection (BD), and traffic detection. Traffic sign recognition (TSR), road surface detection, and the like may be performed.
- LD lane detection
- VD vehicle detection
- PD pedestrian detection
- BD light spot detection
- TSR Traffic sign recognition
- road surface detection and the like may be performed.
- the processor 170 may detect information from the vehicle surrounding image acquired by the first camera 195a and the second camera 195b.
- the information may be information about a vehicle driving situation.
- the information may be a concept including road information, traffic law information, surrounding vehicle information, vehicle or pedestrian traffic light information, construction information, traffic condition information, parking lot information, lane information, and the like.
- the processor 170 may check the information by comparing the detected information with the information stored in the memory 140.
- the processor 170 may receive weather information, road traffic state information, for example, TPEG (Transport Protocol Expert Group) information through the communication unit 120.
- TPEG Transport Protocol Expert Group
- the processor 170, the vehicle driving assistance apparatus 100 may grasp, in real time, the traffic situation information around the vehicle, which is determined based on the image.
- the processor 170 may receive navigation information and the like from the vehicle display apparatus 400 or a separate navigation device (not shown) through the interface unit 130.
- the processor 170 may receive sensor information from the controller 770 or the sensing unit 760 through the interface unit 130.
- the sensor information includes vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery information, fuel information, tire information, vehicle It may include at least one of lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation information.
- the processor 170 may receive images acquired by the first camera 195a and the second camera 195b.
- the acquired image may be a vehicle front image or a vehicle surrounding image.
- the processor 170 may detect an object based on the first image received from the first camera 195a.
- the processor 170 may track the detected object.
- the first image may be a far image in front of the vehicle obtained by the narrow angle camera 195a.
- the processor 170 may detect an object based on the second image received from the second camera 195b.
- the processor 170 may track the detected object.
- the second image may be a near field image of the front of the vehicle obtained by the wide angle camera 195b.
- a criterion for distinguishing a long distance from a short distance may be 50m.
- the distance may be a distance greater than 50m and less than 300m based on the vehicle 700.
- the short distance may be a distance of 0 m or more and 50 m or less based on the vehicle 700. It is noted that the present invention is not limited thereto.
- the processor 170 may obtain a stereo image by processing each of the first image and the second image.
- the processor 170 may perform a disparity operation on the front of the vehicle based on the stereo image.
- the processor 170 may detect an object in a stereo image.
- the processor 170 may track the detected object.
- the processor 170 may calculate the distance to the object, the relative speed to the object, or the speed of the object based on the disparity calculation.
- the processor 170 may detect and track the first object based on the first image. In a state in which the first object detected based on the first image is tracked, when the distance to the first object calculated based on the disparity operation is equal to or less than the reference value, the processor 170 based on the second image.
- the first object can be tracked.
- the object can be detected more accurately by adaptively utilizing the first image and the second image. This is because when the first image is a far image and the second image is a near image, it is more accurate to detect an object based on the first image when it is far and to detect an object based on the second image when it is near. .
- the processor 170 may adjust the resolution, the exposure degree, and the content of the first image and the second image.
- the resolution, the exposure degree, and the contents of the processed first image and the second image may coincide with each other.
- the resolution, exposure degree, and contents of the two images that are the basis of the stereo image must match.
- the content may mean at least one object included in the image. Two images for one content must be secured to generate a stereo image.
- the processor 170 may bin the first image.
- the binning operation on the first image will be described in more detail with reference to FIGS. 8 to 9.
- the processor 170 may crop the second image.
- the cropping processing operation on the second image will be described in more detail with reference to FIGS. 8 to 9.
- the processor 170 may adjust the zoom ratio of the camera of the first camera 195a or the second camera 195b.
- the first camera 195a may include a plurality of zoom lenses and a zoom barrel.
- the second camera 195b may include a plurality of zoom lenses and a zoom barrel.
- the processor 170 may adjust the zoom magnification of the first camera 195a by adjusting the position of the zoom barrel of the first camera 195a.
- the processor 170 may adjust the zoom magnification of the second camera 195b by adjusting the position of the zoom barrel of the second camera 195b.
- the zoom ratio of the first camera 195a or the second camera 195b can be adjusted differently or the same.
- the processor 170 may provide the vehicle driver 750 with a control signal based on the object information and the disparity information through the interface unit 130. For example, the processor 170 may provide a control signal to the power source driver 751, the steering driver 752, or the brake driver 753 of the vehicle driver 750. The control signal may be provided to the vehicle driver 750 via the controller 770 of the vehicle 700.
- the processor 170 when the object is detected on the side of the traveling line of the host vehicle 700 and the distance to the object is less than or equal to the preset distance, the processor 170 provides a control signal to the power source driver 751 to accelerate the acceleration. By doing this, collision with the object can be avoided.
- the processor 170 when the object is detected on the traveling line of the host vehicle 700 and the distance to the object is less than or equal to the preset distance, the processor 170 provides a control signal to the steering driver 752 to perform steering. This can avoid collisions with objects.
- the processor 170 when the object is detected on the traveling line of the host vehicle 700 and the distance to the object is less than or equal to the preset distance, the processor 170 provides a control signal to the brake drive unit 753 to decelerate or stop. By doing so, collision with the object can be avoided.
- the processor 170 may classify an ADAS application based on the first image and an ADAS application based on the second image among a plurality of ADAS applications according to characteristics of the first image and the second image.
- the processor 170 may implement an ADAS application requiring a far image based on the first image.
- the processor 170 may be configured to execute an application of a forward collision warning (FCW), autonomous emergency braking (ACE), adaptive cruise control (ACC), or distronic system (DTR) among a plurality of ADAS applications based on the first image.
- FCW Forward Collision Warning
- AEB Autonomous Emergency Braking
- ACC Adaptive Cruise Control
- DTR Distronic System
- the processor 170 may provide information or data obtained from the first image to a forward collision warning (FCW), autonomous emergency braking (AEB), adaptive cruise control (ACC), or distro system (DTR) application.
- the DTR may be called Distronic Plus or Disronic Plus with steering assist.
- the DTR may be a technology in which a Lane Keeping Assistant System (LKAS) is combined with an ACC.
- LKAS Lane Keeping Assistant System
- the processor 170 may implement an ADAS application that requires a wide range of images based on the second image.
- the processor 170 may implement an application of cross traffic alert (CTA), traffic sign recognition (TSR), or traffic jam assist (TJA) among a plurality of ADAS applications based on the second image.
- Cross Traffic Alert (CTA), Traffic Sign Recognition (TSR), and Traffic Jam Assist (TJA) are applications in which a wider range of object detection is important, including driving lanes of the subject vehicle 700 at close range.
- the processor 170 may provide information or data obtained from the second image to a cross traffic alert (CTA), traffic sign recognition (TSR), and traffic jam assist (TJA) application.
- the processor 170 may perform an object detection or object tracking operation located at a far distance based on the first image.
- the first image is an image obtained based on the narrow angle lens, and is advantageous for detecting and tracking an object located at a far distance. You can more accurately detect and track remotely located objects.
- the processor 170 may perform an object detection or emergency object detection operation located at a short distance based on the second image. Since the second image is an image acquired based on a wide-angle lens, and is an image obtained from a wide field of view, it is advantageous for object detection and emergency object detection located at a short distance. More precisely located object detection and emergency object detection can be performed.
- a criterion for distinguishing a long distance from a short distance may be 50m.
- the distance may be a distance greater than 50m and less than 300m based on the vehicle 700.
- the short distance may be a distance of 0 m or more and 50 m or less based on the vehicle 700. It is noted that the present invention is not limited thereto.
- the processor 170 may select one of the first image and the second image as an image for detecting an object based on the traveling speed.
- the processor 170 may select the first image as an image for detecting an object. This is because, in the case of high-speed driving, the far image may be usefully used rather than the near image.
- the processor 170 may select the second image as an image for detecting an object. This is because, in the case of low-speed driving, the near field image may be usefully used rather than the far field image.
- the processor 170 may select one of the first image and the second image as an image for detecting an object based on the driving situation.
- the processor 170 may select the first image as an image for detecting an object. This is because a far-field image may be more usefully used than a near-field image in a high-speed road where high-speed driving is performed.
- the processor 170 may select the second image as an image for detecting an object. This is because near-distance images can be used more effectively than long-distance images in urban roads where various obstacles exist while driving at low speed.
- the processor 170 may select the second image as an image for detecting an object. This is because when driving in the rain, snow or fog, a wider field of view image is required.
- the processor 170 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors (processors), It may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- processors processors
- It may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
- the processor 170 may be controlled by the controller 770.
- the power supply unit 190 may supply power required for the operation of each component under the control of the processor 170.
- the power supply unit 190 may receive power from a battery inside the vehicle.
- 3B is an internal block diagram of the vehicle driving assistance apparatus 100 according to the embodiment of the present invention.
- the vehicle driving assistance apparatus 100 of FIG. 3B further includes an input unit 110, a communication unit 120, and an output unit 150 as compared to the vehicle driving assistance apparatus 100 of FIG. 3A.
- an input unit 110 a communication unit 120
- an output unit 150 as compared to the vehicle driving assistance apparatus 100 of FIG. 3A.
- the input unit 110 may include a plurality of buttons or a touch screen attached to the vehicle driving assistance apparatus 100, particularly, the camera 195. Through a plurality of buttons or a touch screen, the vehicle driving assistance apparatus 100 may be turned on and operated. In addition, various input operations may be performed.
- the input unit 110 may include a microphone for converting a user's voice input into an electrical signal and transmitting the same to the processor 170.
- the communication unit 120 may exchange data with the mobile terminal 600, the server 601, or another vehicle 602 in a wireless manner.
- the communication unit 120 may exchange data wirelessly with a mobile terminal of a vehicle driver.
- various data communication methods such as Bluetooth, WiFi Direct, WiFi, APiX, and NFC are possible.
- the communication unit 120 may receive weather information, road traffic information, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 601. Meanwhile, the vehicle driving assistance apparatus 100 may transmit the grasped real-time information to the mobile terminal 600, the server 601, or another vehicle 602.
- TPEG Transport Protocol Expert Group
- the user's mobile terminal 600 and the vehicle driving assistance device 100 may perform pairing with each other automatically or by executing the user's application.
- the communication unit 120 may receive the traffic light change information from the external server 601.
- the external server 601 may be a server located at a traffic control station for controlling traffic.
- the output unit 150 may include a display unit 151 and a sound output unit 152.
- the display unit 151 may display various information processed by the processor 170.
- the display unit 151 may display an image related to the operation of the vehicle driving assistance apparatus 100.
- the display unit 151 may include a transparent display disposed in close proximity to a cluster, a head up display (HUD), or a wind shield on the front of the vehicle.
- the display module 151 may include a projection module that projects an image onto the windshield of the vehicle 700.
- the audio output unit 152 may output sound to the outside based on the audio signal processed by the processor 170. To this end, the audio output unit 152 may include at least one speaker.
- FIGS. 3A-3B illustrates an internal block diagram of the processor of FIGS. 3A-3B
- FIGS. 5A-5B are diagrams for reference to an operation description of the processor of FIG. 4.
- the processor 170 may include an image preprocessor 410, a binning processor 412, a first object detector 413, a cropping processor 414, a second object detector 415, and a stereo image.
- the generator 417 may include a disparity calculator 420, a third object detector 434, an object checker 436, an object tracking unit 440, and an application unit 450.
- the image preprocessor 410 may receive an image from the camera 195 and perform preprocessing.
- the image preprocessing unit 410 may include noise reduction, rectification, calibration, color enhancement, and color space conversion (CSC) on an image. ), Interpolation, camera gain control, and the like. Accordingly, a sharper image may be obtained than the stereo image captured by the camera 195.
- CSC color space conversion
- the binning processor 412 may bin the first image received from the first camera 195a.
- the image input to the binning processor 412 may be an image preprocessed by the image preprocessor 410.
- the binning processor 412 may convert information of at least two pixel units in the first image into information of one pixel unit. As such, the resolution of the first image in the binning process may be reduced.
- the binning processor 412 may bin the non-contiguous frames of the first image including a plurality of frames.
- the first object detector 413 may detect an object based on the first image received from the first camera 195a.
- the image input to the first object detector 413 may be an image preprocessed by the image preprocessor 410.
- the first object detector 413 may calculate a distance from the detected object and a relative speed with the object.
- the first object detector 413 may track the detected object and calculate a distance to the object based on the size of the object that changes with time.
- the first object detector 413 may calculate a relative speed with the object based on the distance from the object.
- the cropping processor 414 may crop the second image received from the second camera 195b.
- the image input to the cropping processor 414 may be an image preprocessed by the image preprocessor 410.
- the cropping processor 414 may cut out an unnecessary area of the second image.
- the cropping processor 414 may crop the non-contiguous frames of the second image including the plurality of frames.
- the second object detector 415 may detect the object based on the second image received from the second camera 195b.
- the image input to the second object detector 415 may be an image preprocessed by the image preprocessor 410.
- the second object detector 413 may calculate a distance from the detected object and a relative speed with the object.
- the second object detector 415 may track the detected object and calculate a distance to the object based on an object size that changes with time.
- the second object detector 413 may calculate a relative speed with the object based on the distance from the object.
- the stereo image generator 417 may generate a stereo image based on the binned first image and the cropped second image.
- the stereo image generator 417 may generate a stereo image by rectifying the binned first image or the cropped second image.
- the processor 170 may adjust the size of any one of the binned first image and the cropped second image to adjust the size of the image, and then generate a stereo image.
- the processor 170 may adjust the size of both the binned first image and the cropped second image, adjust the size of the image, and generate a stereo image.
- the disparity calculator 420 may perform stereo matching on the received images and obtain a disparity map according to the stereo matching. That is, disparity information about the stereo image of the front of the vehicle may be obtained.
- the stereo matching may be performed in units of pixels of stereo images or in units of predetermined blocks.
- the disparity map may refer to a map in which stereo parallax information of stereo images, that is, left and right images, is numerically represented.
- the third object detector 434 may detect the object.
- the object detector 434 may detect an object with respect to at least one of the images based on the disparity information information.
- the object detector 434 may detect an object with respect to at least one of the images.
- the object verification unit 436 may classify and verify the detected object.
- the object confirmation unit 436 may classify and confirm the objects detected by the first object detector 413, the second object detector 415, and the third object detector 434.
- the object verification unit 436 may include an identification method using a neural network, a support vector machine (SVM) method, a method of identifying by AdaBoost using a haar-like feature, or a histograms of oriented gradients (HOG). Techniques can be used.
- SVM support vector machine
- AdaBoost a method of identifying by AdaBoost using a haar-like feature
- HOG histograms of oriented gradients
- the object checking unit 436 may check the objects by comparing the objects stored in the memory 140 with the detected objects.
- the object confirmation unit 436 may check the surrounding vehicles, lanes, road surfaces, signs, danger zones, tunnels, and the like, which are located around the vehicle.
- the object tracking unit 440 may perform tracking on the identified object. For example, in order to sequentially identify the object in the obtained stereo images, calculate the motion or motion vector of the identified object, track the movement of the object, etc. based on the calculated motion or motion vector. Can be. Accordingly, it is possible to track surrounding vehicles, lanes, road surfaces, signs, dangerous areas, tunnels, and the like, which are located around the vehicle.
- the application unit 450 may calculate a risk of the vehicle 700 based on various objects located around the vehicle, for example, another vehicle, a lane, a road surface, a sign, and the like. In addition, it is possible to calculate the possibility of colliding with the vehicle ahead, whether the vehicle slips.
- the application unit 450 may output, as vehicle driving assistance information, a message for informing the user of such information, based on the calculated risk, the possibility of colliding or the slip.
- a control signal for attitude control or driving control of the vehicle 700 may be generated as vehicle control information.
- the processor 170 may include an image preprocessor 410, a disparity calculator 420, a segmentation unit 432, an object detector 434, an object checker 436, and an object tracking unit ( 440 and only a part of the application unit 450 may be included.
- 5A and 5B are views referred to for describing an operating method of the processor 170 of FIG. 4 based on stereo images obtained in the first and second frame sections, respectively.
- the stereo camera 195 acquires a stereo image.
- the disparity calculator 812 in the processor 170 receives the stereo images FR1a and FR1b signal-processed by the image preprocessor 811 and performs stereo matching on the received stereo images FR1a and FR1b. To obtain a disparity map 520.
- the disparity map 520 is a leveling disparity between the stereo images FR1a and FR1b. The greater the disparity level is, the closer the vehicle is to the distance, and the smaller the disparity level is, the lower the disparity map is. We can calculate that distance of is far.
- the disparity map when displaying such a disparity map, the disparity map may be displayed such that the larger the disparity level, the higher the luminance, and the smaller the disparity level, the lower the luminance.
- the first to fourth lanes 528a, 528b, 528c, 528d and the like have corresponding disparity levels, respectively, the construction area 522, the first front vehicle 524.
- each of the second front vehicles 526 has a corresponding disparity level.
- the segmentation unit 432, the object detection unit 814, and the object confirmation unit 436 may perform segment, object detection, and detection on at least one of the stereo images FR1a and FR1b based on the disparity map 520. Perform object verification.
- the first to fourth lanes 538a, 538b, 538c, 538d, the construction area 532, the first front vehicle 534, and the second front vehicle 536 detect an object. And confirmation may be performed.
- the stereo camera 195 acquires a stereo image.
- the disparity calculator 812 in the processor 170 receives the stereo images FR2a and FR2b signal-processed by the image preprocessor 811 and performs stereo matching on the received stereo images FR2a and FR2b. To obtain a disparity map 540.
- the first to fourth lanes 548a, 548b, 548c, 548d and the like have corresponding disparity levels, respectively, the construction area 542, the first front vehicle 544.
- each of the second front vehicles 546 has a corresponding disparity level.
- the segmentation unit 432, the object detection unit 814, and the object confirmation unit 436 may perform segment, object detection, and detection on at least one of the stereo images FR2a and FR2b based on the disparity map 540. Perform object verification.
- the first to fourth lanes 558a, 558b, 558c and 558d, the construction area 552, the first front vehicle 554, and the second front vehicle 556 detect the object. And confirmation may be performed.
- the object tracking unit 816 may perform tracking on the identified object by comparing FIG. 5A with FIG. 5B.
- the object tracking unit 816 may track the movement of the object based on the motion or the motion vector of each object identified in FIGS. 5A and 5B. Accordingly, tracking of a lane, a construction area, a first front vehicle, a second front vehicle, and the like located around the vehicle can be performed.
- 6A to 6B are views referred to for describing the operation of the vehicle driving assistance apparatus of FIGS. 3A to 3C.
- FIG. 6A is a diagram illustrating a situation in front of a vehicle captured by the stereo camera 195 provided in a vehicle.
- the vehicle front situation is displayed in a bird eye view.
- the first lane 642a, the second lane 644a, the third lane 646a, the fourth lane 648a is located, the first lane 642a and the second A construction area 610a is located between the lanes 644a, a first front vehicle 620a is located between the second lane 644a and the third lane 646a, and the third lane 646a and the fourth lane. It can be seen that the second front vehicle 630a is disposed between the lanes 648a.
- FIG. 6B illustrates displaying the vehicle front situation detected by the vehicle driving assistance apparatus together with various types of information.
- the image as shown in FIG. 6B may be displayed on the display unit 180, the vehicle display apparatus 400, or the display unit 741 provided in the vehicle driving assistance apparatus.
- FIG. 6B illustrates that information display is performed based on an image captured by the stereo camera 195, unlike FIG. 6A.
- a construction area 610b is located between the lanes 644b
- a first front vehicle 620b is located between the second lane 644b and the third lane 646b
- the third lane 646b and the fourth lane It can be seen that the second front vehicle 630b is disposed between the lanes 648b.
- the vehicle driving assistance apparatus 100 performs signal processing based on stereo images captured by the stereo cameras 195a and 195b to construct a construction area 610b, a first front vehicle 620b, and a second front vehicle 630b. You can check the object for).
- the first lane 642b, the second lane 644b, the third lane 646b, and the fourth lane 648b may be identified.
- the vehicle driving assistance apparatus 100 is based on the stereo image captured by the stereo camera 195, the distance to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b. Information can be calculated.
- the vehicle driving assistance apparatus 100 may receive sensor information about the vehicle from the controller 770 or the sensing unit 760.
- the vehicle speed information 672, the gear information 671, and the yaw rate information 673 are displayed on the vehicle front image upper portion 670, and the angle of the vehicle is displayed on the vehicle front image lower portion 680. While the information 682 is illustrated, various examples are possible. In addition, the width information 683 of the vehicle and the curvature information 681 of the road may be displayed together with the angle information 682 of the vehicle.
- the vehicle driving assistance apparatus 100 may receive speed limit information, etc. for the road on which the vehicle is traveling, through the communication unit 120 or the interface unit 130.
- speed limit information 640b is displayed.
- the vehicle driving assistance apparatus 100 may display various pieces of information illustrated in FIG. 6B through the display unit 180. Alternatively, the vehicle driving assistance apparatus 100 may store various pieces of information without additional display. In addition, the information may be used for various applications.
- FIG. 7 is an example of an internal block diagram of a vehicle according to an embodiment of the present invention.
- the vehicle 700 includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driver 750, a memory 730, an interface unit 780, a control unit 770, and a power supply unit. 790, the vehicle driving assistance apparatus 100, and the vehicle display apparatus 400.
- the communication unit 710 may include one or more wireless communication devices between the vehicle 700 and the mobile terminal 600, between the vehicle 700 and the external server 601, or between the vehicle 700 and another vehicle 602. It may include a module. In addition, the communication unit 710 may include one or more modules for connecting the vehicle 700 to one or more networks.
- the communication unit 710 may include a broadcast receiving module 711, a wireless internet module 712, a short range communication module 713, a location information module 714, an optical communication module 715, and a V2X communication module 716. have.
- the broadcast receiving module 711 receives a broadcast signal or broadcast related information from an external broadcast management server through a broadcast channel.
- the broadcast includes a radio broadcast or a TV broadcast.
- the wireless internet module 712 refers to a module for wireless internet access and may be embedded or external to the vehicle 700.
- the wireless internet module 712 is configured to transmit and receive wireless signals in a communication network in accordance with wireless internet technologies.
- wireless Internet technologies include wireless LAN (WLAN), wireless-fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, digital living network alliance (DLNA), wireless broadband (WiBro), WiMAX ( World Interoperability for Microwave Access (HSDPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), and the like.
- 712 transmits and receives data according to at least one wireless Internet technology in a range including the Internet technologies not listed above.
- the wireless internet module 712 may exchange data wirelessly with the external server 601.
- the wireless internet module 712 may receive weather information and road traffic information (eg, TPEG (Transport Protocol Expert Group)) information from the external server 601.
- TPEG Transport Protocol Expert Group
- the short range communication module 713 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near field communication may be supported using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, and Wireless Universal Serial Bus (Wireless USB) technologies.
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- UWB Ultra Wideband
- ZigBee ZigBee
- Near field communication may be supported using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, and Wireless Universal Serial Bus (Wireless USB) technologies.
- the short range communication module 713 may form short range wireless communication networks to perform short range communication between the vehicle 700 and at least one external device. For example, the short range communication module 713 may exchange data with the mobile terminal 600 wirelessly.
- the short range communication module 713 may receive weather information and traffic condition information of a road (for example, a transport protocol expert group (TPEG)) from the mobile terminal 600. For example, when the user boards the vehicle 700, the mobile terminal 600 and the vehicle 700 of the user may perform pairing with each other automatically or by executing an application of the user.
- TPEG transport protocol expert group
- the location information module 714 is a module for obtaining the location of the vehicle 700, and a representative example thereof is a GPS (Global Positioning System) module.
- GPS Global Positioning System
- the vehicle may acquire the position of the vehicle using a signal transmitted from a GPS satellite.
- the optical communication module 715 may include an optical transmitter and an optical receiver.
- the light receiver may convert the light signal into an electrical signal to receive information.
- the light receiver may include a photo diode (PD) for receiving light.
- Photodiodes can convert light into electrical signals.
- the light receiver may receive information of the front vehicle through the light emitted from the light source included in the front vehicle.
- the light emitter may include at least one light emitting device for converting an electrical signal into an optical signal.
- the light emitting element is a light emitting diode (LED).
- the light emitting unit converts the electric signal into an optical signal and transmits it to the outside.
- the light transmitting unit may emit an optical signal to the outside through the blinking of the light emitting device corresponding to the predetermined frequency.
- the light emitting unit may include a plurality of light emitting element arrays.
- the light emitting unit may be integrated with a lamp provided in the vehicle 700.
- the light emitting unit may be at least one of a headlight, a taillight, a brake light, a turn signal, and a vehicle width lamp.
- the optical communication module 715 may exchange data with another vehicle 602 through optical communication.
- the V2X communication module 716 is a module for performing wireless communication with the server 601 or another vehicle 602.
- the V2X module 716 includes a module capable of implementing inter-vehicle communication (V2V) or inter-vehicle communication (V2I) protocol.
- the vehicle 700 may perform wireless communication with the external server 601 and another vehicle 602 through the V2X communication module 716.
- the input unit 720 may include a driving manipulation unit 721, a camera 195, a microphone 723, and a user input unit 724.
- the driving operation means 721 receives a user input for driving the vehicle 700.
- the driving manipulation means 721 may include a steering input means 721a, a shift input means 721b, an acceleration input means 721c, and a brake input means 721d.
- the steering input means 721a receives an input of a traveling direction of the vehicle 700 from the user.
- the steering input means 721a is preferably formed in a wheel shape to enable steering input by rotation.
- the steering input means 721a may be formed of a touch screen, a touch pad, or a button.
- the shift input means 721b receives an input of parking P, forward D, neutral N, and reverse R of the vehicle 700 from the user.
- the shift input means 721b is preferably formed in the form of a lever.
- the shift input unit 721b may be formed as a touch screen, a touch pad, or a button.
- the acceleration input means 721c receives an input for accelerating the vehicle 700 from the user.
- the brake input means 721d receives an input for deceleration of the vehicle 700 from the user.
- the acceleration input means 721c and the brake input means 721d are preferably formed in the form of a pedal. According to an embodiment, the acceleration input means 721c or the brake input means 721d may be formed as a touch screen, a touch pad, or a button.
- the camera 195 may include an image sensor and an image processing module.
- the camera 195 may process a still image or a moving image obtained by an image sensor (eg, CMOS or CCD).
- the image processing module may process the still image or the moving image acquired through the image sensor, extract necessary information, and transfer the extracted information to the controller 770.
- the vehicle 700 may include a camera 195 for capturing a vehicle front image or a vehicle surrounding image and an internal camera 195c for capturing an interior image of the vehicle.
- the internal camera 195c may acquire an image of the passenger.
- the internal camera 195c may acquire an image for biometric recognition of the passenger.
- the internal camera 195c may acquire an image of a passenger in the vehicle 700 and detect how many people are in the vehicle.
- FIG. 7 illustrates that the camera 195 is included in the input unit 720
- the camera 195 has a configuration included in the vehicle driving assistance apparatus 100 as described with reference to FIGS. 2 to 6. It may be explained.
- the microphone 723 may process an external sound signal into electrical data.
- the processed data may be utilized in various ways depending on the function being performed in the vehicle 700.
- the microphone 723 may convert the user's voice command into electrical data.
- the converted electrical data may be transferred to the controller 770.
- the camera 722 or the microphone 723 may be a component included in the sensing unit 760, not a component included in the input unit 720.
- the user input unit 724 is for receiving information from a user. When information is input through the user input unit 724, the controller 770 may control the operation of the vehicle 700 to correspond to the input information.
- the user input unit 724 may include a touch input means or a mechanical input means. According to an embodiment, the user input unit 724 may be disposed in one region of the steering wheel. In this case, the driver may manipulate the user input unit 724 with a finger while holding the steering wheel.
- the sensing unit 760 senses a signal related to driving of the vehicle 700.
- the sensing unit 760 may include a collision sensor, a wheel sensor, a speed sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor.
- Position module vehicle forward / reverse sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle interior temperature sensor, vehicle interior humidity sensor, rain sensor, ultrasonic sensor, radar , Light Detection And Ranging (LiADAR), and the like.
- LiADAR Light Detection And Ranging
- the sensing unit 760 may include vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, and battery information.
- the sensing signal may be obtained such as fuel information, tire information, vehicle lamp information, vehicle internal temperature information, vehicle internal humidity information, rain information, steering wheel rotation angle, and the like.
- the sensing unit 760 in addition to the accelerator pedal sensor, pressure sensor, engine speed sensor (engine speed sensor), air flow sensor (AFS), intake temperature sensor (ATS), water temperature sensor (WTS), throttle
- the sensor may further include a position sensor TPS, a TDC sensor, a crank angle sensor CAS, and the like.
- the sensing unit 760 may include a biometric information detecting unit.
- the biometric information detector detects and acquires biometric information of the passenger.
- Biometric information includes fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, voice recognition ( Voice recognition) information.
- the biometric information sensing unit may include a sensor for sensing biometric information of the passenger.
- the internal camera 195c and the microphone 723 may operate as sensors.
- the biometric information detecting unit may acquire hand shape information and face recognition information through the internal camera 195c.
- the output unit 740 outputs the information processed by the controller 770 and may include a display unit 741, a sound output unit 742, and a haptic output unit 743.
- the display 741 may display information processed by the controller 770.
- the display unit 741 may display vehicle related information.
- the vehicle related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for driving guide to the vehicle driver.
- the vehicle related information may include vehicle state information indicating a current state of a vehicle or vehicle driving information related to driving of the vehicle.
- the display unit 741 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- flexible display flexible display
- display a 3D display, or an e-ink display.
- the display unit 741 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
- a touch screen may provide an output interface between the vehicle 700 and the user while functioning as a user input unit 724 that provides an input interface between the vehicle 700 and the user.
- the display unit 741 may include a touch sensor that senses a touch on the display unit 741 to receive a control command by a touch method. Using this, when a touch is made to the display unit 741, the touch sensor may sense the touch and the controller 770 may generate a control command corresponding to the touch based on the touch sensor.
- the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
- the display unit 741 may include a cluster (cluster) so that the driver can check the vehicle status information or vehicle driving information while driving.
- the cluster can be located on the dashboard. In this case, the driver can check the information displayed on the cluster while keeping the gaze in front of the vehicle.
- the display unit 741 may be implemented as a head up display (HUD).
- HUD head up display
- information may be output through a transparent display provided in the wind shield.
- the display unit 741 may include a projection module to output information through an image projected on the wind shield.
- the sound output unit 742 converts the electrical signal from the control unit 770 into an audio signal and outputs the audio signal.
- the sound output unit 742 may include a speaker.
- the sound output unit 742 may output a sound corresponding to the operation of the user input unit 724.
- the haptic output unit 743 generates a tactile output.
- the haptic output unit 743 may vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.
- the vehicle driver 750 may control operations of various vehicles.
- the vehicle driver 750 may receive a control signal from the vehicle driving assistance apparatus 100.
- the vehicle driver 750 may control each device based on the control signal.
- the vehicle driver 750 includes a power source driver 751, a steering driver 752, a brake driver 753, a lamp driver 754, an air conditioning driver 755, a window driver 756, an airbag driver 757, and a sunroof. It may include a driver 758 and a suspension driver 759.
- the power source driver 751 may perform electronic control of the power source in the vehicle 700.
- the power source driver 751 may perform electronic control on the engine. Thereby, the output torque of an engine, etc. can be controlled.
- the power source driver 751 is an engine, the speed of the vehicle may be limited by limiting the engine output torque under the control of the controller 770.
- the power source driver 751 may perform control on the motor. Thereby, the rotation speed, torque, etc. of a motor can be controlled.
- the power source driver 751 may receive an acceleration control signal from the vehicle driving assistance apparatus 100.
- the power source driver 751 may control the power source according to the received acceleration control signal.
- the steering driver 752 may perform electronic control of a steering apparatus in the vehicle 700. As a result, the traveling direction of the vehicle can be changed.
- the steering driver 752 may receive a steering control signal from the vehicle driving assistance apparatus 100.
- the steering driver 752 may control the steering apparatus to steer according to the received steering control signal.
- the brake driver 753 may perform electronic control of a brake apparatus (not shown) in the vehicle 700. For example, the speed of the vehicle 700 may be reduced by controlling the operation of the brake disposed on the wheel. As another example, by varying the operation of the brakes disposed on the left wheels and the right wheels, the traveling direction of the vehicle 700 may be adjusted to the left or the right.
- the brake driver 753 may receive the deceleration control signal from the vehicle driving assistance apparatus 100.
- the brake driver 759 may control the brake device according to the received deceleration control signal.
- the lamp driver 754 may control turn on / turn off of a lamp disposed in or outside the vehicle. In addition, it is possible to control the intensity, direction, etc. of the light of the lamp. For example, control of a direction indicator lamp, a brake lamp, and the like can be performed.
- the air conditioning driver 755 may perform electronic control of an air cinditioner (not shown) in the vehicle 700. For example, when the temperature inside the vehicle is high, the air conditioner may be operated to control cold air to be supplied into the vehicle.
- the window driver 756 may perform electronic control of a window apparatus in the vehicle 700. For example, the opening or closing of the left and right windows of the side of the vehicle can be controlled.
- the airbag driver 757 may perform electronic control of an airbag apparatus in the vehicle 700.
- the airbag can be controlled to burst.
- the sunroof driver 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 700. For example, the opening or closing of the sunroof can be controlled.
- the suspension driver 759 may perform electronic control of a suspension apparatus (not shown) in the vehicle 700. For example, when the road surface is curved, the suspension device may be controlled to control the vibration of the vehicle 700 to be reduced.
- the suspension driver 759 may receive a suspension control signal from the vehicle driving assistance apparatus 100. The suspension driver 759 may control the suspension device according to the received suspension control signal.
- the memory 730 is electrically connected to the controller 770.
- the memory 730 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data.
- the memory 730 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware.
- the memory 730 may store various data for the overall operation of the vehicle 700, such as a program for processing or controlling the controller 770.
- the interface unit 780 may serve as a path to various types of external devices connected to the vehicle 700.
- the interface unit 780 may include a port connectable to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 may exchange data with the mobile terminal 600.
- the interface unit 780 may serve as a path for supplying electrical energy to the connected mobile terminal 600.
- the interface unit 780 provides the mobile terminal 600 with electrical energy supplied from the power supply unit 790. do.
- the controller 770 may control the overall operation of each unit in the vehicle 700.
- the controller 770 may be referred to as an electronic control unit (ECU).
- ECU electronice control unit
- the controller 770 may be hardware, such as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and processors ( It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- processors It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
- the power supply unit 790 may supply power required for the operation of each component under the control of the controller 770.
- the power supply unit 770 may receive power from a battery (not shown) in the vehicle.
- the vehicle driving assistance apparatus 100 may exchange data with the controller 770.
- the control signal generated by the vehicle driving assistance apparatus 100 may be output to the controller 770.
- the vehicle driving assistance apparatus 100 may be the vehicle driving assistance apparatus described above with reference to FIGS. 1 to 6B.
- the vehicle display apparatus 400 may exchange data with the controller 770.
- the controller 770 may receive navigation information from the vehicle display apparatus 400 or a separate navigation device (not shown).
- the navigation information may include set destination information, route information according to the destination, map information or vehicle location information related to driving of the vehicle.
- FIG 8 is a view referred to for explaining the binning process and the cropping process operation according to an embodiment of the present invention.
- the processor 170 may receive a first image from the first camera 195a.
- the first image may include a plurality of frames 811, 812, 813, 814, 815, 816,...
- the processor 170 may bin the first image. For example, the processor 170 bins some frames 811, 813, 815,... That are not consecutive among the plurality of frames 811, 812, 813, 814, 815, 816,... can do.
- the processor 170 may bin the first image based on the second image.
- the processor 170 may bin the first image to be synchronized with the second image. For example, the processor 170 may select the frames 811, 813, 815,... Of the first image corresponding to the frames 821, 823, 825,..., Cropped in the second image. Binning can be performed.
- the processor 170 may bin the first image to correspond to the resolution of the second image. For example, the processor 170 may bin the first image to have the same resolution as that of the second image.
- the processor 170 may detect an object based on the first image. For example, the processor 170 may detect an object based on the unbinned frames 812, 814, 816,... Of the plurality of frames 811, 812, 813, 814, 815, 816. Can be.
- Images of unbinned frames have higher resolution and contain more information. By detecting the object based on the image of the frame that is not binned, information about the object can be detected more accurately.
- the processor 170 may receive a second image from the second camera 195b.
- the second image may include a plurality of frames 821, 822, 823, 824, 825, 826,...
- the processor 170 may crop the second image. For example, the processor 170 crops some noncontiguous frames 821, 823, 825,... Among the plurality of frames 821, 822, 823, 824, 825, 826,... Can be processed.
- the processor 170 may crop the second image based on the first image.
- the processor 170 may crop the second image to be synchronized with the first image. For example, the processor 170 may cross frames 281, 823, 825,... Of the second image corresponding to the frames 811, 813, 815,..., Binned in the first image. You can ping it.
- the processor 170 may crop the second image to correspond to the content of the first image.
- the processor 170 may crop the second image to have the same content as the content of the first image.
- the processor 170 may detect an object based on the second image. For example, the processor 170 may detect an object based on the uncropped frames 822, 824, and 826 of the plurality of frames 821, 822, 823, 824, 825, and 826.
- Images of uncropped frames have a wider field of view and therefore contain more information. By detecting the object based on the image of the frame which has not been cropped, information about the object can be detected more accurately.
- FIG. 9 is a diagram referred to for describing an operation of generating a stereo image according to an embodiment of the present invention.
- the processor 170 may process a first image and a second image to generate a stereo image.
- the processor 170 may bin the first image (811) and crop the second image (821) to obtain stereo images 811 and 821.
- the processor 170 may generate a stereo image by rectifying the binned first image or the cropped second image.
- the processor 170 may adjust the size of any one of the binned first image and the cropped second image to adjust the size of the image, and then generate a stereo image.
- the processor 170 may adjust the size of both the binned first image and the cropped second image, adjust the size of the image, and generate a stereo image.
- the processor 170 may perform a disparity operation based on the stereo images 811 and 821.
- FIG. 10 is a diagram referred to for describing a first image according to an embodiment of the present invention.
- the processor 170 may receive a first image from the first camera 195a.
- the first image may be a far image in front of the vehicle acquired by the narrow angle camera 195a.
- the processor 170 may detect an object located at a far distance through the first image.
- the processor 170 may not detect an object in a wide field of view from side to side through the first image, but may detect an object located at a far distance.
- reference numeral 910 conceptually indicates a detectable area according to the characteristics of the first camera 195a.
- the processor 170 may detect the object 1010.
- the detected object 1010 may be detected because it is included in the first image according to an angle of view of the first lens 193a included in the first camera 195a.
- the object 1020 that is not detected is not detected because it is not included in the first image according to the angle of view of the first lens 193a included in the first camera 195a.
- FIG. 11 is a diagram referred to for describing a second image, according to an embodiment of the present invention.
- the processor 170 may receive a second image from the second camera 195b.
- the second image may be a near field image of the front of the vehicle acquired by the wide angle camera 195b.
- the processor 170 may detect the object located at the front left side or the right side of the vehicle, from among objects located at a short distance through the second image.
- the processor 170 may not detect an object located at a far distance through the second image, but may detect the object in a wide field of view from side to side.
- reference numeral 920 conceptually indicates a detectable area according to the characteristics of the second camera 195b.
- the processor 170 may detect the object 1110.
- the detected interconnect 1110 may be detected because it is included in the second image according to the angle of view of the second lens 193b included in the second camera 195b.
- the object 1120 that is not detected is not detected because it is not included in the second image according to the angle of view of the second lens 193b included in the second camera 195b.
- FIG. 12 is a diagram for describing a stereo image generated based on a first image and a second image, according to an embodiment of the present invention.
- the processor 170 may bin the first image 811, crop the second image 821, and generate a stereo image by rectifying.
- the processor 170 may perform a disparity operation based on the generated stereo image.
- the processor 170 may perform a disparity operation on the object 1210 detected in an area overlapping the first image and the second image.
- Reference numeral 1220 denotes an object that can be detected in the first image but not detected in the second image.
- Reference numeral 1225 denotes an object that can be detected in the second image but not detected in the first image.
- the processor 170 may not perform disparity operations on these objects 1220 and 1225.
- Reference numeral 1230 denotes an object that is not detected in any of the first image and the second image.
- FIG. 13 to 14 are exemplary views illustrating a first image, a second image, and a stereo image according to an embodiment of the present invention.
- the first camera 195a may have an angle of view of a degree.
- the first camera 195a may have a focal length according to an angle of view of a degree. a degree may be less than b degree.
- the first camera 195a may acquire a first image 1310.
- the second camera 195b may have an angle of view of b degrees.
- the second camera 195b may have a focal length according to an angle of view of b degree.
- b degree may be greater than a degree.
- the second camera 195b may acquire a second image 1320.
- the first camera 195a and the second camera 195b may be disposed to be spaced apart by a distance in the horizontal direction.
- the first objects 1350a and 1350b are included in the first image 1310 and the second image 1320.
- the first object 1350a of the first image 1310 may be larger than the first object 1350b of the second image 1320. That is, the first object 1350b of the second image 1320 may be smaller than the first object 1350a of the first image 1310.
- the second image 1320 is an image captured at a wider angle than the first image 1310.
- the second camera 195b may take a picture of a subject located in a wider space from side to side than the first camera 195a.
- the processor 170 may bin the first image and crop the second image.
- the processor 170 may rectify the binned first image and the cropped second image.
- Reference numeral 1410 denotes a rectified first image.
- Reference numeral 1420 denotes a second image that has been rectified.
- the processor 170 may rectify the binned first image based on the cropped second image. That is, the processor 170 may rectify only the binned first image.
- the processor 170 may rectify the cropped second image based on the binned first image. That is, the processor 170 may rectify only the cropped second image.
- the processor 170 may generate a stereo image by performing rectification, and then perform a disparity operation.
- the processor 170 may generate a disparity map 1430 through a disparity operation.
- the disparity map 1430 is as described with reference to FIGS. 5A to 6B.
- the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
- the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes the implementation in the form of a carrier wave (eg, transmission over the Internet).
- the computer may include a processor 170 or a controller 770. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Optics & Photonics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (19)
- 제1 화각을 갖는 제1 렌즈를 포함하고, 차량 전방의 제1 영상을 획득하는 제1 카메라;상기 제1 화각과 다른 제2 화각을 갖는 제2 렌즈를 포함하고, 차량 전방의 제2 영상을 획득하는 제2 카메라; 및상기 제1 영상 및 상기 제2 영상 각각에 기초하여 오브젝트 검출을 수행하고, 상기 제1 영상 및 상기 제2 영상 각각을 처리하여 스테레오 영상을 획득하고, 상기 스테레오 영상에 기초하여, 차량 전방에 대한 디스패러티 연산을 수행하는 프로세서;를 포함하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 제1 영상 및 상기 제2 영상 각각을 처리할때, 상기 제1 영상 및 상기 제2 영상의 해상도, 노출 정도 및 컨텐츠를 조절하여 처리된 제1 영상 및 제2 영상의 해상도, 노출 정도 및 컨텐츠가 서로 일치하도록 처리하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 제1 렌즈는, 협각 렌즈이고, 상기 제2 렌즈는 광각 렌즈인 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 제1 영상을 비닝(binning) 처리하는 차량 운전 보조 장치.
- 제 4항에 있어서,상기 제1 영상은 복수의 프레임을 포함하고,상기 프로세서는,상기 복수의 프레임 중 연속되지 않는 일부 프레임을 비닝 처리하는 차량 운전 보조 장치.
- 제 5항에 있어서,상기 프로세서는,상기 복수의 프레임 중 비닝 처리되지 않는 프레임을 기초로 상기 오브젝트를 검출하는 차량 운전 보조 장치.
- 제 4항에 있어서,상기 프로세서는,상기 제2 영상 기준으로 상기 제1 영상을 비닝 처리하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 제2 영상을 크로핑(cropping) 처리하는 차량 운전 보조 장치.
- 제 8항에 있어서,상기 제2 영상은 복수의 프레임을 포함하고,상기 프로세서는,상기 복수의 프레임 중 연속되지 않는 일부 프레임을 크로핑 처리하는 차량 운전 보조 장치.
- 제 9항에 있어서,상기 프로세서는,상기 복수의 프레임 중 크로핑 처리되지 않는 프레임을 기초로 상기 오브젝트를 검출하는 차량 운전 보조 장치.
- 제 8항에 있어서,상기 프로세서는,상기 제1 영상 기준으로 상기 제2 영상을 크로핑 처리하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 스테레오 영상에서, 상기 오브젝트를 검출하고, 상기 디스패러티 연산에 기초하여 상기 오브젝트와의 거리 또는 상대 속도를 연산하는 차량 운전 보조 장치.
- 제 12항에 있어서,상기 프로세서는,상기 제1 영상에 기초하여 검출된 제1 오브젝트를 트래킹하는 상태에서, 상기 연산된 제1 오브젝트와의 거리가 기준값 이하가 되는 경우, 상기 제2 영상에 기초하여 상기 제1 오브젝트를 트래킹하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 제1 카메라 또는 상기 제2 카메라의 줌배율을 조정하는 차량 운전 보조 장치.
- 제 1항에 있어서,다른 장치와 데이터를 교환하는 인터페이스부;를 더 포함하고,상기 프로세서는,상기 인터페이스부를 통해, 상기 오브젝트 정보 및 상기 디스패러티 정보에 기초한 제어 신호를 차량 구동부에 제공하는 차량 운전 보조 장치.
- 제 15항에 있어서,상기 프로세서는,상기 차량의 동력원 구동부, 조향 구동부 또는 브레이크 구동부에 상기 제어 신호를 제공하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 제1 영상에 기초하여 원거리에 위치한 오브젝트 검출 또는 오브젝트 트래킹 동작을 수행하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,상기 제2 영상에 기초하여 근거리에 위치한 오브젝트 검출 또는 긴급 오브젝트 검출 동작을 수행하는 차량 운전 보조 장치.
- 제 1항에 있어서,상기 프로세서는,주행 속도에 기초하여, 상기 제1 영상 및 상기 제2 영상 중 어느 하나를 상기 오브젝트 검출하기 위한 영상으로 선택하는 차량 운전 보조 장치.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020177033218A KR102077575B1 (ko) | 2015-05-19 | 2016-04-06 | 차량 운전 보조 장치 및 차량 |
EP16796646.4A EP3299240A4 (en) | 2015-05-19 | 2016-04-06 | DEVICE FOR ASSISTING THE DRIVING OF A VEHICLE AND VEHICLE |
US15/574,985 US10703374B2 (en) | 2015-05-19 | 2016-04-06 | Vehicle driving assisting apparatus and vehicle comprising same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562163575P | 2015-05-19 | 2015-05-19 | |
US62/163,575 | 2015-05-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016186319A1 true WO2016186319A1 (ko) | 2016-11-24 |
Family
ID=57320416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2016/003586 WO2016186319A1 (ko) | 2015-05-19 | 2016-04-06 | 차량 운전 보조 장치 및 차량 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10703374B2 (ko) |
EP (1) | EP3299240A4 (ko) |
KR (1) | KR102077575B1 (ko) |
WO (1) | WO2016186319A1 (ko) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101677856B1 (ko) * | 2015-08-06 | 2016-11-18 | 연세대학교 산학협력단 | 차량용 후방 카메라 시스템 |
KR102661614B1 (ko) | 2019-02-11 | 2024-04-29 | 삼성전자주식회사 | 화면 제공 방법 및 이를 지원하는 전자 장치 |
WO2021033812A1 (ko) * | 2019-08-22 | 2021-02-25 | (주)캠시스 | 이종 스테레오 카메라 시스템 및 카메라 보정 방법 |
KR102219339B1 (ko) * | 2019-10-01 | 2021-02-24 | 주식회사대성엘텍 | 운전 보조 장치 및 이를 이용한 운전 보조 방법 |
KR102497488B1 (ko) * | 2022-11-29 | 2023-02-08 | 주식회사 셀플러스코리아 | 주행속도에 따른 인식범위조절이 가능한 자율주행 차량의 영상인식 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110158528A1 (en) * | 2009-12-31 | 2011-06-30 | Sehoon Yea | Determining Disparity Search Range in Stereo Videos |
US8106936B2 (en) * | 2007-03-16 | 2012-01-31 | Kollmorgen Corporation | Panoramic video imaging and display system |
KR101172704B1 (ko) * | 2009-11-27 | 2012-08-09 | (주)이프러스 | 멀티화각 카메라를 탑재한 차량용 블랙박스 |
US20120327189A1 (en) * | 2010-03-12 | 2012-12-27 | Hitachi Automotive Systems, Ltd. | Stereo Camera Apparatus |
US20130100256A1 (en) * | 2011-10-21 | 2013-04-25 | Microsoft Corporation | Generating a depth map |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7786898B2 (en) | 2006-05-31 | 2010-08-31 | Mobileye Technologies Ltd. | Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications |
JP5278819B2 (ja) | 2009-05-11 | 2013-09-04 | 株式会社リコー | ステレオカメラ装置及びそれを用いた車外監視装置 |
JP5663352B2 (ja) | 2011-03-03 | 2015-02-04 | 日本電産エレシス株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
EP2946336B1 (en) * | 2013-01-15 | 2023-06-21 | Mobileye Vision Technologies Ltd. | Stereo assist with rolling shutters |
KR102027771B1 (ko) | 2013-01-31 | 2019-10-04 | 한국전자통신연구원 | 차량 속도 적응형 장애물 검출 장치 및 방법 |
WO2014148031A1 (ja) | 2013-03-19 | 2014-09-25 | パナソニック株式会社 | 画像生成装置、撮像装置および画像生成方法 |
US10318823B2 (en) * | 2013-10-14 | 2019-06-11 | Mobileye Vision Technologies Ltd. | Forward-facing multi-imaging system for navigating a vehicle |
US9150220B2 (en) * | 2013-12-04 | 2015-10-06 | Mobileye Vision Technologies Ltd. | Systems and methods for mimicking a leading vehicle |
CN111199218A (zh) * | 2014-01-30 | 2020-05-26 | 移动眼视力科技有限公司 | 用于车辆的控制系统、和图像分析系统 |
EP3108264A2 (en) * | 2014-02-20 | 2016-12-28 | Mobileye Vision Technologies Ltd. | Advanced driver assistance system based on radar-cued visual imaging |
WO2015157410A1 (en) * | 2014-04-08 | 2015-10-15 | Tk Holdings Inc. | System and method for night vision object detection and driver assistance |
US9443163B2 (en) * | 2014-05-14 | 2016-09-13 | Mobileye Vision Technologies Ltd. | Systems and methods for curb detection and pedestrian hazard assessment |
US9571819B1 (en) * | 2014-09-16 | 2017-02-14 | Google Inc. | Efficient dense stereo computation |
US20170113611A1 (en) | 2015-10-27 | 2017-04-27 | Dura Operating, Llc | Method for stereo map generation with novel optical resolutions |
-
2016
- 2016-04-06 EP EP16796646.4A patent/EP3299240A4/en not_active Withdrawn
- 2016-04-06 KR KR1020177033218A patent/KR102077575B1/ko active IP Right Grant
- 2016-04-06 WO PCT/KR2016/003586 patent/WO2016186319A1/ko active Application Filing
- 2016-04-06 US US15/574,985 patent/US10703374B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8106936B2 (en) * | 2007-03-16 | 2012-01-31 | Kollmorgen Corporation | Panoramic video imaging and display system |
KR101172704B1 (ko) * | 2009-11-27 | 2012-08-09 | (주)이프러스 | 멀티화각 카메라를 탑재한 차량용 블랙박스 |
US20110158528A1 (en) * | 2009-12-31 | 2011-06-30 | Sehoon Yea | Determining Disparity Search Range in Stereo Videos |
US20120327189A1 (en) * | 2010-03-12 | 2012-12-27 | Hitachi Automotive Systems, Ltd. | Stereo Camera Apparatus |
US20130100256A1 (en) * | 2011-10-21 | 2013-04-25 | Microsoft Corporation | Generating a depth map |
Non-Patent Citations (1)
Title |
---|
See also references of EP3299240A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP3299240A1 (en) | 2018-03-28 |
US10703374B2 (en) | 2020-07-07 |
EP3299240A4 (en) | 2019-01-30 |
KR102077575B1 (ko) | 2020-02-17 |
KR20170140284A (ko) | 2017-12-20 |
US20180154900A1 (en) | 2018-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018066741A1 (ko) | 자동주차 보조장치 및 이를 포함하는 차량 | |
WO2017003052A1 (ko) | 차량 운전 보조 방법 및 차량 | |
WO2017094952A1 (ko) | 차량 외부 알람방법, 이를 실행하는 차량 운전 보조장치 및 이를 포함하는 차량 | |
WO2017039047A1 (ko) | 차량 및 그 제어방법 | |
WO2018012674A1 (en) | Driver assistance apparatus and vehicle having the same | |
WO2017078362A1 (en) | Vehicle and method for controlling the vehicle | |
WO2015099465A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2017150768A1 (ko) | 디스플레이 장치 및 이를 포함하는 차량 | |
WO2018070583A1 (ko) | 자동주차 보조장치 및 이를 포함하는 차량 | |
WO2017209313A1 (ko) | 차량용 디스플레이 장치 및 차량 | |
WO2017034282A1 (en) | Driver assistance apparatus and method for controlling the same | |
WO2017022881A1 (ko) | 차량 및 그 제어방법 | |
WO2017119541A1 (ko) | 차량 운전 보조장치 및 이를 포함하는 차량 | |
WO2015093828A1 (ko) | 스테레오 카메라 및 이를 구비한 차량 | |
WO2016186319A1 (ko) | 차량 운전 보조 장치 및 차량 | |
WO2017018729A1 (ko) | 차량용 레이더, 및 이를 구비하는 차량 | |
WO2018131949A1 (ko) | 어라운드뷰 제공장치 | |
WO2015093823A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2017115916A1 (ko) | 차량 보조장치 및 이를 포함하는 차량 | |
WO2017104888A1 (ko) | 차량 운전 보조장치 및 이의 차량 운전 보조방법 | |
WO2016021961A1 (ko) | 차량용 헤드램프 구동장치 및 이를 구비한 차량 | |
WO2015099463A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2020105751A1 (ko) | 탑승자 모니터링 방법 및 이를 위한 장치 | |
WO2020166749A1 (ko) | 차량을 이용한 정보 표시 방법 및 시스템 | |
WO2017171124A1 (ko) | 외장형 모듈 및 이와 연결되는 차량 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16796646 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20177033218 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15574985 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016796646 Country of ref document: EP |