WO2017065352A1 - Appareil de fourniture de vision pour véhicule et véhicule - Google Patents

Appareil de fourniture de vision pour véhicule et véhicule Download PDF

Info

Publication number
WO2017065352A1
WO2017065352A1 PCT/KR2015/013834 KR2015013834W WO2017065352A1 WO 2017065352 A1 WO2017065352 A1 WO 2017065352A1 KR 2015013834 W KR2015013834 W KR 2015013834W WO 2017065352 A1 WO2017065352 A1 WO 2017065352A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
processor
view
information
Prior art date
Application number
PCT/KR2015/013834
Other languages
English (en)
Korean (ko)
Inventor
김성민
서진
홍기현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2017065352A1 publication Critical patent/WO2017065352A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention

Definitions

  • the present invention relates to an around view providing apparatus and a vehicle provided in a vehicle.
  • the vehicle is a device for moving in the direction desired by the user on board.
  • An example is a car.
  • Sensors mounted on autonomous vehicles include cameras, infrared sensors, radars, GPS, lidars, and gyroscopes, among which cameras occupy an important position as sensors for providing various information to users.
  • AVM Around View Monitoring
  • an embodiment of the present invention is to provide an apparatus for providing a view around a vehicle in which a viewpoint is switched based on the object and displayed when an object is detected.
  • an embodiment of the present invention is to provide a vehicle including the apparatus for providing an around view for the vehicle.
  • a plurality of cameras for obtaining a vehicle surrounding image;
  • a display unit configured to display a first viewpoint image generated by synthesizing the images acquired by the plurality of cameras;
  • a processor configured to display, on the display unit, a second viewpoint image having a different viewpoint than the first viewpoint when the object is detected.
  • FIG. 1 is a view showing the appearance of a vehicle according to an embodiment of the present invention.
  • FIG. 2 is a view schematically showing the positions of a plurality of cameras according to an embodiment of the present invention.
  • FIG. 3 is a diagram referred to for describing an around view image according to an exemplary embodiment of the present invention.
  • 4A to 4B are block diagrams for describing an apparatus for providing an around view for a vehicle according to an exemplary embodiment of the present invention.
  • 5A-5B illustrate an internal block diagram of the processor of FIGS. 4A-4B.
  • 5C is a diagram illustrating object detection in the processor of FIGS. 5A-5B.
  • FIG. 6 is a block diagram referred to describe a camera according to an embodiment of the present invention.
  • FIGS. 7A to 7C are flowcharts for describing an operation of the apparatus for providing an around view for a vehicle according to an exemplary embodiment of the present invention.
  • FIG. 8 is a block diagram referred to to explain a vehicle according to an embodiment of the present invention.
  • 9 to 15 are diagrams for describing an operation of a vehicle around view providing apparatus when an object is located at a rear of a vehicle according to an exemplary embodiment of the present invention.
  • 16 to 21 are diagrams for describing an operation of a vehicle around view providing apparatus when an object is located in front of a vehicle according to an embodiment of the present invention.
  • 22 to 25 are diagrams for describing an operation of an apparatus for providing an around view of a vehicle when an object is located at a side of a vehicle according to an exemplary embodiment of the present invention.
  • 26 to 27 are diagrams for describing an operation of a vehicle around view providing apparatus when a plurality of objects are provided according to an embodiment of the present invention.
  • 28 to 29 are diagrams for describing an operation of a vehicle around view providing apparatus when an object is detected while driving, according to an embodiment of the present invention.
  • the vehicle described herein may be a concept including an automobile and a motorcycle.
  • a vehicle is mainly described for a vehicle.
  • the vehicle described herein may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, an electric vehicle having an electric motor as a power source, and the like.
  • the left side of the vehicle means the left side of the driving direction of the vehicle
  • the right side of the vehicle means the right side of the driving direction of the vehicle
  • FIG. 1 is a view showing the appearance of a vehicle according to an embodiment of the present invention.
  • a vehicle 700 includes wheels 103FR, 103FL, 103RL,... Rotated by a power source, steering input means 721a for adjusting a traveling direction of the vehicle 700, and a vehicle ( It may include a plurality of cameras 195 attached to the 700. In the figure, only the right camera 195c and the front camera 195d are shown for convenience.
  • the plurality of cameras 195 may acquire a vehicle surrounding image photographed at each of the arranged positions. Images acquired by the plurality of cameras 195 may be signal processed within the vehicle around view providing apparatus 100.
  • the plurality of cameras 195 may be composed of two or more. In the following description, the plurality of cameras 195 is described as four, but the scope of the present invention is not limited to the number of cameras.
  • the plurality of cameras 195 may be two, three, four, or more.
  • FIG. 2 is a view schematically showing the positions of a plurality of cameras according to an embodiment of the present invention.
  • the plurality of cameras 195a, 195b, 195c, and 195d may be disposed at the left side, the rear side, the right side, and the front side of the vehicle, respectively.
  • the left camera 195a and the right camera 195c may be disposed in a case surrounding the left side mirror and a case surrounding the right side mirror, respectively.
  • the rear camera 195b and the front camera 195d may be disposed near the trunk switch and near the emblem or the emblem, respectively.
  • Each of the plurality of images captured by the plurality of cameras 195a, 195b, 195c, and 195d is transmitted to a processor (170 of FIG. 4A or 4B) or the like in the vehicle 700, and the controller (170 of FIG. 4A or 4B). ) May combine the plurality of images to generate an around view image.
  • FIG. 3 is a diagram referred to for describing an around view image according to an exemplary embodiment of the present invention.
  • the around view image 201 includes a first image area 195ai from the left camera 195a, a second image area 195bi from the rear camera 195b, and a third image area from the right camera 195c ( 195ci), and a fourth image area 195di from the front camera 195d.
  • boundary lines 202a, 202b, 202c, and 202d may be displayed on boundaries of each of the plurality of images.
  • 4A to 4B are block diagrams for describing an apparatus for providing an around view for a vehicle according to an exemplary embodiment of the present invention.
  • the apparatus for providing a vehicle around view 100 may generate an around view image by combining a plurality of images received from the plurality of cameras 195.
  • the vehicle around view providing apparatus 100 may include an input unit 110, a communication unit 120, an interface unit 130, a memory 140, a processor 170, a display unit 180, and an audio output.
  • the unit 185, the power supply unit 190, and the camera 195 may be included.
  • the input unit 110 may include a plurality of buttons or a touch screen. Through the plurality of buttons or the touch screen, the around view providing apparatus 100 may be turned on and operated. In addition, various input operations may be performed.
  • the communication unit 120 may exchange data with the mobile terminal 600, the server 601, or another vehicle 602 in a wireless manner.
  • the communication unit 120 may exchange data wirelessly with a mobile terminal of a vehicle driver.
  • various data communication methods such as Bluetooth, WiFi Direct, WiFi, APiX, and NFC are possible.
  • the communication unit 120 may receive weather information, road traffic information, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 601. Meanwhile, the around view providing apparatus 100 may transmit the grasped real-time information to the mobile terminal 600 or the server 601.
  • TPEG Transport Protocol Expert Group
  • the mobile terminal 600 and the around view providing apparatus 100 of the user may perform pairing with each other automatically or by executing an application of the user.
  • the communication unit 120 may receive the traffic light change information from the external server 601.
  • the external server 601 may be a server located at a traffic control station for controlling traffic.
  • the interface unit 130 may receive vehicle-related data or transmit a signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 performs data communication with the control unit 770, the vehicle display apparatus 400, the sensing unit 760, the vehicle driver 750, and the like, by wire or wireless communication. can do.
  • the interface unit 130 may receive navigation information by data communication with the controller 770, the vehicle display apparatus 400, or an additional navigation device.
  • the navigation information may include set destination information, route information according to the destination, map information related to driving of the vehicle, and current location information of the vehicle. Meanwhile, the navigation information may include location information of the vehicle on the road.
  • the interface unit 130 may receive sensor information from the controller 770 or the sensing unit 760.
  • the sensor information includes vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery information, fuel information, tire information, vehicle lamp It may include at least one of information, vehicle interior temperature information, vehicle interior humidity information, rain information.
  • Such sensor information may include heading sensors, yaw sensors, gyro sensors, position modules, vehicle forward / reverse sensors, wheel sensors, vehicle speed sensors,
  • the vehicle body may be obtained from a body tilt sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle interior temperature sensor, a vehicle interior humidity sensor, a rain sensor, and the like.
  • the position module may include a GPS module for receiving GPS information.
  • vehicle driving information the vehicle driving information related to the vehicle driving.
  • the interface unit 130 may provide a signal to the controller 770 or the vehicle driver 750.
  • the signal may be a control signal.
  • the processor 170 may provide a control signal for acceleration to the power source driver 751.
  • the processor 170 may provide a steering control signal to the steering driver 752 through the interface unit 130.
  • the processor 170 may provide a control signal for deceleration to the brake driver 753 through the interface unit 130.
  • the memory 140 may store various data for the overall operation of the around view providing apparatus 100, such as a program for processing or controlling the processor 170.
  • the memory 140 may store data for identifying an object. For example, when a predetermined object is detected in the image acquired through the camera 195, the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm.
  • the memory 140 may store data about traffic information. For example, when predetermined traffic information is detected in the image acquired through the camera 195, the memory 140 may store data for identifying what the traffic information corresponds to by a predetermined algorithm. have.
  • the memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware.
  • the processor 170 controls the overall operation of each unit in the around view providing apparatus 100.
  • the processor 170 may process the vehicle surrounding image acquired by the camera 195. In particular, the processor 170 performs signal processing based on computer vision.
  • the processor 170 may perform object detection and object tracking.
  • the processor 170 may detect lane detection (LD), vehicle detection (VD), pedestrian detection (PD), light spot detection (BD), and traffic detection. Traffic sign recognition (TSR), road surface detection, and the like may be performed.
  • LD lane detection
  • VD vehicle detection
  • PD pedestrian detection
  • BD light spot detection
  • TSR Traffic sign recognition
  • road surface detection and the like may be performed.
  • the processor 170 may detect information in the vehicle surrounding image acquired by the camera 195.
  • the information may be information about a vehicle driving situation.
  • the information may be a concept including road information, traffic law information, surrounding vehicle information, vehicle or pedestrian traffic light information, construction information, traffic condition information, parking lot information, lane information, and the like.
  • the processor 170 may check the information by comparing the detected information with the information stored in the memory 140.
  • the processor 170 may receive weather information, road traffic state information, for example, TPEG (Transport Protocol Expert Group) information through the communication unit 120.
  • TPEG Transport Protocol Expert Group
  • the processor 170 in the around view providing apparatus 100, may grasp in real time the traffic situation information around the vehicle, which is determined based on the image.
  • the processor 170 may receive navigation information and the like from the vehicle display apparatus 400 or a separate navigation device (not shown) through the interface unit 130.
  • the processor 170 may receive sensor information from the controller 770 or the sensing unit 760 through the interface unit 130.
  • the sensor information includes vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery information, fuel information, tire information, vehicle It may include at least one of lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation information.
  • the processor 170 may obtain a plurality of images from the plurality of cameras 195 and combine the plurality of images to generate an around view image.
  • the processor 170 may display an around view image on the display 180.
  • the around view image may be a first view point image.
  • the around view image may be a top view image or a bird eye view image.
  • the processor 170 may switch the view point based on the object in the surrounding image of the vehicle or the first view point image.
  • the first view point image may be a top view image.
  • the processor 170 may detect an object from a vehicle surrounding image based on computer vision-based signal processing.
  • the processor 170 may track the detected object. An object detecting operation based on the vehicle surrounding image will be described later with reference to FIGS. 5A to 5C.
  • the processor 170 may calculate a distance from the object.
  • the processor 170 may calculate a distance from the object by comparing the plurality of images of each frame with each other in the vehicle surrounding image including the plurality of frames.
  • the processor 170 may receive a vehicle left peripheral image including a plurality of frames through the left camera 195a.
  • the processor 170 may detect and track an object located in the left side of the vehicle 700 for each frame.
  • a difference occurs between the images formed for each frame.
  • the processor 170 may generate a depth map by comparing the plurality of images formed for each frame. In this case, the depth map may be generated by matching each pixel or a predetermined block unit.
  • the processor 170 may acquire disparity information about the left side of the vehicle through a depth matp.
  • the processor 170 may calculate a distance from the object based on the disparity information.
  • the processor 170 receives the right peripheral image received from the right camera 195c and the back peripheral received from the rear camera 195b according to the same operation as that of calculating the distance from the object from the left peripheral image described above.
  • the distance to the object may be calculated from the image and the front peripheral image received from the front camera 195d.
  • the processor 170 may generate the depth map by receiving information from the distance detector 150 of FIG. 4B.
  • object information may be calculated based on a time of flight (TOF) of infrared light, and a depth map may be generated based on the calculated object information.
  • TOF time of flight
  • the processor 170 may switch the view point based on the object in the vehicle surrounding image or the first view point image.
  • the processor 170 may switch the view point according to a preset view parameter in the vehicle surrounding image or the around view image.
  • the values of the view parameter may be defined in a look up table.
  • the lookup table may be stored in the memory 140.
  • the processor 170 may extract predetermined feature points from the vehicle surrounding image or the first view point image.
  • the processor 170 may convert the view point through a world transformation, a viewing transformation, a projection transformation algorithm, or the like based on the feature points.
  • the processor 170 may switch the view point based on the depth map including the 3D information.
  • the processor 170 may display the view point switching image on the display unit 180.
  • the image may be a second view point image.
  • the second viewpoint image may have a different viewpoint than the first viewpoint image.
  • the viewpoint change image may be an image viewed from an angle at which the detected object is easy to identify.
  • the viewpoint change image may be an image in which the detected object is the center.
  • the processor 170 may display distance information with respect to the object or height information of the object on the viewpoint change image.
  • the processor 170 may display an alarm message reminding a collision with an object on the viewpoint change image.
  • the processor 170 may display a driving guide message on the viewpoint change point image.
  • the driving guide message may be an acceleration, deceleration, or steering guide message for circumventing the object or guiding the object to safely pass the object.
  • the view point switching image may be an image for checking the height of the object and the distance to the object.
  • the processor 170 may display the distance to the object or the height of the object on the viewpoint change image.
  • the distance from the object may be a distance from the vehicle body of the vehicle 700 to the object.
  • the distance from the object may be a distance from the wheel of the vehicle 700 to the object.
  • the processor 170 may switch the viewpoint based on the object.
  • the object When the vehicle 700 or the object is moving, the object may approach the vehicle 700. In this case, the distance between the vehicle 700 and the object is reduced.
  • the processor 170 may switch the viewpoint based on the object and display the viewpoint change image on the display unit 170.
  • the view point switching image is displayed, so that the user can recognize the existence of the object, thereby preventing collision with the object.
  • the processor 170 may switch the viewpoint based on the object.
  • the object may include any one of a curb, a stopper, a speed bump, a traffic corn, and a safety fence.
  • the processor 170 may switch the viewpoint based on the object, and display the viewpoint change image through the display unit 170.
  • the view point switching image is displayed, so that the user can recognize the existence of the object, thereby preventing collision with the object.
  • the processor 170 may generate a plurality of viewpoints converted images. For example, the processor 170 may generate a plurality of view point switching images of the object, viewed from the plurality of angles, around the object.
  • the processor 170 may display, on the display unit 180, a plurality of images converted from a viewpoint at various angles around the object.
  • the processor 170 may sequentially display, on the display unit 180, a plurality of images whose viewpoints are switched around the object at predetermined time intervals.
  • the processor 170 may simultaneously display the first viewpoint image and the second viewpoint image in which the viewpoint is switched on the display unit 180. In this case, the processor 170 may match and display an object included in the view point switching image and an object included in the first view point image. By matching and displaying the objects as described above, the user can more clearly identify the objects and recognize the positional relationship between the objects and the vehicle.
  • the processor 170 may provide a control signal to the vehicle driver 750.
  • the control signal may be directly transmitted to the vehicle driver 750 or may be transmitted through the controller 770.
  • the processor 170 may provide a control signal to at least one of the power source driver 751, the steering driver 752, the brake driver 753, and the suspension driver 759.
  • the processor 170 may provide the steering driver 752 with a steering control signal for avoiding collision with the object.
  • the processor 170 may provide a brake control signal to the brake driver 753 to prevent a collision with an object.
  • the processor 170 may provide the suspension driver 759 with a suspension control signal for raising the height of the vehicle body rather than the height of the object.
  • the processor 170 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors (processors), It may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors processors
  • It may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • the processor 170 may be controlled by the controller 770.
  • the display unit 180 may display various types of information processed by the processor 170.
  • the display unit 180 may display an image related to the operation of the around view providing apparatus 100.
  • the display unit 180 may display a first viewpoint image or a second viewpoint image under the control of the processor 170.
  • the display unit 180 may display an around view image generated by the processor 170.
  • the around view image it is possible to provide a variety of user user interface, it is also possible to include a touch sensor capable of touch input to the provided user interface.
  • the display unit 180 may display an around view image generated by synthesizing the images acquired by the plurality of cameras 195.
  • the around view image may be a first view point image.
  • the around view image may be a top view image or a bird eye view image.
  • the display unit 180 may display a second view point image.
  • the display unit 180 may be implemented such that an image is displayed on the room mirror, the side mirror, or the side window glass.
  • the display unit 180 may be disposed in the room mirror or the side mirror.
  • the display unit 180 may normally act as a mirror and display an image when a predetermined event occurs.
  • the display unit 180 may be formed as a transparent display and disposed close to the side window glass.
  • the display unit 180 may include a projection module, and the projection module may project an image on the side window glass.
  • the display unit 180 may be implemented to display an image on the front windshield.
  • the display unit 180 may be formed of a transparent display and disposed close to the front windshield.
  • the display unit 180 may include a projection module, and the projection module may project an image on the front windshield.
  • the audio output unit 185 may output sound to the outside based on the audio signal processed by the processor 170.
  • the audio output unit 185 may include at least one speaker.
  • the power supply unit 190 may supply power required for the operation of each component under the control of the processor 170.
  • the power supply unit 190 may receive power from a battery inside the vehicle.
  • the camera 195 may acquire a vehicle surrounding image.
  • the camera 195 may be plural.
  • the plurality of cameras 195 may acquire a vehicle surrounding image photographed at each of the arranged positions.
  • the camera 195 may be a left camera 195a disposed on the left side of the vehicle 700 to obtain a left peripheral image, and a right camera 195c disposed on the right side of the vehicle 700 to acquire a right peripheral image. ), A rear camera 195b disposed at the rear of the vehicle 700 to obtain a rear peripheral image, and a front camera 195d disposed at the front of the vehicle 700 to acquire a front peripheral image.
  • each of the left camera 195a, the right camera 195c, the rear camera 195b and the front camera 195d preferably faces the ground to some extent.
  • the camera 195 is a camera for providing an around view image, and is preferably a wide angle camera. According to an embodiment, the camera 195 may include a fisheye lens.
  • the camera 195 may process infrared light and visible light together.
  • the camera 195 includes an optical output unit for outputting infrared light, a beam splitter for dividing received light into infrared light and visible light, a first optical sensor for processing infrared light, and a second optical sensor for processing visible light. can do.
  • the processor 170 may calculate a distance to the object based on the infrared light.
  • the processor 170 may process an image based on visible light.
  • the around view providing apparatus of FIG. 4B is similar to the around view providing apparatus of FIG. 4A, except that the apparatus 100 further includes a distance detector 150.
  • the distance detector 150 will be described.
  • the distance detector 150 may detect an object.
  • the distance detector 150 may detect a distance from the detected object.
  • the distance detector 150 may include at least one of an ultrasonic sensor, a lidar, a radar, and a TOF camera.
  • the object information detected by the distance detector 150 may be provided to the processor 170.
  • the object information may include distance information with respect to the object.
  • the processor 170 may receive object information from the distance detector 150. When the object is detected through the distance detector 150, the processor 170 switches the view point based on the object in the surrounding image of the vehicle or the top view image, and displays the view point change image. Can be marked on.
  • 5A-5B illustrate an internal block diagram of the processor of FIGS. 4A-4B.
  • 5C is a diagram illustrating object detection in the processor of FIGS. 5A-5B.
  • FIG. 5A is an example of an internal block diagram of the processor 170.
  • the processor 170 of the around view providing apparatus 100 for a vehicle may include an image preprocessor 410 and a disparity calculator ( 420, an object detector 434, an object tracking unit 440, and an application unit 450 may be provided.
  • the image preprocessor 410 may perform preprocessing by receiving a plurality of images from the plurality of cameras 195a,..., 195d or generated around view images.
  • the image preprocessor 410 may include noise reduction, rectification, calibration, color enhancement, and color enhancement for a plurality of images or generated around view images.
  • Color space conversion (CSC), interpolation, camera gain control, and the like may be performed. Accordingly, a sharper image may be obtained than a plurality of images captured by the plurality of cameras 195a,..., 195d or the generated around view image.
  • the disparity calculator 420 receives the plurality of images or the generated around view images signaled by the image preprocessor 410, and sequentially receives the plurality of images or the generated around for a predetermined time. Stereo matching is performed on the view image, and a disparity map according to stereo matching is obtained. That is, disparity information about the surroundings of the vehicle can be obtained.
  • the stereo matching may be performed in units of pixels or in units of predetermined blocks of the images.
  • the disparity map may refer to a map that numerically represents the disparity information (binocular parallax information) of the image, that is, left and right images.
  • the segmentation unit 432 may perform segmentation and clustering in the image based on the disparity information from the disparity calculator 420.
  • the segmentation unit 432 may separate a background and a foreground from at least one of the images based on the disparity information.
  • an area in which the disparity information is less than or equal to a predetermined value in the disparity map may be calculated in the background, and the portion may be excluded. Thereby, the foreground can be relatively separated.
  • an area in which the disparity information is greater than or equal to a predetermined value in the disparity map may be calculated in the foreground and a corresponding portion may be extracted. Thereby, the foreground can be separated.
  • the object detector 434 may detect the object based on the image segment from the segmentation unit 432.
  • the object detector 434 may detect an object with respect to at least one of the images based on the disparity information information.
  • the object detector 434 may detect an object with respect to at least one of the images.
  • an object can be detected from the foreground separated by image segments.
  • an object verification unit 436 classifies and verifies the separated object.
  • the object verification unit 436 may include an identification method using a neural network, a support vector machine (SVM) method, a method of identifying by AdaBoost using a haar-like feature, or a histograms of oriented gradients (HOG). Techniques can be used.
  • SVM support vector machine
  • AdaBoost a method of identifying by AdaBoost using a haar-like feature
  • HOG histograms of oriented gradients
  • the object checking unit 436 may check the objects by comparing the objects stored in the memory 140 with the detected objects.
  • the object checking unit 436 may check surrounding vehicles, lanes, road surfaces, signs, dangerous areas, tunnels, and the like, which are positioned around the vehicle.
  • the object tracking unit 440 performs tracking on the identified object. For example, in order to sequentially identify the object in the acquired images, calculate the motion or motion vector of the identified object, and track the movement of the object, etc. based on the calculated motion or motion vector. have. Accordingly, it is possible to track surrounding vehicles, lanes, road surfaces, signs, dangerous areas, and the like, which are located around the vehicle.
  • 5B is another example of an internal block diagram of a processor.
  • the processor 170 of FIG. 5B has the same internal configuration unit as the processor 170 of FIG. 5A, but the signal processing order is different. Only the differences are described below.
  • the object detector 434 may receive a plurality of images or the generated around view images, and detect an object in the plurality of images or the generated around view images. Unlike FIG. 5A, based on the disparity information, for the segmented image, the object may be detected directly from the plurality of images or the generated around view image, instead of detecting the object.
  • the object verification unit 436 classifies the detected and separated objects based on the image segments from the segmentation unit 432 and the objects detected by the object detection unit 434. , Verify.
  • the object verification unit 436 may include an identification method using a neural network, a support vector machine (SVM) method, a method of identifying by AdaBoost using a haar-like feature, or a histograms of oriented gradients (HOG). Techniques can be used.
  • SVM support vector machine
  • AdaBoost a method of identifying by AdaBoost using a haar-like feature
  • HOG histograms of oriented gradients
  • FIG. 5C is a diagram referred to for describing a method of operating the processor 170 of FIG. 5C, based on the images acquired in the first and second frame sections, respectively.
  • the plurality of cameras 195a,..., 195d respectively acquire images FR1a and FR1b sequentially.
  • the disparity calculator 420 in the processor 170 receives the images FR1a and FR1b signal-processed by the image preprocessor 410 and performs stereo matching on the received images FR1a and FR1b. Obtain a disparity map 520.
  • the disparity map 520 is a leveling disparity between the images FR1a and FR1b. The greater the disparity level is, the closer the distance is to the vehicle, and the smaller the disparity level is, the lower the disparity map 520 is. We can calculate that distance is far.
  • the disparity map when displaying such a disparity map, the disparity map may be displayed such that the larger the disparity level, the higher the luminance, and the smaller the disparity level, the lower the luminance.
  • the first to fourth lanes 528a, 528b, 528c, 528d and the like have corresponding disparity levels, respectively, the construction area 522, the first front vehicle 524.
  • each of the second front vehicles 526 has a corresponding disparity level.
  • the segmentation unit 432, the object detection unit 434, and the object confirmation unit 436 based on the disparity map 520, segment, object detection, and object for at least one of the images FR1a and FR1b. Perform the check.
  • the first to fourth lanes 538a, 538b, 538c, 538d, the construction area 532, the first front vehicle 534, and the second front vehicle 536 detect an object. And confirmation can be performed.
  • the object tracking unit 440 may perform tracking on the identified object.
  • FIG. 6 is a block diagram referred to describe a camera according to an embodiment of the present invention.
  • the camera 195 may include an optical output unit 610, a first optical sensor 620, a second optical sensor 630, and a beam splitter 640.
  • the light output unit 610 may output infrared light.
  • the light output unit 610 may include a light source and a lens for generating infrared light.
  • the first optical sensor 620 may process infrared light.
  • the first optical sensor 620 may convert infrared light into an electrical signal.
  • the first photosensor 620 may include at least one photodiode.
  • the first optical sensor 620 may include a complementary metal-oxide-semiconductor (CMOS) or a charge coupled device (CCD).
  • CMOS complementary metal-oxide-semiconductor
  • CCD charge coupled device
  • the second optical sensor 620 may process visible light.
  • the second optical sensor 630 may convert visible light into an electrical signal.
  • the second photosensor 630 may include at least one photodiode.
  • the second optical sensor 620 may include a complementary metal-oxide-semiconductor (CMOS) or a charge coupled device (CCD).
  • CMOS complementary metal-oxide-semiconductor
  • CCD charge coupled device
  • the beam splitter 640 may separate the received light into infrared light and visible light.
  • the beam splitter 640 may guide the infrared light separated from the received light to the first optical sensor 620.
  • the beam splitter 640 may guide visible light separated from the received light to the second optical sensor 630.
  • each camera may include the above-described light output unit, a first optical sensor, a second optical sensor, and a beam splitter.
  • the processor 170 may calculate a distance to an object based on a time of flight (TOF) of infrared light sensed by the first optical sensor 620.
  • the processor 170 may process image based on computer vision based on the visible light sensed by the second optical sensor 620.
  • TOF time of flight
  • FIGS. 7A to 7C are flowcharts for describing an operation of the apparatus for providing an around view for a vehicle according to an exemplary embodiment of the present invention.
  • the processor 170 may receive a vehicle surrounding image acquired by the camera 195 (S610).
  • the camera 195 may be a plurality.
  • the camera 195 may be a left camera 195a disposed on the left side of the vehicle 700 to obtain a left peripheral image, and a right camera 195c disposed on the right side of the vehicle 700 to acquire a right peripheral image.
  • a rear camera 195b disposed at the rear of the vehicle 700 to obtain a rear peripheral image
  • a front camera 195d disposed at the front of the vehicle 700 to acquire a front peripheral image.
  • the processor 170 may display an around view image generated based on the surrounding image of the vehicle through the display unit 180 (S620). For example, the processor 170 may combine a plurality of surrounding images, convert the combined image into a top view or a bird eye view, and display the combined image on the display unit 180. In this case, the processor 170 may generate and display a virtual vehicle image (700i of FIG. 3) corresponding to the vehicle 700.
  • a virtual vehicle image 700i of FIG. 3
  • the processor 170 may detect an object (S630).
  • the processor 170 may detect and track an object based on the surrounding image of the vehicle or the top view image.
  • the processor 170 may detect and track an object through the distance detector 150 of FIG. 4B.
  • the distance detector 150 of FIG. 4B may include at least one of an ultrasonic sensor, a lidar, a radar, and a TOF camera.
  • the processor 170 may detect and track an object based on the TOF of the received infrared light.
  • the camera 195 includes an optical output unit for outputting infrared light, a beam splitter for dividing received light into infrared light and visible light, a first optical sensor for processing infrared light, and a second optical sensor for processing visible light. can do.
  • the processor 170 may calculate a distance to the object based on the infrared light.
  • the processor 170 may process an image based on visible light.
  • the processor 170 may change the viewpoint of the vehicle surrounding image or the top view image to the object center image, and display the viewpoint change image on the display unit 180 (S660).
  • the operation of the around-view vehicle providing apparatus of FIG. 7B is similar to the operation of the around-view vehicle providing apparatus of FIG. 7A, except that operation S640 is further included.
  • step S640 the process will be described based on the step S640.
  • the processor 170 may determine whether the height of the detected object is equal to or less than a reference value in operation S640.
  • the object may include any one of a curb, a stopper, a speed bump, a traffic corn, and a safety fence.
  • the processor 170 may convert the viewpoint image around the vehicle or the top view image into the object center image and display the viewpoint change image on the display unit 180. It may be (S660).
  • the processor 170 may switch the viewpoint based on the object, and display the viewpoint change image through the display unit 170.
  • the view point switching image is displayed, so that the user can recognize the existence of the object, thereby preventing collision with the object.
  • the operation of the around-view vehicle providing apparatus of FIG. 7C is similar to the operation of the around-view vehicle providing apparatus of FIG. 7A, except that the operation further includes step S650.
  • the processor 170 may determine whether a distance between the detected object and the vehicle 700 is equal to or less than a reference value (S650).
  • the processor 170 converts the viewpoint of the vehicle around the image or the top view image into the object center image, and displays the viewpoint change image.
  • the display may be displayed at 180 (S660).
  • the object When the vehicle 700 or the object is moving, the object may approach the vehicle 700. In this case, the distance between the vehicle 700 and the object is reduced.
  • the processor 170 may switch the viewpoint based on the object and display the viewpoint change image on the display unit 170.
  • the view point switching image is displayed, so that the user can recognize the existence of the object, thereby preventing collision with the object.
  • FIG. 8 is a block diagram referred to to explain a vehicle according to an embodiment of the present invention.
  • the vehicle 700 includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driver 750, a memory 730, an interface unit 780,
  • the controller 770 may include a power supply unit 790, an around view providing apparatus 100, and a vehicle display apparatus 400.
  • the communication unit 710 may include one or more wireless communication devices between the vehicle 700 and the mobile terminal 600, between the vehicle 700 and the external server 601, or between the vehicle 700 and another vehicle 602. It may include a module. In addition, the communication unit 710 may include one or more modules for connecting the vehicle 700 to one or more networks.
  • the communication unit 710 may include a broadcast receiving module 711, a wireless internet module 712, a short range communication module 713, a location information module 714, an optical communication module 715, and a V2X communication module 716. have.
  • the broadcast receiving module 711 receives a broadcast signal or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast includes a radio broadcast or a TV broadcast.
  • the wireless internet module 712 refers to a module for wireless internet access and may be embedded or external to the vehicle 700.
  • the wireless internet module 712 is configured to transmit and receive wireless signals in a communication network in accordance with wireless internet technologies.
  • wireless Internet technologies include wireless LAN (WLAN), wireless-fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, digital living network alliance (DLNA), wireless broadband (WiBro), WiMAX ( World Interoperability for Microwave Access (HSDPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), and the like.
  • 712 transmits and receives data according to at least one wireless Internet technology in a range including the Internet technologies not listed above.
  • the wireless internet module 712 may exchange data wirelessly with the external server 601.
  • the wireless internet module 712 may receive weather information and road traffic information (eg, TPEG (Transport Protocol Expert Group)) information from the external server 601.
  • TPEG Transport Protocol Expert Group
  • the short range communication module 713 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near field communication may be supported using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, and Wireless Universal Serial Bus (Wireless USB) technologies.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • Near field communication may be supported using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, and Wireless Universal Serial Bus (Wireless USB) technologies.
  • the short range communication module 713 may form short range wireless communication networks to perform short range communication between the vehicle 700 and at least one external device. For example, the short range communication module 713 may exchange data with the mobile terminal 600 wirelessly.
  • the short range communication module 713 may receive weather information and traffic condition information of a road (for example, a transport protocol expert group (TPEG)) from the mobile terminal 600. For example, when the user boards the vehicle 700, the mobile terminal 600 and the vehicle 700 of the user may perform pairing with each other automatically or by executing an application of the user.
  • TPEG transport protocol expert group
  • the location information module 714 is a module for obtaining the location of the vehicle 700, and a representative example thereof is a GPS (Global Positioning System) module.
  • GPS Global Positioning System
  • the vehicle may acquire the position of the vehicle using a signal transmitted from a GPS satellite.
  • the optical communication module 715 may include an optical transmitter and an optical receiver.
  • the light receiver may convert the light signal into an electrical signal to receive information.
  • the light receiver may include a photo diode (PD) for receiving light.
  • Photodiodes can convert light into electrical signals.
  • the light receiver may receive information of the front vehicle through the light emitted from the light source included in the front vehicle.
  • the light emitter may include at least one light emitting device for converting an electrical signal into an optical signal.
  • the light emitting element is a light emitting diode (LED).
  • the light emitting unit converts the electric signal into an optical signal and transmits it to the outside.
  • the light transmitting unit may emit an optical signal to the outside through the blinking of the light emitting device corresponding to the predetermined frequency.
  • the light emitting unit may include a plurality of light emitting element arrays.
  • the light emitting unit may be integrated with a lamp provided in the vehicle 700.
  • the light emitting unit may be at least one of a headlight, a taillight, a brake light, a turn signal, and a vehicle width lamp.
  • the optical communication module 715 may exchange data with another vehicle 602 through optical communication.
  • the V2X communication module 716 is a module for performing wireless communication with the server 601 or another vehicle 602.
  • the V2X module 716 includes a module capable of implementing inter-vehicle communication (V2V) or inter-vehicle communication (V2I) protocol.
  • the vehicle 700 may perform wireless communication with the external server 601 and another vehicle 602 through the V2X communication module 716.
  • the input unit 720 may include a driving manipulation unit 721, a camera 195, a microphone 723, and a user input unit 724.
  • the driving operation means 721 receives a user input for driving the vehicle 700.
  • the driving manipulation means 721 may include a steering input means 721a, a shift input means 721b, an acceleration input means 721c, and a brake input means 721d.
  • the steering input means 721a receives an input of a traveling direction of the vehicle 700 from the user.
  • the steering input means 721a is preferably formed in a wheel shape to enable steering input by rotation.
  • the steering input means 721a may be formed of a touch screen, a touch pad, or a button.
  • the shift input means 721b receives an input of parking P, forward D, neutral N, and reverse R of the vehicle 700 from the user.
  • the shift input means 721b is preferably formed in the form of a lever.
  • the shift input unit 721b may be formed as a touch screen, a touch pad, or a button.
  • the acceleration input means 721c receives an input for accelerating the vehicle 700 from the user.
  • the brake input means 721d receives an input for deceleration of the vehicle 700 from the user.
  • the acceleration input means 721c and the brake input means 721d are preferably formed in the form of a pedal. According to an embodiment, the acceleration input means 721c or the brake input means 721d may be formed as a touch screen, a touch pad, or a button.
  • the camera 195 may include an image sensor and an image processing module.
  • the camera 195 may process a still image or a moving image obtained by an image sensor (eg, CMOS or CCD).
  • the image processing module may process the still image or the moving image acquired through the image sensor, extract necessary information, and transfer the extracted information to the controller 770.
  • the vehicle 700 may include a camera 195 for capturing a vehicle front image or a vehicle surrounding image and an internal camera for capturing an interior image of the vehicle.
  • the internal camera can acquire an image of the passenger.
  • the internal camera may acquire an image for biometric recognition of the passenger.
  • FIG. 7 illustrates that the camera 195 is included in the input unit 720
  • the camera 195 has a configuration included in the around view providing apparatus 100 as described with reference to FIGS. 1 to 7. It may be explained.
  • the microphone 723 may process an external sound signal into electrical data.
  • the processed data may be utilized in various ways depending on the function being performed in the vehicle 700.
  • the microphone 723 may convert the user's voice command into electrical data.
  • the converted electrical data may be transferred to the controller 770.
  • the camera 722 or the microphone 723 may be a component included in the sensing unit 760, not a component included in the input unit 720.
  • the user input unit 724 is for receiving information from a user. When information is input through the user input unit 724, the controller 770 may control the operation of the vehicle 700 to correspond to the input information.
  • the user input unit 724 may include a touch input means or a mechanical input means. According to an embodiment, the user input unit 724 may be disposed in one region of the steering wheel. In this case, the driver may manipulate the user input unit 724 with a finger while holding the steering wheel.
  • the sensing unit 760 senses a signal related to driving of the vehicle 700.
  • the sensing unit 760 may include a collision sensor, a wheel sensor, a speed sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor.
  • Position module vehicle forward / reverse sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle interior temperature sensor, vehicle interior humidity sensor, rain sensor, ultrasonic sensor, radar , Light Detection And Ranging (LiADAR), and the like.
  • LiADAR Light Detection And Ranging
  • the sensing unit 760 may include vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, and battery information.
  • the sensing signal may be obtained such as fuel information, tire information, vehicle lamp information, vehicle internal temperature information, vehicle internal humidity information, rain information, steering wheel rotation angle, and the like.
  • the sensing unit 760 in addition to the accelerator pedal sensor, pressure sensor, engine speed sensor (engine speed sensor), air flow sensor (AFS), intake temperature sensor (ATS), water temperature sensor (WTS), throttle
  • the sensor may further include a position sensor TPS, a TDC sensor, a crank angle sensor CAS, and the like.
  • the sensing unit 760 may include a biometric information detecting unit.
  • the biometric information detector detects and acquires biometric information of the passenger.
  • Biometric information includes fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, voice recognition ( Voice recognition) information.
  • the biometric information sensing unit may include a sensor for sensing biometric information of the passenger.
  • the internal camera and the microphone 723 may operate as a sensor.
  • the biometric information detecting unit may acquire hand shape information and face recognition information through an internal camera.
  • the output unit 740 outputs the information processed by the controller 770 and may include a display unit 741, a sound output unit 742, and a haptic output unit 743.
  • the display 741 may display information processed by the controller 770.
  • the display unit 741 may display vehicle related information.
  • the vehicle related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for driving guide to the vehicle driver.
  • the vehicle related information may include vehicle state information indicating a current state of a vehicle or vehicle driving information related to driving of the vehicle.
  • the display unit 741 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • display a 3D display, or an e-ink display.
  • the display unit 741 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
  • a touch screen may provide an output interface between the vehicle 700 and the user while functioning as a user input unit 724 that provides an input interface between the vehicle 700 and the user.
  • the display unit 741 may include a touch sensor that senses a touch on the display unit 741 to receive a control command by a touch method. Using this, when a touch is made to the display unit 741, the touch sensor may sense the touch and the controller 770 may generate a control command corresponding to the touch based on the touch sensor.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the display unit 741 may include a cluster (cluster) so that the driver can check the vehicle status information or vehicle driving information while driving.
  • the cluster can be located on the dashboard. In this case, the driver can check the information displayed on the cluster while keeping the gaze in front of the vehicle.
  • the display unit 741 may be implemented as a head up display (HUD).
  • HUD head up display
  • information may be output through a transparent display provided in the wind shield.
  • the display unit 741 may include a projection module to output information through an image projected on the wind shield.
  • the sound output unit 742 converts the electrical signal from the control unit 770 into an audio signal and outputs the audio signal.
  • the sound output unit 742 may include a speaker.
  • the sound output unit 742 may output a sound corresponding to the operation of the user input unit 724.
  • the haptic output unit 743 generates a tactile output.
  • the haptic output unit 743 may vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.
  • the vehicle driver 750 may control operations of various vehicles.
  • the vehicle driver 750 may receive a control signal from the around view providing apparatus 100.
  • the vehicle driver 750 may control each device based on the control signal.
  • the vehicle driver 750 includes a power source driver 751, a steering driver 752, a brake driver 753, a lamp driver 754, an air conditioning driver 755, a window driver 756, an airbag driver 757, and a sunroof. It may include a driver 758 and a suspension driver 759.
  • the power source driver 751 may perform electronic control of the power source in the vehicle 700.
  • the power source driver 751 may perform electronic control on the engine. Thereby, the output torque of an engine, etc. can be controlled.
  • the power source driver 751 is an engine, the speed of the vehicle may be limited by limiting the engine output torque under the control of the controller 770.
  • the power source driver 751 may perform control on the motor. Thereby, the rotation speed, torque, etc. of a motor can be controlled.
  • the power source driver 751 may receive an acceleration control signal from the around view providing device 100.
  • the power source driver 751 may control the power source according to the received acceleration control signal.
  • the steering driver 752 may perform electronic control of a steering apparatus in the vehicle 700. As a result, the traveling direction of the vehicle can be changed.
  • the steering driver 752 may receive a steering control signal from the around view providing apparatus 100.
  • the steering driver 752 may control the steering apparatus to steer according to the received steering control signal.
  • the brake driver 753 may perform electronic control of a brake apparatus (not shown) in the vehicle 700. For example, the speed of the vehicle 700 may be reduced by controlling the operation of the brake disposed on the wheel. As another example, by varying the operation of the brakes disposed on the left wheels and the right wheels, the traveling direction of the vehicle 700 may be adjusted to the left or the right.
  • the brake driver 753 may receive the deceleration control signal from the around view providing apparatus 100.
  • the brake driver 759 may control the brake device according to the received deceleration control signal.
  • the lamp driver 754 may control turn on / turn off of a lamp disposed in or outside the vehicle. In addition, it is possible to control the intensity, direction, etc. of the light of the lamp. For example, control of a direction indicator lamp, a brake lamp, and the like can be performed.
  • the air conditioning driver 755 may perform electronic control of an air cinditioner (not shown) in the vehicle 700. For example, when the temperature inside the vehicle is high, the air conditioner may be operated to control cold air to be supplied into the vehicle.
  • the window driver 756 may perform electronic control of a window apparatus in the vehicle 700. For example, the opening or closing of the left and right windows of the side of the vehicle can be controlled.
  • the airbag driver 757 may perform electronic control of an airbag apparatus in the vehicle 700.
  • the airbag can be controlled to burst.
  • the sunroof driver 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 700. For example, the opening or closing of the sunroof can be controlled.
  • the suspension driver 759 may perform electronic control of a suspension apparatus (not shown) in the vehicle 700. For example, when the road surface is curved, the suspension device may be controlled to control the vibration of the vehicle 700 to be reduced.
  • the suspension driver 759 may receive a suspension control signal from the around view providing apparatus 100. The suspension driver 759 may control the suspension device according to the received suspension control signal.
  • the memory 730 is electrically connected to the controller 770.
  • the memory 730 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data.
  • the memory 730 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware.
  • the memory 730 may store various data for the overall operation of the vehicle 700, such as a program for processing or controlling the controller 770.
  • the interface unit 780 may serve as a path to various types of external devices connected to the vehicle 700.
  • the interface unit 780 may include a port connectable to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 may exchange data with the mobile terminal 600.
  • the interface unit 780 may serve as a path for supplying electrical energy to the connected mobile terminal 600.
  • the interface unit 780 provides the mobile terminal 600 with electrical energy supplied from the power supply unit 790. do.
  • the controller 770 may control the overall operation of each unit in the vehicle 700.
  • the controller 770 may be referred to as an electronic control unit (ECU).
  • ECU electronice control unit
  • the controller 770 may be hardware, such as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and processors ( It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • the power supply unit 790 may supply power required for the operation of each component under the control of the controller 770.
  • the power supply unit 770 may receive power from a battery (not shown) in the vehicle.
  • the around view providing apparatus 100 may exchange data with the controller 770. Various information, data, or control signals generated by the around view providing apparatus 100 may be output to the controller 770.
  • the around view providing apparatus 100 may be the around view providing apparatus described above with reference to FIGS. 1 to 7C.
  • the vehicle display apparatus 400 may exchange data with the controller 770.
  • the controller 770 may receive navigation information from the vehicle display apparatus 400 or a separate navigation device (not shown).
  • the navigation information may include set destination information, route information according to the destination, map information or vehicle location information related to driving of the vehicle.
  • 9 to 15 are diagrams for describing an operation of a vehicle around view providing apparatus when an object is located at a rear of a vehicle according to an exemplary embodiment of the present invention.
  • the processor 170 of the apparatus 100 for providing an around view of a vehicle may receive a vehicle surrounding image acquired by the camera 195.
  • the processor 170 may receive a left peripheral image from the left camera 195a.
  • the processor 170 may receive a rear surrounding image from the rear camera 195b.
  • the processor 170 may receive a right peripheral image from the right camera 195c.
  • the processor 170 may receive a front surrounding image from the front camera 195d.
  • the processor 170 may generate an around view image 910 by combining the plurality of received surrounding images of the vehicle.
  • the around view image may be a top view image.
  • the processor 170 may display the top view image on the display unit 180.
  • the processor 170 may detect the objects 920 and 930.
  • the objects 920 and 930 may be objects whose height is low and cannot be confirmed when the driver is sitting in the driver's seat.
  • the objects 920 and 930 may be curbs, stoppers, speed bumps, traffic cones and safety fences.
  • the objects 920 and 930 are not clearly identified in the top view image, so the vehicle around view providing apparatus 100 needs to switch to another view point and provide it to the driver.
  • the processor 170 displays both the side surfaces of the vehicle 700 and the objects 920 and 930 as illustrated in FIG. 10.
  • the view point may be switched to the side view to be displayed on the display unit 180.
  • the side surface may include a left side or a right side.
  • the processor 170 may switch the view point to the rear view in which both the rear surface of the vehicle 700 and the objects 920 and 930 are displayed, and display the same on the display unit 180.
  • the vehicle image in the viewpoint switching image may be a virtual image generated by the processor 170 based on the data stored in the memory 140.
  • the processor 170 may determine distance information 925 and 935 from the objects 920 and 930. I can display it.
  • the processor 170 may display height information 923 and 933 of the objects 920 and 930. As such, by displaying the distances 925 and 935 from the objects 920 and 930 and the heights 933 and 933 of the objects 920 and 930, the user is made aware of the object and the object is not visible. It has the effect of preventing collisions.
  • the processor 170 may calculate the distance to the object and the height of the object based on the above-described depth map.
  • the processor 170 may calculate a distance to the object based on the information received by the distance detector 150 of FIG. 4B.
  • the processor 170 may calculate the height of the object based on the information detected by the camera 195 and the information received by the distance detector 150 of FIG. 4B.
  • the processor 170 may switch the view point to the upper side view and display it on the display unit 180.
  • the view point may be switched to the upper side rear view or the upper side front view and displayed on the display unit 180.
  • the processor 170 may display the view point switching image (eg, the image of FIG. 10 or FIG. 11) and the top view image (eg, the image of FIG. 9) together. Can be marked on.
  • the processor 170 may match and display an object displayed in the top view image and an object displayed in the view point switching image.
  • the processor 170 may display an auxiliary line connecting the object displayed in the top view image and the object displayed in the view point switching image. As such, the top view image and the view point switching image are displayed to match each other, thereby accurately transmitting the position of the object to the user.
  • the processor 170 may display the plurality of viewpoint switching images (eg, the image of FIG. 10 and the image of FIG. 11) on the display unit 180.
  • the processor 170 may display the plurality of view point switching images on the display unit 180 at the same time.
  • the processor 170 may sequentially display the plurality of view point-switched images on the display unit 180 at predetermined time intervals. As such, by providing the user with various view point switching images, there is an effect of providing the user with more accurate information about the object.
  • the display unit 180 may be implemented such that the image 1420 is displayed on the side mirrors 1410L and 1410R.
  • the image may be a viewpoint change point image, a top view image, or a combination thereof.
  • the display unit 180 may normally act as a mirror and display an image 1420 when a predetermined event occurs.
  • the event may include the case where the object is detected, the distance from the detected object is equal to or less than the reference value, or the height of the detected object is equal to or less than the reference value.
  • the processor 170 may display the image 1410 on an image reflected by the side mirrors 1410L and 1410R. In this case, the user can check the reflected image and the image at the same time.
  • the processor 170 may control the image 1420 to be displayed on the side mirrors 1410L and 1410R according to the positional relationship between the vehicle 700 and the detected object. For example, when the distance between the detected object and the vehicle 700 is less than or equal to a preset value when the object is located on the left side of the vehicle 700, the processor 170 may display an image (or image) on the left side mirror 1410L. 1420 may be displayed. For example, when the distance between the detected object and the vehicle 700 is less than or equal to a preset value while the object is located on the right side of the vehicle 700, the processor 170 may display an image (or image) on the right side mirror 1410R. 1420 may be displayed.
  • the processor 170 may control the image 1420 to be displayed on the side mirrors 1410L and 1410R.
  • the processor 170 may control the image 1420 to be displayed on the side mirrors 1410L and 1410R when the eyes of the user sensed by the internal camera are directed to the side mirrors 1410L and 1410R.
  • the interface unit 130 may receive the gaze information of the user detected by the internal camera and transmit it to the processor 170.
  • the processor 170 may control the image 1420 to be displayed on the side mirrors 1410L and 1410R.
  • the interface unit 130 may receive the position information of the shift lever and transmit the position information to the processor 170.
  • the image 1420 when the image 1420 is displayed on the side mirrors 1410L and 1410R, the image reflected by the side mirrors 1410L and 1410R when the user reverses the parking while looking at the side mirrors 1410L and 1410R ( An object that cannot be identified through vi) can be confirmed through the image 1420.
  • the display unit 180 may be implemented to display an image 1520 on the room mirror 1510.
  • the image may be a viewpoint change point image, a top view image, or a combination thereof.
  • the display unit 180 may normally act as a mirror and display an image 1420 when a predetermined event occurs.
  • the event may include the case where the object is detected, the distance from the detected object is equal to or less than the reference value, or the height of the detected object is equal to or less than the reference value.
  • the processor 170 may display the image 1520 on the image reflected by the room mirror 1510. In this case, the user may simultaneously check the reflected image and the image.
  • the user may check through the image reflected through the room mirror 1510. An object that cannot be identified may be identified through the image 1520.
  • the processor 170 may control to display the image 1520 on the room mirror 1510.
  • the processor 170 may control the image 1520 to be displayed on the room mirror 1510 when the gaze of the user detected by the internal camera is directed to the room mirror 1510.
  • the interface unit 130 may receive the gaze information of the user detected by the internal camera and transmit it to the processor 170.
  • the processor 170 may control the image 1520 to be displayed on the room mirror 1510 when the shift lever is located in the reverse direction R.
  • the interface unit 130 may receive the position information of the shift lever and transmit the position information to the processor 170.
  • the display unit 180 may be implemented such that an image is displayed on the side window glass adjacent to the side mirror.
  • the display unit 180 may be formed as a transparent display and disposed close to the side window glass.
  • the display unit 180 may include a projection module, and the projection module may project an image on the side window glass.
  • the processor 170 may control to display an image on the side window glass.
  • the processor 170 may control an image to be displayed on the side window glass when the gaze of the user sensed by the internal camera is directed to the side window glass.
  • the interface unit 130 may receive the gaze information of the user detected by the internal camera and transmit it to the processor 170.
  • the processor 170 may control to display an image on the side window glass.
  • the interface unit 130 may receive the position information of the shift lever and transmit the position information to the processor 170.
  • 16 to 21 are diagrams for describing an operation of a vehicle around view providing apparatus when an object is located in front of a vehicle according to an embodiment of the present invention.
  • the processor 170 of the around view apparatus 100 for a vehicle may receive a vehicle surrounding image acquired by the camera 195.
  • the processor 170 may receive a left peripheral image from the left camera 195a.
  • the processor 170 may receive a rear surrounding image from the rear camera 195b.
  • the processor 170 may receive a right peripheral image from the right camera 195c.
  • the processor 170 may receive a front surrounding image from the front camera 195d.
  • the processor 170 may generate an around view image 1610 by combining the plurality of received surrounding images of the vehicle.
  • the around view image may be a top view image.
  • the processor 170 may display the top view image on the display unit 180.
  • the processor 170 may detect the objects 1620 and 1630.
  • the objects 1620 and 1630 may be objects whose height is low and cannot be confirmed when the driver is sitting in the driver's seat.
  • the objects 1620 and 1630 may be curbs, stoppers, speed bumps, traffic cones and safety fences.
  • the objects 1620 and 1630 are not clearly identified in the top view image, and thus the vehicle around view providing apparatus 100 needs to switch to another view point and provide it to the driver.
  • the processor 170 displays both the side surfaces of the vehicle 700 and the objects 1620 and 1630 as illustrated in FIG. 17.
  • the view point may be switched to the side view to be displayed on the display unit 180.
  • the side surface may include a left side or a right side.
  • the processor 170 may switch the viewpoint to a front view in which both the front surface of the vehicle 700 and the object regions 1620 and 1630 are displayed on the display unit 180. .
  • the processor 170 may display a top view on which the top surface and the object of the vehicle 700 are displayed on the display unit 180.
  • the processor 170 may display an enlarged image, compared to the previously displayed top view image, centering on the object.
  • the processor 170 may display distance information 1910 with respect to the object on the top view image.
  • the processor 170 may display an alarm message 1920 on the top view image.
  • the vehicle image in the viewpoint switching image may be a virtual image generated by the processor 170 based on the data stored in the memory 140.
  • the processor 170 may display distance information 1625 and 1635 with the objects 1620 and 1630. I can display it.
  • the processor 170 may display height information 1623 and 1633 of the objects 1620 and 1630. As such, by displaying the distances 1625 and 1635 to the objects 1620 and 1630 and the heights 1623 and 1633 of the objects 1620 and 1630, the user is recognized and the object is not visible even when the object is not visible. It has the effect of preventing collisions.
  • the processor 170 may calculate the distance to the object and the height of the object based on the above-described depth map.
  • the processor 170 may calculate a distance to the object based on the information received by the distance detector 150 of FIG. 4B.
  • the processor 170 may calculate the height of the object based on the information detected by the camera 195 and the information received by the distance detector 150 of FIG. 4B.
  • the processor 170 may display the view point switching image (eg, the image of FIG. 17 or FIG. 18) and the top view image (eg, the image of FIG. 16) together. Can be marked on.
  • the processor 170 may match and display an object displayed in the top view image and an object displayed in the view point switching image.
  • the processor 170 may display an auxiliary line connecting the object displayed in the top view image and the object displayed in the view point switching image. As such, the top view image and the view point switching image are displayed to match each other, thereby accurately transmitting the position of the object to the user.
  • the processor 170 may display the plurality of viewpoint switching images (for example, the image of FIG. 17 and the image of FIG. 18) on the display unit 180.
  • the processor 170 may display the plurality of view point switching images on the display unit 180 at the same time.
  • the processor 170 may sequentially display the plurality of view point-switched images on the display unit 180 at predetermined time intervals. As such, by providing the user with various view point switching images, there is an effect of providing the user with more accurate information about the object.
  • the display unit 180 may be implemented such that an image 2020 is displayed on the front wind shield 2010.
  • the image may be a viewpoint change point image, a top view image, or a combination thereof.
  • the display unit 180 may be formed as a transparent display and disposed close to the front windshield 2010.
  • the display unit 180 may include a projection module, and the projection module may project the image 2020 on the front windshield 2010.
  • the image 2020 is displayed on the front windshield, so that the user can check the object while keeping the gaze in front of the user. Therefore, there is an effect that can prevent the collision with the object even if it is not visible to the user.
  • the processor 170 may, on the front windshield, augmented reality (AR: Augmented Reality) around the object that matches the distance information 2110 between the vehicle 700 and the object to the object. ) Can be displayed.
  • AR Augmented Reality
  • the processor 170 may display an alarm message 2120 on the front windshield.
  • 22 to 25 are diagrams for describing an operation of an apparatus for providing an around view of a vehicle when an object is located at a side of a vehicle according to an exemplary embodiment of the present invention.
  • the processor 170 of the apparatus 100 for providing an around view of a vehicle may receive a vehicle surrounding image acquired by the camera 195.
  • the processor 170 may receive a left peripheral image from the left camera 195a.
  • the processor 170 may receive a rear surrounding image from the rear camera 195b.
  • the processor 170 may receive a right peripheral image from the right camera 195c.
  • the processor 170 may receive a front surrounding image from the front camera 195d.
  • the processor 170 may generate an around view image 2210 by combining the plurality of received surrounding images of the vehicle.
  • the around view image may be a top view image.
  • the processor 170 may display the top view image on the display unit 180.
  • the processor 170 may detect the objects 2220 and 2230.
  • the objects 2220 and 2230 may be objects whose height is low and cannot be confirmed when the driver is sitting in the driver's seat.
  • the objects 2220 and 2230 may be curbs, stoppers, speed bumps, traffic cones, safety fences, and the like.
  • the objects 2220 and 2230 are not clearly identified in the top view image, and thus the vehicle around view providing apparatus 100 needs to switch to another view point and provide it to the driver.
  • the processor 170 displays both the front surface of the vehicle 700 and the objects 2220 and 2230, as illustrated in FIG. 23.
  • the view point may be switched to the front switching view to be displayed on the display unit 180.
  • the side may include a left side or a right side.
  • the processor 170 may switch the view point to a rear view in which both the rear surface of the vehicle 700 and the objects 2220 and 2230 are displayed, and display the same on the display unit 180.
  • the vehicle image in the viewpoint switching image may be a virtual image generated by the processor 170 based on the data stored in the memory 140.
  • the processor 170 may display distance information 2225 and 2235 with the objects 2220 and 2230. I can display it.
  • the processor 170 may display height information 2223 and 2233 of the objects 2220 and 2230. As such, by displaying the distances 2225 and 2235 of the objects 2220 and 2230 and the heights 2223 and 2233 of the objects 2220 and 2230, the user is recognized and the object is not visible. It has the effect of preventing collisions.
  • the processor 170 may calculate the distance to the object and the height of the object based on the above-described depth map.
  • the processor 170 may calculate a distance to the object based on the information received by the distance detector 150 of FIG. 4B.
  • the processor 170 may calculate the height of the object based on the information detected by the camera 195 and the information received by the distance detector 150 of FIG. 4B.
  • the display unit 180 may be implemented such that an image is displayed on any one of a room mirror, a side mirror, a side window glass, and a front wind shield.
  • the processor 170 may enlarge and display the viewpoint change image around the object 2220 in the state of the viewpoint change. For example, when the distance from the object 2220 is less than or equal to the reference value, the processor 170 may enlarge and display the viewpoint change image around the object 2220. In this case, the processor 170 may display distance information 2510 with the object 2220 on the viewpoint change image. In addition, the processor 170 may display an alarm message 2520 indicating a collision prediction with the object 2220 on the viewpoint change image.
  • 26 to 27 are diagrams for describing an operation of a vehicle around view providing apparatus when a plurality of objects are provided according to an embodiment of the present invention.
  • the processor 170 may detect a plurality of objects 2620 and 2630.
  • the processor 170 may switch the view point around the object 700 closer to the vehicle 700 among the plurality of objects 2620 and 2630 and display the same on the display unit 180.
  • the processor 170 may switch the view point around the second object 2630. Can be.
  • the processor 170 may display the viewpoint change image on the display unit 180.
  • the viewpoint change point image may be an image point of view changed to one of a front view, a side view, and a rear view.
  • the viewpoint change image may be an image 2640 enlarged around the second object 2630 in the top view image.
  • the processor 170 may display distance information 2641 with the second object 2630 on the viewpoint change image. In addition, the processor 170 may display an alarm message 2602 calling for attention of collision with the second object 2630 on the viewpoint change image.
  • the processor 170 may switch the view point around the first object 2720. Can be.
  • the processor 170 may display the viewpoint switching image on the display unit 180.
  • the viewpoint change point image may be an image point of view changed to one of a front view, a side view, and a rear view.
  • the viewpoint change image may be an image 2740 enlarged around the first object 2720 in the top view image.
  • the processor 170 may display distance information 2741 from the first object 2720 on the viewpoint change image. In addition, the processor 170 may display an alarm message 2742 for alerting the collision with the first object 2720 on the viewpoint change image.
  • 28 to 29 are diagrams for describing an operation of a vehicle around view providing apparatus when an object is detected while driving, according to an embodiment of the present invention.
  • the processor 170 may detect an object 2810 positioned around the vehicle, in particular, in front of the vehicle while driving.
  • the processor 170 may calculate a distance from the object 2810 and a height of the object 2810 based on the depth map described above.
  • the processor 170 may calculate a distance from the object 2810 based on the information received by the distance detector 150 of FIG. 4B.
  • the processor 170 may calculate the height of the object 2810 based on the information detected by the camera 195 and the information received by the distance detector 150 of FIG. 4B.
  • the object 2810 may be any one of a curb, a stopper, a speed bump, a traffic corn, and a safety fence.
  • the processor 170 may switch view points around the object 2810.
  • the processor 170 may display the viewpoint switching image 2910 on the display unit 180. If the display unit 180 is implemented to display an image on the front windshield 2950, the user may check the image while keeping a gaze in front of the vehicle while driving.
  • the processor 170 may display the height information 2920 of the object on the view point switching image 2910. In addition, the processor 170 may display a driving guide message 2930 for avoiding or safely passing an object on the viewpoint switching image 2910.
  • the processor 170 may provide a control signal to the vehicle driver 750.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include a processor 170 or a controller 770. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention concerne un appareil de fourniture de vision destiné à un véhicule, comprenant : une pluralité de caméras destinées à acquérir une image circonvoisine d'un véhicule ; une unité d'affichage permettant d'afficher une première image de point de vue générée par synthèse d'images acquises par la pluralité de caméras ; et un processeur permettant, lorsqu'un objet est détecté, d'afficher une seconde image de point de vue présentant un point de vue différent du premier point de vue, par le biais de l'unité d'affichage.
PCT/KR2015/013834 2015-10-13 2015-12-16 Appareil de fourniture de vision pour véhicule et véhicule WO2017065352A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150142652A KR20170043212A (ko) 2015-10-13 2015-10-13 차량용 어라운드 뷰 제공 장치 및 차량
KR10-2015-0142652 2015-10-13

Publications (1)

Publication Number Publication Date
WO2017065352A1 true WO2017065352A1 (fr) 2017-04-20

Family

ID=58518304

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/013834 WO2017065352A1 (fr) 2015-10-13 2015-12-16 Appareil de fourniture de vision pour véhicule et véhicule

Country Status (2)

Country Link
KR (1) KR20170043212A (fr)
WO (1) WO2017065352A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108153308A (zh) * 2017-12-21 2018-06-12 李华 用于机器人车辆自动驾驶的复合视觉激光导航系统及其控制方法
CN109109800A (zh) * 2018-07-23 2019-01-01 威马智慧出行科技(上海)有限公司 汽车雷达报警提示方法及装置
CN109827610A (zh) * 2019-03-12 2019-05-31 百度在线网络技术(北京)有限公司 用于校验传感器融合结果的方法和装置
CN111327877A (zh) * 2018-12-17 2020-06-23 现代自动车株式会社 车辆及控制车辆图像的方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101947685B1 (ko) * 2017-05-18 2019-04-29 충북대학교 산학협력단 차량의 위치를 측정하는 장치 및 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005107261A1 (fr) * 2004-04-27 2005-11-10 Matsushita Electric Industrial Co., Ltd. Visualisation de circonférence d’un véhicule
US20050285966A1 (en) * 2004-01-28 2005-12-29 Canesta, Inc. Single chip red, green, blue, distance (RGB-Z) sensor
KR101400419B1 (ko) * 2012-07-04 2014-05-30 주식회사 금산코리아 차량의 룸미러를 이용한 영상표시장치
KR101413231B1 (ko) * 2013-02-18 2014-06-30 인하대학교 산학협력단 증강현실 기반 차량 주변 모니터링 장치 및 방법, 그리고 차량
US20150178576A1 (en) * 2013-12-20 2015-06-25 Magna Electronics Inc. Vehicle vision system with enhanced pedestrian detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050285966A1 (en) * 2004-01-28 2005-12-29 Canesta, Inc. Single chip red, green, blue, distance (RGB-Z) sensor
WO2005107261A1 (fr) * 2004-04-27 2005-11-10 Matsushita Electric Industrial Co., Ltd. Visualisation de circonférence d’un véhicule
KR101400419B1 (ko) * 2012-07-04 2014-05-30 주식회사 금산코리아 차량의 룸미러를 이용한 영상표시장치
KR101413231B1 (ko) * 2013-02-18 2014-06-30 인하대학교 산학협력단 증강현실 기반 차량 주변 모니터링 장치 및 방법, 그리고 차량
US20150178576A1 (en) * 2013-12-20 2015-06-25 Magna Electronics Inc. Vehicle vision system with enhanced pedestrian detection

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108153308A (zh) * 2017-12-21 2018-06-12 李华 用于机器人车辆自动驾驶的复合视觉激光导航系统及其控制方法
CN109109800A (zh) * 2018-07-23 2019-01-01 威马智慧出行科技(上海)有限公司 汽车雷达报警提示方法及装置
CN111327877A (zh) * 2018-12-17 2020-06-23 现代自动车株式会社 车辆及控制车辆图像的方法
CN109827610A (zh) * 2019-03-12 2019-05-31 百度在线网络技术(北京)有限公司 用于校验传感器融合结果的方法和装置
CN109827610B (zh) * 2019-03-12 2021-05-14 百度在线网络技术(北京)有限公司 用于校验传感器融合结果的方法和装置

Also Published As

Publication number Publication date
KR20170043212A (ko) 2017-04-21

Similar Documents

Publication Publication Date Title
WO2018066741A1 (fr) Dispositif d'assistance au stationnement automatique et véhicule comprenant celui-ci
WO2018012674A1 (fr) Appareil d'aide à la conduite et véhicule équipé de celui-ci
WO2018070583A1 (fr) Appareil d'aide au stationnement automatique et véhicule comprenant ce dernier
WO2017209313A1 (fr) Dispositif d'affichage de véhicule et véhicule
WO2015099465A1 (fr) Dispositif d'assistance à la conduite pour véhicule et véhicule doté de celui-ci
WO2017039047A1 (fr) Véhicule et procédé de commande associé
WO2017022881A1 (fr) Véhicule et procédé de commande associé
WO2017150768A1 (fr) Dispositif d'affichage et véhicule équipé de celui-ci
WO2017018729A1 (fr) Radar pour véhicule et véhicule équipé de celui-ci
WO2018131949A1 (fr) Appareil servant à fournir une vue environnante
WO2017094952A1 (fr) Procédé d'alarme externe de véhicule, dispositif auxiliaire de conduite de véhicule pour réaliser celui-ci, et véhicule comprenant un dispositif auxiliaire de conduite de véhicule
WO2017034282A1 (fr) Appareil d'aide à la conduite et procédé de commande de ce dernier
WO2016182275A1 (fr) Appareil de conduite autonome et véhicule le comprenant
WO2017061653A1 (fr) Procédé pour empêcher la conduite en état d'ivresse et dispositif auxiliaire de véhicule pour le fournir
WO2017200162A1 (fr) Dispositif d'aide à la conduite de véhicule et véhicule
WO2017119541A1 (fr) Appareil d'assistance à la conduite de véhicule et véhicule le comprenant
WO2017115916A1 (fr) Appareil d'assistance de véhicule et véhicule équipé de celui-ci
WO2016186294A1 (fr) Dispositif de projection d'image et véhicule le comprenant
WO2015093828A1 (fr) Caméra stéréo et véhicule comportant celle-ci
WO2017104888A1 (fr) Dispositif d'aide à la conduite de véhicule et son procédé d'aide à la conduite de véhicule
WO2017065352A1 (fr) Appareil de fourniture de vision pour véhicule et véhicule
WO2016186319A1 (fr) Dispositif d'assistance à la conduite d'un véhicule et véhicule
WO2016021961A1 (fr) Appareil de pilotage de lampe de tête de véhicule et véhicule comportant celui-ci
WO2015088289A1 (fr) Caméra stéréoscopique, dispositif auxiliaire de conduite de véhicule, et véhicule
WO2015099463A1 (fr) Dispositif d'assistance à la conduite de véhicule et véhicule le comportant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15906314

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15906314

Country of ref document: EP

Kind code of ref document: A1