WO2014155954A1 - 車両用表示制御装置 - Google Patents

車両用表示制御装置 Download PDF

Info

Publication number
WO2014155954A1
WO2014155954A1 PCT/JP2014/000906 JP2014000906W WO2014155954A1 WO 2014155954 A1 WO2014155954 A1 WO 2014155954A1 JP 2014000906 W JP2014000906 W JP 2014000906W WO 2014155954 A1 WO2014155954 A1 WO 2014155954A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
vehicle
visible light
unit
Prior art date
Application number
PCT/JP2014/000906
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
宗作 重村
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2014155954A1 publication Critical patent/WO2014155954A1/ja

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/30Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/106Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene

Definitions

  • the present disclosure relates to a vehicle display control device.
  • Night vision system vehicle display control device
  • a vehicle monitoring system that extracts objects such as pedestrians that may collide with a vehicle from an image captured by an infrared camera and provides the information to the driver of the vehicle , Also called a vehicle monitoring system).
  • vehicle monitoring system the possibility of a collision with an object such as a pedestrian is determined based on the relative distance and relative speed between the vehicle and the object.
  • Obstacle detection devices include distance sensors such as millimeter wave radars and laser radars, but these sensors can detect the presence or absence of obstacles, but cannot detect what the detected obstacles are. The person cannot be detected.
  • far infrared cameras are suitable for pedestrian detection applications because they visualize the heat of an object.
  • Far-infrared light is less scattered by moisture, so it is relatively easy to see pedestrians even in bad weather (rain, fog, etc.), and has the advantage of being less susceptible to ambient light such as headlights from oncoming vehicles.
  • the temperature of background objects (buildings, roads, etc.) other than pedestrians rises due to the sun, which makes it difficult to obtain a temperature difference between the pedestrian and the background object.
  • the image captured by the far-infrared camera is an image of the heat of the object, there is a large gap in appearance from the image that is actually visually recognized, and there is also a problem that it is difficult to recognize what is reflected.
  • Patent Document 1 In the configuration of Patent Document 1, a visible light camera and an infrared camera are essential components. Since the visible light camera needs to be matched in shape according to the attachment position, the cost becomes high.
  • the visible light camera function is also provided in portable information terminals represented by smartphones (high-function mobile phones), and the penetration rate of these terminals is high. Also, in recent years, a system that links a vehicle device such as a navigation device and a portable information terminal has been put into practical use. However, in the above-described vehicle display control device, there is no example that refers to the cooperation with the portable information terminal.
  • This disclosure aims to provide a vehicle display control device that improves user convenience at low cost.
  • the vehicle is applied to a vehicle that includes a visible light camera connection portion to which a visible light camera can be attached and detached and an infrared camera that captures the outside of the vehicle, and the display portion is disposed in the vehicle interior.
  • the display control device is configured to detect an object outside the vehicle based on an infrared image captured by the infrared camera, and when the visible light camera is connected to the visible light camera connection unit.
  • An overlapping area detection unit that detects an overlapping area between the infrared image and the visible light image based on a range of a visible light image outside the vehicle captured by the visible light camera, and at least a part of the visible light image.
  • a display image determination unit that transmits the display image to the display unit.
  • the display image determination unit superimposes and displays the information of the object on the visible light image and transmits the display image as the display image to the display unit when the object exists within the overlap region.
  • the cost of the device can be reduced. Further, the visibility is improved as compared with the configuration of only the infrared camera. Furthermore, the operation load can be reduced by connecting a visible light camera familiar to the user. It can also follow the performance improvement of visible light cameras.
  • FIG. 1 is a diagram illustrating a configuration example of a vehicle display control device of the present disclosure
  • FIG. 2 is a diagram illustrating an example of attaching an infrared camera, a terminal device, and a display to a vehicle
  • FIG. 3 is a flowchart for explaining the display control process.
  • FIG. 4 is a flowchart for explaining the first display image generation process.
  • FIG. 5 is a flowchart for explaining the second display image generation process.
  • FIG. 6 is a diagram illustrating a display example in the first display image generation process.
  • FIG. 7 is a diagram showing a display example in the first display image generation process following FIG. FIG.
  • FIG. 8 is a diagram illustrating a determination example of a duplicate image and a display image
  • FIG. 9A is a diagram showing an example of enlarged display of a duplicate image
  • FIG. 9B is a diagram showing another example of enlarged display of overlapping images
  • FIG. 9C is a diagram illustrating another example of enlarged display of overlapping images.
  • the vehicle display control device 1 includes a control unit 10, an infrared camera 22 connected to the control unit 10, an operation unit 24, and a display 30 (display unit of the present disclosure).
  • the control unit 10 includes a well-known CPU 12, a ROM 13 that stores a control program, a RAM 14 that temporarily stores data, a memory 15 that is configured by a nonvolatile storage medium and stores information necessary for the operation of the vehicle display control device 1, Including an arithmetic processing unit 11 (display image determining unit and recommended information output unit of the present disclosure), a signal input / output circuit (abbreviated as I / O in FIG. 1) 16 connected to the arithmetic processing unit 11, a display control unit 17 (display unit of the present disclosure), an image processing unit 18 (overlapping area detection unit, object recognition unit of the present disclosure), and a connection unit 20 (visible light camera connection unit of the present disclosure).
  • arithmetic processing unit 11 display image determining unit and recommended information output unit of the present disclosure
  • I / O in FIG. 1 signal input / output circuit
  • control unit 10 may be configured by hardware as one or a plurality of ICs or ASICs, or a part and all of the control unit 10 may be built on a memory by software.
  • the signal input / output circuit 16 converts the output from the operation unit 24 into data that can be processed by the arithmetic processing unit 11, for example.
  • the display control unit 17 corresponds to a so-called display driver, acquires a control command and display data from the arithmetic processing unit 11, and performs display control of the display unit 30.
  • the image processing unit 18 performs predetermined image processing such as filtering and binarization processing on the captured image captured by the terminal device 21 and the infrared camera 22 based on the control command from the arithmetic processing unit 11, Image data composed of pixels of a dimensional array is generated. Further, the overlapping area and the object are detected from the generated image (details will be described later).
  • the connection unit 20 is a connection interface between the terminal device 21 (visible light camera of the present disclosure) and the control unit 10 and includes at least one of the following.
  • -Wired connection using the USB standard the connection unit 20 includes a connector and a cable for connection, and outputs a signal having a level corresponding to whether or not the terminal device 21 is connected from the signal line 20a.
  • the USB standard power can be supplied from the control unit 10 to the terminal device 21, and the battery consumption of the terminal device 21 can be suppressed by providing the terminal device 21 with a charging circuit.
  • -Wireless connection based on well-known wireless LAN standards.
  • the connection unit 20 includes a wireless transmission / reception circuit, and outputs a signal indicating the reception intensity of the radio wave from the terminal device 21 from the signal line 20a.
  • the terminal device 21 is brought into the vehicle by a passenger, and includes a visible light camera including a solid-state image sensor such as a CCD or a CMOS. In the visible light camera, an imaging direction and an imaging range are determined based on the operation of the occupant.
  • Examples of the terminal device 21 include a tablet-type terminal typified by a known smartphone (a general term for a portable information terminal that has a touch panel mounted on a display portion such as a liquid crystal display and is operated with a finger), and a digital camera. As shown in FIG. 2, the terminal device 21 is attached to a position where the front of the vehicle 40 can be photographed, such as an upper portion of the dash panel of the vehicle 40, using an attachment (not shown) such as a bracket.
  • the infrared camera 22 may be either a near-infrared camera that reflects the reflection of an infrared light or a far-infrared camera that reflects a far-infrared ray emitted from an object. As shown in FIG. 2, the infrared camera 22 is attached to a position where the front of the vehicle 40 can be photographed, such as the vicinity of the rearview mirror or the front part of the ceiling.
  • the operation unit 24 is configured as a known mechanical switch or a touch panel formed on the screen of the display 30.
  • the display device 30 is configured as a well-known LCD, and is attached, for example, in a meter panel of the vehicle 40 as shown in FIG. Alternatively, it is included as part of the meter display.
  • the display of the tablet terminal may be a display unit of the present disclosure.
  • the vehicle display control device 1 is activated by turning on the ignition switch or operating an activation switch included in the operation unit 24 (S11).
  • the terminal device 21 is connected using any of the following.
  • the determination is made based on the output level of the signal line 20a.
  • the determination is made based on the reception intensity of the radio wave from the signal line 20a.
  • the first display image generation process corresponding to step S13 in FIG. 3 will be described with reference to FIG. First, based on the control command from the arithmetic processing unit 11, the terminal device 21 and the infrared camera 22 capture images, and the image processing unit 18 acquires those images (visible light image and infrared image) (S31). In the terminal device 21, an occupant performs setting of an imaging range and an imaging operation (for example, transition to a still image shooting mode).
  • the overlapping area between the visible light image and the infrared image is detected by the image processing unit 18 (S32).
  • An example of the detection method is disclosed in, for example, Japanese Patent Application Laid-Open No. 2007-131178.
  • match the resolution of the two images enlarge the low resolution image to match the high resolution image or reduce the high resolution image to match the low resolution image
  • fix the infrared image and convert the visible light image From the upper left to the lower right of the infrared image
  • calculate the similarity by overlaying rows or columns in units of pixels and shifting in the column direction or row direction.
  • the similarity is the highest, the two images are displayed. It is determined that there is an overlapping area (so-called template matching).
  • a message including that effect is output to the display 30 and recommended information for recommending the occupant to change the imaging range of the terminal device 21 is output (S39). For example, a message such as “Please change the imaging range” is displayed on the display 30. Then, it returns to step S31.
  • the above-mentioned configuration is “provided with a recommended information output unit (11) that outputs recommended information that recommends changing the imaging range of the visible light camera when an overlapping region is not detected”. With this configuration, it is possible to alert the occupant that the direction of the visible light camera is not appropriate.
  • the overlapping region exists.
  • a predetermined value or when the size (area) of the overlapping area is lower than a predetermined value (example: area 45 in FIG. 8).
  • the overlapping area does not include all of the predetermined reference area 47 such as the front of the vehicle 40 in the traveling direction (eg, area 45 and area 46 in FIG. 8). Accordingly, it is possible to prevent an image that is not useful for the passenger from being displayed.
  • the display device 30 when there is an overlapping area (S33: Yes), for example, the display device 30 performs highlighting by enclosing the outer edge of the overlapping portion with a frame or the like and gives the occupant one of the following: Display a guidance message to select. On the basis of this display, the occupant determines the display form and sets it as a display image for display on the display 30 (S34). ⁇ Display as is. ⁇ Enlarge the overlapping area. -Change the imaging range of the terminal device 21 and perform imaging again.
  • the vehicle occupant includes an operation unit (24) for performing an operation input, and the display image determination unit determines the range of the display image based on the operation input of the occupant”.
  • the occupant may select and display a part of the overlapping area.
  • This configuration is “the desired display area is selected from the overlap area by the operation of the occupant's operation section, and the display image determining section determines the desired display area as the display image”. With this configuration, an area desired by the occupant can be displayed as a display image.
  • FIG. 6 shows a display example on the display 30 when an overlapping area is detected.
  • An overlapping region 43 between the visible light image 41 and the infrared image 42 is displayed using a visible light image.
  • region where the visible light image 41 and the infrared image 42 do not overlap is abbreviate
  • An occupant selection button is displayed at the bottom of the screen. When the “Display as is” button is pressed, the current display mode is continued. When the “enlarged display” button is pressed, the overlapping area 43 of the visible light image 41 is enlarged and displayed.
  • the enlargement method may maintain the aspect ratio of the overlapping region 43 to enlarge to the full screen of the display 30.
  • the screen may be enlarged in accordance with the size of the screen 30.
  • a part of the overlapping area 43 may be selected by the selection frame 49 and enlarged in accordance with the screen of the display 30. Good.
  • a frame that matches the aspect ratio of the screen of the display 30 may be used in advance, or the selection frame 49 may be set with an arbitrary aspect ratio.
  • the image may be enlarged while maintaining the aspect ratio as in FIG. 9A, or the image may be enlarged so that the vertical direction or the horizontal direction matches the screen size as in FIG. 9B. May be.
  • the passenger changes the imaging range of the terminal device 21 and presses the “shoot again” button shown in FIG. 6, the process returns to step S31 to perform imaging.
  • the display unit displays the display image in an enlarged manner according to the size of the display unit”.
  • the display image that is, the overlapping area
  • the image processing unit 18 detects the presence or absence of an object (pedestrian, animal, etc.) and its position in the entire region of the infrared image.
  • an object is detected (S35: Yes)
  • the display form determined in step S34 is set as the display image (S38).
  • the target object is not included in the display image, for example, as shown in FIG. 8, when an overlapping area that matches the reference area 47 is detected, the pedestrian 48 exists in the overlapping area.
  • the process may move to the above-described step S39 to output the recommended information.
  • This configuration is “provided with a recommended information output unit (11) that outputs recommended information that recommends changing the imaging range of the visible light camera when the object is outside the range of the display image”. According to this configuration, it is possible to display information useful for the occupant and to alert the occupant that the direction of the visible light camera is not appropriate.
  • the object information reflecting the presence of the object is superimposed on the display image, and this is used as the display image (S37).
  • the object information include an image of a frame surrounding the object and a symbol image representing the type of the object.
  • the above-described configuration includes the “object recognition unit (18) for recognizing a predetermined object from an infrared image, and the display unit displays the display image when the object exists within the range of the display image.
  • the object information reflecting the existence of the object is superimposed and displayed.
  • FIG. 7 shows a display example on the display 30 when an object is detected.
  • the overlap area 43 visible light image
  • the respective images are highlighted by being surrounded by a substantially rectangular frame 51, and further, a balloon 52 in which the image of the object is enlarged is displayed.
  • the second display image generation process corresponding to step S15 in FIG. 3 will be described with reference to FIG.
  • This processing is “when the visible light camera is not connected to the visible light camera connection unit, the display unit displays an infrared image taken by the infrared camera”. With this configuration, even when the visible light camera is not connected to the visible light camera connection unit, the object can be detected and displayed.
  • an image is picked up by the infrared camera 22, and the image processing unit 18 acquires the image (S51).
  • the image processing unit 18 uses the infrared image to detect the presence / absence of the object and its position.
  • the object information reflecting the presence of the object is superimposed on the infrared image, and this is used as a display image (S53).
  • an infrared image is used as a display image (S54).
  • the above disclosure includes the following aspects.
  • a vehicle that is applied to a vehicle including a visible light camera connection unit to which a visible light camera can be attached and detached and an infrared camera that captures an image of the outside of the vehicle, and a display unit that is disposed in the passenger compartment.
  • the display control device for an object includes an object detection unit that detects an object outside the vehicle based on an infrared image captured by the infrared camera, and the visible light camera is connected to the visible light camera connection unit.
  • the overlapping area detection unit detects an overlapping area of the infrared image and the visible light image based on a range of the visible light image outside the vehicle, and at least a part of the visible light image.
  • a display image determination unit that transmits the image as a display image to the display unit.
  • the display image determination unit superimposes and displays the information of the object on the visible light image and transmits the display image as the display image to the display unit when the object exists within the overlap region.
  • the cost of the device can be reduced. Further, the visibility is improved as compared with the configuration of only the infrared camera. Furthermore, the operation load can be reduced by connecting a visible light camera familiar to the user. It can also follow the performance improvement of visible light cameras.
  • the vehicular display control apparatus may further include an operation unit for an operation input by an occupant of the vehicle.
  • the display image determination unit determines a range of the display image based on an operation input of the occupant.
  • the overlapping area detection unit may determine that the overlapping area has been detected when the overlapping area exceeds a predetermined size. Furthermore, the overlapping area detection unit may determine that the overlapping area has been detected when the overlapping area includes a predetermined reference area of the infrared image.
  • the display unit may display an infrared image captured by the infrared camera.
  • the vehicle display control apparatus may further include a recommended information output unit that outputs recommended information that recommends changing the imaging range of the visible light camera when the overlapping region detection unit does not detect the overlapping region. .
  • the visible light camera may be mounted on a mobile terminal.
  • the portable terminal is disposed in a vehicle cabin.
  • the display unit is mounted on the portable terminal or mounted on an in-vehicle device.
  • the said display part may be mounted in the vehicle-mounted apparatus.
  • the display image displayed on the display unit of the in-vehicle device is adjusted.
  • the display image determination unit may highlight the object image by surrounding it with a rectangular frame, and may superimpose and display a balloon obtained by enlarging the image of the object on the visible light image.
  • each section is expressed as S11, for example.
  • each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section.
  • each section configured in this manner can be referred to as a device, module, or means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
PCT/JP2014/000906 2013-03-28 2014-02-21 車両用表示制御装置 WO2014155954A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013067707A JP5991648B2 (ja) 2013-03-28 2013-03-28 車両用表示制御装置
JP2013-067707 2013-03-28

Publications (1)

Publication Number Publication Date
WO2014155954A1 true WO2014155954A1 (ja) 2014-10-02

Family

ID=51622977

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/000906 WO2014155954A1 (ja) 2013-03-28 2014-02-21 車両用表示制御装置

Country Status (2)

Country Link
JP (1) JP5991648B2 (enrdf_load_stackoverflow)
WO (1) WO2014155954A1 (enrdf_load_stackoverflow)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113557172A (zh) * 2019-03-11 2021-10-26 株式会社小糸制作所 选通照相机、汽车、车辆用灯具、物体识别系统、运算处理装置、物体识别方法、图像显示系统、检查方法、摄像装置、图像处理装置
JP2022109764A (ja) * 2021-01-15 2022-07-28 パナソニックIpマネジメント株式会社 駐車支援方法および駐車支援装置

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9778657B2 (en) 2010-11-19 2017-10-03 Bradley Tyers Automatic location placement system
US11480965B2 (en) 2010-11-19 2022-10-25 Maid Ip Holdings Pty/Ltd Automatic location placement system
KR101601475B1 (ko) * 2014-08-25 2016-03-21 현대자동차주식회사 야간 주행 시 차량의 보행자 검출장치 및 방법
KR102548491B1 (ko) * 2016-03-29 2023-06-28 메이드 아이피 홀딩스 피티와이/엘티디 자동 위치 배치 시스템
JP6390035B2 (ja) * 2016-05-23 2018-09-19 本田技研工業株式会社 車両制御システム、車両制御方法、および車両制御プログラム
KR102449834B1 (ko) 2017-02-17 2022-09-29 스미도모쥬기가이고교 가부시키가이샤 작업기계용 주변감시 시스템
KR102533860B1 (ko) * 2017-12-04 2023-05-19 소니그룹주식회사 화상 처리 장치 및 화상 처리 방법
JP7508958B2 (ja) * 2020-09-11 2024-07-02 株式会社リコー 情報処理装置、情報処理システム、及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005184523A (ja) * 2003-12-19 2005-07-07 Matsushita Electric Ind Co Ltd 車載監視カメラ装置
JP2008056052A (ja) * 2006-08-30 2008-03-13 Xanavi Informatics Corp 車載装置
JP2008244906A (ja) * 2007-03-28 2008-10-09 Honda Motor Co Ltd 車両用表示装置
JP2012214083A (ja) * 2011-03-31 2012-11-08 Fujitsu Ten Ltd 画像生成装置、画像表示システム及び画像生成方法
JP2013055416A (ja) * 2011-09-01 2013-03-21 Alpine Electronics Inc 車外映像提供システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005184523A (ja) * 2003-12-19 2005-07-07 Matsushita Electric Ind Co Ltd 車載監視カメラ装置
JP2008056052A (ja) * 2006-08-30 2008-03-13 Xanavi Informatics Corp 車載装置
JP2008244906A (ja) * 2007-03-28 2008-10-09 Honda Motor Co Ltd 車両用表示装置
JP2012214083A (ja) * 2011-03-31 2012-11-08 Fujitsu Ten Ltd 画像生成装置、画像表示システム及び画像生成方法
JP2013055416A (ja) * 2011-09-01 2013-03-21 Alpine Electronics Inc 車外映像提供システム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113557172A (zh) * 2019-03-11 2021-10-26 株式会社小糸制作所 选通照相机、汽车、车辆用灯具、物体识别系统、运算处理装置、物体识别方法、图像显示系统、检查方法、摄像装置、图像处理装置
JP2022109764A (ja) * 2021-01-15 2022-07-28 パナソニックIpマネジメント株式会社 駐車支援方法および駐車支援装置
JP7610904B2 (ja) 2021-01-15 2025-01-09 パナソニックオートモーティブシステムズ株式会社 駐車支援方法および駐車支援装置

Also Published As

Publication number Publication date
JP2014191668A (ja) 2014-10-06
JP5991648B2 (ja) 2016-09-14

Similar Documents

Publication Publication Date Title
JP5991648B2 (ja) 車両用表示制御装置
JP6633216B2 (ja) 撮像装置、及び、電子機器
TWI814804B (zh) 距離測量處理設備,距離測量模組,距離測量處理方法及程式
US10802210B2 (en) Apparatus and method for a safety system of cameras for advantageously viewing vehicular traffic by the driver
US10528825B2 (en) Information processing device, approaching object notification method, and program
US20110228980A1 (en) Control apparatus and vehicle surrounding monitoring apparatus
JP6793193B2 (ja) 物体検出表示装置、移動体及び物体検出表示方法
CN103987582B (zh) 驾驶辅助装置
US20180015879A1 (en) Side-view mirror camera system for vehicle
US20180217255A1 (en) Radar for vehicle and vehicle provided therewith
US10999559B1 (en) Electronic side-mirror with multiple fields of view
KR102441079B1 (ko) 차량의 디스플레이 제어 장치 및 방법
CN114270798B (zh) 摄像装置
CN108569298A (zh) 用于增强俯视图图像的方法和装置
KR102704495B1 (ko) 차량 및 그 제어 방법
US20220155459A1 (en) Distance measuring sensor, signal processing method, and distance measuring module
EP4408006A1 (en) Image processing system, movable apparatus, image processing method, and storage medium
WO2018042976A1 (ja) 画像生成装置、画像生成方法、記録媒体、および画像表示システム
US20200304752A1 (en) Method and apparatus for enhanced video display
CN117341583A (zh) 照相机系统及其控制方法、存储介质和信息处理设备
JP2019188855A (ja) 車両用視認装置
WO2017158829A1 (ja) 表示制御装置および表示制御方法
JP6349890B2 (ja) 運転支援装置
US20210170946A1 (en) Vehicle surrounding image display device
JP2007025739A (ja) 車両用画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14774911

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14774911

Country of ref document: EP

Kind code of ref document: A1