WO2012073722A1 - Dispositif de synthèse d'image - Google Patents

Dispositif de synthèse d'image Download PDF

Info

Publication number
WO2012073722A1
WO2012073722A1 PCT/JP2011/076669 JP2011076669W WO2012073722A1 WO 2012073722 A1 WO2012073722 A1 WO 2012073722A1 JP 2011076669 W JP2011076669 W JP 2011076669W WO 2012073722 A1 WO2012073722 A1 WO 2012073722A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
subject
infrared light
distance
Prior art date
Application number
PCT/JP2011/076669
Other languages
English (en)
Japanese (ja)
Inventor
高山 淳
Original Assignee
コニカミノルタホールディングス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタホールディングス株式会社 filed Critical コニカミノルタホールディングス株式会社
Priority to US13/990,808 priority Critical patent/US20130250070A1/en
Priority to JP2012546774A priority patent/JP5783471B2/ja
Publication of WO2012073722A1 publication Critical patent/WO2012073722A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/30Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/106Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images

Definitions

  • the present invention relates to an image composition device that acquires subject information and forms a subject image, and more particularly to an image composition device that can form an appropriate subject image even at night.
  • a visible light camera and a far-infrared light camera have different wavelength ranges of electromagnetic waves to be detected, but there is substantially no suitable optical material that transmits both visible light and far-infrared light. It is inherently impossible to match the optical axes of the far-infrared light camera. That is, the visible light image captured by the visible light camera and the far-infrared light image captured by the far-infrared light camera have different viewpoint positions.
  • Patent Document 1 discloses a technique for processing the same subject to overlap by extracting the subject using image recognition when the visible light image and the far-infrared light image are synthesized. .
  • Patent Document 1 it is necessary to capture the same subject with both a visible light camera and a far-infrared light camera in order to match the positions of the subjects.
  • a visible light camera and a far-infrared light camera there are many subjects that can be seen only in one of them, and there is a problem that the subject cannot always be aligned.
  • the present invention has been made in view of the above-described problems, and provides an image composition device that can photograph a subject regardless of the luminance of the subject and can suppress deviation of the subject in the synthesized image. Objective.
  • the image composition apparatus of the present invention First image acquisition means for acquiring subject information using electromagnetic waves in a first wavelength region and generating first image information relating to the subject; Subject information is acquired from a different viewpoint from the first image acquisition means using electromagnetic waves in a second wavelength region different from the first wavelength region, and second image information relating to the subject is generated.
  • a second image acquisition means Ranging means for obtaining distance information to the subject; 3D information generating means for generating 3D information of the subject based on the distance information to the subject; Based on the generated three-dimensional information of the subject, an image related to at least one of the images so that the photographing viewpoints of the image based on the first image information and the image based on the second image information match each other.
  • a viewpoint conversion means for processing information.
  • the viewpoint conversion unit matches the viewpoints of the image based on the first image information and the image based on the second image information based on the generated three-dimensional information of the subject.
  • the superimposing unit since the image information relating to at least one of the images is processed, the superimposing unit superimposes the first image on which the viewpoint conversion is performed and the second image so as to overlap each other. By superimposing the image information and the second image information, it is possible to obtain a composite image in which subject deviation is suppressed regardless of subject distance.
  • the electromagnetic wave in the first wavelength range refers to visible light having a wavelength of 400 nm to 700 nm, for example.
  • the electromagnetic wave in the second wavelength range refers to, for example, far-infrared light having a wavelength of about 4 ⁇ m or more, terahertz wave, millimeter wave, microwave, and the like.
  • image information refers to, for example, an image signal.
  • superimposing includes combining a part of images with the relative position on the screen fixed.
  • the first image information and the second image are overlapped with each other so that the first image and the second image subjected to viewpoint conversion processing by the viewpoint conversion unit are superimposed. It has a superimposing means for superimposing the image information. This makes it possible to perform image composition without deviation.
  • a superimposing unit that extracts specific information from the first image and the second image subjected to the viewpoint conversion process by the viewpoint conversion unit and superimposes them. It is characterized by.
  • the superimposing means extracts subject information having a luminance value equal to or higher than a predetermined value in the second image information, and inserts the subject information into the first image information.
  • the electromagnetic wave in the second wavelength range is far-infrared light
  • the superimposing means extracts subject information having a specific color or shape from the first image information, and inserts the subject information into the second image information.
  • the electromagnetic wave in the first wavelength range is visible light
  • the color and shape of the traffic light can be stored in advance, so that the traffic light can be extracted from the visible light image by image recognition, and the information can be extracted from the second light.
  • the superimposing unit extracts subject information having a specific color or shape from the first image information, and has a luminance value equal to or higher than a predetermined value in the second image information. It is preferable to extract subject information and insert the extracted subject information into another background image information.
  • the electromagnetic wave in the first wavelength range is visible light
  • the traffic light can be extracted from the visible light image by image recognition by storing the color and shape of the traffic light in advance, and the electromagnetic wave in the second wavelength range. Is far-infrared light, it can be determined as a human body if far-infrared light at a predetermined position or more is detected. It is possible to warn.
  • predetermined information is given to the extracted subject information.
  • the “predetermined information” may be information on a frame surrounding the subject, but for example, the distance to the subject may be displayed numerically.
  • the distance measuring unit acquires distance information to a subject based on parallax information obtained from a plurality of the first image acquiring unit or the second image acquiring unit.
  • the three-dimensional information generating means acquires the three-dimensional information of the subject by applying the distance measurement information to the entire screen.
  • the distance measuring unit measures the distance to the subject by projecting electromagnetic waves onto the subject and measuring the time or direction of reflection, and the three-dimensional information generating unit.
  • the three-dimensional information of the subject is acquired based on the distance to the subject.
  • the electromagnetic wave in the first wavelength region is visible light, near infrared light, or visible light and near infrared light
  • the electromagnetic wave in the second wavelength region is far infrared light. Preferably there is.
  • a visible light image and a far-infrared light image can be combined with an image viewed from one viewpoint with a minimum configuration, and the position of a subject that can be seen only by one camera can be adjusted.
  • FIGS. 3A and 3B are schematic diagrams illustrating processing of the image synthesis unit 10 in FIG. 2, where FIG. , (D) show images of two-dimensional image data subjected to viewpoint conversion.
  • FIG. 1 is a schematic diagram of a vehicle equipped with an image composition device according to a first embodiment.
  • visible light cameras 1 and 2 are attached to the inside of a windshield of a vehicle VH, and a far-infrared light camera is attached near the front grille of the vehicle VH.
  • the visible light cameras 1 and 2 serving as the first image acquisition means and the distance measurement means receive visible light from the subject OB at the vertical viewpoint position A, and output it as an image signal.
  • the far-infrared light camera 3 receives far-infrared light at the vertical viewpoint position B and outputs it as an image signal.
  • the image signals of the cameras 1 to 3 are input to the image composition unit 10, and the image signal processed here is output to the display device 4 that is a monitor to display a composite image that can be viewed by the driver of the vehicle VH. It is like that.
  • the image composition apparatus includes cameras 1 to 3 and an image composition unit 10.
  • FIG. 2 is a block diagram of the image composition apparatus according to the present embodiment.
  • an image composition unit 10 includes a three-dimensional information generation unit 11 that is a three-dimensional information generation unit, a viewpoint conversion unit 12 that is a viewpoint conversion unit, a subject recognition unit 13, a data processing unit 14, and a superimposition unit that is a superimposition unit. 15 and the viewpoint data part 19 are included.
  • a first motion detector 16, a second motion detector 17, and a motion comparator 18 may be included.
  • the 3D information generation unit 11 extracts 3D information based on the principle of a stereo camera based on the image signals of the visible light cameras 1 and 2.
  • FIG. 3 is a diagram illustrating a state in which a distance to an object is measured with a stereo camera.
  • visible light cameras 1 and 2 having a pair of imaging elements are arranged so as to be separated from each other by a predetermined base line interval L and their optical axes are parallel to each other.
  • the captured images of the objects captured by the visible light cameras 1 and 2 are searched for corresponding points in units of pixels using, for example, a SAD (Sum of Absolute Difference) method that is a corresponding point search method.
  • SAD Sud of Absolute Difference
  • FIG. 3 at least the focal length (f), the number of pixels of the image sensor (CCD), and the size ( ⁇ ) of one pixel are equal to each other.
  • the subject OB is photographed by arranging the optical axes 1X and 2X in parallel so as to be separated from each other to the left and right.
  • the pixel number of the end portion of the subject OB on the imaging surface 1 b of the camera 1 (assuming counting from the left end or the right end) is the same on the imaging surface 2 b of the camera 2.
  • the viewpoint conversion unit 12 calculates viewpoint coordinates and angle of view based on the three-dimensional information obtained by the three-dimensional information generation unit 11, and changes the viewpoint position with respect to the image signals of the visible light cameras 1 and 2. Perform image processing. At this time, when the angle of view is different, the angle of view can be adjusted.
  • the viewpoint conversion is described in, for example, Japanese Patent Laid-Open No. 2008-099136.
  • the viewpoint position may be converted with reference to preset relative position data of the visible light camera 1 and far-infrared light camera 3 stored in the viewpoint data unit 19.
  • a visible light far-infrared light synthesized image viewed from the position of the far-infrared light camera is generated by synthesizing the visible light image converted from the viewpoint and the far-infrared light image. Note that the viewpoint position of the visible light image may be matched with the viewpoint position of the far-infrared light image, or vice versa.
  • the first motion detection unit 16 detects the motion of the subject photographed by the visible light cameras 1 and 2, and the second motion detection.
  • the unit 17 can detect the motion of the subject photographed by the far-infrared light camera 3 and can compare both of them with the motion comparison unit 18.
  • the motion comparison unit 18 recognizes an area where the same subject is imaged in each of the images of the visible light cameras 1 and 2 and the image of the far-infrared light camera 3, and measures the displacement of these positions. .
  • the position data for changing the viewpoint stored in the viewpoint data unit 19 is corrected.
  • the position data for changing the viewpoint stored in the viewpoint data unit 19 is corrected.
  • FIG. 4A is an image (far-infrared light image) taken at dusk with the far-infrared light camera 3, and the subjects HM1 and HM2 as people appear white because they generate heat. LED traffic lights with a small amount of heat generation are not clearly visible.
  • FIG. 4B is an image (visible light image) obtained by photographing the same subject with the visible light camera 1 at the same timing. Since it is dusk, the overall brightness of the subject is low, and the lamp of the self-light-emitting traffic light SG is clearly visible, but the people HM1, HM2, etc. are not clearly visible.
  • the viewpoint position of the far-infrared light camera 3 and the viewpoint position of the visible light camera 1 are separated in the vertical direction, the viewpoints of the two images are different as is apparent from FIG. Therefore, if both are superposed as they are, the subject will be displaced.
  • the viewpoint can be obtained by using this to shift the subject position of the two-dimensional image data.
  • the converted two-dimensional image data is obtained. An image of such two-dimensional image data is shown in FIG.
  • the viewpoint conversion is performed on the visible light image
  • the viewpoint of the far-infrared light image (a) matches the viewpoint of the visible light image (b) as shown in FIG. Therefore, when the image signals are superimposed on each other by the superimposing unit 15, the deviation of the subjects having different subject distances in the composite image is suppressed, and the person viewing the composite image does not feel uncomfortable.
  • FIG. 7 is a diagram illustrating an example of a composite image in which a visible light image is superimposed on a far-infrared light image.
  • a visible light image is superimposed on a far-infrared light image.
  • FIG. 8 is a diagram illustrating an example of a composite image in which a far-infrared light image is superimposed on a visible light image.
  • the driver since the whitish people HM1 and HM2 are displayed so as to float in a dark dusk landscape, the driver can quickly and clearly identify the subject to be noted.
  • FIG. 9 is a diagram showing an example of a composite image obtained by superimposing only the extract of the visible light image and the extract of the far infrared light image.
  • the object recognition unit 13 extracts only the traffic light SG from the color and shape of the object, for example, from the visible light image, and generates far infrared rays from the far infrared light image. Since only the persons HM1 and HM2 having the luminance value are extracted and the superimposition unit 15 performs synthesis display so as to embed them on the black background, the driver can quickly and clearly identify the subject to be noted.
  • FIG. 10 is a diagram showing an example of a composite image in which a far-infrared light image is superimposed on a visible light image and further contoured.
  • the subject recognition unit 13 extracts only the traffic light SG from the color and shape of the subject, for example, from the visible light image, generates heat from the far infrared light image, and radiates far infrared light. Since only the persons HM1 and HM2 having values are extracted and the data processing unit 14 frames the traffic lights SG and the persons HM1 and HM2 (F1, F2, and F3), the superimposing unit 15 composites and displays them. The driver can quickly and clearly identify the subject to be noted.
  • the subject to be extracted is not limited to the above, and may be a vehicle, a sign, an obstacle, or the like.
  • FIG. 11 is a block diagram of an image composition apparatus according to the second embodiment.
  • the distance to the subject is detected by a separate distance detecting device (ranging means) 5 instead of the stereo camera system.
  • the image composition unit 20 includes a three-dimensional information generation unit 21, a viewpoint data unit 22, a viewpoint conversion unit 23, and a superimposition unit 24.
  • the three-dimensional information generation unit 21 detects the subject distance based on the signal from the distance detection device 5.
  • the viewpoint conversion unit 23 receives the image signal from the first information acquisition device (first image acquisition unit) 6 and stores the first information acquisition device 6 and the preset first information acquisition device 6 stored in the viewpoint data unit 22. (2) The viewpoint position is converted with reference to the relative position data of the information acquisition device 7.
  • the superimposing unit 24 receives the image signal from the second information acquisition device 7 (second image acquisition means), and synthesizes it so as to be superimposed on the image signal of the first information acquisition device 6 whose viewpoint position has been converted.
  • the synthesized image signal is output from the image synthesis unit 20 and displayed by the display device 4 (FIG. 2) or the like.
  • the distance detection device 5 may detect an object distance by projecting infrared light or the like by a light cutting method or TOF (Time of Flight).
  • the first information acquisition device 6 may be a visible light camera, an infrared light camera, or the like.
  • the second information acquisition device 7 may be a far-infrared light camera, a millimeter wave radar, a laser radar, or the like.
  • a near-infrared light camera may be used instead of the visible light camera, or a camera having sensitivity to visible light and near-infrared light may be used.
  • the present invention is particularly effective for in-vehicle cameras and robot-mounted cameras, for example, but the application is not limited thereto.

Abstract

La présente invention concerne un dispositif de synthèse d'image qui est capable d'imager un objet quelle que soit la luminosité de l'objet et qui est capable de supprimer le désalignement des objets dans l'image synthétisée. Une unité de conversion de point de vue (12) effectue une conversion de point de vue relative à une image de lumière visible, et de cette façon, un point de vue d'une image de lumière infrarouge lointain correspond à un point de vue d'une image de lumière visible. Par conséquent, lorsque des signaux d'image desdites images sont superposés dans une unité de superposition (15), le désalignement des objets ayant différentes distances d'objet est supprimé dans l'image synthétisée et l'image synthétisée n'apparaît pas étrange à une personne regardant l'image synthétisée.
PCT/JP2011/076669 2010-12-01 2011-11-18 Dispositif de synthèse d'image WO2012073722A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/990,808 US20130250070A1 (en) 2010-12-01 2011-11-18 Image synthesis device
JP2012546774A JP5783471B2 (ja) 2010-12-01 2011-11-18 画像合成装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-268170 2010-12-01
JP2010268170 2010-12-01

Publications (1)

Publication Number Publication Date
WO2012073722A1 true WO2012073722A1 (fr) 2012-06-07

Family

ID=46171666

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/076669 WO2012073722A1 (fr) 2010-12-01 2011-11-18 Dispositif de synthèse d'image

Country Status (3)

Country Link
US (1) US20130250070A1 (fr)
JP (1) JP5783471B2 (fr)
WO (1) WO2012073722A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2779624A1 (fr) * 2013-03-15 2014-09-17 Infrared Integrated Systems Ltd. Appareil et procédé d'imagerie multispectrale avec superposition tridimensionnelle
WO2014174765A1 (fr) * 2013-04-26 2014-10-30 コニカミノルタ株式会社 Dispositif de capture d'image et procédé de capture d'image
WO2015182771A1 (fr) * 2014-05-30 2015-12-03 日本電産エレシス株式会社 Dispositif de capture d'image, dispositif de traitement d'image, procédé de traitement d'image et programme informatique
JP2016005213A (ja) * 2014-06-19 2016-01-12 株式会社Jvcケンウッド 撮像装置及び赤外線画像生成方法
JP2016111475A (ja) * 2014-12-04 2016-06-20 ソニー株式会社 画像処理装置、画像処理方法、及び撮像システム
JP2016189184A (ja) * 2015-03-11 2016-11-04 ザ・ボーイング・カンパニーThe Boeing Company リアルタイム多次元画像融合
JP2017220923A (ja) * 2016-06-07 2017-12-14 パナソニックIpマネジメント株式会社 画像生成装置、画像生成方法、およびプログラム
WO2019069581A1 (fr) * 2017-10-02 2019-04-11 ソニー株式会社 Dispositif et procédé de traitement d'image
WO2019111464A1 (fr) * 2017-12-04 2019-06-13 ソニー株式会社 Dispositif et procédé de traitement d'image
WO2020039605A1 (fr) * 2018-08-20 2020-02-27 コニカミノルタ株式会社 Dispositif de détection de gaz, dispositif de traitement d'informations et programme
WO2021053969A1 (fr) * 2019-09-20 2021-03-25 キヤノン株式会社 Dispositif d'imagerie, procédé de commande de dispositif d'imagerie et programme associé
WO2021241533A1 (fr) * 2020-05-29 2021-12-02 富士フイルム株式会社 Système, procédé et programme d'imagerie, et procédé d'acquisition d'informations
WO2022163337A1 (fr) * 2021-01-29 2022-08-04 株式会社小松製作所 Système d'affichage et procédé d'affichage
WO2022195970A1 (fr) * 2021-03-19 2022-09-22 株式会社Jvcケンウッド Dispositif d'avertissement et procédé d'avertissement
WO2022239573A1 (fr) * 2021-05-13 2022-11-17 富士フイルム株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3023111B1 (fr) * 2014-06-30 2017-10-20 Safel Systeme de vision
JP5959073B2 (ja) 2014-09-30 2016-08-02 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation 検出装置、検出方法、及び、プログラム
WO2018047687A1 (fr) * 2016-09-12 2018-03-15 パナソニックIpマネジメント株式会社 Dispositif et procédé de génération de modèle tridimensionnel
DE102018203910B3 (de) 2018-03-14 2019-06-13 Audi Ag Fahrerassistenzsystem sowie Verfahren für ein Kraftfahrzeug zum Anzeigen einer erweiterten Realität

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005037366A (ja) * 2003-06-24 2005-02-10 Constec Engi Co 赤外線構造物診断システム及び赤外線構造物診断方法
JP2006060425A (ja) * 2004-08-18 2006-03-02 Olympus Corp 画像生成方法および装置
JP2006148327A (ja) * 2004-11-17 2006-06-08 Olympus Corp 画像生成装置
JP2007232652A (ja) * 2006-03-02 2007-09-13 Fujitsu Ltd 路面状態判定装置および路面状態判定方法
JP2011039727A (ja) * 2009-08-10 2011-02-24 Ihi Corp 車両操縦用画像表示装置および方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2309453A3 (fr) * 1998-07-31 2012-09-26 Panasonic Corporation Appareil d'affichage d'images et procédé d'affichage d'images
JP3910893B2 (ja) * 2002-08-30 2007-04-25 富士通株式会社 画像抽出方法及び認証装置
DE10305861A1 (de) * 2003-02-13 2004-08-26 Adam Opel Ag Vorrichtung eines Kraftfahrzeuges zur räumlichen Erfassung einer Szene innerhalb und/oder außerhalb des Kraftfahrzeuges
JP4376653B2 (ja) * 2004-02-17 2009-12-02 富士重工業株式会社 車外監視装置
JP2006333132A (ja) * 2005-05-26 2006-12-07 Sony Corp 撮像装置及び撮像方法、プログラム、プログラム記録媒体並びに撮像システム
US7786898B2 (en) * 2006-05-31 2010-08-31 Mobileye Technologies Ltd. Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005037366A (ja) * 2003-06-24 2005-02-10 Constec Engi Co 赤外線構造物診断システム及び赤外線構造物診断方法
JP2006060425A (ja) * 2004-08-18 2006-03-02 Olympus Corp 画像生成方法および装置
JP2006148327A (ja) * 2004-11-17 2006-06-08 Olympus Corp 画像生成装置
JP2007232652A (ja) * 2006-03-02 2007-09-13 Fujitsu Ltd 路面状態判定装置および路面状態判定方法
JP2011039727A (ja) * 2009-08-10 2011-02-24 Ihi Corp 車両操縦用画像表示装置および方法

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9654704B2 (en) 2013-03-15 2017-05-16 Infrared Integrated Systems, Ltd. Apparatus and method for multispectral imaging with three dimensional overlaying
EP2779623A1 (fr) * 2013-03-15 2014-09-17 Infared Integrated Systems Limited Appareil et procédé d'imagerie multispectrale avec correction de parallaxe
CN104079839A (zh) * 2013-03-15 2014-10-01 红外线综合系统有限公司 用于利用视差校正的多光谱成像的设备和方法
US20140362227A1 (en) * 2013-03-15 2014-12-11 Infrared Integrated Systems, Ltd. Apparatus and method for multispectral imaging with parallax correction
EP2779624A1 (fr) * 2013-03-15 2014-09-17 Infrared Integrated Systems Ltd. Appareil et procédé d'imagerie multispectrale avec superposition tridimensionnelle
US9729803B2 (en) * 2013-03-15 2017-08-08 Infrared Integrated Systems, Ltd. Apparatus and method for multispectral imaging with parallax correction
WO2014174765A1 (fr) * 2013-04-26 2014-10-30 コニカミノルタ株式会社 Dispositif de capture d'image et procédé de capture d'image
WO2015182771A1 (fr) * 2014-05-30 2015-12-03 日本電産エレシス株式会社 Dispositif de capture d'image, dispositif de traitement d'image, procédé de traitement d'image et programme informatique
JP2016005213A (ja) * 2014-06-19 2016-01-12 株式会社Jvcケンウッド 撮像装置及び赤外線画像生成方法
JP2016111475A (ja) * 2014-12-04 2016-06-20 ソニー株式会社 画像処理装置、画像処理方法、及び撮像システム
JP2016189184A (ja) * 2015-03-11 2016-11-04 ザ・ボーイング・カンパニーThe Boeing Company リアルタイム多次元画像融合
JP2017220923A (ja) * 2016-06-07 2017-12-14 パナソニックIpマネジメント株式会社 画像生成装置、画像生成方法、およびプログラム
WO2019069581A1 (fr) * 2017-10-02 2019-04-11 ソニー株式会社 Dispositif et procédé de traitement d'image
US11468574B2 (en) 2017-10-02 2022-10-11 Sony Corporation Image processing apparatus and image processing method
JPWO2019069581A1 (ja) * 2017-10-02 2020-11-19 ソニー株式会社 画像処理装置及び画像処理方法
JP7188394B2 (ja) 2017-10-02 2022-12-13 ソニーグループ株式会社 画像処理装置及び画像処理方法
WO2019111464A1 (fr) * 2017-12-04 2019-06-13 ソニー株式会社 Dispositif et procédé de traitement d'image
JPWO2019111464A1 (ja) * 2017-12-04 2021-01-14 ソニー株式会社 画像処理装置及び画像処理方法
US11641492B2 (en) 2017-12-04 2023-05-02 Sony Corporation Image processing apparatus and image processing method
JP7188397B2 (ja) 2017-12-04 2022-12-13 ソニーグループ株式会社 画像処理装置及び画像処理方法
WO2020039605A1 (fr) * 2018-08-20 2020-02-27 コニカミノルタ株式会社 Dispositif de détection de gaz, dispositif de traitement d'informations et programme
US11012656B2 (en) 2018-08-20 2021-05-18 Konica Minolta, Inc. Gas detection device, information processing device, and program
JP2021047377A (ja) * 2019-09-20 2021-03-25 キヤノン株式会社 撮像装置、撮像装置の制御方法、プログラム
WO2021053969A1 (fr) * 2019-09-20 2021-03-25 キヤノン株式会社 Dispositif d'imagerie, procédé de commande de dispositif d'imagerie et programme associé
WO2021241533A1 (fr) * 2020-05-29 2021-12-02 富士フイルム株式会社 Système, procédé et programme d'imagerie, et procédé d'acquisition d'informations
JP7436656B2 (ja) 2020-05-29 2024-02-21 富士フイルム株式会社 撮影システム、撮影方法、撮影プログラム、及び情報取得方法
WO2022163337A1 (fr) * 2021-01-29 2022-08-04 株式会社小松製作所 Système d'affichage et procédé d'affichage
WO2022195970A1 (fr) * 2021-03-19 2022-09-22 株式会社Jvcケンウッド Dispositif d'avertissement et procédé d'avertissement
WO2022239573A1 (fr) * 2021-05-13 2022-11-17 富士フイルム株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image

Also Published As

Publication number Publication date
US20130250070A1 (en) 2013-09-26
JPWO2012073722A1 (ja) 2014-05-19
JP5783471B2 (ja) 2015-09-24

Similar Documents

Publication Publication Date Title
JP5783471B2 (ja) 画像合成装置
KR102499586B1 (ko) 촬상 장치
KR102344171B1 (ko) 화상 생성 장치, 화상 생성 방법, 및 프로그램
US10183621B2 (en) Vehicular image processing apparatus and vehicular image processing system
EP2544449B1 (fr) Dispositif de surveillance du périmètre d'un véhicule
KR100921095B1 (ko) 차량용 정보표시시스템
KR101349025B1 (ko) 원적외선 스마트 나이트 뷰를 위한 차선 정보 합성 장치 및 방법
WO2012169355A1 (fr) Dispositif de génération d'image
US20080198226A1 (en) Image Processing Device
KR101611194B1 (ko) 차량 주변 이미지 생성 장치 및 방법
JP2008511080A (ja) 融合画像を形成するための方法および装置
JP2008230296A (ja) 車両用運転支援システム
JP5951785B2 (ja) 画像処理装置及び車両前方監視装置
JP2008183933A (ja) 赤外線暗視装置
KR20150055181A (ko) 헤드업 디스플레이를 이용한 나이트 비전 정보 표시 장치 및 방법
CN107399274B (zh) 影像叠合的方法
KR20160136757A (ko) 단안 카메라를 이용한 장애물 검출장치
US11589028B2 (en) Non-same camera based image processing apparatus
US20220279134A1 (en) Imaging device and imaging method
JP2008230358A (ja) 表示装置
JP4795813B2 (ja) 車両周囲監視装置
Rickesh et al. Augmented reality solution to the blind spot issue while driving vehicles
US11838656B2 (en) Imaging device and correction method for reducing image height dependency of a depth
CN107003389A (zh) 用于车辆的驾驶员辅助系统以及辅助车辆的驾驶员的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11845083

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2012546774

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 13990808

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11845083

Country of ref document: EP

Kind code of ref document: A1