WO2022163337A1 - 表示システム及び表示方法 - Google Patents
表示システム及び表示方法 Download PDFInfo
- Publication number
- WO2022163337A1 WO2022163337A1 PCT/JP2022/000500 JP2022000500W WO2022163337A1 WO 2022163337 A1 WO2022163337 A1 WO 2022163337A1 JP 2022000500 W JP2022000500 W JP 2022000500W WO 2022163337 A1 WO2022163337 A1 WO 2022163337A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- visible light
- image
- infrared
- distance
- imaging device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 22
- 238000003384 imaging method Methods 0.000 claims abstract description 149
- 238000003331 infrared imaging Methods 0.000 claims abstract description 76
- 238000004364 calculation method Methods 0.000 claims abstract description 29
- 239000002131 composite material Substances 0.000 claims abstract description 18
- 230000002194 synthesizing effect Effects 0.000 claims description 20
- 239000000203 mixture Substances 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 38
- 239000000428 dust Substances 0.000 description 32
- 238000010586 diagram Methods 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 239000002245 particle Substances 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000003595 spectral effect Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0686—Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
Definitions
- the display device 50 is arranged in the remote control room 200 outside the work machine 1.
- the display device 50 displays an image of the work site.
- the image of the work site includes an image of a predetermined range around work machine 1 .
- the image of the predetermined range around work machine 1 includes at least the image of the work target of work machine 1 .
- the work target of the work machine 1 includes the work target of the work machine 1 .
- the infrared imaging device 30 images the work site.
- the infrared imaging device 30 is provided on the working machine 1 .
- the infrared imaging device 30 is provided on the revolving body 3 .
- the infrared imaging device 30 images a predetermined range around the working machine 1 .
- the infrared imager 30 acquires images in the infrared spectral range.
- the spectrum range of infrared rays is 780 [nm] or more and 100 [ ⁇ m] or less.
- the infrared imager 30 acquires images in the far-infrared spectral range.
- the spectrum range of the infrared imaging device 30 is, for example, 7.5 [ ⁇ m] or more and 14 [ ⁇ m] or less.
- Each of the visible light imaging device 20 and the infrared imaging device 30 has an optical system and an image sensor that receives light that has passed through the optical system.
- the image sensor includes a CCD (Couple Charged Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- first infrared camera 31 and the second infrared camera 32 may be arranged between the first visible light camera 21 and the second visible light camera 22 .
- the imaging range of the first visible light camera 21, the imaging range of the second visible light camera 22, the imaging range of the first infrared camera 31, and the imaging range of the second infrared camera 32 match.
- the first visible light camera 21 , the second visible light camera 22 , the first infrared camera 31 , and the second infrared camera 32 simultaneously capture the front range of the revolving body 3 .
- At least part of the imaging range of the first visible light camera 21 and the imaging range of the second visible light camera 22 should match. At least part of the imaging range of the first infrared camera 31 and the imaging range of the second infrared camera 32 need only match. It is sufficient that the imaging range of the visible light imaging device 20 and a part of the imaging range of the infrared imaging device 30 match.
- the running body control unit 301 receives the operation signal of the remote control device 40 transmitted from the control device 60 .
- the traveling body control unit 301 outputs a control signal for controlling the operation of the traveling body 2 based on the operation signal of the remote control device 40 .
- the work machine control unit 303 receives the operation signal of the remote control device 40 transmitted from the control device 60 .
- Work machine control unit 303 outputs a control signal for controlling the operation of work machine 4 based on an operation signal from remote control device 40 .
- Control signals for controlling work implement 4 include control signals for controlling hydraulic cylinders 5 .
- the communication device 7 communicates with the communication device 6 via the communication system 400 .
- the communication device 7 receives the operation signal of the remote control device 40 transmitted from the control device 60 via the communication device 6 and outputs it to the control device 300 .
- the communication device 7 transmits the visible light image data and the infrared image data output from the image output unit 304 to the communication device 6 .
- the communication device 7 includes an encoder that compresses each of the visible light image data and the infrared image data. Each of the visible light image data and the infrared image data is transmitted from the communication device 7 to the communication device 6 in a compressed state.
- the control device 60 includes an operation signal output unit 61, a visible light image acquisition unit 62, an infrared image acquisition unit 63, a visible light distance calculation unit 64, an infrared distance calculation unit 65, a determination unit 66, and a synthesis unit 67. , a reference line generation unit 68 , and a display output unit 69 .
- the operation signal output unit 61 outputs an operation signal for remotely operating the working machine 1 .
- An operation signal for remotely operating the work machine 1 is generated by the operator operating the remote control device 40 .
- the operation signal output section 61 outputs an operation signal for the remote control device 40 .
- the communication device 6 transmits the operation signal output from the operation signal output section 61 to the communication device 7 .
- the visible light image acquisition unit 62 acquires a visible light image Ga representing the image of the first target captured by the visible light imaging device 20 .
- the visible light image acquisition unit 62 acquires the visible light image Ga by acquiring the visible light image data restored by the communication device 6 .
- the first target includes an imaging target that exists within the imaging range of the visible light imaging device 20 .
- the infrared image acquisition unit 63 acquires an infrared image Gb representing an image of the second target captured by the infrared imaging device 30 .
- the infrared imaging device 30 obtains the infrared image Gb by obtaining the infrared image data restored by the communication device 6 .
- the second target includes an imaging target that exists within the imaging range of the infrared imaging device 30 .
- the visible light distance calculation unit 64 calculates a visible light distance Da that indicates the distance from the visible light imaging device 20 to the first object existing in the imaging range of the visible light imaging device 20 .
- the first visible light camera 21 and the second visible light camera 22 of the visible light imaging device 20 constitute a stereo camera.
- the visible light distance calculation unit 64 stereo-processes the visible light image Ga captured by the first visible light camera 21 and the visible light image Ga captured by the second visible light camera 22, thereby obtaining the visible light imaging device 20 to the first target is calculated.
- the infrared distance calculation unit 65 calculates an infrared distance Db that indicates the distance from the infrared imaging device 30 to the second object existing in the imaging range of the infrared imaging device 30 .
- the first infrared camera 31 and the second infrared camera 32 of the infrared imaging device 30 constitute a stereo camera.
- the infrared distance calculation unit 65 stereo-processes the infrared image Gb captured by the first infrared camera 31 and the infrared image Gb captured by the second infrared camera 32, thereby determining the distance from the infrared imaging device 30 to the second target. An infrared distance Db is calculated.
- FIG. 5 is a schematic diagram for explaining the visible light imaging device 20 and the infrared imaging device 30 according to the embodiment.
- the visible light imaging device 20 images a predetermined range around the work machine 1 .
- the infrared imaging device 30 images a predetermined range around the work machine 1 .
- the imaging range of the visible light imaging device 20 and at least part of the imaging range of the infrared imaging device 30 match.
- FIG. 5 shows a situation in which the visible light image Ga captured by the visible light imaging device 20 is not blurred.
- the first target imaged by the visible light imaging device 20 and the second target imaged by the infrared imaging device 30 are the same imaging target.
- the visible light distance calculation unit 64 stereo-processes the visible light image Ga captured by the first visible light camera 21 and the visible light image Ga captured by the second visible light camera 22, thereby obtaining the visible light imaging device 20 to the first target is calculated.
- the infrared distance calculation unit 65 stereo-processes the infrared image Gb captured by the first infrared camera 31 and the infrared image Gb captured by the second infrared camera 32, thereby determining the distance from the infrared imaging device 30 to the second target.
- An infrared distance Db is calculated.
- the visible light distance Da and the infrared distance Db are substantially equal.
- FIG. 6 is a schematic diagram for explaining the visible light image Ga and the infrared image Gb according to the embodiment.
- FIG. 6 shows the visible light image Ga and the infrared image Gb captured when the visible light image Ga captured by the visible light imaging device 20 is not blurred.
- the first target imaged by the visible light imaging device 20 and the second target imaged by the infrared imaging device 30 are the same imaging target.
- the first object appearing in the visible light image Ga and the second object appearing in the infrared image Gb are the same imaging object.
- a plurality of first segmented regions Pa are defined in the visible light image Ga.
- a plurality of first partition regions Pa are defined in a matrix in the visible light image Ga.
- the first partitioned area Pa includes pixels (pixel areas) of the visible light image Ga. Pixels of the visible light image Ga correspond to pixels of the image sensor of the visible light imaging device 20 .
- a plurality of second segmented regions Pb are defined in the infrared image Gb.
- a plurality of second partition regions Pb are defined in a matrix in the infrared image Gb.
- the second partitioned area Pb includes pixels (pixel areas) of the infrared image Gb.
- the pixels of the infrared image Gb correspond to the pixels of the image sensor of the infrared imaging device 30 .
- the reference point Po of the image area is set in each of the visible light image Ga and the infrared image Gb.
- the positions indicated by the reference points Po of both images are the same, and the first partitioned area Pa obtained by dividing the visible light image Ga and the second partitioned area Pb obtained by dividing the infrared image Gb are each divided with the reference point Po as a reference. , are calibrated so that the size and position (range indicated by the image) match.
- the structure of the optical system of the visible light imaging device 20 and the structure of the image sensor of the infrared imaging device 30 may be different.
- the focal length of the optical system of the visible light imaging device 20 and the focal length of the optical system of the infrared imaging device 30, and the visual field range of the system of the visible light imaging device 20 and the visual field range of the optical system of the infrared imaging device 30 are: can be different.
- the plurality of first partitioned regions Pa (pixels) defined in the visible light image Ga and the plurality of second partitioned regions Pb (pixels) defined in the infrared image Gb have a one-to-one correspondence, and the pixels are The indicated regions are positionally coincident.
- the size of the first partitioned region Pa and the size of the second partitioned region Pb are equal.
- the number of first partitioned regions Pa and the number of second partitioned regions Pb are equal.
- the visible light distance calculator 64 calculates the visible light distance Da for each of the plurality of first partitioned areas Pa defined in the visible light image Ga.
- the visible light distance calculation unit 64 calculates the visible light distance Da from the visible light imaging device 20 to the first target existing in the imaging range of the visible light imaging device 20 for each of the plurality of first divided regions Pa of the visible light image Ga. calculate.
- the infrared distance calculator 65 calculates the infrared distance Db for each of the plurality of second partitioned areas Pb defined in the infrared image Gb.
- the infrared distance calculation unit 65 calculates an infrared distance Db from the infrared imaging device 30 to a second target existing in the imaging range of the infrared imaging device 30 for each of the plurality of second partitioned regions Pb of the infrared image Gb.
- the synthesizing unit 67 synthesizes the visible light image Ga with the second partitioned region Pb in which the difference between the visible light distance Da and the infrared distance Db is equal to or greater than the distance threshold to generate the synthesized image Gd.
- the synthesizing unit 67 synthesizes the cutout region Gc cut out from the infrared image Gb and the visible light image Ga from which the plurality of first divided regions Pa corresponding to the cutout region Gc are removed, A synthetic image Gd is generated.
- the synthesizing unit 67 synthesizes a first segmented region of the visible light image Ga in which a cutout region Gc including a plurality of second segmented regions Pb for which the difference between the visible light distance Da and the infrared distance Db is equal to or greater than the distance threshold corresponds to the cutout region Gc.
- the cutout region Gc and the visible light image Ga are combined so as to be superimposed on Pa.
- the synthesizing unit 67 adds a plurality of second segmented regions Pb whose difference between the visible light distance Da and the infrared distance Db is equal to or greater than the distance threshold in the portion of the visible light image Ga from which the first segmented region Pa has been removed. Fit a clipping region Gc containing
- the synthesizing unit 67 may perform smoothing processing on the boundary between the visible light image Ga and the cutout region Gc after fitting (overlapping) the cutout region Gc to the visible light image Ga.
- the reference line generation unit 68 generates a reference line image Ge indicating the distance from the work machine 1 based on the infrared distance Db calculated by the infrared distance calculation unit 65 .
- the operator of the work machine 1 can check the display device 50 and recognize the situation around the work machine 1. .
- the operator of the work machine 1 can confirm the display device 50 and recognize the distance from the work machine 1 to the imaging target (construction target). can be done.
- the reference line generation unit 68 generates a reference line image Ge indicating the distance from the work machine 1 based on the infrared distance Db calculated by the infrared distance calculation unit 65 .
- the synthesized image Gd is not displayed on the display device 50 and the visible light image Ga is displayed on the display device 50 .
- the visible light image Ga is displayed on the display device 50 instead of the synthesized image Gd when the area of the object to be imaged that is blocked by the dust is small.
- FIG. 13 is a block diagram showing a computer system 1000 according to an embodiment.
- the controller 60 described above includes a computer system 1000 .
- a computer system 1000 includes a processor 1001 such as a CPU (Central Processing Unit), a main memory 1002 including non-volatile memory such as ROM (Read Only Memory) and volatile memory such as RAM (Random Access Memory), It has a storage 1003 and an interface 1004 including an input/output circuit.
- the functions of the control device 60 described above are stored in the storage 1003 as computer programs.
- the processor 1001 reads a computer program from the storage 1003, develops it in the main memory 1002, and executes the above-described processing according to the program. Note that the computer program may be distributed to the computer system 1000 via a network.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Selective Calling Equipment (AREA)
- Controls And Circuits For Display Device (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、実施形態に係る作業機械1の遠隔操作システム100を示す模式図である。遠隔操作システム100は、作業現場に存在する作業機械1を遠隔操作する。遠隔操作システム100の少なくとも一部は、遠隔操作地の遠隔操作室200に配置される。遠隔操作システム100は、遠隔操作装置40と、表示装置50と、制御装置60とを備える。
図2は、実施形態に係る作業機械1を示す斜視図である。実施形態においては、作業機械1が油圧ショベルであることとする。作業機械1は、作業現場において稼働する。
図3は、実施形態に係る可視光撮像装置20及び赤外線撮像装置30を示す斜視図である。図3に示すように、可視光撮像装置20及び赤外線撮像装置30のそれぞれは、旋回体3の前部の上部に配置される。可視光撮像装置20及び赤外線撮像装置30のそれぞれは、旋回体3の前方を撮像する。実施形態において、可視光撮像装置20及び赤外線撮像装置30により撮像される作業機械1の周辺の所定範囲は、旋回体3の前方範囲である。
図4は、実施形態に係る作業機械1の遠隔操作システム100を示す機能ブロック図である。図4に示すように、遠隔操作システム100は、作業現場の画像を表示する表示システム10を有する。また、遠隔操作システム100は、遠隔操作地に配置される通信装置6と、通信装置6に接続される制御装置60と、制御装置60に接続される遠隔操作装置40と、制御装置60に接続される表示装置50とを備える。また、遠隔操作システム100は、作業機械1に配置される通信装置7と、通信装置7に接続される制御装置300と、制御装置300に接続される可視光撮像装置20と、制御装置300に接続される赤外線撮像装置30と、制御装置300により制御される走行体2と、制御装置300により制御される旋回体3と、制御装置300により制御される油圧シリンダ5とを備える。表示システム10は、可視光撮像装置20と、赤外線撮像装置30と、制御装置60と、表示装置50とを含む。
図12は、実施形態に係る表示方法を示すフローチャートである。
図13は、実施形態に係るコンピュータシステム1000を示すブロック図である。上述の制御装置60は、コンピュータシステム1000を含む。コンピュータシステム1000は、CPU(Central Processing Unit)のようなプロセッサ1001と、ROM(Read Only Memory)のような不揮発性メモリ及びRAM(Random Access Memory)のような揮発性メモリを含むメインメモリ1002と、ストレージ1003と、入出力回路を含むインターフェース1004とを有する。上述の制御装置60の機能は、コンピュータプログラムとしてストレージ1003に記憶されている。プロセッサ1001は、コンピュータプログラムをストレージ1003から読み出してメインメモリ1002に展開し、プログラムに従って上述の処理を実行する。なお、コンピュータプログラムは、ネットワークを介してコンピュータシステム1000に配信されてもよい。
以上説明したように、可視光画像Gaが不鮮明になる事象が発生していない状況においては、ステップS9で説明したように、可視光画像Gaが表示装置50に表示される。可視光画像Gaはカラー画像であり、可視光画像Gaの分解能は赤外線画像Gbの分解能よりも高い。すなわち、可視光画像Gaの視認性は、赤外線画像Gbの視認性よりも優れている。そのため、可視光画像Gaが不鮮明になる事象が発生していない状況においては、表示システム10は、可視光画像Gaを表示装置50に表示する。これにより、表示システム10は、作業機械1の周辺の状況を作業機械1の操作者に提供することができる。
上述の実施形態においては、可視光撮像装置20により撮像される画像が不鮮明になる事象が、粉塵の発生であることとした。可視光撮像装置20により撮像される画像が不鮮明になる事象として、粉塵の発生の他に、霧の発生、作業機械の作業が夜間に実施されることに起因する可視光の光量不足、及び撮像対象が逆光状態になることが例示される。
Claims (7)
- 作業機械に設けられた可視光撮像装置により撮像された第1対象の画像を示す可視光画像を取得する可視光画像取得部と、
前記作業機械に設けられた赤外線撮像装置により撮像された第2対象の画像を示す赤外線画像を取得する赤外線画像取得部と、
前記可視光撮像装置から前記第1対象までの距離を示す可視光距離を前記可視光画像に規定された複数の第1区画領域ごとに算出する可視光距離算出部と、
前記赤外線撮像装置から前記第2対象までの距離を示す赤外線距離を前記赤外線画像に前記第1区画領域に対応するように規定された複数の第2区画領域ごとに算出する赤外線距離算出部と、
対応する前記第1区画領域及び前記第2区画領域ごとに前記可視光距離と前記赤外線距離との差が距離閾値以上か否かを判定する判定部と、
前記差が距離閾値以上である前記第2区画領域と前記可視光画像とを合成して合成画像を生成する合成部と、
前記合成画像が表示装置に表示されるように前記合成画像を出力する表示出力部と、を備える、
表示システム。 - 前記合成部は、前記差が距離閾値以上である前記第2区画領域が前記第2区画領域に対応する前記可視光画像の前記第1区画領域に重畳するように、前記第2区画領域と前記可視光画像とを合成する、
請求項1に記載の表示システム。 - 前記合成画像は、前記差が距離閾値以上である前記第2区画領域と前記差が距離閾値未満である前記第1区画領域とを含む、
請求項1又は請求項2に記載の表示システム。 - 前記可視光撮像装置は、第1可視光カメラと、第2可視光カメラと、を含み、
前記可視光距離算出部は、前記第1可視光カメラにより撮像された可視光画像と前記第2可視光カメラにより撮像された可視光画像とをステレオ処理することにより、前記可視光距離を算出する、
請求項1から請求項3のいずれか一項に記載の表示システム。 - 前記赤外線撮像装置は、第1赤外線カメラと、第2赤外線カメラと、を含み、
前記赤外線距離算出部は、前記第1赤外線カメラにより撮像された赤外線画像と前記第2赤外線カメラにより撮像された赤外線画像とをステレオ処理することにより、前記赤外線距離を算出する、
請求項1から請求項4のいずれか一項に記載の表示システム。 - 前記赤外線距離に基づいて前記作業機械からの距離を示す目安線画像を生成する目安線生成部を備え、
前記表示出力部は、前記目安線画像が前記表示装置に表示されるように前記目安線画像を出力する、
請求項1から請求項5のいずれか一項に記載の表示システム。 - 作業機械に設けられた可視光撮像装置により撮像された第1対象の画像を示す可視光画像を取得することと、
前記作業機械に設けられた赤外線撮像装置により撮像された第2対象の画像を示す赤外線画像を取得することと、
前記可視光撮像装置から前記第1対象までの距離を示す可視光距離を前記可視光画像に規定された複数の第1区画領域ごとに算出することと、
前記赤外線撮像装置から前記第2対象までの距離を示す赤外線距離を前記赤外線画像に前記第1区画領域に対応するように規定された複数の第2区画領域ごとに算出することと、
対応する前記第1区画領域及び前記第2区画領域ごとに前記可視光距離と前記赤外線距離との差が距離閾値以上か否かを判定することと、
前記差が距離閾値以上である前記第2区画領域と前記可視光画像とを合成して合成画像を生成することと、
前記合成画像が表示装置に表示されるように前記合成画像を出力することと、を含む、
表示方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2022214854A AU2022214854A1 (en) | 2021-01-29 | 2022-01-11 | Display system and display method |
CA3203152A CA3203152A1 (en) | 2021-01-29 | 2022-01-11 | Display system and display method |
US18/039,591 US20240015374A1 (en) | 2021-01-29 | 2022-01-11 | Display system and display method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021013735A JP2022117174A (ja) | 2021-01-29 | 2021-01-29 | 表示システム及び表示方法 |
JP2021-013735 | 2021-01-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022163337A1 true WO2022163337A1 (ja) | 2022-08-04 |
Family
ID=82653264
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/000500 WO2022163337A1 (ja) | 2021-01-29 | 2022-01-11 | 表示システム及び表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240015374A1 (ja) |
JP (1) | JP2022117174A (ja) |
AU (1) | AU2022214854A1 (ja) |
CA (1) | CA3203152A1 (ja) |
WO (1) | WO2022163337A1 (ja) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0937147A (ja) * | 1995-07-19 | 1997-02-07 | Honda Motor Co Ltd | 視覚装置 |
JP2008230358A (ja) * | 2007-03-19 | 2008-10-02 | Honda Motor Co Ltd | 表示装置 |
JP2009174854A (ja) * | 2006-05-09 | 2009-08-06 | Panasonic Corp | 測距用画像選択機能を有する測距装置 |
JP2012027773A (ja) * | 2010-07-26 | 2012-02-09 | Toyota Central R&D Labs Inc | 擬似濃淡画像生成装置及びプログラム |
WO2012073722A1 (ja) * | 2010-12-01 | 2012-06-07 | コニカミノルタホールディングス株式会社 | 画像合成装置 |
US20140368640A1 (en) * | 2012-02-29 | 2014-12-18 | Flir Systems Ab | Method and system for performing alignment of a projection image to detected infrared (ir) radiation information |
CN104537786A (zh) * | 2014-11-10 | 2015-04-22 | 国家电网公司 | 用于变电站的红外远程影像识别报警装置 |
CN106683039A (zh) * | 2016-11-21 | 2017-05-17 | 云南电网有限责任公司电力科学研究院 | 一种生成火情态势图的系统 |
US20200300087A1 (en) * | 2019-03-20 | 2020-09-24 | Joy Global Underground Mining Llc | Systems and methods for controlling a longwall mining system based on a forward-looking mine profile |
-
2021
- 2021-01-29 JP JP2021013735A patent/JP2022117174A/ja active Pending
-
2022
- 2022-01-11 US US18/039,591 patent/US20240015374A1/en active Pending
- 2022-01-11 AU AU2022214854A patent/AU2022214854A1/en active Pending
- 2022-01-11 CA CA3203152A patent/CA3203152A1/en active Pending
- 2022-01-11 WO PCT/JP2022/000500 patent/WO2022163337A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0937147A (ja) * | 1995-07-19 | 1997-02-07 | Honda Motor Co Ltd | 視覚装置 |
JP2009174854A (ja) * | 2006-05-09 | 2009-08-06 | Panasonic Corp | 測距用画像選択機能を有する測距装置 |
JP2008230358A (ja) * | 2007-03-19 | 2008-10-02 | Honda Motor Co Ltd | 表示装置 |
JP2012027773A (ja) * | 2010-07-26 | 2012-02-09 | Toyota Central R&D Labs Inc | 擬似濃淡画像生成装置及びプログラム |
WO2012073722A1 (ja) * | 2010-12-01 | 2012-06-07 | コニカミノルタホールディングス株式会社 | 画像合成装置 |
US20140368640A1 (en) * | 2012-02-29 | 2014-12-18 | Flir Systems Ab | Method and system for performing alignment of a projection image to detected infrared (ir) radiation information |
CN104537786A (zh) * | 2014-11-10 | 2015-04-22 | 国家电网公司 | 用于变电站的红外远程影像识别报警装置 |
CN106683039A (zh) * | 2016-11-21 | 2017-05-17 | 云南电网有限责任公司电力科学研究院 | 一种生成火情态势图的系统 |
US20200300087A1 (en) * | 2019-03-20 | 2020-09-24 | Joy Global Underground Mining Llc | Systems and methods for controlling a longwall mining system based on a forward-looking mine profile |
Also Published As
Publication number | Publication date |
---|---|
CA3203152A1 (en) | 2022-08-04 |
US20240015374A1 (en) | 2024-01-11 |
AU2022214854A1 (en) | 2023-06-29 |
JP2022117174A (ja) | 2022-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7108750B2 (ja) | システムおよび方法 | |
US10650251B2 (en) | Monitoring image display device of industrial machine | |
AU2013293921B2 (en) | Environment monitoring device for operating machinery | |
JP5888956B2 (ja) | ショベル及び該ショベルの周囲画像表示方法 | |
US9836938B2 (en) | Shovel having audio output device installed in cab | |
US11447928B2 (en) | Display system, display method, and remote control system | |
EP3666977B1 (en) | Road machine | |
CA2815822C (en) | Dump truck | |
US20150009329A1 (en) | Device for monitoring surroundings of machinery | |
KR101752613B1 (ko) | 작업 기계의 주변 감시 장치 | |
US10621743B2 (en) | Processing-target image creating device, processing-target image creating method, and operation assisting system | |
US20190199940A1 (en) | Image display apparatus | |
JP2013253402A (ja) | 作業機械の周囲監視装置 | |
WO2019189633A1 (ja) | 道路機械 | |
WO2023085311A1 (ja) | 表示システム及び表示方法 | |
WO2022163337A1 (ja) | 表示システム及び表示方法 | |
KR20210034449A (ko) | 건설장비용 전방 영상 생성 장치 | |
JP2016065449A (ja) | ショベル | |
JP7351478B2 (ja) | 表示システム、遠隔操作システム、及び表示方法 | |
US11949845B2 (en) | Dynamic visual overlay for enhanced terrain perception on remote control construction equipment | |
KR102671221B1 (ko) | 화상 처리 장치, 카메라 시스템 및 화상 처리 방법 | |
JP7178334B2 (ja) | ショベル及びショベルの表示装置 | |
JP7009327B2 (ja) | ショベル | |
JP5964353B2 (ja) | ダンプトラック | |
JP2023169512A (ja) | 監視システム、監視システムの制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745562 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3203152 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18039591 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2022214854 Country of ref document: AU Date of ref document: 20220111 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22745562 Country of ref document: EP Kind code of ref document: A1 |