WO2015145599A1 - Video projection device - Google Patents

Video projection device Download PDF

Info

Publication number
WO2015145599A1
WO2015145599A1 PCT/JP2014/058453 JP2014058453W WO2015145599A1 WO 2015145599 A1 WO2015145599 A1 WO 2015145599A1 JP 2014058453 W JP2014058453 W JP 2014058453W WO 2015145599 A1 WO2015145599 A1 WO 2015145599A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
laser light
light
video
distance
Prior art date
Application number
PCT/JP2014/058453
Other languages
French (fr)
Japanese (ja)
Inventor
益岡 信夫
将史 山本
崎田 康一
鈴木 基之
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2014/058453 priority Critical patent/WO2015145599A1/en
Publication of WO2015145599A1 publication Critical patent/WO2015145599A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/101Scanning systems with both horizontal and vertical deflecting means, e.g. raster or XY scanners
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/105Scanning systems with one or more pivoting mirrors or galvano-mirrors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/02Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
    • G09G3/025Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen with scanning or deflecting the beams in two directions or dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3129Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present invention relates to an image projection apparatus that projects an image on the surface of an object such as an object or a building.
  • mapping A technique for projecting an image created by a personal computer or the like onto the surface of an object to be projected such as an object or a building using an image projection apparatus such as a projector, so-called projector mapping (hereinafter referred to as mapping) is known.
  • mapping it is necessary to set the image to be projected according to the shape of the projection object. If the position or shape of the projection object changes, the projection direction and projection image will be adjusted accordingly. It is necessary to set the size.
  • Patent Document 1 discloses an imaging unit and a projection imaged by the imaging unit so that an object such as a pattern or a color can be mapped on the surface of the projection object even if the shape of the projection object changes or moves.
  • Patent Document 1 in order to extract a projection area of a projection object, an image including an image of the projection object is acquired by an imaging unit (camera), and an object corresponding to the extracted projection area is used as the projection area.
  • the mapping is configured.
  • the size of the projection screen changes according to the distance between the screen and the projection apparatus main body.
  • the size also varies depending on the zoom position of the projection lens.
  • the projection screen is distorted into a trapezoid depending on the perpendicularity between the screen and the projector main body. Therefore, the size and shape of the projection screen (such as trapezoidal distortion) vary greatly depending on the positional relationship between the screen and the projection apparatus main body.
  • the imaging range changes depending on the zoom position of the camera.
  • the coordinate system of the image displayed on the projection apparatus body before the projection mapping display is performed.
  • the calibration process for matching the coordinate system of the image acquired by the camera is necessary, which is inconvenient.
  • the object to be projected is detected by photographing with a camera, it is necessary to stop the object drawing process during photographing. For this reason, when the projection object moves, a time during which the object is not displayed intermittently occurs. That is, it is difficult to project an object in real time while extracting a projection area.
  • An object of the present invention is to provide a video projection apparatus capable of mapping in almost real time without reducing a gap between extraction of a projection area and timing of projecting an object, and the projected video is not interrupted.
  • the present invention relates to a laser light source that emits laser light modulated by an input video signal, and a reflection angle that scans the laser light and projects the image onto the projection object in a video projection apparatus that projects an image onto the projection object. Based on the calculated distance, a variable mirror, a light receiving unit that detects laser light reflected by the projection, a distance calculation unit that calculates a distance to the projection based on a light reception signal from the light receiving unit, An image processing unit that performs a mapping process on an image projected on the projection object.
  • FIG. 1 is a configuration diagram (Example 1) showing an embodiment of a video projection apparatus according to the present invention.
  • FIG. The figure which shows an example of the image
  • FIG. 6 is a diagram showing a configuration of a laser light source 21 ′ in Embodiment 2.
  • An image projection apparatus is configured to project an image on a projection object using a laser light source, and measures the position and shape of the projection object using the projected laser beam, and the measured projection object The image to be projected is mapped according to the position and shape of the object.
  • FIG. 1 is a block diagram showing an embodiment of a video projection apparatus according to the present invention.
  • the image projection apparatus 1 includes a laser module 2 that emits laser light, and a drive circuit for the laser module 4, a laser light source drive unit 5, a light amount light receiving unit 6, an amplification factor control unit 7, a control signal generation unit 9, The generator 10, the distance calculator 11, the memory 12, and the image processor 13 are included.
  • the laser module 2 includes a laser light source 21, a reflection angle variable mirror 22, and a light receiving unit 23.
  • the light receiving unit 23 is arranged outside the housing of the laser module 2 or inside the housing via a lens, a mirror, or the like.
  • APD Avalanche Photo Diode
  • the image processing unit 13 processes the input video signal, the laser light source driving unit 5 drives the laser module 2, and emits laser light modulated by the video signal toward the projection object 3.
  • the laser light emitted from the laser module 2 is irradiated onto the projection object 3 to display an image, and the reflected light is detected by the light receiving unit 23.
  • the detection signal from the light receiving unit 23 is processed by the light quantity light receiving unit 6 and the pulse generating unit 10, and the distance calculating unit 11 calculates the distance to the projection object 3.
  • the calculated distance data is stored in the memory 12.
  • the image processing unit 13 determines the position and shape of the projection object 3 from the distance data to the projection object 3 stored in the memory 12, and determines a projection area on which a video object such as a pattern or a color is to be projected. .
  • a mapping process of a video to be projected is performed in accordance with the determined projection area.
  • an image is projected onto the surface of the projection object 3 with the laser light emitted from the laser module 2, and the position and shape of the projection object 3 are determined based on the reflected light of the laser light, thereby mapping the projected image. It is set as the structure which processes.
  • the control signal generation unit 9 generates a mirror drive unit control signal 100, a laser light source drive unit control signal 101, a bias voltage control signal 102, and a synchronization signal 200 based on the input video signal.
  • the laser light source drive unit control signal 101 and the synchronization signal 200 are input to the laser light source drive unit 5.
  • the laser light source driving unit 5 generates a laser driving signal 203 in accordance with the laser light source driving unit control signal 101 and the synchronization signal 200, and adjusts the light amount of the laser light source 21 according to the signal level of the laser driving signal 203 and its application time. Further, the mirror drive unit control signal 100 and the synchronization signal 200 are input to the mirror drive unit 4.
  • the mirror driving unit 4 generates a horizontal driving signal 201 and a vertical driving signal 202 in accordance with the mirror driving unit control signal 100 and the synchronization signal 200.
  • the horizontal direction drive signal 201 and the vertical direction drive signal 202 control the angle of the reflection angle variable mirror 22 in the horizontal direction and the vertical direction, respectively.
  • the bias voltage control signal 102 and the synchronization signal 200 are input to the amplification factor control unit 7.
  • the amplification factor control unit 7 applies a bias voltage to the light receiving unit 23 according to the bias voltage control signal 102 and the synchronization signal 200.
  • the laser module 2 adjusts the light amount of the laser light source 21 and the angle of the reflection angle variable mirror 22 and scans the projection object 3 with the laser light to project an image.
  • the laser module 2 adjusts the light amount of the laser light source 21 and the angle of the reflection angle variable mirror 22 and scans the projection object 3 with the laser light to project an image.
  • a color image or the like can be projected using a plurality of laser light sources as will be described later.
  • the principle of distance measurement uses a TOF (Time of Flight) method in which the measurement is performed from the time difference between the light emission time from the laser light source 21 and the light reception time of the light reflected from the projection object 3 by the light receiving unit 23.
  • the laser light applied to the projection object 3 is scattered, and a part of the reflected light is detected by the light receiving unit 23.
  • the signal detected by the light receiving unit 23 is input to the light quantity light receiving unit 6 and the gain control unit 7.
  • the light quantity light receiving unit 6 amplifies the detected minute signal.
  • the gain control unit 7 sets the light receiving sensitivity of the light receiving unit 23 according to the intensity of the laser light emitted from the laser light source 21, that is, according to the drive signal 203 from the laser light source driving unit 5.
  • the amplification factor control unit 7 sets the light receiving sensitivity of the light receiving unit 23 according to the signal level of the reflected light detected by the light receiving unit 23.
  • a signal from the light receiving unit 6 is input to the pulse generation unit 10.
  • the pulse generator 10 compares the input signal with a reference voltage and converts an analog signal into a pulse signal.
  • the distance calculation unit 11 receives the pulse signal generated by the pulse generation unit 10 and calculates the distance d based on the time difference ⁇ t from the laser pulse emission timing signal 103 of the laser light source driving unit 5.
  • the distance calculation unit 11 obtains the distance to the projection object 3 for each position scanned by the reflection angle variable mirror 22 of the laser module 2. That is, the scan range is divided into predetermined intervals, and the distances at the horizontal and vertical positions are obtained and stored in the memory 12. For example, assume that the projection object 3 in FIG. 1 is composed of a planar object 31 on the back surface and a box-shaped object 32 on the front side. The distance d2 to the object 32 is smaller than the distance d1 to the object 31.
  • the image processing unit 13 refers to the distance data, compares the distance d at each position using the threshold value dth (where d2 ⁇ dth ⁇ d1), and the region where d> dth is the region of the object 31 on the back surface. It is determined that the region where d ⁇ dth is the region of the object 32 in front. Then, the projection image is mapped in accordance with the position and shape of the projection object 3.
  • FIG. 2 is a diagram illustrating an example of image projection by mapping processing.
  • two objects 31 and 32 exist as the projection object 3, and the video object 41 a and 41 b are projected on the planar object 31 and the video object 42 is projected on the box-shaped object 32.
  • the image processing unit 13 extracts projection areas 31 a and 32 a (indicated by broken lines) of the objects 31 and 32 from the distance data stored in the memory 12.
  • the image processing unit 13 performs a mapping process of the input video signal so that predetermined objects 41a, 41b, and 42 are projected in accordance with the positions and shapes of the extracted projection areas 31a and 32a. Thereby, even if a projection object (for example, the object 32) moves, the desired object 42 can be projected on the projection area 32a following this.
  • FIG. 3 is a diagram illustrating an example of the configuration of the laser light source 21.
  • the laser light source 21 includes three light sources: a light source 21R that generates red (R light), a light source 21G that generates green (G light), and a light source 21B that generates blue (B light), and can display a color image. Yes.
  • the R light beam generated from the light source 21R is reflected by the total reflection mirror 24R and travels toward the reflection angle variable mirror 22.
  • the G light beam generated from the light source 21G is reflected by the wavelength selective mirror 24R (reflects G light and transmits R light) and travels toward the reflection angle variable mirror 22.
  • the B light beam generated from the light source 21 ⁇ / b> B is reflected by the wavelength selective mirror 24 ⁇ / b> B (reflects B light and transmits R light and G light) and travels toward the reflection angle variable mirror 22. These reflected color beams are combined into one image light beam, scanned by the reflection angle variable mirror 22, and projected onto the projection object 3.
  • the arrangement of the light sources of R light, G light, and B light shown here is an example, and the arrangement may be changed as appropriate in consideration of the transmission efficiency of the light beam.
  • the number of light sources and the light emission color may be appropriately determined according to the projected image.
  • the intensity of the laser beam emitted from the laser light source 21 varies depending on the video signal. Therefore, the intensity of the reflected light reflected from the projection object 3 also changes with the video signal, and the level of the light receiving signal in the light receiving unit 23 varies.
  • the amplification factor control unit 7 adjusts the light receiving sensitivity of the light receiving unit 23 based on the laser drive signal 203 to obtain the distance. The influence on the measurement can be eliminated.
  • FIG. 4 is a diagram illustrating an example of a distance data table stored in the memory 12.
  • the scan range (rectangular region) of the laser light is divided into m ⁇ n meshes, and distance data d at each mesh position is stored. That is, the data amount is for one screen (one frame).
  • the mesh division may be performed in units of pixels, but it is preferable to use a plurality of pixels in units in consideration of storage capacity, distance calculation load, and the like.
  • the distance data d1 for the region 31a corresponding to the object 31 and the region 32a corresponding to the object 32 are used.
  • the distance data d2 is stored.
  • the data to be stored is not the distance value itself, but is binarized as a result of comparing the distance data d with the threshold value dth, that is, “1” (d> dth) and “0” (d ⁇ dth).
  • Stored data may be stored.
  • the image processing unit 13 can determine the position and shape of the projection object 3 with reference to this table. Each time new distance data is input to the memory 12, the corresponding mesh position data is overwritten and saved, whereby the latest position and shape of the projection object 3 can be known.
  • FIG. 5 is a diagram showing a flowchart of video projection and mapping processing. In the following flow, loop processing is performed for each frame.
  • the laser module 2 irradiates the projection object 3 with laser light modulated by the video signal.
  • S302 it is determined whether the irradiation position (pixel) is coincident with the measurement position.
  • the measurement position here refers to the approximate center of the pixel area divided into meshes. If they match, the process proceeds to S303, and if they do not match, the process proceeds to S306.
  • the light receiving unit 23 receives the reflected light from the projection object 3.
  • the distance calculation unit 11 measures the distance from the light reception signal to the projection object 3.
  • the measured distance data is stored in the memory 12. As described above, distance measurement is performed when the mesh division of the distance measurement is in units of a plurality of pixels and the position where the laser beam is irradiated matches the mesh position.
  • S306 it is determined whether or not irradiation of the entire screen has been completed. If not completed, the process proceeds to S307, moves to the next irradiation position, and returns to S301. When the laser beam irradiation position is the measurement position, the distance measurement is repeated. If irradiation of the entire screen has been completed in S306, the process proceeds to S308.
  • the distance data for one screen is read from the memory 12.
  • the image processing unit 13 determines the position and shape of the projection object 3 from the distance data.
  • the image processing unit 13 corrects the video signal according to the position and shape of the projection object 3 (mapping process). In this embodiment, since the image is projected by laser light, it is not necessary to adjust the focus according to the distance to the projection object.
  • S311 it is determined whether or not the video projection operation is finished. If not completed, the process moves to the first irradiation position in S312, returns to S301, and repeats image projection, distance measurement, and mapping processing.
  • mapping processing of a video signal it is possible to perform mapping processing of a video signal to be projected at any time while projecting a video.
  • the delay time until the distance measurement result to the projection object is reflected in the mapping process is the time required for projection of one screen until the determination in S306 is switched to Yes at the maximum. Since the time required for general projection for one screen is sufficiently small, such as 1/60 second, delay is not particularly problematic.
  • the vertical resolution is lowered to reduce the time required for projection for one screen.
  • the delay time can be further shortened, and a shift occurring at the projection position (mapping position) can be reduced even with a fast-moving projection object.
  • the vertical resolution may be appropriately switched according to the movement speed of the projection object.
  • the time delay of the mapping process is small, and the projection of the image is performed. Can be executed almost in real time without interruption. Therefore, even when the projection object moves, a video projection device with a good appearance can be realized with little deviation of the projection position with respect to the projection object.
  • the irradiation position of the laser and the measurement position are the same, the projection coordinate system and the measurement coordinate system coincide with each other, so that calibration is unnecessary and the usability is good.
  • FIG. 6 is a diagram illustrating a configuration of a laser light source 21 ′ according to the second embodiment.
  • the laser light source 21 ′ includes a light source 21IR that generates infrared light (IR light) in addition to a red (R light) light source 21R, a green (G light) light source 21G, and a blue (B light) light source 21B.
  • IR light infrared light
  • the laser light source 21 ′ includes a light source 21IR that generates infrared light (IR light) in addition to a red (R light) light source 21R, a green (G light) light source 21G, and a blue (B light) light source 21B.
  • RGB light red
  • G light green
  • B light blue
  • the IR light beam generated from the light source 21IR is reflected by the total reflection mirror 24IR and travels toward the reflection angle variable mirror 22.
  • the other light sources 21R, 21G, and 21B are also reflected by the wavelength selective mirrors 24R ′, 24G ′, and 24B ′, respectively, and travel toward the reflection angle variable mirror 22, which are combined into one beam.
  • the IR light beam emitted from the light source 21IR does not affect the displayed image even if it is irradiated onto the projection object.
  • the light receiving unit 23 and the distance calculating unit 11 measure the distance to the projection object using the IR laser light emitted from the light source 21IR.
  • the light receiving unit 23 is provided with a filter that transmits infrared light and reflects visible light so as to receive only IR light. Since the IR light beam emitted from the light source 21IR has a constant intensity, the intensity of the IR laser light reflected from the projection object is also substantially constant. Therefore, when the intensity of visible light (R, G, B light) changes according to the video signal, the distance measurement accuracy does not decrease and the mapping is performed even when a dark video is projected with a low video signal level. It is possible to realize a video projection apparatus with little misalignment and good appearance.
  • Video projection device 2: Laser module, 3, 31, 32: Projected object
  • 4: Mirror drive unit 5: Laser light source driving unit
  • 7: Amplification rate control unit 9: control signal generator
  • 11: Distance calculation unit 12: Memory
  • 22: Reflective angle variable mirror 23: light receiving part

Abstract

This video projection device (1) is provided with a laser light source (21) that emits laser light modulated by an inputted video signal, a variable-reflection-angle mirror (22) that projects video onto a projection target (3) by scanning said projection target with the laser light, a light-receiving unit (23) that detects laser light reflected off of the projection target (3), a distance computation unit (11) that computes the distance to the projection target on the basis of a light signal received from the light-receiving unit (23), and an image processing unit (13) that determines the position and shape of the projection target (3) from the computed distance and performs a mapping process to map the projected video to a region of the determined projection target. This results in a video projection device (1) that produces nice-looking video with minimal projection-position misalignment with respect to the projection target (3) even if said projection target moves.

Description

映像投影装置Video projection device
 本発明は、物体や建物などの被投影物の表面に映像を投影する映像投影装置に関する。 The present invention relates to an image projection apparatus that projects an image on the surface of an object such as an object or a building.
 パソコン等で作成した映像を、プロジェクター等の映像投影装置を用いて、物体や建物などの被投影物の表面に投影する技術、いわゆるプロジェクタマッピング(以下、マッピングと略す)が知られる。マッピングを行う際には、被投影物の形状に合わせて投影する映像を設定する必要があり、被投影物の位置や形状が変化する場合には、その変化に合わせて投影方向や投影映像の大きさを設定する必要がある。 A technique for projecting an image created by a personal computer or the like onto the surface of an object to be projected such as an object or a building using an image projection apparatus such as a projector, so-called projector mapping (hereinafter referred to as mapping) is known. When mapping, it is necessary to set the image to be projected according to the shape of the projection object.If the position or shape of the projection object changes, the projection direction and projection image will be adjusted accordingly. It is necessary to set the size.
 特許文献1には、被投影物が形状変化や移動しても、被投影物の表面に模様や色等のオブジェクトをマッピングできるようにするため、撮像手段と、撮像手段によって撮像された被投影物を含む画像を取得する画像取得手段と、画像取得手段から取得された画像から被投影物の投影領域を抽出する領域抽出手段と、投影領域に対応したオブジェクトを投影領域にマッピングするマッピング手段を備える構成が開示される。 Patent Document 1 discloses an imaging unit and a projection imaged by the imaging unit so that an object such as a pattern or a color can be mapped on the surface of the projection object even if the shape of the projection object changes or moves. Image acquisition means for acquiring an image including an object, area extraction means for extracting the projection area of the projection object from the image acquired from the image acquisition means, and mapping means for mapping an object corresponding to the projection area to the projection area A configuration comprising is disclosed.
特開2013-192189号公報JP 2013-192189 A
 特許文献1によれば、被投影物の投影領域を抽出するために、撮像手段(カメラ)にて被投影物の像を含む画像を取得し、抽出した投影領域に対応したオブジェクトを投影領域にマッピングする構成となっている。 According to Patent Document 1, in order to extract a projection area of a projection object, an image including an image of the projection object is acquired by an imaging unit (camera), and an object corresponding to the extracted projection area is used as the projection area. The mapping is configured.
 特許文献1によれば、スクリーンと投影装置本体との距離に応じて投影画面の大きさが変化する。また、投影レンズのズーム位置によっても大きさが異なる。さらには、スクリーンと投影装置本体の垂直度によって、投影画面が台形に歪む。よって、投影画面の大きさ、形状(台形歪など)は、スクリーンと投影装置本体との位置関係によって大きく異なる。また、カメラのズーム位置によっても撮像範囲が変わってしまう。 According to Patent Document 1, the size of the projection screen changes according to the distance between the screen and the projection apparatus main body. The size also varies depending on the zoom position of the projection lens. Furthermore, the projection screen is distorted into a trapezoid depending on the perpendicularity between the screen and the projector main body. Therefore, the size and shape of the projection screen (such as trapezoidal distortion) vary greatly depending on the positional relationship between the screen and the projection apparatus main body. Also, the imaging range changes depending on the zoom position of the camera.
 以上のように、設置状態、投影装置本体、カメラのズーム位置よって、投影画面の大きさ形状、撮像範囲が変わるため、プロジェクションマッピング表示を行う前に、投影装置本体が表示する映像の座標系と、カメラにて取得する画像の座標系を一致させるためのキャリブレーション処理が必要となり、使い勝手が悪い。 As described above, since the size and shape of the projection screen and the imaging range vary depending on the installation state, the projection apparatus body, and the zoom position of the camera, the coordinate system of the image displayed on the projection apparatus body before the projection mapping display is performed. The calibration process for matching the coordinate system of the image acquired by the camera is necessary, which is inconvenient.
 また、カメラで撮影して被投影物を検出するため、撮影時はオブジェクトの描画処理を停止する必要がある。そのため、被投影物が移動する場合、断続的にオブジェクトが表示されない時間が生じてしまう。すなわち、投影領域を抽出しながらリアルタイムでオブジェクトを投影することは困難であった。 Also, since the object to be projected is detected by photographing with a camera, it is necessary to stop the object drawing process during photographing. For this reason, when the projection object moves, a time during which the object is not displayed intermittently occurs. That is, it is difficult to project an object in real time while extracting a projection area.
 本発明の目的は、投影領域の抽出とオブジェクトを投影するタイミングのずれを少なくして、投影映像が途切れることなく、ほぼリアルタイムでマッピング可能な映像投影装置を提供することである。 An object of the present invention is to provide a video projection apparatus capable of mapping in almost real time without reducing a gap between extraction of a projection area and timing of projecting an object, and the projected video is not interrupted.
 本発明は、被投影物に映像を投影する映像投影装置において、入力した映像信号で変調されたレーザ光を出射するレーザ光源と、レーザ光をスキャンして被投影物に映像を投影する反射角度可変ミラーと、被投影物にて反射したレーザ光を検出する受光部と、受光部からの受光信号を基に被投影物までの距離を算出する距離算出部と、算出した距離に基づいて、前記被投影物に投影する映像をマッピング処理する画像処理部と、を備える。 The present invention relates to a laser light source that emits laser light modulated by an input video signal, and a reflection angle that scans the laser light and projects the image onto the projection object in a video projection apparatus that projects an image onto the projection object. Based on the calculated distance, a variable mirror, a light receiving unit that detects laser light reflected by the projection, a distance calculation unit that calculates a distance to the projection based on a light reception signal from the light receiving unit, An image processing unit that performs a mapping process on an image projected on the projection object.
 本発明によれば、被投影物が移動する場合でも被投影物に対する投影位置のずれが少なく、見映えの良い映像投影装置を実現できる。 According to the present invention, it is possible to realize a video projection apparatus with a good appearance with little deviation of the projection position with respect to the projection object even when the projection object moves.
本発明に係る映像投影装置の一実施例を示す構成図(実施例1)。1 is a configuration diagram (Example 1) showing an embodiment of a video projection apparatus according to the present invention. FIG. マッピング処理による映像投影の一例を示す図。The figure which shows an example of the image | video projection by a mapping process. レーザ光源21の構成の一例を示す図。The figure which shows an example of a structure of the laser light source. メモリ12に記憶する距離データテーブルの一例を示す図。The figure which shows an example of the distance data table memorize | stored in the memory. 映像投影とマッピング処理のフローチャートを示す図。The figure which shows the flowchart of an image | video projection and a mapping process. 実施例2におけるレーザ光源21’の構成を示す図。FIG. 6 is a diagram showing a configuration of a laser light source 21 ′ in Embodiment 2.
 本発明の映像投影装置は、レーザ光源を用いて被投影物に映像を投影する構成であって、投影した該レーザ光を用いて被投影物の位置と形状を測定し、測定した該被投影物の位置と形状に合わせて投影する映像をマッピング処理する構成としている。 An image projection apparatus according to the present invention is configured to project an image on a projection object using a laser light source, and measures the position and shape of the projection object using the projected laser beam, and the measured projection object The image to be projected is mapped according to the position and shape of the object.
 図1は、本発明に係る映像投影装置の一実施例を示す構成図である。映像投影装置1は、レーザ光を出射するレーザモジュール2と、その駆動回路として、ミラー駆動部4、レーザ光源駆動部5、光量受光部6、増幅率制御部7、制御信号生成部9、パルス生成部10、距離算出部11、メモリ12、画像処理部13から構成される。レーザモジュール2は、レーザ光源21、反射角度可変ミラー22、受光部23からなる。受光部23はレーザモジュール2の筐体の外側、あるいはレンズ、ミラー等を介して筐体の内側に配置している。受光部23には、例えばAPD(Avalanche Photo Diode)と呼ばれる高感度のフォトダイオードを用いる。 FIG. 1 is a block diagram showing an embodiment of a video projection apparatus according to the present invention. The image projection apparatus 1 includes a laser module 2 that emits laser light, and a drive circuit for the laser module 4, a laser light source drive unit 5, a light amount light receiving unit 6, an amplification factor control unit 7, a control signal generation unit 9, The generator 10, the distance calculator 11, the memory 12, and the image processor 13 are included. The laser module 2 includes a laser light source 21, a reflection angle variable mirror 22, and a light receiving unit 23. The light receiving unit 23 is arranged outside the housing of the laser module 2 or inside the housing via a lens, a mirror, or the like. For the light receiving unit 23, for example, a highly sensitive photodiode called APD (Avalanche Photo Diode) is used.
 画像処理部13は入力した映像信号を処理し、レーザ光源駆動部5はレーザモジュール2を駆動し、映像信号で変調されたレーザ光を被投影物3に向けて出射する。レーザモジュール2から出射したレーザ光は、被投影物3に照射されて映像を表示するとともに、その反射光を受光部23で検出する。受光部23からの検出信号は、光量受光部6とパルス生成部10で処理され、距離算出部11にて被投影物3までの距離を算出する。算出した距離データはメモリ12に記憶される。画像処理部13は、メモリ12に記憶された被投影物3までの距離データから、被投影物3の位置と形状を判定し、模様や色等の映像オブジェクトを投影すべき投影領域を決定する。そして、決定した投影領域に合わせて投影する映像のマッピング処理を行う。このように、レーザモジュール2から出射したレーザ光で被投影物3の表面に映像を投影するとともに、そのレーザ光の反射光により被投影物3の位置と形状を判定して投影する映像のマッピング処理を行う構成としている。 The image processing unit 13 processes the input video signal, the laser light source driving unit 5 drives the laser module 2, and emits laser light modulated by the video signal toward the projection object 3. The laser light emitted from the laser module 2 is irradiated onto the projection object 3 to display an image, and the reflected light is detected by the light receiving unit 23. The detection signal from the light receiving unit 23 is processed by the light quantity light receiving unit 6 and the pulse generating unit 10, and the distance calculating unit 11 calculates the distance to the projection object 3. The calculated distance data is stored in the memory 12. The image processing unit 13 determines the position and shape of the projection object 3 from the distance data to the projection object 3 stored in the memory 12, and determines a projection area on which a video object such as a pattern or a color is to be projected. . Then, a mapping process of a video to be projected is performed in accordance with the determined projection area. In this way, an image is projected onto the surface of the projection object 3 with the laser light emitted from the laser module 2, and the position and shape of the projection object 3 are determined based on the reflected light of the laser light, thereby mapping the projected image. It is set as the structure which processes.
 まず、駆動回路の動作から説明する。制御信号生成部9は入力した映像信号に基づき、ミラー駆動部制御信号100、レーザ光源駆動部制御信号101、バイアス電圧制御信号102、同期信号200を生成する。レーザ光源駆動部制御信号101と同期信号200はレーザ光源駆動部5に入力される。レーザ光源駆動部5は、レーザ光源駆動部制御信号101と同期信号200に応じてレーザ駆動信号203を生成し、レーザ駆動信号203の信号レベルやその印加時間によりレーザ光源21の光量を調整する。また、ミラー駆動部制御信号100と同期信号200はミラー駆動部4に入力される。ミラー駆動部4は、ミラー駆動部制御信号100と同期信号200に応じて水平方向駆動信号201と垂直駆動信号202を生成する。水平方向駆動信号201と垂直駆動信号202はそれぞれ反射角度可変ミラー22の水平方向、垂直方向の角度を制御する。バイアス電圧制御信号102と同期信号200は増幅率制御部7に入力される。増幅率制御部7は、バイアス電圧制御信号102と同期信号200に応じバイアス電圧を受光部23に印加する。 First, the operation of the drive circuit will be described. The control signal generation unit 9 generates a mirror drive unit control signal 100, a laser light source drive unit control signal 101, a bias voltage control signal 102, and a synchronization signal 200 based on the input video signal. The laser light source drive unit control signal 101 and the synchronization signal 200 are input to the laser light source drive unit 5. The laser light source driving unit 5 generates a laser driving signal 203 in accordance with the laser light source driving unit control signal 101 and the synchronization signal 200, and adjusts the light amount of the laser light source 21 according to the signal level of the laser driving signal 203 and its application time. Further, the mirror drive unit control signal 100 and the synchronization signal 200 are input to the mirror drive unit 4. The mirror driving unit 4 generates a horizontal driving signal 201 and a vertical driving signal 202 in accordance with the mirror driving unit control signal 100 and the synchronization signal 200. The horizontal direction drive signal 201 and the vertical direction drive signal 202 control the angle of the reflection angle variable mirror 22 in the horizontal direction and the vertical direction, respectively. The bias voltage control signal 102 and the synchronization signal 200 are input to the amplification factor control unit 7. The amplification factor control unit 7 applies a bias voltage to the light receiving unit 23 according to the bias voltage control signal 102 and the synchronization signal 200.
 以上の駆動回路の動作により、レーザモジュール2ではレーザ光源21の光量と反射角度可変ミラー22の角度を調整し、被投影物3に対しレーザ光をスキャンして映像を投影する。なお図1ではレーザ光源21は1個のみ示しているが、後述するように複数のレーザ光源を使用してカラー映像等を投影することができる。 By the operation of the drive circuit described above, the laser module 2 adjusts the light amount of the laser light source 21 and the angle of the reflection angle variable mirror 22 and scans the projection object 3 with the laser light to project an image. Although only one laser light source 21 is shown in FIG. 1, a color image or the like can be projected using a plurality of laser light sources as will be described later.
 次に、被投影物3までの距離測定について説明する。距離測定の原理は、レーザ光源21からの光の出射時間と、受光部23にて被投影物3から反射する光の受光時間の時間差から測定するTOF(Time of Flight)方式を用いる。被投影物3に照射したレーザ光は散乱し、一部の反射光を受光部23で検出する。この時間差Δtに光速cを乗じて距離dを算出する(往復距離2d=c・Δt)。 Next, distance measurement to the projection object 3 will be described. The principle of distance measurement uses a TOF (Time of Flight) method in which the measurement is performed from the time difference between the light emission time from the laser light source 21 and the light reception time of the light reflected from the projection object 3 by the light receiving unit 23. The laser light applied to the projection object 3 is scattered, and a part of the reflected light is detected by the light receiving unit 23. The distance d is calculated by multiplying the time difference Δt by the speed of light c (round-trip distance 2d = c · Δt).
 受光部23で検出した信号は、光量受光部6および増幅率制御部7に入力される。光量受光部6は検出された微小信号の増幅を行う。増幅率制御部7は、レーザ光源21から出射するレーザ光の強度に応じて、すなわち、レーザ光源駆動部5からの駆動信号203に応じて受光部23の受光感度を設定する。あるいは増幅率制御部7は、受光部23で検出した反射光の信号レベルに応じて、受光部23の受光感度を設定する。これにより、レーザ光源21からの出射レーザ光の強度が変化しても、あるいは被投影物3の反射率が変化しても、一定レベルの受光信号を得ることができる。光量受光部6の信号はパルス生成部10に入力される。パルス生成部10では入力信号と参照電圧を比較しアナログ信号をパルス信号に変換する。距離算出部11は、パルス生成部10で生成されたパルス信号を受け、レーザ光源駆動部5のレーザパルス出射タイミング信号103との時間差Δtにより距離dを算出する。 The signal detected by the light receiving unit 23 is input to the light quantity light receiving unit 6 and the gain control unit 7. The light quantity light receiving unit 6 amplifies the detected minute signal. The gain control unit 7 sets the light receiving sensitivity of the light receiving unit 23 according to the intensity of the laser light emitted from the laser light source 21, that is, according to the drive signal 203 from the laser light source driving unit 5. Alternatively, the amplification factor control unit 7 sets the light receiving sensitivity of the light receiving unit 23 according to the signal level of the reflected light detected by the light receiving unit 23. Thereby, even if the intensity | strength of the emitted laser beam from the laser light source 21 changes, or the reflectance of the to-be-projected object 3 changes, the received light signal of a fixed level can be obtained. A signal from the light receiving unit 6 is input to the pulse generation unit 10. The pulse generator 10 compares the input signal with a reference voltage and converts an analog signal into a pulse signal. The distance calculation unit 11 receives the pulse signal generated by the pulse generation unit 10 and calculates the distance d based on the time difference Δt from the laser pulse emission timing signal 103 of the laser light source driving unit 5.
 距離算出部11では、レーザモジュール2の反射角度可変ミラー22によりスキャンした各位置について被投影物3との距離を求める。すなわち、スキャン範囲を所定の間隔に分割し、水平方向、垂直方向の各位置での距離を求め、メモリ12に記憶する。例えば図1の被投影物3が、背面にある平面状の物体31と手前にある箱状の物体32とで構成される場合を想定する。物体32までの距離d2は物体31までの距離d1よりも小さくなる。 The distance calculation unit 11 obtains the distance to the projection object 3 for each position scanned by the reflection angle variable mirror 22 of the laser module 2. That is, the scan range is divided into predetermined intervals, and the distances at the horizontal and vertical positions are obtained and stored in the memory 12. For example, assume that the projection object 3 in FIG. 1 is composed of a planar object 31 on the back surface and a box-shaped object 32 on the front side. The distance d2 to the object 32 is smaller than the distance d1 to the object 31.
 画像処理部13はこの距離データを参照し、閾値dth(ただしd2<dth<d1)を用いて各位置での距離dを比較し、d>dthとなる領域が背面の物体31の領域で、d<dthとなる領域が手前の物体32の領域であることを判別する。そして被投影物3の位置と形状に合わせて、投影する映像をマッピング処理する。 The image processing unit 13 refers to the distance data, compares the distance d at each position using the threshold value dth (where d2 <dth <d1), and the region where d> dth is the region of the object 31 on the back surface. It is determined that the region where d <dth is the region of the object 32 in front. Then, the projection image is mapped in accordance with the position and shape of the projection object 3.
 図2は、マッピング処理による映像投影の一例を示す図である。ここでは被投影物3として2つの物体31,32が存在し、平面状の物体31には映像オブジェクト41a,41bを、箱状の物体32には映像オブジェクト42を投影した状態を示している。画像処理部13は、メモリ12に記憶した距離データから、物体31,32それぞれの投影領域31a,32a(破線で示す)を抽出する。画像処理部13は、抽出した投影領域31a,32aの位置と形状に合わせて所定のオブジェクト41a,41b,42が投影されるように、入力した映像信号のマッピング処理を行う。これにより、被投影物(例えば物体32)が移動しても、これに追従してその投影領域32aに所望のオブジェクト42を投影することができる。 FIG. 2 is a diagram illustrating an example of image projection by mapping processing. Here, two objects 31 and 32 exist as the projection object 3, and the video object 41 a and 41 b are projected on the planar object 31 and the video object 42 is projected on the box-shaped object 32. The image processing unit 13 extracts projection areas 31 a and 32 a (indicated by broken lines) of the objects 31 and 32 from the distance data stored in the memory 12. The image processing unit 13 performs a mapping process of the input video signal so that predetermined objects 41a, 41b, and 42 are projected in accordance with the positions and shapes of the extracted projection areas 31a and 32a. Thereby, even if a projection object (for example, the object 32) moves, the desired object 42 can be projected on the projection area 32a following this.
 図3は、レーザ光源21の構成の一例を示す図である。レーザ光源21は、赤色(R光)を発生する光源21R、緑色(G光)を発生する光源21G、青色(B光)を発生する光源21Bの3個を備え、カラー映像の表示を可能としている。光源21Rから発生されたR光ビームは全反射ミラー24Rにて反射され、反射角度可変ミラー22へ向かう。光源21Gから発生されたG光ビームは波長選択性ミラー24R(G光を反射、R光を透過)にて反射され、反射角度可変ミラー22へ向かう。光源21Bから発生されたB光ビームは波長選択性ミラー24B(B光を反射、R光とG光を透過)にて反射され、反射角度可変ミラー22へ向かう。反射したこれらの各色ビームは1本の映像光ビームに合成され、反射角度可変ミラー22にてスキャンされ、被投影物3に投影される。なお、ここに示したR光、G光、B光の各光源の配置は一例であり、光ビームの伝達効率等を考慮して適宜配置を変更してよい。また、ここではR,G,B色の3個の光源を使用したが、光源数と発光色は投影する映像に応じて適宜決定すればよい。また、1個の光源による単色光としてもよい。 FIG. 3 is a diagram illustrating an example of the configuration of the laser light source 21. The laser light source 21 includes three light sources: a light source 21R that generates red (R light), a light source 21G that generates green (G light), and a light source 21B that generates blue (B light), and can display a color image. Yes. The R light beam generated from the light source 21R is reflected by the total reflection mirror 24R and travels toward the reflection angle variable mirror 22. The G light beam generated from the light source 21G is reflected by the wavelength selective mirror 24R (reflects G light and transmits R light) and travels toward the reflection angle variable mirror 22. The B light beam generated from the light source 21 </ b> B is reflected by the wavelength selective mirror 24 </ b> B (reflects B light and transmits R light and G light) and travels toward the reflection angle variable mirror 22. These reflected color beams are combined into one image light beam, scanned by the reflection angle variable mirror 22, and projected onto the projection object 3. The arrangement of the light sources of R light, G light, and B light shown here is an example, and the arrangement may be changed as appropriate in consideration of the transmission efficiency of the light beam. In addition, although three light sources of R, G, and B are used here, the number of light sources and the light emission color may be appropriately determined according to the projected image. Moreover, it is good also as monochromatic light by one light source.
 レーザ光源21から出射するレーザ光の強度は映像信号に依存して変化する。従って、被投影物3から反射される反射光の強度も映像信号に伴って変化し、受光部23における受光信号のレベルが変動することになる。これについては、レーザ光源21から出射するレーザ光の強度はその映像信号により既知であるから、増幅率制御部7はレーザ駆動信号203に基づいて受光部23の受光感度を調整することにより、距離測定への影響をなくすことができる。 The intensity of the laser beam emitted from the laser light source 21 varies depending on the video signal. Therefore, the intensity of the reflected light reflected from the projection object 3 also changes with the video signal, and the level of the light receiving signal in the light receiving unit 23 varies. In this regard, since the intensity of the laser light emitted from the laser light source 21 is known from the video signal, the amplification factor control unit 7 adjusts the light receiving sensitivity of the light receiving unit 23 based on the laser drive signal 203 to obtain the distance. The influence on the measurement can be eliminated.
 図4は、メモリ12に記憶する距離データテーブルの一例を示す図である。レーザ光のスキャン範囲(矩形領域)をm×n個のメッシュに分割し、各メッシュ位置における距離データdを記憶する。すなわち、データ量は1画面(1フレーム)分となる。メッシュ分割は画素単位としてもよいが、記憶容量や距離計算の負荷などを考慮し、複数画素を単位とするのがよい。 FIG. 4 is a diagram illustrating an example of a distance data table stored in the memory 12. The scan range (rectangular region) of the laser light is divided into m × n meshes, and distance data d at each mesh position is stored. That is, the data amount is for one screen (one frame). The mesh division may be performed in units of pixels, but it is preferable to use a plurality of pixels in units in consideration of storage capacity, distance calculation load, and the like.
 被投影物3として、図2のように物体31,32が配置される場合には、物体31に対応する領域31aに対しては距離データd1が、物体32に対応する領域32aに対しては距離データd2が保存される。なお、保存するデータは距離の値そのものでなく、距離データdを閾値dthと比較した結果、すなわち、“1”(d>dth)と、“0”(d<dth)のように2値化したデータを保存してもよい。画像処理部13はこのテーブルを参照して、被投影物3の位置と形状を判定することができる。そして、メモリ12に新しい距離データが入力する毎に対応するメッシュ位置のデータに上書きして保存することで、被投影物3の最新の位置と形状を知ることができる。 When the objects 31 and 32 are arranged as the projection object 3 as shown in FIG. 2, the distance data d1 for the region 31a corresponding to the object 31 and the region 32a corresponding to the object 32 are used. The distance data d2 is stored. The data to be stored is not the distance value itself, but is binarized as a result of comparing the distance data d with the threshold value dth, that is, “1” (d> dth) and “0” (d <dth). Stored data may be stored. The image processing unit 13 can determine the position and shape of the projection object 3 with reference to this table. Each time new distance data is input to the memory 12, the corresponding mesh position data is overwritten and saved, whereby the latest position and shape of the projection object 3 can be known.
 図5は、映像投影とマッピング処理のフローチャートを示す図である。以下のフローはフレーム毎にループ処理を行うものとする。 FIG. 5 is a diagram showing a flowchart of video projection and mapping processing. In the following flow, loop processing is performed for each frame.
 S301で、レーザモジュール2は被投影物3に対して映像信号で変調されたレーザ光を照射する。 In S301, the laser module 2 irradiates the projection object 3 with laser light modulated by the video signal.
 S302で、照射位置(画素)と測定位置が一致しているかを判定する。ここでいう測定位置とは、メッシュ分割した画素エリアの略中心を示す。一致していればS303に進み、一致していなければS306に進む。 In S302, it is determined whether the irradiation position (pixel) is coincident with the measurement position. The measurement position here refers to the approximate center of the pixel area divided into meshes. If they match, the process proceeds to S303, and if they do not match, the process proceeds to S306.
 S303で、受光部23は被投影物3からの反射光を受光する。S304で、距離算出部11は受光信号から被投影物3までの距離を測定する。S305で、測定した距離データをメモリ12に保存する。以上のように、距離測定のメッシュ分割が複数画素単位であって、レーザ光を照射している位置がメッシュ位置と一致している場合に、距離測定を行っている。 In S303, the light receiving unit 23 receives the reflected light from the projection object 3. In S304, the distance calculation unit 11 measures the distance from the light reception signal to the projection object 3. In step S305, the measured distance data is stored in the memory 12. As described above, distance measurement is performed when the mesh division of the distance measurement is in units of a plurality of pixels and the position where the laser beam is irradiated matches the mesh position.
 S306で、一画面全体の照射が終了したか否かを判定する。終了していなければS307に進み、次の照射位置に移動しS301へ戻る。そして、レーザ光の照射位置が測定位置の場合は距離測定を繰り返す。S306で一画面全体の照射が終了していれば、S308へ進む。 In S306, it is determined whether or not irradiation of the entire screen has been completed. If not completed, the process proceeds to S307, moves to the next irradiation position, and returns to S301. When the laser beam irradiation position is the measurement position, the distance measurement is repeated. If irradiation of the entire screen has been completed in S306, the process proceeds to S308.
 なお、上記フローにおいては、メッシュ分割が複数画素単位の場合で説明したが、メッシュ分割が全画素の場合は、すべての照射位置において距離測定を行うこととなる。 In the above flow, the case where the mesh division is in units of a plurality of pixels has been described. However, when the mesh division is all pixels, distance measurement is performed at all irradiation positions.
 S308で、メモリ12から1画面分の距離データを読み出す。S309で、画像処理部13は距離データから被投影物3の位置と形状を判定する。S310で、画像処理部13は被投影物3の位置と形状に合わせて映像信号を補正する(マッピング処理)。なお、本実施例ではレーザ光による映像投影なので、被投影物までの距離に応じてフォーカスを調整する必要はない。 In S308, the distance data for one screen is read from the memory 12. In step S309, the image processing unit 13 determines the position and shape of the projection object 3 from the distance data. In S310, the image processing unit 13 corrects the video signal according to the position and shape of the projection object 3 (mapping process). In this embodiment, since the image is projected by laser light, it is not necessary to adjust the focus according to the distance to the projection object.
 S311で、映像投影動作が終了したか否かを判定する。終了していなければ、S312で最初の照射位置に移動し、S301に戻り映像投影と距離測定、マッピング処理を繰り返す。 In S311, it is determined whether or not the video projection operation is finished. If not completed, the process moves to the first irradiation position in S312, returns to S301, and repeats image projection, distance measurement, and mapping processing.
 上記フローチャートによれば、映像を投影しながら、随時投影する映像信号のマッピング処理を行うことができる。被投影物までの距離測定結果がマッピング処理に反映されるまでの遅延時間は、最大でS306の判定がYesに切り替わるまで、すなわち1画面分の投影に要する時間である。一般的な1画面分の投影に要する時間は1/60秒と十分小さいため、特に遅延が問題となることはない。 According to the above flowchart, it is possible to perform mapping processing of a video signal to be projected at any time while projecting a video. The delay time until the distance measurement result to the projection object is reflected in the mapping process is the time required for projection of one screen until the determination in S306 is switched to Yes at the maximum. Since the time required for general projection for one screen is sufficiently small, such as 1/60 second, delay is not particularly problematic.
 また、被投影物の動きが速い場合は、例えば、垂直方向の解像度を下げて、1画面分の投影に要する時間を小さくするようにする。これにより遅延時間をさらに短縮することができ、動きの速い被投影物でも、投影位置(マッピング位置)に生じるずれを少なくすることができる。すなわち、被投影物の動き速度に応じて、垂直方向の解像度を適宜切り換えて動作させればよい。 Also, when the movement of the projection object is fast, for example, the vertical resolution is lowered to reduce the time required for projection for one screen. As a result, the delay time can be further shortened, and a shift occurring at the projection position (mapping position) can be reduced even with a fast-moving projection object. In other words, the vertical resolution may be appropriately switched according to the movement speed of the projection object.
 本実施例では、映像の投影と被投影物の形状測定とを同一レーザ光源21からのレーザ光(映像光)を用いて行っているので、マッピング処理の時間遅れが少なく、また、映像の投影が途切れることなく、ほぼリアルタイムで実行することができる。よって、被投影物が移動する場合でも被投影物に対する投影位置のずれが少なく、見映えの良い映像投影装置を実現することができる。また、レーザの照射位置と測定位置が同じであるため、投影の座標系と測定の座標系は一致するため、キャリブレーションの必要がなく、使い勝手がよい。 In the present embodiment, since the projection of the image and the shape measurement of the projection object are performed using the laser light (image light) from the same laser light source 21, the time delay of the mapping process is small, and the projection of the image is performed. Can be executed almost in real time without interruption. Therefore, even when the projection object moves, a video projection device with a good appearance can be realized with little deviation of the projection position with respect to the projection object. In addition, since the irradiation position of the laser and the measurement position are the same, the projection coordinate system and the measurement coordinate system coincide with each other, so that calibration is unnecessary and the usability is good.
 実施例2では、レーザ光源に赤外光(IR光)を追加した構成としている。
  図6は、実施例2におけるレーザ光源21’の構成を示す図である。レーザ光源21’は、赤色(R光)光源21R、緑色(G光)光源21G、青色(B光)光源21Bの他に、赤外光(IR光)を発生する光源21IRを備えている。光源21R,21G,21Bから映像信号に応じた各色ビームを出射するとともに、追加した光源21IRは、映像信号とは関係なく一定強度の赤外光(IR光)ビームを出射する。光源21IRから発生されたIR光ビームは全反射ミラー24IRにて反射され、反射角度可変ミラー22へ向かう。他の光源21R,21G,21Bについても、それぞれ波長選択性ミラー24R’,24G’,24B’で反射され、反射角度可変ミラー22へ向かい、これらは合成されて1本のビームとなる。ただし、光源21IRから出射したIR光ビームは、被投影物に照射されても、表示される映像には何ら影響しない。
In the second embodiment, infrared light (IR light) is added to the laser light source.
FIG. 6 is a diagram illustrating a configuration of a laser light source 21 ′ according to the second embodiment. The laser light source 21 ′ includes a light source 21IR that generates infrared light (IR light) in addition to a red (R light) light source 21R, a green (G light) light source 21G, and a blue (B light) light source 21B. Each color beam corresponding to the video signal is emitted from the light sources 21R, 21G, and 21B, and the added light source 21IR emits an infrared light (IR light) beam having a constant intensity regardless of the video signal. The IR light beam generated from the light source 21IR is reflected by the total reflection mirror 24IR and travels toward the reflection angle variable mirror 22. The other light sources 21R, 21G, and 21B are also reflected by the wavelength selective mirrors 24R ′, 24G ′, and 24B ′, respectively, and travel toward the reflection angle variable mirror 22, which are combined into one beam. However, the IR light beam emitted from the light source 21IR does not affect the displayed image even if it is irradiated onto the projection object.
 本実施例では、受光部23と距離算出部11は、光源21IRから出射したIRレーザ光を用いて被投影物までの距離測定を行う。なお、図示していないが、受光部23には赤外光を透過し可視光を反射するフィルタを設けて、IR光のみを受光するようにしている。光源21IRから出射するIR光ビームは一定強度としているので、被投影物から反射されるIRレーザ光の強度もほぼ一定となる。よって、映像信号に応じて可視光(R,G,B光)の強度が変化する場合、特に映像信号レベルが低く暗い映像を投影する場合であっても、距離測定精度が低下せず、マッピング位置ずれが少なく見映えの良い映像投影装置を実現することができる。 In the present embodiment, the light receiving unit 23 and the distance calculating unit 11 measure the distance to the projection object using the IR laser light emitted from the light source 21IR. Although not shown, the light receiving unit 23 is provided with a filter that transmits infrared light and reflects visible light so as to receive only IR light. Since the IR light beam emitted from the light source 21IR has a constant intensity, the intensity of the IR laser light reflected from the projection object is also substantially constant. Therefore, when the intensity of visible light (R, G, B light) changes according to the video signal, the distance measurement accuracy does not decrease and the mapping is performed even when a dark video is projected with a low video signal level. It is possible to realize a video projection apparatus with little misalignment and good appearance.
 1:映像投影装置、
 2:レーザモジュール、
 3,31,32:被投影物、
 4:ミラー駆動部、
 5:レーザ光源駆動部、
 6:光量受光部、
 7:増幅率制御部、
 9:制御信号生成部、
 10:パルス生成部、
 11:距離算出部、
 12:メモリ、
 13:画像処理部、
 21,21’:レーザ光源、
 22:反射角度可変ミラー、
 23:受光部、
 24:ミラー、
 41,42:オブジェクト。
1: Video projection device,
2: Laser module,
3, 31, 32: Projected object,
4: Mirror drive unit,
5: Laser light source driving unit,
6: Light quantity receiving part,
7: Amplification rate control unit,
9: control signal generator,
10: pulse generator,
11: Distance calculation unit,
12: Memory,
13: Image processing unit
21, 21 ′: Laser light source,
22: Reflective angle variable mirror,
23: light receiving part,
24: Mirror
41, 42: Object.

Claims (5)

  1.  被投影物に映像を投影する映像投影装置において、
     入力した映像信号で変調されたレーザ光を出射するレーザ光源と、
     該レーザ光を反射して前記被投影物に投影する反射角度可変ミラーと、
     該反射角度可変ミラーを駆動するミラー駆動部と、
     前記被投影物にて反射したレーザ光を検出する受光部と、
     該受光部からの受光信号を基に前記被投影物までの距離を算出する距離算出部と、
     算出した距離に基づいて、前記被投影物に投影する映像をマッピング処理する画像処理部と、
     を備えることを特徴とする映像投影装置。
    In a video projection device that projects an image on a projection object,
    A laser light source that emits a laser beam modulated by the input video signal;
    A reflection angle variable mirror that reflects the laser light and projects it onto the projection object;
    A mirror driving section for driving the variable reflection angle mirror;
    A light receiving unit for detecting a laser beam reflected by the projection object;
    A distance calculation unit that calculates a distance to the projection object based on a light reception signal from the light reception unit;
    Based on the calculated distance, an image processing unit that performs mapping processing on the image projected on the projection object;
    A video projection apparatus comprising:
  2.  請求項1に記載の映像投影装置において、
     前記レーザ光源から出射するレーザ光の強度に応じて、前記受光部の受光感度を設定する増幅率制御部を備えることを特徴とする映像投影装置。
    The video projection device according to claim 1,
    An image projection apparatus comprising: an amplification factor control unit that sets a light receiving sensitivity of the light receiving unit according to the intensity of laser light emitted from the laser light source.
  3.  請求項1に記載の映像投影装置において、
     前記レーザ光源には一定強度の赤外レーザ光を出射する光源を含み、
     前記受光部と前記距離算出部は、前記赤外レーザ光の反射光を検出して前記被投影物までの距離を算出することを特徴とする映像投影装置。
    The video projection device according to claim 1,
    The laser light source includes a light source that emits infrared laser light having a constant intensity,
    The image projection device, wherein the light receiving unit and the distance calculating unit detect a reflected light of the infrared laser light and calculate a distance to the projection object.
  4.  請求項1乃至3のいずれか1項に記載の映像投影装置において、
     前記反射角度可変ミラーによるレーザ光のスキャン範囲をメッシュに分割し、前記距離算出部により算出した距離データを前記メッシュ毎に記憶するメモリを備えることを特徴とする映像投影装置。
    The video projection device according to any one of claims 1 to 3,
    An image projection apparatus comprising: a memory that divides a scan range of laser light by the reflection angle variable mirror into meshes, and stores distance data calculated by the distance calculation unit for each mesh.
  5.  請求項4に記載の映像投影装置において、
     前記被投影物が移動している場合、前記距離算出部は前記被投影物の動き速度に応じて、1画面全体をスキャンする時間を切り換えることを特徴とする映像投影装置。
    The video projection device according to claim 4,
    When the projection object is moving, the distance calculation unit switches the time for scanning the entire screen according to the movement speed of the projection object.
PCT/JP2014/058453 2014-03-26 2014-03-26 Video projection device WO2015145599A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/058453 WO2015145599A1 (en) 2014-03-26 2014-03-26 Video projection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/058453 WO2015145599A1 (en) 2014-03-26 2014-03-26 Video projection device

Publications (1)

Publication Number Publication Date
WO2015145599A1 true WO2015145599A1 (en) 2015-10-01

Family

ID=54194197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/058453 WO2015145599A1 (en) 2014-03-26 2014-03-26 Video projection device

Country Status (1)

Country Link
WO (1) WO2015145599A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017073111A1 (en) * 2015-10-30 2017-05-04 株式会社Jvcケンウッド Light radiation device and light radiation method
JP2020114778A (en) * 2020-02-25 2020-07-30 パナソニックIpマネジメント株式会社 Projection instruction device, goods assort system and projection instruction method
US11500066B2 (en) * 2019-04-23 2022-11-15 Hyundai Motor Company LiDAR-integrated lamp device for vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002328428A (en) * 2001-05-01 2002-11-15 Sony Corp Projector and image projection system
JP2003130953A (en) * 2001-10-24 2003-05-08 Nikon Corp Range finder
JP2009180966A (en) * 2008-01-31 2009-08-13 Seiko Epson Corp Image forming apparatus
JP2013118596A (en) * 2011-12-05 2013-06-13 Tetsuya Akiba Projection system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002328428A (en) * 2001-05-01 2002-11-15 Sony Corp Projector and image projection system
JP2003130953A (en) * 2001-10-24 2003-05-08 Nikon Corp Range finder
JP2009180966A (en) * 2008-01-31 2009-08-13 Seiko Epson Corp Image forming apparatus
JP2013118596A (en) * 2011-12-05 2013-06-13 Tetsuya Akiba Projection system and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017073111A1 (en) * 2015-10-30 2017-05-04 株式会社Jvcケンウッド Light radiation device and light radiation method
US11500066B2 (en) * 2019-04-23 2022-11-15 Hyundai Motor Company LiDAR-integrated lamp device for vehicle
JP2020114778A (en) * 2020-02-25 2020-07-30 パナソニックIpマネジメント株式会社 Projection instruction device, goods assort system and projection instruction method

Similar Documents

Publication Publication Date Title
US10120066B2 (en) Apparatus for making a distance determination
JP6784295B2 (en) Distance measurement system, distance measurement method and program
JP6120611B2 (en) Beam scanning display device
US9106805B2 (en) Image measuring system
JP2015017992A (en) Method and device for optically scanning and measuring environment
JPWO2014097539A1 (en) Three-dimensional measuring apparatus and three-dimensional measuring method
JP2006313116A (en) Distance tilt angle detection device, and projector with detection device
US10663593B2 (en) Projector apparatus with distance image acquisition device and projection method
JP2002131016A (en) Apparatus and method of distance measurement
KR20190074769A (en) Apparatus for Light Detection and Ranging
JP2016096516A (en) Image processing device, image projection system, image processing method, and program
WO2015145599A1 (en) Video projection device
JP2018063222A (en) Distance measurement device, distance measurement method and program
US20210270969A1 (en) Enhanced depth mapping using visual inertial odometry
US11835653B2 (en) Electromagnetic wave detection apparatus, program, and information acquisition system
JP6379646B2 (en) Information processing apparatus, measurement method, and program
US20220244392A1 (en) High resolution lidar scanning
JP2020067388A (en) Abnormality detection apparatus, abnormality detection method, program, and ranging apparatus
US20160191878A1 (en) Image projection device
WO2023153451A1 (en) Measurement device
JP2014123170A (en) Position determination device and position determination method
US20230078063A1 (en) Distance measurement device and distance measurement system
WO2022259594A1 (en) Distance measurement device and distance measurement method
WO2022196779A1 (en) Three-dimensional-measurement device
JP3730979B2 (en) Projector having tilt angle measuring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14887639

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14887639

Country of ref document: EP

Kind code of ref document: A1