JP6467516B2 - Projector device with distance image acquisition device and projection method - Google Patents

Projector device with distance image acquisition device and projection method Download PDF

Info

Publication number
JP6467516B2
JP6467516B2 JP2017543009A JP2017543009A JP6467516B2 JP 6467516 B2 JP6467516 B2 JP 6467516B2 JP 2017543009 A JP2017543009 A JP 2017543009A JP 2017543009 A JP2017543009 A JP 2017543009A JP 6467516 B2 JP6467516 B2 JP 6467516B2
Authority
JP
Japan
Prior art keywords
projection
image
distance image
distance
difference value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2017543009A
Other languages
Japanese (ja)
Other versions
JPWO2017056776A1 (en
Inventor
潤也 北川
潤也 北川
智行 河合
智行 河合
新貝 安浩
安浩 新貝
智紀 増田
智紀 増田
善工 古田
善工 古田
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2015191761 priority Critical
Priority to JP2015191761 priority
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to PCT/JP2016/074251 priority patent/WO2017056776A1/en
Publication of JPWO2017056776A1 publication Critical patent/JPWO2017056776A1/en
Application granted granted Critical
Publication of JP6467516B2 publication Critical patent/JP6467516B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/142Adjusting of projection optics
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/005Projectors using an electronic spatial light modulator but not peculiar thereto
    • G03B21/006Projectors using an electronic spatial light modulator but not peculiar thereto using LCD's
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2053Intensity control of illuminating light

Description

  The present invention relates to a projector apparatus with a distance image acquisition apparatus and a projection method, and more particularly to a technique for projecting an image in accordance with the movement of a moving and stationary projection object.

  A TOF (Time Of Flight) camera is known as a camera that acquires an image while acquiring distance information. The TOF camera obtains a distance image (depth data) indicating the distance to the subject by irradiating the subject with light and measuring the time (flight time) until the reflected light is received by the image sensor.

  In addition, a technique for projecting an image from a projector device according to the three-dimensional shape of a projection target is also known, and is called projection mapping, video mapping, or the like.

  In Patent Document 1, a contour image of a closed region is obtained by performing coordinate conversion of a captured image of a projection object from a coordinate system of an electronic camera to a coordinate system of a display screen and performing edge processing on the projection object of the converted image. Is extracted and projection mapping is performed on the contour shape.

  Further, Patent Document 2 proposes a technique for using a TOF camera as a device that acquires a distance image of a projection object in real time and integrating the projector device with a projector device, thereby miniaturizing a projection mapping device.

  In particular, the projector device described in Patent Document 2 is characterized in that a projection image is irradiated as pulsed light, whereby the light source and projection lens of the projector device are also used as the light source and projection lens of the TOF camera.

JP 2013-192189 A Special table 2013-546222 gazette

  Here, in the projection mapping, an image generated in accordance with the shape of the projection object is projected, so it is important to detect whether the projection object is moving or stationary. That is, when projection mapping is performed on a moving projection object, there may be a case where the projection mapping and the projection image are misaligned and the projection mapping cannot be performed well.

  On the other hand, in order to detect the movement of the projection object, a function of a detection device for detecting the movement of the projection object has to be added to the projector apparatus, which causes the projector apparatus to become larger or complicated. It becomes.

  Patent Documents 1 and 2 described above do not refer to detecting the movement of the projection target while suppressing the enlargement or complication of the projector apparatus when performing projection mapping.

  The present invention has been made in view of such circumstances, and an object of the present invention is to suppress the enlargement or complexity of the projector device, detect the movement of the projection object, and accurately display the image on the projection object. To provide a projector device with a distance image acquisition device and a projection method for projection.

  In order to achieve the above object, a projector with a distance image acquisition device according to one aspect of the present invention includes a display optical element for displaying a projection image, and a projection image displayed on the display optical element as a projection object. A projection apparatus including a projection light source and a projection lens to project, a distance image sensor in which a plurality of light receiving elements are arranged in a two-dimensional manner, a measurement light source, and a measurement light source. An imaging lens that forms an image of the reflected measurement light on the distance image sensor, and distance information corresponding to the flight time of the measurement light emitted from the measurement light source, reflected by the projection object, and incident on the distance image sensor. A distance image acquisition device including a distance image generation unit that acquires from the image sensor and generates a distance image based on the acquired distance information. The projector device is acquired by the distance image acquisition device. A projection image generation unit that detects the shape of the projection object based on the measured distance image and generates a projection image corresponding to the detected shape; and a first distance image acquired at a first timing by the distance image acquisition device A difference value acquisition unit that acquires a difference value between the distance information included in the image and the distance information included in the second distance image acquired at the second timing, and the projection based on the difference value acquired by the difference value acquisition unit A determination unit that determines whether or not the body is stationary, and a projection instruction that outputs an instruction to project the image generated by the projection image generation unit to the projection target based on the determination result of the determination unit And a projection control unit that controls projection of the projector device based on an instruction to perform projection output from the projection instruction unit.

  According to this aspect, the projection image is generated by detecting the shape of the projection body based on the distance image generated by the distance image generation unit of the distance image acquisition device. Furthermore, according to this aspect, it is determined whether or not the projection target is stationary from the difference value between the distance image acquired at the first timing and the distance image acquired at the second timing. Based on the projection. As a result, in this aspect, the distance image is used for generating a projection image and determining whether or not the projection object is stationary, so that the apparatus is prevented from being enlarged or complicated and the movement of the projection object is detected. Therefore, it is possible to accurately project a projection image onto the projection target.

  Preferably, the difference value acquisition unit acquires a difference value between the average value of the distance information included in the first distance image and the average value of the distance information included in the second distance image, and the determination unit includes the difference value of the first value. When the threshold is 1 or less, it is determined that the projection object is stationary.

  According to this aspect, the difference value is calculated based on the average value of the distance information included in the first distance image and the average value of the distance information included in the second distance image, and the determination unit is based on the first threshold value. To determine whether the projection object is stationary. Thereby, this aspect can determine more accurately whether the to-be-projected object is still based on the average value of the distance information which a distance image has.

  Preferably, the difference value acquisition unit acquires the maximum difference value between the distance information included in the first distance image and the distance information included in the second distance image, and the determination unit determines that the maximum difference value is the second difference value. When it is below the threshold, it is determined that the projection object is stationary.

  According to this aspect, the difference value is the maximum value of the difference value between the distance information included in the first distance image and the distance information included in the second distance image, and the determination unit is projected based on the second threshold value. Determine if the body is stationary. Thereby, this aspect can determine whether the to-be-projected object is still more accurately based on the maximum value of the difference value.

  Preferably, the projector device with a distance image acquisition device includes a difference image generation unit that generates a difference image generated based on the difference value acquired by the difference value acquisition unit.

  According to this aspect, since the difference image is generated based on the difference value, the difference image can be used.

  Preferably, the difference image generation unit acquires an average difference value in a plurality of frames of the difference image, and the determination unit determines that the average difference value in the plurality of frames of the difference image is equal to or less than a third threshold value. Is determined to be stationary.

  According to this aspect, since the projection object is determined to be stationary when the average difference value in the plurality of frames of the difference image is equal to or less than the third threshold value, the projection object is more accurately stationary. It can be determined whether or not.

  Preferably, the difference image generation unit obtains the maximum difference value in the plurality of frames of the difference image, and the determination unit receives the difference when the maximum difference value in the plurality of frames of the difference image is equal to or less than a fourth threshold value. It is determined that the projection body is stationary.

  According to this aspect, since the projection target is determined to be stationary when the maximum difference value in the plurality of frames of the difference image is equal to or less than the fourth threshold value, based on the maximum difference value in the plurality of frames. Thus, it can be determined whether or not the projection object is stationary.

  Preferably, the difference image generation unit obtains a difference value for each region of the difference image, and the determination unit determines that the projection object is stationary based on the size of the region where the difference value is equal to or smaller than a fifth threshold in the difference image. It is determined whether or not.

  According to this aspect, since it is determined whether or not the projection object is stationary based on the size of the area where the difference value is equal to or smaller than the fifth threshold in the difference image, the projection object is more accurately detected. It can be determined whether or not it is stationary.

  Preferably, the difference image generation unit acquires a difference value for each region of the difference image in a plurality of frames, and the determination unit is based on a size of a region in which the difference value is equal to or smaller than a sixth threshold in the difference image of the plurality of frames. To determine whether or not the projection object is stationary.

  According to this aspect, since it is determined whether or not the projection object is stationary based on the size of the area whose difference value is equal to or smaller than the sixth threshold in the difference image of the plurality of frames, The stationary state of the projection object can be accurately determined.

  Preferably, the difference image generation unit obtains a value obtained by multiplying the weighting coefficient set in accordance with the area of the difference image by the difference value, and the determination unit obtains a value obtained by multiplying the weighting coefficient by the difference value as a seventh threshold value. When the following is true, it is determined that the projection target is stationary.

  According to this aspect, the difference value is acquired based on the weighting set according to the area of the difference image, and it is determined whether or not the projection object is stationary based on the difference value. It is possible to accurately determine the movement of the projection object.

  Preferably, the projector device with a distance image acquisition device outputs a projection stop instruction unit that outputs a projection stop instruction of the image generated by the projection image generation unit to the projection target based on the determination result of the determination unit; The projection control unit controls the projection of the projector device based on the projection stop instruction from the projection stop instruction unit.

  According to this aspect, since an instruction to stop image projection is output based on the determination result, and projection of the projector device is controlled based on the stop instruction, the projection is performed when the projection object starts moving again after being stationary. To stop. As a result, it is possible to perform projection with higher accuracy by preventing projection onto the moving projection object.

  A projection method according to another aspect of the present invention includes a display optical element that displays a projection image, and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target. A projector apparatus, a distance image sensor in which a plurality of light receiving elements are arranged in a two-dimensional manner, a measurement light source, and measurement light emitted from the measurement light source and reflected by the projection object is imaged on the distance image sensor. The distance information corresponding to the flight time of the measurement light emitted from the imaging lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and based on the acquired distance information A distance image generation unit including a distance image generation unit configured to generate a distance image, and a projection method using a projector device with a distance image acquisition device. A projection image generating step for detecting the shape of the projection object based on the distance image acquired by the step S1 and generating a projection image corresponding to the detected shape; and the first acquired by the distance image acquisition device at the first timing. Based on the difference value acquisition step for acquiring the difference value between the distance information included in the distance image and the distance information included in the second distance image acquired at the second timing, and the difference value acquired in the difference value acquisition step A determination step for determining whether or not the projection target is stationary, and an instruction for projecting the image generated in the projection image generation step to the projection target based on the determination result of the determination step A projection instruction step; and a projection control step of controlling the projection of the projector device based on the instruction to perform the projection output from the projection instruction step.

  According to the present invention, based on the distance image generated by the distance image generation unit of the distance image acquisition device, the projection image is generated by detecting the shape of the projection body, and the distance image acquired at the first timing and It is determined whether or not the projection target is stationary from the difference value of the distance image acquired at the second timing, and projection is performed based on the determination. As a result, the present invention uses the distance image to generate a projection image and determine whether or not the projection object is stationary, thereby suppressing the enlargement or complication of the apparatus and detecting the movement of the projection object. Therefore, it is possible to accurately project a projection image onto the projection target.

FIG. 1 is a conceptual diagram showing a case where a projector device with a distance image acquisition device is used. FIG. 2 is a front view of the projection object described in FIG. FIG. 3 is a diagram conceptually showing an aspect in which projection mapping is performed on the projection object described in FIG. FIG. 4 is a block diagram showing the configuration of the projector apparatus. FIG. 5 is a diagram for explaining the calculation processing of the distance of the projection target. FIG. 6 is a block diagram showing the configuration of the control unit. FIG. 7 is a diagram illustrating the difference value acquisition performed by the difference value acquisition unit. FIG. 8 is a diagram for explaining the difference value acquisition performed by the difference value acquisition unit. FIG. 9 is a flowchart showing the operation of the projector apparatus. FIG. 10 is a block diagram showing the configuration of the control unit. FIG. 11 is a conceptual diagram showing a difference image. FIG. 12 is a conceptual diagram showing a difference image. FIG. 13 is a diagram conceptually showing a distance image and a difference image. FIG. 14 is a diagram conceptually showing a distance image and a difference image.

  Preferred embodiments of a projector apparatus with a distance image acquisition apparatus and a projection method according to the present invention will be described below with reference to the accompanying drawings.

  FIG. 1 is a conceptual diagram showing a case where a projector device with a distance image acquisition device according to the present invention is used. The case where the person who is the to-be-projected body 2 goes straight to the projector device 20 with a distance image acquisition device (hereinafter referred to as the projector device 20) is shown. As a specific example shown in FIG. 1, a fashion show or the like is assumed. The projector device 20 includes a distance image acquisition unit (distance image acquisition device) 20A and a projector unit (projector device) 20B. The projector device 20 sets the clothes worn by the person as the projection object 2 as the projection area 3, and the projection mapping (Projection Mapping) with respect to the projection area 3 of the projection object 2 when the projection object 2 is stationary. )I do.

  FIG. 2 is a front view of the projection object 2 described with reference to FIG. 2A is a front view of the projection target 2 at time T1, FIG. 2B is a front view of the projection target 2 at time T2, and FIG. It is a front view of the to-be-projected body 2. FIG. As shown in FIGS. 2A to 2C, when the projection object 2 is moving, the distance between the projector device 20 and the projection object 2 changes, so the range of the projection region 3 changes. When the range of the projection area 3 changes, the projection image generated in accordance with the projection area 3 and the projection area 3 do not match. Therefore, in the present invention, in order to prevent the above-described mismatch between the projection image and the projection region 3, the movement of the projection object 2 is detected, and projection mapping is performed when the projection object 2 is stationary.

  FIG. 3 is a diagram conceptually showing an aspect in which projection mapping is performed on the projection object 2 described in FIG. The projector device 20 projects the projection image 5 on clothes worn by a person who is the projection object 2. The projection image 5 projected by the projector device 20 is formed by matching the shape of the closed region formed by the clothes worn by the person who is the projection object 2 with the projection region 3. In this way, projecting the projection image 5 in accordance with the shape of the projection region 3 is generally called projection mapping.

  FIG. 4 is a block diagram showing the configuration of the projector device 20. The projector device 20 includes a distance image acquisition unit 20A that acquires a distance image, a projector unit 20B that projects a projection image 5, a control unit 26, a memory 27, a projection image generation unit 28, and an input I / O. F (interface) 29. The control unit 26, the memory 27, and the input I / F 29 function in common in the distance image acquisition unit 20A and the projector unit 20B.

  The distance image acquisition unit 20A acquires a distance image of a pulsed light detection method, and includes a timing generator 31, an LED (Light Emitting Diode) light source (measurement light source) 32, a light source driver 33, and a projection lens 35. An imaging lens 36, a lens driver 37, a distance image sensor 38, an AD (Analog-to-Digital) converter 39 labeled "A / D" in the figure, and an "I / F" in the figure. And an interface circuit 40 labeled "". The distance image acquisition unit 20A has a function of a so-called TOF (Time Of Flight) camera. The principle of the TOF (Time Of Flight) camera will be described later. The distance image is a two-dimensional distribution image of distance values (distance information) obtained by a distance measurement method such as TOF. Each pixel of the distance image has a distance value (distance information).

  The timing generator 31 outputs timing signals to the LED light source 32 and the distance image sensor 38 under the control of the control unit 26.

  The LED light source 32 emits pulsed light having a constant pulse width in synchronization with the timing signal input from the timing generator 31. The light source driver 33 controls the driving of the LED light source 32 under the control of the control unit 26.

  The projection lens 35 irradiates the projection object 2 with the pulsed light emitted from the LED light source 32. The imaging lens 36 forms an image on the distance image sensor 38 of the pulsed light reflected by the projection target 2 when the projection target 2 is irradiated with pulsed light from the LED light source 32 through the projection lens 35. The lens driver 37 performs focus control and the like of the imaging lens 36 via a lens driving unit (not shown).

  The distance image sensor 38 has a plurality of light receiving elements arranged in a two-dimensional form, a CMOS (Complementary Metal-Oxide Semiconductor) driver having a vertical driver and a horizontal driver, and a CMOS type image driven by a timing generator 31. It is composed of sensors. The distance image sensor 38 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge Coupled Device) type image sensor.

  The distance image sensor 38 has a plurality of light receiving elements (photodiodes) arranged two-dimensionally, and a band that allows only the wavelength band of the pulsed light emitted from the LED light source 32 to pass through the incident surface side of the plurality of light receiving elements. A pass filter or a visible light cut filter for removing visible light may be provided. Thereby, the plurality of light receiving elements of the distance image sensor 38 function as pixels having sensitivity to pulsed light.

  The distance image sensor 38 controls the exposure period (exposure time and exposure timing) in synchronization with the emission of pulsed light from the LED light source 32 by the timing signal input from the timing generator 31. Electric charges corresponding to the amount of pulsed light incident during the exposure period are accumulated in each light receiving element of the distance image sensor 38. Thus, in the pulsed light detection method, the exposure amount increases as the distance to the projection object 2 decreases, and conversely, the exposure amount decreases as the distance to the projection object 2 increases. Accordingly, the distance to the projection object 2 can be measured. Then, from the distance image sensor 38, a pixel signal (analog signal corresponding to the charge accumulated for each pixel) corresponding to the incident light amount of the pulsed light reflected by the projection object 2 is read out.

  The AD converter 39 converts the pixel signal read from the distance image sensor 38 into a digital signal and outputs the digital signal to the interface circuit 40. Some CMOS image sensors include an AD converter 39. In this case, the AD converter 39 can be omitted. The interface circuit 40 functions as an image input controller, and outputs the digital signal input from the AD converter 39 to the control unit 26. Thereby, as will be described in detail later, a distance image is generated by the distance image generation unit 113 (FIG. 6) of the control unit 26, and a projection image 5 is further generated by the projection image generation unit 28.

  The projector unit 20B is a so-called single-plate liquid crystal projector, and includes a display optical element (also referred to as a light modulation element) 42, an element driver 43, an LED light source (projection light source) 44, a light source driver 45, and a projection lens 46. And a lens driver 47.

  As the display optical element 42, a transmissive liquid crystal panel including a plurality of color filters, or an element having a color filterless structure in which a dichroic mirror, a microlens array, and a monochrome transmissive liquid crystal panel are combined is used. . The element of the color filterless structure, for example, separates white light into RGB three-color light by three types of dichroic mirrors that reflect R (Red) light, G (Green) light, and B (Blue) light, respectively. The three colors of light are incident on the microlens array on the liquid crystal panel at different angles. A color image can be displayed by causing light of three colors to enter the R pixel, G pixel, and B pixel of the liquid crystal panel by the microlens array.

  The projector unit 20B is not limited to a single-plate liquid crystal projector, and may be a known three-plate liquid crystal projector including color separation optics and a plurality of liquid crystal panels. Further, the projector unit 20B is not limited to the transmissive liquid crystal method, and may adopt other methods such as a reflective liquid crystal display method and a micromirror device method (light switch display method).

  The element driver 43 controls the display optical element 42 under the control of the control unit 26 to display the projection image 5 generated by the projection image generation unit 28 described later.

  The LED light source 44 causes white light to enter the display optical element 42 from the back side of the display optical element 42 (the side opposite to the surface facing the projection lens 46). Thereby, the image light of the projected image 5 is emitted from the display optical element 42. The light source driver 45 controls the driving of the LED light source 44 under the control of the control unit 26.

  The projection lens 46 projects the image light of the projection image 5 emitted from the display optical element 42 onto the projection target 2. The lens driver 47 performs focus control and the like of the projection lens 46 via a lens driving unit (not shown).

  The control unit 26 is connected to the timing generator 31, the light source driver 33, the lens driver 37, the distance image sensor 38, the interface circuit 40, the element driver 43, the light source driver 45, the lens driver 47, and the like via the data bus 49. Yes. The control unit 26 includes various arithmetic units, a processing unit, and a storage unit including, for example, a CPU (Central Processing Unit), and by executing a control program and data read from the memory 27, Overall operation and processing of the projector device 20 are controlled.

  The memory 27 stores a control program for the control unit 26 to execute processing.

  The projection image generation unit 28 generates the projection image 5 based on data and information input from the control unit 26 under the control of the control unit 26. That is, the projection image generation unit 28 detects the shape of the projection object 2 based on the distance image acquired by the distance image acquisition device, and generates the projection image 5 corresponding to the detected shape. The generation of the projection image 5 performed by the projection image generation unit 28 will be described later.

<Basic principle of TOF method>
Next, the basic principle of acquiring a distance image with a TOF camera will be described. Hereinafter, a case where a distance image of the projection target 2 is acquired by the TOF camera (distance image acquisition unit 20A) will be described.

  The LED light source 32 of the distance image acquisition unit 20 </ b> A emits near-infrared light, and is therefore pulse-driven by the timing generator 31. As a result, the LED light source 32 emits pulsed light with a pulse width. This pulsed light is reflected by the surface of the projection object 2, and the reflected pulsed light is imaged (received) by the distance image sensor 38 via the imaging lens 36 of the distance image acquisition unit 20A. The distance image sensor 38 is provided with a visible light cut filter, and the distance image sensor 38 receives only near-infrared pulse light.

  FIG. 5 is a diagram for explaining the calculation processing of the distance of the projection target 2.

  The LED light source 32 of the distance image acquisition unit 20A is pulse-driven by the timing generator 31. As shown in FIGS. 5A and 5B, the distance image sensor 38 is synchronized with the pulse driving of the LED light source 32. The timing generator 31 sequentially performs the two exposure controls of the first exposure control and the second exposure control.

  In the first exposure control shown in FIG. 5A, pulse light is emitted from the LED light source 32, and at least according to the distance of the projection object 2, there is a difference in the exposure amount at the corresponding light receiving element of the distance image sensor 38. This is exposure control for controlling the exposure period so as to occur. After the pulse light is emitted from the LED light source 32, the exposure is performed after the elapse of a certain time (the time until the pulse light returns from the farthest object that can be measured). The exposure is terminated after a lapse of time (predetermined exposure time) in which all of the pulsed light reflected from at least the farthest object returns.

  The second exposure control shown in FIG. 5B is an exposure control in which pulse light is emitted from the LED light source 32, and the phase of exposure start with respect to the pulse light is different from the phase of exposure start with respect to the pulse light of the first exposure control. There is exposure control for removing the difference in the reflectance of the subject and the change in the exposure amount at the distance image sensor 38 due to the pulse light that is not a uniform light amount on the entire screen. In this example, exposure control is performed to expose all pulsed light reflected by the subject from all light receiving elements of the distance image sensor 38. Specifically, exposure is started in synchronization with the light emission timing of the pulsed light emitted from the LED light source 32, and a predetermined time (a predetermined exposure time until all the pulsed light returns from at least the farthest subject that can be measured). ) After the elapse of time, the exposure is terminated. The “predetermined exposure time” in the first exposure control and the “predetermined exposure time” in the second exposure control are the same time, but the phase of the exposure start with respect to the pulsed light is different as described above.

Next, the distance image generation unit 113 of the control unit 26 is a sensor corresponding to each exposure amount acquired from the distance image sensor 38 by the first exposure control and the second exposure control as shown in FIG. output (output data of a pixel), the first when the data L 1 and second data L 2, respectively, the distance information D corresponding to the distance to the subject is calculated by the following equation.
[Equation 1]
D = L 1 ÷ L 2
That is, to calculate the division data obtained by dividing the [number 1] According to formula, the first data L 1 second data L 2. This division data is data (distance information D) corresponding to the relative distance from which the influence of the reflectance of the subject is removed and the influence of the light quantity of the pulsed light that is not a uniform light quantity on the entire screen is removed. It is also possible to determine the absolute distance of an object based on the first data L 1 and the second data L 2.

  A distance image can be generated by acquiring the distance information D for every pixel of the distance image sensor 38.

<Generation of projection image>
Next, generation of the projection image 5 performed by the projection image generation unit 28 will be described.

  The projection image generation unit 28 uses the first distance image and the second distance image generated by the distance image generation unit 113 (FIG. 6) as projection images (moving images or still images) recorded in the memory 27. Using the information (that is, information such as the shape, size, and unevenness of the projection area 3 of the projection object 2), the projection image (moving image or still image) recorded in the memory 27 is trimmed, coordinate-converted, and scaled. The projection image 5 suitable for the surface of the projection object 2 is generated. It is preferable that the distance image used by the difference value acquisition unit 101 is used effectively as the distance image used by the projection image generation unit 28 for generating the projection image 5 because the data is effectively used.

  In the display optical element 42 of the projector unit 20 </ b> B shown in FIG. 4, the transmittance of each optical element is controlled based on the projection image 5 generated by the projection image generation unit 28.

[First Embodiment]
Next, the control unit 26 in the first embodiment of the present invention will be described.

  FIG. 6 is a block diagram illustrating a configuration of the control unit 26. The control unit 26 mainly includes a distance image generation unit 113, a difference value acquisition unit 101, a determination unit 103, a projection instruction unit 105, a projection control unit 107, and a projection stop instruction unit 109.

  The distance image generation unit 113 acquires distance information from the distance image sensor 38 and generates a distance image based on the acquired distance information. Here, the distance information is information corresponding to the flight time of the measurement light that is incident on the distance image sensor 38 after the pulsed light (measurement light) emitted from the LED light source 32 is reflected by the projection object 2. , Corresponding to the distance of the projection object 2. The specific distance image generation method of the distance image generation unit 113 is as described above. The distance image generated by the distance image generation unit 113 is composed of two-dimensionally distributed distance information.

  The difference value acquisition unit 101 includes the distance information included in the first distance image acquired at the first timing by the distance image acquisition unit 20A and the distance information included in the second distance image acquired at the second timing. Get the difference value.

  FIG. 7 and FIG. 8 are diagrams for explaining the difference value acquisition performed by the difference value acquisition unit 101. FIG. 7A shows the distance image S acquired by the distance image acquisition unit 20A in the case of FIG. 2A (time T1). FIG. 7B shows the distance image T acquired by the distance image acquisition unit 20A in the case of FIG. 2B (time T2). For the sake of explanation, the distance information is shown as shading in the drawing as a short distance 11, a middle distance 13, and a long distance 15. In FIG. 7, the projection object 2 is drawn with an ellipse and simplified, and backgrounds other than the stage 7 and the projection object 2 in the image are omitted.

  The distance image S shown in FIG. 7A has the distance information of the short distance 11, the medium distance 13, and the long distance 15 in the stage 7, and the distance information of the long distance 15 in the projection target 2. In the distance image S, the shade of the area having the distance information of the far distance 15 of the stage 7 and the shade of the projection object 2 are equally represented. The distance image T shown in FIG. 7B has the distance information of the short distance 11, the medium distance 13, and the long distance 15 in the stage 7, and the distance information of the medium distance 13 in the projection target 2. In the distance image T, the density of the area having the distance information of the middle distance 13 of the stage 7 and the density of the projection object 2 are equally represented.

  FIG. 7C illustrates a difference value between the distance information included in the distance image S and the distance information included in the distance image T. Since the distance information regarding the stage 7 does not change between the distance image S and the distance image T, the difference value is zero. On the other hand, since the distance information regarding the projection object 2 changes between the distance image S and the distance image T, the difference value is not zero. That is, the distance information of the projection object 2 in the distance image S is the far distance 15, while the distance information of the projection object 2 in the distance image T is the medium distance 13. Therefore, a difference value is generated in the projection target 2.

  On the other hand, in the case of FIG. 8, distance images U and V acquired when the projection object 2 is stationary are shown. That is, FIG. 8A shows the distance image U acquired by the distance image acquisition unit 20A at time T3, and FIG. 8B shows the distance image V acquired by the distance image acquisition unit 20A at time T4. It is shown. In the distance images U and V, the distance information of the projection object 2 is the same at the intermediate distance 13. Therefore, as shown in FIG. 8C, the difference value between the distance information of the distance image U and the distance information of the distance image V is zero. Further, the difference value of the stage 7 is also 0 as in FIG. Since the time difference between times T3 and T4 is very short, it can be predicted that the projection object 2 is stationary when the difference value between the distance images U and V is close to zero. The distance image acquisition unit 20A can acquire distance images continuously. In this case, the distance image acquisition unit 20A can acquire a distance image at 30 fps (frame per second) or 60 fps.

  As described above, the difference value acquisition unit 101 acquires the difference value by subtracting the distance information included in the distance image acquired at the first timing and the distance information included in the distance image acquired at the second timing. The

  Further, the difference value acquisition unit 101 acquires the average value of the distance information included in the distance image S, acquires the average value of the distance information included in the distance image T, and thereafter may use a value obtained by subtracting the average values as the difference value. . That is, the difference value acquisition unit 101 acquires, as a difference value, a value obtained by subtracting the average value of the distance information included in the distance image acquired at the first timing and the average value of the distance information included in the second distance image. May be.

  Also, for example, the difference value acquisition unit 101 acquires, as a difference value, the maximum difference between the distance information included in the distance image acquired at the first timing and the distance information included in the distance image acquired at the second timing. May be. That is, the distance information included in the distance image acquired at the first timing may be subtracted from the distance information included in the distance image acquired at the second timing, and the maximum value may be employed as the difference value.

  Further, for example, the difference value acquisition unit 101 subtracts the value obtained by subtracting the sum of the distance information included in the distance image acquired at the first timing from the sum of the distance information included in the distance image acquired at the second timing. You may get as

  Returning to FIG. 6, the determination unit 103 determines whether the projection object 2 is stationary based on the difference value acquired by the difference value acquisition unit 101. That is, the determination unit 103 acquires the difference value, and determines the state (moving or stationary) of the projection target 2 based on the difference value. The determination unit 103 can determine the state of the projection object 2 by various methods using the difference value. For example, the determination unit 103 can determine the state of the projection target 2 based on the threshold value. Specifically, the determination unit 103 determines that the projection object 2 is stationary when the acquired difference value is equal to or smaller than the threshold value, and moves the projection object 2 when the acquired difference value is larger than the threshold value. It may be determined that The threshold value is a threshold value (first threshold value) when the determination unit 103 determines the difference value between the average value of the distance information included in the first distance image and the average value of the distance information included in the second distance image. The threshold value (second threshold value) when the determination unit 103 determines based on the maximum difference value between the distance information included in the first distance image and the distance information included in the second distance image is properly used.

  The projection instruction unit 105 outputs an instruction to project the image generated by the projection image generation unit 28 on the projection target 2 based on the determination result of the determination unit 103. That is, when the determination unit 103 determines that the projection target 2 is stationary, the projection instruction unit 105 outputs a projection instruction by the projector unit 20B.

  The projection control unit 107 controls the projection of the projector unit 20B based on the instruction to perform projection output from the projection instruction unit 105. The projection control unit 107 causes the display optical element 42 to display the projection image 5 generated by the projection image generation unit 28, causes the light source driver 45 to emit the LED light source 44, and causes the projection image 5 to be displayed on the projection object 2. .

  The projection stop instructing unit 109 determines that the projection target 2 has stopped once, and when the projection unit 20B determines that the projection target 2 has started movement after the projection unit 20B determines that the projection target 2 has started to move. An instruction to stop projecting the image generated by the projection image generation unit 28 is output to the body 2. That is, the projection stop instruction unit 109 outputs an instruction to stop the projection by the projector unit 20B when the projection target 2 starts moving again. Even during projection by the projector unit 20B, a distance image is acquired by the distance image acquisition unit 20A as needed, and the state of the projection target 2 is determined by the determination unit 103 as needed.

  Next, a projection method using the projector device 20 will be described. FIG. 9 is a flowchart showing the operation of the projector device 20.

  First, the distance image is acquired at the first timing by the distance image acquisition unit 20A of the projector device 20 (step S10). Thereafter, the distance image acquisition unit 20A acquires the distance image at the second timing (step S11). Then, the difference value acquisition unit 101 acquires a difference value between the distance information included in the distance image at the first timing and the distance information included in the distance image at the second timing (step S12). In addition, although illustration is abbreviate | omitted, acquisition of the distance image by 20 A of distance image acquisition parts, and acquisition of the difference value by the difference value acquisition part 101 are performed at any time.

  Then, the determination unit 103 determines whether or not the projection object 2 is stationary based on the difference value (step S13). When the determination unit 103 determines that the projection object 2 is not stationary (moving) (No in step S13), the distance image acquisition unit 20A acquires the distance image again. On the other hand, when the determination unit 103 determines that the projection object 2 is stationary, the projection image generation unit 28 generates the projection image 5 (step S14). The projection image generation unit 28 generates the projection image 5 based on the distance image acquired at the first timing and / or the distance image acquired at the second timing. Thereafter, the projection instruction unit 105 outputs an instruction to project the projection image 5 (step S15). Then, the projection control unit 107 controls the projector unit 20B to perform projection on the projection target 2 (step S16).

  Each of the above-described configurations and functions can be appropriately realized by arbitrary hardware, software, or a combination of both. For example, for a program that causes a computer to execute the above-described processing steps (processing procedure), a computer-readable recording medium (non-transitory recording medium) that records such a program, or a computer that can install such a program However, the present invention can be applied.

  As described above, according to the present invention, the projection image 5 is generated by detecting the shape of the projection object 2 based on the distance image generated by the distance image generation unit 113 of the distance image acquisition unit 20A. Is done. Further, according to the present invention, it is determined whether or not the projection object 2 is stationary from the difference value between the distance image acquired at the first timing and the distance image acquired at the second timing. Projection is performed based on. Thus, the present invention uses the distance image to generate the projection image 5 and determine whether or not the subject is stationary, so that the movement of the projection object 2 can be detected while suppressing the enlargement or complication of the apparatus. In addition, the projection image 5 can be accurately projected onto the projection target 2.

[Second Embodiment]
Next, a second embodiment of the present invention will be described. In the present embodiment, a difference image is generated based on the difference value, and the movement of the projection object 2 is detected with high accuracy.

  FIG. 10 is a block diagram illustrating a configuration of the control unit 26 of the present embodiment. In addition, the part demonstrated in FIG. 6 attaches | subjects the same code | symbol, and abbreviate | omits description. The control unit 26 mainly includes a distance image generation unit 113, a difference value acquisition unit 101, a determination unit 103, a difference image generation unit 111, a projection instruction unit 105, a projection control unit 107, and a projection stop instruction unit 109.

  When the control unit 26 of the present embodiment illustrated in FIG. 10 is compared with the control unit 26 of the first embodiment illustrated in FIG. 6, the control unit 26 of the present embodiment is different in that it includes a difference image generation unit 111.

  The difference image generation unit 111 generates a difference image generated based on the difference value acquired by the difference value acquisition unit 101. That is, the difference image generation unit 111 creates a two-dimensional distribution map of difference values based on the difference values acquired from the difference value acquisition unit 101. Here, the difference image is a two-dimensional distribution diagram of difference values.

  FIG. 11 is a conceptual diagram showing a difference image. The difference image W shown in FIG. 11 is created based on the distance image and the difference value described with reference to FIGS. 7 and 8. Further, the difference image may be displayed in different shades or colors depending on the difference values constituting the difference image. By doing in this way, when a difference image is displayed on a display screen (not shown), it becomes easy for a user to grasp the difference of a difference value. In the case of the difference image shown in FIG. 11, the difference value is expressed. The background other than the projection object 2 and the stage 7 is omitted.

  FIG. 11A is a difference image W created from the difference values acquired from the distance images S and T shown in FIGS. 7A and 7B, and the two-dimensional distribution of the acquired difference values. It is shown. Since the difference value between the distance images S and T is 0, the stage 7 is expressed in black. On the other hand, the projection object 2 is composed of two regions (regions O, P, and Q) having different difference values between the distance images S and T.

  Since the area O has a difference value of 0 between the distance images S and T, it is shown in black as in the stage 7. The difference value in the region Q is the difference value between the medium distance 13 (FIG. 7) and the long distance 15 (FIG. 7), and the difference value in the region P is the difference value between the medium distance 13 (FIG. 7) and the background. is there. The background is assumed to be at infinity.

  FIG. 11B is a difference image W created from the difference values acquired from the distance images U and V shown in FIGS. 8A and 8B, and a two-dimensional distribution of the acquired difference values. It is shown. In the case shown in FIG. 11B, since the difference value is 0 between the projection object 2 and the stage 7, the projection object 2 and the stage 7 are expressed in black. Thus, the difference image is configured by a two-dimensional distribution of difference values of distance information included in the distance image.

  Returning to FIG. 10, the determination unit 103 according to the present embodiment can determine the movement of the projection object 2 in various ways based on the difference image W. For example, the determination unit 103 determines whether or not the projection object 2 is stationary based on a value obtained by multiplying the weighting coefficient set according to the area of the difference image by the difference value as a threshold (seventh threshold). Can be done using. In this case, the difference image generation unit 111 acquires a value obtained by multiplying the weighting coefficient set according to the area of the difference image by the difference value.

  FIG. 12 is a conceptual diagram showing the difference image W. In the difference image W shown in FIG. 12, different weighting coefficients are set in the area F and the area E, and the difference image generation unit 111 acquires a value obtained by multiplying the weighting coefficient by the difference value. For example, the area F is weighted with a coefficient of 1.5, and the area E is weighted with a coefficient of 0.5. Then, the difference image generation unit 111 multiplies the weighting coefficient of the region F and the difference value of the region F, and multiplies the weighting coefficient of the region E and the difference value of the region E. Thereby, since the main subject often exists in the center of the image, the movement of the main subject (projected body 2) can be detected more accurately. In FIG. 12, the region is divided into the region near the center and the region around it, but various modes can be adopted for setting the region. For example, the difference image W can be divided into N × M (N and M are integers) grid-like regions, or the regions can be divided along the shape of the projection object 2.

  13 and 14 are diagrams conceptually showing the distance images S and T and the difference image W. FIG. FIG. 13A shows a distance image S and a distance image T in which the projection object 2 has moved to the vicinity of the center. The distance image S is imaged at time T1, the distance image T is imaged at time T2, and the projection object 2 moves to the right in FIG. 13 from time T1 to T2. In FIG. 13A, the distance between the projection object 2 and the background is binarized with respect to the distance, and the projection object 2 that is closer to the projector device 20 is expressed in white than the background.

  FIG. 13B is a difference image (W1 and W2) between the distance image S and the distance image T shown in FIG. In FIG. 13B, a region having a difference value of 0 is expressed in black, and a region 117 where the difference value is not 0 is expressed in white. For example, the difference value of the area 117 of the difference image (W1 and W2) is set to 100. In the difference image W2, as described with reference to FIG. 12, weights are set in the areas E and F, and no weight is set in the difference image W1. In the difference image W2, the difference value in the region F is weighted by the coefficient 1.5, so that the difference image generation unit 111 is multiplied by the difference value (100) of the region 117 and the weighting coefficient (1.5). (100 × 1.5 = 150) is obtained. On the other hand, since no weighting is set in the difference image W1, the difference value of W1 is 100.

  In FIG. 14A, a distance image S in which a person 114 that is not the projection target 2 is shown and a distance image T in which the person 114 is not shown are shown. The distance image S is captured at time T1, and the distance image T is captured at time T2. In FIG. 14A, the distance between the person 114 and the background is binarized, and the person 114 close to the projector apparatus 20 is expressed in white compared to the background.

  FIG. 14B is a difference image (W1 and W2) between the distance image S and the distance image T shown in FIG. 14A, and a region 119 whose difference value corresponding to the person 114 is not 0 is white. It is expressed. For example, the difference value in the difference image (W1 and W2) is set to 50. In FIG. 14B, weighting is set as in FIG. 13B. In the difference image W2, the difference value in the region E is weighted by the coefficient 0.5, so that the difference image generation unit 111 multiplies the difference value (50) in the region 119 by the weighting coefficient (0.5). Is obtained (50 × 0.5 = 25). On the other hand, since no weighting is set in the difference image W1, the difference value of W1 is 50.

  In this way, in the case of FIG. 13B and FIG. 14B (when weighting is set according to the region), the difference value of region F is highly evaluated, and the difference value of region E is evaluated small. The

  In this way, weighting is performed according to the area of the difference image, and the movement of the projection object 2 is determined based on the value obtained by multiplying the weighting coefficient by the difference value. The movement can be determined more accurately.

  Further, the determination unit 103 determines that the projection object 2 is stationary when the average difference value in a plurality of frames of the difference image generated by the difference image generation unit 111 is equal to or less than a threshold value (third threshold value). Also good. That is, the difference image generation unit 111 acquires an average difference value in one difference image and generates the average value for a plurality of frames, and the determination unit 103 receives the average difference value based on the obtained average difference values in the difference images for the plurality of frames. It is determined that the projection body 2 is stationary. Thus, since it is determined whether or not the projection object 2 is stationary based on the average value of the difference images of a plurality of frames, it is possible to determine a more correct movement of the projection object 2.

  Further, the determination unit 103 can determine that the projection object 2 is stationary when the maximum difference value in a plurality of frames of the difference image is equal to or less than a threshold value (fourth threshold value). That is, the difference image generation unit 111 acquires the maximum difference value in the difference images of a plurality of frames, and the determination unit 103 determines whether or not the projection target 2 is stationary based on the maximum difference value. Thus, since it is determined whether or not the projection object 2 is stationary based on the maximum difference value, it is possible to determine a more correct movement of the projection object 2.

  In addition, the determination unit 103 determines whether or not the projection object 2 is stationary based on the size of an area where the difference value is equal to or less than a threshold value (fifth threshold value) in the difference image. That is, the difference image generation unit 111 calculates an area below the threshold in the difference image, and the determination unit 103 determines that the projection object 2 is stationary when the size of the area is larger than a certain size. . Further, the difference image generation unit 111 calculates an area equal to or smaller than a threshold value (sixth threshold value) in the difference image of a plurality of frames, and the projection object 2 is stationary when the size of the area is larger than a certain size. The determination unit 103 may determine that the

  Further, the difference value acquisition unit 101 can acquire or calculate various values related to the difference image as values (hereinafter referred to as evaluation values) by which the determination unit 103 determines the movement of the projection target 2. For example, the difference image generation unit 111 may use the number of pixels whose difference value is not 0 between the distance image S and the distance image T as the evaluation value. Moreover, the difference image generation part 111 is good also considering the variance value or standard deviation of the difference value of a difference image as an evaluation value.

  The example of the present invention has been described above, but the present invention is not limited to the above-described embodiment, and it goes without saying that various modifications can be made without departing from the spirit of the present invention.

  DESCRIPTION OF SYMBOLS 2 ... Projection object, 7 ... Stage, 20 ... Projector apparatus with distance image acquisition apparatus, 20A ... Distance image acquisition part, 20B ... Projector part, 26 ... Control part, 27 ... Memory, 28 ... Projection image generation part, 29 ... Input I / F, 31 ... Timing generator, 32 ... Light source for measurement, 33 ... Light source driver, 35 ... Projection lens, 36 ... Imaging lens, 37 ... Lens driver, 38 ... Distance image sensor, 39 ... AD converter, 39 DESCRIPTION OF SYMBOLS ... Converter, 40 ... Interface circuit, 40 ... Circuit, 42 ... Display optical element, 43 ... Element driver, 44 ... Projection light source, 45 ... Light source driver, 46 ... Projection lens, 47 ... Lens driver, 49 ... Data bus, DESCRIPTION OF SYMBOLS 101 ... Difference value acquisition part, 103 ... Determination part, 105 ... Projection instruction part, 107 ... Projection control part, 109 ... Projection stop instruction part, 111 ... Difference image generation , 113 ... distance image generation unit

Claims (12)

  1. A projector apparatus comprising: a display optical element that displays a projection image; and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target;
    A distance image sensor in which a plurality of light receiving elements are two-dimensionally arranged, a measurement light source, and a measurement light emitted from the measurement light source and reflected by the projection object is imaged on the distance image sensor. The distance information corresponding to the flight time of the measurement light emitted from the image lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and the acquisition is performed. A distance image generation device including a distance image generation unit that generates a distance image based on the distance information
    The projector device includes:
    A projection image generation unit that detects a shape of the projection object based on the distance image acquired by the distance image acquisition device, and generates a projection image corresponding to the detected shape;
    A difference value for acquiring a difference value between the distance information included in the first distance image acquired at the first timing by the distance image acquisition device and the distance information included in the second distance image acquired at the second timing. An acquisition unit;
    A determination unit that determines whether or not the projection object is stationary based on the difference value acquired by the difference value acquisition unit;
    A projection instruction unit that outputs an instruction to project the image generated by the projection image generation unit on the projection object based on the determination result of the determination unit;
    A projection control unit that controls projection of the projector device based on an instruction to perform the projection output from the projection instruction unit;
    Equipped with a,
    The difference value acquisition unit acquires a difference value between an average value of distance information included in the first distance image and an average value of distance information included in the second distance image;
    The determination unit is a projector with a distance image acquisition device that determines that the projection target is stationary when the difference value is equal to or less than a first threshold value .
  2. A projector apparatus comprising: a display optical element that displays a projection image; and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target;
    A distance image sensor in which a plurality of light receiving elements are two-dimensionally arranged, a measurement light source, and a measurement light emitted from the measurement light source and reflected by the projection object is imaged on the distance image sensor. The distance information corresponding to the flight time of the measurement light emitted from the image lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and the acquisition is performed. A distance image generation device including a distance image generation unit that generates a distance image based on the distance information
    The projector device includes:
    A projection image generation unit that detects a shape of the projection object based on the distance image acquired by the distance image acquisition device, and generates a projection image corresponding to the detected shape;
    A difference value for acquiring a difference value between the distance information included in the first distance image acquired at the first timing by the distance image acquisition device and the distance information included in the second distance image acquired at the second timing. An acquisition unit;
    A determination unit that determines whether or not the projection object is stationary based on the difference value acquired by the difference value acquisition unit;
    A projection instruction unit that outputs an instruction to project the image generated by the projection image generation unit on the projection object based on the determination result of the determination unit;
    A projection control unit that controls projection of the projector device based on an instruction to perform the projection output from the projection instruction unit;
    With
    The difference value acquisition unit acquires a maximum difference value between distance information included in the first distance image and distance information included in the second distance image;
    The projector according to claim 1, wherein the determination unit determines that the projection target is stationary when the maximum difference value is equal to or less than a second threshold value.
  3. A projector apparatus comprising: a display optical element that displays a projection image; and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target;
    A distance image sensor in which a plurality of light receiving elements are two-dimensionally arranged, a measurement light source, and a measurement light emitted from the measurement light source and reflected by the projection object is imaged on the distance image sensor. The distance information corresponding to the flight time of the measurement light emitted from the image lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and the acquisition is performed. A distance image generation device including a distance image generation unit that generates a distance image based on the distance information
    The projector device includes:
    A projection image generation unit that detects a shape of the projection object based on the distance image acquired by the distance image acquisition device, and generates a projection image corresponding to the detected shape;
    A difference value for acquiring a difference value between the distance information included in the first distance image acquired at the first timing by the distance image acquisition device and the distance information included in the second distance image acquired at the second timing. An acquisition unit;
    A determination unit that determines whether or not the projection object is stationary based on the difference value acquired by the difference value acquisition unit;
    A projection instruction unit that outputs an instruction to project the image generated by the projection image generation unit on the projection object based on the determination result of the determination unit;
    A projection control unit that controls projection of the projector device based on an instruction to perform the projection output from the projection instruction unit;
    A projector with a distance image acquisition device comprising: a difference image generation unit that generates a difference image generated based on the difference value acquired by the difference value acquisition unit.
  4. The difference image generation unit acquires an average difference value in a plurality of frames of the difference image,
    The distance image acquisition device according to claim 3 , wherein the determination unit determines that the projection target is stationary when the average difference value in the plurality of frames of the difference image is equal to or less than a third threshold value. Projector device.
  5. The difference image generation unit acquires a maximum difference value in a plurality of frames of the difference image,
    The distance image acquisition according to claim 3 , wherein the determination unit determines that the projection target is stationary when the maximum difference value in the plurality of frames of the difference image is equal to or less than a fourth threshold value. Projector device with device.
  6. The difference image generation unit acquires a difference value for each area of the difference image,
    The distance according to claim 3 , wherein the determination unit determines whether or not the projection target is stationary based on a size of the region in which the difference value is equal to or less than a fifth threshold in the difference image. Projector device with image acquisition device.
  7. The difference image generation unit acquires a difference value for each area of the difference image in a plurality of frames,
    The determination unit, to claim 3 for the whether the projection target body the difference value in the difference image based on the size of the region is less than the sixth threshold is static determination of the plurality of frames The projector apparatus with a range image acquisition apparatus as described.
  8. The difference image generation unit obtains a value obtained by multiplying a weighting coefficient set according to a region of the difference image by a difference value;
    4. The distance image acquisition device according to claim 3 , wherein the determination unit determines that the projection object is stationary when a value obtained by multiplying the weighting coefficient by a difference value is equal to or less than a seventh threshold value. Projector device.
  9. A projection stop instruction unit that outputs a projection stop instruction for the image generated by the projection image generation unit to the projection object based on the determination result of the determination unit;
    The projection control unit, a distance image acquiring device with incorporated projector apparatus according to any one of claims 1 to 8 for controlling the projection of the projector device, based on the projection of the stop instruction from the projection stop instruction section.
  10. A projector apparatus comprising: a display optical element that displays a projection image; and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target;
    A distance image sensor in which a plurality of light receiving elements are two-dimensionally arranged, a measurement light source, and a measurement light emitted from the measurement light source and reflected by the projection object is formed on the distance image sensor The distance information corresponding to the flight time of the measurement light emitted from the image lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and acquired. A distance image generating unit that generates a distance image based on the distance information, and a projection method using a projector with a distance image acquiring device, comprising:
    A projection image generation step of detecting a shape of the projection object based on the distance image acquired by the distance image acquisition device and generating a projection image corresponding to the detected shape;
    A difference value for acquiring a difference value between the distance information included in the first distance image acquired at the first timing by the distance image acquisition device and the distance information included in the second distance image acquired at the second timing. An acquisition step;
    A determination step of determining whether or not the projection object is stationary based on the difference value acquired in the difference value acquisition step;
    A projection instruction step for outputting an instruction to project the image generated in the projection image generation step to the projection object based on the determination result of the determination step;
    A projection control step of controlling projection of the projector device based on an instruction to perform the projection output from the projection instruction step;
    Obtaining a difference value between an average value of distance information included in the first distance image and an average value of distance information included in the second distance image;
    Determining that the projection object is stationary when the difference value is equal to or less than a first threshold;
    A projection method including:
  11.   A projector apparatus comprising: a display optical element that displays a projection image; and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target;
      A distance image sensor in which a plurality of light receiving elements are two-dimensionally arranged, a measurement light source, and a measurement light emitted from the measurement light source and reflected by the projection object is imaged on the distance image sensor. The distance information corresponding to the flight time of the measurement light emitted from the image lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and the acquisition is performed. A distance image generating unit that generates a distance image based on the distance information, and a projection method using a projector with a distance image acquiring device, comprising:
      A projection image generation step of detecting a shape of the projection object based on the distance image acquired by the distance image acquisition device and generating a projection image corresponding to the detected shape;
      A difference value for acquiring a difference value between the distance information included in the first distance image acquired at the first timing by the distance image acquisition device and the distance information included in the second distance image acquired at the second timing. An acquisition step;
      A determination step of determining whether or not the projection object is stationary based on the difference value acquired in the difference value acquisition step;
      A projection instruction step for outputting an instruction to project the image generated in the projection image generation step to the projection object based on the determination result of the determination step;
      A projection control step of controlling projection of the projector device based on an instruction to perform the projection output from the projection instruction step;
    Obtaining a maximum difference value between the distance information of the first distance image and the distance information of the second distance image;
      Determining that the projection object is stationary when the maximum difference value is equal to or less than a second threshold;
      A projection method including:
  12.   A projector apparatus comprising: a display optical element that displays a projection image; and a projection light source and a projection lens that project the projection image displayed on the display optical element onto a projection target;
      A distance image sensor in which a plurality of light receiving elements are two-dimensionally arranged, a measurement light source, and a measurement light emitted from the measurement light source and reflected by the projection object is imaged on the distance image sensor. The distance information corresponding to the flight time of the measurement light emitted from the image lens and the measurement light source, reflected by the projection object and incident on the distance image sensor is acquired from the distance image sensor, and the acquisition is performed. A distance image generating unit that generates a distance image based on the distance information, and a projection method using a projector with a distance image acquiring device, comprising:
      A projection image generation step of detecting a shape of the projection object based on the distance image acquired by the distance image acquisition device and generating a projection image corresponding to the detected shape;
      A difference value for acquiring a difference value between the distance information included in the first distance image acquired at the first timing by the distance image acquisition device and the distance information included in the second distance image acquired at the second timing. An acquisition step;
      A determination step of determining whether or not the projection object is stationary based on the difference value acquired in the difference value acquisition step;
      A projection instruction step for outputting an instruction to project the image generated in the projection image generation step to the projection object based on the determination result of the determination step;
      A projection control step of controlling projection of the projector device based on an instruction to perform the projection output from the projection instruction step;
      Generating a difference image generated based on the difference value;
      A projection method including:
JP2017543009A 2015-09-29 2016-08-19 Projector device with distance image acquisition device and projection method Active JP6467516B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2015191761 2015-09-29
JP2015191761 2015-09-29
PCT/JP2016/074251 WO2017056776A1 (en) 2015-09-29 2016-08-19 Projector device equipped with distance image acquisition device and projection method

Publications (2)

Publication Number Publication Date
JPWO2017056776A1 JPWO2017056776A1 (en) 2018-08-09
JP6467516B2 true JP6467516B2 (en) 2019-02-13

Family

ID=58423434

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017543009A Active JP6467516B2 (en) 2015-09-29 2016-08-19 Projector device with distance image acquisition device and projection method

Country Status (4)

Country Link
US (1) US10663593B2 (en)
JP (1) JP6467516B2 (en)
CN (1) CN108140358A (en)
WO (1) WO2017056776A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018152022A (en) * 2017-03-15 2018-09-27 セイコーエプソン株式会社 Projector system
DE102018105621A1 (en) * 2018-03-12 2019-09-12 Salzbrenner STW-Inside GmbH Video projection device
WO2020145097A1 (en) * 2019-01-10 2020-07-16 株式会社小糸製作所 LiDAR SENSOR UNIT AND VEHICLE SECURITY SYSTEM

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3407952B2 (en) * 1993-10-19 2003-05-19 日本信号株式会社 Train door detection device
JP3795647B2 (en) * 1997-10-29 2006-07-12 株式会社竹中工務店 Hand pointing device
US20020063807A1 (en) * 1999-04-19 2002-05-30 Neal Margulis Method for Performing Image Transforms in a Digital Display System
JP3495355B2 (en) * 2001-11-05 2004-02-09 三菱電機株式会社 Pulse radar equipment
JP2003255219A (en) * 2002-03-04 2003-09-10 Nisca Corp Automatic framing camera
JP3772870B2 (en) * 2003-08-25 2006-05-10 カシオ計算機株式会社 Projection apparatus, projection method, and program
JP2005223393A (en) * 2004-02-03 2005-08-18 Casio Comput Co Ltd Projector, projecting method, and projection program
JP4670303B2 (en) * 2004-10-06 2011-04-13 ソニー株式会社 Image processing method and image processing apparatus
JP2008242658A (en) * 2007-03-26 2008-10-09 Funai Electric Co Ltd Three-dimensional object imaging apparatus
CN101726978B (en) * 2008-10-29 2012-09-12 精工爱普生株式会社 Projector and projector control method
US10410500B2 (en) * 2010-09-23 2019-09-10 Stryker Corporation Person support apparatuses with virtual control panels
US8681255B2 (en) * 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
KR20120055991A (en) * 2010-11-24 2012-06-01 삼성전자주식회사 Image processing apparatus and control method thereof
JP2012208439A (en) * 2011-03-30 2012-10-25 Sony Corp Projection device, projection method and projection program
JP2013033206A (en) * 2011-07-06 2013-02-14 Ricoh Co Ltd Projection display device, information processing device, projection display system, and program
JP2013061552A (en) * 2011-09-14 2013-04-04 Ricoh Co Ltd Projector device and operation detection method
EP2635022A1 (en) * 2012-02-29 2013-09-04 Flir Systems AB A method and system for performing alignment of a projection image to detected infrared (IR) radiation information
EP2634747A1 (en) * 2012-02-29 2013-09-04 Flir Systems AB A method and system for projecting a visible representation of infrared radiation
JP2013192189A (en) 2012-03-15 2013-09-26 Casio Comput Co Ltd Image processing device, projection system, program and image processing method
JP5842694B2 (en) * 2012-03-21 2016-01-13 セイコーエプソン株式会社 Image processing apparatus, projector, and projector control method
KR101793628B1 (en) * 2012-04-08 2017-11-06 삼성전자주식회사 Transparent display apparatus and method thereof
JP2014041225A (en) * 2012-08-22 2014-03-06 Canon Inc Liquid crystal projector and projection method of liquid crystal projector
US9996909B2 (en) * 2012-08-30 2018-06-12 Rakuten, Inc. Clothing image processing device, clothing image display method and program
CN104755981B (en) * 2012-11-14 2017-04-12 富士胶片株式会社 Image processor, image-capturing device and image processing method
JP2014137762A (en) * 2013-01-18 2014-07-28 Sanyo Electric Co Ltd Object detector
US9405124B2 (en) * 2013-04-09 2016-08-02 Massachusetts Institute Of Technology Methods and apparatus for light field projection
JP2015038595A (en) * 2013-07-19 2015-02-26 キヤノン株式会社 Video generation device, and video generation method
JP5842110B2 (en) * 2013-10-10 2016-01-13 パナソニックIpマネジメント株式会社 Display control device, display control program, and recording medium
US10122976B2 (en) * 2014-12-25 2018-11-06 Panasonic Intellectual Property Management Co., Ltd. Projection device for controlling a position of an image projected on a projection surface
JP6101944B2 (en) * 2014-12-25 2017-03-29 パナソニックIpマネジメント株式会社 Projection device

Also Published As

Publication number Publication date
JPWO2017056776A1 (en) 2018-08-09
WO2017056776A1 (en) 2017-04-06
US20180224553A1 (en) 2018-08-09
US10663593B2 (en) 2020-05-26
CN108140358A (en) 2018-06-08

Similar Documents

Publication Publication Date Title
US8736710B2 (en) Automatic exposure control for flash photography
KR101470126B1 (en) Light receiver, light reception method and transmission system
JP2016057638A (en) Method of generating focus signal from plurality of edge widths
US10021290B2 (en) Image processing apparatus, image processing method, image processing program, and image pickup apparatus acquiring a focusing distance from a plurality of images
US8681255B2 (en) Integrated low power depth camera and projection device
US8717417B2 (en) Three-dimensional mapping and imaging
KR20140066638A (en) Apparatus and method for obtaining 3d image
JP5448617B2 (en) Distance estimation device, distance estimation method, program, integrated circuit, and camera
JP3867205B2 (en) Pointed position detection device, pointed position detection system, and pointed position detection method
JP4337353B2 (en) Flicker detection device, flicker correction device, imaging device, flicker detection program, and flicker correction program
JP4530571B2 (en) 3D image detection device
JP4111592B2 (en) 3D input device
JP5743390B2 (en) Ranging device and ranging method
KR100777428B1 (en) Image processing device and method
JP5739174B2 (en) Distance information extraction method and optical apparatus employing the method
US9282259B2 (en) Camera and method for thermal image noise reduction using post processing techniques
US7889890B2 (en) Image capture apparatus and control method therefor
AU2012296028B2 (en) Projector and control method thereof
KR102010228B1 (en) Image processing apparatus, image processing method, and program
US7995097B2 (en) Techniques of motion estimation when acquiring an image of a scene that may be illuminated with a time varying luminance
JP6016068B2 (en) Image projection apparatus, control method therefor, and program
KR100851477B1 (en) Projecting apparatus and method and recording medium recording the program of the projecting method
JP4420909B2 (en) Imaging device
US9338379B2 (en) System, device, and method for obtaining an image of a person&#39;s skin
JP6370134B2 (en) Imaging device, control method thereof, and control program

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180320

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180320

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190104

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190111

R150 Certificate of patent or registration of utility model

Ref document number: 6467516

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150