WO2015162722A1 - Dispositif de projection d'image - Google Patents

Dispositif de projection d'image Download PDF

Info

Publication number
WO2015162722A1
WO2015162722A1 PCT/JP2014/061434 JP2014061434W WO2015162722A1 WO 2015162722 A1 WO2015162722 A1 WO 2015162722A1 JP 2014061434 W JP2014061434 W JP 2014061434W WO 2015162722 A1 WO2015162722 A1 WO 2015162722A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
detection
image
image projection
gesture
Prior art date
Application number
PCT/JP2014/061434
Other languages
English (en)
Japanese (ja)
Inventor
将史 山本
瀬尾 欣穂
浦田 浩之
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2014/061434 priority Critical patent/WO2015162722A1/fr
Publication of WO2015162722A1 publication Critical patent/WO2015162722A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to an image projection apparatus having a gesture detection function.
  • Patent Document 1 As background art in this technical field.
  • the problem is that “an operator's gesture made in a predetermined direction different from the projection direction of image light can be detected, and the operator's gesture can be detected even when the distance between the projector and the projection surface is narrow.
  • an image projection apparatus that can detect and can further improve the detection sensitivity of an operator's gesture without reducing the visibility of the image projected on the projection surface.
  • the image projection apparatus 1 that modulates light from a light source into image light by a light modulation element, projects image light through an imaging optical system, and displays an image on the projection surface 20.
  • the gesture detection means 13 capable of detecting the gesture JS of the operator OP performed in a predetermined direction different from the projection direction D1, and detection of the gesture by the gesture detection means 13
  • a detection range setting means for determining the coverage area has been described as a configuration equipped with a control signal generating means for generating a control signal based on the gesture information which is the detection result of the gesture detection means 13 ".
  • the gesture detection means recognizes an operator's gesture when an object enters the detection range, and does not consider whether the object is a hand or a human body. Therefore, for example, when a person passes through the detection range, it is recognized as a gesture, and there arises a problem that control contrary to the operator's intention is performed.
  • a video projector particularly in a desktop projector using a short focus lens
  • the operator operates at a position close to the desk surface on which the video is projected.
  • the human body of the operator on the back is often included.
  • An object of the present invention is to provide an image projection device that improves gesture detection accuracy by distinguishing between the operator's hand and the human body and performs control according to the operator's intention.
  • the present invention is an image projection apparatus capable of controlling an image projected based on an operator's gesture, an image projection unit that generates image light based on the image signal and projects the image on a projection surface, and an operator's gesture And an operation detection unit that generates an operation signal for controlling the projection image based on a gesture detected by the sensor, the sensor including a plurality of detection elements,
  • the projection surface is an installation surface on which the video projection device is installed
  • at least one predetermined detection element of the plurality of detection elements has a ratio of the projected video screen included in the detection range equal to or higher than a predetermined ratio. It is characterized by becoming.
  • an image projection apparatus that performs control according to the operator's intention without erroneous detection of a gesture.
  • FIG. 1 is an overview diagram of a video projection apparatus according to Embodiment 1.
  • FIG. The block diagram which shows the internal structure of a video projector. The figure which shows the case where a sensor is arrange
  • the figure which showed the frequency component of the signal obtained by the motion of a hand and the motion of a human body (Example 2).
  • the figure which showed an example of the detection circuit structure of the sensor using a thermopile Example 3
  • FIG. 10 is a diagram illustrating a gesture in the z direction (Example 4). The figure which showed the relationship between the hand and detection range at the time of z direction gesture. The figure which showed the time change of the sensor signal of each CH at the time of az direction operation
  • FIG. 10 is a diagram illustrating time variation of output signals of a normal detection element and a detection element in a failure state (Example 5). The figure which showed the flowchart of failure determination of a detection element. The figure which shows the pattern of a hand gesture. The figure which shows the pattern excluded from error determination. The figure which showed the kind of gesture (Example 6).
  • FIG. 1 is an overview of the image projection apparatus according to the first embodiment.
  • the image projection apparatus 1 is installed on an installation surface (desk) 11, magnifies the image light generated inside the apparatus by the projection lens 4 d, reflects the image light by the reflection mirror 4 e, and projects the image screen 10 on the installation surface 11. .
  • the focus of the projected image is adjusted by the focus ring 15.
  • the reflection mirror 4e is configured to be foldable, and is stored so that the reflection surface faces the image projection device 1 when not in use.
  • the video projection device 1 includes a sensor 2 that detects an operator's gesture. The sensor 2 detects the movement and direction of the operator's hand within the detection range 2a.
  • the sensor 2 may have a light source to detect an operator's hand, or may be a passive sensor that does not have a light source.
  • a pyroelectric sensor that is a temperature sensor that detects the temperature of the operator's hand can be used.
  • FIG. 2 is a block diagram showing the internal configuration of the image projection apparatus 1.
  • the video projection device 1 includes an operation detection unit 3 and a video projection unit 4.
  • the operation detection unit 3 includes a signal detection unit 3a, a gesture determination unit 3b, and an operation signal generation unit 3c.
  • the signal detection unit 3a detects the signal from the sensor 2 and supplies it to the gesture determination unit 3b.
  • the gesture determination unit 3b performs processing for determining various gesture movements based on the supplied signal.
  • the operation signal generation unit 3c generates an operation signal corresponding to the determination result of the gesture determination unit 3b and outputs the operation signal to an external device 6 such as a PC (Personal Computer) or a smartphone.
  • the external device 6 controls the video signal supplied to the video projection device 1 according to the operation signal from the operation signal generation unit 3c.
  • the operation signal is generated based on the gesture of the operator, and the image projected from the image projection apparatus 1 is controlled by the operation signal. For example, according to the direction in which the operator moves his / her hand, an operation of scrolling, frame-by-frame advancement, or switching between slides is performed.
  • the external device 6 may be any device that supplies a video signal to the video projection device 1.
  • a card-like storage medium inserted into a card interface provided in the video projection device 1 may be used.
  • the video projection unit 4 includes a video control unit 4a, a light source unit 4b, a light control unit 4c, a projection lens 4d, and a projection mirror 4e.
  • the video control unit 4a sends a control signal to the light source unit 4b and the light control unit 4c in accordance with the video signal supplied from the external device 6.
  • the light source unit 4b is a light source such as a halogen lamp, an LED, or a laser, and adjusts the amount of light according to the control signal of the video control unit 4a. When the light source unit 4b includes three colors of R, G, and B, the light amount may be controlled independently according to the video signal.
  • the light control unit 4c has optical system components such as a mirror, a lens, a prism, and an imager (for example, a display device such as a liquid crystal panel), and is supplied from the external device 6 using light emitted from the light source unit 4b. An optical image based on the processed image signal is generated.
  • the projection lens 4d enlarges the image output from the light control unit 4c.
  • the projection mirror 4 e reflects the light emitted from the projection lens 4 d and projects the video screen 10 on the installation surface 11.
  • the projection mirror 4e uses an aspherical mirror, and when projecting an image screen of the same size, the projection distance can be shortened compared to a general projection apparatus.
  • the projection lens 4d and the projection mirror 4e are collectively referred to as a projection optical unit.
  • the installation surface (desk surface) 11 is an xz plane, and the image projection apparatus 1 is installed upright in the y direction.
  • the operator is positioned in the z direction with respect to the image projection apparatus 1.
  • FIG. 3 is a diagram showing a case where the sensor 2 is arranged on the upper side (y direction) of the image projection device 1.
  • the detection range of the sensor 2 at this time is indicated by 2a.
  • FIG. 3 shows the positional relationship between the human body 20 and the hand 21 of the operator at this time.
  • FIG. 4 is a diagram showing a case where the sensor 2 is arranged on the upper side of the image projection device 1 and the detection range 2a is enlarged.
  • the detection range 2a is expanded so as to include the video screen 10
  • the detection can be satisfactorily detected at a position close to the video projection apparatus 1, but the detection sensitivity is lowered at the end position of the video screen 10 far from the video projection apparatus 1. Judgment becomes difficult. Therefore, it is necessary to make the detection possible at a height close to the installation surface without reducing the sensitivity to the end position of the video screen.
  • FIG. 5 is a diagram showing a case where the sensor 2 is arranged on the lower side of the image projection device 1.
  • the sensitivity does not decrease regardless of the distance from the sensor 2, so that gesture detection is possible on a video screen close to the installation surface.
  • the average length of the hand is 19 cm.
  • FIG. 6 is a diagram showing an example of erroneous detection due to the movement of the human body.
  • the sensor 2 recognizes the movement and direction of the human body 20 as a gesture, and erroneous detection occurs. Therefore, it is necessary to separate the gesture of the hand 21 from the movement of the human body 20.
  • the following describes a configuration that can perform an intuitive operation on a video screen close to the installation surface without reducing the sensitivity at the edge of the video screen and suppress false detection. That is, the configuration of the sensor 2 for suppressing erroneous detection and the setting of the detection range 2a for separating the movement of the hand and the human body will be described.
  • FIG. 7A and 7B are diagrams showing a configuration example and a detection range of the sensor 2, wherein FIG. 7A is a diagram viewed from the z direction, and FIG. 7B is a diagram viewed from the y direction and the x direction.
  • the sensor 2 has a plurality (four in this case) of detection elements 16, and each of the detection elements 16 has a different detection range.
  • the detection elements located in the y ′ direction, the x ′ direction, the y direction, and the x direction from the center of the sensor 2 will be referred to as CH1, CH2, CH3, and CH4, respectively.
  • the sensor 2 detects a gesture using the signal amplitude from each detection element 16 and the timing of signal output. For example, when a hand is shaken in the x to x ′ direction, the CH1 and CH3 signals are detected after the CH4 signal is output, and finally the CH2 signal is detected. At this time, a time difference occurs between the signal outputs of CH4 and CH2.
  • the gesture at that time is a gesture from x to the x 'direction.
  • four detection elements 16 are used. However, the number of detection elements is not limited to four, and may be appropriately increased or decreased according to the movement direction of the hand gesture.
  • FIG. (B) shows an example of the detection range 2a of each detection element 16 in the xz section and the yz section direction of the sensor 2.
  • the detection ranges 2a of the detection elements 16 CH1 to CH4 are arranged so as to expand radially from the central axis (z direction) of the sensor 2 (x direction and y direction).
  • the central axis of each CH detection range 2a does not necessarily extend radially, and the detection range 2a of each CH may be different even if it is parallel to the central axis of the sensor 2.
  • a part of the detection range 2a of each CH may overlap.
  • FIG. 8 is a diagram illustrating an example of setting the detection range 2a.
  • At least one predetermined detection element of the detection elements 16 is set so that the video screen 10 included in the detection range 2a becomes a predetermined ratio or more.
  • the ratio of the video screen 10 to the detection range 2a is referred to as “video screen occupation ratio”.
  • the predetermined detection element is a detection element of CH1, and this detection element (CH1) is set so that the video screen occupation ratio is larger than that of the other elements.
  • the movement of the hand on the video screen of the operator and the movement of the human body of the operator can be separated.
  • the predetermined detection element whose video screen occupancy rate is equal to or higher than the predetermined value will be referred to as a “gesture monitoring element” (reference numeral 16g). 7 and 8, CH1 is the gesture monitoring element 16g.
  • FIG. 9 is a diagram for explaining the separation of the movement of the hand and the human body by the gesture monitoring element.
  • FIG. 9 shows changes in sensor signals generated in the gesture monitoring element CH1 (16g) and the other detection elements CH3 with respect to the movement of the hand 21 and the movement of the human body 20 shown in FIG.
  • the gesture monitoring element CH1 since the movement of the hand 21 crosses the detection range 2a (CH1), a large signal change occurs. However, since the movement of the human body 20 has a small ratio across the detection range 2a (CH1), the signal change is small. On the other hand, since the other detection element CH3 detects any movement of the hand 21 and the human body 20, the signal change becomes large. In this manner, the gesture monitoring element detects the movement of the operator's hand with priority over the movement of other human bodies. The movement of the hand and the human body can be separated by comparing the signal level of the gesture monitoring element with the signal level of the other detection elements.
  • the video screen occupancy rate of the gesture monitoring element may be determined so as to satisfy this. .
  • the video screen occupancy is about 50% or more, it is practically satisfactory.
  • the gesture monitoring element 16 g is included in the sensor 2, but the gesture monitoring element may be provided separately from the sensor 2. It is assumed that the gesture monitoring element is included in the sensor 2 and the sensor 2 is tilted downward (installation surface side). At this time, the gesture monitoring element has a detection range on the video screen (installation surface). However, since the other detection elements are also directed downward, the detection sensitivity at the edge of the video screen may be reduced. On the other hand, by providing the gesture monitoring element as a separate body, there is an advantage that the degree of freedom with respect to the operability of the sensor 2 is increased and the sensitivity at the end of the video screen can be secured.
  • FIG. 10 is a diagram illustrating an example in which the gesture monitoring element is configured separately from the sensor 2.
  • the video projection device 1 is provided with a gesture monitoring element 17 g that is separate from the sensor 2, and the gesture monitoring element 17 g is arranged at a different position so as to detect only the video screen 10.
  • a region 2b indicates a range detected by the gesture monitoring element 17g, and only the hand movement in the region 2b can be detected.
  • FIG. 11 is a diagram showing an example in which the operation range by the gesture monitoring element is limited.
  • the gesture monitoring element 17g configured as a separate body can not only separate the hand and the human body but also limit the operation range of the operator by limiting the detection range 2b. For example, it is possible to use such that the operation range of the operator is limited only to the center of the video screen 10.
  • the limit of the operation range can be changed according to the situation. For example, when it is possible to operate at any position on the video screen, the detection range 2b is expanded to the entire video screen 10, and when it is desired to limit the operation to only the front of the video projector 1, the detection range 2b is limited to the center of the video screen. . If the area is a predetermined ratio (for example, 1/3), the operable range can be limited to 1/3. Needless to say, the operation range can be changed to the left, right, and four corners of the video screen depending on the use environment.
  • the operation range may be switched from the menu screen of the image projection apparatus 1 or may be switched by a mechanical switch. Alternatively, the operation range may be switched by a gesture after enabling the operation range switching mode.
  • the operation range switching mode may be enabled by a menu or may be enabled by a specific gesture.
  • FIG. 12 is a diagram showing the setting of the detection range of the sensor 2 according to the application.
  • (a) is a case where priority is given to suppression of erroneous detection.
  • (B) shows a case where priority is given to operability.
  • the center axis of the sensor 2 is attached to the image projection apparatus so as to be inclined with respect to the installation surface of the image projection apparatus.
  • the height h from the installation surface 11 to the center of the sensor 2 is 2 ⁇
  • the angle of the detection range 2 a of the sensor 2 is 2 ⁇
  • the inclination ⁇ of the central axis of the sensor 2 with respect to the installation surface and from the sensor 2 to the edge of the video screen 10
  • the total length of the hand 21 is H.
  • the detection range of the gesture monitoring element 16g is determined by determining the detection range 2a based on the size H of the hand 21.
  • the center axis of the sensor 2 is set to the height center H / 2 of the hand 21 operating the video screen end.
  • the relational expressions at this time are as shown in the expressions (2) and (3).
  • tan ⁇ 1 ((H / 2 ⁇ h) / L) (2) However, ⁇ > tan ⁇ 1 (h / L) (3)
  • H 19 cm
  • the center axis of the sensor 2 may be set so as to pass through the height of 9.5 cm from the installation at the image screen end.
  • the mounting position of the sensor 2 with respect to the image projection apparatus 1 will be described.
  • the mounting position of the sensor 2 is easy to secure a space, and is less susceptible to the influence of image light (projection light) projected from the image projection unit 4, so that the image projection apparatus 1 is projected so as to project an image on the installation surface 11.
  • image light projection light
  • the sensor 2 near the video screen In order to set the detection range of the sensor 2 on the video screen 10 of the installation surface 11, it is desirable to arrange the sensor 2 near the video screen.
  • the image projection unit 4 is disposed above the center to provide a distance from the installation surface, and the emission direction from the projection lens 4d is on the upper side. Therefore, it is easy to secure a space below the projection lens 4d. In addition, the influence of direct light on the sensor 2 by the image projection unit 4 is reduced.
  • the detection range of the sensor 2 in the left-right direction (x direction).
  • An appropriate human hand gesture in the horizontal direction (x direction) is 20 to 30 cm.
  • the detection range may be adjusted so that the hand shake is 20 to 30 cm at the edge of the video screen.
  • the detection range of each detection element may be set independently. You may set to the visual field which has anisotropy instead of the visual field which expands isotropically.
  • the horizontal direction (x direction) is set to expand to 20 to 30 cm at the end of the video screen
  • the vertical direction (y direction) is preferably set to an expansion of about 10 cm because it is preferable for the operator to have a small hand shake. .
  • the gesture monitoring element is used, and the movement of the hand and the human body is separated using the difference in the movement speed between the hand and the human body, and the gesture is detected with high accuracy.
  • the difference in the operation speed between the hand and the human body will be described first, and the speed constraint for separating the movement of the hand and the human body will be described.
  • FIG. 13 is a diagram illustrating frequency components of signals obtained by hand movements and human body movements.
  • a solid line indicates a signal obtained by hand movement
  • a broken line indicates a signal obtained by the movement of the human body.
  • the peak of hand movement is 2.7 Hz
  • the peak of human movement is 1.2 Hz.
  • the movement of the human body is measured at a speed that is enough to walk in the office. These speed differences (frequency differences) are about 2.3 times, and by extracting this speed difference, the movement of the hand and the human body can be separated with higher accuracy.
  • FIG. 14 is a diagram showing an example of detecting the movement of a human body, and shows an example of walking at a speed v in the x direction so as to cross the detection range 2a.
  • the movement in the x direction is detected using the detection elements CH4 and CH2 of the sensor 2 of FIG.
  • FIG. 15 is a diagram showing a time change of the output signal of the sensor 2 by the human body.
  • tA indicates the time when the human body crosses the position A within the detection range 2a
  • tb indicates the time when the human body crosses the position B within the detection range 2a.
  • this speed difference is used to determine whether or not the moving body indicates a gesture from the speed of the moving body that has moved within the detection range 2a.
  • a threshold T0 for the travel time from A to B is set.
  • the time tA at which the detection signal output from the sensor 2 (CH4) shows a peak when the moving body crosses the position A of the detection range.
  • the time difference (tB ⁇ tA) from the time tB when the detection signal output from the sensor 2 (CH2) shows a peak when crossing the position B of the detection range is compared with the threshold value T0. If (tB ⁇ tA) ⁇ T0 as a result of the comparison, the moving body is determined or detected as a hand gesture. If (tB-tA)> T0, it is determined that the moving body is not a gesture (or is not detected as a gesture). In the experiments of the present inventors, the speed at which a person moves is 1 to 1.5 m / sec, and the speed of hand shaking is 2 to 4 m / sec. Therefore, the threshold value T0 may be set based on these. . As described above, in this embodiment, the passing time of the moving body is measured and compared with the threshold value T0, thereby eliminating the false detection due to the movement of the human body other than the hand of the operator and further improving the detection accuracy of the gesture. Can do.
  • Example 3 describes a detection element 16 used for gesture detection.
  • the detection element 16 of the sensor 2 an element that detects infrared radiation from a hand (human body) such as a thermopile can be used.
  • a thermopile an element that detects infrared radiation from a hand (human body)
  • thermopile an element that detects infrared radiation from a hand (human body)
  • thermopile when using a thermopile, signal detection may not be possible due to the influence of the ambient temperature. In order to eliminate the influence, it is desirable to have a circuit configuration that detects the time change of the signal (hereinafter referred to as a change detection circuit).
  • a change detection circuit a circuit configuration that detects the time change of the signal
  • FIG. 16 is a diagram showing an example of a detection circuit configuration of the sensor 2 using a thermopile.
  • An output signal from the sensor 2 (detection element 16) is detected by the signal detection unit 3a and sent to the gesture determination unit 3b.
  • the signal detector 3a includes an amplifier 18 that amplifies a signal, an A / D converter 19 that performs A / D conversion, and electronic components such as a resistor capacitor.
  • the A / D conversion unit 19 may use an IC having an A / D conversion function, or may use a microcomputer having an A / D function and perform processing together with the gesture determination unit 3b in the subsequent stage.
  • thermopile used for the sensor 2 outputs a signal substantially proportional to the temperature difference between the temperature of the object (for example, a human hand) and the thermopile itself.
  • the output signal of the thermopile is amplified by the amplifier 18. Further, the positive terminal of the thermopile is input to the positive terminal of the amplifier 18, and the reference voltage Vref is input to the negative terminal. Then, the amplifier 18 amplifies the detection signal from the thermopile input to the positive terminal with reference to the reference voltage Vref input to the negative terminal, and outputs the voltage Vobj.
  • FIG. 17 is a diagram showing the output voltage of the amplifier 18 with respect to the object temperature.
  • the object temperature is Tobj
  • the thermopile temperature is Ts
  • the reference voltage is Vref
  • the output voltage of the amplifier 18 is Vobj
  • Toobj> Ts Vobj> Vref
  • Vobj ⁇ Ts Vobj ⁇ Vs.
  • the output voltage Vobj of the amplifier 18 is determined by the relationship between the temperature Tobj of the object and the temperature Ts of the thermopile, and is expressed by Expression (4).
  • Vobj ⁇ (Tobj ⁇ Ts) + Vref (4)
  • is a coefficient determined by the sensitivity of the thermopile and the gain of the amplifier 18.
  • Equation (4) when the difference between the thermopile temperature Ts and the object temperature Tobj is increased, the output voltage Vobj is saturated and sticks to Vcc or GND. In the image projection apparatus 1, due to the heat generated from the light source unit 4b, the thermopile temperature Ts also rises significantly, making it difficult to detect the object. In order to avoid this, a change detection circuit is used to eliminate the influence of the temperature increase of the thermopile.
  • FIG. 18 is a diagram illustrating a configuration of a change detection circuit of the sensor 2.
  • the change detection circuit has a configuration in which a capacitor C is inserted in series between the positive terminal of the sensor 2 and the positive terminal of the amplifier 18, and a resistor R is inserted between the positive terminal and the negative terminal of the amplifier 18.
  • the CR time constant circuit reacts to a temperature change in a short time, but does not depend on the temperature Ts increase of the thermopile, which is a long-time temperature change.
  • This time constant is desirably 3 sec or less, for example, in accordance with the operation time of the gesture.
  • the output voltage Vobj at this time is expressed by Expression (5) when the object temperature changes from Tobj1 to Tobj2.
  • Vobj ⁇ (Tobj2 ⁇ Tobj1) (5)
  • the gesture direction is mainly described in the x direction and the y direction.
  • the detection of the gesture in the z direction will be described.
  • FIG. 19 is a diagram showing a gesture in the z direction, and the operation in the z direction is performed by covering the image projection apparatus 1 with the hand 21.
  • FIG. 20 is a diagram showing the relationship between the hand and the detection range during the z-direction gesture.
  • the detection range 2a of each detection element CH1 to CH4 of the sensor 2 is assumed to be included in the range of the hand 21.
  • the detection of the gesture in the z direction is determined using the detection signals of the detection elements CH1 to CH4 based on whether or not these detection signals are synchronized.
  • FIG. 21 is a diagram showing a time change of the sensor signal of each CH during the z-direction operation.
  • the change detection circuit (FIG. 18) of the third embodiment is used for this detection circuit, and the sensor signal on the vertical axis is the output level of the amplifier 18.
  • the sensor signal from each CH rises, and the sensor signal peaks when the detection range is completely covered. If the state covering the detection range continues, the sensor signal decreases after a predetermined holding time according to the time constant of the capacitor C and the resistance R of the change detection circuit.
  • the gesture is an original z direction, the rise time and holding time of the signal detected by each CH should be synchronized. Therefore, it is determined whether or not the gesture is in the z direction by evaluating the difference between the rise times (tmin to tmax) of each CH and the holding time “thold” in which the signal level is equal to or higher than the threshold value Vth.
  • FIG. 22 is a diagram showing a flowchart for detecting a z-direction gesture.
  • the determination in the z direction has three stages: threshold determination, rise time difference determination, and signal level maintenance period determination. This will be described below with reference to a flowchart.
  • threshold determination is performed.
  • a signal of each CH obtained by A / D converting the output signal of the sensor 2 by the signal detection unit 3a is used.
  • a signal change from the offset level is detected, and when the signal level exceeds the threshold value Vth, the next rise time is detected. If the signal level does not exceed the threshold, the threshold determination is repeated.
  • the rise time is detected.
  • the time exceeding the threshold value Vth is detected for each CH.
  • the signal of each CH rises almost simultaneously with respect to the gesture in the z direction.
  • a difference occurs in the rise time of each CH. Therefore, the rise time is detected to distinguish the gesture in the z direction from the movement in the other direction.
  • the rise time difference is determined. Of the rise time obtained in S103, the minimum value tmin and the maximum value tmax are obtained. If the rise time difference (tmax ⁇ tmin) is less than the predetermined time, the next signal level maintaining period determination (S105) is performed. If the rise time difference (tmax ⁇ tmin) is equal to or greater than the predetermined time, the process returns to the threshold determination process (S101).
  • the value of the predetermined time depends on the movement speed of the hand, but is preferably about 50 to 100 msec in consideration of operability.
  • the signal level maintenance period is determined.
  • a period thold in which the signal levels of all the detection elements CH1 to CH4 exceed the threshold value Vth is detected. If the detected threshold is equal to or longer than a predetermined time, it is regarded as a gesture in the z direction.
  • the predetermined time is desirably about 600 msec as a time that does not cause discomfort to humans. If the time is longer than that, the time for covering the sensor 2 with the hand becomes longer, so that the operator feels stress.
  • a control signal corresponding to the operation is output.
  • the present embodiment it is possible to suppress erroneous detection that is not intended by the operator in gesture detection in the z direction. For example, in the operation of pointing a finger on the image, the possibility of completely covering the detection range 2a is reduced unless the position is very close to the sensor 2, so that the z-direction operation and the finger pointing operation can be separated. Moreover, you may recognize as a gesture of az direction by repeating the operation
  • Example 5 a method for detecting a failure state of the detection element 16 in the sensor 2 will be described.
  • the change state detection circuit (FIG. 18) of the third embodiment is used, and the failure state is determined by comparing the signal levels of the plurality of detection elements 16.
  • FIG. 23 is a diagram showing temporal changes in the output signals of a normal detection element and a detection element in a failure state.
  • CH1 is shown as a failure detection element (failure element)
  • CH2 is shown as a normal detection element (normal element).
  • the normal element CH2 outputs a signal (maximum value CH2MAX), but the faulty element CH1 has a small or zero signal output (maximum value CH1MAX). Therefore, failure determination is possible by comparing the signal level of each CH.
  • FIG. 24 is a diagram showing a flowchart for determining a failure of the detection element.
  • failure determination using four detection elements will be described.
  • the failure determination is performed in front of the sensor 2 and includes five levels of determination: threshold determination for each detection element, count clear determination, error exclusion determination, error determination, and failure determination.
  • threshold determination for each detection element
  • count clear determination for each detection element
  • error exclusion determination for each detection element
  • failure determination for failure determination
  • failure determination includes five levels of determination: threshold determination for each detection element, count clear determination, error exclusion determination, error determination, and failure determination.
  • it demonstrates along a flowchart.
  • the maximum value of each CH is detected.
  • the maximum value of each detection element hereinafter, CH1MAX, CH2MAX, CH3MAX, CH4MAX
  • a count clear determination is performed.
  • the maximum value of each CH detected in S203 is compared with the threshold Eth. If it is greater than or equal to the threshold, the error count for that CH is cleared. If it is less than the threshold, the error count is not cleared and the process proceeds to maximum value detection (S206).
  • the threshold Eth may be set to be equal to or higher than the noise level, or may be set according to the magnitude of the signal level. Any level can be used as long as it is detected that a hand gesture has been input.
  • the error detection period may be determined according to the urgency of detection together with the error count value at the time of failure determination. If error detection is required immediately in time, the detection period and the error count number may be set small.
  • maximum value detection is performed.
  • CHMAX maximum value
  • CHMIN minimum value among the maximum values (CH1MAX, CH2MAX, CH3MAX, CH4MAX) of each CH are obtained.
  • error exclusion determination is performed.
  • the error exclusion determination is introduced to distinguish whether the detection element is in a failure state or whether the detection element is normal but no signal is detected.
  • the error exclusion determination pattern will be described with reference to FIGS.
  • FIG. 25A and 25B are diagrams showing hand movement patterns, where FIG. 25A shows a combination of hand movements in the y direction, and FIG. 25B shows a combination of hand movements in the x direction.
  • FIG. 25A shows a combination of hand movements in the y direction
  • FIG. 25B shows a combination of hand movements in the x direction.
  • FIG. 26 is a diagram showing patterns to be excluded from error determination.
  • the cases of (1), (3), (4), and (6) in FIG. 25 are excluded, and the combinations of the minimum value CHMIN and the maximum value CHMAX at that time are shown. That is, the combination of CHMAX and CHMIN shown in FIG. 26 is excluded from the error determination.
  • error determination is performed.
  • the maximum value (CH1MAX, CH2MAX, CH3MAX, CH4MAX) of each detection element is compared with the CHMAX value, and an error is determined when Expression (6) is satisfied. (Maximum value of each detecting element) ⁇ CHMAX / (constant coefficient) (6)
  • the error count value is incremented.
  • failure determination is performed.
  • the count value of the error determination process in S208 exceeds a certain value, it is determined that the detection element is in a failure state.
  • the count value may be set arbitrarily together with the detection period when detecting the maximum value of each detection element.
  • the failure state of the detection element 16 in the sensor 2 can be detected with high accuracy.
  • Example 6 describes an example of assigning video screen operation functions to gesture types.
  • the main functions of the video screen operation are PageUp / PageDown (function to move the page to the next or previous page when displaying PowerPoint, pdf, etc.), PowerOn / Off (power ON / OFF), FrameLock (stop the display screen), There are ENTER (selection / determination), Blank (display ON / OFF), and Zoom (enlarge display). Gesture ON / OFF (gesture valid / invalid switching) is added to these functions, and each screen operation function by the gesture is realized.
  • PageUp / PageDown function to move the page to the next or previous page when displaying PowerPoint, pdf, etc.
  • PowerOn / Off power ON / OFF
  • FrameLock stop the display screen
  • ENTER selection / determination
  • Blank display ON / OFF
  • Zoom enlarge display
  • FIG. 27 is a diagram showing gesture types, in which (a) shows a gesture in the z direction, (b) shows a gesture in the y direction, and (c) shows a gesture in the x direction.
  • the gesture in the z direction in (a) is a pushing operation, which is a large operation compared to the operation in the other direction. Therefore, it is desirable as an intuitive operation to assign a function that greatly changes the display state of the projector. It is desirable to assign a function for changing the display state, such as Power ON / OFF, Blank, and Frame Lock, to the gesture in the z direction.
  • the y-direction gesture works like pressing a switch on the screen. Therefore, it is desirable to assign it to input or function switching. It is desirable to assign functions such as Zoom, ENTER, and Geometry ON / OFF to the y-direction gesture.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif (1) de projection d'image qui comporte : une unité (3) de détection de commande, qui détecte un geste d'un opérateur au moyen d'un capteur (2) et produit un signal de commande ; et une unité (4) de projection d'image, qui produit une lumière d'image sur la base d'un signal d'image et projette une image sur une surface de projection. Si cette surface de projection est la surface d'installation (11) sur laquelle le dispositif (1) de projection d'image est installé, le capteur (2) de l'unité (3) de détection de commande comprend une pluralité de dispositifs de détection (16) qui détectent un geste d'un opérateur. La pluralité des dispositifs de détection présentent différentes valeurs de couverture de détection, et au moins un de la pluralité des dispositifs de détection est un dispositif (16g) de surveillance de geste comportant une valeur de couverture de détection couvrant une proportion prédéterminée, supérieure ou égale à l'écran sur lequel l'image est projetée.
PCT/JP2014/061434 2014-04-23 2014-04-23 Dispositif de projection d'image WO2015162722A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/061434 WO2015162722A1 (fr) 2014-04-23 2014-04-23 Dispositif de projection d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/061434 WO2015162722A1 (fr) 2014-04-23 2014-04-23 Dispositif de projection d'image

Publications (1)

Publication Number Publication Date
WO2015162722A1 true WO2015162722A1 (fr) 2015-10-29

Family

ID=54331911

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/061434 WO2015162722A1 (fr) 2014-04-23 2014-04-23 Dispositif de projection d'image

Country Status (1)

Country Link
WO (1) WO2015162722A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6281878B1 (en) * 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
JP2002526867A (ja) * 1998-10-07 2002-08-20 インテル コーポレイション データ入力方法
JP2005047412A (ja) * 2003-07-30 2005-02-24 Nissan Motor Co Ltd 非接触式情報入力装置
JP2005141542A (ja) * 2003-11-07 2005-06-02 Hitachi Ltd 非接触入力インターフェース装置
US8382295B1 (en) * 2010-06-30 2013-02-26 Amazon Technologies, Inc. Optical assembly for electronic devices
JP2013148802A (ja) * 2012-01-23 2013-08-01 Nikon Corp プロジェクタ
US20130289820A1 (en) * 2012-04-27 2013-10-31 Innnova Electronics, Inc. Automotive Diagnostic Tool with Virtual Display and Input

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6281878B1 (en) * 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
JP2002526867A (ja) * 1998-10-07 2002-08-20 インテル コーポレイション データ入力方法
JP2005047412A (ja) * 2003-07-30 2005-02-24 Nissan Motor Co Ltd 非接触式情報入力装置
JP2005141542A (ja) * 2003-11-07 2005-06-02 Hitachi Ltd 非接触入力インターフェース装置
US8382295B1 (en) * 2010-06-30 2013-02-26 Amazon Technologies, Inc. Optical assembly for electronic devices
JP2013148802A (ja) * 2012-01-23 2013-08-01 Nikon Corp プロジェクタ
US20130289820A1 (en) * 2012-04-27 2013-10-31 Innnova Electronics, Inc. Automotive Diagnostic Tool with Virtual Display and Input

Similar Documents

Publication Publication Date Title
US9552072B2 (en) Image projection device
US8408720B2 (en) Image display apparatus, image display method, and recording medium having image display program stored therein
JP6134804B2 (ja) 映像投射装置
JP6111706B2 (ja) 位置検出装置、調整方法、および調整プログラム
WO2015092905A1 (fr) Dispositif d'affichage d'image de projection et procédé d'affichage d'image de projection
US20110242054A1 (en) Projection system with touch-sensitive projection image
US8830210B2 (en) Optical touch apparatus and drive method to control an average brightness of LEDs
US8582118B2 (en) Optical detecting device, display device, and electronic equipment
JP5645444B2 (ja) 画像表示システムおよびその制御方法
EP2386936A2 (fr) Dispositif de détection optique, dispositif d'affichage et appareil électronique
US20160004337A1 (en) Projector device, interactive system, and interactive control method
JP7294350B2 (ja) 情報処理装置、情報処理方法、およびプログラム
KR101375056B1 (ko) 적외선펜의 좌표를 인식하는 터치스크린시스템
WO2017060943A1 (fr) Dispositif de télémétrie optique et appareil de projection d'images
JP2015088060A (ja) プロジェクタ
US10677448B2 (en) Lighting device
KR20120012727A (ko) 포인팅 디바이스 및 이를 가지는 디스플레이장치
JP2014137813A (ja) マルチビューディスプレイシステム及びその操作方法
WO2017212601A1 (fr) Dispositif optique de mesure de distance et dispositif de projection d'images pourvu de ce dernier
WO2015162722A1 (fr) Dispositif de projection d'image
JP6634904B2 (ja) 電子デバイス、及び電子デバイスの制御方法
US20140267033A1 (en) Information Technology Device Input Systems And Associated Methods
KR101510342B1 (ko) 안전 버튼을 구비한 곡면 디스플레이 장치
JP2020191045A (ja) 指示体、表示システムおよび動作方法
JP2015158558A (ja) プロジェクター

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14890289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14890289

Country of ref document: EP

Kind code of ref document: A1