WO2015008915A1 - Rear projection type display apparatus capable of sensing touch input and gesture input - Google Patents

Rear projection type display apparatus capable of sensing touch input and gesture input Download PDF

Info

Publication number
WO2015008915A1
WO2015008915A1 PCT/KR2014/001668 KR2014001668W WO2015008915A1 WO 2015008915 A1 WO2015008915 A1 WO 2015008915A1 KR 2014001668 W KR2014001668 W KR 2014001668W WO 2015008915 A1 WO2015008915 A1 WO 2015008915A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
screen
output
frame section
during
Prior art date
Application number
PCT/KR2014/001668
Other languages
French (fr)
Inventor
Jae Kwang Lee
Yong Ho Cho
Pil Won Jeong
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2015008915A1 publication Critical patent/WO2015008915A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/10Projectors with built-in or built-on screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • G03B21/60Projection screens characterised by the nature of the surface
    • G03B21/62Translucent screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3129Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2033LED or laser light sources
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B33/00Colour photography, other than mere exposure or projection of a colour film
    • G03B33/04Colour photography, other than mere exposure or projection of a colour film by four or more separation records
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0686Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the present invention relates to a display apparatus, and more particularly to a rear projection type display apparatus capable of sensing a touch input and a gesture input.
  • a display apparatus is an apparatus that displays an image.
  • a projector is an apparatus that projects an image onto a screen.
  • sensing of a gesture input is suggested in the case of a front projection type and sensing of a touch input is suggested in the case of a rear projection type.
  • a display apparatus including a screen having a first transparency during a first frame section and a second transparency during a second frame section, a light source unit configured to output an output light for detection of a distance to an external object, a light detector configured to detect a received light from the external object, and a processor configured to process a touch input based on the output light and the received light during the first frame section and to process a gesture input during the second frame section.
  • a display apparatus including a screen having a first transparency in a first region thereof and a second transparency in a second region thereof, a light source unit configured to output an output light for detection of a distance to an external object, a light detector configured to detect a received light from the external object, and a processor configured to process a touch input on the first region of the screen and a gesture input at the front of the second region of the screen based on the output light and the received light.
  • a display apparatus in accordance with an embodiment of the present invention senses a touch input in a state in which a screen has a first transparency during a first frame section, and senses a gesture input in a state in which the screen has a second transparency during a second frame section.
  • sensing of a touch input and a gesture input may be accomplished based on the output light and a received light corresponding to the output light.
  • the display apparatus when outputting a projection image, may simultaneously output a projection image and an output light. Thereby, sensing of a touch input or gesture input may be accomplished simultaneously with projection of an image.
  • a display apparatus in accordance with another embodiment of the present invention may further include a camera to capture an image of an external object in a state in which a screen has a second transparency during a second frame section.
  • the captured image may be displayed, as a projection image, on the screen.
  • FIG. 1 illustrates a conceptual view of a display apparatus in accordance with one embodiment of the present invention
  • FIG. 2 is one example of a block diagram showing a simplified internal configuration of the display apparatus of FIG. 1;
  • FIG. 3 illustrates simultaneous output of a projection image and a light for distance detection from a light output module of FIG. 1;
  • FIGS. 4A and 4B are reference views for explanation of sensing of a touch input and a gesture input by the light output module of FIG. 1;
  • FIGS. 5A and 5B are reference views for explanation of a distance detection method of the light output module of FIGS. 4A and 4B;
  • FIGS. 6A to 6E are views for explanation of a screen of FIGS. 4A and 4B;
  • FIG. 7 illustrates an implementation example of the display apparatus of FIG. 1
  • FIG. 8 is another example of a block diagram showing a simplified internal configuration of the display apparatus of FIG. 1;
  • FIG. 9 is a reference view for explanation of sensing of a gesture input by the light output module of FIG. 8;
  • FIG. 10 illustrates projection of an image from the light output module of FIG. 8
  • FIG. 11 is a flowchart showing an operating method of the display apparatus in accordance with one embodiment of the present invention.
  • FIGS. 12 and 13 illustrate various examples of arrangement of a first frame section and a second frame section
  • FIGS. 14A and 14B are views for explanation of operation based on a touch input and a gesture input on a screen
  • FIG. 15 is a view for explanation of operation of a light source unit of FIGS. 4A and 4B;
  • FIGS. 16 and 17 are views illustrating various examples of a touch input and a gesture input.
  • module and “unit” are given only in consideration of ease in the preparation of the specification, and do not have or serve as specially important meanings or roles. Thus, the “module” and “unit” may be mingled with each other.
  • a display apparatus described herein is designed to display projection images on two screens via scanning. More particularly, the display apparatus is capable of recognizing a distance to an external object or motion of the external object while displaying projection images on two screens via scanning.
  • the display apparatus may include a light output module to output an output light for detection of a distance to an external object or motion of the external object.
  • the light output module may output a projection image to display an image on a screen, in addition to the output light.
  • the display apparatus is capable of receiving a light scattered or reflected by an external object and of detecting, e.g., a distance to the external object based on a difference between the output light and the received light.
  • the display apparatus may sense a touch input in a state in which a screen has a first transparency during a first frame section, and may sense a gesture input in a state in which the screen has a second transparency during a second frame section.
  • the above described display apparatus may be included in home appliances, such as a TV, media player, game console, air conditioner, refrigerator, washing machine, cooking appliance, robot cleaner, etc., and may also be included in a vehicle, such as a car, etc.
  • home appliances such as a TV, media player, game console, air conditioner, refrigerator, washing machine, cooking appliance, robot cleaner, etc.
  • vehicle such as a car, etc.
  • FIG. 1 illustrates a conceptual view of a display apparatus in accordance with one embodiment of the present invention.
  • the display apparatus 10 may include a light output module 100 and screens 200a and 200b.
  • the light output module 100 may output an output light for detection of a distance to an external object via first direction scanning and second direction scanning, may receive a light corresponding to the output light, and may detect a distance to an external object or motion of the external object based on a difference between the output light and the received light.
  • the light output module 100 may output a projection image based on a visible light via first direction scanning and second direction scanning.
  • the light output module 100 may include a 2-dimentional (2D) scanner to simultaneously output the projection image and the output light via scanning.
  • 2D 2-dimentional
  • this image projection may be referred to as rear projection.
  • the light output module 100 may receive a light scattered or reflected by the user finger 20, and sense the touch input based on the output light and the received light.
  • the light output module 100 may receive a light scattered or reflected by the user hand 60, and sense the gesture input based on the received light as well as the output light.
  • the display apparatus 10 in accordance with the embodiment of the present invention may sense a touch input during a first frame section by providing the screen 200 with a first transparency, and sense a gesture input during a second frame section by providing the screen 200 with a second transparency less than the first transparency.
  • the display apparatus 100 enables detection of a touch input and a gesture input.
  • FIG. 2 is a block diagram showing a simplified internal configuration of the light output module of FIG. 1.
  • the light output module 100 serves to output a projection image and an output light in a Time of Flight (TOF) manner.
  • TOF Time of Flight
  • the light output module 100 may include a memory 120, a scanner 140, a processor 170, a communication module 180, a drive unit 185, a power supply unit 190, a light source unit 210, and a light detector 280, for example.
  • the memory 120 may store programs for processing and control of the processor 170, and may function to temporarily store input or output data (e.g., still images and videos).
  • the communication module 180 serves as an interface between the light output module 100 and all external devices connected to the light output module 100 in a wired or wireless manner.
  • the communication module 180 may receive data or power from the external devices to transmit the same to internal components of the light output module 100, and may transmit internal data of the light output module 100 to the external devices.
  • the communication module 180 may receive a radio signal from a proximate mobile terminal (not shown).
  • the radio signal may include a voice call signal, a video telephony call signal, or various types of data, such as text data, image data, etc.
  • the communication module 180 may include a local area communication module (not shown).
  • Local area communication technologies may include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and the like.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the scanner 140 may output an input light to an external area by sequentially and repeatedly implementing first direction scanning and second direction scanning.
  • the light, input to the scanner 140 may include a visible light corresponding to a projection image and an output light for detection of a distance to an external object.
  • the output light may be an infrared light.
  • the scanner 140 may sequentially and repeatedly implement scanning from the left to the right and scanning from the right to the left with respect to an external scan area, and more particularly may implement this scanning with respect to the entire external scan area on a per frame basis. Through the scanning as described above, the scanner 140 may output the visible light and the output light to the external scan area.
  • the processor 170 may implement general control operation of the light output module 100. More specifically, the processor 170 may control operation of the respective internal units of the light output module 100.
  • the processor 170 may control output of a projection image, such as a video image stored in the memory 120 or a video image transmitted from an external device through the communication module 180, to an external scan area.
  • a projection image such as a video image stored in the memory 120 or a video image transmitted from an external device through the communication module 180, to an external scan area.
  • the processor 170 may control the drive unit 185 that controls the light source unit 210 to output red (R), green (G), and blue (B) visible lights. More specifically, the processor 170 may output R, G and B signals, corresponding to a video image to be displayed, to the drive unit 185.
  • the processor 170 may transmit an electrical signal, corresponding to an output light, to the drive unit 185, for detection of a distance to an external object.
  • the processor 170 may control operation of the scanner 140. More specifically, the processor 170 may control sequential and repetitive implementation of first direction scanning and second direction scanning to output a projection image and an output light to an external area.
  • the processor 170 may vary a frame rate to vary a scanning speed of the scanner 140.
  • the processor 170 may implement detection of a distance to an external object based on an electrical signal, corresponding to an output light to be transmitted to the drive unit 185, and an electrical signal corresponding to a received light received by the light detector 280.
  • the processor 170 may detect a distance to an external scan area using a phase difference between an electrical signal corresponding to an output light and an electric signal corresponding to a received light.
  • the processor 170 may detect gesture motion of a user based on distance information regarding an external scan area detected on a per frame basis.
  • the processor 170 may process a touch input during a first frame section by providing the screen 200 with a first transparency, and process a gesture input during a second frame section by providing the screen 200 with a second transparency less than the first transparency.
  • the processor 170 may control a screen controller 205 to control the screen 200 at the outside of the light output module 100.
  • the screen 200 may have a first transparency during a first frame section, and a second transparency less than the first transparency during a second frame section. More specifically, the screen 200 is changed to be opaque during the first frame section such that a projection image is displayed on the screen 200, and is changed to be transparent during the second frame section such that an output light is transmitted through the screen 200 to enable detection of a user who is located at the front of the screen 200.
  • the above described transparency adjustment of the screen on a per frame section basis may be implemented by the screen controller 205.
  • the screen controller 205 may vary an arrangement direction of the anisotropic body based on power applied to the anisotropic body between the first transparent electrode and the second transparent electrode, thereby controlling the screen 200 to achieve a first transparency during a first frame section and a second transparency during a second frame section.
  • a frame synchronization signal may be transmitted from the processor 170 to the screen controller 205.
  • the light source unit 210 may include a blue light source to output a blue light, a green light source to output a green light, and a red light source to output a red light.
  • each light source may take form of a laser diode or Light Emitting Diode (LED).
  • the light source unit 210 may include an infrared light source to output an infrared light.
  • the light detector 280 may detect a received light from an external object, the received light corresponding to an output light, and convert the detected received light into an electrical signal.
  • the light detector 280 may include a photodiode to convert an optical signal into an electrical signal.
  • the light detector 280 may include a photodiode having high photoelectric conversion efficiency, for example, an avalanche photodiode to convert a light, scattered by an external object, into an electrical signal.
  • the light detector 280 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor, in order to receive and detect an infrared light when an output light is the infrared light.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • a sampler (not shown) to convert an analog signal into a digital signal may be additionally provided between the light detector 280 and the processor 170.
  • the drive unit 185 may control output of red, green, and blue lights from the red light source, the green light source, and the blue light source of the light source unit 210 in response to R, G, and B signals transmitted from the processor 170.
  • the drive unit 185 may control output of an infrared light from the infrared light source of the light source unit 210 in response to an electrical signal, corresponding to an output light, transmitted from the processor 170.
  • the power supply unit 190 may supply power required for operation of the respective components upon receiving external power or internal power under control of the processor 170.
  • FIG. 3 illustrates simultaneous output of a projection image and an output light for distance detection from the light output module of FIG. 1.
  • the scanner 140 included in the light output module 100 in accordance with the embodiment of the present invention may output an input light to an external scan area by sequentially and repeatedly implementing first direction scanning and second direction scanning.
  • the drawing illustrates output of a projection image 202a; RGB and an output light IR to the screen 200a.
  • the scanner 140 included in the light output module 100 in accordance with the embodiment of the present invention may simultaneously output an input light, i.e. a visible light RGB and an infrared output light IR.
  • the scanner 140 may sequentially and repeatedly implement scanning from the left to the right and scanning from the right to the left with respect to an external scan area, and more particularly may implement this scanning with respect to the entire scan area on a per frame basis.
  • the light output module 100 in accordance with the embodiment of the present invention may detect a distance to an external object while projecting an image to an external area. Therefore, the light output module 100 enables display of an image related to the distance to the object or motion of the object, or display of an image corresponding to a user gesture.
  • FIGS. 4A and 4B are reference views for explanation of sensing of a touch input and a gesture input by the light output module of FIG. 1.
  • a light output module 100a of FIGS. 4A and 4B may include the light source unit 210, collimator units 212 and 218, a photosynthesis unit 220, a light reflector 256, the scanner 140, the processor 170, the drive unit 185, an infrared light transmission filter 282, and the light detector 280.
  • the light source unit 210 may include a plurality of light sources. More specifically, the light source unit 210 may include a red light source 210R, a green light source 210G, a blue light source 210B, and an output light source 210IR to output an infrared output light. Among these light sources, the light sources 210R, 210G and 210B may include laser diodes.
  • the respective light sources 210R, 210G, 210B and 210IR may be driven by respective electrical signals from the drive unit 185, and the electrical signals of the drive unit 185 may be generated under control of the processor 170. Meanwhile, the output light source 210R may output an output light in response to an electrical signal corresponding to the output light.
  • Lights, output from the respective light sources 210R, 210G, 210B and 210IR, are collimated via respective collimator lenses included in the collimator unit 212.
  • the photosynthesis unit 220 synthesizes lights output from the respective light sources 210R, 210G, 210B and 210IR, and outputs the synthesized light in a given direction.
  • the photosynthesis unit 220 may include four 2D Micro Electro Mechanical System (MEMS) mirrors 220a, 220b, 220c and 220d.
  • MEMS Micro Electro Mechanical System
  • a first photo synthesizer 220a, a second photo synthesizer 220b, a third photo synthesizer 220c, and a fourth photo synthesizer 220d respectively output a red light from the red light source 210R, a green light from the green light source 210G, a blue light from the blue light source 210B, and an output light from the output light source 210IR toward the scanner 140.
  • the light reflector 256 reflects the red light, the green light, the blue light, and the output light, having passed through the photosynthesis unit 220, toward the scanner 140.
  • the light reflector 256 must reflect lights of various wavelengths, and to this end, may take the form of a Total Mirror (TM).
  • the scanner 140 may output a visible light RBG and an output light IR, received from the light source unit 210 by sequentially and repeatedly implementing first direction scanning and second direction scanning with respect to an external area. This scanning is repeatedly implemented with respect to the entire external scan area.
  • the visible light RGB and the output light IR, output from the scanner 140 may be output to the screen 200.
  • a projection image corresponding to the visible light RGB may be displayed on the screen 200.
  • this image projection may be referred to as rear projection.
  • the light output module 100 may sense a touch input, acquired when the user finger 20 touches the screen 200, during a first frame section, and sense a gesture input, acquired by motion of the user hand 60 located opposite to the screen 200, during a second frame section.
  • the screen 200 may be opaque during the first frame section, and may be transparent during the second frame section.
  • FIG. 4A illustrates that the screen 200 is opaque during the first frame section.
  • the light output module 100 may simultaneously output a projection image based on a visible light RGB and an output light IR.
  • the output light IR may be scattered or reflected by the finger 20 to thereby be introduced into the light output module 100. More specifically, a received light may be input to the light detector 280 by way of the collimator unit 218 and the infrared light transmission filter 282.
  • the light detector 280 may convert the received light into an electrical signal, and the processor 170 may implement distance detection and processing of the touch input based on the output light and the received light acquired by the finger 20.
  • FIG. 4B illustrates that the screen 200 is transparent during the second frame section.
  • the light output module 100 may output an output light IR.
  • the light output module 100 may simultaneously output a projection image based on a visible light RGB and an output light IR.
  • the output light IR may be scattered or reflected by the user hand 60 to thereby be introduced into the light output module 100.
  • a received light may be input to the light detector 280 by way of the collimator unit 218 and the infrared light transmission filter 282.
  • the light detector 280 may convert the received light into an electrical signal, and the processor 170 may implement distance detection and processing of the gesture input based on the output light and the received light acquired by the user hand 60.
  • first frame section and the second frame section may be alternately arranged, but are not limited thereto, and various other examples are possible.
  • the scanner 140 outputs the visible light RGB, and therefore the screen 200, which displays a projection image, may have a freeform curved surface.
  • the projection image may be displayed on the curved surface of the screen.
  • the curvature of the screen may be recognized via distance detection using the output light, scaling of a display image is implemented based on the corresponding curved surface, and the scaled projection image may be displayed. In this way, display on a freeform curved surface is possible.
  • FIGS. 5A and 5B are reference views for explanation of a distance detection method of the light output module of FIGS. 4A and 4B.
  • FIG. 5A is a reference view for explanation of sensing of a touch input on the screen 200 of FIG. 4A during the first frame section.
  • the drawing illustrates increase in the amplitude of the received light at the x-coordinate and y-coordinate corresponding to the first point Pt.
  • an electrical signal Tx corresponding to the output light and an electrical signal Rx corresponding to the received light are illustrated.
  • the amplitude of the received light Ae at the x-coordinate and the y-coordinate corresponding to the first point Pt is illustrated. That is, the user touch causes the amplitude of the electrical signal corresponding to the received light at the touch point to be greater than that at other points on the screen.
  • the processor 170 detects and processes the touch point on the screen 200. More particularly, the processor 170 may calculate coordinates of the touch input when the screen 200 has a first transparency.
  • FIG. 5B is a reference view for explanation of sensing of a gesture input implemented at the front of the screen 200 of FIG. 4B during the second frame section.
  • Tx represents an electrical signal corresponding to an output light
  • Rx represents an electrical signal corresponding to a received light acquired by a user gesture input at the front of the screen 200.
  • the light output module 100 may receive the output light scattered or reflected by the user hand 60 located at the front of the screen 200
  • the processor 170 may calculate distance information based on a phase difference between the an electrical signal corresponding to the output light and an electrical signal corresponding to the received light, and process a gesture input based on the calculated distance information.
  • FIGS. 6A to 6E are views for explanation of the screen of FIGS. 4A and 4B.
  • FIG. 6A illustrates a cross sectional view of the screen 200 in accordance with one embodiment of the present invention.
  • the screen 200 may include a protecting film 905, an upper glass panel 910 and a lower glass panel 920, first electrodes 915 and second electrodes 925 arranged between the upper glass panel 910 and the lower glass panel 920, and an anisotropic body 930 interposed between the first electrodes 915 and the second electrodes 925.
  • the protecting film 905 may serve to protect the screen 200.
  • the first electrodes 915 and the second electrodes 925 arranged between the upper glass panel 910 and the lower glass panel 920 may be transparent electrodes, such as Indium Tin Oxide (ITO) electrodes.
  • ITO Indium Tin Oxide
  • the anisotropic body 930 interposed between the first electrodes 915 and the second electrodes 925, may have a variable arrangement direction depending on power applied between the first electrodes 915 and the second electrodes 925, and may have different indices of refraction depending on the variable arrangement direction.
  • the anisotropic body 930 may be formed of liquid crystals.
  • FIG. 6B is a view illustrating arrangement of the upper glass panel and the first electrodes of the screen of FIG. 6A.
  • the first electrodes 915 are vertically disposed on the upper glass panel 910. More particularly, the drawing illustrates vertical arrangement of m first electrodes V1, ..., Vm. Meanwhile, although the drawing illustrates the first electrodes V1, ..., Vm as being electrically connected to one another, alternatively, the first electrodes may be spaced apart from one another.
  • FIG. 6C illustrates the waveform of power applied to the first electrodes of FIG. 6B.
  • low level power L may be applied to the first electrodes V1, ..., Vm during a first frame section
  • high level power H may be applied to the first electrodes V1, ..., Vm during a second frame section.
  • FIG. 6D is a view illustrating arrangement of the lower glass panel and the second electrodes of the screen of FIG. 6A.
  • the second electrodes 925 are horizontally arranged on the lower glass panel 920 in parallel. More particularly, the drawing illustrates horizontal arrangement of n second electrodes H1, ..., HN.
  • FIG. 6E illustrates the waveform of power applied to the second electrodes of FIG. 6D.
  • low level power L may be applied to the second electrodes H1, ..., HN during a first frame section
  • high level power H may be applied to the second electrodes H1, ..., HN during a second frame section.
  • an arrangement direction of the anisotropic body 903 does not vary, which may provide the second screen 200 with a first transparency, i.e. cause the second screen 200 to be opaque.
  • an arrangement direction of the anisotropic body 930 varies, which may provide the screen 200 with a second transparency, i.e. cause the second screen 200 to be transparent.
  • the screen 200 is changed to be opaque during the first frame section, and is changed to be transparent during the second frame section.
  • FIG. 7 illustrates an implementation example of the display apparatus of FIG. 1.
  • the display apparatus 10 of the present invention is a rear projection type display apparatus capable of sensing a touch input and a gesture input.
  • the display apparatus 10 may be embodied as a display apparatus for use in public places, such as theaters, terminals, airports, etc.
  • the display apparatus 10 may be embodied as a display apparatus for use in a specific private space, such as a living room or a bedroom.
  • the display apparatus 10 may be embodied as a display apparatus for use in a specific private space, such as the interior of a vehicle.
  • FIG. 7 illustrates an implementation example in which the display apparatus in accordance with the embodiment of the present invention is installed to a center fascia 610 of a vehicle having a freeform curved surface.
  • the center fascia 610 may be provided, at the front of a steering wheel 403, with the screen 200 corresponding to a dashboard where various gauges are arranged.
  • the drawing illustrates the case in which a gauge image is displayed on the screen 200.
  • a driver may view the gauge image displayed on the screen 200. Meanwhile, the driver may implement a touch input on the screen 200, or may implement a gesture input at the front of the screen 200.
  • the display apparatus 10 of the present invention may sense a touch input or gesture input based on an output light from the light output module 100 and a received light corresponding to the output light.
  • the display apparatus 10 of the present invention may provide the screen 200 with an image corresponding to a user touch input or gesture input.
  • FIG. 8 is another example of a block diagram showing a simplified internal configuration of the display apparatus of FIG. 1.
  • FIG. 8 shows the display apparatus 10 in accordance with another embodiment of the present invention, which is similar to the display apparatus 10 of FIG. 2, but has a difference in terms of additional provision of a camera 195.
  • the camera 195 is included in the light output module 100, but is not limited thereto, and various other examples are possible.
  • the camera 195 may capture an image of an external object located behind the transparent screen 200.
  • the camera 195 may be an RGB camera to acquire a captured image of an external object.
  • the camera 195 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the image captured by the camera 195 may be transmitted to the processor 170 and may be output as a projection image to the screen 200 during a first frame section.
  • FIG. 9 is a reference view for explanation of sensing of a gesture input by the light output module of FIG. 8.
  • FIG. 9 illustrates detection of a distance to an external object based on, e.g., a gesture input during a second frame section.
  • the light output module 100a may include the camera 195 to capture an image of an external object. More specifically, a captured image of the user including the user hand 60 may be acquired.
  • FIG. 10 illustrates projection of an image from the light output module of FIG. 8.
  • an image captured by the camera 195 included in the light output module 100 during a second frame section may be displayed as a projection image on the screen 200.
  • the user 700 may view an image 650 showing the user’s figure.
  • the light output module 100 may output the projection image 650 during a first frame section after the camera 195 captures the image.
  • the display apparatus 10 including the light output module 100 may be connected to a network or an external terminal device via the communication module 180 to implement data exchange.
  • the display apparatus 10 may transmit the captured image 650 as exemplarily shown in FIG. 10 to an opponent terminal device, and may receive an image captured by the opponent terminal device. This enables realization of a video conferencing system.
  • the display device 10 of the present invention may further include an audio output unit, an audio input unit, etc., thus being capable of realizing a video telephone system.
  • FIG. 11 is a flowchart showing an operating method of the display apparatus in accordance with one embodiment of the present invention.
  • the display apparatus in accordance with the embodiment of the present invention sets the screen to a first transparency during a first frame section (S810).
  • the processor 170 may control the screen controller 210 to apply low level power to the first electrodes 915 and the second electrodes 925 of the screen 200. Thereby, the screen 200 may be set to the first transparency, i.e. to be opaque.
  • the display apparatus 10 senses a touch input on the screen 200 during the first frame section (S820).
  • the light source unit 210 may output a visible light and an output light during the first frame section.
  • the scanner 140 may output a projection image based on the visible light and the output light to the screen 200 via scanning.
  • the screen 200 is opaque, the projection image is diffused and displayed on the screen 200, and the output light is reflected by the screen 200. Meanwhile, when a touch input occurs on the screen 200, the amplitude of a received light at a touch input point varies, differently from a received light reflected from the screen 200.
  • an electrical signal corresponding to a received light at the touch input point may correspond to the wavelength Rx of FIG. 5A
  • an electrical signal corresponding to a received light reflected by the screen at positions except for the touch input point may correspond to the wavelength Rx1 of FIG. 5B. That is, the amplitude of the electrical signal corresponding to the received light at the touch input point is greater than that at positions except for the touch input point.
  • the processor 170 may process the touch input based on a phase difference between the electrical signal corresponding to the output light and the electrical signal corresponding to the received light and the amplitude of the electrical signal corresponding to the received light.
  • the phase difference is used only to detect a distance to the screen 200, and may not be considered upon sensing of a touch input. That is, the processor 170 may process a touch input based on the amplitude of the electrical signal corresponding to the received light.
  • the display apparatus 10 sets the screen 200 to a second transparency during a second frame section (S830).
  • the processor 170 may control the screen controller 210 to apply high level power to the first electrodes 915 and the second electrodes 925 of the screen 200. Thereby, the screen 200 may be set to the second transparency less than the first transparency, i.e. to be transparent.
  • the display apparatus 10 senses a gesture input of the user at the front of the screen 200 during the second frame section (S840).
  • the light source unit 210 may output only an output light except for a visible light during the second frame section, but is not limited thereto, and various other examples are possible.
  • the scanner 140 may output an output light to the screen 200 via scanning.
  • the output light may be transmitted through the screen 200 and reach the user located at the front of the screen 200. Then, the output light may be scattered or reflected by the user hand 60 and thereafter be introduced into the scanner 140.
  • an electrical signal corresponding to the received light acquired via the gesture input may correspond to the wavelength Rx of FIG. 5B.
  • the processor 170 may process the gesture input based on a phase difference between the electrical signal corresponding to the output light and the electrical signal corresponding to the received light.
  • the light source unit 210 may increase the intensity of the output light during the second frame section as compared to the first frame section.
  • FIGS. 12 and 13 illustrate various examples of arrangement of the first frame section and the second frame section.
  • FIG. 12 illustrates the case in which a first frame section for sensing of a touch input and a second frame section for sensing of a gesture input are alternately arranged.
  • the screen 200 may be changed to be opaque during odd frame sections 1, 3,..., n-1 frames and may be changed to be transparent during even frame sections 2, 4,..., n frames.
  • odd frame sections 1, 3,..., n-1 frames may correspond to the above described first frame section
  • even frame sections 2, 4,..., n frames may correspond to the above described second frame section.
  • respective corresponding projection images 910a, 910c, ..., 910y may be displayed on the screen 200 via scanning
  • respective corresponding projection images 910b, 910d, ..., 910z may be displayed on the screen 200 via scanning.
  • no projection image may be displayed on the screen 200 during the even frame sections 2, 4,..., n frames.
  • a light source to output a visible light included in the light source unit 210 may be turned off.
  • the frequency of the first frame sections may differ from the frequency of the second frame sections.
  • FIG. 13 illustrates the case in which the frequency of the first frame sections is greater than the frequency of the second frame sections, in order to enhance sensing accuracy of a touch input.
  • the screen 200 is changed to be opaque and displays a corresponding projection image.
  • the screen 200 may be changed to be transparent and display a corresponding projection image.
  • the frequency of the second frame sections may be greater than the frequency of the first frame sections.
  • the frequency of the first frame sections and the frequency of the second frame sections may be adjusted under control of the processor 170.
  • the first frame sections and the second frame sections may be alternately arranged.
  • FIGS. 14A and 14B are views for explanation of operation based on a touch input and a gesture input on a screen.
  • FIG. 14A(a) illustrates the case in which a projection image 1100 showing gauge information is displayed.
  • the display apparatus 10 in accordance with the embodiment of the present invention may sense the touch input based on a received light, and implement a corresponding operation.
  • FIG. 14A(b) illustrates the case in which a projection image 1120 showing time information depending on the touch input is displayed. As such, user convenience may be increased.
  • FIG. 14B(a) illustrates the case in which a projection image 1210 as a broadcast image of a first channel (CH 9) is displayed on the screen 200.
  • the display apparatus 10 in accordance with the embodiment of the present invention may sense the gesture input based on a received light, and implement a corresponding operation.
  • FIG. 14B(b) illustrates the case in which channel-up depending on the gesture input is implemented and thus a broadcast image of a second channel (CH 11) is displayed as a projection image 1220. As such, user convenience may be increased.
  • FIG. 15 is a view for explanation of operation of the light source unit of FIGS. 4A and 4B.
  • a projection image and an output light are output during a first frame section for sensing of a touch input, whereas a projection image is not output and only an output light is output during a second frame section for sensing of a gesture input.
  • the red light source 210R, the green light source 210G, the blue light source 210B, and the output light source 210IR included in the light source unit 210 may be turned on.
  • the red light source 210R, the green light source 210G, and the blue light source 210B included in the light source unit 210 may be turned off, and only the output light source 210IR may be turned on. As such, unnecessary power consumption may be reduced.
  • time-based separation of a touch input section and a gesture input section on a per frame section basis has been described as a method of sensing a touch input and a gesture input, space-based separation is also possible.
  • the display apparatus in accordance with another embodiment of the present invention includes a screen consisting of a first region having a first transparency and a second region having a second transparency less than the first transparency, a light source unit to output an output light for detection of a distance to an external object, and a processor configured to process a touch input based on a received light corresponding to the output light when the touch input occurs on the first region of the screen and to process a gesture input based on a received light corresponding to the output light when the gesture input occurs at the front of the second region of the screen.
  • the first region of the screen 200 may be set to a first transparency
  • the second region may be set to a second transparency less than the first transparency
  • the processor 170 may process the touch input based on a received light corresponding to the output light.
  • the processor 170 may process a gesture input based on a received light corresponding to the output light.
  • the light source unit 210 may simultaneously output a projection image based on a visible light and an output light
  • the scanner 140 may simultaneously output the projection image and the output light to the screen 200 by implementing first directions canning and the second direction scanning.
  • the processor 170 may implement scaling of an image to be displayed, thereby controlling projection of the image onto the first region of the screen 200.
  • FIGS. 16 and 17 are views illustrating various examples of a touch input and a gesture input.
  • FIG. 16(a) illustrates the case in which a first region 1310 of the screen 200 is opaque and a second region 1320 is transparent.
  • the display apparatus 10 may sense the touch input based on a received light. Then, to ensure efficient sensing of the touch input, the size of the first region 1310 of the screen 200 may vary.
  • FIG. 16(b) illustrates the case in which an entire region 1315 of the screen 200 is opaque. Thus, sensing of a touch input on the entire region 1315 of the screen 200 is possible.
  • FIG. 17(a) illustrates the case in which the first region 1310 of the screen 200 is opaque and the second region 1320 is transparent.
  • the display apparatus 10 may sense the gesture input based on a received light. Then, the display apparatus 10 may vary the size of the second region 1320 of the screen 200, to ensure efficient sensing of the gesture input.
  • FIG. 17(b) illustrates the case in which an entire region 1325 of the screen 200 is transparent. Thus, sensing of a gesture input at the front of the entire region 1325 of the screen 200 is possible.
  • variation in the sizes of the transparent region and the opaque region may be accomplished by applying high level power H to some of the first electrodes V1, ..., Vm and the second electrodes H1, ..., HN of the screen 200 and applying low level power L to the other electrodes as described above with reference to FIGS. 6A to 6E.
  • the present invention may be applied to a display apparatus, and more particularly to a rear projection type display apparatus capable of sensing a touch input and a gesture input.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Position Input By Displaying (AREA)
  • Projection Apparatus (AREA)
  • Mechanical Optical Scanning Systems (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Disclosed is a display apparatus. The display apparatus includes a screen having a first transparency during a first frame section and a second transparency during a second frame section, a light source unit configured to output an output light for detection of a distance to an external object, a light detector configured to detect a received light from the external object, and a processor configured to process a touch input based on the output light and the received light during the first frame section and to process a gesture input during the second frame section. The display apparatus is capable of sensing a touch input and a gesture input.

Description

REAR PROJECTION TYPE DISPLAY APPARATUS CAPABLE OF SENSING TOUCH INPUT AND GESTURE INPUT
The present invention relates to a display apparatus, and more particularly to a rear projection type display apparatus capable of sensing a touch input and a gesture input.
A display apparatus is an apparatus that displays an image. Among various display apparatuses, a projector is an apparatus that projects an image onto a screen.
There is increasing a demand for user interaction via, e.g., a touch input or gesture input of a user during display of a projection image on a screen.
In this connection, sensing of a gesture input is suggested in the case of a front projection type and sensing of a touch input is suggested in the case of a rear projection type.
However, simultaneous sensing of a touch input and a gesture input is limited to some extent.
It is an object of the present invention to provide a rear projection type display apparatus capable of sensing a touch input and a gesture input.
In accordance with one aspect of the present invention, the above and other objects can be accomplished by the provision of a display apparatus including a screen having a first transparency during a first frame section and a second transparency during a second frame section, a light source unit configured to output an output light for detection of a distance to an external object, a light detector configured to detect a received light from the external object, and a processor configured to process a touch input based on the output light and the received light during the first frame section and to process a gesture input during the second frame section.
In accordance with another aspect of the present invention, there is provided a display apparatus including a screen having a first transparency in a first region thereof and a second transparency in a second region thereof, a light source unit configured to output an output light for detection of a distance to an external object, a light detector configured to detect a received light from the external object, and a processor configured to process a touch input on the first region of the screen and a gesture input at the front of the second region of the screen based on the output light and the received light.
As is apparent from the above description, a display apparatus in accordance with an embodiment of the present invention senses a touch input in a state in which a screen has a first transparency during a first frame section, and senses a gesture input in a state in which the screen has a second transparency during a second frame section. In particular, through output of an output light for detection of a distance to an external object, sensing of a touch input and a gesture input may be accomplished based on the output light and a received light corresponding to the output light.
In the case of a rear projection type, when outputting a projection image, the display apparatus may simultaneously output a projection image and an output light. Thereby, sensing of a touch input or gesture input may be accomplished simultaneously with projection of an image.
Meanwhile, a display apparatus in accordance with another embodiment of the present invention may further include a camera to capture an image of an external object in a state in which a screen has a second transparency during a second frame section. The captured image may be displayed, as a projection image, on the screen.
The above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
FIG. 1 illustrates a conceptual view of a display apparatus in accordance with one embodiment of the present invention;
FIG. 2 is one example of a block diagram showing a simplified internal configuration of the display apparatus of FIG. 1;
FIG. 3 illustrates simultaneous output of a projection image and a light for distance detection from a light output module of FIG. 1;
FIGS. 4A and 4B are reference views for explanation of sensing of a touch input and a gesture input by the light output module of FIG. 1;
FIGS. 5A and 5B are reference views for explanation of a distance detection method of the light output module of FIGS. 4A and 4B;
FIGS. 6A to 6E are views for explanation of a screen of FIGS. 4A and 4B;
FIG. 7 illustrates an implementation example of the display apparatus of FIG. 1;
FIG. 8 is another example of a block diagram showing a simplified internal configuration of the display apparatus of FIG. 1;
FIG. 9 is a reference view for explanation of sensing of a gesture input by the light output module of FIG. 8;
FIG. 10 illustrates projection of an image from the light output module of FIG. 8;
FIG. 11 is a flowchart showing an operating method of the display apparatus in accordance with one embodiment of the present invention;
FIGS. 12 and 13 illustrate various examples of arrangement of a first frame section and a second frame section;
FIGS. 14A and 14B are views for explanation of operation based on a touch input and a gesture input on a screen;
FIG. 15 is a view for explanation of operation of a light source unit of FIGS. 4A and 4B; and
FIGS. 16 and 17 are views illustrating various examples of a touch input and a gesture input.
Reference will now be made in detail to the embodiments of the present invention.
With respect to constituent elements used in the following description, suffixes “module” and “unit” are given only in consideration of ease in the preparation of the specification, and do not have or serve as specially important meanings or roles. Thus, the “module” and “unit” may be mingled with each other.
A display apparatus described herein is designed to display projection images on two screens via scanning. More particularly, the display apparatus is capable of recognizing a distance to an external object or motion of the external object while displaying projection images on two screens via scanning.
To this end, the display apparatus may include a light output module to output an output light for detection of a distance to an external object or motion of the external object. In addition, the light output module may output a projection image to display an image on a screen, in addition to the output light.
The display apparatus is capable of receiving a light scattered or reflected by an external object and of detecting, e.g., a distance to the external object based on a difference between the output light and the received light.
In particular, the display apparatus may sense a touch input in a state in which a screen has a first transparency during a first frame section, and may sense a gesture input in a state in which the screen has a second transparency during a second frame section.
Meanwhile, the above described display apparatus may be included in home appliances, such as a TV, media player, game console, air conditioner, refrigerator, washing machine, cooking appliance, robot cleaner, etc., and may also be included in a vehicle, such as a car, etc.
Hereinafter, the display apparatus as described above will be described in detail.
FIG. 1 illustrates a conceptual view of a display apparatus in accordance with one embodiment of the present invention.
Referring to the drawing, the display apparatus 10 may include a light output module 100 and screens 200a and 200b.
The light output module 100 may output an output light for detection of a distance to an external object via first direction scanning and second direction scanning, may receive a light corresponding to the output light, and may detect a distance to an external object or motion of the external object based on a difference between the output light and the received light.
In addition to the output light, the light output module 100 may output a projection image based on a visible light via first direction scanning and second direction scanning.
To this end, the light output module 100 may include a 2-dimentional (2D) scanner to simultaneously output the projection image and the output light via scanning.
Meanwhile, since a user 700, who is located opposite to the light output module 100, recognizes an image projected onto a screen 200, this image projection may be referred to as rear projection.
In addition, with regard to the user 700 who is located at the front of the screen 200, detection of a distance to each user is possible based on an output light to be output to the screen 200.
For example, as exemplarily shown in the drawing, when the user 700 who is located at the front of the screen 200 implements a touch input of touching the screen 200 using the finger 20, the light output module 100 may receive a light scattered or reflected by the user finger 20, and sense the touch input based on the output light and the received light.
In another example, as exemplarily shown in the drawing, when the user 700 who is located at the front of the screen 200 implements a gesture input at the front of the screen 200 using the hand 60, the light output module 100 may receive a light scattered or reflected by the user hand 60, and sense the gesture input based on the received light as well as the output light.
Meanwhile, in order to efficiently sense a touch input and a gesture input, the display apparatus 10 in accordance with the embodiment of the present invention may sense a touch input during a first frame section by providing the screen 200 with a first transparency, and sense a gesture input during a second frame section by providing the screen 200 with a second transparency less than the first transparency.
In conclusion, the display apparatus 100 enables detection of a touch input and a gesture input.
FIG. 2 is a block diagram showing a simplified internal configuration of the light output module of FIG. 1.
Referring to the drawing, the light output module 100 serves to output a projection image and an output light in a Time of Flight (TOF) manner.
To this end, the light output module 100 may include a memory 120, a scanner 140, a processor 170, a communication module 180, a drive unit 185, a power supply unit 190, a light source unit 210, and a light detector 280, for example.
The memory 120 may store programs for processing and control of the processor 170, and may function to temporarily store input or output data (e.g., still images and videos).
The communication module 180 serves as an interface between the light output module 100 and all external devices connected to the light output module 100 in a wired or wireless manner. The communication module 180 may receive data or power from the external devices to transmit the same to internal components of the light output module 100, and may transmit internal data of the light output module 100 to the external devices.
In particular, the communication module 180 may receive a radio signal from a proximate mobile terminal (not shown). Here, the radio signal may include a voice call signal, a video telephony call signal, or various types of data, such as text data, image data, etc.
To this end, the communication module 180 may include a local area communication module (not shown). Local area communication technologies may include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and the like.
The scanner 140 may output an input light to an external area by sequentially and repeatedly implementing first direction scanning and second direction scanning.
The light, input to the scanner 140, may include a visible light corresponding to a projection image and an output light for detection of a distance to an external object. Here, the output light may be an infrared light.
The scanner 140 may sequentially and repeatedly implement scanning from the left to the right and scanning from the right to the left with respect to an external scan area, and more particularly may implement this scanning with respect to the entire external scan area on a per frame basis. Through the scanning as described above, the scanner 140 may output the visible light and the output light to the external scan area.
The processor 170 may implement general control operation of the light output module 100. More specifically, the processor 170 may control operation of the respective internal units of the light output module 100.
The processor 170 may control output of a projection image, such as a video image stored in the memory 120 or a video image transmitted from an external device through the communication module 180, to an external scan area.
To this end, the processor 170 may control the drive unit 185 that controls the light source unit 210 to output red (R), green (G), and blue (B) visible lights. More specifically, the processor 170 may output R, G and B signals, corresponding to a video image to be displayed, to the drive unit 185.
Meanwhile, the processor 170 may transmit an electrical signal, corresponding to an output light, to the drive unit 185, for detection of a distance to an external object.
The processor 170 may control operation of the scanner 140. More specifically, the processor 170 may control sequential and repetitive implementation of first direction scanning and second direction scanning to output a projection image and an output light to an external area.
Meanwhile, during control of operation of the scanner 140, the processor 170 may vary a frame rate to vary a scanning speed of the scanner 140.
Meanwhile, the processor 170 may implement detection of a distance to an external object based on an electrical signal, corresponding to an output light to be transmitted to the drive unit 185, and an electrical signal corresponding to a received light received by the light detector 280.
For example, the processor 170 may detect a distance to an external scan area using a phase difference between an electrical signal corresponding to an output light and an electric signal corresponding to a received light. In addition, the processor 170 may detect gesture motion of a user based on distance information regarding an external scan area detected on a per frame basis.
More specifically, the processor 170 may process a touch input during a first frame section by providing the screen 200 with a first transparency, and process a gesture input during a second frame section by providing the screen 200 with a second transparency less than the first transparency.
Meanwhile, the processor 170 may control a screen controller 205 to control the screen 200 at the outside of the light output module 100.
The screen 200 may have a first transparency during a first frame section, and a second transparency less than the first transparency during a second frame section. More specifically, the screen 200 is changed to be opaque during the first frame section such that a projection image is displayed on the screen 200, and is changed to be transparent during the second frame section such that an output light is transmitted through the screen 200 to enable detection of a user who is located at the front of the screen 200.
The above described transparency adjustment of the screen on a per frame section basis may be implemented by the screen controller 205.
When the screen 200 includes a first transparent electrode and a second transparent electrode intersecting each other, and an anisotropic body interposed between the first transparent electrode and the second transparent electrode, the screen controller 205 may vary an arrangement direction of the anisotropic body based on power applied to the anisotropic body between the first transparent electrode and the second transparent electrode, thereby controlling the screen 200 to achieve a first transparency during a first frame section and a second transparency during a second frame section.
Meanwhile, a frame synchronization signal, etc., may be transmitted from the processor 170 to the screen controller 205.
The light source unit 210 may include a blue light source to output a blue light, a green light source to output a green light, and a red light source to output a red light. In this case, each light source may take form of a laser diode or Light Emitting Diode (LED).
In addition, the light source unit 210 may include an infrared light source to output an infrared light.
The light detector 280 may detect a received light from an external object, the received light corresponding to an output light, and convert the detected received light into an electrical signal. To this end, the light detector 280 may include a photodiode to convert an optical signal into an electrical signal. In particular, the light detector 280 may include a photodiode having high photoelectric conversion efficiency, for example, an avalanche photodiode to convert a light, scattered by an external object, into an electrical signal.
Meanwhile, the light detector 280 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor, in order to receive and detect an infrared light when an output light is the infrared light.
Although not shown in the drawing, a sampler (not shown) to convert an analog signal into a digital signal may be additionally provided between the light detector 280 and the processor 170.
The drive unit 185 may control output of red, green, and blue lights from the red light source, the green light source, and the blue light source of the light source unit 210 in response to R, G, and B signals transmitted from the processor 170.
Meanwhile, the drive unit 185 may control output of an infrared light from the infrared light source of the light source unit 210 in response to an electrical signal, corresponding to an output light, transmitted from the processor 170.
The power supply unit 190 may supply power required for operation of the respective components upon receiving external power or internal power under control of the processor 170.
FIG. 3 illustrates simultaneous output of a projection image and an output light for distance detection from the light output module of FIG. 1.
Referring to the drawing, the scanner 140 included in the light output module 100 in accordance with the embodiment of the present invention may output an input light to an external scan area by sequentially and repeatedly implementing first direction scanning and second direction scanning. The drawing illustrates output of a projection image 202a; RGB and an output light IR to the screen 200a.
The scanner 140 included in the light output module 100 in accordance with the embodiment of the present invention may simultaneously output an input light, i.e. a visible light RGB and an infrared output light IR. In particular, the scanner 140 may sequentially and repeatedly implement scanning from the left to the right and scanning from the right to the left with respect to an external scan area, and more particularly may implement this scanning with respect to the entire scan area on a per frame basis.
More specifically, the light output module 100 in accordance with the embodiment of the present invention may detect a distance to an external object while projecting an image to an external area. Therefore, the light output module 100 enables display of an image related to the distance to the object or motion of the object, or display of an image corresponding to a user gesture.
Hereinafter, the internal configuration of the light output module 100 will be described in more detail.
FIGS. 4A and 4B are reference views for explanation of sensing of a touch input and a gesture input by the light output module of FIG. 1.
A light output module 100a of FIGS. 4A and 4B may include the light source unit 210, collimator units 212 and 218, a photosynthesis unit 220, a light reflector 256, the scanner 140, the processor 170, the drive unit 185, an infrared light transmission filter 282, and the light detector 280.
The light source unit 210 may include a plurality of light sources. More specifically, the light source unit 210 may include a red light source 210R, a green light source 210G, a blue light source 210B, and an output light source 210IR to output an infrared output light. Among these light sources, the light sources 210R, 210G and 210B may include laser diodes.
The respective light sources 210R, 210G, 210B and 210IR may be driven by respective electrical signals from the drive unit 185, and the electrical signals of the drive unit 185 may be generated under control of the processor 170. Meanwhile, the output light source 210R may output an output light in response to an electrical signal corresponding to the output light.
Lights, output from the respective light sources 210R, 210G, 210B and 210IR, are collimated via respective collimator lenses included in the collimator unit 212.
The photosynthesis unit 220 synthesizes lights output from the respective light sources 210R, 210G, 210B and 210IR, and outputs the synthesized light in a given direction. To this end, the photosynthesis unit 220 may include four 2D Micro Electro Mechanical System (MEMS) mirrors 220a, 220b, 220c and 220d.
More specifically, a first photo synthesizer 220a, a second photo synthesizer 220b, a third photo synthesizer 220c, and a fourth photo synthesizer 220d respectively output a red light from the red light source 210R, a green light from the green light source 210G, a blue light from the blue light source 210B, and an output light from the output light source 210IR toward the scanner 140.
The light reflector 256 reflects the red light, the green light, the blue light, and the output light, having passed through the photosynthesis unit 220, toward the scanner 140. The light reflector 256 must reflect lights of various wavelengths, and to this end, may take the form of a Total Mirror (TM).
Meanwhile, the scanner 140 may output a visible light RBG and an output light IR, received from the light source unit 210 by sequentially and repeatedly implementing first direction scanning and second direction scanning with respect to an external area. This scanning is repeatedly implemented with respect to the entire external scan area. In particular, the visible light RGB and the output light IR, output from the scanner 140, may be output to the screen 200.
In this way, a projection image corresponding to the visible light RGB may be displayed on the screen 200.
Meanwhile, since the user 700, who is located opposite to the light output module 100, recognizes an image projected onto the screen 200, this image projection may be referred to as rear projection.
In addition, with regard to the user 700 who is located at the front of the screen 200, detection of a distance to each user is possible based on an output light to be output to the screen 200.
More specifically, the light output module 100 may sense a touch input, acquired when the user finger 20 touches the screen 200, during a first frame section, and sense a gesture input, acquired by motion of the user hand 60 located opposite to the screen 200, during a second frame section. To this end, the screen 200 may be opaque during the first frame section, and may be transparent during the second frame section.
FIG. 4A illustrates that the screen 200 is opaque during the first frame section. In this case, the light output module 100 may simultaneously output a projection image based on a visible light RGB and an output light IR. When the user implements a touch input on the screen 200 by moving the finger 20 during the first fame section, the output light IR may be scattered or reflected by the finger 20 to thereby be introduced into the light output module 100. More specifically, a received light may be input to the light detector 280 by way of the collimator unit 218 and the infrared light transmission filter 282.
The light detector 280 may convert the received light into an electrical signal, and the processor 170 may implement distance detection and processing of the touch input based on the output light and the received light acquired by the finger 20.
FIG. 4B illustrates that the screen 200 is transparent during the second frame section. In this case, the light output module 100 may output an output light IR. Alternatively, the light output module 100 may simultaneously output a projection image based on a visible light RGB and an output light IR. When the user implements a gesture input at the front of the screen 200 by moving the hand 60 during the second fame section, the output light IR may be scattered or reflected by the user hand 60 to thereby be introduced into the light output module 100. More specifically, a received light may be input to the light detector 280 by way of the collimator unit 218 and the infrared light transmission filter 282.
The light detector 280 may convert the received light into an electrical signal, and the processor 170 may implement distance detection and processing of the gesture input based on the output light and the received light acquired by the user hand 60.
Meanwhile, the first frame section and the second frame section may be alternately arranged, but are not limited thereto, and various other examples are possible.
According to the embodiment of the present invention, the scanner 140 outputs the visible light RGB, and therefore the screen 200, which displays a projection image, may have a freeform curved surface. Even in this case, the projection image may be displayed on the curved surface of the screen. For example, the curvature of the screen may be recognized via distance detection using the output light, scaling of a display image is implemented based on the corresponding curved surface, and the scaled projection image may be displayed. In this way, display on a freeform curved surface is possible.
FIGS. 5A and 5B are reference views for explanation of a distance detection method of the light output module of FIGS. 4A and 4B.
First, FIG. 5A is a reference view for explanation of sensing of a touch input on the screen 200 of FIG. 4A during the first frame section.
When the user touches a first point Pt on the screen 200 as exemplarily shown in FIG. 5A(a) in a state in which an output light is output to the screen 200, for example, when the user implements a touch input using the hand, the output light is scattered or reflected by the user hand to thereby be detected as a received light by the light detector 280.
The drawing illustrates increase in the amplitude of the received light at the x-coordinate and y-coordinate corresponding to the first point Pt.
In FIG. 5A(b), an electrical signal Tx corresponding to the output light and an electrical signal Rx corresponding to the received light are illustrated. In particular, the amplitude of the received light Ae at the x-coordinate and the y-coordinate corresponding to the first point Pt is illustrated. That is, the user touch causes the amplitude of the electrical signal corresponding to the received light at the touch point to be greater than that at other points on the screen.
Based on the above described property, the processor 170 detects and processes the touch point on the screen 200. More particularly, the processor 170 may calculate coordinates of the touch input when the screen 200 has a first transparency.
Next, FIG. 5B is a reference view for explanation of sensing of a gesture input implemented at the front of the screen 200 of FIG. 4B during the second frame section.
Here, Tx represents an electrical signal corresponding to an output light, and Rx represents an electrical signal corresponding to a received light acquired by a user gesture input at the front of the screen 200.
When the screen 200 transmits an output light, the light output module 100 may receive the output light scattered or reflected by the user hand 60 located at the front of the screen 200
Then, the processor 170 may calculate distance information based on a phase difference between the an electrical signal corresponding to the output light and an electrical signal corresponding to the received light, and process a gesture input based on the calculated distance information.
FIGS. 6A to 6E are views for explanation of the screen of FIGS. 4A and 4B.
First, FIG. 6A illustrates a cross sectional view of the screen 200 in accordance with one embodiment of the present invention. The screen 200 may include a protecting film 905, an upper glass panel 910 and a lower glass panel 920, first electrodes 915 and second electrodes 925 arranged between the upper glass panel 910 and the lower glass panel 920, and an anisotropic body 930 interposed between the first electrodes 915 and the second electrodes 925.
The protecting film 905 may serve to protect the screen 200.
The first electrodes 915 and the second electrodes 925 arranged between the upper glass panel 910 and the lower glass panel 920 may be transparent electrodes, such as Indium Tin Oxide (ITO) electrodes.
The anisotropic body 930, interposed between the first electrodes 915 and the second electrodes 925, may have a variable arrangement direction depending on power applied between the first electrodes 915 and the second electrodes 925, and may have different indices of refraction depending on the variable arrangement direction. For example, the anisotropic body 930 may be formed of liquid crystals.
Next, FIG. 6B is a view illustrating arrangement of the upper glass panel and the first electrodes of the screen of FIG. 6A.
As illustrated in the drawing, the first electrodes 915 are vertically disposed on the upper glass panel 910. More particularly, the drawing illustrates vertical arrangement of m first electrodes V1, …, Vm. Meanwhile, although the drawing illustrates the first electrodes V1, …, Vm as being electrically connected to one another, alternatively, the first electrodes may be spaced apart from one another.
FIG. 6C illustrates the waveform of power applied to the first electrodes of FIG. 6B. As exemplarily shown in the drawing, low level power L may be applied to the first electrodes V1, …, Vm during a first frame section and high level power H may be applied to the first electrodes V1, …, Vm during a second frame section.
Next, FIG. 6D is a view illustrating arrangement of the lower glass panel and the second electrodes of the screen of FIG. 6A.
As illustrated in the drawing, the second electrodes 925 are horizontally arranged on the lower glass panel 920 in parallel. More particularly, the drawing illustrates horizontal arrangement of n second electrodes H1, …, HN.
FIG. 6E illustrates the waveform of power applied to the second electrodes of FIG. 6D. As exemplarily shown in the drawing, low level power L may be applied to the second electrodes H1, …, HN during a first frame section and high level power H may be applied to the second electrodes H1, …, HN during a second frame section.
As such, when low level power is applied to both the first and second electrodes, an arrangement direction of the anisotropic body 903 does not vary, which may provide the second screen 200 with a first transparency, i.e. cause the second screen 200 to be opaque.
On the other hand, when high level power is applied to both the first and second electrodes, an arrangement direction of the anisotropic body 930 varies, which may provide the screen 200 with a second transparency, i.e. cause the second screen 200 to be transparent.
In this way, the screen 200 is changed to be opaque during the first frame section, and is changed to be transparent during the second frame section.
FIG. 7 illustrates an implementation example of the display apparatus of FIG. 1.
Referring to the drawing, the display apparatus 10 of the present invention is a rear projection type display apparatus capable of sensing a touch input and a gesture input. Thus, the display apparatus 10 may be embodied as a display apparatus for use in public places, such as theaters, terminals, airports, etc. Alternatively, the display apparatus 10 may be embodied as a display apparatus for use in a specific private space, such as a living room or a bedroom. Alternatively, the display apparatus 10 may be embodied as a display apparatus for use in a specific private space, such as the interior of a vehicle.
FIG. 7 illustrates an implementation example in which the display apparatus in accordance with the embodiment of the present invention is installed to a center fascia 610 of a vehicle having a freeform curved surface.
The center fascia 610 may be provided, at the front of a steering wheel 403, with the screen 200 corresponding to a dashboard where various gauges are arranged.
The drawing illustrates the case in which a gauge image is displayed on the screen 200.
Thereby, a driver may view the gauge image displayed on the screen 200. Meanwhile, the driver may implement a touch input on the screen 200, or may implement a gesture input at the front of the screen 200.
In this case, the display apparatus 10 of the present invention may sense a touch input or gesture input based on an output light from the light output module 100 and a received light corresponding to the output light.
In addition, the display apparatus 10 of the present invention may provide the screen 200 with an image corresponding to a user touch input or gesture input.
FIG. 8 is another example of a block diagram showing a simplified internal configuration of the display apparatus of FIG. 1.
FIG. 8 shows the display apparatus 10 in accordance with another embodiment of the present invention, which is similar to the display apparatus 10 of FIG. 2, but has a difference in terms of additional provision of a camera 195.
The camera 195 is included in the light output module 100, but is not limited thereto, and various other examples are possible.
During a second frame section, i.e. while the screen 200 has a second transparency, the camera 195 may capture an image of an external object located behind the transparent screen 200.
To this end, the camera 195 may be an RGB camera to acquire a captured image of an external object. For example, the camera 195 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor.
The image captured by the camera 195 may be transmitted to the processor 170 and may be output as a projection image to the screen 200 during a first frame section.
FIG. 9 is a reference view for explanation of sensing of a gesture input by the light output module of FIG. 8.
Similar to FIG. 4B, FIG. 9 illustrates detection of a distance to an external object based on, e.g., a gesture input during a second frame section.
The light output module 100a may include the camera 195 to capture an image of an external object. More specifically, a captured image of the user including the user hand 60 may be acquired.
FIG. 10 illustrates projection of an image from the light output module of FIG. 8.
Referring to the drawing, an image captured by the camera 195 included in the light output module 100 during a second frame section may be displayed as a projection image on the screen 200. Thus, the user 700 may view an image 650 showing the user’s figure.
The light output module 100 may output the projection image 650 during a first frame section after the camera 195 captures the image.
Meanwhile, the display apparatus 10 including the light output module 100 may be connected to a network or an external terminal device via the communication module 180 to implement data exchange.
Thereby, the display apparatus 10 may transmit the captured image 650 as exemplarily shown in FIG. 10 to an opponent terminal device, and may receive an image captured by the opponent terminal device. This enables realization of a video conferencing system.
Meanwhile, although not shown in FIG. 2 and other related drawings, the display device 10 of the present invention may further include an audio output unit, an audio input unit, etc., thus being capable of realizing a video telephone system.
FIG. 11 is a flowchart showing an operating method of the display apparatus in accordance with one embodiment of the present invention.
Referring to the drawing, the display apparatus in accordance with the embodiment of the present invention sets the screen to a first transparency during a first frame section (S810).
The processor 170 may control the screen controller 210 to apply low level power to the first electrodes 915 and the second electrodes 925 of the screen 200. Thereby, the screen 200 may be set to the first transparency, i.e. to be opaque.
Next, the display apparatus 10 senses a touch input on the screen 200 during the first frame section (S820). The light source unit 210 may output a visible light and an output light during the first frame section. The scanner 140 may output a projection image based on the visible light and the output light to the screen 200 via scanning.
In this case, since the screen 200 is opaque, the projection image is diffused and displayed on the screen 200, and the output light is reflected by the screen 200. Meanwhile, when a touch input occurs on the screen 200, the amplitude of a received light at a touch input point varies, differently from a received light reflected from the screen 200.
More specifically, an electrical signal corresponding to a received light at the touch input point may correspond to the wavelength Rx of FIG. 5A, whereas an electrical signal corresponding to a received light reflected by the screen at positions except for the touch input point may correspond to the wavelength Rx1 of FIG. 5B. That is, the amplitude of the electrical signal corresponding to the received light at the touch input point is greater than that at positions except for the touch input point.
In conclusion, the processor 170 may process the touch input based on a phase difference between the electrical signal corresponding to the output light and the electrical signal corresponding to the received light and the amplitude of the electrical signal corresponding to the received light. The phase difference is used only to detect a distance to the screen 200, and may not be considered upon sensing of a touch input. That is, the processor 170 may process a touch input based on the amplitude of the electrical signal corresponding to the received light.
Next, the display apparatus 10 sets the screen 200 to a second transparency during a second frame section (S830).
The processor 170 may control the screen controller 210 to apply high level power to the first electrodes 915 and the second electrodes 925 of the screen 200. Thereby, the screen 200 may be set to the second transparency less than the first transparency, i.e. to be transparent.
Next, the display apparatus 10 senses a gesture input of the user at the front of the screen 200 during the second frame section (S840).
The light source unit 210 may output only an output light except for a visible light during the second frame section, but is not limited thereto, and various other examples are possible.
The scanner 140 may output an output light to the screen 200 via scanning.
In this case, since the screen 200 is transparent, the output light may be transmitted through the screen 200 and reach the user located at the front of the screen 200. Then, the output light may be scattered or reflected by the user hand 60 and thereafter be introduced into the scanner 140.
In this case, an electrical signal corresponding to the received light acquired via the gesture input may correspond to the wavelength Rx of FIG. 5B.
In conclusion, the processor 170 may process the gesture input based on a phase difference between the electrical signal corresponding to the output light and the electrical signal corresponding to the received light.
Meanwhile, since the output light must be transmitted through the screen 200 during the second frame section, to enhance sensing accuracy of the gesture input, the light source unit 210 may increase the intensity of the output light during the second frame section as compared to the first frame section.
Various examples of arrangement of the first frame section and the second frame section are possible. This will be described below with reference to FIGS. 12 and 13.
FIGS. 12 and 13 illustrate various examples of arrangement of the first frame section and the second frame section.
First, FIG. 12 illustrates the case in which a first frame section for sensing of a touch input and a second frame section for sensing of a gesture input are alternately arranged. Thereby, the screen 200 may be changed to be opaque during odd frame sections 1, 3,…, n-1 frames and may be changed to be transparent during even frame sections 2, 4,…, n frames.
That is, the odd frame sections 1, 3,…, n-1 frames may correspond to the above described first frame section, and the even frame sections 2, 4,…, n frames may correspond to the above described second frame section.
During the odd frame sections 1, 3,…, n-1 frames, respective corresponding projection images 910a, 910c, …, 910y may be displayed on the screen 200 via scanning, and during the even frame sections 2, 4,…, n frames, respective corresponding projection images 910b, 910d, …, 910z may be displayed on the screen 200 via scanning.
On the other hand, differently from FIG. 12, no projection image may be displayed on the screen 200 during the even frame sections 2, 4,…, n frames. In this case, a light source to output a visible light included in the light source unit 210 may be turned off.
Alternatively, differently from FIG. 12, the frequency of the first frame sections may differ from the frequency of the second frame sections.
FIG. 13 illustrates the case in which the frequency of the first frame sections is greater than the frequency of the second frame sections, in order to enhance sensing accuracy of a touch input.
As exemplarily illustrated in the drawing, during all frame sections 1, 2, 3, …, n frames except for a fourth frame section 4 frame, the screen 200 is changed to be opaque and displays a corresponding projection image. On the other hand, during the fourth frame section 4 frame, the screen 200 may be changed to be transparent and display a corresponding projection image.
Differently from FIG. 13, to enhance sensing accuracy of a gesture input, the frequency of the second frame sections may be greater than the frequency of the first frame sections.
The frequency of the first frame sections and the frequency of the second frame sections may be adjusted under control of the processor 170.
As exemplarily shown in FIG. 13, when a gesture input begins to be sensed during the fourth frame section 4 frame under the condition of the frequency of the first frame sections being greater than the frequency of the second frame sections, it is possible to increase the frequency of the second frame sections as compared to that in FIG. 13. For example, as exemplarily shown in FIG. 12, the first frame sections and the second frame sections may be alternately arranged.
Conversely, when a touch input begins to be sensed under the condition of the frequency of the second frame sections being greater than the frequency of the first frame sections, it is possible to further increase the frequency of the first frame sections.
FIGS. 14A and 14B are views for explanation of operation based on a touch input and a gesture input on a screen.
First, FIG. 14A(a) illustrates the case in which a projection image 1100 showing gauge information is displayed. In this case, when a touch input occurs by the user finger 20, the display apparatus 10 in accordance with the embodiment of the present invention may sense the touch input based on a received light, and implement a corresponding operation.
FIG. 14A(b) illustrates the case in which a projection image 1120 showing time information depending on the touch input is displayed. As such, user convenience may be increased.
Next, FIG. 14B(a) illustrates the case in which a projection image 1210 as a broadcast image of a first channel (CH 9) is displayed on the screen 200. In this case, when a gesture input by the user hand 60 occurs, the display apparatus 10 in accordance with the embodiment of the present invention may sense the gesture input based on a received light, and implement a corresponding operation.
FIG. 14B(b) illustrates the case in which channel-up depending on the gesture input is implemented and thus a broadcast image of a second channel (CH 11) is displayed as a projection image 1220. As such, user convenience may be increased.
FIG. 15 is a view for explanation of operation of the light source unit of FIGS. 4A and 4B.
As described above, a projection image and an output light are output during a first frame section for sensing of a touch input, whereas a projection image is not output and only an output light is output during a second frame section for sensing of a gesture input.
Thereby, as exemplarily shown in the drawing, during the first frame section, the red light source 210R, the green light source 210G, the blue light source 210B, and the output light source 210IR included in the light source unit 210 may be turned on. During the second frame section, the red light source 210R, the green light source 210G, and the blue light source 210B included in the light source unit 210 may be turned off, and only the output light source 210IR may be turned on. As such, unnecessary power consumption may be reduced.
Although time-based separation of a touch input section and a gesture input section on a per frame section basis has been described as a method of sensing a touch input and a gesture input, space-based separation is also possible.
That is, the display apparatus in accordance with another embodiment of the present invention includes a screen consisting of a first region having a first transparency and a second region having a second transparency less than the first transparency, a light source unit to output an output light for detection of a distance to an external object, and a processor configured to process a touch input based on a received light corresponding to the output light when the touch input occurs on the first region of the screen and to process a gesture input based on a received light corresponding to the output light when the gesture input occurs at the front of the second region of the screen.
In this case, the first region of the screen 200 may be set to a first transparency, and the second region may be set to a second transparency less than the first transparency.
Then, when the touch input occurs on the first region of the screen, the processor 170 may process the touch input based on a received light corresponding to the output light. When the gesture input occurs at the front of the second region of the screen, the processor 170 may process a gesture input based on a received light corresponding to the output light.
In this case, the light source unit 210 may simultaneously output a projection image based on a visible light and an output light, and the scanner 140 may simultaneously output the projection image and the output light to the screen 200 by implementing first directions canning and the second direction scanning.
Meanwhile, the processor 170 may implement scaling of an image to be displayed, thereby controlling projection of the image onto the first region of the screen 200.
A description of the other configurations of the display apparatus in accordance with another embodiment of the present invention will be replaced by the above description with reference to FIGS. 4A to 15.
FIGS. 16 and 17 are views illustrating various examples of a touch input and a gesture input.
First, FIG. 16(a) illustrates the case in which a first region 1310 of the screen 200 is opaque and a second region 1320 is transparent. In this case, when the user finger 20 touches a point in the first region 1310, the display apparatus 10 may sense the touch input based on a received light. Then, to ensure efficient sensing of the touch input, the size of the first region 1310 of the screen 200 may vary.
FIG. 16(b) illustrates the case in which an entire region 1315 of the screen 200 is opaque. Thus, sensing of a touch input on the entire region 1315 of the screen 200 is possible.
Next, FIG. 17(a) illustrates the case in which the first region 1310 of the screen 200 is opaque and the second region 1320 is transparent. In this case, when the user hand 60 makes gesture motion at the front of the second region 1320, the display apparatus 10 may sense the gesture input based on a received light. Then, the display apparatus 10 may vary the size of the second region 1320 of the screen 200, to ensure efficient sensing of the gesture input.
FIG. 17(b) illustrates the case in which an entire region 1325 of the screen 200 is transparent. Thus, sensing of a gesture input at the front of the entire region 1325 of the screen 200 is possible.
Meanwhile, variation in the sizes of the transparent region and the opaque region may be accomplished by applying high level power H to some of the first electrodes V1, …, Vm and the second electrodes H1, …, HN of the screen 200 and applying low level power L to the other electrodes as described above with reference to FIGS. 6A to 6E.
The above described configurations and methods of the display apparatus are included in at least one of the embodiments of the present invention, and should not be limited to only one embodiment. In addition, the configurations and methods as illustrated in each embodiment may be implemented with regard to other embodiments as they are combined with one another or modified by those skilled in the art.
Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.
The present invention may be applied to a display apparatus, and more particularly to a rear projection type display apparatus capable of sensing a touch input and a gesture input.

Claims (20)

  1. A display apparatus comprising:
    a screen having a first transparency during a first frame section and a second transparency during a second frame section;
    a light source unit configured to output an output light for detection of a distance to an external object;
    a light detector configured to detect a received light from the external object; and
    a processor configured to process a touch input based on the output light and the received light during the first frame section and to process a gesture input during the second frame section.
  2. The apparatus according to claim 1, wherein the processor processes a touch input based on a phase difference between an electrical signal corresponding to the output light and an electrical signal corresponding to the received light and the amplitude of the electrical signal corresponding to the received light during the first frame section, and
    wherein the processor processes a gesture input based on a phase difference between an electrical signal corresponding to the output light and an electrical signal corresponding to the received light during the second frame section.
  3. The apparatus according to claim 1, wherein the light source unit simultaneously outputs a projection image based on a visible light and the output light, and
    wherein the screen displays the projection image during the first frame section, and does not display the projection image during the second frame section.
  4. The apparatus according to claim 1, further comprising a scanner configured to output the output light toward the screen by implementing first direction scanning and second direction scanning.
  5. The apparatus according to claim 4, wherein the light source unit simultaneously outputs a projection image based on a visible light and the output light, and
    wherein the scanner outputs the projection image and the output light toward the screen during the first frame section and the second frame section.
  6. The apparatus according to claim 4, wherein the light source unit simultaneously outputs a projection image based on a visible light and the output light, and
    wherein the scanner outputs the projection image and the output light toward the screen during the first frame section, and outputs the output light toward the screen during the second frame section.
  7. The apparatus according to claim 1, wherein the screen alternately varies a transparency on a per frame basis.
  8. The apparatus according to claim 1, wherein the intensity of the output light, output during the second frame section, is greater than the intensity of the output light output during the first frame section.
  9. The apparatus according to claim 1, further comprising a screen controller configured to control the screen such that the screen has the first transparency during the first frame section and the second transparency during the second frame section.
  10. The apparatus according to claim 1, wherein the processor varies the frequency of first frame sections and the frequency of second frame sections based on the touch input or the gesture input.
  11. The apparatus according to claim 1, wherein the screen includes a first electrode and a second electrode interesting each other, and an anisotropic body interposed between the first electrode and the second electrode, and
    wherein an arrangement direction of the anisotropic body varies based on power applied between the first electrode and the second electrode, such that the screen has the first transparency during the first frame section and the second transparency during the second frame section.
  12. The apparatus according to claim 1, further comprising a camera located opposite to the external object on the basis of the screen, the camera being configured to capture an image of the external object during the second frame section,
    wherein the processor receives the captured image of the external object.
  13. The apparatus according to claim 12, wherein the light source unit simultaneously outputs a projection image based on a visible light and the output light, and
    wherein the screen displays the captured image as the projection image during the first frame section.
  14. A display apparatus comprising:
    a screen having a first transparency in a first region thereof and a second transparency in a second region thereof;
    a light source unit configured to output an output light for detection of a distance to an external object;
    a light detector configured to detect a received light from the external object; and
    a processor configured to process a touch input on the first region of the screen and a gesture input at the front of the second region of the screen based on the output light and the received light.
  15. The apparatus according to claim 14, wherein the processor processes a touch input based on a phase difference between an electrical signal corresponding to the output light and an electrical signal corresponding to the received light in the first region and the amplitude of the electrical signal corresponding to the received light, and
    wherein the processor processes a gesture input based on a phase difference between an electrical signal corresponding to the output light and an electrical signal corresponding to the received light in the second region.
  16. The apparatus according to claim 14, wherein the light source unit simultaneously outputs a projection image based on a visible light and the output light.
  17. The apparatus according to claim 14, further comprising a scanner configured to output the output light toward the screen by implementing first direction scanning and second direction scanning.
  18. The apparatus according to claim 14, wherein the screen varies the sizes of the first region and the second region based on the touch input or the gesture input.
  19. The apparatus according to claim 14, further comprising a screen controller configured to control the screen such that the first region of the screen has the first transparency and the second region of the screen has the second transparency.
  20. The apparatus according to claim 14, wherein the screen includes a first electrode and a second electrode interesting each other, and an anisotropic body interposed between the first electrode and the second electrode, and
    wherein an arrangement direction of the anisotropic body varies based on power applied between the first electrode and the second electrode, such that the first region of the screen has the first transparency and the second region of the screen has the second transparency.
PCT/KR2014/001668 2013-07-16 2014-02-28 Rear projection type display apparatus capable of sensing touch input and gesture input WO2015008915A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130083758A KR102102760B1 (en) 2013-07-16 2013-07-16 Display apparatus for rear projection-type capable of detecting touch input and gesture input
KR10-2013-0083758 2013-07-16

Publications (1)

Publication Number Publication Date
WO2015008915A1 true WO2015008915A1 (en) 2015-01-22

Family

ID=52346338

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/001668 WO2015008915A1 (en) 2013-07-16 2014-02-28 Rear projection type display apparatus capable of sensing touch input and gesture input

Country Status (2)

Country Link
KR (1) KR102102760B1 (en)
WO (1) WO2015008915A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113503440A (en) * 2021-07-21 2021-10-15 许昌学院 Intelligent home projector based on hand motion recognition

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090102800A1 (en) * 2007-10-17 2009-04-23 Smart Technologies Inc. Interactive input system, controller therefor and method of controlling an appliance
KR20110101995A (en) * 2010-03-10 2011-09-16 전남대학교산학협력단 Touch and touch gesture recognition system
US20120274550A1 (en) * 2010-03-24 2012-11-01 Robert Campbell Gesture mapping for display device
US20130106792A1 (en) * 2010-08-04 2013-05-02 Robert Campbell System and method for enabling multi-display input
WO2013077883A1 (en) * 2011-11-23 2013-05-30 Intel Corporation Gesture input with multiple views, displays and physics

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101441584B1 (en) * 2008-01-02 2014-09-23 삼성전자 주식회사 See-through display apparatus and method
KR101688942B1 (en) * 2010-09-03 2016-12-22 엘지전자 주식회사 Method for providing user interface based on multiple display and mobile terminal using this method
JP5609566B2 (en) * 2010-11-10 2014-10-22 船井電機株式会社 projector

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090102800A1 (en) * 2007-10-17 2009-04-23 Smart Technologies Inc. Interactive input system, controller therefor and method of controlling an appliance
KR20110101995A (en) * 2010-03-10 2011-09-16 전남대학교산학협력단 Touch and touch gesture recognition system
US20120274550A1 (en) * 2010-03-24 2012-11-01 Robert Campbell Gesture mapping for display device
US20130106792A1 (en) * 2010-08-04 2013-05-02 Robert Campbell System and method for enabling multi-display input
WO2013077883A1 (en) * 2011-11-23 2013-05-30 Intel Corporation Gesture input with multiple views, displays and physics

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113503440A (en) * 2021-07-21 2021-10-15 许昌学院 Intelligent home projector based on hand motion recognition

Also Published As

Publication number Publication date
KR102102760B1 (en) 2020-05-29
KR20150009360A (en) 2015-01-26

Similar Documents

Publication Publication Date Title
WO2015023145A1 (en) Distance detection apparatus for acquiring distance information having variable spatial resolution and image display apparatus having same
WO2017185316A1 (en) First-person-view flight control method and system for unmanned aerial vehicle, and smart glasses
WO2016190505A1 (en) Glass type terminal and control method therefor
WO2016111561A1 (en) Display device and operating method thereof
WO2019147021A1 (en) Device for providing augmented reality service, and method of operating the same
WO2014038898A1 (en) Transparent display apparatus and object selection method using the same
WO2019240373A1 (en) Terminal apparatus and method of transmitting control command thereof
WO2017030262A1 (en) Photographing apparatus and method for controlling the same
WO2016195207A1 (en) Mobile terminal
WO2017026554A1 (en) Mobile terminal
WO2017171412A2 (en) Image processing apparatus and mobile terminal
WO2016182090A1 (en) Glasses-type terminal and control method therefor
WO2015023038A1 (en) Display device capable of displaying plurality of projection pictures without overlapping
EP3479562A2 (en) Electronic device including light-emitting elements and method of operating electronic device
WO2015064935A1 (en) Electronic device and control method thereof
WO2022045509A1 (en) Electronic apparatus and controlling method thereof
WO2017039061A1 (en) Wearable device and control method therefor
WO2019054626A1 (en) Electronic device and method for obtaining data from second image sensor by means of signal provided from first image sensor
WO2015008915A1 (en) Rear projection type display apparatus capable of sensing touch input and gesture input
WO2017057845A1 (en) Reconfigurable mobile device
EP3357222A1 (en) Reconfigurable mobile device
WO2021091105A1 (en) Electronic apparatus and control method thereof
WO2019083313A1 (en) Input device and control method therefor
WO2019022492A1 (en) Camera, and image display apparatus including the same
WO2012157793A1 (en) Gesture recognition method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14826572

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14826572

Country of ref document: EP

Kind code of ref document: A1