WO2012128399A1 - Dispositif d'affichage et procédé de commande associé - Google Patents

Dispositif d'affichage et procédé de commande associé Download PDF

Info

Publication number
WO2012128399A1
WO2012128399A1 PCT/KR2011/001919 KR2011001919W WO2012128399A1 WO 2012128399 A1 WO2012128399 A1 WO 2012128399A1 KR 2011001919 W KR2011001919 W KR 2011001919W WO 2012128399 A1 WO2012128399 A1 WO 2012128399A1
Authority
WO
WIPO (PCT)
Prior art keywords
stereoscopic image
gesture
display device
user
controller
Prior art date
Application number
PCT/KR2011/001919
Other languages
English (en)
Inventor
Soungmin Im
Sangki Kim
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Priority to CN201180069433.7A priority Critical patent/CN103430215B/zh
Priority to DE112011104939.0T priority patent/DE112011104939T5/de
Priority to PCT/KR2011/001919 priority patent/WO2012128399A1/fr
Publication of WO2012128399A1 publication Critical patent/WO2012128399A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • the present invention relates to a display device and a method of controlling the same, and more particularly, to a display device and a method of controlling the same, capable of controlling the presentation (i.e., display) of an image in response to a distance and an approach direction with respect to a stereoscopic image.
  • terminals such as personal computers, laptop computers, cellular phones or the like
  • multimedia player type terminals equipped with complex functions of, for example, capturing pictures or videos, reproducing music or video files, providing game services, receiving broadcasting signals or the like.
  • Terminals as multimedia devices, may also be called display devices as they are generally configured to display a variety of image information.
  • Such display devices may be classified into portable and stationary type according to the mobility thereof.
  • portable display devices may include laptop computers, cellular phones and the like
  • stationary display devices may include televisions, monitors for desktop computers and the like.
  • an object of the present invention is to efficiently provide a display device and a method of controlling the same, capable of controlling the presentation of an image in response to a distance and an approach direction with respect to a stereoscopic image.
  • a display device including: a camera capturing a gesture of a user; a display displaying a stereoscopic image; and a controller controlling presentation of the stereoscopic image in response to a distance between the gesture and the stereoscopic image in a virtual space and an approach direction of the gesture with respect to the stereoscopic image.
  • a display device including: a camera capturing a gesture of a user; a display displaying a stereoscopic image having a plurality of sides; and a controller executing a function assigned to at least one of the plurality of sides in response to an approach direction of the gesture with respect to the at least one of the plurality of sides in a virtual space.
  • a method of controlling the display device including: displaying a stereoscopic image; acquiring a gesture with respect to the displayed stereoscopic image; and controlling presentation of the stereoscopic image in response to a distance between the gesture and the stereoscopic image in a virtual space, and an approach direction of the gesture with respect to the stereoscopic image.
  • a method of controlling a display device including: displaying a stereoscopic image having a plurality of sides; acquiring a gesture with respect to the displayed stereoscopic image; and executing a function assigned to at least one of the plurality of sides in response to an approach direction of the gesture with respect to the at least one of the plurality of sides in a virtual space.
  • the presentation of an image can be controlled in response to a distance and an approach direction with respect to a stereoscopic image.
  • FIG. 1 is a block diagram of a display device relating to an embodiment of this document.
  • FIG. 2 is a conceptional view for explaining a proximity depth of a proximity sensor.
  • FIGS. 3 and 4 are views for explaining a method for displaying a stereoscopic image by using a binocular parallax according to an exemplary embodiment of the present invention.
  • FIG. 5 is a flowchart according to an exemplary embodiment of the present invention.
  • FIGS. 6 through 9 are views for explaining a method for displaying a stereoscopic image associated with FIG. 5.
  • FIG. 10 is a flowchart of the process of acquiring a user's gesture associated with FIG. 5, in more detail.
  • FIG. 11 is a view depicting a gesture for control acquisition associated with FIG. 10.
  • FIG. 12 is a flowchart of the process of controlling the presentation of the stereoscopic image associated with FIG. 5, in more detail.
  • FIGS. 13 and 14 are views depicting examples of a displayed stereoscopic image.
  • FIGS. 15 and 16 are views depicting gestures with respect to a stereoscopic image.
  • FIGS. 17 through 20 are views depicting display changes according to a gesture with respect to a stereoscopic image.
  • FIGS. 21 through 26 are views depicting gestures with respect to a stereoscopic image in the form of a polyhedron.
  • FIGS. 27 through 31 are views depicting pointers for selecting a stereoscopic image.
  • FIGS. 32 through 34 are views depicting the process of selecting any one of a plurality of stereoscopic images.
  • FIGS. 35 and 36 are views depicting an operation of a feedback unit.
  • FIGS. 37 through 39 are views depicting an operation of a display device relating to another exemplary embodiment of this document.
  • the mobile terminal described in the specification can include a cellular phone, a smart phone, a laptop computer, a digital broadcasting terminal, personal digital assistants (PDA), a portable multimedia player (PMP), a navigation system and so on.
  • PDA personal digital assistants
  • PMP portable multimedia player
  • FIG. 1 is a block diagram of a display device relating to an embodiment of this document.
  • the display device 100 may include a communication unit 110, a user input unit 120, an output unit 150, a memory 160, an interface 170, a controller 180, and a power supply 190. Not all of the components shown in FIG. 1 may be essential parts and the number of components included in the display device 100 may be varied.
  • the communication unit 110 may include at least one module that enables communication between the display device 100 and a communication system or between the display device 100 and another device.
  • the communication unit 110 may include a broadcasting receiving module 111, an Internet module 113, and a near field communication module 114.
  • the broadcasting receiving module 111 may receive broadcasting signals and/or broadcasting related information from an external broadcasting management server through a broadcasting channel.
  • the broadcasting channel may include a satellite channel and a terrestrial channel
  • the broadcasting management server may be a server that generates and transmits broadcasting signals and/or broadcasting related information or a server that receives previously created broadcasting signals and/or broadcasting related information and transmits the broadcasting signals and/or broadcasting related information to a terminal.
  • the broadcasting signals may include not only TV broadcasting signals, radio broadcasting signals, and data broadcasting signals but also signals in the form of a combination of a TV broadcasting signal and a radio broadcasting signal of a data broadcasting signal.
  • the broadcasting related information may be information on a broadcasting channel, a broadcasting program or a broadcasting service provider, and may be provided even through a communication network.
  • the broadcasting related information may exist in various forms.
  • the broadcasting related information may exist in the form of an electronic program guide (EPG) of a digital multimedia broadcasting (DMB) system or in the form of an electronic service guide (ESG) of a digital video broadcast-handheld (DVB-H) system.
  • EPG electronic program guide
  • ESG electronic service guide
  • DMB digital multimedia broadcasting
  • DVB-H digital video broadcast-handheld
  • the broadcasting receiving module 111 may receive broadcasting signals using various broadcasting systems.
  • the broadcasting signals and/or broadcasting related information received through the broadcasting receiving module 111 may be stored in the memory 160.
  • the Internet module 113 may correspond to a module for Internet access and may be included in the display device 100 or may be externally attached to the display device 100.
  • the near field communication module 114 may correspond to a module for near field communication. Further, Bluetooth®, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB) and/or ZigBee® may be used as a near field communication technique.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra wideband
  • ZigBee® ZigBee®
  • the user input 120 is used to input an audio signal or a video signal and may include a camera 121 and a microphone 122.
  • the camera 121 may process image frames of still images or moving images obtained by an image sensor in a video telephony mode or a photographing mode. The processed image frames may be displayed on a display 151.
  • the camera 121 may be a 2D or 3D camera.
  • the camera 121 may be configured in the form of a single 2D or 3D camera or in the form of a combination of the 2D and 3D cameras.
  • the image frames processed by the camera 121 may be stored in the memory 160 or may be transmitted to an external device through the communication unit 110.
  • the display device 100 may include at least two cameras 121.
  • the microphone 122 may receive an external audio signal in a call mode, a recording mode or a speech recognition mode and process the received audio signal into electric audio data.
  • the microphone 122 may employ various noise removal algorithms for removing or reducing noise generated when the external audio signal is received.
  • the output unit 150 may include the display 151 and an audio output module 152.
  • the display 151 may display information processed by the display device 100.
  • the display 151 may display a user interface (UI) or a graphic user interface (GUI) relating to the display device 100.
  • the display 151 may include at least one of a liquid crystal display, a thin film transistor liquid crystal display, an organic light-emitting diode display, a flexible display and a three-dimensional display. Some of these displays may be of a transparent type or a light transmissive type. That is, the display 151 may include a transparent display.
  • the transparent display may include a transparent liquid crystal display.
  • the rear structure of the display 151 may also be of a light transmissive type. Accordingly, a user may see an object located behind the body of terminal through the transparent area of the terminal body, occupied by the display 151.
  • the display device 100 may include at least two displays 151.
  • the display device 100 may include a plurality of displays 151 that are seated on a single plane at a predetermined distance or integrated displays.
  • the plurality of displays 151 may also be seated on different planes.
  • the display 151 and a sensor sensing touch form a layered structure that is referred to as a touch screen
  • the display 151 may be used as an input device in addition to an output device.
  • the touch sensor may be in the form of a touch film, a touch sheet, and a touch pad, for example.
  • the touch sensor may convert a variation in pressure applied to a specific portion of the display 151 or a variation in capacitance generated at a specific portion of the display 151 into an electric input signal.
  • the touch sensor may sense pressure of touch as well as position and area of the touch.
  • a signal corresponding to the touch input may be transmitted to a touch controller.
  • the touch controller may then process the signal and transmit data corresponding to the processed signal to the controller 180. Accordingly, the controller 180 can detect a touched portion of the display 151.
  • the audio output module 152 may output audio data received from the radio communication unit 110 or stored in the memory 160.
  • the audio output module 152 may output audio signals related to functions, such as a call signal incoming tone and a message incoming tone, performed in the display device 100.
  • the memory 160 may store a program for operation of the controller 180 and temporarily store input/output data such as a phone book, messages, still images, and/or moving images.
  • the memory 160 may also store data about vibrations and sounds in various patterns that are output from when a touch input is applied to the touch screen.
  • the memory 160 may include at least a flash memory, a hard disk type memory, a multimedia card micro type memory, a card type memory, such as SD or XD memory, a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), a programmable ROM (PROM) magnetic memory, a magnetic disk or an optical disk.
  • the display device 100 may also operate in relation to a web storage performing the storing function of the memory 160 on the Internet.
  • the interface 170 may serve as a path to all external devices connected to the mobile terminal 100.
  • the interface 170 may receive data from the external devices or power and transmit the data or power to internal components of the display device terminal 100 or transmit data of the mobile terminal 100 to the external devices.
  • the interface 170 may include a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device having a user identification module, an audio I/O port, a video I/O port, and/or an earphone port.
  • the controller 180 may control overall operations of the mobile terminal 100.
  • the controller 180 may perform control and processing for voice communication.
  • the controller 180 may also include an image processor 182 for pressing image, which will be explained later.
  • the power supply 190 receives external power and internal power and provides power required for each of the components of the display device 100 to operate under the control of the controller 180.
  • embodiments described in this document can be implemented in software, hardware or a computer readable recording medium. According to hardware implementation, embodiments of this document may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and/or electrical units for executing functions. The embodiments may be implemented by the controller 180 in some cases.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, and/or electrical units for executing functions.
  • controller 180 may be implemented by the controller 180 in some cases.
  • embodiments such as procedures or functions may be implemented with a separate software module executing at least one function or operation.
  • Software codes may be implemented according to a software application written in an appropriate software language. The software codes may be stored in the memory 160 and executed by the controller 180.
  • FIG. 2 is a conceptional view for explaining a proximity depth of the proximity sensor.
  • the proximity sensor located inside or near the touch screen senses the approach and outputs a proximity signal.
  • the proximity sensor can be constructed such that it outputs a proximity signal according to the distance between the pointer approaching the touch screen and the touch screen (referred to as “proximity depth”).
  • the distance in which the proximity signal is output when the pointer approaches the touch screen is referred to as a detection distance.
  • the proximity depth can be known by using a plurality of proximity sensors having different detection distances and comparing proximity signals respectively output from the proximity sensors.
  • FIG. 2 shows the section of the touch screen in which proximity sensors capable of sensing three proximity depths are arranged. Proximity sensors capable of sensing less than three or more than four proximity depths can be arranged in the touch screen.
  • the pointer when the pointer completely comes into contact with the touch screen (D0), it is recognized as contact touch.
  • the pointer When the pointer is located within a distance D1 from the touch screen, it is recognized as proximity touch of a first proximity depth.
  • the pointer When the pointer is located in a range between the distance D1 and a distance D2 from the touch screen, it is recognized as proximity touch of a second proximity depth.
  • the pointer When the pointer is located in a range between the distance D2 and a distance D3 from the touch screen, it is recognized as proximity touch of a third proximity depth.
  • the pointer When the pointer is located at longer than the distance D3 from the touch screen, it is recognized as cancellation of proximity touch.
  • the controller 180 can recognize the proximity touch as various input signals according to the proximity distance and proximity position of the pointer with respect to the touch screen and perform various operation controls according to the input signals.
  • FIGS. 3 and 4 are views illustrating a method for displaying a stereoscopic image using binocular parallax according to an exemplary embodiment of the present invention. Specifically, FIG. 3 shows a scheme using a lenticular lens array, and FIG. 4 shows a scheme using a parallax barrier.
  • Binocular parallax refers to the difference in vision of viewing an object between a human being’s (user’s or observer’s) left and right eyes.
  • the user’s brain When the user’s brain combines an image viewed by the left eye and that viewed by the right eye, the combined image makes the user feel stereoscopic.
  • the phenomenon in which the user feels stereoscopic according to binocular parallax will be referred to as a ‘stereoscopic vision’, and an image causing a stereoscopic vision will be referred to as a ‘stereoscopic image’.
  • a particular object included in an image causes the stereoscopic vision
  • the corresponding object when a particular object included in an image causes the stereoscopic vision, the corresponding object will be referred to as a ‘stereoscopic object’.
  • a method for displaying a stereoscopic image according to binocular parallax is classified into a glass type method and a glassless type method.
  • the glass type method may include a scheme using tinted glasses having a wavelength selectivity, a polarization glass scheme using a light blocking effect according to a deviation difference, and a time-division glass scheme alternately providing left and right images within a residual image time of eyes.
  • the glass type method may further include a scheme in which filters each having a different transmittance are mounted on left and right eyes and a cubic effect with respect to a horizontal movement is obtained according to a time difference of a visual system made from the difference in transmittance.
  • the glassless type method in which a cubic effect is generated from an image display surface, rather than from an observer, includes a parallax barrier scheme, a lenticular lens scheme, a microlens array scheme, and the like.
  • a display module 151 includes a lenticular lens array 81a.
  • the lenticular lens array 81a is positioned between a display surface 81 on which pixels (L) to be input to a left eye 82a and pixels (R) to be input to a right eye 82b are alternately arranged along a horizontal direction, and the left and right eyes 82a and 82b, and provides an optical discrimination directionality with respect to the pixels (L) to be input to the left eye 82a and the pixels (R) to be input to the right eye 82b.
  • an image which passes through the lenticular lens array 81a is separated by the left eye 82a and the right eye 82b and thusly observed, and the user’s brain combines (or synthesizes) the image viewed by the left eye 82a and the image viewed by the right eye 82b, thus allowing the user to observe a stereoscopic image.
  • the display module 151 includes a parallax barrier 81b in the shape of a vertical lattice.
  • the parallax barrier 81b is positioned between a display surface 81 on which pixels (L) to be input to a left eye 82a and pixels (R) to be input to a right eye 82b are alternately arranged along a horizontal direction, and the left and right eyes 82a and 82b, and allows images are separately observed at the left eye 82a and the right eye 82b.
  • the user’s brain combines (or synthesizes) the image viewed by the left eye 82a and the image viewed by the right eye 82b, thus allowing the user to observe a stereoscopic image.
  • the parallax barrier 81b is turned on to separate incident vision only in the case of displaying a stereoscopic image, and when a planar image is intended to be displayed, the parallax barrier 81b may be turned off to allow the incident vision to pass therethrough without being separated.
  • a stereoscopic image using binocular parallax may be displayed by using various other methods.
  • FIG. 5 is a flowchart according to an exemplary embodiment of the present invention.
  • the controller 180 of the display device 100 may display a stereoscopic image in operation S10.
  • the stereoscopic image may be an image displayed by using a binocular disparity, that is, a stereo disparity.
  • a binocular disparity that is, a stereo disparity.
  • a stereoscopic image with depth or perspective may be displayed.
  • an image may be displayed as if protruding or receding from a display surface of the display 151.
  • the stereoscopic image using the stereo disparity is different from a related-art two-dimensional (2D) display that gives just a 3D-like impression.
  • 2D related-art two-dimensional
  • a user's gesture may be acquired in operation S30.
  • the user's gesture may be captured by the camera 121 provided in the display device 100.
  • the camera 121 may capture a motion made by a user in front of the TV.
  • the camera 121 may capture a hand motion of the user in front or at the back of the mobile terminal.
  • the presentation of the stereoscopic image may be controlled according to a distance and a location relationship between the stereoscopic image and the gesture in operation S50.
  • the controller 180 may learn (i.e., determine) the location of the gesture made by the user. That is, an image captured by the camera 121 may be analyzed to thereby provide an analysis of the location of the gesture in the virtual space.
  • the location of the gesture may be a relative distance with respect to the body of a user or the display surface of the display 151.
  • the distance may refer to a location within a 3D space.
  • the distance may indicate a specific spot having x-y-z components from an origin such as a specific point on the body of the user.
  • the controller 180 may determine the location of the displayed stereoscopic image in the virtual space. That is, the controller 180 may determine the location of the stereoscopic image in the virtual space giving the user an impression that an image is displayed therein due to the effect of the stereo disparity. For example, this means that in the case where an image has positive (+) depth to look as if protruding toward the user from the display surface of the display 151, the controller 180 may determine the extent to which the image protrudes, and the location thereof.
  • the controller 180 may determine a direction in which the gesture approaches the stereoscopic image, that is, an approach direction of the gesture with respect to the stereoscopic image. That is, since the controller 180 learns the location of the gesture, and the location of the stereoscopic image in the virtual space, it can be determined which side (or face) of the stereoscopic image the gesture is made for. For example, in the case in which the stereoscopic image in the form of a polyhedron is displayed in the virtual space, the controller 180 may determine whether the user's gesture is directed toward the front side of the stereoscopic image or toward the lateral or rear side of the stereoscopic image.
  • a function corresponding to the approach direction may be executed. For example, in the case in which an the stereoscopic image is approached from its front side thereof and touched, a function of activating the stereoscopic image may be executed. Also, in the case in which the stereoscopic image is approached from the rear side thereof and touched, a specific function corresponding to the stereoscopic image may be executed.
  • FIG. 6 illustrates an example of a stereoscopic image including a plurality of image objects 10 and 11.
  • the stereoscopic image depicted in FIG. 6 may be an image obtained by the camera 121.
  • the stereoscopic image includes a first image object 10 and a second image object 11.
  • the controller 180 may display an image acquired in real time by the camera 121 on the display 151 in the form of a preview.
  • the controller 180 may acquire one or more stereo disparities respectively corresponding to one or more of the image objects in operation S110.
  • the controller 180 may use the acquired left-eye and right-eye images to acquire the stereo disparity of each of the first image object 10 and the second image 11.
  • FIG. 7 is a view for explaining a stereo disparity of an image object included in a stereoscopic image.
  • the first image object 10 may have a left-eye image 10a presented to the user's left eye 20a, and a right-eye image 10b presented to the right eye 20b.
  • the controller 180 may acquire a stereo disparity d1 corresponding to the first image object 10 on the basis of the left-eye image 10a and the right-eye image 10b.
  • the controller 180 may convert a 2D image, acquired by the camera 121, into a stereoscopic image by using a predetermined algorithm for converting a 2D image into a 3D image, and display the converted image on the display 151.
  • the controller 180 may acquire the respective stereo disparities of the first image object 10 and the second image object 11.
  • FIG. 8 is a view for comparing the stereo disparities of the image objects 10 and 11 depicted in FIG. 6.
  • the stereo disparity d1 of the first image object 10 is different from a stereo disparity d2 of the second image object 11. Furthermore, as shown in FIG. 8, since the stereo disparity d2 of the second image object 11 is greater than the stereo disparity d1 of the first image object 10, the second image object 11 is viewed as if being located farther away from the user than the first image object 10.
  • the controller 180 may acquire one or more graphic objects respectively corresponding to one or more of the image objects in operation S120.
  • the controller 180 may display the acquired one or more graphic objects on the display 151 so as to have a stereo disparity.
  • FIG. 9 illustrates the first image object 10 that may look as if protruding toward the user.
  • the locations of the left-eye image 10a and the right-eye image 10b on the display 151 may be opposite to those depicted in FIG. 7.
  • the images are also presented to the left eye 20a and the right eye 20b in the opposite manner.
  • the user can view the displayed image as if it is located in front of the display 151, that is, at the intersection of sights. That is, the user may perceive positive (+) depth in relation to the display 151. This is different from the case of FIG. 7 in which the user perceives negative (-) depth that gives the user an impression that the first image object 10 is displayed at the rear of the display 151.
  • the controller 180 may give the user the perception of various types of depth by displaying a stereoscopic image having positive (+) or negative depth (-) according to needs.
  • FIG. 10 is a flowchart illustrating the process of acquiring the user's gesture depicted in FIG. 5 in more detail.
  • FIG. 11 is a view depicting a gesture for control acquisition related to FIG. 10.
  • the acquiring of the user's gesture by the controller 180 of the display device in operation S30 of FIG. 5, may include initiating capturing using the camera 121 in operation S31.
  • the controller 180 may activate the camera 121.
  • the controller 180 may capture an image of the surroundings of the display device 100.
  • the controller 180 may control the display device 100 on the basis of a gesture made by a specific user having control. For example, this means that in the case where a plurality of people are located in front of the display device 100, the controller 180 may allow a specific function of the display device 100 to be performed on the basis of only a gesture made by a specific person having acquired control among those in front of the display device 100.
  • the control upon the display device 100 may be granted to a user U who has made a specific gesture.
  • the control may be granted to a user having made such a gesture.
  • the user with control may be tracked.
  • the granting and tracking of the control may be performed on the basis of an image captured by the camera 121 provided in the display device 100. That is, this means that the controller 180 analyzes the captured image to thereby continuously determine whether or not a specific user U exists, the specific user U performs a gesture required for control acquisition, the specific user U is moving, and the like.
  • the specific gesture of the user may be a gesture for executing a specific function of the display device 100 and terminating the specific function being performed.
  • the specific gesture may be a gesture to select various menus displayed as stereoscopic images by the display device 100.
  • S50 of FIG. 5 the operation in which the presentation of the stereoscopic image is controlled according to the user's gesture (S50 of FIG. 5) will be described in detail.
  • FIG. 12 is a flowchart of the process of controlling the presentation of the stereoscopic image associated with FIG. 5, in more detail.
  • FIGS. 13 and 14 are views depicting examples of a displayed stereoscopic image.
  • FIGS. 15 and 16 are views depicting gestures with respect to a stereoscopic image.
  • FIGS. 17 through 20 are views depicting changes in display (i.e., presentation) according to a gesture with respect to a stereoscopic image.
  • the display device 100 may appropriately control the presentation of the stereoscopic image in response to thespecific gesture made by the user U with respect to the stereoscopic image.
  • the controller 180 may acquire the location of the stereoscopic image in the virtual space VS in operation S51.
  • the virtual space VS may refer to a space that gives the user U an impression that individual objects O1 to O3 of the stereoscopic image displayed by the display device 100 are located in a 3D space. That is, the virtual space VS may be a space where an image, being displayed substantially on the display 151, looks as if protruding toward the user U with positive (+) depth or receding against the user U with negative (-) depth. Each of the objects O1 to O3 may look as if floating in the virtual space VS or being extended in a vertical or horizontal direction of the virtual space VS.
  • the user U may have an impression that he can take hold (grip) of the display objects O1 to O3 with his hand.
  • This effect is more clearly demonstrated in an object looking as if being located near the user U.
  • the user U may have a visual illusion that the first object O1 is located right in front of him. In this case, the user U may have an impression that he may hold the first object O1 with his hand H.
  • the controller 180 may learn the location of the stereoscopic image displayed in the virtual space VS. That is, based on the locations of the left-eye and right-eye images 10a and 10b of FIG. 7 on the display 151,the controller 180 may determine the location of the stereoscopic image in the virtual space VS, presented to the user U.
  • the location of the gesture in the virtual space VS may be acquired in operation S52.
  • the location of the gesture in the virtual space VS may be acquired by using the camera 121 provided in the display device 100. That is, the controller 180 may analyze an image acquired as the camera 121 continuously tracks an image of the user U.
  • the controller 180 may determine a first distance D1 from the display device 100 to the first object O1, a second distance D2 from the display device 100 to the hand H of the user U, and a third distance D3 from the display device 100 to the user U.
  • the first to third distances D1 to D3 may be determined by analyzing the image captured by the camera 121 and using the location of the displayed first object O1 that has been known to the controller 180.
  • the user U may make a specific gesture within the predetermined distance. For example, the user U may put out his hand H toward the first object O1 to make a motion associated with the first object O1.
  • the user U may stretch out his hand H to approach the first object O1 within a predetermined radius.
  • the controller 180 may determine a direction V in which the user U s hand H approaches, through an image analysis. That is, the controller 180 may determine whether the hand H approaches the first object O1 or another object adjacent to the first object O1, by using the trace of the gesture made by the hand H.
  • an approach direction of the gesture with respect to the stereoscopic image may be acquired in operation S54.
  • the controller 180 may determine which direction (i.e., side) of the hand H faces the first object O1. For example, the controller 180 may determine whether a palm side P or a back side B of the palm faces the first object O1.
  • the controller 180 may determine in which direction the hand H1 or H2 approaches the first object O1. That is, the controller 180 may determine that the palm side P approaches the first object O1 in the case of a first hand H1 and determine that the back side B of the hand approaches the first object O1 in the case of a second hand H2.
  • the controller 180 may determine that the user U moves to take a grip on (i.e., hold) the first object O1.
  • the controller 180 may determine that the user U is not moving to take a grip on the first object O1. That is, this means that it may be determined which motion the user is to make with respect to a specific stereoscopic image, on the basis of the gesture of the user U, in particular, a hand motion. Accordingly, the controller 180 may enable the execution of a specific function on the basis of a specific hand motion. That is, the case of the first hand H1 may be linked to a motion of grabbing the first object O1, and the case of the second hand H2 may be linked to a motion of pushing the first object O1.
  • the controller 180 may allow stereoscopic images to have different characteristics according to shapes of the stereoscopic images and/or properties of entities respectively corresponding to the stereoscopic images. That is, a stereoscopic image representing a rigid object such as an iron bar, and a stereoscopic image representing an elastic object such as a rubber bar may have different responses to a user's gesture. In the case in which a stereoscopic image represents an entity such as an iron bar, the shape of the stereoscopic image may be maintained as it is even when a user makes a motion of holding the corresponding image. In contrast, in the case in which a stereoscopic image represents an entity such as a rubber bar, the shape thereof may be changed when a user makes a motion of holding the same.
  • the user U may make a gesture of taking hold of the first object O1 and moving it in a third direction A3.
  • the controller 180 may cause the stereoscopic image to look as if the first object O1 is held by the hand. Accordingly, the user can perform a function of moving the first object O1 in the third direction A3.
  • the controller 180 may allow the presentation of the response of the first object O1 to the user's gesture to be varied, according to the properties of an entity corresponding to the first object O1.
  • the user may move the first hand H1 toward the first object O1, that is, in the first direction A1.
  • the controller 180 may detect the motion of the first hand H1 through the camera 121.
  • the user's first hand H1 may virtually comes into contact with the first object O1.
  • the first object O1 represents a soft material such as a rubber bar
  • the controller 180 may create a visual effect of bending the first object O1 in its portion where the virtual contact with the first hand H1 has occurred.
  • the user may make a gesture toward liquid W contained in a bowl D in a fourth direction A4.
  • the controller 180 may create an effect of causing waves in the liquid W in response to the gesture of the hand H.
  • FIGS. 21 through 26 are views illustrating gestures with respect to a stereoscopic image in the form of a polyhedron.
  • the controller 180 of the display device 100 may perform a specific function in response to a user's gesture with respect to a specific side of a stereoscopic image in the form of a polyhedron with a plurality of sides (faces).
  • the controller 180 may display an object O that can give a user a stereoscopic impression caused by a stereo disparity.
  • the object O may have a cubic shape, and a specific function may be assigned to each side of the cubic shape. For example, a gesture of a touch on a first side S1 may execute a function of activating the object O, a touch on a second side S2 may execute a calling function, and a touch on a third side S3 may execute a message sending function. In such a manner, each side of the object O may have each function assigned thereto.
  • the user may make a gesture of touching the first side S1, the front side, in a fifth direction A5 by using his hand H. That is, this measn that the user makes a gesture of pushing his hand forward, away from the body of the user.
  • the controller 180 may execute a function allocated to the first side S1.
  • the user may make a gesture of touching the second side S2 in a sixth direction A6.
  • the touching in the sixth direction A6 may be a gesture of touching the lateral side of the object O.
  • the controller 180 may execute a function corresponding to the second side S2. That is, different functions may be executed according to directions in which the user touches the object O.
  • the user may make a gesture of touching a fifth side S5 of the object O in a seventh direction A7.
  • the controller 180 may perform a function corresponding to the fifth side S5.
  • the user may make a gesture of touching the rear side of the object O, the third side S3 thereof, in an eighth direction A8 as shown in FIG. 23A , or a gesture of touching a sixth side S6, the bottom of the object O, in a ninth direction A9.
  • the controller 180 may perform an individual function in response to the gesture with respect to each side.
  • the user may make a gesture of touching the front side of the object O. That is, the user may make a motion of approaching the object O from its front side and touching the first side S1.
  • the object O Before the user approaches the object O in the fifth direction A5 and touches the first side S1, the object O may be in an inactivated state. For example, a selection on the object O may be restricted in order to prevent an unintended gesture from executing a function corresponding to the object O.
  • the user's touch on the first side S1 may enable the activation of the object O. That is, the execution of a function corresponding to the object O may be enabled by the gesture.
  • the object O, when activated, may be displayed brightly to indicate the activation.
  • the user may make a gesture of touching the lateral side of the object O. That is, the user may make a gesture of touching the second side S2, one of lateral sides of the object O, in the sixth direction A6.
  • the controller 180 of the display device 100 may make different responses according to which spot on the displayed object the gesture is intended for.
  • pop-up objects P related to channel changes may be displayed in response to the user's gesture with respect to the second side S2. This is different from the case in which the touch on the first side S1 executes the function of activating the object O. Even when the object O is in an inactivated state, the function may be executed by the gesture with respect to the second side S2.
  • the user may make a gesture of holding the object O.
  • the gesture of holding the object O may bring about a similar result to that of the gestures of touching the plurality of sides.
  • the user may make a gesture of simultaneously touching the first side S1 and the third side S3 of the object O from the lateral side of the object O.
  • a different function from that in the case of the gesture of separately touching each side may be executed. For example, assuming that a first function is executed by a gesture with respect to the first side S1 and a second function is executed by a gesture with respect to the third side S3, a gesture of simultaneously touching the first and third sides S1 and S3 may execute a third function.
  • the third function may be totally different from the first and second functions or may be the simultaneous execution of the first and second functions.
  • a function of recording a currently viewed broadcasting channel may be executed.
  • FIGS. 27 through 31 are views illustrating a pointer for selection in a stereoscopic image.
  • the display device may display a pointer P corresponding to a gesture of a user U.
  • the pointer P is displayed so as to give the user an impression of 3D distance.
  • first to third objects O1 to O3 may be displayed in the virtual space.
  • the user U may select the first to third objects O1 to O3 directly by using his hand H, or by using the pointer P.
  • the pointer P may be displayed in the space at a predetermined distance from the user U.
  • the pointer P may move toward the third object O3 in response to the user's hand motion.
  • the movement of the pointer P may be determined according to a distance between a preset reference location and the gesture. For example, in the case in which the body of the user U is set as a reference location, if the hand H moves closer or farther away fromt the display device 100, the pointer P may move accordingly.
  • the reference location may be set by the user.
  • the pointer P may move toward the third object O3 in a direction corresponding to an eleventh direction A11.
  • the pointer P may undergo size changes according to the distance from the reference location.
  • the pointer P at the reference location may have a size of a first pointer P1.
  • the pointer P having the size of the first point P1 at the reference location may become bigger to have a size of a second pointer P2 as the hand H moves in a twelfth direction A12. That is, the pointer P increases in size as it approaches the user. Furthermore, the pointer P having the size of the first pointer P1 at the reference location may become smaller to have a size of a third pointer P3 as the hand H moves in a thirteenth direction A13. That is, the pointer P may decrease in size as it moves farther away from the user.
  • the perspective caused by a stereo disparity may be more clearly presented. Also, this may provide a guide to the depth of an object selectable by the user's gesture.
  • the pointer P may change according to the direction of a gesture made by the user.
  • the hand H of the user may move in fourteenth to seventeenth directions A14 to A17.
  • the controller 180 may change the shape of the pointer accordingly and display the same. That is, when the hand H moves in the fourteenth direction S14, a direction in which the hand H moves farther away from the user, the first pointer P1 having an arrow pointing the fourteenth direction A14 may be displayed. That is, first to fourth pointers P1 and P4 may have shapes respectively corresponding to the fourteenth to seventeenth directions A14 to A17.
  • the pointer may indicate whether the hand H is moving or stopped. That is, while the user stops making a motion at a specific location, a circular fifth pointer P5 that indicates no direction is displayed. When the hand H moves, the first to fourth P1 to P4 may be displayed accordingly.
  • FIGS. 32 through 34 are views illustrating the process of selecting any one of a plurality of stereoscopic images.
  • the controller 180 of the display device 100 when a gesture for selecting a specific image from among the plurality of stereoscopic images is input, changes the presentation of another stereoscopic image. Accordingly, the selection with respect to the specific stereoscopic image can be facilitated.
  • a plurality of objects O may be displayed adjacent to each other in a virtual space. That is, objects A to I may be displayed.
  • the user U may make a gesture of selecting object E by moving his hand H.
  • the controller 180 may cause the objects other than the object E to look as if moving away from the object E. That is, when it is determined that the user U intends to select a specific object, objects other than the specific object are caused to move away from the specific object, thereby reducing the possibility of selecting an unintended object.
  • the controller 180 may release the display of other objects other than the specific object. That is, objects, other than the specific object, may be made to disappear, Furthermore, when the user stops making the gesture, the disappeared object may be displayed again.
  • FIGS. 35 and 36 are views illustrating the operation of a feedback unit.
  • the display device 100 may give the user U feedback on a gesture.
  • the feedback may be recognized through an auditory sense and/or a tactile sense.
  • the controller 180 may provide the user with corresponding sounds or vibrations.
  • the feedback for the user may be performed by using a directional speaker SP or an ultrasonic generator US, a feedback unit provided in the display device 100.
  • the directional speaker SP may selectively provide a sound to a specific user of first and second users U1 and U2. That is, only a selected user may be provided with sound through the directional speaker SP capable of selectively determining the propagation direction or transmission range of the sound.
  • the display device 100 may allow an object O to be displayed in a virtual space.
  • the controller 180 may give the user U feedback corresponding to the gesture.
  • the controller may provide the user U with sounds through the directional speaker SP or vibrations through the ultrasonic generator US.
  • the ultrasonic generator US may generate ultrasonic waves directed toward a specific point.
  • the user U may feel pressure. By controlling the pressure given to the user, the user may recognize it as vibrations.
  • FIGS. 37 through 39 are views illustrating the operation of a display device according to another exemplary embodiment of the present invention.
  • the display device 100 may be a mobile terminal which can be carried by a user.
  • the display device 100 is a portable device
  • a user's gesture with respect to not only the front side of the display device 100 but also the rear side thereof may be acquired and corresponding functions may be performed accordingly.
  • the display device 100 may display a stereoscopic image through the display 151.
  • the first to third objects O1 to O3 of the stereoscopic image may be displayed as if protruding or receding from a display surface of the display 151.
  • the first object O1 giving the preception of the same depth as that of the display surface of the display 151, the second object O2 looking as if protruding toward the user, and the third object 03 looking as if receding agastin the user may be displayed in the virtual space.
  • the body of the display device 100 may be provided with a first camera 121a facing the front side and a second camera 121b facing the back side.
  • the user may make a gesture with respect to the second object O2 displayed in front of the display device 100. That is, the user may make a gesture of touching or holding the third object O3 with his hand H.
  • the user's gesture with respect to the second object O2 may be captured by the first camera 121a facing the front side.
  • the user may make a gesture with respect to the third object O3 displayed at the rear of the display device 100. That is, the user may make a gesture of touching or grabbing the third object O3 with the hand H.
  • the user's gesture with respect to the third object O3 may be captured by the second camera 121b facing the back side.
  • the display device 100 can be controlled upon acquiring not only a gesture made in front of the display device 100 but also a gesture made at the rear of the display device 100, various operations can be made according to the depth of a stereoscopic image.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif d'affichage et un procédé de commande associé. Selon le dispositif d'affichage et le procédé de commande associé, un appareil de prise de vues capture un geste effectué par un utilisateur, un dispositif d'affichage affiche une image stéréoscopique, et un contrôleur contrôle la présentation de l'image stéréoscopique en réponse à une distance entre le geste et l'image stéréoscopique dans un espace virtuel et une direction d'approche du geste par rapport à l'image stéréoscopique. Par conséquent, la présentation de l'image stéréoscopique peut être contrôlée en réponse à une distance et une direction d'approche par rapport à l'image stéréoscopique.
PCT/KR2011/001919 2011-03-21 2011-03-21 Dispositif d'affichage et procédé de commande associé WO2012128399A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201180069433.7A CN103430215B (zh) 2011-03-21 2011-03-21 显示装置及其控制方法
DE112011104939.0T DE112011104939T5 (de) 2011-03-21 2011-03-21 Anzeigevorrichtung und Verfahren zum Steuern derselben
PCT/KR2011/001919 WO2012128399A1 (fr) 2011-03-21 2011-03-21 Dispositif d'affichage et procédé de commande associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2011/001919 WO2012128399A1 (fr) 2011-03-21 2011-03-21 Dispositif d'affichage et procédé de commande associé

Publications (1)

Publication Number Publication Date
WO2012128399A1 true WO2012128399A1 (fr) 2012-09-27

Family

ID=46879527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/001919 WO2012128399A1 (fr) 2011-03-21 2011-03-21 Dispositif d'affichage et procédé de commande associé

Country Status (3)

Country Link
CN (1) CN103430215B (fr)
DE (1) DE112011104939T5 (fr)
WO (1) WO2012128399A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017113674A1 (fr) * 2015-12-31 2017-07-06 乐视控股(北京)有限公司 Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent
WO2018222115A1 (fr) * 2017-05-30 2018-12-06 Crunchfish Ab Activation améliorée d'un objet virtuel
WO2024026638A1 (fr) * 2022-08-01 2024-02-08 Huawei Technologies Co., Ltd. Interaction haptique avec un objet 3d sur un dispositif d'affichage 3d à l'œil nu
WO2024127004A1 (fr) * 2022-12-13 2024-06-20 Temporal Research Ltd Procédé d'imagerie et dispositif d'imagerie

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015208136A1 (de) * 2015-04-30 2016-11-03 Krones Aktiengesellschaft Verfahren und Vorrichtung zur Zuförderung, Bereitstellung und zum Austausch von Rollen mit Verpackungsmaterial in einer Verpackungsmaschine
CN105022452A (zh) * 2015-08-05 2015-11-04 合肥联宝信息技术有限公司 一种具有3d 显示效果的笔记本电脑
CN105824429A (zh) * 2016-05-12 2016-08-03 深圳市联谛信息无障碍有限责任公司 基于红外传感器的读屏应用指令输入方法及装置
CN107707900A (zh) * 2017-10-17 2018-02-16 西安万像电子科技有限公司 多媒体内容的处理方法、装置和系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0905644A2 (fr) * 1997-09-26 1999-03-31 Matsushita Electric Industrial Co., Ltd. Dispositif de reconnaissance de gestes de la main
JP2002108196A (ja) * 2000-10-03 2002-04-10 Matsushita Electric Works Ltd 避難仮想体験システム
US20080170776A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling resource access based on user gesturing in a 3d captured image stream of the user

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4267648B2 (ja) * 2006-08-25 2009-05-27 株式会社東芝 インターフェース装置及びその方法
US8555207B2 (en) * 2008-02-27 2013-10-08 Qualcomm Incorporated Enhanced input using recognized gestures
WO2010147600A2 (fr) * 2009-06-19 2010-12-23 Hewlett-Packard Development Company, L, P. Instruction qualifiée
CN101807114B (zh) * 2010-04-02 2011-12-07 浙江大学 一种基于三维手势的自然交互方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0905644A2 (fr) * 1997-09-26 1999-03-31 Matsushita Electric Industrial Co., Ltd. Dispositif de reconnaissance de gestes de la main
JP2002108196A (ja) * 2000-10-03 2002-04-10 Matsushita Electric Works Ltd 避難仮想体験システム
US20080170776A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling resource access based on user gesturing in a 3d captured image stream of the user

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017113674A1 (fr) * 2015-12-31 2017-07-06 乐视控股(北京)有限公司 Procédé et système pour réaliser une commande de détection de mouvement sur la base d'un dispositif intelligent, et dispositif intelligent
WO2018222115A1 (fr) * 2017-05-30 2018-12-06 Crunchfish Ab Activation améliorée d'un objet virtuel
US11467708B2 (en) 2017-05-30 2022-10-11 Crunchfish Gesture Interaction Ab Activation of a virtual object
WO2024026638A1 (fr) * 2022-08-01 2024-02-08 Huawei Technologies Co., Ltd. Interaction haptique avec un objet 3d sur un dispositif d'affichage 3d à l'œil nu
WO2024127004A1 (fr) * 2022-12-13 2024-06-20 Temporal Research Ltd Procédé d'imagerie et dispositif d'imagerie

Also Published As

Publication number Publication date
CN103430215A (zh) 2013-12-04
CN103430215B (zh) 2017-03-22
DE112011104939T5 (de) 2014-01-23

Similar Documents

Publication Publication Date Title
WO2012128399A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2015053449A1 (fr) Dispositif d'affichage d'image de type lunettes et son procédé de commande
WO2015190666A1 (fr) Terminal mobile et son procédé de commande
WO2012144666A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2017209533A1 (fr) Dispositif mobile et procédé de commande correspondant
US20120242793A1 (en) Display device and method of controlling the same
WO2017086508A1 (fr) Terminal mobile et procédé de commande associé
WO2015046636A1 (fr) Terminal mobile et son procédé de commande
WO2019147021A1 (fr) Dispositif de fourniture de service de réalité augmentée et son procédé de fonctionnement
WO2016195147A1 (fr) Visiocasque
WO2018070624A2 (fr) Terminal mobile et son procédé de commande
WO2017026555A1 (fr) Terminal mobile
WO2018048092A1 (fr) Visiocasque et son procédé de commande
WO2015133701A1 (fr) Terminal mobile et son procédé de commande
WO2018093005A1 (fr) Terminal mobile et procédé de commande associé
WO2015174611A1 (fr) Terminal mobile et son procédé de commande
WO2017204498A1 (fr) Terminal mobile
EP2982042A1 (fr) Terminal et son procédé de commande
WO2017026554A1 (fr) Terminal mobile
WO2014065595A1 (fr) Dispositif d'affichage d'image et procédé de commande associé
WO2018101508A1 (fr) Terminal mobile
WO2016013692A1 (fr) Visiocasque et procédé de commande associé
WO2016027932A1 (fr) Terminal mobile du type lunettes et son procédé de commande
WO2015083873A1 (fr) Terminal mobile, et couvercle correspondant
WO2017126709A1 (fr) Terminal mobile et procédé de commande associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11861893

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112011104939

Country of ref document: DE

Ref document number: 1120111049390

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11861893

Country of ref document: EP

Kind code of ref document: A1