WO2012105703A1 - Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement - Google Patents

Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement Download PDF

Info

Publication number
WO2012105703A1
WO2012105703A1 PCT/JP2012/052551 JP2012052551W WO2012105703A1 WO 2012105703 A1 WO2012105703 A1 WO 2012105703A1 JP 2012052551 W JP2012052551 W JP 2012052551W WO 2012105703 A1 WO2012105703 A1 WO 2012105703A1
Authority
WO
WIPO (PCT)
Prior art keywords
display screen
display
proximity
user
display device
Prior art date
Application number
PCT/JP2012/052551
Other languages
English (en)
Japanese (ja)
Inventor
泰文 萩原
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2012105703A1 publication Critical patent/WO2012105703A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present invention relates to a display device that displays an object on a display screen in a stereoscopic manner.
  • the present invention also relates to a display image generation method executed by such a display device, a program for operating such a display device, and a recording medium on which such a program is recorded.
  • an operation object such as a button constituting the user interface is displayed in a stereoscopic manner so as to improve the reality of the user interface.
  • a technique for improving operability when selecting an object displayed in a stereoscopic manner has been proposed.
  • Patent Document 1 discloses an image display system in which an object selected with a fingertip or a pen tip of an input pen is moved and displayed in the depth direction in accordance with the pressing amount of the touch panel.
  • Patent Document 2 in an operation panel device in which a stereoscopic parallax image is displayed on a display unit, if the sensor detects proximity to the operation surface before the pressing means touches the touch panel, the stereoscopic parallax image is converted into a planar image.
  • An operation panel device that switches to and displays on a display unit is disclosed.
  • Japanese Patent Publication Japanese Patent Laid-Open No. 2006-293878 (published on October 26, 2006)”
  • Japanese Patent Publication Japanese Patent Laid-Open Publication No. 2004-280496 (published Oct. 07, 2004)”
  • the display target position of the object is a position where the display device tries to display the object in (three-dimensional) in front of or behind the display screen, in other words, a position where the display device tries to make the user visually recognize the object. Refers to that.
  • the conventional display device displays the object in the right-eye image as the point M as shown in FIG. And the object in the left-eye image is drawn to the right of the point M (with parallax).
  • the conventional display device sets the object drawing position dr in the right-eye image and the object drawing position dl in the left-eye image as shown in FIG. , Respectively, move toward the point M (reduce the parallax).
  • the conventional display device places both the object in the right-eye image and the object in the left-eye image on the point M, as shown in FIG. Draw (remove parallax).
  • the distance from the point M to the drawing position dr is the same as the distance from the point M to the drawing position dl (the parallax is symmetrical). To be given).
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a display device capable of giving a user a natural impression when touching an object displayed in a stereoscopic manner. .
  • a display device is a display device that displays an object on a display screen so as to be stereoscopically viewed, and is a proximity device that detects a proximity position that is a position of an object close to the display screen. Whether the proximity position detected by the position detection means and the proximity position detection means coincides with a display target position that is a position where the object is to be displayed, or is detected by the proximity position detection means In the determination result by the proximity position determination means, the proximity position determination means for determining whether or not the proximity position that has been made coincides with a symmetrical position that is symmetrical with respect to the display target position and the display screen.
  • Image generation means for moving the drawing position of the object to the left and right, and each time the proximity position matches the display target position or the symmetrical position in the determination result by the proximity position determination means, the display target position
  • the amount of movement by which the image generating means moves the drawing position of the object to the left and right in order to bring it closer to the display screen by a predetermined length is set so as to decrease as the display target position approaches the display screen. It is characterized by that.
  • a display image generation method is a display image generation method executed by a display device that displays an object on a display screen in a stereoscopic manner, and is close to the display screen.
  • a proximity position detection step for detecting a proximity position that is the position of an object, and whether the proximity position of the object detected in the proximity position detection step matches a display target position where the object is to be displayed Proximity to determine whether or not the proximity position of the object detected in the proximity position detection step coincides with a symmetrical position that is symmetrical with respect to the display target position and the display screen
  • the proximity position and the display target position or the symmetrical position are An image generation step of moving the drawing position of the object in the right-eye image and the left-eye image displayed on the display screen to the right and left each time the image is matched, and in the determination result in the proximity position determination step, Each time the display target position or the symmetric position coincides,
  • the object here refers to, for example, a display object constituting a GUI (graphical user interface).
  • GUI graphical user interface
  • an operation object for accepting selection by the user such as a button or a switch, is a representative example of the object here.
  • the visual recognition position of the object may move obliquely with respect to the display screen (see FIG. 19 (d)). For this reason, a deviation may occur between the visual recognition position of the object and the display target position of the object.
  • FIG. 1 is a block diagram illustrating a main configuration of a display device according to Embodiment 1.
  • FIG. It is sectional drawing of the display apparatus of FIG. 3A shows a front view and a side view of the display when the finger is close to the display screen
  • FIG. 3B is a front view of the display when the finger is in contact with the display screen
  • FIG. 3C is a front view and a side view of the display when the object is displayed on the display screen in a stereoscopic manner.
  • 4A is a diagram illustrating a drawing position of the object drawn on the right-eye image with respect to the object
  • FIG. 4B is a drawing position of the object drawn on the left-eye image with respect to the object.
  • FIG. 7A and 7B show an example of the position of the user's finger, the object position, and the drawing position when the user's finger is close to the display screen.
  • FIG. 8A is a diagram showing the drawing position of the object drawn on the right-eye image with respect to the object when the object position is located in the space behind the display screen
  • FIG. These are figures which show the drawing position of the object drawn by the image for left eyes with respect to the object when an object position is located in the space behind a display screen.
  • 9A and 9B show other examples of the position of the user's finger, the object position, and the drawing position when the user's finger is close to the display screen.
  • (C) is a figure which shows the other example of the position of a user's finger, an object position, and a drawing position when a user's finger contacts a display screen.
  • FIG. 10 is a block diagram illustrating a main configuration of a display device according to a third embodiment.
  • FIG. 10 is a block diagram illustrating a main configuration of a display device according to a fourth embodiment. It is sectional drawing of the display apparatus of FIG.
  • FIG. 18A shows an example of the position of the user's finger and the object position when the user's finger is close to the display screen and the distance from the proximity position to the display screen is equal to or greater than the threshold.
  • 18B is a diagram illustrating an example of the position of the user's finger and the object position when the distance from the proximity position to the display screen is smaller than the threshold value.
  • FIGS. 19A to 19D are diagrams for explaining problems of the conventional technology that may occur when the display target position of the object is not set in front of the viewing position (position of the user's head). .
  • FIG. 1 is a block diagram showing a main configuration of a display device 1 according to the present embodiment.
  • the display device 1 includes a display screen 10, a camera (first input device) 20, a touch panel (third input device) 30, and a control unit 100.
  • Display screen 10 displays information and provides it to the user.
  • the display screen 10 is a display screen of a display panel such as an LCD, an EL (Electroluminescence) display, or a PDP (Plasma Display Panel).
  • the display screen 10 can display an object in a stereoscopic manner. Specifically, based on the principle of binocular parallax, it is possible to perform display so that stereoscopic viewing can be obtained by simply looking at the screen without wearing so-called 3D glasses.
  • the display screen 10 that does not require wearing of so-called 3D glasses will be described as an example, but the present invention is not limited to this and requires wearing of 3D glasses. It may be a display screen.
  • the object refers to a display object constituting a GUI (Graphical User Interface).
  • GUI Graphic User Interface
  • operation objects for accepting selection by the user such as buttons and switches, are representative examples of objects here.
  • the position where the display device 1 tries to display an object that is, the position where the display device 1 tries to make the user visually recognize the object is referred to as “object position” (“display in claims”).
  • object position corresponds to the "target position”).
  • This object position is set to a place where the user can touch. More specifically, it is set at a location approximately 50 cm away from the user.
  • the camera 20 is an input device that detects an object close to the display screen 10 and transmits a signal indicating the position of the detected object (hereinafter referred to as “proximity position”) to the control unit 100.
  • the touch panel 30 is an input device that detects an object in contact with the display screen 10 and transmits a signal indicating the position of the detected object (hereinafter referred to as “contact position”) to the control unit 100.
  • the touch panel 30 is a contact type so-called capacitive touch panel.
  • the touch panel 30 is not limited to this, and may be a so-called resistive film type.
  • the touch panel 30 may be a so-called non-contact type touch panel such as a liquid crystal panel with a built-in optical sensor.
  • FIG. 2 is a cross-sectional view of the display device 1.
  • the bezel 2 is disposed around the display panel having the display screen 10. If the plane on which the display screen 10 is arranged is the xy plane, the touch panel 30 is arranged on the z-axis positive direction side of the display screen 10 so as to cover the display screen 10.
  • the camera 20 is disposed inside the bezel 2 that protrudes in the positive z-axis direction from the touch panel 30.
  • the position where the camera 20 is arranged is not limited to this, and may be a position where an object close to the display screen 10 can be detected. Further, the camera 20 may be configured integrally with the display device 1 or may be configured separately from the display device 1.
  • the bezel 2 has a shape protruding from the touch panel 30 by about 6 cm in the z-axis direction. That is, the camera 20 detects that an object approximately 6 cm from the touch panel 30 is an object close to the display screen 10.
  • the distance between the object detected by the camera 20 and the touch panel 30 is an example, and the distance can be set according to the size of the display screen 10 and the installation location.
  • the control unit 100 is responsible for processing that the display device 1 executes internally.
  • the control unit 100 includes a proximity position detection unit (proximity position detection unit) 110, a contact position detection unit (contact position detection unit) 120, an object position storage unit 130, a determination unit 140, and an image generation unit ( An image generation unit 150, an image output unit 160, and an execution unit (execution unit) 170.
  • the determination unit 140 includes a proximity position determination unit (proximity position determination unit) 141 and a contact position determination unit (contact position determination unit) 142.
  • the proximity position detection unit 110 detects a proximity position that is the position of an object close to the display screen 10 using an input signal from the camera 20. Specifically, the position of the user's finger in proximity to the display screen 10 is detected as a proximity position using a signal (input signal) indicating the proximity position transmitted from the camera 20. The proximity position detection unit 110 supplies the detected proximity position to the proximity position determination unit 141.
  • the contact position detection unit 120 detects a contact position that is the position of an object that has touched the display screen 10 using an input signal from the touch panel 30. Specifically, the position of the user's finger that touches the display screen 10 is detected as a contact position using a signal (input signal) indicating the contact position transmitted from the touch panel 30. The contact position detection unit 120 supplies the detected contact position to the contact position determination unit 142.
  • the object position is stored in the object position storage unit 130.
  • the object position of this object is a predetermined position.
  • the object position may be a position obtained with respect to the display screen 10 from a predetermined optimum viewing position and the position (drawing position) of the object drawn on the right-eye image and the left-eye image.
  • the object position storage unit 130 may be configured integrally with the display device 1 or may be configured separately from the display device 1.
  • the proximity position determination unit 141 acquires the object position from the object position storage unit 130.
  • the proximity position determination unit 141 determines whether the proximity position supplied from the proximity position detection unit 110 matches the acquired object position.
  • the proximity position determination unit 141 notifies the image generation unit 150 of the determination result.
  • the contact position determination unit 142 acquires the object position from the object position storage unit 130.
  • the contact position determination unit 142 determines whether or not the contact position supplied from the contact position detection unit 120 matches the acquired object position.
  • the contact position determination unit 142 supplies the determination result to the execution unit 170.
  • the image generation unit 150 generates a right-eye image and a left-eye image to be displayed on the display screen 10 in order to realize stereoscopic viewing of the object.
  • the image generation unit 150 uses the right eye so that the object position moves vertically with respect to the display screen 10.
  • a right-eye image and a left-eye image are generated by moving the drawing position of the object in the image and the left-eye image to the left and right.
  • the image generation unit 150 supplies the generated right eye image and left eye image to the image output unit 160.
  • the image generation unit 150 stores the object position moved perpendicular to the display screen 10 in the object position storage unit 130 as a new object position.
  • the image output unit 160 displays the right-eye image and the left-eye image supplied from the image generation unit 150. As a result, the display screen 10 displays the object in a stereoscopic manner.
  • the execution unit 170 executes the function indicated by the object when the determination result by the contact position determination unit 142 is true, that is, when the contact position matches the object position.
  • FIG. 3A is a diagram illustrating a front view and a side view of the display screen 10 when the finger 6 approaches the display screen 10.
  • the proximity position detection unit 110 detects the distance between the finger 6 and the display screen 10.
  • the horizontal direction of the display screen 10 is the x axis
  • the vertical direction of the display screen 10 is the y axis.
  • FIG. 3B is a diagram illustrating a front view and a side view of the display screen 10 when the finger 6 contacts the display screen 10.
  • the contact position detection unit 120 Detects (x2, y2, 0) as the contact position of the finger 6 from the position (x2, y2) of the finger 6 on the xy plane.
  • FIG. 3C is a diagram illustrating a front view and a side view of the display screen 10 when the object o is displayed on the display screen 10 so as to be stereoscopically viewable.
  • the object position is predetermined as (xo, yo, zo)
  • the object o is displayed at the position shown in (c) of FIG.
  • zo is the distance of the object position from the display screen 10 (zo> 0).
  • the object position on a plane (xy plane) parallel to the display screen 10 is represented as (xo, yo), but the object is not displayed only at this one point and includes (xo, yo). It is displayed in the area.
  • the object position on the xy plane is represented by using a representative point such as (xo, yo).
  • f (x, y, z) represents a predetermined optimum viewing position with respect to the display screen 10
  • D represents a distance between the optimum viewing position and the display screen 10.
  • W indicates the width of the display screen 10
  • s indicates the baseline length, which is the distance between the user's right eye fr and left eye fl.
  • fr ′ represents the intersection of the perpendicular line from the right eye fr to the display screen 10 and the display screen 10
  • o ′ represents the intersection of the perpendicular line from the object o to the display screen 10 and the display screen 10.
  • fl ′ represents the intersection of the vertical line from the left eye fl to the display screen 10 and the display screen 10
  • o ′ represents the intersection of the vertical line from the object o to the display screen 10 and the display screen 10.
  • the object position is moved perpendicularly to the display screen 10 in a direction approaching the display screen 10.
  • the movement amount at this time is an amount by which the object o can move following the finger 6, and in this embodiment, this movement amount is ⁇ . That is, the image generation unit 150 changes the object position to (xo, yo, zo ⁇ ).
  • the image generation unit 150 substitutes the changed object position (xo, yo, zo ⁇ ) into the expressions (A2) and (A4), thereby the objects dr and The drawing position of the object dl is calculated from the object position.
  • the image generation unit 150 can generate a right-eye image and a left-eye image in which an object is drawn at the position represented by the above formulas (A2) and (A4).
  • FIG. 5 is a diagram illustrating a drawing position of an object in the right-eye image and the left-eye image displayed on the display screen 10 with respect to the object position.
  • the drawing position of the object in the right-eye image and the left-eye image displayed on the display screen 10 is the position o1 '.
  • the drawing position of the object is moved from o1 'to o2'.
  • the movement amount of the drawing position of the object is ⁇ as shown in FIG.
  • o2 is a position that is closer to the display screen 10 by ⁇ from the position of o1.
  • the moving amount of the drawing position of the object at this time is ⁇ as shown in FIG.
  • the drawing position of the object is moved from o2 'to o3'.
  • o3 is a position that is closer to the display screen 10 by ⁇ from the position of o2.
  • the amount of movement of the drawing position of the object at this time is ⁇ as shown in FIG.
  • the movement amount ⁇ from the drawing position o1 ′ to o2 ′ of the object and the movement amount ⁇ from o2 ′ to o3 ′ have a relationship of ⁇ > ⁇ .
  • o4 is a position that is closer to the display screen 10 by ⁇ from the position of o3.
  • the amount of movement of the drawing position of the object at this time is ⁇ as shown in FIG.
  • ⁇ , ⁇ , and ⁇ have a relationship of ⁇ > ⁇ > ⁇ .
  • the amount of movement for moving the drawing position of the object to the left and right is set to decrease as the object position of the object approaches the display screen 10 by a predetermined length.
  • the amount of movement for moving the object position of the object perpendicular to the display screen 10 is not limited to ⁇ .
  • the drawing position of the object moves from o2 'to o4'. That is, when the object position of the object is moved vertically from o2 by 2 ⁇ ⁇ to the display screen 10, the movement amount of the drawing position of the object is ⁇ + ⁇ .
  • the amount of movement for moving the object position of the object is not limited to ⁇ , but is set so as to decrease as the object position of the object approaches the display screen 10 by a predetermined length.
  • the execution unit 170 executes the function indicated by the object.
  • the drawing position is the position o4 ′ and the object position of the object is the position o4
  • the position where the position o4 coincides with the position o4 ′ has been described as an example.
  • the present invention is not limited to this. That is, the position where the object position of the object matches the drawing position may not be a position moved by a distance three times ⁇ from the position of o1, which is the first object position of the object.
  • the amount to be moved is arbitrary.
  • the image generation unit 150 closes when the user's finger 6 comes close to the display screen 10 and touches an object displayed on the display screen 10 so as to be stereoscopically viewed, that is, the proximity position matches the object position. At this time, a right-eye image and a left-eye image in which the drawing position of the object is moved by ⁇ are generated. Thereafter, every time the finger 6 continues to touch the object, that is, every time the proximity position matches the object position, the movement amount of the drawing position of the object in the right-eye image and the left-eye image is the distance between the object and the display screen 10. The image for right eye and the image for left eye are generated by moving the drawing position of the object to the left and right so as to become smaller as the distance becomes closer. As described above, the movement amount by which the image generation unit 150 moves the drawing position of the object to the left and right is set so as to decrease as the display target position of the object approaches the display screen 10.
  • the display device 1 can display the object at a position moved vertically with respect to the display screen 10 by displaying the right-eye image and the left-eye image thus generated on the display screen 10.
  • FIG. 6 is a flowchart showing a processing flow when the user's finger 6 approaches and approaches the display screen 10.
  • the proximity position detection unit 110 determines whether or not the camera 20 has detected the finger 6 that has approached the display screen 10 (S102). When the camera 20 has detected the finger 6 that has approached the display screen 10 (Yes in S102), the proximity position detection unit 110 detects the proximity position (x1, y1, z1) from the input signal transmitted from the camera 20. (S103, proximity position detection step). If not detected (No in S102), the process returns to S101, and the camera 20 detects the object again.
  • the proximity position determination unit 141 and the contact position determination unit 142 acquire the object position (xo, yo, zo) from the object position storage unit 130 (S104).
  • the contact position detection unit 120 determines whether or not the touch panel 30 has detected an object that has touched the display screen 10 (S106).
  • proximity position determination unit 141 determines whether finger 6 is touching object o, that is, proximity position (x1, It is determined whether or not y1, z1) matches the object position (xo, yo, zo) (S107).
  • the image generation unit 150 moves vertically with respect to the display screen 10 so that the object position approaches the display screen 10. Then, the object position is changed to (xo, yo, zo- ⁇ ). Further, the image generation unit 150 moves the drawing position of the object in the right eye image and the left eye image left and right so that the object is displayed at the object position (xo, yo, zo- ⁇ ), and A left-eye image is generated.
  • the image generation unit 150 stores the changed object position (xo, yo, zo- ⁇ ) as a new object position in the object position storage unit 130 (S108, image generation step).
  • the contact position detection unit 120 determines the contact position (x2, y2, 0) from the input signal transmitted from the touch panel 30. It is detected (S109).
  • the contact position determination unit 142 determines whether or not the finger 6 is touching the object o, that is, whether or not the proximity position (x2, y2, 0) matches the object position (xo, yo, zo). Is determined (S110).
  • the execution unit 170 executes the function indicated by the object o (S111).
  • FIG. 7A and 7B show an example of the position of the user's finger 6, the object position, and the drawing position when the user's finger 6 comes close to the display screen 10, and
  • FIG. ) Is a diagram illustrating an example of the position of the user's finger 6, the object position, and the drawing position when the user's finger 6 contacts the display screen 10.
  • the image generation unit 150 As shown in FIG. 7A, when the finger 6 is close to the display screen 10 and the proximity position (x1, y1, z1) matches the object position (xo, yo, zo), the image generation unit 150 As shown in FIG. 7B, the object position of the object o is moved perpendicularly to the display screen 10 in a direction approaching the display screen 10. The movement amount at this time is an amount by which the object o can move following the finger 6 and is represented by ⁇ .
  • the image generation unit 150 calculates the drawing position of the object dr drawn on the right eye image and the object dl drawn on the left eye image from the object position of the moved object o, as shown in FIG. Move left and right.
  • the image generation unit 150 generates a right-eye image and a left-eye image in which the drawing position of the object o is moved left and right. Thereby, the object o is displayed at the moved object position so as to be stereoscopically viewed.
  • the image generation unit 150 further displays the object position of the object o on the display screen.
  • the display screen 10 is moved vertically.
  • the image generation unit 150 moves the object position of the object o in a direction approaching the display screen 10. The display screen 10 is moved vertically.
  • the execution unit 170 Performs the function indicated by the object o.
  • the display device 1 can display the object at a position moved vertically with respect to the display screen 10 by displaying the generated right-eye image and left-eye image on the display screen 10.
  • the display device 1 has an effect that it can give the user the impression that the user has surely touched the object.
  • the display device 1 when the position of the object o is in the space in front of the display screen 10 as in the present embodiment, the display device 1 represents the drawing position of the object in the right-eye image and the left-eye image by using the formula (A2) and the formula It can be obtained by a simple calculation as shown in (A4). Further, the display device 1 can move the object o perpendicular to the display screen 10 in a direction approaching the display screen 10.
  • the display device 1 can give a natural impression that the user is pushing the object toward the display screen 10.
  • the camera 20 detects the proximity of the finger 6 and transmits an input signal indicating the proximity position to the proximity position detection unit 110.
  • the proximity position detection unit 110 detects the proximity position using the transmitted input signal. Accordingly, the display device 1 can move the object perpendicular to the display screen 10 in a direction approaching the display screen 10 when the proximity position matches the object position of the object o.
  • the user can feel as if the user actually touched the object.
  • the display device 1 executes a function indicated by the object.
  • the display device 1 can give the user an impression that the object o is surely pressed.
  • the camera 20 has been described as an input device for detecting the proximity position.
  • the input device for detecting the proximity position is not limited to this.
  • an infrared sensor and a touch panel may be used.
  • the touch panel is a so-called non-contact type touch panel.
  • zo is the distance of the object position from the display screen 10 (zo> 0).
  • f (x, y, z) represents a predetermined optimum viewing position with respect to the display screen 10
  • D represents a distance between the optimum viewing position and the display screen 10.
  • W indicates the width of the display screen 10
  • s indicates the baseline length, which is the distance between the user's right eye fr and left eye fl.
  • fr ′ indicates the intersection of the perpendicular to the display screen 10 from the right eye fr and the display screen 10
  • the right triangle formed by the object dr, the right eye fr, and the intersection fr ′ is similar to the right triangle formed by the object o, the object dr, and the intersection dr ′. Therefore, the following formula (A5) is established.
  • fl ′ indicates the intersection of the perpendicular line from the left eye fl to the display screen 10 and the display screen 10
  • the proximity position determination unit 141 matches the position (symmetric position) where the proximity position is symmetric with respect to the object position with respect to the display screen 10. It is determined whether or not to do.
  • the position symmetrical to the object position with respect to the display screen 10 is a position on the vertical line from the object position to the display screen 10 and has the same vertical length.
  • the proximity position determination unit 141 determines whether or not the proximity position matches the position of (xo, yo, zo).
  • the object position is moved perpendicularly to the display screen 10 in a direction approaching the display screen 10.
  • the movement amount at this time is an amount by which the object o can move following the finger 6, and in this embodiment, this movement amount is ⁇ . That is, the image generation unit 150 changes the object position to (xo, yo, ⁇ zo + ⁇ ).
  • the image generation unit 150 substitutes the changed object position (xo, yo, ⁇ zo + ⁇ ) into the expressions (A6) and (A8), thereby drawing the object dr and the object drawn in the right eye image and the left eye image.
  • the drawing position of dl is calculated from the object position.
  • the image generation unit 150 can generate a right-eye image and a left-eye image in which an object is drawn at the position represented by the above formulas (A6) and (A8).
  • the image generation unit 150 displays the right-eye image and the left-eye image in which the x component of the drawing position is changed. Generate. That is, the image generation unit 150 generates a right-eye image and a left-eye image in which the drawing position of the object is moved to the left and right.
  • FIG. 9A and 9B show an example of the position of the user's finger 6, the object position, and the drawing position when the user's finger 6 comes close to the display screen 10, and
  • FIG. ) Is a diagram illustrating an example of the position of the user's finger 6, the object position, and the drawing position when the user's finger 6 contacts the display screen 10.
  • the finger 6 is close to the display screen 10, and the proximity position (x1, y1, z1) is symmetrical with respect to the display screen 10 with respect to the object position (xo, yo, ⁇ zo).
  • the image generation unit 150 moves the object position of the object o toward the display screen 10 in a direction approaching the display screen 10 as illustrated in FIG. To move vertically.
  • the movement amount at this time is an amount by which the object o can move following the finger 6 and is represented by ⁇ .
  • the image generation unit 150 calculates the drawing position of the object dr drawn on the right eye image and the object dl drawn on the left eye image from the object position of the moved object o, as shown in FIG. Move left and right.
  • the image generation unit 150 generates a right-eye image and a left-eye image in which the drawing position of the object o is moved left and right. Thereby, the object o is displayed at the moved object position so as to be stereoscopically viewed.
  • the proximity position (x1, y1, z1 ′) of the finger 6 is symmetric with respect to the display screen 10 (xo, yo, zo ⁇ ) with the object position (xo, yo, ⁇ zo + ⁇ ) moved.
  • the image generation unit 150 further moves the object position of the object o perpendicular to the display screen 10 in a direction approaching the display screen 10. As described above, until the finger 6 contacts the display screen 10, that is, until the object position becomes (xo, yo, 0), the image generation unit 150 moves the object position of the object o in a direction approaching the display screen 10. The display screen 10 is moved vertically.
  • the execution unit 170 Performs the function indicated by the object o.
  • the display device 1 determines the drawing positions of the objects in the right-eye image and the left-eye image using the expressions (A6) and (A8). It can be obtained by a simple calculation as shown in FIG. Further, the display device 1 can move the object o perpendicular to the display screen 10 in a direction approaching the display screen 10.
  • the display device 10 can give a natural impression that the user is pushing the object o toward the display screen 10.
  • FIG. 10 is a block diagram showing a main configuration of the display device 3 according to the present embodiment.
  • the display device 3 includes a display screen 10, a camera (first input device) 20, a touch panel (third input device) 30, a camera (second input device) 40, and a control unit. 300.
  • the camera 40 is an input device that detects the face of the user who uses the display device 3 and transmits a signal indicating the detected face information and the position of the face to the control unit 300.
  • FIG. 11 is a cross-sectional view of the display device 3.
  • a camera 40 is disposed below the display device 1 shown in FIG. 2 (in the negative y-axis direction).
  • the position where the camera 40 is disposed is not limited to this, and may be a position where the face of the user who uses the display device 3 can be detected.
  • the display device 3 may have a configuration surrounded by the bezel 2 as shown in FIG.
  • the camera 40 may be configured integrally with the display device 3 or may be configured separately from the display device 3. Further, the camera 40 may be integrated with the camera 20.
  • the control unit 300 is responsible for processing that the display device 3 executes internally.
  • the control unit 300 includes a proximity position detection unit (proximity position detection unit) 110, a contact position detection unit (contact position detection unit) 120, an object position storage unit 130, a determination unit 140, and an image generation unit ( An image generation unit 150, an image output unit 160, an execution unit (execution unit) 170, a face detection unit (face detection unit) 310, and an object position calculation unit (object position calculation unit) 320.
  • the determination unit 140 includes a proximity position determination unit (proximity position determination unit) 141 and a contact position determination unit (contact position determination unit) 142.
  • the face detection unit 310 recognizes the face of the user who is using the display device 3 using an input signal from the camera 40. Specifically, the user's face is recognized using the user's face information transmitted from the camera 40 and a signal indicating its position.
  • the user's face information is information indicating components of the user's face such as the user's right eye, left eye, nose and mouth.
  • the face detection unit 310 When the face detection unit 310 recognizes the user's face, the face detection unit 310 detects the position of the user's face with respect to the display screen 10.
  • the position of the user's face detected by the face detection unit 310 is the position of the user's nose, but is not limited thereto, and may be, for example, the position of the user's right eye.
  • the face detection unit 310 supplies the detected position of the user's face to the object position calculation unit 320.
  • the face detection unit 310 notifies the object position calculation unit 320 to that effect.
  • the object position calculation unit 320 calculates the object position of the object displayed on the display screen 10 so as to be stereoscopically viewable. Specifically, the object position is calculated from the position of the user's face supplied from the face detection unit 310 and the drawing position of the object in the right-eye image and the left-eye image displayed on the display screen 10. A method for calculating the object position will be described later.
  • the object position calculation unit 320 uses the optimal viewing position determined in advance for the display screen 10 and the drawing position of the object. Calculate the object position. A method for calculating the object position will be described later.
  • the object position calculation unit 320 stores the calculated object position in the object position storage unit 130.
  • the proximity position determination unit 141 determines whether the proximity position detected by the proximity position detection unit 110 matches the object position calculated by the object position calculation unit 320.
  • FIG. 12 is a front view and a side view of the display screen 10 when the user uses the display device 3.
  • the face detection unit 310 displays the distance z3 between the user's face and the display screen 10 and the display with the upper left as the origin. From the face position (x3, y3) on the plane parallel to the screen 10 (xy plane), (x3, y3, z3) is detected as the face position.
  • the position of the face on the plane (xy plane) parallel to the display screen 10 is represented as (x3, y3)
  • the position of the user's face is in an area including (x3, y3).
  • the position of the face on the xy plane is represented using a representative point such as (x3, y3).
  • FIG. 13 shows the object position o (xo,) with respect to the position of the object dr drawn on the right-eye image and the object dl drawn on the left-eye image at the position f (fx, fy, fz) of the user's face. It is a figure which shows yo, zo).
  • s indicates a baseline length that is the distance between the user's right eye fr and left eye fl.
  • zo is the distance of the object position from the display screen 10 (zo> 0).
  • the object position calculation unit 320 determines from the position f (fx, fy, fz) of the user's face.
  • the position of the right eye fr and the position of the user's left eye fl are calculated.
  • the face detection unit 310 detects the position of the right eye fr and the position of the left eye fl of the user, the positions are used.
  • the object o is drawn on the straight line connecting the object dr drawn on the user's right eye fr and the right eye image (straight line shown by a dotted line in FIG. 13), and on the user's left eye fl and left eye image.
  • the object position can be obtained by calculating the intersection of the two straight lines.
  • the object position calculation unit 320 sets the object position as f (fx, fy, fz) as the optimum viewing position determined in advance for the display screen 10. Can be obtained.
  • the face detection unit 310 determines whether or not the camera 40 has detected a user using the display device 3 (S302). When the camera 40 detects a user using the display device 3 (Yes in S302), the face detection unit 310 recognizes the user's face (S303).
  • the face detection unit 310 When the face detection unit 310 recognizes the user's face (Yes in S303), the face detection unit 310 detects the position (x3, y3, z3) of the user's face (S304).
  • the face detection unit 310 substitutes the detected position (x3, y3, z3) of the user's face for the position f (fx, fy, fz) of the user shown in FIG. 12 (S305).
  • the unit 310 substitutes the optimum viewing position for the position f (fx, fy, fz) of the user's face shown in FIG. 12 (S306).
  • the optimum viewing position is expressed as (W / 2, H / 2, D), but the optimum viewing position is not limited to this, and is a position determined according to the installation location of the display device 3 or the like. It is.
  • W indicates the width of the display device 3
  • H indicates the height of the display device 3
  • D indicates the distance between the display device 3 and the user.
  • the object position calculation unit 320 calculates the object position from the position f (fx, fy, fz) of the user's face and the drawing position of the object (S307).
  • the object position calculation unit 320 stores the calculated object position in the object position storage unit 130.
  • the display device 3 can detect the position of the user's face and calculate the object position of the object o displayed on the display screen 10 so as to be stereoscopically viewed.
  • the object o can be displayed in a stereoscopic manner and the impression that the object o has been touched can be given.
  • the camera 40 has been described as an input device for detecting the face of the user who uses the display device 3, but the input device for detecting the face of the user is not limited to this.
  • an infrared sensor may be used.
  • the display device 1 and the display device 3 are configured to include the contact position detection unit 120 that detects the position of the object that has touched the display screen 10, but the present invention is not limited to this.
  • the proximity position is set as the contact position.
  • a fourth embodiment according to the present invention will be described with reference to FIGS. 15 to 18.
  • FIG. 15 is a block diagram showing a main configuration of the display device 4 according to the present embodiment.
  • the display device 4 includes a display screen 10, a camera (first input device) 20, and a control unit 400.
  • the control unit 400 is responsible for the processing that the display device 4 executes internally. As shown in FIG. 15, the control unit 400 includes a proximity position detection unit (proximity position detection unit) 110, an object position storage unit 130, an image generation unit (image generation unit) 150, an image output unit 160, and an execution unit (execution unit). ) 170 and a determination unit 410.
  • the determination unit 410 includes a proximity position determination unit (proximity position determination unit) 411 and a contact position determination unit (comparison unit) 412.
  • the proximity position determination unit 411 acquires the object position from the object position storage unit 130.
  • the object position stored in the object position storage unit 130 may be a predetermined position, or is drawn on the display screen 10 in a predetermined optimal viewing position, a right-eye image, and a left-eye image. It may be a position obtained from the drawing position of the selected object. Also, the position obtained in S305 of the third embodiment may be used.
  • the proximity position determination unit 411 determines whether or not the proximity position supplied from the proximity position detection unit 110 matches the acquired object position. Thereafter, the proximity position determination unit 411 notifies the determination result to the contact position determination unit 412. When the proximity position determination unit 411 receives a notification from the contact position determination unit 412 that the proximity position is not the contact position, the proximity position determination unit 411 notifies the image generation unit 150 of the determination result.
  • the contact position determination unit 412 When the determination result by the proximity position determination unit 411 is true, that is, when the proximity position matches the object position, the contact position determination unit 412 includes a distance from the proximity position to the display screen 10 and a predetermined threshold value. Compare When the distance from the proximity position to the display screen 10 is smaller than the threshold value, the contact position determination unit 412 considers the proximity position as the contact position to the display screen 10. Thereafter, the contact position determination unit 412 notifies the proximity position determination unit 411 and the execution unit 170 of the determination result.
  • the execution unit 170 When the execution unit 170 receives a notification from the contact position determination unit 412 that the proximity position is the contact position, the execution unit 170 executes the function indicated by the object.
  • FIG. 16 is a cross-sectional view of the display device 4.
  • the display 16 has a configuration excluding the touch panel 30 from the display device 1 of FIG.
  • the display device 4 may have a configuration surrounded by the bezel 2 as shown in FIG.
  • FIG. 17 is a flowchart showing a processing flow when the user's finger 6 comes close to the display screen 10.
  • the proximity position detection unit 110 determines whether or not the camera 20 has detected the finger 6 that has approached the display screen 10 (S402). If camera 20 has not detected finger 6 in proximity to display screen 10 (No in S402), the process returns to S401, and camera 20 detects an object again.
  • the proximity position detection unit 110 detects the proximity position (x1, y1, z1) from the input signal transmitted from the camera 20. (S403).
  • the proximity position determination unit 411 acquires the object position (xo, yo, zo) from the object position storage unit 130 (S404).
  • the proximity position determination unit 141 determines whether or not the finger 6 is touching the object o, that is, whether or not the proximity position (x1, y1, z1) matches the object position (xo, yo, zo). Determination is made (S405).
  • the contact position determination unit 412 compares the distance (z1) from the proximity position to the display screen 10 with a predetermined threshold value (dth). (S406).
  • the image generation unit 150 causes the object position to be the display screen 10.
  • the object position is changed to (xo, yo, zo- ⁇ ) so as to move vertically to the display screen 10 in a direction approaching. Further, the image generation unit 150 moves the drawing position of the object in the right eye image and the left eye image left and right so that the object is displayed at the object position (xo, yo, zo- ⁇ ), and A left-eye image is generated.
  • the image generation unit 150 stores the changed object position (xo, yo, zo- ⁇ ) as a new object position in the object position storage unit 130 (S407).
  • the execution unit 170 executes the function indicated by the object o (S408).
  • FIG. 18A shows an example of the position of the user's finger 6 and the object position when the user's finger 6 is close to the display screen 10 and the distance from the close position to the display screen 10 is equal to or greater than the threshold value.
  • FIG. 18B is a diagram illustrating an example of the position of the user's finger 6 and the object position when the distance from the proximity position to the display screen 10 is smaller than the threshold value.
  • the contact position determination unit 412 Compares the distance z1 between the proximity position (x1, y1, z1) and the display screen 10 with a predetermined threshold value dth.
  • the image generation unit 150 sets the object position of the object o in a direction closer to the display screen 10. Move perpendicular to 10.
  • the movement amount at this time is an amount by which the object o can move following the finger 6 and is represented by ⁇ .
  • the contact position determination unit 412 determines the proximity position (x1, y1, z1). The distance z1 'between') and the display screen 10 is compared with a predetermined threshold value dth.
  • the contact position determination unit 412 determines that the proximity position (x1, y1, z1 ′) is on the display screen 10. It is determined that the contact position. After the determination, the execution unit 170 executes the function indicated by the object o.
  • the function shown in the object o can be executed without actually touching the display screen 10.
  • Each block of the display device 1, the display device 3, and the display device 4 described above may be configured by hardware logic, or may be realized by software using a CPU as follows.
  • the display device 1, the display device 3, and the display device 4 include a CPU (central processing unit) that executes instructions of a control program that realizes each function, a ROM (read only memory) that stores the program, and the program.
  • a RAM random access memory
  • a storage device such as a memory for storing the program and various data, and the like are provided.
  • An object of the present invention is to display the program code (execution format program, intermediate code program, source program) of the display device 1, the display device 3, and the control program (authentication program) of the display device 4, which is software for realizing the functions described above ) Is supplied to the display device 1, the display device 3, and the display device 4, and the computer (or CPU or MPU) reads the program code recorded on the recording medium. It can also be achieved by executing.
  • Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R.
  • Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
  • the display device 1, the display device 3, and the display device 4 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network.
  • the communication network is not particularly limited.
  • the Internet intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available.
  • the transmission medium constituting the communication network is not particularly limited.
  • infrared rays such as IrDA and remote control, Bluetooth ( (Registered trademark), 802.11 wireless, HDR (high data rate), mobile phone network, satellite line, terrestrial digital network, and the like can also be used.
  • the present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
  • a proximity position detection unit that detects a proximity position of an object close to the display screen, and the proximity position detection unit detect Proximity position determination means for determining whether the proximity position of the object coincides with a display target position where the object is to be displayed or a display target position and a symmetrical position with respect to the display screen; and the proximity An image for moving the drawing position of the object in the right-eye image and the left-eye image displayed on the display screen to the right and left when the proximity position matches the display target position or the symmetrical position in the determination result by the position determination unit Generating means, and in the determination result by the proximity position determination means, the proximity position and the table Each time the target position or the symmetrical position matches, the amount of movement by which the image generating means moves the drawing position of the object to the left or right in order to bring the display target position closer to the display screen by
  • the display image generation method is a display image generation method executed by a display device that displays an object on a display screen in a stereoscopic manner, and detects a proximity position of an object close to the display screen. Whether the proximity position of the object detected by the detection step and the proximity position detection step coincides with a display target position where the object is to be displayed or a symmetrical position with respect to the display target position and the display screen.
  • the proximity position and the display target position or the symmetrical position match in the determination result of the proximity position determination step and the determination of the proximity position determination step, the right eye image and the left eye image displayed on the display screen And an image generation step of moving the drawing position of the object to the left and right,
  • the determination result in the proximity position determination step every time the proximity position matches the display target position or the symmetrical position, in the image generation step to bring the display target position closer to the display screen by a predetermined length.
  • the amount of movement for moving the drawing position of the object to the left and right is set so as to decrease as the display target position of the object approaches the display screen.
  • the object refers to a display object constituting a GUI (Graphical User Interface).
  • GUI Graphic User Interface
  • operation objects for accepting selection by the user such as buttons and switches, are representative examples of objects here.
  • the display device is configured such that each time an object such as a user's finger approaches the display screen and touches an object displayed in a stereoscopic manner on the display screen, the object in the right-eye image and the left-eye image is displayed. Move the drawing position of to the left or right. At this time, the movement amount for moving the drawing position of the object in the right-eye image and the left-eye image to the left and right is set so as to decrease as the distance between the object and the display screen decreases.
  • the display device can display the object at a position moved vertically with respect to the display screen by displaying the right-eye image and the left-eye image thus generated on the display screen.
  • the display device has an effect that it can give an impression that the user has surely touched the object.
  • the horizontal coordinate of the display target position (the upper left corner of the display screen is the origin) is xo
  • the distance of the display target position from the display screen is zo
  • the distance between the viewing position and the display screen is D
  • the display When the width in the horizontal direction of the screen is W and the baseline length, which is the distance between the right eye and the left eye of the user, is s, the left and right of the drawing position of the object drawn on the right eye image by the image generating means
  • a coordinate xdr in the direction (with the upper left corner of the display screen as the origin) and a coordinate xdl in the horizontal direction of the drawing position of the object drawn on the image for the left eye by the image generation means (the upper left corner of the display screen And the origin)
  • the image generation means renders the drawing after the movement of the object in the right-eye image.
  • the proximity position determining means of the display device may be configured such that when the display target position of the object is located in a space in front of the display screen, the proximity position detected by the proximity position detecting means is It is preferable to determine whether or not the display target position matches.
  • the proximity position determination unit of the display device is configured such that when the display target position of the object is located in a space behind the display screen, the proximity position detected by the proximity position detection unit is It is preferable to determine whether or not the position coincides with the symmetrical position.
  • the position of the object in the right-eye image and the left-eye image displayed on the display screen is determined by the display device regardless of whether the object is in the space in front of or behind the display screen. Can be obtained by simple calculation, and the object can be moved perpendicularly to the display screen in a direction approaching the display screen.
  • the display device can give the user a natural impression that the user is pushing the object toward the display screen.
  • the display device preferably further includes a first input device, and the proximity position detection unit detects the proximity position of the object using an input signal from the first input device. .
  • the first input device detects the proximity of the object and transmits an input signal indicating the proximity position to the proximity position detection means.
  • the proximity position detection means detects the proximity position using the transmitted input signal.
  • the display device can move the object perpendicular to the display screen in a direction approaching the display screen when the proximity position matches the display target position of the object.
  • the user can feel as if the user actually touched the object.
  • the display device recognizes the face of the user who uses the display device and detects the position of the user's face with respect to the display screen, and allows the stereoscopic display on the display screen.
  • Object position calculating means for calculating a display target position of the object to be displayed, and the object position calculating means, when the face detection means recognizes the user's face, A target display position of the object is calculated from the drawing position of the object, and when the face of the user is not recognized by the face detection means, an optimal viewing position predetermined for the display screen;
  • the display target position of the object is calculated from the drawing position of the object, and the proximity position determining means It is determined whether or not the proximity position detected by the position detection means coincides with the display target position of the object calculated by the object position calculation means or a symmetrical position with respect to the display target position and the display screen. It is preferable.
  • the display device further includes a second input device, and the face detection unit detects the position of the user's face using
  • the display device can detect the position of the user's face and calculate the display target position of the object displayed in a stereoscopic view on the display screen or its symmetrical position.
  • the object can be displayed in a stereoscopic manner and the impression that the object has been touched can be given.
  • the display device includes a contact position detection unit that detects a contact position of an object that contacts the display screen, and the contact position of the object detected by the contact position detection unit is the display target position or the symmetry.
  • a contact position determination unit that determines whether or not the position coincides with a position; and a function indicated by the object when the contact position matches the display target position or the symmetrical position in the determination result by the contact position determination unit It is preferable to include an execution means.
  • the display device may further include a third input device, and the contact position detecting means may detect the contact position of the object using an input signal from the third input device.
  • the display device executes the function indicated by the object.
  • a distance from the proximity position to the display screen is determined in advance.
  • a contact position determination unit that determines that the proximity position detected by the proximity position detection unit is a contact position on the display screen when the distance is smaller than the threshold, and the contact position determination And means for executing a function indicated by the object when the proximity position is determined to be a contact position on the display screen.
  • the present invention also includes a display program for operating a computer included in the display device, the display program for causing the computer to execute as each of the means, and a computer-readable recording medium on which the display program is recorded. Is included in the technical scope.
  • the present invention can be suitably used for a display device that displays an object in a stereoscopic manner, in particular, a display device such as a television receiver of 40 inches or more and a digital signage of 100 inches or more.

Abstract

La présente invention concerne un dispositif d'affichage (1) permettant d'afficher un objet sur un écran d'affichage (10) de manière visionnable au plan stéréoscopique, qui est doté d'un détecteur de position proximale (110) afin de détecter une position proximale d'un corps proximal, d'une unité de détermination de position proximale (141) afin de déterminer si la position proximale coïncide avec une position cible d'affichage et d'un générateur d'image (150) destiné à déplacer vers la gauche ou vers la droite la position à laquelle l'objet est dessiné de sorte que, à chaque fois que les positions coïncident, la position cible d'affichage se réduise à mesure que le spectateur se rapproche de l'écran d'affichage (10).
PCT/JP2012/052551 2011-02-04 2012-02-03 Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement WO2012105703A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011022900 2011-02-04
JP2011-022900 2011-02-04

Publications (1)

Publication Number Publication Date
WO2012105703A1 true WO2012105703A1 (fr) 2012-08-09

Family

ID=46602903

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/052551 WO2012105703A1 (fr) 2011-02-04 2012-02-03 Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2012105703A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014034653A1 (fr) * 2012-08-28 2014-03-06 Necカシオモバイルコミュニケーションズ株式会社 Appareil électronique, son procédé de commande, et programme

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005228130A (ja) * 2004-02-13 2005-08-25 Sharp Corp 描画装置、描画方法および描画プログラム
WO2008062586A1 (fr) * 2006-11-22 2008-05-29 Sharp Kabushiki Kaisha Dispositif d'affichage, procédé d'affichage, programme d'affichage, et support d'enregistrement
JP2009070416A (ja) * 2009-01-05 2009-04-02 Sony Computer Entertainment Inc 制御システムおよび制御方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005228130A (ja) * 2004-02-13 2005-08-25 Sharp Corp 描画装置、描画方法および描画プログラム
WO2008062586A1 (fr) * 2006-11-22 2008-05-29 Sharp Kabushiki Kaisha Dispositif d'affichage, procédé d'affichage, programme d'affichage, et support d'enregistrement
JP2009070416A (ja) * 2009-01-05 2009-04-02 Sony Computer Entertainment Inc 制御システムおよび制御方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014034653A1 (fr) * 2012-08-28 2014-03-06 Necカシオモバイルコミュニケーションズ株式会社 Appareil électronique, son procédé de commande, et programme
JP2014044662A (ja) * 2012-08-28 2014-03-13 Nec Casio Mobile Communications Ltd 電子機器、その制御方法及びプログラム

Similar Documents

Publication Publication Date Title
US20200409529A1 (en) Touch-free gesture recognition system and method
TWI486629B (zh) 穿透型頭部穿戴式顯示系統與互動操作方法
US11164546B2 (en) HMD device and method for controlling same
US9454837B2 (en) Image processing apparatus, method, and computer-readable storage medium calculating size and position of one of an entire person and a part of a person in an image
TWI471820B (zh) 行動終端機及其操作控制方法
US20100053151A1 (en) In-line mediation for manipulating three-dimensional content on a display device
KR20120050900A (ko) 정보 처리 장치, 입체 표시 방법 및 프로그램
JP5263355B2 (ja) 画像表示装置及び撮像装置
JP2016523420A (ja) デジタルデバイスとの対話のための直接的なポインティング検出のためのシステムおよび方法
CN110968187B (zh) 由外围设备启用的远程触摸检测
JP2012108723A (ja) 指示受付装置
US9432652B2 (en) Information processing apparatus, stereoscopic display method, and program
EP2453660B1 (fr) Appareil de traitement d'informations, procédé d'affichage stéréoscopique et programme
US20150033157A1 (en) 3d displaying apparatus and the method thereof
CN107077199B (zh) 用于在三维显示器上呈现虚拟对象的装置及用于控制装置的方法
JP5341126B2 (ja) 検出領域拡大装置、表示装置、検出領域拡大方法、プログラムおよび、コンピュータ読取可能な記録媒体
CN105808015A (zh) 防窥用户交互装置及方法
US9123146B2 (en) Stereoscopic image display control apparatus, and stereoscopic image display control method
WO2012105703A1 (fr) Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement
JP2006302029A (ja) 表示装置制御プログラム、表示装置制御方法、表示装置
JP2012103980A5 (fr)
KR101560474B1 (ko) 3차원 사용자 인터페이스를 제공하는 스테레오스코픽 디스플레이 장치 및 방법
JP5950806B2 (ja) 入力装置、情報処理方法、及び情報処理プログラム
Zhao et al. Evaluation of visuo-haptic feedback in a 3D touch panel interface
Niikura et al. 3D touch panel interface using an autostereoscopic display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12742144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12742144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP