WO2008062586A1 - Dispositif d'affichage, procédé d'affichage, programme d'affichage, et support d'enregistrement - Google Patents

Dispositif d'affichage, procédé d'affichage, programme d'affichage, et support d'enregistrement Download PDF

Info

Publication number
WO2008062586A1
WO2008062586A1 PCT/JP2007/064565 JP2007064565W WO2008062586A1 WO 2008062586 A1 WO2008062586 A1 WO 2008062586A1 JP 2007064565 W JP2007064565 W JP 2007064565W WO 2008062586 A1 WO2008062586 A1 WO 2008062586A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display device
touch panel
determination
input
Prior art date
Application number
PCT/JP2007/064565
Other languages
English (en)
Japanese (ja)
Inventor
Akizumi Fujioka
Toshihiro Yanagi
Masami Ozaki
Keiichi Yamamoto
Asahi Yamato
Original Assignee
Sharp Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Kabushiki Kaisha filed Critical Sharp Kabushiki Kaisha
Publication of WO2008062586A1 publication Critical patent/WO2008062586A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Display device display method, display program, and recording medium
  • the present invention relates to a display device, a display method, a display program, and a recording medium that display an image (object) three-dimensionally.
  • 3D display is used to display a selection object such as a button in a three-dimensional manner to improve the reality of the user interface.
  • selection object such as a button
  • several techniques have been proposed to improve operability when selecting objects displayed in 3D.
  • Patent Document 1 discloses a display unit that displays an image for operation, and a touch panel that is formed on the display unit so as to be superimposed on the display unit and that is pressed by a pressing unit to perform input for operation. And a sensor which is disposed at a predetermined distance from the operation surface of the touch panel and detects the approach to the operation surface before the pressing means touches the touch panel, and a stereoscopic parallax image and a planar image.
  • an operation panel device that includes a control unit that is configured to switch an image displayed on the display unit between a stereoscopic parallax image and a planar image in response to detection by the sensor.
  • Patent Document 2 a 3D stereoscopic image displayed on a 3D display device can be displayed in front of the touch panel as viewed from the viewer's power, and the touch panel can be displayed with a fingertip or the like.
  • An image display system is disclosed in which an object other than the object pointed to is moved forward when touching.
  • Patent Document 1 Japanese Published Patent Publication “Japanese Unexamined Patent Publication No. 2004-280496 (Publication Date: 10th 07th 2004)”
  • Patent Document 2 Japanese Patent Publication “JP-2006-293878 (Publication Date: 26th Jan. 2006)”
  • a stereoscopic image is switched to a planar image before the user's finger touches the touch panel.
  • it gives the user an uncomfortable feeling that an object that has not been touched appears to be retracted.
  • the present invention has been made to solve the above-described problems, and an object of the present invention is to display a display that gives a natural impression to the user when the user selects a three-dimensionally displayed object. And a display method, a display program, and a recording medium.
  • a display device According to the present invention, provides
  • a display device comprising a display means for three-dimensionally displaying an object on a three-dimensional display arranged at a certain distance from the input device!
  • Determination means for determining whether or not the object is displayed at a position on the three-dimensional display corresponding to the detected input position
  • the display means is characterized in that, when the determination result by the determination means is true, the display state of the object is changed to a lower one.
  • the display means displays the object three-dimensionally on the three-dimensional display arranged at a certain distance from the input device.
  • the object here is an operation object for accepting a selection by the user, such as a button or a switch. It is a project.
  • the input device is a contact or non-contact type touch panel, for example, arranged with the input detection surface parallel to the 3D display.
  • the display means for example, based on the principle of binocular parallax, does not require so-called 3D glasses, and three-dimensionally displays an object on a three-dimensional display so that a stereoscopic view can be obtained simply by viewing it as it is.
  • an input device is arranged at a fixed distance from the 3D display.
  • the input device is disposed on the near side as viewed from the user.
  • the detection means detects an input position in the input device. Specifically, if the input device is a touch-type touch panel, for example, the position where the user touched with a finger is specified by coordinates. In accordance with this detection process, the determination unit determines whether or not the object is displayed at the position on the three-dimensional display corresponding to the detection position on the input device detected by the detection unit. In other words, it is determined whether or not the user selects an object displayed in 3D through the input device.
  • the display unit changes the display state of the selected object to a lower one. For example, if the object is a button, the display state is changed so that it is retracted to the display device (3D display) side as seen from the user.
  • a plane object may be displayed instead of a three-dimensional object.
  • the display device has an effect of enabling display that gives a natural impression to the user when the user selects an object displayed in three dimensions.
  • a display method according to the present invention provides
  • the input device is a contact type touch panel.
  • the input device is a non-contact touch panel
  • a panel made of, for example, an acrylic plate is arranged at the input position detected by the non-contact touch panel. Therefore, when the user touches the panel, the non-contact touch panel detects the input by the user, and the height of the corresponding object is lowered. As a result, the user can feel the force as well as the force of feeling as if the user actually touched the object.
  • the display means is
  • the display unit approximately matches the height of the object with the distance between the input device and the three-dimensional display before the determination result by the determination unit becomes true.
  • the object is displayed in 3D so that the foreground of the object substantially matches the display surface of the 3D display.
  • the user can feel as if he / she actually touched the object when inputting to the input device.
  • the display device has an effect of further improving the operability when selecting an object.
  • the display means is
  • the determination means Before the determination result by the determination means becomes true, it is preferable to display only the active object among the plurality of objects in a three-dimensional display! /.
  • the display unit three-dimensionally displays only active ones of the plurality of objects before the determination result by the determination unit becomes true.
  • the active object is an object that is set to perform a specific process on the display device after being selected!
  • the object is a key for inputting a predetermined number, and if the user selects this key, the number assigned to the key is set to be input!
  • the object corresponds to an active object.
  • the display device displays only the active object in three dimensions, and is not active! / ⁇ In other words, the display device does not perform any processing even if selected! / , Do not display 3D. This allows the user to easily distinguish selectable objects from others. In addition, you can prevent accidental selection of objects that cannot be selected. Accordingly, the display device has an effect that the operability at the time of object selection by the user can be further improved.
  • the display means is
  • the determination means Before the determination result by the determination means becomes true, it is preferable to display the object with a shadow.
  • the object can be displayed more three-dimensionally. Therefore, the object can be identified more easily.
  • the display means is
  • the display state of the object is preferably changed to a height corresponding to the input area of the input device.
  • the display unit changes the display state of the object with a height corresponding to the input area of the input device.
  • the input area here is the contact area in the touch panel when the input device is a contact-type touch panel.
  • the input device when the input device is a non-contact type touch panel, it is the area of the input position detected by the touch panel.
  • the object when the object is a button, the height of the object is changed to be lower as the input area is larger.
  • the input area is regarded as pressing pressure, and the dent of the button is proportional to the pressure.
  • the display device has an effect of giving the user a stronger impression that the object has been selected.
  • the display means is
  • the display state of the object is preferably changed to a shape corresponding to the input area of the input device.
  • the display unit changes the display state of the object to a shape corresponding to the input area of the input device.
  • the input area here is a contact area in the touch panel when the input device is a contact type touch panel.
  • the input device is a non-contact type touch panel, it is the area of the input position detected by the touch panel. For example, if the object is a sphere, the input area is regarded as a pressing pressure, and the object is displayed in a deformed shape corresponding to the pressure, that is, an elliptical sphere.
  • the display device has an effect of giving the user a stronger impression that the object has been selected.
  • the display device may be realized by a computer.
  • a display program for realizing the display device in a computer by operating the computer as each of the above means, and a computer-readable recording medium recording the display program also fall within the scope of the present invention.
  • FIG. 1 is a block diagram showing a main configuration of a display device according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing a positional relationship between a 3D display and a touch panel.
  • FIG. 3 is a diagram showing a display state of a plurality of 3D buttons. (A) shows how multiple 3D buttons are displayed, and (b) shows how one of these 3D buttons is selected with a finger.
  • FIG. 4 is a diagram showing a display state of a plurality of 3D buttons. (A) shows a 3D display of only the active 3D buttons, and (b) shows a 3D object shaded and displayed!
  • FIG. 5 is a diagram showing a display state of a 3D object.
  • A shows a display example of a 3D object before the finger touches the touch panel, and
  • (b) shows after the finger touches the touch panel. The state of the 3D object whose shape has changed is shown.
  • FIG. 1 is a block diagram showing a main configuration of a display device 1 according to an embodiment of the present invention.
  • the display device 1 includes a 3D display 2 (3D display), a touch panel 3 (input device, contact-type touch panel), a memory 4, and a control unit 10.
  • the 3D display 2 is hardware that displays information and presents it to the user.
  • it is realized as a liquid crystal display or plasma display.
  • the 3D display 2 provided in the display device 1 can display an image (object) in a three-dimensional manner. Specifically, based on the principle of binocular parallax, a so-called 3D negative is not required, and a display that allows a stereoscopic view to be obtained by just looking at the screen is possible.
  • the touch panel 3 is hardware used by the user to select a 3D object displayed on the 3D display 2.
  • the touch panel 3 is arranged at a fixed distance from the 3D display 2! /. This point will be described later.
  • the memory 4 is hardware that stores various data used when the control unit 10 executes processing. This corresponds to a so-called nonvolatile memory or volatile memory.
  • the display image data is stored in the memory 4.
  • the control unit 10 is responsible for processing that the display device 1 executes internally. More specifically, the control unit 10 includes a detection unit 12 (detection unit), a display unit 11 (display unit), a touch determination unit 13 (determination unit), and a touch area calculation unit 14.
  • FIG. 2 is a diagram showing a positional relationship between the 3D display 2 and the touch panel 3.
  • a backlight 21 is provided on the back side of the 3D display 2.
  • the touch panel 3 includes a protection panel 23 and a touch sensor 22.
  • the touch sensor 22 outputs an electrical signal indicating the contact position on the touch panel 3 to the control unit 10.
  • the touch panel 3 shown in FIG. 2 is a so-called capacitive touch panel.
  • the touch panel 3 is not limited to the capacitance type, and may be a so-called resistance film type. In this case, the positions of the touch sensor and the protection panel are opposite to those of the capacitive type.
  • the 3D display 2 is arranged at a certain distance a from the touch panel 3. More specifically, the display surface of the 3D display 2 and the contact surface of the touch panel 3 are separated by a distance a.
  • the touch panel 3 is arranged on the front side as viewed from the user. That is, the 3D display 2 is arranged on the depth side as viewed from the user.
  • the 3D display 2 displays a 3D button 24 as shown in FIG. At this time, the height of the 3D button 24 is made substantially equal to the distance a. As a result, the foremost force of the 3D button 24 substantially matches the contact surface of the 3D display 2.
  • the user touches the touch panel 3 with the finger 25. At this time, since the forefront of the 3D button 24 and the contact surface of the touch panel 3 are substantially coincident, when the user touches the touch panel 3, the user feels as if he / she touched the 3D button 24 itself. Touch with force S.
  • FIG. 3 is a diagram showing a display state of a plurality of 3D buttons.
  • (A) is a diagram showing a state in which 3D button 3;! To 33 is displayed, and
  • (b) is a diagram showing a state in which one of these 3D buttons is selected with the finger 25.
  • FIG. 3 is a diagram showing a display state of a plurality of 3D buttons.
  • the display unit 11 reads display data from the memory 4. Then, the object included in the read data is displayed on the 3D display 2 as a three-dimensional object as shown in FIG. In the example of Fig. 3 (a), 3D button 3;! -33 is displayed. At this time, the height of each 3D button is made to substantially coincide with the distance a between the touch panel 3 and the 3D display 2. In other words, before the object is selected by the user (that is, before the determination result by the touch detection unit 13 becomes true), the object is placed so that the foremost surface of the object substantially matches the display surface of the 3D display 2. 3D display. As a result, when the user touches the touch panel 3, the user can feel as if he / she actually touched the object. Accordingly, the display device 1 can improve operability when selecting an object.
  • the display unit 11 can freely change the height of the 3D object as appropriate. In other words, the height of the displayed object does not necessarily match the distance a.
  • the user can select each 3D button 3;! To 33 displayed in three dimensions by touching the touch panel 3. Specifically, the user presses finger 25 Touch the touch panel 3 in place.
  • the detection unit 12 outputs a scan signal to the touch panel 3 at a predetermined cycle. In response to this scan signal, the touch sensor 22 of the touch panel 3 outputs an electrical signal indicating the position of contact to the control unit 10.
  • the contact position on the touch panel 3 is detected by the detection unit 12 analyzing this electrical signal. For example, the X and y coordinates of touch panel 3 are calculated.
  • the detection unit 12 outputs the calculated X coordinate and y coordinate to the touch determination unit 13.
  • the touch determination unit 13 determines whether or not the user has selected the 3D button based on the input X coordinate and y coordinate.
  • the display data stored in the memory 4 includes coordinate data for specifying the display position on the 3D display 2 of the 3D button 3;! Therefore, the touch determination unit 13 collates the coordinates indicating the detection position on the touch panel 3 calculated by the detection unit 12 with the display coordinates of each 3D button, thereby detecting the detected contact position 1S 3D button display position. It is determined whether or not they match.
  • the contact position of the touch panel 3 touched by the finger 25 corresponds to the display position of the 3D button 33.
  • the touch determination unit 13 determines that the user has selected the 3D button 33.
  • the touch determination unit 13 outputs the determination result to the display unit 11.
  • the touch determination unit 13 performs this determination, for example, at the predetermined cycle described above. Therefore, if the predetermined period is 100 milliseconds, every time 100 milliseconds elapse, it is determined whether or not the user has pressed the 3D button, and the result is output to the display unit 11.
  • the display unit 11 changes the display state of each 3D button based on the determination result of the touch determination unit 13. Specifically, the display state of the 3D button whose determination result by the touch determination unit 13 is “true” is changed to a lower one. In the example of FIG. 3B, the display state of the 3D button 33 is changed to a 2D object 34 having a height of zero. That is, the display state is changed so that it is retracted to the display device 1 (3D display 2) side as seen from the user.
  • 3D button 33 when the user selects the 3D button 33, it appears that the 3D button 33 is retracted in accordance with the selection operation. In other words, you will witness the same phenomenon as when you press a real button or switch. As a result, 3D button 33 to 2D button 34 The ability to receive changes in the display as natural.
  • the display device 1 can perform display that gives a natural impression to the user when the user selects an object displayed in three dimensions.
  • the display unit 11 may three-dimensionally display only active objects among a plurality of objects (buttons) before the determination result by the touch determination unit 13 becomes true. This situation will be described below with reference to FIG.
  • FIG. 4 is a diagram showing a display state of a plurality of 3D buttons, and (a) shows a state in which only active ones of the plurality of buttons are three-dimensionally displayed.
  • the display unit 11 displays a three-dimensional display on the 3D display only of the active objects among the plurality of objects before the determination result by the touch determination unit 13 becomes true.
  • the active object here is an object that is set to be subjected to a specific process in the display device 1 after being selected. For example, an object is a button for inputting a predetermined number, and if the user selects this button, the number assigned to the button is set to be input! Objects correspond to active objects.
  • buttons 31 and the 3D button 32 force are active buttons. Therefore, the display unit 11 displays these buttons three-dimensionally. On the other hand, 2D button 34 is not active. Therefore, 2D display.
  • the display unit 11 displays only the active object in a three-dimensional manner.
  • the display device 1 is not active, that is, the display device 1 does not perform any processing! /, The setting is made! /, Or the object is! Regardless of the judgment result, 3D display is not always performed. This allows the user to easily distinguish selectable objects from others. Furthermore, it is possible to prevent an object that cannot be selected from being selected by mistake.
  • the display device 1 can improve the operability at the time of object selection by the user more and more. [0069] (shadows the 3D display object)
  • the display unit 11 may display the 3D object with a shadow before the determination result by the touch determination unit 13 becomes true. This situation is explained below with reference to Fig. 4 (b).
  • FIG. 4 is a diagram showing a display state of a plurality of 3D buttons
  • (b) is a diagram showing a state in which the 3D buttons are shaded and displayed.
  • the display unit 11 displays a shadow 41 at a position where the active 3D button 31 projects onto the display surface of the 3D display 2.
  • a shadow 42 is displayed at a position where the active 3D button 32 projects onto the display surface of the 3D display 2.
  • the display unit 11 can change the display state of the object to a shape corresponding to the contact area on the touch panel 3 when the determination result by the touch determination unit 13 is true. An example of this is shown in FIG.
  • FIG. 5 is a diagram showing a display state of the 3D object.
  • (A) shows a display example of a 3D object before the finger 25 touches the touch panel 3
  • the display unit 11 displays a 3D object 51 that is a sphere having a radius a on the 3D display 2.
  • This 3D object 51 is preliminarily set in the display data in the memory 4 and is subject to shape change.
  • Touch panel 3 is a panel that can detect multiple points. That is, an electrical signal indicating a plurality of coordinates corresponding to each position where contact has occurred is output to the control unit 10.
  • the detection unit 12 detects all coordinates corresponding to the position where the touch panel 3 is in contact. Thereby, all the coordinate information indicating the detected coordinates is output to both the touch determination unit 13 and the touch area calculation unit 14.
  • the detection unit 12 receives the input coordinate information, the coordinate information of the display position of the 3D object 51, and Compare In the example of (b) of FIG. 5, the result of determination by the touch determination unit 13 becomes true by this comparison. Therefore, the touch determination unit 13 outputs to the touch area calculation unit 14 that the 3D object to be changed is in contact.
  • the touch area calculation unit 14 calculates the area where the finger 25 touches the touch panel 3. At this time, the coordinate information input from the detection unit 12 is used. As a method for calculating the area, a known technique can be appropriately used.
  • the touch area calculation unit 14 outputs the calculated area to the display unit 11 and notifies the display unit 11 to change the display state of the 3D object 51 to a display corresponding to the area.
  • the display unit 11 changes the display state of the 3D object 51 to a shape corresponding to the contact area of the finger 25 on the touch panel 3.
  • the contact area is regarded as a pressing pressure
  • the 3D object 51 is displayed in a deformed shape corresponding to the pressure, that is, an elliptical sphere.
  • the 3D object 51 is a spherical force having a radial force and a volume of S.
  • the display unit 11 changes the shape of the 3D object 51 to a low shape while keeping the volume S of the 3D object constant. Further, according to the lowered height, the shape of the 3D object 51 is changed to an elliptical spherical shape, that is, a shape spreading in the plane direction. As a result, as shown in FIG. 5 (b), the display unit 11 replaces the spherical 3D object 51 with an elliptical spherical shape having a height of a ⁇ 2, a planar radius of ⁇ 2a, and a volume of S.
  • the 3D object 52 is displayed on the 3D display 2.
  • the display device 1 can give the user a stronger impression that the object has been selected.
  • the display unit 11 changes the display state of the 3D object to a height corresponding to the contact area on the touch panel 3 when the determination result by the touch determination unit 13 is true.
  • the display unit 11 changes the 3D object display state with a height corresponding to the contact area on the touch panel 3. For example, if the 3D object is a 3D button, the larger the contact area, Change the button height to a lower one. In other words, considering the contact area as pressing pressure,
  • the display device 1 gives the impression that the object is surely selected.
  • the above-mentioned “before the determination result by the touch determination unit 13 becomes true” more strictly means that the determination result becomes true after a certain time after the determination result by the touch determination unit 13 becomes false.
  • Means the period until. This period is a period when the user does not select the 3D object, and the display device 1 makes the height of the 3D object substantially coincide with the distance a between the touch panel 3 and the 3D display 2 during this period. Only the active 3D object is displayed in 3D, or the shadow is displayed on the 3D object.
  • the touch panel 3 is a so-called non-contact type touch panel, and may detect an input position by a user using an optical sensor or electromagnetic induction. Further, when a non-contact type touch panel is used as the touch panel 3, a panel arranged in parallel with the 3D display 2 may be further provided at a position spaced apart from the 3D display 2.
  • the panel is arranged at the input position detected by the non-contact touch panel. Therefore, when the user touches the panel, the non-contact touch panel detects the input by the user, and the height of the corresponding object becomes lower. As a result, the user can feel as if he / she actually touched the object. In the non-contact type touch panel, the input area when the user's input process is accepted can be obtained.
  • the display unit 11 may change the display state of the 3D object, and at the same time, a sound processing unit (not shown) may output a specific sound through a speaker (not shown). This allows the user to hear through the hearing that the 3D object has been selected. The ability to endure and confirm S
  • the vibration processing unit may vibrate the display device 1 by vibrating a vibrator (not shown). This allows the user to confirm through tactile sense that the 3D object has been selected.
  • a specific scent may be emitted from a scent processing unit (not shown) or a scent storage (not shown). For example, break the smell ball. This allows the user to confirm through the olfaction that the 3D object has been selected.
  • each block included in the display device 1 may be configured by hardware logic. Alternatively, it may be realized by software using a CPU (Central Processing Unit) as follows.
  • CPU Central Processing Unit
  • the display device 1 is a CPU that executes instructions of a control program that realizes each function.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • storage devices such as the memory that stores the control program and various data (Recording media).
  • the object of the present invention can also be achieved by a predetermined recording medium.
  • This recording medium records the program code (execution format program, intermediate code program, source program) of the control program of the display device 1 which is software that realizes the above-mentioned functions so that it can be read by a computer! / Yo! /
  • This recording medium is supplied to the display device 1. This allows the display device 1 (or CPU or MPU) as a computer to read and execute the program code recorded on the supplied recording medium! /.
  • the recording medium that supplies the program code to the display device 1 is not limited to a specific structure or type. That is, this recording medium is, for example, a magnetic tape or a cassette tape. Tape system such as floppy disk, disk system including magnetic disk such as floppy disk / hard disk and optical disk such as CD—ROM / MO / MD / DVD / CD—R, IC card (including memory card) / Card system such as optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
  • Tape system such as floppy disk, disk system including magnetic disk such as floppy disk / hard disk and optical disk such as CD—ROM / MO / MD / DVD / CD—R, IC card (including memory card) / Card system such as optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
  • the display device 1 is configured to be connectable to a communication network, the object of the present invention can be achieved.
  • the program code is supplied to the display device 1 via the communication network.
  • the communication network is not limited to a specific type or form as long as it can supply the program code to the display device 1.
  • the Internet intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication network, etc. may be used.
  • the transmission medium constituting the communication network is not limited to a specific configuration or type as long as it is an arbitrary medium capable of transmitting the program code.
  • IEEE1 394 Universal Serial Bus
  • power line carrier for example, IEEE1 394, USB (Universal Serial Bus), power line carrier, cable TV line, telephone line, ADSL (Asymmetric Digital Subscriber Line) line, etc., infrared such as IrD A remote control, Bluetooth (registered trademark) 802.11 radio, HDR, mobile phone network, satellite line, terrestrial digital network, etc.
  • the present invention can also be realized in the form of a converter data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
  • the display device includes a determination unit that determines whether an object is displayed at a position on the three-dimensional display corresponding to the input position on the input device, and the determination.
  • the display means for changing the display state of the object to a lower one is provided, so when the user selects an object displayed in three dimensions, There is an effect that a display that gives a natural impression to the user can be performed.
  • the present invention can be used as a display device capable of selecting an object displayed three-dimensionally through a touch panel.
  • a display device capable of selecting an object displayed three-dimensionally through a touch panel.
  • it can be applied to various uses such as bank ATMs and home game machines.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif d'affichage (1) affiche de manière tridimensionnelle un objet sur un dispositif d'affichage (2) tridimensionnel agencé à une certaine distance d'un panneau tactile (3). Dans le dispositif d'affichage (1), une section de détection (12) détecte une position de contact dans le panneau tactile (3). Une section (13) de détermination de toucher détermine si ou non l'objet tridimensionnel est affiché dans la position dans le dispositif d'affichage (2) tridimensionnel correspondant à la position de contact détectée par la section de détection (12). Lorsque le résultat de la détermination par la section (13) de détermination de toucher est vrai, une section d'affichage (11) diminue la hauteur de l'affichage de l'objet tridimensionnel. Ceci conduit au fait que l'affichage donne à l'utilisateur une impression naturelle lorsque l'utilisateur sélectionne l'objet affiché de manière tridimensionnelle.
PCT/JP2007/064565 2006-11-22 2007-07-25 Dispositif d'affichage, procédé d'affichage, programme d'affichage, et support d'enregistrement WO2008062586A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006316323 2006-11-22
JP2006-316323 2006-11-22

Publications (1)

Publication Number Publication Date
WO2008062586A1 true WO2008062586A1 (fr) 2008-05-29

Family

ID=39429527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/064565 WO2008062586A1 (fr) 2006-11-22 2007-07-25 Dispositif d'affichage, procédé d'affichage, programme d'affichage, et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2008062586A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011013778A (ja) * 2009-06-30 2011-01-20 Sony Corp 立体画像表示装置、物体近接検出装置および電子機器
JP2011118907A (ja) * 2010-12-20 2011-06-16 Fujifilm Corp 3次元表示時における指示位置設定装置および方法並びにプログラム
EP2339435A1 (fr) * 2008-08-27 2011-06-29 FUJIFILM Corporation Dispositif et procédé pour définir la position d'une instruction durant un affichage tridimensionnel, et programme
JP2012098929A (ja) * 2010-11-02 2012-05-24 Canon Inc 表示制御装置及び表示制御方法
WO2012105703A1 (fr) * 2011-02-04 2012-08-09 シャープ株式会社 Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement
WO2012132749A1 (fr) * 2011-03-31 2012-10-04 富士フイルム株式会社 Dispositif d'affichage stéréoscopique, procédé d'acceptation d'instruction, programme et support d'enregistrement associé
JP2012194760A (ja) * 2011-03-16 2012-10-11 Canon Inc 画像処理装置及びその制御方法、並びにプログラム
JP2012252386A (ja) * 2011-05-31 2012-12-20 Ntt Docomo Inc 表示装置
EP2175351A3 (fr) * 2008-10-10 2013-07-10 Sony Corporation Appareil, système, procédé et programme pour le traitement d'informations
CN103294387A (zh) * 2012-02-23 2013-09-11 宏达国际电子股份有限公司 立体成像系统及其方法
JP5781080B2 (ja) * 2010-10-20 2015-09-16 三菱電機株式会社 3次元立体表示装置および3次元立体表示処理装置
JP2016042391A (ja) * 2015-12-24 2016-03-31 京セラ株式会社 表示機器
EP2585900A4 (fr) * 2010-06-24 2016-07-06 Nokia Technologies Oy Appareil et procédé pour une entrée basée sur la proximité

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08511365A (ja) * 1993-04-28 1996-11-26 マクフェターズ,アール.,ダグラス ホログラフィックなオペレータ・インタフェース
JPH10171600A (ja) * 1996-12-06 1998-06-26 Brother Ind Ltd 入力装置
JP2002351352A (ja) * 2001-05-28 2002-12-06 Hitachi Electronics Service Co Ltd 薄型表示装置
JP2005316790A (ja) * 2004-04-30 2005-11-10 Nippon Telegr & Teleph Corp <Ntt> 情報入力方法および情報入出力装置並びにプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08511365A (ja) * 1993-04-28 1996-11-26 マクフェターズ,アール.,ダグラス ホログラフィックなオペレータ・インタフェース
JPH10171600A (ja) * 1996-12-06 1998-06-26 Brother Ind Ltd 入力装置
JP2002351352A (ja) * 2001-05-28 2002-12-06 Hitachi Electronics Service Co Ltd 薄型表示装置
JP2005316790A (ja) * 2004-04-30 2005-11-10 Nippon Telegr & Teleph Corp <Ntt> 情報入力方法および情報入出力装置並びにプログラム

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8482557B2 (en) 2008-08-27 2013-07-09 Fujifilm Corporation Device and method for setting instructed position during three-dimensional display, as well as program
EP2339435A1 (fr) * 2008-08-27 2011-06-29 FUJIFILM Corporation Dispositif et procédé pour définir la position d'une instruction durant un affichage tridimensionnel, et programme
EP2339435A4 (fr) * 2008-08-27 2012-06-06 Fujifilm Corp Dispositif et procédé pour définir la position d'une instruction durant un affichage tridimensionnel, et programme
EP2175351A3 (fr) * 2008-10-10 2013-07-10 Sony Corporation Appareil, système, procédé et programme pour le traitement d'informations
JP2011013778A (ja) * 2009-06-30 2011-01-20 Sony Corp 立体画像表示装置、物体近接検出装置および電子機器
EP2585900A4 (fr) * 2010-06-24 2016-07-06 Nokia Technologies Oy Appareil et procédé pour une entrée basée sur la proximité
JP5781080B2 (ja) * 2010-10-20 2015-09-16 三菱電機株式会社 3次元立体表示装置および3次元立体表示処理装置
JP2012098929A (ja) * 2010-11-02 2012-05-24 Canon Inc 表示制御装置及び表示制御方法
JP2011118907A (ja) * 2010-12-20 2011-06-16 Fujifilm Corp 3次元表示時における指示位置設定装置および方法並びにプログラム
WO2012105703A1 (fr) * 2011-02-04 2012-08-09 シャープ株式会社 Dispositif d'affichage, procédé de génération d'image d'affichage, programme et support d'enregistrement
JP2012194760A (ja) * 2011-03-16 2012-10-11 Canon Inc 画像処理装置及びその制御方法、並びにプログラム
CN103460257A (zh) * 2011-03-31 2013-12-18 富士胶片株式会社 立体显示设备、接受指令的方法、程序及记录其的介质
EP2693405A1 (fr) * 2011-03-31 2014-02-05 FUJIFILM Corporation Dispositif d'affichage stéréoscopique, procédé d'acceptation d'instruction, programme et support d'enregistrement associé
JPWO2012132749A1 (ja) * 2011-03-31 2014-07-24 富士フイルム株式会社 立体表示装置、指示受付方法及びプログラムならびにその記録媒体
EP2693405A4 (fr) * 2011-03-31 2014-09-03 Fujifilm Corp Dispositif d'affichage stéréoscopique, procédé d'acceptation d'instruction, programme et support d'enregistrement associé
JP5693708B2 (ja) * 2011-03-31 2015-04-01 富士フイルム株式会社 立体表示装置、指示受付方法及びプログラムならびにその記録媒体
WO2012132749A1 (fr) * 2011-03-31 2012-10-04 富士フイルム株式会社 Dispositif d'affichage stéréoscopique, procédé d'acceptation d'instruction, programme et support d'enregistrement associé
CN106896952A (zh) * 2011-03-31 2017-06-27 富士胶片株式会社 立体显示设备和接受指令的方法
US9727229B2 (en) 2011-03-31 2017-08-08 Fujifilm Corporation Stereoscopic display device, method for accepting instruction, and non-transitory computer-readable medium for recording program
JP2012252386A (ja) * 2011-05-31 2012-12-20 Ntt Docomo Inc 表示装置
CN103294387A (zh) * 2012-02-23 2013-09-11 宏达国际电子股份有限公司 立体成像系统及其方法
JP2016042391A (ja) * 2015-12-24 2016-03-31 京セラ株式会社 表示機器

Similar Documents

Publication Publication Date Title
WO2008062586A1 (fr) Dispositif d&#39;affichage, procédé d&#39;affichage, programme d&#39;affichage, et support d&#39;enregistrement
US20200409529A1 (en) Touch-free gesture recognition system and method
US10203764B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
JP6463795B2 (ja) グラフィカルユーザインターフェース装置においてテクスチャを用いるためのシステム及び方法
JP5446769B2 (ja) 3次元入力表示装置
US11656711B2 (en) Method and apparatus for configuring a plurality of virtual buttons on a device
TWI360071B (en) Hand-held device with touchscreen and digital tact
US9015584B2 (en) Mobile device and method for controlling the same
CN105589594B (zh) 电子装置和电子装置的操作控制方法
JP5694204B2 (ja) グラフィカルユーザインターフェース装置においてテクスチャを用いるためのシステム及び方法
CN103502923B (zh) 用户与设备的基于触摸和非触摸的交互作用
JP2018063700A (ja) コンテキスト型圧力検知触覚応答
KR20140126129A (ko) 잠금 및 해제를 제어하는 장치 및 방법
JP6012068B2 (ja) 電子機器、その制御方法及びプログラム
KR101019254B1 (ko) 공간 투영 및 공간 터치 기능이 구비된 단말 장치 및 그 제어 방법
EP2075671A1 (fr) Interface d&#39;utilisateur de dispositif portable et son procédé de fonctionnement
CN103097989A (zh) 信息处理装置、处理控制方法、程序及记录介质
CN102981743A (zh) 控制操作对象的方法及电子设备
JP2017111462A (ja) 触感呈示装置及び触感呈示方法
US20140320440A1 (en) Tactile sensation providing device
JP2018113025A (ja) 触覚によるコンプライアンス錯覚のためのシステム及び方法
EP2423786B1 (fr) Appareil de traitement d&#39;informations, procédé d&#39;affichage stéréoscopique et programme
CN102760004B (zh) 一种控制数据显示状态的方法及装置
EP3652613B1 (fr) Transmission améliorée d&#39;entrée haptique
JP2014074988A (ja) 表示装置、表示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07791281

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07791281

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP