WO2012144559A1 - Dispositif et procédé de commande d'affichage - Google Patents

Dispositif et procédé de commande d'affichage Download PDF

Info

Publication number
WO2012144559A1
WO2012144559A1 PCT/JP2012/060609 JP2012060609W WO2012144559A1 WO 2012144559 A1 WO2012144559 A1 WO 2012144559A1 JP 2012060609 W JP2012060609 W JP 2012060609W WO 2012144559 A1 WO2012144559 A1 WO 2012144559A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display surface
mode
unit
indicator
Prior art date
Application number
PCT/JP2012/060609
Other languages
English (en)
Japanese (ja)
Inventor
真治 木村
通子 板橋
山崎 仁史
哲平 小西
仁嗣 川崎
剛 神山
悠 菊地
信三 大久保
稲村 浩
Original Assignee
株式会社エヌ・ティ・ティ・ドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エヌ・ティ・ティ・ドコモ filed Critical 株式会社エヌ・ティ・ティ・ドコモ
Publication of WO2012144559A1 publication Critical patent/WO2012144559A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to display control when an operation on a display surface is performed using an indicator such as a finger.
  • Patent Document 1 discloses an input device that enlarges and displays the currently displayed content when a finger approaches the touch panel.
  • Patent Documents 2 and 3 also disclose similar display control.
  • an object of the present invention is to control a display mode according to an operation when an indicator approaches the display surface when a user operation on the display surface is detected three-dimensionally.
  • the present invention provides an operation unit that repeatedly detects the position of the indicator with respect to the display surface and receives an operation, the contact operation for bringing the indicator into contact with the display surface, and the display surface Discrimination for discriminating an operation mode related to the display mode of the display surface based on the operation means for detecting the proximity operation for bringing the indicator close to each other without making contact and the transition of the proximity operation detected by the operation means
  • a specifying means for specifying a display mode of an image displayed on the display surface according to the operation mode determined by the determining means and the position of the indicator detected by the operating means;
  • a display control device comprising display control means for displaying an image on the display surface in a display mode specified by the means.
  • the determination unit determines the operation mode based on a residence time in which a three-dimensional position of the indicator indicated by the proximity operation detected by the operation unit is included in a predetermined region. In another preferable aspect, the determination unit determines that the operation mode is a predetermined mode when the residence time is longer than a predetermined threshold, and the specifying unit is determined by the determination unit. When the operation mode is the predetermined mode, a display mode for highlighting a part of the image displayed on the display surface is specified. In still another preferred embodiment, the specifying means includes a range of highlighting on the display surface, an orthogonal projection coordinate of the position of the indicator with respect to the display surface, and a distance from the display surface at the position.
  • the specifying unit specifies a magnification when an image displayed on the display surface is enlarged. In still another preferred aspect, the specifying unit specifies a pop-out amount when the image displayed on the display surface is stereoscopically displayed. In still another preferred aspect, the display control means cancels the highlight display when a contact operation is detected by the operation means after the image is highlighted, and the position detected by the operation unit is Highlighting is not performed again until it is outside the predetermined area. In still another preferred embodiment, the determination unit is configured such that the position detected by the operation unit enters the determination region until the position is detected as the contact operation, or the position leaves the determination region. Is calculated as the residence time.
  • the determination unit determines the operation mode based on a moving speed of the indicator indicated by the proximity operation detected by the operation unit.
  • the specifying unit uses the control data stored in the storage unit according to the operation mode determined by the determination unit and the position of the indicator detected by the operation unit. Specify the display mode.
  • the storage means stores the control data according to an application, and the specifying means uses the control data according to an application corresponding to an image displayed on the display surface. Specify the display mode.
  • the display control device includes a filter unit that executes a filter process for reducing noise with respect to coordinate information representing a position detected by the operation unit, and the determination unit includes the filter unit.
  • the operation mode is discriminated based on the coordinate information on which the filtering process is executed.
  • the operation means includes a first sensor that detects the contact operation and a second sensor that detects the proximity operation.
  • the operation means includes a single sensor that detects the contact operation and the proximity operation.
  • the present invention is a step of repeatedly detecting the position of the indicator with respect to the display surface and receiving an operation, the contact operation of bringing the indicator into contact with the display surface, and the display surface
  • the operation mode relating to the display mode of the display surface is determined based on a first step of detecting a proximity operation for bringing the indicator close to each other without making contact and a transition of the proximity operation detected in the first step.
  • a display mode of an image displayed on the display surface according to the second step to be determined, the operation mode determined in the second step, and the position of the indicator detected in the first step And a fourth step of displaying an image on the display surface in the display mode specified in the third step.
  • the present invention when the user's operation on the display surface is detected three-dimensionally, it is possible to control the display mode according to the operation when the indicator approaches the display surface.
  • FIG. 1 Figure showing the appearance of the display device Block diagram showing the hardware configuration of the display device Functional block diagram showing the functional configuration of the control unit
  • FIG. 1 A flowchart showing display control realized by the control unit Diagram showing the relationship between the detectable area and the discrimination area
  • FIG. 1 Flow chart showing display control realized by control unit (second embodiment) Block diagram showing configuration of display device (third embodiment)
  • SYMBOLS 100 100a ... Display apparatus, 101 ... Display surface, 110 ... Control part, 111 ... Data acquisition part, 112 ... Discrimination part, 113 ... Identification part, 114 ... Display control part, 120 ... Storage part, 130 ... Display part, 140 ... Operation part 141 ... Contact sensor 142 ... Proximity sensor 150 ... Communication part 160 ... Filter part
  • the present invention is characterized by display control based on the characteristics of an operation when an indicator approaches the display surface without touching the display surface in a display device that three-dimensionally detects a user operation on the display surface.
  • the indicator refers to a body part or an instrument for the user to perform an operation pointing to the display surface, and is, for example, a user's finger or a stylus (stylus pen).
  • the present invention pays attention to such a feature of the user's operation, and switches the display mode between the case where the feature is operated and the case where the feature is not operated. Specifically, the present invention determines the movement of the indicator based on the time or speed at which the indicator is detected, and makes the display mode of the display surface different from normal when the user's operation satisfies a predetermined condition. This improves the operability on the display surface.
  • a display mode (a display mode different from the normal mode) is to highlight an image, for example, to display an image that the user intends to indicate in an enlarged manner or to display it in a three-dimensional manner. By realizing such highlighting, the user can more easily specify an object such as a button.
  • FIG. 1 is a diagram illustrating an appearance of a display device 100 according to an embodiment of the present invention.
  • the display device 100 is an information processing device that allows a user to operate the display surface 101.
  • the display surface 101 is a surface for displaying an image and a surface for receiving a user operation.
  • the display device 100 is configured to detect the position of the fingertip of the user's hand. That is, the indicator that is the detection target of the present embodiment is the user's finger.
  • the surface shown in FIG. 1, that is, the surface having the display surface 101 will be referred to as “front” hereinafter.
  • FIG. 2 is a block diagram showing a hardware configuration of the display device 100.
  • the display device 100 includes a control unit 110, a storage unit 120, a display unit 130, an operation unit 140, and a communication unit 150.
  • the display unit 130 and the operation unit 140 may be configured integrally or may be separated.
  • the communication unit 150 is not an essential component for the present invention.
  • the control unit 110 is a means for controlling the operation of each unit of the display device 100.
  • the control unit 110 includes an arithmetic processing device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory), and executes a program stored in the ROM or the storage unit 120.
  • arithmetic processing device such as a CPU (Central Processing Unit)
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the storage unit 120 is means for storing data used by the control unit 110 for control.
  • the storage unit 120 is configured by a hard disk or a flash memory.
  • the storage unit 120 may include so-called removable media, that is, removable storage means.
  • the storage unit 120 stores a display control table to be described later.
  • the display unit 130 is a means for displaying an image on the display surface 101.
  • the display unit 130 includes a display panel that displays an image using a liquid crystal element or an organic EL (electroluminescence) element, a drive circuit that drives the display panel, and the like.
  • the display surface 101 is a rectangle, and pixels are arranged in a matrix along the short side and the long side of the rectangle.
  • a three-dimensional orthogonal coordinate system having an appropriate position (here, the upper left corner) as the origin O is defined for the display surface 101.
  • the X axis is defined in the long side direction of the display surface 101
  • the Y axis is defined in the short side direction of the display surface 101
  • the Z axis is defined in the direction orthogonal to the X axis and the Y axis.
  • the Z-axis is a coordinate axis with the surface of the display surface 101 as the origin and the direction toward the user (upward in FIG. 1) as the positive direction.
  • the operation unit 140 is a means for receiving user operations.
  • the operation unit 140 includes a sensor for detecting the position of the user's fingertip with respect to the display surface 101 and supplying coordinate information representing the position to the control unit 110. More specifically, the operation unit 140 includes a contact sensor 141 and a proximity sensor 142.
  • the contact sensor 141 is a sensor for detecting a state in which the user's fingertip is in contact with the display surface 101.
  • the contact sensor 141 can be realized by, for example, a known touch screen (also referred to as a touch panel).
  • the proximity sensor 142 is a sensor for detecting a state in which the user's fingertip is close to the display surface 101.
  • the proximity sensor 142 may also be a sensor using a well-known technique.
  • the proximity sensor 142 is realized by detecting the capacitance of the fingertip or optically detecting the position of the fingertip.
  • the contact sensor 141 two-dimensionally detects the position of the surface of the display surface 101 where the user's fingertip has contacted. Therefore, the coordinate information supplied by the contact sensor 141 is coordinate information representing coordinates in the X-axis direction (X coordinate) and coordinates in the Y-axis direction (Y coordinate).
  • the proximity sensor 142 three-dimensionally detects the position of the user's fingertip that is in close proximity without touching the display surface 101. Therefore, the coordinate information supplied by the proximity sensor 142 is coordinate information representing the coordinates in the respective directions of the X axis, the Y axis, and the Z axis.
  • the operation specified by the coordinate information supplied by the contact sensor 141 is referred to as “contact operation”
  • the operation specified by the coordinate information supplied by the proximity sensor 142 is referred to as “proximity operation”.
  • the proximity sensor 142 has a limited range in which the user's fingertip can be detected (hereinafter referred to as “detectable region”).
  • the detectable area is determined by the hardware performance of the proximity sensor 142.
  • “Proximity” as used in the present embodiment refers to a state where the position of the fingertip (particularly the position in the Z-axis direction) is in this detectable region, and more specifically, a state where the position of the fingertip is included in the determination region described later It points to.
  • the contact sensor 141 and the proximity sensor 142 detect the position of the indicator at a predetermined sampling rate. That is, the contact sensor 141 and the proximity sensor 142 repeatedly and continuously detect the position of the indicator.
  • the proximity sensor 142 may temporarily lower the sampling rate when the user's finger is not detected for a predetermined time or longer, and return to the original sampling rate when the finger is detected again.
  • the contact sensor 141 may function after the proximity sensor 142 detects a finger. This is because the finger always comes close to the display surface 101 in advance for the finger to contact the display surface 101.
  • the operation unit 140 may be a unit that accepts not only operations on the display surface 101 but also other operations.
  • the display device 100 may include a physical key such as a button or a switch on a portion other than the display surface 101 (including a side surface and a back surface), and the operation unit 140 may be configured to accept an operation using the physical key.
  • the operation unit 140 supplies operation information representing an operation with a physical key to the control unit 110.
  • the physical key here may be a so-called QWERTY keyboard or numeric keypad.
  • the communication unit 150 is a means for communicating with an external device.
  • the communication unit 150 includes, for example, an antenna and a network adapter.
  • the communication unit 150 may communicate with an external device via a network such as the Internet or a mobile communication network, but may communicate directly with an external device without using a network, such as near field communication.
  • the communication by the communication unit 150 is wireless communication here, but may be wired communication.
  • the hardware configuration of the display device 100 is as described above. Based on this configuration, the display device 100 communicates with an external device and displays an image corresponding to the process being executed on the display surface 101. At this time, the display device 100 detects the proximity operation and the contact operation, and detects the position of the user's fingertip. When detecting the proximity operation, the display device 100 changes the display mode of the image as necessary according to the transition of the detected position of the fingertip.
  • the display device 100 enlarges and displays an image when the fingertip stays in the detectable region for a predetermined time or more and the contact operation is still not detected.
  • the display device 100 may enlarge and display only a part of the image displayed on the display surface 101 (for example, an image in the vicinity of the fingertip position), but the entire image displayed on the display surface 101 is displayed.
  • An enlarged display may be used. However, in the latter case, it is desirable to control the display position of the image so that the position that the user intends to indicate is not hidden by the enlarged display.
  • the image display mode may include an image that does not perform enlarged display even when the user's operation satisfies a predetermined condition.
  • These display modes are defined in advance and can be set by the user according to his / her preference.
  • the display mode of an image may differ for every application (application corresponding to the image currently displayed on the display surface 101) in execution.
  • FIG. 3 is a functional block diagram showing a part related to image display control in the functional configuration of the control unit 110.
  • the control unit 110 executes functions corresponding to the data acquisition unit 111, the determination unit 112, the specifying unit 113, and the display control unit 114 illustrated in FIG. These functions may be realized by either an OS (Operating System) or specific software that controls image display, or may be realized by cooperation of a plurality of software.
  • OS Operating System
  • the data acquisition unit 111 is means for acquiring data necessary for image display. Specifically, the data acquisition unit 111 is image data indicating an image to be displayed on the display surface 101 and coordinate information supplied by the operation unit 140. The image data may be stored in advance in the storage unit 120 (for example, image data used when executing a specific application) or received by the communication unit 150 from an external device (for example, by a browser). Web page data to be displayed).
  • the discriminating unit 112 is a unit that discriminates the operation mode related to the display mode of the display surface 101 based on the coordinate information supplied from the proximity sensor 142, that is, the proximity operation detected by the proximity sensor 142.
  • the operation mode is to determine the image display method and to determine whether or not to highlight the image.
  • the operation mode is determined in more detail according to the coordinate (Z coordinate) in the Z-axis direction of the coordinate information supplied from the proximity sensor 142, that is, the distance from the display surface 101 of the position of the fingertip.
  • these operation modes are described in the display control table. Therefore, the determination unit 112 performs the determination by referring to the display control table stored in the storage unit 120.
  • the determination unit 112 determines the operation mode based on the time during which the position of the fingertip detected by the proximity sensor 142 is included in the predetermined area, that is, the time during which the fingertip remains in the predetermined area.
  • the predetermined area is an area included in the above-described detectable area, and may be an area in the same range as the detectable area. However, in the present embodiment, the predetermined area is an area smaller than the detectable area. Suppose there is. Such an area is hereinafter referred to as a “discrimination area”.
  • the detectable area is an area defined in hardware, while the discrimination area is an area defined in software.
  • the time measured by the determination unit 112 to determine whether or not to perform enlarged display is hereinafter referred to as “residence time”.
  • the dwell time is preferably set such that, after the position of the fingertip detected by the proximity sensor 142 enters the determination area, the Z coordinate of the position becomes 0 (that is, the fingertip touches the display surface 101), or the position Is the time until the finger leaves the discrimination area, and is the time during which the position of the fingertip continues to be included in the discrimination area.
  • the staying time may continue to be accumulated while ignoring such time as long as the time when the fingertip leaves the determination area is very short.
  • FIG. 4 is a diagram showing an example of the display control table.
  • the determination unit 112 refers to a table corresponding to the time during which the position of the fingertip detected by the proximity sensor 142 continues to be included in the determination region (that is, the residence time).
  • the determination unit 112 refers to the table 4 when the residence time is less than the predetermined threshold, and selects an operation mode according to the table 4, while when the residence time is equal to or greater than the predetermined threshold, One of the tables 1 to 3 is referred to, and an operation mode according to the referenced table is selected. Note that which of the tables 1 to 3 is referred to by the determination unit 112 may be determined in advance by the user or may be determined by an application corresponding to an enlarged display image.
  • tables 1 to 3 are examples of tables used in the proximity operation effective mode.
  • the table 4 is an example of a table used in the proximity operation invalid mode.
  • Z represents the Z coordinate
  • Th1 to Th4 represent threshold values set for the Z coordinate.
  • Table 1 is a table showing a display mode in which enlarged display is not performed when Z> Th4, and enlarged display is performed at a constant magnification (m times) when 0 ⁇ Z ⁇ Th4.
  • the display device 100 performs an enlarged display using the table 1
  • the image displayed on the display surface 101 is displayed at a constant magnification at a certain time interval, and the position of the fingertip changes slightly.
  • the magnification does not change as long as the position is within the discrimination area.
  • the magnification m is an appropriate value that satisfies m> 1.
  • Table 2 is the same as Table 1 in that the enlarged display is not performed when Z> Th4, but the magnification of the enlarged display changes according to the magnitude of the value of Z when 0 ⁇ Z ⁇ Th4. It is a table which shows an aspect.
  • the display device 100 performs an enlarged display using the table 2
  • an image displayed on the display surface 101 is enlarged and displayed with a certain time as a boundary, and the magnification is increased as the finger is brought closer to the display surface 101. Increases gradually.
  • the table 3 is the same as the table 2 in that the enlarged display is not displayed when Z> Th4 and the magnification of the enlarged display changes depending on the value of Z when 0 ⁇ Z ⁇ Th4.
  • the aspect of change in magnification is different from that in Table 2.
  • the display device 100 performs an enlarged display using the table 3
  • the image displayed on the display surface 101 is displayed larger as Z is closer to Th4, and as Z is closer to 0 (that is, the fingertip).
  • the display mode is such that the magnification approaches 1 (equal magnification) and approaches the original size (approaching the display surface 101).
  • Such a magnification is represented by the product of Z and n.
  • the value of n is an appropriate value satisfying n> 0.
  • the change in magnification at this time is not continuous as in the case of Table 2, but is more continuous.
  • Table 4 corresponds to the proximity operation invalid mode, and the magnification remains at 1 (equal magnification) regardless of the value of Z. Therefore, when referring to the table 4, the display device 100 may omit detecting the proximity operation by the proximity sensor 142, and may detect only the contact operation by the contact sensor 141.
  • the specifying unit 113 is a unit that specifies the display mode of the image displayed on the display surface 101 according to the operation mode determined by the determining unit 112 and the position of the fingertip detected by the operation unit 140.
  • the specifying unit 113 specifies the magnification of the image based on the Z coordinate of the position of the fingertip when performing display control in which the magnification of the enlarged display is variable. Further, when displaying a part of the image displayed on the display surface 101 in an enlarged manner, the specifying unit 113 displays the X and Y coordinates of the fingertip position (that is, the coordinates of the orthogonal projection of the fingertip position on the display surface 101). ) To specify the position to enlarge the image.
  • the display control unit 114 is a means for displaying an image on the display surface 101 in the display mode specified by the specifying unit 113.
  • the display control unit 114 enlarges and displays the image in the display mode specified by the specifying unit 113 when the image is to be highlighted. Therefore, the specific display mode of the image differs depending on the position of the user's fingertip and the operation mode.
  • FIG. 5 is a flowchart showing display control realized by the control unit 110 with the above functional configuration.
  • the control unit 110 executes the display control process shown in FIG. That is, the control unit 110 does not always need to execute the display control process illustrated in FIG. 5. For example, when a specific application that does not require enlarged display is being executed, the control unit 110 does not execute the process. May be.
  • the display control by the control unit 110 is started when the user's fingertip enters the detectable region and a predetermined condition is satisfied. Therefore, first, the control unit 110 specifies the position of the proximity operation based on the coordinate information supplied from the proximity sensor 142, and determines whether or not the position is included in the detectable region (step S1). Control unit 110 repeats the same processing until this determination becomes affirmative (YES).
  • control unit 110 determines whether or not the position is further included in the determination area, and whether the dwell time in the determination area is equal to or greater than a predetermined threshold value. It is determined whether or not (step S2).
  • the determination in step S2 is negative (NO) when the fingertip approaches the display surface 101 and the user's operation shifts from the proximity operation to the contact operation or when the fingertip moves away from the display surface 101 and exits the determination area. Become. This determination is also negative when the residence time is less than a predetermined threshold.
  • FIG. 6 is a diagram showing the relationship between the detectable area and the discrimination area.
  • the vertical coordinate axis represents the Z axis, but the horizontal coordinate axis may be either the X axis or the Y axis.
  • the discrimination area is shown as a smaller area than the detectable area not only in the Z-axis direction but also in the X-axis direction (or Y-axis direction).
  • the size of the direction (area when viewed from the front by the user) may be the same as that of the detectable region.
  • the control unit 110 controls the display unit 130 to display an enlarged image in a display mode according to the position of the fingertip according to the display control table. (Step S3). In addition, the control unit 110 controls the display mode of the image as described above and determines a change in the position of the fingertip. Specifically, control unit 110 determines whether the fingertip is within the detectable region or whether the user operation has shifted from the proximity operation to the contact operation (step S4). At this time, once the user's fingertip comes out of the detectable region, the determination in step S4 becomes negative. Then, the control part 110 repeats a process from step S1.
  • the control unit 110 determines which of these corresponds to the user's operation. Specifically, the control unit 110 determines whether or not the user operation is a contact operation (step S5).
  • the contact operation and the proximity operation are distinguished by the Z coordinate of the coordinate information. Alternatively, the control unit 110 may distinguish between the contact operation and the proximity operation depending on whether the coordinate information is supplied from the contact sensor 141 or the proximity sensor 142.
  • control unit 110 changes the display of the image according to the position where the fingertip has contacted (step S7). At this time, the control unit 110 may execute a process corresponding to the position touched by the fingertip, that is, the object selected by the user. For example, if the user selects a hyperlink (a character string or an icon) of the web page, the control unit 110 receives the data described in the hyperlink and renders the data to switch the display of the page. , And so on. On the other hand, if the user's operation is not a contact operation (that is, if it is a proximity operation), control unit 110 repeats the processes after step S3.
  • a contact operation that is, if it is a proximity operation
  • the control unit 110 also determines whether or not the user's operation is a contact operation even when the staying time in the determination area is less than the predetermined threshold in step S2 (step S6). Also in this case, if the user's operation is a contact operation, the control unit 110 executes the process of step S7. On the other hand, if the user's operation is not a contact operation, control unit 110 repeats the processing from step S1 onward in this case.
  • control unit 110 After executing the process of step S7, the control unit 110 resets the accumulated residence time value and returns it to 0 (step S8). In addition, the control part 110 may reverse the execution order of the process of step S7 and the process of step S8. In addition to executing the process of step S7, the control unit 110 resets the dwell time value when the position indicated by the coordinate information is outside the detectable area (or determination area). May be.
  • FIG. 7 is a diagram showing a display example for each operation mode by the display device 100.
  • Each of these display examples shows a case where the user gradually approaches his / her finger to select the icon Ic2 in a state where the icons Ic1 to Ic6 are displayed.
  • the display mode of the proximity operation effective mode is that in accordance with the table 3 shown in FIG. In the proximity operation valid mode, the display device 100 changes the image display mode in the flow Im1 ⁇ Im2 ⁇ Im3 ⁇ Im4, while in the proximity operation invalid mode, the image display mode is changed in the flow Im1 ⁇ Im4.
  • the image Im1 is an image in an initial state, and is an image when the user's finger is outside the detectable region.
  • the image Im2 is an image when the user performs a proximity operation and the staying time becomes a predetermined threshold value or more, and the image Im3 is displayed on the display surface 101 by the user more than the case of the image Im2. It is an image when it is brought close to. In these cases, the icon Ic2 is enlarged and displayed so that it is more conspicuous than the other icons Ic1, Ic3 to Ic6 and can be easily confirmed and selected.
  • the image Im4 is an image when the icon Ic2 is selected.
  • the display mode of the icon Ic2 does not change.
  • Such a display mode is realized when the finger moves relatively quickly and the finger reaches the display surface 101 in a relatively short time. That is, in this case, the display device 100 does not change the image display mode by the proximity operation.
  • Such an operation may be realized by the display device 100 not accepting the proximity operation itself, but may be realized by not reflecting the accepted proximity operation in the display mode.
  • the display mode of the enlarged display is not limited to the example shown in FIG.
  • the display device 100 does not enlarge only a specific object (in the case of FIG. 7, an icon), but may enlarge and display a predetermined range centered on the position indicated by the user. Good.
  • the display device 100 displays an enlarged image as a semi-transparent image (an image in a display mode in which the background can be seen through) so as not to hide and hide other parts by enlarging a part of the image. It may be.
  • the display device 100 may change the color of the object or blink the display in accordance with the enlarged display.
  • the display device 100 of the present embodiment is configured to determine whether or not to perform an enlarged display based on the residence time in the fingertip determination area. According to such a display device 100, it is possible to determine the operation mode according to the user's operation without the user having previously designated the operation mode (by an operation other than the proximity operation). Therefore, even when the display device 100 displays the same image, whether or not to enlarge the display according to the situation depends on the degree of proficiency with respect to the user's operation and the user's current operation speed. Can be changed.
  • Such display control is also suitable when, for example, displaying an image in which a plurality of objects (buttons, icons, hyperlinks, etc.) are mixed as a user selection target.
  • a display device 100 when such an image is displayed, if the user tries to select a relatively small object by a slow operation, the object is enlarged and displayed.
  • a relatively large object is selected by a quick operation, it is possible to prevent the object from being enlarged and displayed.
  • the display control of this embodiment is also suitable when the user is an elderly person or a physically handicapped person.
  • the user's hand shakes and the position where the user wants to touch is not fixed, or the finger is close to the display surface 101 because the image on the display surface 101 cannot be seen well.
  • the image near the finger can be enlarged and displayed to assist or assist the operation.
  • the display unit 130 enables three-dimensional display of an image.
  • the three-dimensional display here is a display mode in which an image is perceived as if it is three-dimensional.
  • the display unit 130 may realize what is called autostereoscopic viewing, but may also realize stereoscopic viewing using auxiliary equipment such as dedicated glasses.
  • auxiliary equipment such as dedicated glasses.
  • a specific method for realizing stereoscopic viewing by the display unit 130 a known appropriate technique may be used.
  • FIG. 8 is a flowchart showing display control of this embodiment. This flowchart is the same as the display control of the first embodiment (see FIG. 5) except that the process of step S3 is replaced with the process of step S3a.
  • step S3a the control unit 110 controls the pop-out amount when the image is displayed in 3D.
  • the pop-out amount represents a sense of visual distance between the user and the image, and the larger the pop-out amount, the closer the image is to the user (that is, the more popping out than the display surface 101). Represents being close to
  • the control unit 110 may locally change the pop-out amount in accordance with the position of the fingertip, but may make the entire display surface 101 appear to pop out.
  • the “magnification” portion in the first embodiment may be read as “the amount of protrusion”.
  • control unit 110 may execute a combination of the display control of the present embodiment and the display control of the first embodiment. That is, the control unit 110 may control the display so that the image displayed on the display surface 101 is enlarged and displayed in a stereoscopic manner. In this case, the control unit 110 may make the part of the image displayed on the display surface 101 to be enlarged and the part to be stereoscopically displayed different.
  • FIG. 9 is a block diagram showing the configuration of the display device 100a of the present embodiment.
  • the display device 100a is different from the display device 100 (see FIG. 2) in that a filter unit 160 is provided.
  • the filter unit 160 is means for executing a filter process for reducing noise on the coordinate information.
  • the filter unit 160 may be hardware independent of other components, but may be realized as a function of the control unit 110 or the operation unit 140.
  • the filter used by the filter unit 160 is a smoothing filter, a median filter, a Gaussian filter, a moving average filter, or the like. That is, when the coordinates represented by the coordinate information fluctuate finely in a short time, the filter unit 160 converts the coordinate information so as to suppress the fluctuation. Therefore, the noise referred to here is a high-frequency component when coordinate information repeatedly and continuously detected is expressed in time series.
  • the control unit 110 may prevent the magnification (or the amount of pop-up when the image is displayed in 3D) from being abruptly (or frequently) changed when the image on the display surface 101 is enlarged and displayed. This makes it possible to prevent the image from blurring and becoming difficult for the user to see.
  • Such display control is particularly effective when the magnification (or pop-out amount) changes continuously as in the case of the table 3 in FIG.
  • the filter unit 160 may perform filter processing on each of the X coordinate, Y coordinate, and Z coordinate, but may perform filter processing only on the Z coordinate.
  • the filter unit 160 may change the mode of filter processing (more specifically, the degree of noise reduction) to be performed for each coordinate for each coordinate.
  • This embodiment is characterized by display control after enlarged display (or stereoscopic display). Specifically, in the present embodiment, when a contact operation is detected after the image on the display surface 101 is highlighted, the display device 100 cancels the highlight and returns to the normal display, and then the user Until the fingertip is once out of the discrimination area, the highlighting is limited not to be performed again.
  • Such display control is particularly effective when screen transition such as page switching occurs due to a touch operation.
  • FIG. 10 is a flowchart showing the display control of this embodiment. This flowchart is different from the display control of the first embodiment (see FIG. 5) in that the processes of steps S9 and S10 are added. In the present embodiment, the control unit 110 temporarily cancels the enlarged display in accordance with the user's contact operation in step S7.
  • step S9 the control unit 110 determines whether the position of the fingertip indicated by the coordinate information of the proximity operation is included in the determination area. When the determination is negative, that is, when the position of the fingertip goes out of the determination area, the control unit 110 repeats the processes after step S1. At this time, the control unit 110 can shift to the proximity operation effective mode if the dwell time becomes equal to or greater than the threshold value again.
  • control unit 110 when the position of the fingertip is within the determination area, the control unit 110 operates in the proximity operation invalid mode as shown in step S10, and repeats the determination in step S9 until the position of the fingertip is outside the determination area. Note that the control unit 110 may invalidate the enlarged display based on the proximity operation instead of invalidating the proximity operation in Step S10.
  • the display device 100 of the present embodiment it is possible to prevent highlighting contrary to the user's intention after the touch operation.
  • the display device 100 displays a web page by a browser
  • the user selects a hyperlink in the web page by a contact operation
  • the page displayed by the browser is displayed as the hyperlink from the web page before selection. It is common to change to other web pages shown. Since such page transitions involve communication and rendering by the communication unit 150, it may take time until the pages are completely switched. At this time, the user may wait without moving the finger position much until the next page is displayed. In such a case, if the user's fingertip continues to stay in the determination area and the staying time exceeds a predetermined threshold, the next page is suddenly displayed in a highlighted state. .
  • Such a display mode is highly likely to be a display mode not intended by the user.
  • the display control of the present embodiment since the highlighting is not performed unless the user removes his / her finger once, the highlighting is not intended by the user when there is little movement of the user's finger as described above. In contrast, it can be prevented from being performed.
  • step S9 may be performed based on a part of the determination area instead of the determination area itself.
  • the determination in step S9 is based on a region having a narrower range in the Z-axis direction than the determination region (that is, a region smaller than the determination region), and determines whether or not the fingertip position is included in the region. May be.
  • a contact operation may be used instead of the proximity operation.
  • the control unit 110 calculates the distance between the coordinates of these two contact operations, and the distance and a predetermined threshold value are calculated. It may be determined whether to enable or disable highlighting by comparing.
  • the feature of the user's operation is discriminated based on the moving speed of the finger instead of the residence time described above.
  • the moving speed here is a moving speed when the fingertip of the user approaches the display surface 101.
  • the moving speed may be a speed of displacement of the fingertip in the Z-axis direction (that is, a speed not considering the displacement in the X-axis direction and the Y-axis direction), or considering the displacement in each of these three axes directions.
  • the calculated speed may be used.
  • FIG. 11 is a flowchart showing the display control of this embodiment. This flowchart is the same as the display control of the first embodiment (see FIG. 5) except that the process of step S2 is replaced with the process of step S2a.
  • the control unit 110 calculates a moving speed based on a plurality of coordinate information indicating the proximity operation, and determines whether the moving speed is equal to or less than a predetermined threshold.
  • the moving speed can be calculated by dividing the displacement between coordinates indicated by a plurality of (at least two points) coordinate information by the difference between the detection times.
  • the control unit 110 determines that the operation mode is the proximity operation effective mode when the moving speed is equal to or lower than the threshold, that is, when moving relatively slowly, and executes the process of step S3. On the other hand, when the moving speed exceeds the threshold value, that is, when moving relatively quickly, the control unit 110 determines that the operation mode is the proximity operation invalid mode, and executes the process of step S6.
  • control unit 110 can calculate the acceleration and jerk (jerk) of the fingertip position instead of the moving speed of the fingertip position, and can use this for the determination. That is, the control unit 110 may determine the operation mode based on the moving speed per unit time or its acceleration instead of the fingertip displacement per unit time.
  • the present invention is not limited to the aspect of each embodiment described above, and can be implemented in other aspects.
  • the present invention can also be implemented, for example, by the modes shown in the following modifications.
  • this invention may be implemented in the aspect which combined these some modified examples, and may be implemented in the aspect which combined multiple characteristics of each embodiment mentioned above.
  • the indicator of the present invention may be an instruction device such as a stylus that the user holds and moves.
  • the operating means in the case of using such an indicator may detect the position of the indicator with infrared rays or ultrasonic waves.
  • the position of the indicator can be detected magnetically.
  • the operation means of the present invention is not configured to separately include a sensor for detecting the contact operation and a sensor for detecting the proximity operation, but may be configured to detect the contact operation and the proximity operation with a single sensor. Good. In short, in the present invention, it is arbitrary how the functions of the display control apparatus described above are implemented in hardware.
  • the display control device of the present invention may be a part of the configuration of the display device 100 as in the above-described embodiment, but the display device and other devices provided independently of the display device It may be realized by cooperation.
  • the present invention is provided with operation means on the display device side and other means (discriminating means, specifying means, display control means, etc.) May be provided on the main body side.
  • the display control apparatus of the present invention may include a unit (data acquisition unit 111) that acquires coordinate information supplied from the operation unit, instead of the operation unit. That is, the display control apparatus of the present invention can be configured only by the control unit 110 described above. Further, such a display control apparatus can be implemented in the form of a program for causing a computer to realize this and a recording medium on which such a program is recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne la commande, lors de la détection en trois dimensions d'une opération d'utilisateur sur un écran d'affichage, d'un état d'affichage en fonction d'une opération lorsqu'un corps indicateur effectue un rapprochement étroit avec l'écran d'affichage. A cet effet, l'invention porte sur un dispositif d'affichage qui détecte une opération de contact d'un état d'exécution de contact avec l'écran d'affichage et une opération de rapprochement étroit d'un état d'exécution de rapprochement étroit sans faire contact avec l'écran d'affichage. Le dispositif d'affichage détermine que le corps indicateur, qui est détecté dans l'opération de contact, est inclus dans une région d'évaluation d'emplacement et si son temps cumulé est supérieur ou égal à un seuil prescrit (S2), et si le seuil est égalé ou dépassé, ledit dispositif effectue un affichage agrandi en fonction de la position du corps indicateur (S3). Si le temps cumulé est inférieur au seuil prescrit, le dispositif d'affichage n'effectue pas l'affichage agrandi de l'image.
PCT/JP2012/060609 2011-04-22 2012-04-19 Dispositif et procédé de commande d'affichage WO2012144559A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011096135A JP2012226691A (ja) 2011-04-22 2011-04-22 表示制御装置及び表示制御方法
JP2011-096135 2011-04-22

Publications (1)

Publication Number Publication Date
WO2012144559A1 true WO2012144559A1 (fr) 2012-10-26

Family

ID=47041664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/060609 WO2012144559A1 (fr) 2011-04-22 2012-04-19 Dispositif et procédé de commande d'affichage

Country Status (2)

Country Link
JP (1) JP2012226691A (fr)
WO (1) WO2012144559A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014057929A1 (fr) * 2012-10-10 2014-04-17 株式会社Nttドコモ Dispositif d'interface utilisateur, procédé d'interface utilisateur et programme
WO2015045090A1 (fr) * 2013-09-27 2015-04-02 株式会社 東芝 Procédé et dispositif électronique

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102086799B1 (ko) * 2013-02-21 2020-03-09 삼성전자주식회사 가상 키 패드를 디스플레이하기 위한 방법 및 그 전자 장치
US10394434B2 (en) 2013-02-22 2019-08-27 Samsung Electronics Co., Ltd. Apparatus and method for recognizing proximity motion using sensors
US10261612B2 (en) * 2013-02-22 2019-04-16 Samsung Electronics Co., Ltd. Apparatus and method for recognizing proximity motion using sensors
KR20140105689A (ko) * 2013-02-23 2014-09-02 삼성전자주식회사 사용자의 입력에 응답하여 피드백을 제공하는 방법 및 이를 구현하는 단말
JP6294597B2 (ja) * 2013-05-14 2018-03-14 シャープ株式会社 情報表示装置、プログラム、記録媒体、および情報表示方法
JP2014222489A (ja) * 2013-05-14 2014-11-27 シャープ株式会社 情報表示装置、位置判定方法、プログラムおよび記録媒体
JP6244712B2 (ja) * 2013-07-23 2017-12-13 富士通株式会社 画像処理装置、画像処理方法および画像処理プログラム
JP6265839B2 (ja) * 2014-06-09 2018-01-24 アルパイン株式会社 入力表示装置、電子機器、アイコンの表示方法および表示プログラム
JP6360367B2 (ja) 2014-06-26 2018-07-18 京セラ株式会社 携帯電子機器、携帯電子機器の制御方法およびプログラム
JP6304071B2 (ja) * 2014-08-21 2018-04-04 京セラドキュメントソリューションズ株式会社 画像処理装置
JP6274134B2 (ja) * 2015-03-10 2018-02-07 京セラドキュメントソリューションズ株式会社 表示入力装置及びこれを備えた画像形成装置
JP2022152962A (ja) 2021-03-29 2022-10-12 京セラドキュメントソリューションズ株式会社 表示装置及び画像形成装置

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010205050A (ja) * 2009-03-04 2010-09-16 Mitsubishi Electric Corp タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法、およびユーザインタフェース制御プログラム

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002358162A (ja) * 2001-06-01 2002-12-13 Sony Corp 画像表示装置
JP2005049668A (ja) * 2003-07-30 2005-02-24 Sharp Corp データ変換装置、表示装置、データ変換方法、プログラム及び記録媒体
JP3939709B2 (ja) * 2004-04-30 2007-07-04 日本電信電話株式会社 情報入力方法および情報入出力装置
JP4479962B2 (ja) * 2005-02-25 2010-06-09 ソニー エリクソン モバイル コミュニケーションズ, エービー 入力処理プログラム、携帯端末装置、及び入力処理方法
KR100649523B1 (ko) * 2005-06-30 2006-11-27 삼성에스디아이 주식회사 입체 영상 표시 장치
JP2008281599A (ja) * 2007-05-08 2008-11-20 Nippon Telegr & Teleph Corp <Ntt> 情報強調表示方法および情報入出力装置
JP5101995B2 (ja) * 2007-09-10 2012-12-19 株式会社リコー 入力制御装置および画像形成装置
JP2009246625A (ja) * 2008-03-31 2009-10-22 Fujifilm Corp 立体表示装置及び立体表示方法並びにプログラム
JP5313713B2 (ja) * 2009-02-03 2013-10-09 Necカシオモバイルコミュニケーションズ株式会社 端末装置及びプログラム
JP5338549B2 (ja) * 2009-08-05 2013-11-13 ソニー株式会社 表示装置及び表示方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010205050A (ja) * 2009-03-04 2010-09-16 Mitsubishi Electric Corp タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法、およびユーザインタフェース制御プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014057929A1 (fr) * 2012-10-10 2014-04-17 株式会社Nttドコモ Dispositif d'interface utilisateur, procédé d'interface utilisateur et programme
WO2015045090A1 (fr) * 2013-09-27 2015-04-02 株式会社 東芝 Procédé et dispositif électronique

Also Published As

Publication number Publication date
JP2012226691A (ja) 2012-11-15

Similar Documents

Publication Publication Date Title
WO2012144559A1 (fr) Dispositif et procédé de commande d&#39;affichage
US9946338B2 (en) Information processing to vary screen display based on a gaze point of the user
JP6381032B2 (ja) 電子機器、その制御方法及びプログラム
EP2801009B1 (fr) Système pour interaction du regard
JP4093823B2 (ja) 視野移動操作方法
US8775966B2 (en) Electronic device and method with dual mode rear TouchPad
JP5708083B2 (ja) 電子機器、情報処理方法、プログラム、及び電子機器システム
EP2657811B1 (fr) Dispositif de traitement d&#39;entrée tactile, dispositif de traitement d&#39;informations, et procédé de commande d&#39;entrée tactile
US20110060986A1 (en) Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same
US20100177121A1 (en) Information processing apparatus, information processing method, and program
US20160041619A1 (en) Information processing apparatus, information processing method, and program
EP2840478B1 (fr) Procédé et appareil pour fournir une interface utilisateur pour appareil de diagnostic médical
CN107024965A (zh) 信息处理系统和信息处理方法
EP2577425A2 (fr) Gestes d&#39;interaction d&#39;utilisateur avec un clavier virtuel
CN109918013A (zh) 用于触摸屏悬停输入处理的方法和设备
WO2011141622A1 (fr) Interface utilisateur
WO2010127714A2 (fr) Appareil électronique comprenant une ou plusieurs surfaces d&#39;entrée de coordonnées et procédé permettant de contrôler un tel appareil électronique
JP2010205050A (ja) タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法、およびユーザインタフェース制御プログラム
TW201333822A (zh) 用以在螢幕間提供過渡之裝置及方法
US8558806B2 (en) Information processing apparatus, information processing method, and program
US20120179963A1 (en) Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display
WO2013072073A1 (fr) Procédé et dispositif d&#39;exécution d&#39;une variation de focale
JP5628991B2 (ja) 表示装置、表示方法、及び表示プログラム
KR100795590B1 (ko) 네비게이팅하는 방법, 전자 디바이스, 사용자 인터페이스,그리고 컴퓨터 프로그램 산물
JP2011081447A (ja) 情報処理方法及び情報処理装置

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12774522

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12774522

Country of ref document: EP

Kind code of ref document: A1