WO2014061098A1 - Information display device and display information operation method - Google Patents
Information display device and display information operation method Download PDFInfo
- Publication number
- WO2014061098A1 WO2014061098A1 PCT/JP2012/076680 JP2012076680W WO2014061098A1 WO 2014061098 A1 WO2014061098 A1 WO 2014061098A1 JP 2012076680 W JP2012076680 W JP 2012076680W WO 2014061098 A1 WO2014061098 A1 WO 2014061098A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- icon
- display
- gesture
- information
- composite icon
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/367—Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an information display device and a display information operation method.
- Patent Documents 1 and 2 disclose devices using a touch panel.
- a smooth scroll operation icon for performing continuous smooth scroll processing of a map image is displayed. Specifically, this icon is displayed in the lower right part or the lower left part on the map image in accordance with the position of the driver's seat.
- the navigation map image moves at a high speed in the direction of the arrow for the duration of the touch.
- an area for executing smooth scroll processing that is, an icon for smooth scroll operation
- an area for executing touch scroll processing and drag scroll processing that is, smooth scroll. Area other than the operation icon).
- the navigation device of Patent Document 2 has poor operability when the smooth scroll operation and the touch scroll operation are very similar (for example, when the difference between the two operations is only the difference in the time of touching the screen). It has been proposed to solve this problem. Since the timing for performing the scroll operation depends on the intention of the user, the smooth scroll operation icon must always be displayed so that it can be used at any time.
- each arrow portion indicating the map moving direction needs to be large enough to be touched with a finger. If an arrow portion for eight directions is provided in the icon as described in Patent Document 2, the icon for smooth scroll operation becomes large.
- An object of the present invention is to provide an information display device and a display information operation method having high convenience.
- An information display device includes a display unit having a display surface, an input unit that receives a user operation, and a control unit.
- the user operation is a gesture operation associated with a screen movement deformation type function that controls display information on the display surface in a control direction set according to the gesture direction
- the control unit displays the composite icon on the display surface. Display.
- the composite icon is a composite of a plurality of icons that are all associated with the screen movement deformation type function but have different control direction assignments.
- the control unit executes the screen movement deformation type function in the control direction assigned to the icon for which the execution instruction operation has been performed.
- the composite icon is called on the display surface by the gesture operation, and the screen movement deformable function associated with the gesture operation can be executed in various control directions using the composite icon. .
- the control direction can be selected as appropriate. Thereby, high convenience can be provided.
- a composite icon is displayed simply by performing a gesture operation associated with the function to be executed. That is, a composite icon corresponding to the function intended by the user is automatically displayed. For this reason, high convenience can be provided.
- the composite icon is not called, so the display information is not hidden by the composite icon.
- FIG. 1 It is a figure which illustrates a scroll compound icon. It is a conceptual diagram of a scroll compound icon. It is a figure which illustrates a display size change compound icon. It is a figure which illustrates a rotation compound icon. It is a block diagram which illustrates a control part. It is a flowchart which illustrates the process until a composite icon is displayed. It is a conceptual diagram of end point conditions. It is a conceptual diagram of composite icon call operation. It is a figure explaining the 1st example of the display position of a composite icon. It is a figure explaining the 2nd example of the display position of a composite icon. It is a figure explaining the 3rd example of the display position of a composite icon.
- FIG. 1 illustrates a block diagram of an information display device 10 according to an embodiment.
- the information display device 10 includes a display unit 12, an input unit 14, a control unit 16, and a storage unit 18.
- Display unit 12 displays various information.
- the display unit 12 drives each pixel based on, for example, a display surface configured by arranging a plurality of pixels in a matrix and image data acquired from the control unit 16 (in other words, each pixel's Driving device for controlling the display state).
- the image displayed on the display unit 12 may be a still image, a moving image, or a combination of a still image and a moving image.
- the display unit 12 can be configured by a liquid crystal display device, for example.
- a display area of a display panel corresponds to the display surface
- a drive circuit externally attached to the display panel corresponds to the drive device.
- a part of the driver circuit may be incorporated in the display panel.
- the display unit 12 can be configured by an electroluminescence (EL) display device, a plasma display device, or the like.
- EL electroluminescence
- the input unit 14 receives various information from the user.
- the input unit 14 includes, for example, a detection unit that detects an indicator used by the user for input, and a detection signal output unit that outputs a result detected by the detection unit to the control unit 16 as a detection signal. .
- the input unit 14 is configured by a so-called contact-type touch panel
- the input unit 14 may be referred to as a “touch panel 14” below.
- the touch panel may be referred to as a “touch pad” or the like.
- indication used for an input is a user's finger
- the detection unit of the touch panel 14 provides an input surface on which a user places a fingertip, and detects the presence of a finger on the input surface by a sensor group provided for the input surface.
- a sensor group provided for the input surface.
- an area where a finger can be detected by the sensor group corresponds to an input area where a user input can be received.
- the input area corresponds to an input surface of a two-dimensional area.
- the sensor group may be any of electrical, optical, mechanical, etc., or a combination thereof.
- Various position detection methods have been developed, and any of them may be adopted for the touch panel 14.
- a configuration capable of detecting the pressing force of the finger on the input surface may be employed.
- the position of the fingertip on the input surface can be specified from the combination of the output signals of each sensor.
- the identified position is expressed by coordinate data on coordinates set on the input surface, for example.
- the coordinate data indicating the finger position changes, so that the movement of the finger can be detected by a series of coordinate data acquired continuously.
- the finger position may be expressed by a method other than coordinates. That is, the coordinate data is an example of finger position data for expressing the position of the finger.
- the detection signal output unit of the touch panel 14 generates coordinate data indicating the finger position from the output signals of the sensors, and transmits the coordinate data to the control unit 16 as a detection signal.
- the conversion to coordinate data may be performed by the control unit 16.
- the detection signal output unit converts the output signal of each sensor into a signal in a format that can be acquired by the control unit 16, and transmits the obtained signal to the control unit 16 as a detection signal.
- the input surface 34 of the touch panel 14 (see FIG. 1) and the display surface 32 of the display unit 12 (see FIG. 1) are overlapped, in other words, the input surface 34.
- a structure in which the display surface 32 is integrated is illustrated. With such an integrated structure, the input / display unit 20 (see FIG. 1), more specifically, the touch screen 20 is provided.
- the input surface 34 and the display surface 32 are identified to the user, giving the user the feeling of performing an input operation on the display surface 32. For this reason, an intuitive operation environment is provided.
- the expression “the user operates the display surface 32” may be used.
- the control unit 16 performs various processes and controls in the information display device 10. For example, the control unit 16 analyzes information input from the touch panel 14, generates image data according to the analysis result, and outputs the image data to the display unit 12.
- control unit 16 includes a central processing unit (for example, configured with one or a plurality of microprocessors) and a main storage unit (for example, one or a plurality of storage devices such as ROM, RAM, flash memory, etc.). ).
- a central processing unit for example, configured with one or a plurality of microprocessors
- main storage unit for example, one or a plurality of storage devices such as ROM, RAM, flash memory, etc.
- main storage unit for example, one or a plurality of storage devices such as ROM, RAM, flash memory, etc.
- Various programs may be stored in the main storage unit of the control unit 16 in advance, or may be read from the storage unit 18 during execution and stored in the main storage unit.
- the main storage unit is used not only for storing programs but also for storing various data.
- the main storage unit provides a work area when the central processing unit executes the program.
- the main storage unit provides an image holding unit for writing an image to be displayed on the display unit 12.
- the image holding unit may be referred to as “video memory”, “graphic memory”, or the like.
- control unit 16 may be configured as hardware (for example, an arithmetic circuit configured to perform a specific calculation).
- the storage unit 18 stores various information.
- the storage unit 18 is provided as an auxiliary storage unit used by the control unit 16.
- the storage unit 18 can be configured using one or more storage devices such as a hard disk device, an optical disk, a rewritable and nonvolatile semiconductor memory, and the like.
- touch operation is an operation in which at least one fingertip is brought into contact with the input surface of the touch panel and the contacted finger is moved away from the input surface without being moved on the input surface.
- gesture operation is an operation in which at least one fingertip is brought into contact with the input surface, and the contacted finger is moved on the input surface (in other words, slid), and then released from the input surface. .
- the coordinate data detected by the touch operation (in other words, finger position data) is basically static and static.
- the coordinate data detected by the gesture operation changes with time and is dynamic. According to such a series of changing coordinate data, the point where the finger starts moving and the point on the input surface, the locus from the moving start point to the moving end point, the moving direction, the moving amount, the moving speed, the moving acceleration, Etc. can be acquired.
- FIG. 3 is a conceptual diagram illustrating a one-point touch operation (also simply referred to as “one-point touch”) as a first example of the touch operation.
- a top view of the input surface 34 is shown in the upper stage, and a side view or a sectional view of the input surface 34 is shown in the lower stage.
- touch points in other words, finger detection points
- black circles Such an illustration technique is also used in the drawings described later.
- a black circle may be actually displayed on the display surface.
- One-point touch can be classified into single tap, multi-tap and long press operations, for example.
- Single tap is an operation of tapping the input surface 34 once with a fingertip.
- a single tap is sometimes simply referred to as a “tap”.
- Multi-tap is an operation of repeating a tap a plurality of times.
- a double tap is a typical multi-tap.
- the long press is an operation for maintaining the point contact of the fingertip.
- FIG. 4 is a conceptual diagram illustrating a two-point touch operation (also simply referred to as “two-point touch”) as a second example of the touch operation.
- the two-point touch is basically the same as the one-point touch except that two fingers are used. For this reason, it is possible to perform each operation of tap, multi-tap, and long press, for example, by two-point touch.
- two fingers of one hand may be used, or one finger of the right hand and one finger of the left hand may be used.
- the positional relationship between the two fingers is not limited to the example in FIG.
- FIG. 5 is a conceptual diagram illustrating a drag operation (also simply referred to as “drag”) as a first example of the gesture operation.
- Dragging is an operation of shifting the fingertip while it is placed on the input surface 34.
- the moving direction and moving distance of the finger are not limited to the example in FIG.
- the movement start point of the finger is schematically shown by a black circle
- the movement end point of the finger is schematically shown by a black triangle
- the direction of movement of the finger is expressed by the direction of the triangle
- the black circle The trajectory is represented by a line connecting the triangle and the black triangle.
- Such an illustration technique is also used in the drawings described later. Note that the black circle, the black triangle, and the locus may be actually displayed on the display surface.
- FIG. 6 is a conceptual diagram illustrating a flick operation (also simply referred to as “flick”) as a second example of the gesture operation.
- the flick is an operation for quickly paying the fingertip on the input surface 34.
- the moving direction and moving distance of the finger are not limited to the example in FIG.
- flicking unlike dragging, the finger leaves the input surface 34 while moving.
- the touch panel 14 is a contact type, the finger movement after leaving the input surface 34 is not detected in principle.
- a flick can be identified when the moving speed is equal to or higher than a predetermined threshold (referred to as “drag / flick identification threshold”).
- the point where the finger finally arrives after it leaves the input surface 34 (more specifically, the point is defined as the input surface 34).
- this estimation process can be interpreted as a process of converting a flick into a virtual drag.
- the information display apparatus 10 treats the estimated point as the end point of finger movement.
- the estimation process may be executed by the touch panel 14 or may be executed by the control unit 16.
- the information display device 10 may be modified so that a point away from the input surface 34 is handled as an end point of finger movement.
- FIG. 7 is a conceptual diagram illustrating a pinch out operation (also simply referred to as “pinch out”) as a third example of the gesture operation.
- Pinch out is an operation of moving two fingertips away on the input surface 34.
- Pinch out is also called “pinch open”.
- FIG. 7 illustrates the case where both two fingers are dragged.
- one fingertip is fixed on the input surface 34 (in other words, one fingertip maintains a touch state), and only the other fingertip is held. It is also possible to pinch out by dragging. 7 and 8 are referred to as “two-point movement type”, and the method in FIG. 8 is referred to as “one-point movement type”.
- FIG. 9 is a conceptual diagram illustrating a pinch-in operation (also simply referred to as “pinch-in”) as a fifth example of the gesture operation.
- Pinch-in is an operation of bringing two fingertips closer on the input surface 34.
- Pinch-in is also referred to as “pinch close”.
- FIG. 9 illustrates a two-point movement type pinch-in
- FIG. 10 illustrates a one-point movement type pinch-in as a sixth example of the gesture operation.
- pinch-out and pinch-in are collectively referred to as “pinch operation” or “pinch”, and the direction of finger movement is referred to as “pinch direction”.
- the pinch operation is particularly referred to as pinch out.
- the pinch operation is particularly called pinch-in.
- pinch-out and pinch-in two fingers of one hand may be used, or one finger of the right hand and one finger of the left hand may be used. Further, the positional relationship, the moving direction, and the moving distance of the two fingers are not limited to the examples in FIGS. Further, in the one-point movement type pinch-out and pinch-in, the finger to be dragged is not limited to the examples of FIGS. 8 and 10. It is also possible to pinch out and pinch in using flicks instead of dragging.
- Each user operation is associated with a specific function. Specifically, when a user operation is detected, a process associated with the user operation is executed by the control unit 16, thereby realizing a corresponding function. In view of this point, user operations can be classified based on functions to be realized.
- a double tap performed on an icon on the display surface 32 is associated with a function for executing a program or command associated with the icon.
- the double tap functions as an execution instruction operation.
- dragging performed on display information (a map image is illustrated in FIG. 11) is associated with a scroll function for scrolling the display information.
- the drag operation functions as a scroll operation. Note that it is also possible to perform scrolling by flicking instead of dragging.
- pinch out and the pinch in performed on the display information changes the size (in other words, the scale) of the information display.
- pinch-out and pinch-in function as a display size change operation (may be referred to as a “display scale change operation”). More specifically, in the example of FIG. 12, pinch out corresponds to an enlargement operation, and pinch in corresponds to a reduction operation.
- the drag is associated with a function that rotates the information display.
- the two-point movement type rotary drag functions as a rotation operation.
- the function of associating may be changed according to the number of fingers to be rotated and dragged.
- a double tap may be assigned to a folder opening operation for opening a folder associated with an icon in addition to the above execution instruction operation.
- the drag may be assigned to a scroll function and a drawing function.
- the execution instruction function for the icon may be associated with double tap, long press, and flick.
- a program or the like associated with the icon can be executed by any of double tap, long press and flick.
- the scroll function may be associated with both dragging and flicking.
- the rotation function may be associated with both the two-point movement type rotation drag and the one-point movement type rotation drag.
- a gesture operation associated with a screen movement deformation type function may be expressed as “a screen movement deformation type function gesture operation”.
- the screen movement deformation type function associated with the gesture operation is a function of controlling (in other words, manipulating) display information on the display surface in a control direction set according to the gesture direction.
- the screen movement deformation type function includes, for example, a slide function, a display size change function, a rotation function, and a bird's eye view display function (more specifically, an elevation angle and depression angle change function).
- the slide function can be classified as a screen movement function. Further, if the rotation function is viewed from the viewpoint of angle movement, the rotation function can be classified as a screen movement function. Further, the display size changing function and the bird's eye view display function can be classified into screen deformation functions.
- the scroll direction (that is, the control direction) is set according to the gesture direction (for example, the drag direction or the flick direction), and the display information is scrolled in the scroll direction.
- the control direction when the gesture direction (for example, pinch direction) is the enlargement direction, the control direction is set to the enlargement direction, and when the gesture direction is the reduction direction, the control direction is set to the reduction direction. Then, the display information size is changed in the set control direction.
- the gesture direction for example, pinch direction
- the control direction when the gesture direction is the reduction direction, the control direction is set to the reduction direction. Then, the display information size is changed in the set control direction.
- control direction is set to the right rotation direction when the gesture direction (for example, the rotation direction in the rotation drag) is the right rotation direction, and the control direction is set to the left rotation direction when the gesture direction is the left rotation direction.
- the counter is set to the left rotation direction, and the display information is rotated in the set control direction.
- the screen movement / deformation function may control the display information by using not only the gesture direction but also the gesture amount (for example, the length of the gesture trajectory).
- the control amount for example, the scroll amount, the display size change amount, and the rotation amount
- the control amount may be set larger as the gesture amount is larger.
- the screen movement deformation type function may control the display information using the gesture speed in addition to or instead of the gesture amount.
- the display information control speed eg, scroll speed, display size change speed, and rotation speed
- the display information control speed may be set higher as the gesture speed is higher.
- the non-moving deformation type function does not use the gesture direction for realizing the function even if it is associated with the gesture operation. For example, even if a flick to an icon is associated with an execution instruction function of a specific program, the function belongs to the non-moving deformation type. For example, when dragging is used for the drawing function and the handwritten character input function, only the trajectory corresponding to the dragging is displayed, and the display information is not controlled according to the dragging direction.
- a composite icon is a composite of a plurality of icons.
- the composite icon is displayed on the display surface when a gesture operation of the screen movement deformation type function is performed.
- the screen associated with the gesture operation related to the appearance of the composite icon in other words, the opportunity to display the composite icon
- a moving deformation type function is executed.
- any icon in the composite icon is associated with the screen movement deformation type function associated with the gesture operation related to the appearance of the composite icon.
- different control directions are assigned to the icons in the composite icon. For this reason, when an execution instruction operation is performed on any one of the composite icons, the screen movement deformation type function is executed with the control direction assigned to the icon.
- One point touch operation is exemplified as the execution instruction operation for the composite icon.
- the screen movement deformation type function associated with the composite icon may be continuously executed while the state of touching any one of the composite icons continues.
- the control amount (for example, the scroll amount) in the screen movement deformation type function is increased by the long press operation rather than the tap operation.
- the screen movement deformation type function may be continuously executed while tapping is continuously performed.
- FIG. 14 shows a scroll composite icon 72 as an example of the composite icon.
- the scroll composite icon 72 has eight icons 72a to 72h. All of the icons 72a to 72h are associated with the scroll function, but different scroll directions are assigned to the icons 72a to 72h. Specifically, the icon 72a is assigned to scroll in the upward direction, the icon 72b is assigned to scroll in the upper right 45 ° direction, and the icon 72c is assigned to scroll in the right direction.
- the scroll directions of the icons 72d, 72e, 72f, 72g, and 72h are respectively assigned to the lower right 45 ° direction, the lower direction, the lower left 45 ° direction, the left direction, and the upper left 45 ° direction. In view of this point, in the example of FIG.
- the icons 72a to 72h are drawn with a design in which the vertices of the vertically long triangles face the scroll direction.
- the design of the scroll composite icon 72 is not limited to the illustrated example.
- the icons 72a to 72h may also be referred to as scroll icons 72a to 72h.
- FIG. 15 shows a conceptual diagram of the scroll composite icon.
- a drag 70 that is an example of a gesture operation is associated with a scroll function that is an example of a screen movement deformation type function.
- a scroll composite icon 72 capable of executing a scroll function that is a function associated with the drag 70 is displayed.
- each of the icons 72a to 72h receives an instruction to execute the scroll function and executes the scroll function in the scroll direction set as described above.
- the slide direction of the map image is the same right direction as the drag direction.
- the scroll direction of the map image is generally expressed as the left direction. That is, in the scroll function and the slide function, the scroll direction and the slide direction, which are control directions, differ by 180 °.
- both the scroll function and the slide function are common in that the control direction is set according to the gesture direction (drag direction in the example of FIG. 15) or the direction designated by the scroll icons 72a to 72h.
- FIG. 15 illustrates an example in which the left scroll icon 72g (see FIG. 14) is touched.
- icons 72a to 72f and 72h in other scroll directions, It is also possible to scroll in the direction of.
- the display size change composite icon 80 includes two display size change icons 80a and 80b.
- the two display size change icons 80a and 80b may be referred to as an enlarged icon 80a and a reduced icon 80b, respectively, depending on the display size change direction (that is, the control direction).
- the rotation composite icon 84 includes two rotation icons 84 a and 84 b.
- the two rotation icons 84a and 84b may be referred to as a right rotation icon 84a and a left rotation icon 84b, respectively, depending on the rotation direction (that is, the control direction).
- the design of the composite icons 80 and 84 is not limited to the illustrated example.
- a composite icon is called on the display surface by a gesture operation, and the screen movement deformable function associated with the gesture operation can be executed in various directions using the composite icon. For this reason, if a composite icon is used, the number of repeated gesture operations can be reduced to reduce the operation burden. If a composite icon is used, the control direction of the display image can be selected as appropriate.
- a composite icon is displayed simply by performing a gesture operation associated with the function to be executed. That is, a composite icon corresponding to the function intended by the user is automatically displayed.
- the composite icon is not called, so the display information is not hidden by the composite icon.
- FIG. 18 illustrates a block diagram of the control unit 16.
- the display unit 12, the input unit 14, and the storage unit 18 are also illustrated for explanation.
- the control unit 16 includes an input analysis unit 40, an overall control unit 42, a first image forming unit 44, a first image holding unit 46, a second image forming unit 48, A two-image holding unit 50, an image composition unit 52, a composite image holding unit 54, and a composite icon management unit 56 are included.
- the input analysis unit 40 analyzes the user operation detected by the input unit 14 and identifies the user operation. Specifically, the input analysis unit 40 acquires coordinate data detected along with a user operation from the input unit 14, and acquires user operation information from the coordinate data.
- the user operation information is, for example, information such as the type of user operation, the start and end points of finger movement, the trajectory from the start point to the end point, the moving direction, the moving amount, the moving speed, and the moving acceleration.
- the difference between the start point and the end point can be distinguished from a predetermined threshold value (referred to as a “touch / gesture identification threshold value”) to identify the touch operation and the gesture operation. It is. Further, as described above, the drag and the flick can be identified from the finger moving speed at the end of the trajectory.
- pinch out and pinch in can be identified from the moving direction.
- the two drags draw a circle while maintaining a distance, it is possible to identify that the rotation drag has been performed.
- a drag and a one-point touch are identified at the same time, it can be identified that the pinch-out, the pinch-in, and the rotational drag are one-point moving types.
- the overall control unit 42 performs various processes in the control unit 16. For example, the overall control unit 42 associates a position on the input surface of the input unit 14 with a position on the display surface of the display unit 12. According to this, the touch position in the touch operation, the gesture locus in the gesture operation, and the like are associated on the display surface. With such association, it is possible to identify the position on the display surface where the user operation is intended. Such association can be realized by a so-called graphical user interface (GUI) technique.
- GUI graphical user interface
- the overall control unit 42 identifies a function desired by the user, that is, a user instruction based on, for example, user operation information and function identification information.
- the function identification information is, for example, information in which an association between a user operation and a function to be executed is defined through the operation status information.
- the operation status information includes, for example, the usage status of the information display device 10 (in other words, the usage mode), the operation target on which the user operation has been performed, the type of user operation that can be accepted according to the usage status and the operation target, etc. Information.
- the drag is identified as instructing the execution of the scroll function.
- the tap is identified as instructing execution of the display size enlargement function. For example, if no function is associated with the flick for the enlarged icon, it is determined that the flick is an invalid operation.
- the overall control unit 42 controls the display information on the display surface by controlling the first image forming unit 44, the second image forming unit 48, and the image composition unit 52.
- the display information may be changed based on the identification result of the user instruction, or may be based on an instruction on program execution irrespective of the identification result of the user instruction.
- the overall control unit 42 performs general control on the other functional units 40, 44, 46, 48, 50, 52, 54, and 56, for example, adjustment of execution timing.
- the first image forming unit 44 reads the first information 60 according to the instruction from the overall control unit 42 from the storage unit 18, forms the first image from the first information 60, and converts the first image into the first image holding unit 46.
- the second image forming unit 48 reads the second information 62 according to the instruction from the overall control unit 42 from the storage unit 18, forms a second image from the second information 62, and converts the second image into the second image. Store in the holding unit 50.
- the image composition unit 52 reads the first image from the first image holding unit 46, reads the second image from the second image holding unit 50, and combines the first image and the second image.
- the synthesized image is stored in the synthesized image holding unit 54.
- the image composition is performed so that the first image and the second image are displayed in an overlapping manner.
- first image is the lower image (in other words, the lower layer) and the second image is the upper image (in other words, the upper layer) is illustrated.
- up and down means up and down in the normal direction of the display surface, and the side closer to the user viewing the display surface is expressed as “up”.
- image data is superimposed based on such a concept.
- the lower image is displayed in the transparent portion of the upper image.
- the drawing portion of the upper image hides the lower image.
- a composite image in which the lower image is transparent can be formed.
- the setting of which of the first image and the second image is the upper image may be unchangeable or may be changeable.
- the composite image stored in the composite image holding unit 54 is transferred to the display unit 12 and displayed on the display unit 12.
- the display screen changes when the composite image is updated, that is, when at least one of the first image and the second image is updated.
- the composite icon management unit 56 manages display of composite icons under the control of the overall control unit 42. Specifically, the composite icon management unit 56 manages information such as the display position, size, orientation, and display attribute, and controls the second image forming unit 48 and the image composition unit 52 based on the management information. To manage the display of composite icons.
- the composite icon management unit 56 reads the composite icon image data from the storage unit 18 with respect to the second image forming unit 48 and forms a composite icon image with a size corresponding to the size of the display surface. And instructing that the formed composite icon image is drawn on the transparent plane according to the display position and orientation and stored in the second image holding unit 50. Further, regarding the deletion of the composite icon, the composite icon management unit 56 causes the second image forming unit 48 to store an image that does not have the composite icon image in the second image holding unit 50, for example. Further, the composite icon management unit 56 instructs the image composition unit 52 to compose the images in the image holding units 46 and 50.
- FIG. 19 illustrates a processing flow S10 until a composite icon is displayed.
- the input unit 14 receives a user operation in step S11, and the control unit 16 identifies the input user operation in step S12.
- the control unit 16 executes a function associated with the user operation based on the identification result in step S12.
- step S14 the control unit 16 refers to the user operation received in step S11 as a condition (“composite icon display start condition” or “display start condition”) set in advance for displaying the composite icon. Judgment) is satisfied. If it is determined that the display start condition is not satisfied, the processing of the information display device 10 returns to step S11. On the other hand, when determining that the display start condition is satisfied, the control unit 16 performs a process of displaying the composite icon in step S15. After the composite icon is displayed, the processing flow S10 in FIG. 19 ends.
- a composite icon display start condition a composite icon is displayed when a gesture operation of the screen movement deformation type function (in other words, a gesture operation that triggers display of the composite icon) is performed once. It is possible to adopt a condition (referred to as “one-time operation condition”). According to the one-time operation condition, the composite icon can be used immediately. Therefore, the operation burden of repeating the same gesture operation many times can be reduced.
- operation time threshold value a predetermined threshold value
- operation time condition may be added to the operation condition once.
- a condition that a composite icon is displayed when a single operation speed of the gesture operation of the screen movement deformation type function is equal to or higher than a predetermined threshold (referred to as “operation speed threshold”) ( (Referred to as “operation speed condition”) may be added to the operation condition once. That the gesture operation is performed quickly means, for example, a situation where the user wants to see the display information after the operation quickly. For this reason, it is considered that the gesture operation is likely to be repeated further. Therefore, according to the operation speed condition, it is possible to display the composite icon while more accurately identifying the user's intention.
- the display timing may be specified in the operation speed condition. That is, the operation speed condition is such that when the single operation speed of the gesture operation of the screen movement deformation type function is equal to or higher than the operation speed threshold, the composite icon is displayed at a timing earlier than a predetermined icon display timing. You may deform
- gesture amount threshold a predetermined threshold
- a condition of displaying referred to as “gesture amount condition” may be added to the one-time operation condition.
- the fact that the gesture operation is performed largely means, for example, a situation where the user desires a large control amount for the display information. For this reason, it is considered that the gesture operation is likely to be repeated further. Therefore, according to the gesture amount condition, it is possible to display the composite icon while more accurately identifying the user's intention.
- end point condition a condition that a composite icon is displayed (hereinafter referred to as “end point condition”). ) May be added to the operation condition once.
- the predetermined region on the display surface is, for example, a peripheral region 32b of the display surface 32 as shown in FIG. 20, the peripheral area 32b of the display surface 32 corresponds to the peripheral area 34b of the input surface 34, and the end point 70b of the drag 70 exists in the peripheral areas 32b and 34b.
- region is not limited to the peripheral area
- the drag illustrated in FIG. 20 may be, for example, one drag during a two-point movement type pinch out.
- a condition that a composite icon is displayed when a composite icon call operation is performed following a gesture operation of the screen movement deformation type function (hereinafter referred to as a “call operation condition”) is set as a one-time operation condition. May be added.
- the condition of “following” includes a condition that the gesture operation and the composite icon call operation are performed within a predetermined operation time interval, and a condition that no other operation is performed during the operation. Including.
- the composite icon call operation is, for example, a touch operation.
- an operation of touching an arbitrary point on the input surface with another finger without releasing the dragged finger as the gesture operation from the end point is called a composite icon call It can be used as an operation.
- a tap may be adopted as such a touch operation, or a double tap or a long press may be adopted.
- Such a touch operation can also be performed when the gesture operation is a pinch-out using a plurality of fingers.
- an operation of touching the end point of the drag performed as the gesture operation or the vicinity thereof can be used as the composite icon calling operation.
- a tap may be employed as such a touch operation, or a double tap may be employed.
- a touch operation can also be performed when the gesture operation is a flick and when the gesture operation is a pinch out using a plurality of fingers or the like.
- a long press may be employed as the touch operation after dragging. In this case, the finger used for dragging is not released from the input surface, and a long press is performed as it is.
- Such a composite icon calling operation by long pressing can be performed even when the gesture operation is a pinch out using a plurality of fingers or the like.
- a flick operation may be adopted instead of the touch operation. Specifically, as illustrated in FIG. 21, flicking is performed so as to trace the drag locus.
- the composite icon call operation can suppress the accidental display of the composite icon.
- no-operation progress condition a condition that a composite icon is displayed when a non-operation state continues for a predetermined time (time length) or more after a gesture operation of the screen movement deformation type function (“no-operation progress condition”) May be added to the operation condition once. According to the no operation progress condition, the composite icon is not displayed immediately. For this reason, it contributes to operation mistake prevention.
- a composite icon display start condition a condition that a composite icon is displayed when a gesture operation of the screen movement deformation type function is continuously repeated a predetermined number of times (referred to as “repeated operation condition”).
- the condition “continuously” includes a condition that the gesture operation is repeated within a predetermined operation time interval and a condition that no other operation is performed during the repetition.
- the gesture directions may be the same or different.
- the drag may be repeated in the same direction or the drag may be performed in various directions. In view of this point, in any case, when the scroll composite icon appears, high convenience is obtained.
- the “same gesture direction” is not only the case where the gesture direction of each time is exactly the same, but also the case where the gesture direction of each time is substantially the same (for example, variation in the gesture direction of each time is a predetermined tolerance) In the case)
- the repetition operation condition can detect the repetition of the same gesture operation by monitoring, for example, the type of gesture operation, the gesture direction, the number of repetitions of the loop processing of steps S11 to S14, etc. in step S14 (see FIG. 19). is there.
- repetition total time threshold value a condition that a composite icon is displayed
- total repetition time a condition that a composite icon is displayed
- the repetition of the gesture operation takes a certain amount of time, for example, in a situation where the user wants to see the display information that follows continuously. For this reason, it is considered that the gesture operation is likely to be repeated further. Therefore, according to the repeated total time condition, it is possible to display the composite icon while more accurately identifying the user's intention.
- repetition speed threshold value a condition that the composite icon is displayed
- repetition speed a condition that the composite icon is displayed
- the repetition speed is defined as the number of gesture operations per unit time, for example. That the gesture operation is repeated quickly means, for example, a situation in which the user wants to see the subsequent display information quickly. For this reason, it is considered that the gesture operation is likely to be repeated further. Therefore, according to the repetition speed condition, it is possible to display the composite icon while more accurately identifying the user's intention.
- the display timing may be specified in the repetition speed condition. That is, the repetition speed condition is such that the composite icon is displayed at a timing earlier than a predetermined icon display timing when the repetition speed of the gesture operation of the screen movement deformation type function is equal to or higher than the repetition speed threshold. It may be deformed. According to this, a composite icon can be provided quickly.
- the gesture amount (for example, drag distance) is integrated according to the repetition of the gesture operation of the screen movement deformation type function, and the integrated value reaches a predetermined threshold (referred to as “gesture total amount threshold”).
- a condition that a composite icon is displayed (referred to as “gesture total amount condition”) may be added to the repeated operation condition.
- the fact that the integrated value of the gesture amount becomes large means that, for example, the user desires a large control amount for the display information. For this reason, it is considered that the gesture operation is likely to be repeated further. Therefore, according to the gesture total amount condition, it is possible to display the composite icon while more accurately identifying the user's intention.
- one or more of the above conditions such as the operation time condition exemplified in connection with the one time operation condition may be added to the repeated operation condition.
- one or more of the above conditions such as an operation time condition are applied to each gesture operation.
- one or more of the above conditions such as an operation time condition may be applied to a predetermined number of gesture operations (for example, the last gesture operation). According to such a condition addition, it is possible to improve the accuracy of identifying the user intention.
- the display position of the composite icon is basically arbitrary.
- the scroll composite icon 72 exists in the vicinity of the end point 70 b of the drag 70, the finger used for the drag 70 is moved onto the scroll composite icon 72 with a small amount of movement. Can be made.
- the composite icon 72 is arranged on the right side of the drag end point 70b.
- the composite icon 72 may be arranged on another side of the end point 70b or directly above the end point 70b.
- end point region a region 70c including the end point 70b
- the size and shape of the end point area 70c may be a variable value according to the operation situation (the detected finger size, finger movement speed, etc.), or a fixed value that does not depend on the operation situation. It may be. Further, the center of the end point region 70c does not necessarily coincide with the end point 70b.
- the end point area 70c can be obtained in the coordinate system of the display surface after associating the end point 70b of the drag 70 on the display surface.
- the end point region 70c may be obtained in the coordinate system of the input surface, and the obtained end point region 70c may be associated with the coordinate system of the display surface.
- the average end point position is obtained for all gesture operations or a part of the gesture operations that are the determination target of the repeated operation condition, and the obtained The end point area 70c may be set from the end point position.
- the end point region 70c for a predetermined number of gesture operations for example, the last gesture operation may be set.
- the composite icon 72 may be arranged on the extension line 70d of the locus of the drag 70. According to this, since the composite icon 72 can be reached by moving the finger used for the drag 70 in the same direction as it is, the movement of the finger is smooth.
- a composite icon 72 may be displayed at a position where the extension line 70d passes in the end point region 70c.
- a composite icon 72 may be displayed at a position where the extension line 70d passes in the peripheral area 32b of the display surface 32. According to this, it is possible to avoid the display information at the center of the display surface that is considered to have a high degree of user attention from being hidden by the composite icon 72.
- the setting range of the peripheral area 32b is the same as the above-described FIG. 20 (related to the end point condition among the composite icon display start conditions) is illustrated here, it is not limited to this example.
- FIG. 26 to FIG. 28 illustrate curved drag trajectories
- the following description also applies to linear drag trajectories.
- the extension line 70d is determined as a straight line connecting two points in the drag trajectory.
- FIG. 26 illustrates the case where the two points in the locus are the start point 70a and the end point 70b of the drag 70, but the present invention is not limited to this example.
- the end point 70b of the drag 70 and points other than the end point 70b may be used.
- the extension line 70d is determined as a straight line that contacts one point in the drag locus.
- FIG. 28 illustrates a case where one point in the locus is the end point 70b of the drag 70, but is not limited to this example.
- the extension line 70d can be easily obtained.
- the drag trajectory is divided into two parts: a start point side portion 70e including a trajectory start point 70a and an end point side portion 70f including a trajectory end point 70f.
- the user's intention is considered to be clearer in the end point side portion 70f than in the start point side portion 70e.
- the trajectories in FIGS. 27 and 28 have changed direction during dragging.
- the composite icon 72 can be displayed at a position reflecting the user's intention by using the end point side portion 70f.
- the extension line 70d can be obtained in the coordinate system of the display surface after associating the locus of the drag 70 on the display surface.
- the extension line 70d may be obtained in the coordinate system of the input surface, and the obtained extension line 70d may be associated with the coordinate system of the display surface.
- an average extension line is obtained for all the gesture operations that are the target of the repeated operation condition determination, or a part of the gesture operations.
- the extended line may be used as the extended line 70d.
- an extension line 70d for a predetermined number of gesture operations for example, the last gesture operation may be used.
- a composite icon (display size change composite icon 80 in FIG. 29) is displayed for each drag. May be provided). In this example, it is assumed that the user only needs to selectively operate one of the two composite icons 80.
- the composite icon may be displayed with a display attribute different from other icons (in other words, a display method).
- the composite icon is displayed by display attributes such as blinking, stereoscopic display, animation display, and translucency, or by a combination of a plurality of display attributes. According to this, the visibility of the composite icon is improved, which contributes to prevention of an operation error.
- FIG. 30 illustrates a processing flow S30 during display of the composite icon.
- steps S31 and S32 are the same as steps S11 and S12 of FIG. That is, in step S31, the input unit 14 receives a user operation, and in step S32, the control unit 16 identifies the input user operation.
- step S33 the control unit 16 determines whether or not the user operation received in step S31 is an execution instruction for any icon in the composite icon. Specifically, it is determined whether or not the input position of the user operation corresponds to the display position of any icon in the composite icon, and the user operation is a predetermined operation as an execution instruction operation for the composite icon ( Here, it is determined whether or not the one-point touch is exemplified as described above.
- step S33 If it is determined in step S33 that the user operation is an execution instruction for any icon in the composite icon, the control unit 16 determines in step S34 that the screen movement deformation type function associated with the composite icon, Execute the screen movement deformation function associated with the gesture operation involved in the appearance of the composite icon. At this time, the screen movement deformation type function is executed in the control direction assigned to the icon given the execution instruction. Thereafter, the processing of the information display device 10 returns to step S31.
- step S33 determines whether the user operation is an execution instruction for any icon in the composite icon. If it is determined in step S33 that the user operation is not an execution instruction for any icon in the composite icon, the control unit 16 performs a function associated with the user operation received in step S31 in step S35. Execute. Thereafter, the processing of the information display device 10 returns to step S31.
- step S31 the drag associated with the scroll function is accepted in step S31, and the scroll is executed in step S33. According to this, even when the scroll composite icon is displayed, it is possible to finely adjust the display information by dragging and the like. The same applies to composite icons other than the scroll composite icon.
- step S34 for example, when a composite icon is tapped (more specifically, when any icon in the composite icon is tapped), a predetermined control amount and a predetermined control speed are set.
- the screen movement deformation type function associated with the composite icon is executed.
- the control amount of the display information is determined according to the long press time.
- the control speed of the display information may be a predetermined constant value or may be gradually increased.
- the gesture amount or gesture speed of the gesture operation related to the appearance of the composite icon may be reflected in the control amount of the display information when the execution instruction operation is performed on the composite icon.
- the gesture amount or the gesture speed may be reflected on the control speed of the display information when the execution instruction operation is performed on the composite icon.
- the control amount or the control speed of the display information is set to be larger as the gesture amount or the gesture speed is larger. More specifically, the scroll amount is set larger as the drag distance is longer. Alternatively, the scroll speed is set larger as the drag distance is longer. Alternatively, the scroll amount is set larger as the drag speed is higher. Alternatively, the scroll speed is set higher as the drag speed is higher. As the drag speed, for example, an average speed or a maximum speed can be used. However, it is not limited to the linear relationship shown in FIG.
- the unit is intermittently displayed for each unit.
- Display information may be controlled. For example, as shown in FIG. 32, when the scroll composite icon is tapped once, the display information is scrolled by the one unit, and when the scroll composite icon is pressed for a long time, the scroll for the one unit is interrupted. Do it. According to this, it becomes easy to confirm the change of display information.
- a change in gesture speed of the gesture operation may be reflected in the control speed of the display information when the execution instruction operation is performed on the composite icon.
- the speed history of the gesture operation is reproduced once, and when the scroll composite icon is pressed for a long time, the speed history of the gesture operation is Repeat during long press. Since the gesture speed generally decreases at the beginning and end of the gesture operation, a situation similar to the above-described intermittent scroll is provided. For this reason, it becomes easy to confirm the change of display information.
- any of the above examples can be applied to gesture operations other than dragging and screen movement deformation type functions other than scrolling.
- control information amount and the control speed may be set larger as the pressing force with respect to the composite icon increases.
- step S35 when it is determined in step S33 that the user operation is a display size change operation performed on the composite icon, the control unit 16 sets the display size of the composite icon itself in step S35. change.
- the display size changing operation is, for example, pinch out and pinch in as shown in FIG.
- the pinch operation may be a two-point movement type (see FIGS. 7 and 9) or a one-point movement type (see FIGS. 8 and 10).
- FIG. 35 illustrates a processing flow S50 related to erasure of composite icons (that is, display end).
- the control unit 16 satisfies a predetermined condition (hereinafter referred to as “composite icon deletion condition” or “deletion condition”) for deleting the composite icon. Judge whether or not.
- control unit 16 performs a process of erasing the composite icon from the display surface in step S52. Thereafter, the processing of the information display device 10 returns to the processing flow S10 (see FIG. 19) until the composite icon is displayed. On the other hand, if it is determined that the erasure condition is not satisfied, the processing of the information display device 10 returns to step S51.
- the process flow S50 is executed in parallel with the process flow S30 displaying the composite icon. Specifically, step S51 is repeated until the composite icon deletion condition is satisfied, and step S52 is executed as an interrupt process when the composite icon deletion condition is satisfied.
- operation waiting condition a condition that the composite icon is erased from the display surface when the execution instruction operation for the composite icon is not input. If the composite icon is not used for a certain period of time, the user is likely not to use the composite icon for a while. Therefore, according to the deletion waiting time condition, it is possible to provide high convenience in that the composite icon is deleted after the user's intention is more accurately identified.
- the length of the waiting time until the composite icon is deleted for example, a predetermined constant value can be adopted.
- the length of the waiting time may be set based on the gesture speed of the gesture operation involved in the appearance of the composite icon. For example, if the gesture operation is performed quickly, it is considered that there is a high possibility that the gesture operation is repeated as described above. That is, it is considered that there is a high possibility that a composite icon is used. For this reason, when the gesture speed is high, it is preferable to set the erasure waiting time longer.
- a composite icon erasing condition when the user operation is a predetermined composite icon erasing operation, a condition that the composite icon is erased from the display surface (hereinafter referred to as “erase instruction condition”) is adopted. May be. An operation different from the execution instruction operation for the composite icon (for example, flick for the composite icon) is assigned to the composite icon deletion operation. According to the delete instruction condition, the user can delete the composite icon at any time.
- both the operation waiting condition and the erasure instruction condition may be adopted, which further improves convenience.
- a composite icon may be configured by combining icons associated with different types of screen movement deformation type functions.
- the composite icon 88 shown in FIG. 36 includes scroll icons 72a to 72h and display size change icons 80a and 80b.
- the scrolling, enlargement, and reduction functions can be operated at one place, so that a favorable operation environment can be provided.
- scrolling, enlargement, and reduction can be performed independently or in combination. For example, if the two-point touch targeting the upward scroll icon 72a and the enlargement icon 84a is identified, the control unit 16 executes the upward scroll and enlargement simultaneously. Note that the combination of icons is not limited to the example of FIG.
- the various effects described above can be obtained, and as a result, high convenience can be provided.
- the gesture operation is a drag and the screen movement deformation type function associated with the drag is mainly exemplified as a scroll, but other gesture operations and other screen movement deformation type functions are also exemplified. Similar effects can be obtained.
- the display information displayed on the display unit 12 is a map image
- the use of composite icons is not limited to map images.
- the composite icon can be used for a list of titles such as books and music and a slide for a list of web search results. Further, for example, the composite icon can be used for turning pages of an electronic book or the like and selecting content such as an electronic album.
- the display information to be controlled by the gesture operation and the composite icon may be displayed on the entire display surface or may be displayed on a part of the display surface.
- the display information displayed on a part of the display surface is, for example, display information in a window provided on the part.
- a part of the display surface may be one-dimensional as illustrated in FIG. That is, in the example of FIG. 37, the elements A, B, C, D, E, F, G, H, and I forming the display information form a row on the zigzag path (in other words, connected to each other). ) Move and the movement is controlled by drag or flick.
- a contact type touch panel is exemplified as the input unit 14.
- a non-contact type also referred to as a three-dimensional (3D) type
- 3D three-dimensional
- a detectable region of the sensor group (in other words, an input region that can accept user input) is provided as a three-dimensional space on the input surface, and a finger in the three-dimensional space is placed on the input surface. The position projected on is detected.
- Some non-contact types can detect the distance from the input surface to the finger. According to this method, the finger position can be detected as a three-dimensional position, and further, the approach and retreat of the finger can also be detected.
- Various systems have been developed as non-contact type touch panels. For example, a projection capacity system which is one of electrostatic capacity systems is known.
- the finger is exemplified as the indicator used by the user for input.
- a part other than the finger can be used as the indicator.
- a tool such as a touch pen (also referred to as a stylus pen) may be used as an indicator.
- a so-called motion sensing technology may be used for the input unit 14.
- Various methods have been developed as motion sensing technology. For example, a method is known in which a user's movement is detected by a user holding or wearing a controller equipped with an acceleration sensor or the like.
- a method of extracting a feature point such as a finger from a captured image of a camera and detecting a user's movement from the extraction result is known.
- An intuitive operation environment is also provided by the input unit 14 using the motion sensing technology.
- the input / display unit 20 is illustrated above, the display unit 12 and the input unit 14 may be arranged separately. Even in this case, an intuitive operation environment is provided by configuring the input unit 14 with a touch panel or the like.
- the information display device 10 may further include elements other than the above elements 12, 14, 16, and 18.
- an audio output unit that outputs auditory information
- a communication unit that performs wired or wireless communication with various devices
- the current position of the information display device 10 conform to, for example, a GPS (Global Positioning System) system.
- GPS Global Positioning System
- One or more of the current position detection units to be detected may be added.
- the voice output unit can output, for example, operation sounds, sound effects, guidance sounds, and the like.
- a notification sound can be output at each timing of appearance, use, and deletion of a composite icon.
- the communication unit can be used for, for example, new acquisition and update of information stored in the storage unit 18. Further, the current position detection unit can be used for a navigation function, for example.
- the information display device 10 may be a portable or desktop information device.
- the information display device 10 may be applied to a navigation device or an audio / visual device mounted on a moving body such as an automobile.
- 10 information display device 12 display unit, 14 input unit, 16 control unit, 18 storage unit, 20 input / display unit, 32 display surface, 32b peripheral region, 34 input surface (input region), 34b peripheral region, 70 drag, 70a start point, 70b end point, 70c end point area, 70d extension line, 70e start point side part, 70f end point side part, 72 scroll composite icon, 72a to 72h scroll icon, 80 display size change composite icon, 80a enlarge icon (display size change icon ), 80b reduction icon (display size change icon), 84 rotation composite icon, 84a right rotation icon (rotation icon), 84b left rotation icon (rotation icon), 88 composite icon S10, S30, S50 processing flow.
Abstract
Description
図1に実施の形態に係る情報表示装置10のブロック図を例示する。図1の例によれば、情報表示装置10は、表示部12と、入力部14と、制御部16と、記憶部18とを含んでいる。 <Overview of overall configuration>
FIG. 1 illustrates a block diagram of an
情報表示装置10のより具体的な構成および処理を説明する前に、タッチパネル14に対するユーザ操作について説明する。 <User operations and their functions>
Before describing a more specific configuration and processing of the
情報表示装置10では、複合アイコンという特徴的な操作手法を採用している。複合アイコンは複数のアイコンの複合体である。複合アイコンは、画面移動変形型機能のジェスチャ操作が行われた場合に、表示面に表示される。そして、複合アイコン中の各アイコンに対して実行指示操作を行うと、当該複合アイコンの出現に関与した(換言すれば、当該複合アイコンを表示させるきっかけになった)ジェスチャ操作に関連付けられている画面移動変形型機能が実行される。換言すれば、複合アイコン中のいずれのアイコンも、当該複合アイコンの出現に関与したジェスチャ操作に関連付けられている画面移動変形型機能に、関連付けられている。但し、複合アイコン中のアイコンには、互いに異なる制御方向が割り当てられている。このため、複合アイコン中のいずれかのアイコンに対して実行指示操作を行うと、そのアイコンに割り当てられている制御方向で以て、画面移動変形型機能が実行される。 <Composite icon>
The
図18に制御部16のブロック図を例示する。なお、図18には説明のため、表示部12と入力部14と記憶部18も記載している。図18の例によれば、制御部16は、入力解析部40と、全体制御部42と、第1画像形成部44と、第1画像保持部46と、第2画像形成部48と、第2画像保持部50と、画像合成部52と、合成画像保持部54と、複合アイコン管理部56とを含んでいる。 <Configuration Example of
FIG. 18 illustrates a block diagram of the
以下に、複合アイコンに関連した、情報表示装置10による処理(換言すれば、表示情報操作方法)を例示する。 <Processing Example of
Below, the process (in other words, display information operation method) by the
図19に、複合アイコンが表示されるまでの処理フローS10を例示する。図19の例によれば、ステップS11において入力部14がユーザ操作を受け付け、ステップS12において制御部16が、入力されたユーザ操作を識別する。そして、ステップS13において、制御部16がステップS12の識別結果に基づいて、ユーザ操作に関連付けられている機能を実行する。 <Display of composite icon>
FIG. 19 illustrates a processing flow S10 until a composite icon is displayed. According to the example of FIG. 19, the
上記ステップS14に関し、複合アイコン表示開始条件として、画面移動変形型機能のジェスチャ操作(換言すれば、複合アイコンを表示させるきっかけになるジェスチャ操作)が1回、行われた場合に、複合アイコンを表示させる、という条件(「1回操作条件」と称することにする)を採用可能である。1回操作条件によれば、複合アイコンを即座に利用することができる。したがって、何回も同じジェスチャ操作を繰り返すという操作負担を軽減できる。 <Composite icon display start condition>
Regarding step S14, as a composite icon display start condition, a composite icon is displayed when a gesture operation of the screen movement deformation type function (in other words, a gesture operation that triggers display of the composite icon) is performed once. It is possible to adopt a condition (referred to as “one-time operation condition”). According to the one-time operation condition, the composite icon can be used immediately. Therefore, the operation burden of repeating the same gesture operation many times can be reduced.
上記ステップS15(図19参照)に関し、複合アイコンの表示位置は、基本的には任意である。これに対し、図22に例示するように、ドラッグ70の終点70bの近傍に、スクロール複合アイコン72が存在すれば、ドラッグ70に用いた指を、少ない移動量で、スクロール複合アイコン72上に移動させることができる。 <Composite icon display position>
Regarding step S15 (see FIG. 19), the display position of the composite icon is basically arbitrary. On the other hand, as illustrated in FIG. 22, if the
複合アイコンは、他のアイコンとは異なる表示属性(換言すれば、表示手法)によって表示してもよい。例えば、複合アイコンを、点滅、立体的表示、アニメーション表示、半透過、等の表示属性によって、または、複数の表示属性の組み合わせによって、表示する。これによれば、複合アイコンの視認性が上がり、操作ミス防止に資する。 <Display attribute of composite icon>
The composite icon may be displayed with a display attribute different from other icons (in other words, a display method). For example, the composite icon is displayed by display attributes such as blinking, stereoscopic display, animation display, and translucency, or by a combination of a plurality of display attributes. According to this, the visibility of the composite icon is improved, which contributes to prevention of an operation error.
図30に、複合アイコンを表示中の処理フローS30を例示する。図30の例において、ステップS31,S32は、図19のステップS11,S12と同様である。すなわち、ステップS31において入力部14がユーザ操作を受け付け、ステップS32において制御部16が、入力されたユーザ操作を識別する。 <Use of composite icons>
FIG. 30 illustrates a processing flow S30 during display of the composite icon. In the example of FIG. 30, steps S31 and S32 are the same as steps S11 and S12 of FIG. That is, in step S31, the
上記ステップS34(図30参照)に関し、例えば、複合アイコンをタップすると(より具体的には、複合アイコン中のいずれかのアイコンをタップすると)、予め定められた制御量および予め定められた制御速度で、複合アイコンに関連付けられた画面移動変形型機能が、実行される。また、例えば、複合アイコンを長押している間、複合アイコンに関連付けられた画面移動変形型機能が、連続的に実行される。この場合、表示情報の制御量は長押しの時間に応じて決まる。また、表示情報の制御速度は、予め定められた一定値であってもよいし、あるいは、次第に増加させてもよい。 <Control amount and control speed>
Regarding step S34 (see FIG. 30), for example, when a composite icon is tapped (more specifically, when any icon in the composite icon is tapped), a predetermined control amount and a predetermined control speed are set. Thus, the screen movement deformation type function associated with the composite icon is executed. Further, for example, while the composite icon is pressed and held, the screen movement deformation type function associated with the composite icon is continuously executed. In this case, the control amount of the display information is determined according to the long press time. Further, the control speed of the display information may be a predetermined constant value or may be gradually increased.
上記ステップS35(図30参照)に関し、ステップS33においてユーザ操作が、複合アイコンに対して行われる表示サイズ変更操作であると判断した場合、制御部16はステップS35において、複合アイコン自体の表示サイズを変更する。表示サイズ変更操作は例えば、図34に示すように、ピンチアウトおよびピンチインである。なお、ピンチ操作は2点移動型(図7および図9参照)であってもよいし、1点移動型(図8および図10参照)であってもよい。ユーザが複合アイコンを好みのサイズに変更することによって、操作性が向上する。 <Resize composite icon>
Regarding step S35 (see FIG. 30), when it is determined in step S33 that the user operation is a display size change operation performed on the composite icon, the
図35に、複合アイコンの消去(すなわち表示の終了)に関する処理フローS50を例示する。図35の例によれば、ステップS51において、制御部16は、複合アイコンを消去するために予め定められた条件(「複合アイコン消去条件」または「消去条件」と称することにする)を満たすか否かを判断する。 <Erase composite icon>
FIG. 35 illustrates a processing flow S50 related to erasure of composite icons (that is, display end). According to the example of FIG. 35, in step S51, the
複合アイコン消去条件として、複合アイコンに対する実行指示操作が入力されない状態が続いた場合、複合アイコンを表示面から消去する、という条件(「操作待ち条件」と称することにする)を採用可能である。複合アイコンがある程度の時間、利用されないということは、ユーザはしばらく複合アイコンを利用しない可能性が高いと考えられる。したがって、消去待ち時間条件によれば、ユーザの意図をより正確に識別した上で複合アイコンを消去する点において、高い利便性を提供できる。 <Composite icon deletion conditions>
As the composite icon erasure condition, it is possible to adopt a condition (hereinafter referred to as “operation waiting condition”) that the composite icon is erased from the display surface when the execution instruction operation for the composite icon is not input. If the composite icon is not used for a certain period of time, the user is likely not to use the composite icon for a while. Therefore, according to the deletion waiting time condition, it is possible to provide high convenience in that the composite icon is deleted after the user's intention is more accurately identified.
複数の複合アイコンを同時に表示させることも可能である。例えば、スクロール複合アイコンと表示サイズ変更複合アイコンと回転複合アイコンとを表示させてもよい。この場合、それぞれの複合アイコンについて、上記処理フローS10,S30,S50が並列に管理される。また、同時に表示する複合アイコンの個数に制限を設けてもよい。 <Number of composite icons>
It is also possible to display a plurality of composite icons at the same time. For example, a scroll composite icon, a display size change composite icon, and a rotation composite icon may be displayed. In this case, the processing flows S10, S30, and S50 are managed in parallel for each composite icon. In addition, a limit may be set on the number of composite icons displayed simultaneously.
異なる種類の画面移動変形型機能に関連付けられているアイコンを組み合わせて、複合アイコンを構成してもよい。例えば、図36に示す複合アイコン88は、スクロールアイコン72a~72hと表示サイズ変更アイコン80a,80bとによって構成されている。複合アイコン88によれば、スクロールと拡大と縮小の各機能を1箇所で操作できるので、良好な操作環境を提供できる。なお、スクロールと拡大と縮小は、それぞれ独立に実行可能であるし、あるいは、組み合わせて実行することも可能である。例えば、制御部16は、上方向スクロールアイコン72aと拡大アイコン84aとを対象にした2点タッチを識別したならば、上方向スクロールと拡大とを同時に実行する。なお、アイコンの組み合わせは図36の例に限定されるものではない。 <Combination of icons>
A composite icon may be configured by combining icons associated with different types of screen movement deformation type functions. For example, the
情報表示装置10によれば、上記の各種効果が得られ、更にその結果、高い利便性を提供できる。なお、上記では、ジェスチャ操作がドラッグであり、ドラッグに関連付けられている画面移動変形型機能がスクロールである場合を主に例示したが、他のジェスチャ操作および他の画面移動変形型機能についても、同様の効果が得られる。 <Effect>
According to the
上記では、表示部12に表示される表示情報が地図画像である場合を主に例示した。しかし、複合アイコンの利用は、地図画像に限定されるものではない。例えば、書籍、楽曲等のタイトルのリスト、および、ウェブ検索結果のリストに対するスライド等にも、複合アイコンを利用可能である。また、例えば、電子書籍等のページめくり、および、電子アルバム等のコンテンツ選択にも複合アイコンを利用可能である。 <Modification>
In the above, the case where the display information displayed on the
Claims (30)
- 表示面を有する表示部と、
ユーザ操作を受け付ける入力部と、
前記ユーザ操作が、前記表示面上の表示情報を、ジェスチャ方向に応じて設定される制御方向に制御する画面移動変形型機能に関連付けられたジェスチャ操作である場合、いずれも前記画面移動変形型機能に関連付けられているが前記制御方向の割り当てが異なっている複数のアイコンの複合体である複合アイコンを前記表示面に表示させ、前記ユーザ操作が前記複合アイコン中のいずれかのアイコンに対する実行指示操作である場合、前記実行指示操作が行われたアイコンに割り当てられている前記制御方向で以て前記画面移動変形型機能を実行する、制御部と
を備える情報表示装置。 A display unit having a display surface;
An input unit that accepts user operations;
When the user operation is a gesture operation associated with a screen movement deformation type function for controlling display information on the display surface in a control direction set in accordance with the gesture direction, both of the screen movement deformation type functions A composite icon that is a composite of a plurality of icons that are associated with each other but have different control direction assignments is displayed on the display surface, and the user operation performs an execution instruction operation on any of the icons in the composite icon In the case, the information display device includes a control unit that executes the screen movement deformation function in the control direction assigned to the icon for which the execution instruction operation has been performed. - 前記制御部は、前記ジェスチャ操作が予め定められた回数、連続して繰り返された場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when the gesture operation is continuously repeated a predetermined number of times.
- 前記制御部は、前記ジェスチャ操作の繰り返し時間の長さが、予め定められた閾値に達した場合に、前記複合アイコンを表示させる、請求項2に記載の情報表示装置。 The information display device according to claim 2, wherein the control unit displays the composite icon when a length of a repetition time of the gesture operation reaches a predetermined threshold value.
- 前記制御部は、前記ジェスチャ操作の1回の操作時間の長さが、予め定められた閾値に達した場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when a length of one operation time of the gesture operation reaches a predetermined threshold value.
- 前記制御部は、前記ジェスチャ操作の繰り返し速度が、予め定められた閾値以上である場合に、前記複合アイコンを表示させる、請求項2に記載の情報表示装置。 The information display apparatus according to claim 2, wherein the control unit displays the composite icon when a repetition speed of the gesture operation is equal to or higher than a predetermined threshold.
- 前記制御部は、前記ジェスチャ操作の前記繰り返し速度が前記予め定められた閾値以上である場合、予め定められたアイコン表示タイミングよりも早いタイミングで、前記複合アイコンを表示させる、請求項5に記載の情報表示装置。 The said control part displays the said composite icon at a timing earlier than a predetermined icon display timing, when the said repetition speed of the said gesture operation is more than the said predetermined threshold value. Information display device.
- 前記制御部は、前記ジェスチャ操作の1回の操作速度が、予め定められた閾値以上である場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when a single operation speed of the gesture operation is equal to or higher than a predetermined threshold.
- 前記制御部は、前記ジェスチャ操作の前記1回の操作速度が前記予め定められた閾値以上である場合、予め定められたアイコン表示タイミングよりも早いタイミングで、前記複合アイコンを表示させる、請求項7に記載の情報表示装置。 The said control part displays the said composite icon at timing earlier than a predetermined icon display timing, when the said operation speed of the said gesture operation is more than the said predetermined threshold value. The information display device described in 1.
- 前記制御部は、前記ジェスチャ操作の繰り返しに従ってジェスチャ量を積算し、その積算値が、予め定められた閾値に達した場合に、前記複合アイコンを表示させる、請求項2に記載の情報表示装置。 The information display device according to claim 2, wherein the control unit accumulates the gesture amount according to repetition of the gesture operation, and displays the composite icon when the accumulated value reaches a predetermined threshold value.
- 前記制御部は、前記ジェスチャ操作の1回のジェスチャ量が、予め定められた閾値に達した場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when the amount of one gesture of the gesture operation reaches a predetermined threshold value.
- 前記制御部は、前記ジェスチャ操作の終点が、前記表示面上の予め定められた領域内の地点に対応する場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when an end point of the gesture operation corresponds to a point in a predetermined area on the display surface.
- 前記制御部は、前記ジェスチャ操作に引き続いて複合アイコン呼び出し操作が行われた場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when a composite icon call operation is performed subsequent to the gesture operation.
- 前記制御部は、前記ジェスチャ操作の後に無操作状態が、予め定められた時間、続いた場合に、前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit displays the composite icon when a no-operation state continues for a predetermined time after the gesture operation.
- 前記制御部は、前記ジェスチャ操作のジェスチャ量またはジェスチャ速度を、前記複合アイコンに対して前記実行指示操作が行われた場合における前記表示情報の制御量または制御速度に、反映させる、請求項1に記載の情報表示装置。 The control unit reflects the gesture amount or gesture speed of the gesture operation on the control amount or control speed of the display information when the execution instruction operation is performed on the composite icon. The information display device described.
- 前記制御部は、前記ジェスチャ操作のジェスチャ量を前記表示情報の制御量の1単位に設定し、前記複合アイコンに対して前記実行指示操作が行われた場合、前記1単位ごとに断続的に前記表示情報を制御する、請求項14に記載の情報表示装置。 The control unit sets a gesture amount of the gesture operation to one unit of a control amount of the display information, and when the execution instruction operation is performed on the composite icon, the control unit intermittently performs the unit operation. 15. The information display device according to claim 14, which controls display information.
- 前記制御部は、前記ジェスチャ操作のジェスチャ速度の変化を、前記複合アイコンに対して前記実行指示操作が行われた場合における前記表示情報の制御速度の変化に、反映させる、請求項14に記載の情報表示装置。 15. The control unit according to claim 14, wherein the control unit reflects a change in gesture speed of the gesture operation on a change in control speed of the display information when the execution instruction operation is performed on the composite icon. Information display device.
- 前記制御部は、前記ジェスチャ操作の終点または前記終点を含んで定められた終点領域を前記表示面上に対応付け、前記表示面上で前記終点領域内に前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The control unit associates an end point of the gesture operation or an end point region defined including the end point on the display surface, and displays the composite icon in the end point region on the display surface. The information display device described in 1.
- 前記制御部は、前記ジェスチャ操作のジェスチャ軌跡または前記ジェスチャ軌跡の延長線を前記表示面上に対応付け、前記表示面上で前記延長線上に前記複合アイコンを表示させる、請求項1に記載の情報表示装置。 The information according to claim 1, wherein the control unit associates a gesture trajectory of the gesture operation or an extension line of the gesture trajectory on the display surface, and displays the composite icon on the extension line on the display surface. Display device.
- 前記制御部は、前記表示面の周縁領域内で前記延長線が通る位置に、前記複合アイコンを表示させる、請求項18に記載の情報表示装置。 The information display device according to claim 18, wherein the control unit displays the composite icon at a position where the extension line passes in a peripheral area of the display surface.
- 前記制御部は、前記ジェスチャ軌跡の一部分であり且つ前記ジェスチャ軌跡の終点を含む終点側部分を使って、前記延長線を設定する、請求項18に記載の情報表示装置。 The information display device according to claim 18, wherein the control unit sets the extension line by using an end point side portion that is a part of the gesture trajectory and includes an end point of the gesture trajectory.
- 前記延長線は、前記ジェスチャ軌跡の前記終点における接線、または、前記ジェスチャ軌跡の前記終点と前記終点側部分中の他の1点とを通る直線である、請求項20に記載の情報表示装置。 21. The information display device according to claim 20, wherein the extension line is a tangent line at the end point of the gesture trajectory, or a straight line passing through the end point of the gesture trajectory and another point in the end point side portion.
- 前記複合アイコンは他のアイコンとは異なる表示属性によって表示される、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the composite icon is displayed with a display attribute different from that of the other icons.
- 前記制御部は、前記複合アイコンに対する前記実行指示操作が入力されない状態が続いた場合、前記複合アイコンを前記表示面から消去する、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit erases the composite icon from the display surface when the execution instruction operation on the composite icon is not input.
- 前記制御部は、前記ユーザ操作が複合アイコン消去操作である場合、前記複合アイコンを前記表示面から消去する、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit erases the composite icon from the display surface when the user operation is a composite icon erase operation.
- 前記画面移動変形型機能は、前記表示情報を前記制御方向にスクロールさせるスクロール機能であり、
前記制御方向は、前記表示情報のスクロール方向であり、
前記複合アイコンは、前記複数のアイコンのいずれも前記スクロール機能に関連付けられているが各アイコンによる前記スクロール方向が異なる、スクロール複合アイコンである、
請求項1に記載の情報表示装置。 The screen movement deformation type function is a scroll function for scrolling the display information in the control direction,
The control direction is a scroll direction of the display information,
The composite icon is a scroll composite icon in which any of the plurality of icons is associated with the scroll function, but the scroll direction of each icon is different.
The information display device according to claim 1. - 前記スクロール複合アイコンは、
前記表示情報の表示サイズを拡大する機能に関連付けられた拡大アイコンと、
前記表示情報の表示サイズを縮小する機能に関連付けられた縮小アイコンと
を更に有する、請求項25に記載の情報表示装置。 The scroll compound icon is
An enlarge icon associated with the function of enlarging the display size of the display information;
The information display device according to claim 25, further comprising a reduction icon associated with a function of reducing a display size of the display information. - 前記画面移動変形型機能は、前記表示情報の表示サイズを変更する表示サイズ変更機能であり、
前記制御方向は、前記表示情報の表示サイズ変更方向であり、
前記複合アイコンは、前記複数のアイコンのいずれも前記表示サイズ変更機能に関連付けられているが各アイコンによる前記表示サイズ変更方向が異なる、表示サイズ変更複合アイコンである、
請求項1に記載の情報表示装置。 The screen movement deformation type function is a display size changing function for changing a display size of the display information,
The control direction is a display size change direction of the display information,
The composite icon is a display size change composite icon in which any of the plurality of icons is associated with the display size change function, but the display size change direction by each icon is different.
The information display device according to claim 1. - 前記画面移動変形型機能は、前記表示情報を回転させる回転機能であり、
前記制御方向は、前記表示情報の回転方向であり、
前記複合アイコンは、前記複数のアイコンのいずれも前記回転機能に関連付けられているが各アイコンによる前記回転方向が異なる、回転複合アイコンである、
請求項1に記載の情報表示装置。 The screen movement deformation type function is a rotation function for rotating the display information,
The control direction is a rotation direction of the display information,
The composite icon is a rotation composite icon in which any of the plurality of icons is associated with the rotation function, but the rotation direction of each icon is different.
The information display device according to claim 1. - 前記制御部は、前記ユーザ操作が、前記複合アイコンに対して行われる表示サイズ変更操作である場合、前記複合アイコンの表示サイズを変更する、請求項1に記載の情報表示装置。 The information display device according to claim 1, wherein the control unit changes a display size of the composite icon when the user operation is a display size change operation performed on the composite icon.
- (a)ユーザ操作を受け付けるステップと、
(b)前記ユーザ操作を識別するステップと、
(c)前記ユーザ操作が、表示面上の表示情報を、ジェスチャ方向に応じて設定される制御方向に制御する画面移動変形型機能に関連付けられたジェスチャ操作である場合、いずれも前記画面移動変形型機能に関連付けられているが前記制御方向の割り当てが異なっている複数のアイコンの複合体である複合アイコンを前記表示面に表示させるステップと、
(d)前記ユーザ操作が前記複合アイコン中のいずれかのアイコンに対する実行指示操作である場合、前記実行指示操作が行われたアイコンに割り当てられている前記制御方向で以て前記画面移動変形型機能を実行するステップと
を備える表示情報操作方法。 (A) receiving a user operation;
(B) identifying the user operation;
(C) When the user operation is a gesture operation associated with a screen movement deformation type function for controlling display information on the display surface in a control direction set in accordance with the gesture direction, any of the screen movement deformations Displaying on the display surface a composite icon that is a composite of a plurality of icons that are associated with a type function but have different control direction assignments;
(D) When the user operation is an execution instruction operation with respect to any of the icons in the composite icon, the screen movement deformable function with the control direction assigned to the icon for which the execution instruction operation has been performed A display information operation method comprising the steps of:
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112012007167.0T DE112012007167T5 (en) | 2012-10-16 | 2012-10-16 | Information display device, display information operation method |
CN201280076431.5A CN104737221B (en) | 2012-10-16 | 2012-10-16 | Information display device and display information operation method |
JP2014541847A JP5738495B2 (en) | 2012-10-16 | 2012-10-16 | Information display device and display information operation method |
PCT/JP2012/076680 WO2014061098A1 (en) | 2012-10-16 | 2012-10-16 | Information display device and display information operation method |
US14/426,092 US20150234572A1 (en) | 2012-10-16 | 2012-10-16 | Information display device and display information operation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/076680 WO2014061098A1 (en) | 2012-10-16 | 2012-10-16 | Information display device and display information operation method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014061098A1 true WO2014061098A1 (en) | 2014-04-24 |
Family
ID=50487687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/076680 WO2014061098A1 (en) | 2012-10-16 | 2012-10-16 | Information display device and display information operation method |
Country Status (5)
Country | Link |
---|---|
US (1) | US20150234572A1 (en) |
JP (1) | JP5738495B2 (en) |
CN (1) | CN104737221B (en) |
DE (1) | DE112012007167T5 (en) |
WO (1) | WO2014061098A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017021869A (en) * | 2016-11-04 | 2017-01-26 | ヤフー株式会社 | Display program, terminal device, display method, and distribution apparatus |
JPWO2018167813A1 (en) * | 2017-03-13 | 2019-06-27 | 三菱電機株式会社 | Touch pad operation detection device and touch pad operation detection method |
JP2019185149A (en) * | 2018-04-03 | 2019-10-24 | 株式会社ミクシィ | Information processing device, function display method and function display program |
JP2020204961A (en) * | 2019-06-18 | 2020-12-24 | 京セラドキュメントソリューションズ株式会社 | Information processing apparatus |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9924907B2 (en) * | 2011-09-30 | 2018-03-27 | Google Technology Holdings LLC | Method and system for identifying location of a touched body part |
JP2014211701A (en) * | 2013-04-17 | 2014-11-13 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
USD773479S1 (en) * | 2013-09-06 | 2016-12-06 | Microsoft Corporation | Display screen with icon group |
US10528250B2 (en) | 2014-02-21 | 2020-01-07 | Groupon, Inc. | Method and system for facilitating consumer interactions with promotions |
CN105204629B (en) * | 2015-09-02 | 2018-11-13 | 成都上生活网络科技有限公司 | A kind of 3D gesture identification methods |
CN105892844A (en) * | 2015-12-15 | 2016-08-24 | 乐视移动智能信息技术(北京)有限公司 | Screen-sliding processing method and system |
US20190095068A1 (en) * | 2016-04-19 | 2019-03-28 | Maxell, Ltd. | Portable terminal device |
JP6727921B2 (en) | 2016-05-18 | 2020-07-22 | ソニーモバイルコミュニケーションズ株式会社 | Information processing device, information processing system, and information processing method |
USD800144S1 (en) * | 2016-06-29 | 2017-10-17 | Naturalmotion Ltd. | Display screen or portion thereof with graphical user interface |
USD874516S1 (en) * | 2017-12-05 | 2020-02-04 | Koninklijke Philips N.V. | Display device with icon |
CN108460725B (en) * | 2018-03-22 | 2019-06-18 | 腾讯科技(深圳)有限公司 | Map-indication method, device, equipment and storage medium |
DE102020104789A1 (en) * | 2019-02-26 | 2020-08-27 | Makita Corporation | SEARCH DEVICE FOR AN EMBEDDED OBJECT |
US11409411B1 (en) * | 2021-03-12 | 2022-08-09 | Topgolf International, Inc. | Single finger user interface camera control |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009518758A (en) * | 2005-12-08 | 2009-05-07 | アップル インコーポレイテッド | List scroll in response to moving touch on index symbol list |
JP2010086230A (en) * | 2008-09-30 | 2010-04-15 | Sony Corp | Information processing apparatus, information processing method and program |
JP2011242820A (en) * | 2010-05-13 | 2011-12-01 | Panasonic Corp | Electronic apparatus, display method, and program |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005015380A2 (en) * | 2003-08-08 | 2005-02-17 | Koninklijke Philips Electronics N.V. | Method of scrolling through a document |
JP4645179B2 (en) * | 2004-12-02 | 2011-03-09 | 株式会社デンソー | Vehicle navigation device |
JP4678534B2 (en) * | 2007-06-07 | 2011-04-27 | ソニー株式会社 | Navigation device and map scroll processing method |
US9933937B2 (en) * | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
KR101467881B1 (en) * | 2008-08-18 | 2014-12-02 | 엘지전자 주식회사 | Controlling a Mobile Terminal with at least two display area |
JP5228755B2 (en) * | 2008-09-29 | 2013-07-03 | 富士通株式会社 | Portable terminal device, display control method, and display control program |
US8881013B2 (en) * | 2009-04-30 | 2014-11-04 | Apple Inc. | Tool for tracking versions of media sections in a composite presentation |
JP2011028635A (en) * | 2009-07-28 | 2011-02-10 | Sony Corp | Display control apparatus, display control method and computer program |
KR101113906B1 (en) * | 2009-09-04 | 2012-02-29 | 노상기 | Method of forming a user interface screen for an electric device terminal and electric device terminal for performing the method |
CN102023788A (en) * | 2009-09-15 | 2011-04-20 | 宏碁股份有限公司 | Control method for touch screen display frames |
CN102073439A (en) * | 2009-11-20 | 2011-05-25 | 英业达股份有限公司 | Electronic device and prompting method of touch control screen thereof |
CN102156605B (en) * | 2010-02-12 | 2013-01-09 | 宏碁股份有限公司 | Object moving method, object moving system and electronic device |
JP5494337B2 (en) * | 2010-07-30 | 2014-05-14 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing program |
US9569066B2 (en) * | 2011-10-03 | 2017-02-14 | Google Inc. | Interface for navigating imagery |
-
2012
- 2012-10-16 DE DE112012007167.0T patent/DE112012007167T5/en not_active Withdrawn
- 2012-10-16 JP JP2014541847A patent/JP5738495B2/en active Active
- 2012-10-16 WO PCT/JP2012/076680 patent/WO2014061098A1/en active Application Filing
- 2012-10-16 US US14/426,092 patent/US20150234572A1/en not_active Abandoned
- 2012-10-16 CN CN201280076431.5A patent/CN104737221B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009518758A (en) * | 2005-12-08 | 2009-05-07 | アップル インコーポレイテッド | List scroll in response to moving touch on index symbol list |
JP2010086230A (en) * | 2008-09-30 | 2010-04-15 | Sony Corp | Information processing apparatus, information processing method and program |
JP2011242820A (en) * | 2010-05-13 | 2011-12-01 | Panasonic Corp | Electronic apparatus, display method, and program |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017021869A (en) * | 2016-11-04 | 2017-01-26 | ヤフー株式会社 | Display program, terminal device, display method, and distribution apparatus |
JPWO2018167813A1 (en) * | 2017-03-13 | 2019-06-27 | 三菱電機株式会社 | Touch pad operation detection device and touch pad operation detection method |
JP2019185149A (en) * | 2018-04-03 | 2019-10-24 | 株式会社ミクシィ | Information processing device, function display method and function display program |
JP7078845B2 (en) | 2018-04-03 | 2022-06-01 | 株式会社ミクシィ | Information processing device, function display method and function display program |
JP2020204961A (en) * | 2019-06-18 | 2020-12-24 | 京セラドキュメントソリューションズ株式会社 | Information processing apparatus |
JP7259581B2 (en) | 2019-06-18 | 2023-04-18 | 京セラドキュメントソリューションズ株式会社 | Information processing equipment |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014061098A1 (en) | 2016-09-05 |
CN104737221B (en) | 2016-10-12 |
US20150234572A1 (en) | 2015-08-20 |
JP5738495B2 (en) | 2015-06-24 |
DE112012007167T5 (en) | 2015-07-30 |
CN104737221A (en) | 2015-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5738495B2 (en) | Information display device and display information operation method | |
JP5738494B2 (en) | Information display device and display information operation method | |
US11003304B2 (en) | Information display terminal, information display method and program | |
US10318146B2 (en) | Control area for a touch screen | |
US8499255B2 (en) | Organizational tools on a multi-touch display device | |
JP4557058B2 (en) | Information display terminal, information display method, and program | |
US20100295806A1 (en) | Display control apparatus, display control method, and computer program | |
KR20130099186A (en) | Display device, user interface method, and program | |
KR101586559B1 (en) | Information processing apparatus and information processing method | |
US9557907B2 (en) | Display device capturing digital content and method of controlling therefor | |
KR102161061B1 (en) | Method and terminal for displaying a plurality of pages | |
JP5921703B2 (en) | Information display device and operation control method in information display device | |
JP5875262B2 (en) | Display control device | |
KR20160010993A (en) | Object editing method and image display device using thereof | |
US10915240B2 (en) | Method of selection and manipulation of graphical objects | |
KR102118046B1 (en) | Portable device and controlling method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12886797 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014541847 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14426092 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112012007167 Country of ref document: DE Ref document number: 1120120071670 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12886797 Country of ref document: EP Kind code of ref document: A1 |