WO2017110606A1 - 情報処理装置およびその制御方法およびプログラム - Google Patents
情報処理装置およびその制御方法およびプログラム Download PDFInfo
- Publication number
- WO2017110606A1 WO2017110606A1 PCT/JP2016/087156 JP2016087156W WO2017110606A1 WO 2017110606 A1 WO2017110606 A1 WO 2017110606A1 JP 2016087156 W JP2016087156 W JP 2016087156W WO 2017110606 A1 WO2017110606 A1 WO 2017110606A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- display device
- information processing
- processing apparatus
- distance
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16H—GEARING
- F16H3/00—Toothed gearings for conveying rotary motion with variable gear ratio or for reversing rotary motion
- F16H3/44—Toothed gearings for conveying rotary motion with variable gear ratio or for reversing rotary motion using gears having orbital motion
- F16H3/46—Gearings having only two central gears, connected by orbital gears
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16H—GEARING
- F16H57/00—General details of gearing
- F16H57/02—Gearboxes; Mounting gearing therein
- F16H57/021—Shaft support structures, e.g. partition walls, bearing eyes, casing walls or covers with bearings
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16H—GEARING
- F16H57/00—General details of gearing
- F16H57/02—Gearboxes; Mounting gearing therein
- F16H57/023—Mounting or installation of gears or shafts in the gearboxes, e.g. methods or means for assembly
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to an information processing apparatus having a user interface function capable of specifying a position in response to a user operation, a control method thereof, and a program.
- a seek bar is known as one of user interfaces (UI) for specifying a position.
- the seek bar can be used, for example, to select a display image from a group of continuously shot images.
- Other examples of such a GUI include a slider bar and a scroll bar.
- Patent Document 1 discloses an imaging apparatus that can perform continuous shooting, and allows a user to select one image from a group of continuously shot images using a GUI that can be slid.
- a user uses a finger to drag a knob in a horizontal direction along a slide bar to display a captured image captured at a time corresponding to the slide amount in a display area. ing.
- the above-described problem occurs when moving any user interface element, particularly a user interface element having a small display area, in addition to the seek bar.
- the above problem occurs not only when an input is performed by a touch operation, but also when an input is performed using an arbitrary pointing device (pointing input device).
- an object of the present invention is to improve the usability of a user interface that can specify a position in response to a user operation.
- An information processing apparatus includes: a display control unit that displays a movable user interface element (UI element) on a display device; a detection unit that detects a user operation on the display device; and An acquisition means for acquiring a first position detected on the display device; and a determination for determining a second position at which the UI element is displayed on the display device based on the acquired first position.
- Storage means for storing the second position when the detection of the user operation by the detection means is completed, a third position when detection of the user operation is newly started, and the storage means.
- An information processing apparatus is used to detect a touch operation on a display device, a processing unit that executes a process based on a position corresponding to the touch operation, and the execution of the process Storage means for storing the position; and calculation means for calculating a distance based on a position corresponding to a touch operation on the display device, wherein the detection means is a first touch operation on the display device; A second touch operation is detected, the storage means stores a first position corresponding to the first touch operation, and the calculation means detects a second touch operation on the display device.
- the distance between the stored first position and the second position corresponding to the second touch operation is calculated, and the processing means is responsive to the first touch operation, The first corresponding to one touch operation If a process based on a position is executed and the distance is smaller than a predetermined value, a process based on the first position is executed in response to the second touch operation, and if the distance is larger than the predetermined value, A process based on the second position is executed in response to the second touch operation.
- An information processing apparatus control method includes: a step of displaying a movable user interface element (UI element) on a display device; a step of detecting a user operation on the display device; Obtaining a first position detected on the display device; determining a second position at which the UI element is displayed on the display device based on the obtained first position; A step of storing the second position when the detection of the user operation is completed, a third position when the detection of the user operation is newly started, and the stored second position.
- the UI element is determined based on the third position and the fourth position. In either of the positions of, and having a step of controlling so as to selectively displayed according to the calculated distance.
- a method for controlling an information processing device the step of detecting a first touch operation on a display device, and responding to the first touch operation in response to the first touch operation.
- a step of executing a process based on a first position; a step of storing the first position used to execute the process; and a second touch operation on the display device is detected, and the storage Calculating the distance between the first position being performed and the second position corresponding to the second touch operation, and if the distance is smaller than a predetermined value, in response to the second touch operation, Performing the process based on the first position, and executing the process based on the second position in response to the second touch operation if the distance is greater than the predetermined value.
- the usability of a user interface that can specify a position in response to a user operation is improved, and the user can easily specify a desired position.
- FIG. 1 is a block diagram illustrating the configuration of the information processing apparatus according to the embodiment.
- FIG. 2 is a block diagram illustrating functions of the information processing apparatus according to the embodiment.
- 3A to 3C are diagrams illustrating examples of the seek bar UI according to the embodiment.
- FIG. 4 is a flowchart illustrating the seek bar UI control method according to the first embodiment.
- 5A to 5F are diagrams for explaining an example of the operation of the seek bar UI according to the first embodiment.
- FIG. 6 is a flowchart illustrating a method for controlling the seek bar UI according to the second embodiment.
- FIG. 7 is a flowchart illustrating a method for controlling the seek bar UI according to the third embodiment.
- FIG. 8A and 8B are diagrams illustrating an example of a seek bar UI according to the fourth embodiment.
- FIG. 9 is a flowchart illustrating a method for controlling the seek bar UI according to the fourth embodiment.
- FIG. 10 is a diagram illustrating an example of a threshold setting UI according to the sixth embodiment.
- FIG. 11 is a flowchart illustrating a control method of operation control according to the seventh embodiment.
- FIG. 12 is a diagram illustrating an example of a display screen of the mobile device 100 according to the seventh embodiment.
- 13A to 13F are diagrams illustrating an example of a control method of operation control according to the seventh embodiment.
- FIG. 14 is a flowchart illustrating a control method of operation control according to the eighth embodiment.
- FIG. 15 is a flowchart illustrating a control method of operation control according to the ninth embodiment.
- 16A and 16B are diagrams illustrating an example of a control method of operation control according to the ninth embodiment.
- FIG. 17 is a flowchart illustrating a control method of operation control according to the tenth embodiment.
- FIG. 18 is a diagram illustrating an example of a setting UI for each setting according to the eleventh embodiment.
- the present invention can also be realized by supplying a storage medium storing a program code to a system or apparatus, and a computer (or CPU or MPU) of the system or apparatus reads and executes the program code from the storage medium.
- the program code itself read from the storage medium realizes the functions of the above-described embodiments, and the program code itself and the storage medium that temporarily stores the program code constitute the present invention. become.
- a storage medium for supplying the program code for example, a flexible disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a magnetic tape, a nonvolatile memory card, a ROM, or the like can be used.
- the program code read from the storage medium may be written in a memory provided in a function expansion board inserted into the computer or a function expansion unit connected to the computer. Then, based on the instruction of the program code, the CPU or the like provided in the function expansion board or function expansion unit performs part or all of the actual processing, and the function of the above-described embodiment is realized by the processing. Needless to say, these are also included in the present invention.
- a touch panel is mainly used as an example of a pointing device.
- the pointing device is an input device that can input a point start, a drag, a point end, and the like.
- Examples of the pointing device include a touch panel, a touch pad, a mouse, a pointing stick, a trackball, a joystick, and a pen tablet.
- the touch panel is used, the point operation is called a touch operation.
- the point operation start and the point operation end are hereinafter referred to as touch-on and touch-off, respectively.
- FIG. 1 is an internal configuration diagram of a mobile device 100 such as a smartphone or a tablet according to an embodiment of the present invention.
- the mobile device 100 includes a CPU 101, a DRAM 102, a communication unit 106, a display unit 107, an input unit 108, and an SSD 109.
- a CPU (Central Processing Unit) 101 performs various calculations and controls each part of the mobile device 100 in accordance with input signals and programs.
- the CPU 101 provides a seek bar UI control function 200 as shown in FIG. 2 by executing the control program 103 read into the DRAM 102. That is, the CPU 101 functions as an input acquisition unit 201, a position determination unit 202, a position storage unit 203, a UI display control unit 204, and a process execution unit 205.
- the input acquisition unit 201 acquires an input from the input unit 108.
- the position determination unit 202 determines the knob position of the seek bar UI.
- the position storage unit 203 stores the reference position of the knob.
- the UI display control unit 204 outputs data for displaying the seek bar UI on the display unit 107 to the display unit 107.
- the process execution unit 205 executes a predetermined process such as a display image change in response to an input to the seek bar UI by the user. Details of these functional units will be described later.
- a DRAM (Dynamic Random Access Memory) 102 is a primary storage device.
- the DRAM 102 stores a control program 103 and an operating system 105 read from the program storage unit 110.
- the control program 103 includes a program for the mobile device 100 to perform image management.
- the operating system 105 includes a program for performing basic operations of the mobile device.
- a part of the DRAM 102 is used as a working memory (working memory) 104 when the CPU 101 executes each program.
- the SSD (Solid State Drive) 109 is a secondary (auxiliary) storage device using a non-volatile flash memory.
- SSDs Solid State Drives
- HDDs Hard Disk Drives
- the program storage unit 110 stores a program for the mobile device 100 to execute various functions and a basic operating system program. These programs are read into the DRAM 102 which can be read and written at higher speed as a primary memory, and are sequentially read and executed by the CPU 101. An operation in which a program is executed by the SSD 109, the DRAM 102, and the CPU 101 to execute a function as a mobile device is the same as that of a general mobile device.
- the SSD 109 stores a plurality of image data 111, 112, 113, 114. These image data are JPEG files taken by the imaging apparatus. The figure shows that four image files IMG_0001.JPG to IMG_0004.JPG have been transferred among 100 image files IMG_0001.JPG to IMG_0100.JPG in the image pickup apparatus. The same applies to moving images, continuous shot images, and audio.
- the display unit 107 is an image display device such as a liquid crystal display. In mobile devices, the display unit 107 is generally provided integrally with the main body, but a display device different from the mobile device main body may be connected to the mobile device.
- the display unit 107 displays various information including image data and controls (also referred to as UI elements or UI objects) for user operations.
- the input unit 108 is a configuration for the user to input to the mobile device 100.
- the input unit 108 is configured from a touch panel that is generally used in mobile devices.
- the touch panel detects a user's touch operation on the image display device.
- the touch panel method is not particularly limited, and any existing method such as a capacitance method, a resistive film method, or a surface acoustic wave method can be employed.
- the communication unit 106 transmits / receives data to / from other devices by wireless communication or wired communication.
- the communication unit 106 provides, for example, communication using a wireless connection such as a wireless LAN or communication using a wired connection such as a USB (Universal Serial Bus) cable.
- the communication unit 106 may be directly connected to an external device, or may be connected to the external device via a server or a network such as the Internet.
- FIG. 3A shows an example of a display screen.
- the display screen includes an image display area 301 and a seek bar UI 300.
- FIG. 3B is a diagram showing a detailed configuration of the seek bar UI 300.
- the seek bar UI 300 includes a track 310 that is a horizontally long rectangular operation area, and a knob 320 that can move in the left-right direction along the track 310.
- the “knob” is also referred to as “thumb”, “indicator”, “tab”, or the like.
- the knob indicates the position of the currently displayed image in the whole of a plurality of images taken continuously, and the user can switch the image displayed in the image display area 301 by moving the knob 320. In the image display area 301, an image corresponding to the position of the knob 320 is displayed.
- the knob 320 is at the left end of the track 310, the first image is displayed, and if the knob 320 is at the right end of the track 310, the 140th image is displayed. 24/140, the 24th image is displayed.
- a thumbnail 330 of a part of the continuously shot images is displayed so that the contents of the continuously shot images can be easily understood.
- seven thumbnails are displayed in a row, and each is a thumbnail of the images at positions 0/6, 1/6,.
- the knob 320 of the seek bar UI 300 moves along the row of thumbnails 330.
- the basic movement method of the knob 320 is as follows. Here, the basic operation of the seek bar UI will be described, and details of the operation of the seek bar UI in this embodiment will be described later.
- the knob 320 moves to that position.
- the knob 320 moves accordingly. More specifically, the knob 320 first moves to the drag start position (touch-on position), and then the knob 320 moves according to the drag movement amount.
- the position of the knob 320 can be moved by switching the image displayed in the image display area 301. For example, when the user performs a swipe operation in the image display area 301, the image displayed in the image display area 301 is switched, and accordingly, the knob 320 moves to a position corresponding to the image after switching.
- the specific display mode of the seek bar UI 300 is not limited to FIG. 3B.
- the seek bar UI only needs to be able to move the knob 320 along the track 310.
- the seek bar UI may not display a thumbnail on the track 310.
- FIG. 3C shows another example of the seek bar UI 300.
- the seek bar UI 300 is managed as a seek bar UI object in the computer.
- the seek bar UI object includes an internal state (variable) such as a position of the knob 320 and processing (function) when various events occur.
- An internal operation of the computer when the user performs an operation on the seek bar UI on the display unit 107 will be briefly described.
- An input from the input unit 108 is passed as an event to the OS, and this event is passed from the OS to the seek bar UI object.
- the internal state of the seek bar UI is changed, and a predefined operation is performed in response to the change.
- an internal variable representing the position of the knob 320 is updated in response to a touch-on or drag input event on the track 310, and along with the update, updating processing of the display position of the knob 320 and image display are performed.
- a process of switching the image displayed in the area 301 is performed.
- Example 1 A seek bar UI control method according to an embodiment of the present invention will be described with reference to FIGS.
- the knob position at the time of the most recent touch-off in the seek bar (at the end of the point), the position at which the touch-on event at the time of touch-on (at the start of point) is acquired (touch-on position, point start position) Determine the position of the knob based on the distance between.
- the position of the knob in the seek bar UI is also referred to as a seek bar position.
- step S401 the input acquisition unit 201 determines whether a touch-on event in the seek bar UI 300 has been acquired. When the touch-on event in the seek bar UI 300 is not acquired (S401-NO), the process waits until the touch-on event is acquired. If a touch-on event has been acquired (S401—YES), the process proceeds to step S402.
- step S402 the position determination unit 202 determines whether or not the knob position (xOFF, OFFyOFF) at the time of touch-off in the seek bar is held in the position storage unit 203. If the knob position at the time of touch-off is not held (S402—NO), the process proceeds to step S405. In step S405, the position determination unit 202 sets the knob position to the touch-on position (xON, yON).
- step S403 the position determination unit 202 calculates a distance r between the knob position (xOFF, yOFF) at the time of touch-off and the touch-on position (xON, yON) in the seek bar UI300.
- step S404 the position determination unit 202 determines whether the distance r is greater than a predetermined threshold value ⁇ (distance r> threshold value ⁇ ). If the distance r is larger than the threshold value ⁇ (distance r> threshold ⁇ ), the process proceeds to step S405, and the position determination unit 202 sets the knob position to the touch-on position (xON, yON). On the other hand, if the distance r is equal to or less than the threshold value ⁇ (distance r ⁇ threshold value ⁇ ), the process proceeds to step S406, and the position determination unit 202 sets the knob position to the held knob position (xOFF, yOFF) at the time of touch-off. Set.
- ⁇ distance r> threshold value ⁇
- the distance r may be obtained as a distance along the moving direction of the knob.
- the distance r
- the knob position stored in the position storage unit 203 may be used as a knob drawing area. In this case, the distance between the touch-on position and the knob drawing area is zero when the touch-on position is within the knob drawing area, and is the shortest distance between the touch-on position and the knob drawing area when the touch-on position is outside the knob drawing area. That's fine.
- the magnitude of the threshold value ⁇ may be a predetermined value, or may be a value dynamically determined according to the width of the knob 320 (when the width of the knob 320 changes).
- the threshold value ⁇ can be, for example, about the same as the size of the contact area between the finger and the touch panel when the user performs a touch operation (for example, about 1 to 1.5 times). Further, the threshold value ⁇ can be about 1 to 10 times the half of the width of the knob 320.
- a value selected based on a predetermined criterion from a plurality of values obtained as described above, for example, a minimum value or a maximum value among the plurality of values may be employed.
- step S407 it is determined whether the input acquisition unit 201 has acquired a touch-off event.
- the touch-off event is not acquired, it means that an operation such as dragging is continued and the finger is not separated from the touch panel.
- the position determination unit 202 determines the knob position according to the touch-on position (xON, yON) and the current touch position (x, y) (S409). For example, the position determination unit 202 adds the movement amount (x-xON, y-yON) or the x-direction movement amount (x-xON, 0) to the knob position determined in step S405 or S406. As the new knob position.
- step S407 If the touch-off event is acquired in step S407, the process proceeds to step S408, and the knob position when the touch-off event is acquired is stored in the position storage unit 203.
- the knob position (xOFF, yOFF) is already held in the position storage unit 203, it is overwritten and updated with a new value.
- step S408 the time when the touch-off event is acquired is stored in the position storage unit 203 in association with the knob position.
- step S402 if the current time has passed a predetermined time or more from the time stored in association with the knob position, it is determined that the knob position at the time of touch-off is not held (No in S402). Also good.
- the knob position at the previous touch-off is selectively used according to not only the distance from the knob position at the previous touch-off but also the elapsed time from the previous touch-off.
- An accurate knob position suitable for the user's intention can be designated.
- the UI display control unit 204 updates the display of the seek bar UI 300 along with the update of the knob position, and the process execution unit 205 performs a predetermined process.
- An example of the process performed by the process execution unit 205 is a process of changing an image displayed in the image display area 301.
- the knob position is updated in steps S405, S406, and S409, these processes are executed.
- 5A to 5F are examples of screen display including the seek bar UI in the above-described example of the seek bar UI control method.
- an operation example of the above-described seek bar UI control method will be described with reference to FIGS. 5A to 5F.
- FIG. 5A shows a user interface 501 displayed on the display unit 107 when the mobile device 100 executes a program for selecting a best shot image from continuous shot images.
- 140 images are taken as continuous shot images and stored in the SSD 109.
- a continuous shot image captured by a camera (not shown) or a continuous shot image held by another external device may be displayed on the mobile device 100 while being transferred to the mobile device 100.
- the specific format of the plurality of continuous shot images is arbitrary, and the plurality of continuous shot images may be stored in one file, or may be stored in different files.
- the numerical value on the left side of the display 502 indicates the image number currently being displayed.
- the image number indicates the number taken from the beginning of the continuous shot image.
- the numerical value on the right side of the display 502 indicates the total number of continuous shot images.
- the check box 503 is used by the user to select (specify) the best shot among the continuous shot images. The user turns on the check box 503 to select the currently displayed image as the best shot.
- the check box 503 may be checked when the user touches on the preview image 504. By performing a predetermined operation after checking the check box 503 (or doing nothing), the image corresponding to the checked preview image is saved or marked as the best shot image.
- the preview image 504 is an enlarged display of an image corresponding to the seek bar position (knob position on the seek bar UI) in the continuous shot image.
- the seek bar UI includes a track 507 and a knob 505 that can move along the track 507.
- the knob 505 moves the track 507 in which thumbnails are arranged to the left and right by an operation such as touch-on or dragging.
- a thumbnail image of the continuous shot image is displayed on the track 507 of the seek bar UI. If possible (if the total number of continuous shot images is small), all thumbnail images of the continuous shot images are displayed superimposed on the track 507. However, when all the thumbnail images of the continuous shot images cannot be displayed (including the case where the thumbnail images are smaller than a predetermined reference), the thumbnail images of some of the continuous shot images are superimposed on the track 507. Displayed.
- a position 506 (xOFF, yOFF) is a knob position when the user touches off from the seek bar UI. This position 506 is held in the position storage unit 203.
- FIG. 5B shows a position 508 (xON, yON) where the user touches on the seek bar UI again after the touch-off in FIG. 5A.
- a distance r between the knob position 506 at the previous touch-off and the touch-on position 508 is calculated.
- a region 514 indicated by diagonal lines in FIG. 5B indicates a region located within a threshold ⁇ from the knob position 506 at the time of touch-off.
- region 514 is drawn on drawing for description, in the present Example, the area
- the distance r between the touch-on position 508 and the knob position 506 at the time of touch-off is equal to or less than the threshold value ⁇ .
- the position determination unit 202 makes the knob position 509 when the touch-on event is acquired the same as the knob position 506 at the previous touch-off, and moves the knob with the position 506 as the start position. To enable. That is, when the touch-on position (xON, yON) is close to the knob position (xOFF, yOFF) at the time of touch-off before touch-on, the knob position is not changed. When the user performs a drag operation after touch-on, the knob moves from the position 506 by an amount corresponding to the drag amount.
- position 510 is the knob position when the user touches off the seek bar UI again.
- the touch-off position 511 and the knob position 510 are different.
- the knob position held in the position storage unit 203 is updated from the position 506 to the position 510.
- FIG. 5E shows a location 512 where the user has touched on the seek bar UI again after the touch-off in FIG. 5D.
- the distance r between the knob position 510 and the touch-on position 512 at the previous touch-off is calculated. This time, the distance r is larger than the threshold value ⁇ . Then, as illustrated in FIG.
- the position determination unit 202 sets the position 512 when the touch-on event is acquired as the knob position 513 and uses the position 513 as the start position to validate the movement of the knob. That is, when the touch-on position is far from the knob position at the time of touch-off before touch-on, the knob position is changed to the touch-on position. When the user performs a drag operation after touch-on, the knob moves from the touch-on position 512 (513) by an amount corresponding to the drag amount.
- the present embodiment as described above may be selectively implemented depending on conditions. For example, the above-described embodiment is performed when a condition such as a setting to be performed in advance by the user, operation in a specific mode, or operation of a specific UI element is satisfied. If the condition is not satisfied, the knob position is determined according to the new touch-on position even if the distance between the knob position at the previous touch-off and the new touch-on position is shorter than the threshold.
- the seek bar (knob) position is determined according to the distance between the previous touch-off position and the current touch-on position. Specifically, if the distance is small, the knob position at the previous touch-off is determined again as the knob position, and if the distance is large, the current touch-on position is determined as the knob position. Therefore, when the user wants to specify the same position again after touch-off, and touches on the vicinity of the knob position at the previous touch-off, the knob position at the previous touch-off is specified. Since the user does not need to accurately touch on the knob position at the previous touch-off, the user can easily specify the same knob position as before.
- Example 2 A seek bar UI control method according to an embodiment of the present invention will be described with reference to FIG.
- the control of this embodiment is basically the same as that of the above embodiment, but a function for preventing the movement of the seek bar due to a user's erroneous operation is added. In the following, differences from the first embodiment will be mainly described.
- steps S601 to S602 are added to the processing of the first embodiment (FIG. 4).
- step S601 after detecting the touch-on event, the position determination unit 202 calculates a touch-on duration t.
- step S602 the position determination unit 202 determines whether the touch-on duration t is greater than the threshold time ⁇ . If the touch-on duration t is greater than the threshold value ⁇ , the processing from step S402 onward is executed as in the first embodiment. If the touch-on duration t is less than or equal to the threshold value ⁇ , the process ends.
- the touch-on duration t is the time that the user keeps touching without moving the touch position by a predetermined amount or more after the touch-on event is detected.
- the touch-on continuation time t is obtained and then compared with the threshold time ⁇ . However, if the time during which the touch at substantially the same position continues exceeds the threshold time ⁇ , the processing after step S402 is executed, and if touch-off or a drag of a predetermined amount or more occurs before the threshold time ⁇ elapses, the processing is performed. You may comprise so that it may complete
- the touch input for the threshold time ⁇ or less is invalidated, and the process can be validated when touch input for the threshold time ⁇ or more (so-called long tap) is performed. That is, there is an effect that the seek bar is not moved when the user accidentally touches the screen.
- the start of processing is determined based only on the duration of touch-on, but other factors may be taken into consideration.
- the input may be invalidated when the touch position moves more than the threshold distance between the touch-on and the threshold ⁇ time.
- the “touch-on position” after step S402 in the present embodiment may be a touch position when the touch-on event occurs, or may be a touch position when the threshold time ⁇ has elapsed since the occurrence of the touch-on.
- Example 3 A seek bar UI control method according to an embodiment of the present invention will be described with reference to FIG.
- the seek bar knob is intended to be moved to the touch position, and the knob position at the time of touch-off and the touch-on position are not compared.
- step S701 The calculation process of the touch-on duration t in step S701 is the same as that in step S601 in the second embodiment.
- step S702 the touch-on duration t is compared with the threshold value ⁇ . If the touch-on duration t is equal to or shorter than the threshold time ⁇ (S702-NO), that is, if a long tap is not performed, the process proceeds to step S402. Then, the same processing as in the first embodiment is performed. On the other hand, if the touch-on duration t is longer than the threshold time ⁇ (S702-YES), that is, if a long tap has been made, the process proceeds to step S405, where the knob position is set to the tap-on position.
- the threshold time ⁇ may be the same as or different from the threshold time ⁇ in the second embodiment.
- ⁇ is made larger than ⁇ ( ⁇ ⁇ ).
- the knob position when the user intentionally performs a long tap operation, the knob position can be moved to the touch-on position regardless of the distance between the knob position and the touch-on position at the time of touch-off.
- step S402 When the touch-on event is acquired, the processing after step S402 is immediately performed to change the knob position. Then, when a long tap event is detected thereafter, the knob is moved to the long tap position (touch-on position). Even if it does in this way, the same effect is acquired.
- the processing is switched according to the distance between the knob position at the time of touch-off and the touch-on position.
- the position of the knob of the seek bar UI also changes other than the touch operation on the seek bar. For example, when a moving image or audio file is being played back, the knob position is switched according to the playback location.
- a seek bar UI is used for reproduction / editing of a moving image file.
- FIG. 8A shows a user interface 801 displayed on the display unit 107 when the mobile device 100 executes a program for selecting a best shot image from images constituting a moving image.
- a display 802 represents the total number of frames constituting the moving image and the current frame position.
- a check button 803 is used to select the best shot image.
- the preview image 804 displays the currently designated frame image.
- the seek bar UI at the bottom of the screen includes a knob 805 and a track 807 on which a predetermined number (seven in this case) of thumbnail images are displayed in a row.
- the audio control 808 is used to perform operations such as moving image playback, pause, frame advance, and frame return. Playback and pause are displayed alternately.
- FIG. 8B shows a user interface 811 displayed on the display unit 107 when the mobile device 100 executes a program for editing an audio file.
- the basic configuration of the user interface 811 is the same as that of the user interface 801 in FIG. 8A. The difference is that an audio waveform 814 at the current position is displayed at the center of the screen and a thumbnail or the like is not displayed on the track 817.
- FIG. 9 is a flowchart showing a method for controlling the seek bar UI in the present embodiment.
- step S901 it is determined whether the input acquisition unit 201 has acquired a touch-on event in the seek bar UI.
- the process waits until the touch-on event is acquired. If a touch-on event has been acquired (S401—YES), the process proceeds to step S902.
- step S902 it is determined whether or not the moving image file is being reproduced.
- the moving image is not being reproduced (S902-NO)
- the same processing as that in step S402 and subsequent steps in the first embodiment (FIG. 4) is executed.
- the process proceeds to step S903.
- Steps S903 to S909 are basically the same as steps S403 to S409 in the first embodiment (FIG. 4). The difference is that the current knob position (xCUR, yCUR) is used instead of the knob position (xOFF, yOFF) at the time of touch-off. Accordingly, the processing contents of steps S903 and S906 are different.
- the processing from step S903 onward may be performed regardless of whether the moving image is being played back or paused. In this way, when the vicinity of the current knob position is touched, the movement can be started from the current knob position.
- the present embodiment can be applied not only when reproducing / editing moving images, audio, etc., but also when displaying continuously shot images as in the first to third embodiments.
- the processing is switched according to the distance between the touch-on position and a certain reference position (the knob position at the time of the latest touch-off or the current knob position).
- the knob position is determined by comparison with a plurality of reference positions.
- the position storage unit 203 stores a plurality of reference positions.
- the plurality of reference positions include a knob position at the time of a plurality of touch-off operations (point end operation) in the seek bar UI in the past and a current knob position. More specifically, the knob position at the past touch-off operation can be set to the knob position at the last predetermined touch-off time or the knob position at the touch-off time within the latest predetermined time.
- the position determination part 202 calculates the distance between a touch-on position and all these reference positions, and if the distance between any reference positions is below a threshold value, the said reference position will be touched I reckon. When the distance is less than or equal to the threshold value for two or more reference positions, the reference position closest to the touch-on position may be regarded as touched.
- the knob position becomes the reference position, the following conditions can be added. For example, if the position of the knob after touch-off does not change the display of content such as an image for a predetermined time or longer, and the condition that the knob is stopped at that position is satisfied, this knob position is set to the reference position.
- content such as an image displayed corresponding to the position is considered to be important, but for a knob position that is immediately changed, it is considered that the corresponding content is not important. Because.
- the knob position at the time of touch-off may be held as a reference position.
- the predetermined operation is, for example, an operation that explicitly or implicitly instructs the user to store the knob position at the time of the touch-off as a reference position.
- the threshold distance ⁇ in the first to fifth embodiments and the threshold values ⁇ and ⁇ of the touch-on duration in the second and third embodiments can be set by the user.
- FIG. 10 is an example of a user interface for setting the threshold values ⁇ and ⁇ .
- the distance threshold value ⁇ is expressed by the term “touch effective range”
- the time threshold value ⁇ is expressed by the term “touch effective time”.
- the environment setting UI 1001 includes a slider control 1002 for changing the effective range of touch.
- an effective touch range 1005 is displayed on both sides of the seek bar knob 1004.
- the environment setting UI 1001 includes a slider control 1003 for changing the effective time of touch.
- the effective time of the touch is used as a threshold value ⁇ for preventing the seek bar from moving when the user touches by mistake.
- ⁇ threshold value
- the effective range and effective time are specified using the slider control.
- these values may be input using numerical input, spin control, or the like.
- the effective range of touch may be specified in the display area of the seek bar UI at the bottom of the screen.
- Example 7 A control method for operation control according to an embodiment of the present invention will be described with reference to FIGS. 2, 11, and 12.
- the operation control is, for example, an arrow cursor displayed on the image display area 1201 of the display device as indicated by reference numeral 1202 in FIG.
- an operation is performed according to the distance between the operation control position at the time of the most recent touch-off on the operation screen and the position (touch-on position, point start position) at which the touch-on event at the time of touch-on is acquired. Determine the position of the control.
- each step of FIG. 11 will be described in detail.
- step S1101 it is determined whether the input acquisition unit 201 has acquired a touch-on event on the image display area 1201.
- the process waits until the touch-on event is acquired. If a touch-on event has been acquired (S1101-YES), the process proceeds to step S1102.
- step S1102 the position determining unit 202 determines whether or not the position storage unit 203 holds the operation control position (xOFF, yOFF) at the time of touch-off on the operation screen. If the operation control position (xOFF, yOFF) at the time of touch-off is not held (S1102-NO), the process proceeds to step S1105. In step S1105, the position determination unit 202 sets the operation control position to the touch-on position (xON, yON).
- step S1103 the position determination unit 202 calculates the distance r between the operation control position (xOFF, yOFF) and the touch-on position (xON, yON) at the time of touch-off.
- step S1104 the position determination unit 202 determines whether the distance r is greater than a predetermined threshold value ⁇ (distance r> threshold value ⁇ ). If the distance r is greater than the threshold value ⁇ (distance r> threshold ⁇ ), the process proceeds to step S1105, and the position determination unit 202 sets the operation control position to the touch-on position (xON, yON). On the other hand, if the distance r is less than or equal to the threshold value ⁇ (distance r ⁇ threshold value ⁇ ), the process proceeds to step S1106, and the position determination unit 202 sets the operation control position (xOFF, Set to yOFF).
- the magnitude of the threshold value ⁇ may be a predetermined value set in advance or may be a value dynamically determined according to the width of the operation control 1202 (the size of the operation control 1202 changes). If).
- the threshold value ⁇ can be, for example, about the same as the size of the contact area between the finger and the touch panel when the user performs a touch operation (for example, about 1 to 1.5 times). Further, the threshold value ⁇ can be about 1 to 10 times the half of the width of the knob 320. Further, as the threshold value ⁇ , a value selected based on a predetermined criterion from a plurality of values obtained as described above, for example, a minimum value or a maximum value among the plurality of values may be employed.
- step S1107 it is determined whether the input acquisition unit 201 has acquired a touch-off event.
- the touch-off event is not acquired, it means that an operation such as dragging is continued and the finger is not separated from the touch panel.
- the position determination unit 202 determines the operation control position according to the touch-on position (xON, yON) and the current touch position (x, y) (S1109). For example, the position determination unit 202 determines a position obtained by adding the movement amount (x-xON, y-yON) to the operation control position determined in step S1105 or S1106 as a new operation control position.
- step S1107 If the touch-off event is acquired in step S1107, the process proceeds to step S1108, and the operation control position when the touch-off event is acquired is stored in the position storage unit 203.
- the operation control position (xOFF, yOFF) is already held in the position storage unit 203, it is overwritten and updated with a new value.
- step S1108 the time when the touch-off event is acquired is stored in the position storage unit 203 in association with the operation control position.
- step S1102 if the current time has passed a predetermined time or more from the time stored in association with the operation control position, it is determined that the operation control position at the time of touch-off is not held (NO in S1102). It may be. Alternatively, when the time stored in association with the operation control position passes a predetermined time, it may be deleted from the position storage unit 203 together with the operation control position. With such a configuration, it is possible to selectively use the operation control position at the previous touch-off according to not only the distance from the operation control position at the previous touch-off but also the elapsed time from the previous touch-off. Therefore, it is possible to specify an accurate operation control position that matches the user's intention.
- the UI display control unit 204 updates the display of the operation control 1202 in FIG. 12 as the operation control position is updated, and the process execution unit 205 performs a predetermined process. For example, the process execution unit 205 executes a process based on the position corresponding to the touch-off operation in response to the touch-off operation. In addition, if the distance between the position corresponding to the touch-on operation and the position corresponding to the previous touch-off operation is smaller than a predetermined value, the process execution unit 205 is based on the position corresponding to the previous touch-off operation in response to the touch-on operation. Execute the process.
- the process execution unit 205 executes a process based on the position corresponding to the touch-on operation in response to the touch-on operation.
- An example of the process performed by the process execution unit 205 is a process of changing an image displayed in the image display area 1201.
- the operation control specifies the position of the slider on the slider bar
- the position of the slider is moved and displayed in response to the position of the operation control.
- any of a plurality of frames included in the moving image may be selected depending on the position of the operation control, and the image displayed in the image display area 1202 may be updated.
- a parameter used to adjust the content of the mobile device 100 may be designated based on the position of the operation control.
- the pixels included in the image displayed in the image display area 1201 may be specified depending on the position of the operation control, and the display characteristics such as the color temperature of the image displayed may be changed according to the specified pixels. Further, when the operation control position is updated in steps S1105, S1106, and S1109, these processes are executed.
- FIGS. 13A to 13F are examples of UI control operations according to an embodiment of the present invention.
- the position of the operation control at the touch-on is controlled according to the operation control position at the touch-off and the distance at the touch-on in the continuous shot image.
- a display screen denoted by reference numeral 1301 is an image of an application installed in the mobile device 100 such as a smartphone or a tablet.
- This application has a function of calculating the white balance based on the selected pixel in the displayed image and applying it to the display image.
- the file format of the displayed image is not particularly limited. For example, a JPEG file, a RAW file, a moving image file, or the like may be displayed. Further, the displayed image may be stored by the mobile device 100, or may be displayed while transferring a continuous image stored in the imaging device to the mobile device 100.
- the operation control 1310 in FIG. 13A is an indicator for specifying a pixel at a predetermined position by moving up and down and left and right on the image by an operation such as dragging.
- a cursor is used for operation control.
- Reference numeral 1311 denotes a user's hand operating the mobile device 100.
- Reference numeral 1302 indicates an operation control position (xOFF, yOFF) when the user touches off the operation control.
- Reference numeral 1303 in FIG. 13B indicates a position (xON1, yON1) where the user touches on the terminal after the touch-off.
- the position determining unit 202 calculates the distance between the operation control position (xOFF, yOFF) when touched off and the touched position (xON1, yON1).
- the operation control position is set to the operation control position (xOFF, yOFF) when the touch is turned off as indicated by reference numeral 1304 in FIG. 13C.
- Reference numeral 1305 in FIG. 13D denotes an operation control position (xOFF, yOFF) when the user touches off the operation control.
- Reference numeral 1306 denotes a position where the user touches the terminal after touch-off (xON2, yON2). Then, the position determination unit 202 calculates the distance between the operation control position (xOFF, yOFF) when touched off and the touched position (xON2, yON2).
- the position of the operation control is set based on the touch-on position (xON2, yON2) as indicated by reference numeral 1307 in FIG. 13F.
- white balance is calculated based on the pixel at the new touch position 1307 and applied to the display image of FIG. 13F.
- FIG. 14 is a flowchart showing a control procedure of operation control according to an embodiment of the present invention.
- the movement of the operation control is also invalidated when the operation control position at the time of touch-off and the distance at the time of touch-on exceed a threshold value.
- step S1401 it is determined whether the input acquisition unit 201 has acquired a touch-on event on the image display area 1201. If the touch-on event is not acquired on the image display area 1201 (S1401-NO), the process waits until the touch-on event is acquired. If the touch-on event has been acquired (S1401-YES), the process proceeds to step S1402.
- step S1402 the input acquisition unit 201 calculates a touch-on duration t from the touch-on event, and the process proceeds to step S1403.
- step S1403 the input acquisition unit 201 compares the touch-on duration t with a predetermined threshold value ⁇ , and determines whether the touch-on duration time t is greater than the threshold value ⁇ .
- the touch-on duration t is equal to or less than the threshold ⁇ (touch-on duration t ⁇ threshold ⁇ )
- the operation at the position of the operation control is invalidated, and the process ends.
- the touch-on duration is short, there is a high possibility that the user has accidentally touched the operation screen, and this control has an effect of preventing unintended movement of the operation control.
- the touch-on time t is larger than the threshold value ⁇ (touch-on duration t> The threshold value ⁇ ) is determined to have been touched on by the user, and the process proceeds to step S1404.
- step S1404 the position determination unit 202 determines whether or not the operation control position (xOFF, yOFF) at the time of touch-off is held in the position storage unit 203.
- the operation control position is set to the touch-on position (xON, yON) in step S1407. If the position storage unit 203 holds the operation control position at the time of touch-off, in step S1405, the distance r between the operation control position (xOFF, yOFF) and the touch-on position (xON, yON) when the position determination unit 202 is at touch-off. Is calculated.
- step S1406 the position determination unit 202 determines whether the distance r is greater than the threshold value ⁇ (distance r> threshold value ⁇ ). If the distance r is larger than the threshold value ⁇ (distance r> threshold ⁇ ), the operation control position is set to the touch-on position (xON, yON) in step S1407. If the distance r is less than or equal to the threshold value ⁇ (distance r ⁇ threshold value ⁇ ), the operation control position is set to the operation control position (xOFF, yOFF) at the time of touch-off in step S1408.
- step S1409 it is determined whether the input acquisition unit 201 has acquired a touch-off event.
- the touch-off event is not acquired, it means that an operation such as dragging is continued and the finger is not separated from the touch panel.
- the position determination unit 202 determines the operation control position according to the touch-on position (xON, yON) and the current touch position (x, y) (S1410). For example, the position determination unit 202 determines a position obtained by adding the movement amount (x-xON, y-yON) to the operation control position determined in step S1407 or S1408 as a new operation control position.
- step S1409 If the touch-off event has been acquired in step S1409, the process proceeds to step S1411, and the operation control position when the touch-off event is acquired is stored in the position storage unit 203.
- the operation control position (xOFF, yOFF) is already held in the position storage unit 203, it is overwritten and updated with a new value.
- FIG. 15 is a flowchart showing a control procedure of operation control according to an embodiment of the present invention.
- the operation control position at the time of touch-off is the edge of the screen on the display screen
- the operation control position at the time of touch-on again is easily restarted from the operation control position at the time of touch-off.
- each step of the flowchart shown in FIG. 15 will be described.
- step S1501 the input acquisition unit 201 determines whether a touch-on event has been acquired. When the touch-on event is not acquired on the image display area 1201 (S1501-NO), the process waits until the touch-on event is acquired. If a touch-on event is not acquired on the image display area 1201, the process proceeds to step S1502.
- step S1502 the position determination unit 202 determines whether or not the operation control position (xOFF, yOFF) at the time of touch-off is held in the position storage unit 203. If the operation control position at the time of touch-off is not held (S1502-NO), the process proceeds to step S1505. In step S1505, the position determination unit 202 sets the operation control position to the touch-on position (xON, yON).
- step S1503 the input acquisition unit 201 determines whether the touch-off position (xOFF, yOFF) is an end of the terminal. As will be described later, the operation control is displayed so as not to overlap the finger to be touched. This is to prevent the operation control from being hidden by the user's finger so that the user can find out where he is pointing. For example, when the touch operation is performed with the finger of the right hand, the operation control is located at the upper left of the position pointed by the finger and is displayed so as not to overlap the finger.
- the terminal end described in step S1503 refers to the right end and the lower end of the screen.
- the finger If the finger is moved to the right or bottom edge of the screen while the operation control is displayed at the upper left of the position pointed by the finger, the finger goes out of the screen (outside the touch sensor area) before the operation control. End up. In this state, operation control cannot be performed. Therefore, the size of the threshold is changed so that the touch control is again performed and the operation control is easily moved to the right end and the lower end of the screen. If the touch-off position (xOFF, yOFF) is at the terminal end (S1503-YES), the threshold value ⁇ is changed to ⁇ (a value larger than ⁇ ) in step S1504, and the process proceeds to step S1506.
- step S1506 the position determination unit 202 calculates the distance r between the operation control position (xOFF, yOFF) and the touch-on position (xON, yON) at the time of touch-off.
- step S1507 the position determination unit 202 determines whether the distance r is greater than a predetermined threshold value ⁇ (or ⁇ ) (distance r> threshold ⁇ ( ⁇ )). If the distance r is larger than the threshold value ⁇ ( ⁇ ) (distance r> threshold ⁇ ( ⁇ )), the process advances to step S1505, and the position determination unit 202 sets the operation control position to the touch-on position (xON, yON).
- step S1508 sets the operation control position to the operation control position (xOFF, yOFF) at the time of touch-off.
- step S1509 it is determined whether the input acquisition unit 201 has acquired a touch-off event.
- the touch-off event is not acquired, it means that an operation such as dragging is continued and the finger is not separated from the touch panel.
- the position determination unit 202 determines the operation control position according to the touch-on position (xON, yON) and the current touch position (x, y) (S1510).
- the touch-off event is acquired, the process proceeds to step S1511, and the operation control position when the touch-off event is acquired is stored in the position storage unit 203.
- the operation control position (xOFF, yOFF) is already held in the position storage unit 203, it is overwritten and updated with a new value.
- the 16A and 16B are examples of operation screens according to one embodiment of the present invention.
- the operation control 1601 is an indicator for specifying a pixel at a predetermined position by moving up and down and left and right on the operation screen by an operation such as dragging, and an arrow cursor is used in this embodiment.
- the operation control 1601 is displayed so as not to overlap the finger 1606 to be touched. This is so that the user can see where he is pointing.
- the operation screen is touched with the finger of the right hand, and the operation control is displayed at the upper left of the position pointed by the finger so as not to overlap the finger.
- Reference numeral 1602 is a threshold value for determining from which position the operation control starts to move when touch-on is performed again, and is set with respect to the distance from the operation control position at the time of touch-off.
- the threshold value 1602 is indicated by a circle whose radius is the threshold value rin with the operation control position 1601 at the time of touch-off as the center.
- the value of the threshold 1602 may be changed depending on the position where the touch-on event is acquired on the operation screen. For example, in this embodiment, when a touch-on event is acquired within the area indicated by reference numeral 1603, the threshold is assumed to be rin. When the operation control is displayed on the upper left of the finger of the right hand, the operation control can be moved in one operation by keeping touch-on in the area 1603.
- the finger goes out of the screen before the operation control. For this reason, it is necessary to move to the lower right and the lower end of the screen after touch-on again to enable the movement of the operation control from the previous touch-off position.
- the threshold value rout indicated by reference numeral 1605 is changed to a value larger than the threshold value rin indicated by reference numeral 1602. Accordingly, when the touch-on is performed again, the operation control position is easily set to the previous touch-off position.
- FIG. 17 is a flowchart showing a procedure of image processing according to an embodiment of the present invention.
- image processing is executed based on a pixel selected at the time of touch-off.
- the input acquisition unit 201 determines whether a touch-on event has been acquired. If the touch-on event is not acquired (S1701-NO), the process waits until the touch-on event is acquired.
- step S1702 the process execution unit 205 determines whether image processing has been executed using the pixel at the previous touch-off position. If the process based on the previous touch-off position is being executed (S1702-YES), the process proceeds to step S1703. If the process based on the previous touch-off position has not been executed (S1702-NO), the process proceeds to step S1704. In step S1703, the image processing is interrupted, and the process proceeds to step S1704.
- step S1704 the position determination unit 202 determines the operation control start position based on the previous touch-off position. Since the description of this process has been described above, it will be omitted.
- step S1705 image processing is executed using the pixels at the current touch-off position. In this embodiment, the image processing using the pixel at the previous touch-off position is interrupted before the process of step S1704. However, the image processing is interrupted only when the position of the operation control is changed as a result of the process of step S1704. It may be.
- FIG. 18 is an example of a setting screen for setting the effective range of touch, the effective time, and the display position of the operation control according to an embodiment of the present invention.
- Reference numeral 1801 denotes a slider control for changing the effective range of touch.
- the size of the circle 1804 indicating the effective range displayed at the bottom of the screen is changed and displayed.
- Reference numeral 1802 denotes a slider control for changing the effective time of touch. This valid time is used as a threshold for preventing the seek bar and the operation control from moving when touched by mistake.
- Reference numeral 1803 is a tab for switching whether the display position of the operation control is displayed at the upper left or the upper right of the touch position.
- the operation of designating one display / playback position of continuously shot images, moving images, or sounds using the seek bar UI has been described as an example.
- the seek bar UI can be used to designate a range by designating two positions of a start point and an end point.
- the present invention is also applicable to the case where such a plurality of positions are specified using a seek bar UI.
- the seek bar UI has been described as an example, but there are a scroll bar, slider control, and the like as a user interface that can be slid like the seek bar UI.
- the present invention can be similarly applied to a user interface capable of such a slide operation. Further, the present invention can be applied not only to a slide operation (movement in one direction) but also to specify a position of a user interface element (UI element) that can be moved in any direction.
- the control is performed based on the distance between the UI element position at the time of touch-off and the touch-on position, but the control is not necessarily performed based on the distance between the two points.
- the position of the UI element at the time of touch-off is regarded as touch-on. good.
- the shape of the predetermined region may be an arbitrary shape.
- the predetermined area can be, for example, an area whose shortest distance from the UI element is within a predetermined distance.
- the position of the UI element is represented by a point, and the predetermined area is defined as an area within a predetermined distance from the position.
- These predetermined distances may be determined in advance, or may be set by the user as in the sixth embodiment.
- the present invention is applied to a mobile device, but the application destination of the present invention is not limited to this.
- the present invention can be applied to any information processing apparatus such as a personal computer (PC), a digital still camera, and a digital video camera.
- PC personal computer
- digital still camera digital still camera
- digital video camera digital video camera
- an embodiment in which the position of the operation control is determined using the touch-on duration time may be applied to the seventh to eleventh embodiments.
- the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program
- This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
図1は、本発明の一実施例に係るスマートフォンやタブレットなどのモバイル機器100の内部構成図である。モバイル機器100は、CPU101と、DRAM102と、通信部106と、表示部107と、入力部108と、SSD109とから構成されている。
図3A~3Cを参照して、本実施例において利用するシークバー・ユーザインタフェースについて説明する。ここでは、コンテンツとして連続撮影された複数の静止画像を例にあげ、これらの中から表示される静止画像を切り替えるためにシークバーUIを用いるユースケースを例に説明する。図3Aは表示画面の例を示し、表示画面は画像表示エリア301とシークバーUI300とを含む。
図2~4を参照して、本発明の一実施例に係るシークバーUIの制御方法について説明する。図4に示す制御では、シークバー内での直近のタッチオフ時(ポイント終了時)のノブ位置と、タッチオン時(ポイント開始時)のタッチオンイベントが取得された位置(タッチオン位置、ポイント開始位置)との間の距離に基づいて、ノブの位置を決定する。なお、以下では簡単のために、シークバーUIにおけるノブの位置のことを、シークバー位置とも称する。
なお、条件によって、上記で説明したような本実施例を選択的に実施するようにしてもよい。例えば、ユーザにより予め実施することが設定されていたり、特定のモードで動作していたり、特定のUI要素を操作されていたりするなどの条件を満たす場合に上述の本実施例を実施する。条件を満たさない場合は、前回のタッチオフ時のノブ位置と新たなタッチオン位置の距離が閾値より短くても新たなタッチオン位置に応じてノブ位置を決定する。
図6を参照して、本発明の一実施例に係るシークバーUIの制御方法について説明する。本実施例の制御は基本的に上記実施例と同様であるが、ユーザの誤操作によるシークバーの移動を防止する機能が付加される。以下では、実施例1との相違点を主に説明する。
図7を参照して、本発明の一実施例に係るシークバーUIの制御方法について説明する。本実施例では、ユーザがロングタップしたときには、シークバーのノブをタッチ位置に移動させることを意図したと判断して、タッチオフ時のノブ位置とタッチオン位置の比較を行わない。
実施例1~3では、タッチオフ時のノブ位置とタッチオン位置の距離に応じて処理を切り替えている。しかしながら、シークバーUIのノブの位置は、シークバーに対するタッチ操作以外でも変化する。例えば、動画像や音声ファイルを再生している場合は、再生箇所に応じてノブ位置が切り替えられる。本実施例では、動画ファイルの再生・編集にシークバーUIを用いる。
実施例1~4では、タッチオン位置と、ある一つの参照位置(直近のタッチオフ時のノブ位置または現在のノブ位置)との間の距離にしたがって、処理を切り替えている。本実施例では、複数の参照位置との比較により、ノブ位置を決定する。したがって、位置記憶部203には複数の参照位置が記憶される。複数の参照位置は、過去におけるシークバーUI内での複数のタッチオフ操作時(ポイント終了操作時)のノブ位置や、現在のノブ位置を含む。また、過去のタッチオフ操作時のノブ位置は、より具体的には、直近の所定回のタッチオフ時のノブ位置や、直近の所定時間以内のタッチオフ時のノブ位置とすることができる。そして、位置決定部202は、タッチオン位置とこれら全ての参照位置との間の距離を算出し、いずれかの参照位置との間の距離が閾値以下であれば当該参照位置がタッチされたものとみなす。2つ以上の参照位置について距離が閾値以下となる場合には、タッチオン位置と最も近い参照位置がタッチされたものとみなせばよい。
実施例1~5における閾値距離αや、実施例2,3におけるタッチオン継続時間の閾値β,γはユーザが設定可能とすることができる。図10は、閾値αおよびβを設定するためのユーザインタフェースの例である。なお、ユーザインタフェースでは、距離の閾値αを「タッチの有効範囲」、時間の閾値βを「タッチの有効時間」という用語で表している。
図2、図11、図12を用いて、本発明の一実施例に係る操作コントロールの制御方法について説明する。操作コントロールとは、例えば図12の符号1202で示されるような、表示装置の画像表示エリア1201上に表示される矢印のカーソルなどである。図11に示す制御では、操作画面上の直近のタッチオフ時の操作コントロール位置と、タッチオン時のタッチオンイベントが取得された位置(タッチオン位置、ポイント開始位置)との間の距離に応じて、操作コントロールの位置を決定する。以下、図11の各ステップを詳述する。
yON2)である。そして、位置決定部202が、タッチオフした場合の操作コントロール位置(xOFF, yOFF)とタッチオンした位置(xON2, yON2)との距離を算出する。算出された距離が閾値を超えた場合、図13Fの符号1307のように、操作コントロールの位置がタッチオンした位置(xON2, yON2)に基づいて設定される。この例では、新たなタッチ位置1307の画素に基づいてホワイトバランスが算出され、図13Fの表示画像に適用される。
図14は、本発明の一実施例に係る操作コントロールの制御の手順を示すフローチャートである。図14の例では、タッチオフ時の操作コントロール位置とタッチオン時の距離が閾値を超えた場合も操作コントロールの移動が無効にされる。以下、図14のフローチャートの各ステップを説明する。
閾値β)は、ユーザーが意図してタッチオンしたものと判断され、処理はステップS1404に進む。
図15は、本発明の一実施例に係る操作コントロールの制御の手順を示すフローチャートである。図15の例では、表示画面においてタッチオフ時の操作コントロール位置が画面の端であった場合、再度タッチオンするときの操作コントロール位置を、タッチオフ時の操作コントロール位置から操作を再開しやすくする。以下、図15に示すフローチャートの各ステップを説明する。
図17は、本発明の一実施例に係る画像処理の手順を示すフローチャートである。本実施例では、タッチオフ時に選択される画素に基づいて画像処理が実行されるものとする。図17に示す例では、画像処理を実行中に、操作コントロールをタッチオフした後にタッチオンする場合、画像処理が中断され、新たなタッチオフ位置に基づいて画像処理が実行される。ステップS1701において、入力取得部201が、タッチオンイベントを取得したかを判定する。タッチオンイベントが取得されない場合(S1701-NO)には、タッチオンイベントが取得されるまで待機する。
図18は、本発明の一実施例に係るタッチの有効範囲、有効時間、操作コントロールの表示位置を設定するための設定画面の例である。符号1801は、タッチの有効範囲を変更するためのスライダーコントロールである。ここでは、この値を変更すると、画面下に表示された有効範囲を示す円1804の大きさが変わって表示される。符号1802は、タッチの有効時間を変更するためのスライダーコントロールである。この有効時間は、誤ってタッチした場合にシークバーや操作コントロールを移動しないようにするための閾値として使用される。符号1803は、操作コントロールの表示位置をタッチ位置の左上に表示するか、右上に表示するかを切り換えるためのタブである。ユーザーの利き手(右利き、左利き)に合わせてどちらに操作コントロールを表示するかを選択できる。また、この設定は、図16A、図16Bで説明した閾値rin、routに対応する領域の設定の切り替えにも使用される。このように操作コントロールの動作制御をユーザーの選択に合わせてカスタマイズすることができる。
上記の説明では、シークバーUIを用いて、連続撮影画像、動画、あるいは音声の1つの表示・再生位置を指定する動作を例に説明した。しかしながら、シークバーUIは、始点と終点の2つの位置を指定して、範囲を指定するために用いることができる。本発明はシークバーUIを用いてこのような複数の位置を指定する場合にも適用可能である。
Claims (23)
- 移動可能なユーザインタフェース要素(UI要素)を表示装置に表示させる表示制御手段と、
前記表示装置に対するユーザー操作を検知する検知手段と、
前記ユーザー操作が前記表示装置上で検知された第1の位置を取得する取得手段と、
前記取得された第1の位置に基づいて前記UI要素が前記表示装置上で表示される第2の位置を決定する決定手段と、
前記検知手段による前記ユーザー操作の検知が終了したときの前記第2の位置を記憶する記憶手段と、
ユーザー操作の検知が新たに開始されたときの第3の位置と前記記憶手段に記憶されている第2の位置との距離を算出する算出手段とを備え、
前記表示制御手段は、前記ユーザー操作の検知が新たに開始された場合、前記UI要素を、前記第3の位置に基づき決定される第4の位置および前記記憶手段に記憶されている第2の位置のうちいずれか一方に、前記算出された距離に応じて選択的に表示するよう制御する
ことを特徴とする情報処理装置。 - 前記表示制御手段は、前記検知手段により前記ユーザー操作の検知が開始した後に継続してドラッグ操作が検知された場合は、前記ドラッグ操作に応じて、前記UI要素を前記算出された距離に応じて選択された位置から移動して表示させる
ことを特徴とする請求項1に記載の情報処理装置。 - 前記表示制御手段は、前記ユーザー操作の継続時間が閾値より短い場合は、前記UI要素が前記表示装置上で表示される位置を変更しないで表示させる
ことを特徴とする請求項1または2に記載の情報処理装置。 - 前記表示制御手段は、前記ユーザー操作の継続時間が閾値より長い場合は、前記算出された距離にかかわらず、前記UI要素を前記第4の位置に表示させる
ことを特徴とする請求項1から3のいずれか1項に記載の情報処理装置。 - 前記記憶手段は過去の複数の第2の位置を記憶し、前記算出された距離は前記第3の位置と前記複数の第2の位置に基づく
ことを特徴とする請求項1から4のいずれか1項に記載の情報処理装置。 - 前記算出された距離は、前記複数の第2の位置のうち、前記UI要素が表示されていた時間に基づき選択された前記第2の位置に基づく
ことを特徴とする請求項5に記載の情報処理装置。 - 前記算出された距離は、前記複数の第2の位置のうち、ユーザーによって選択された位置に基づく
ことを特徴とする請求項5に記載の情報処理装置。 - 前記UI要素は、前記表示装置上に表示される操作領域で所定の方向にスライド操作可能であり、前記表示装置に表示されるコンテンツの再生箇所を示すものである
ことを特徴とする請求項1から7のいずれか1項に記載の情報処理装置。 - 前記コンテンツは、連続撮影画像および動画像および複数の静止画像および音声のうち少なくとも1つを含む
ことを特徴とする請求項8に記載の情報処理装置。 - 前記表示制御手段は、前記UI要素の前記第2の位置または前記第4の位置に対応するコンテンツを前記表示装置に表示させる
ことを特徴とする請求項8または9に記載の情報処理装置。 - 前記UI要素は、前記表示装置上に表示される画像上で移動可能であり、前記画像の画素を指定するものである
ことを特徴とする請求項1から7のいずれか1項に記載の情報処理装置。 - 前記UI要素は、前記表示装置上に表示される操作領域で移動可能であり、コンテンツの調整値を示すものである
ことを特徴とする請求項1から7のいずれか1項に記載の情報処理装置。 - 前記記憶手段は、さらに前記ユーザー操作の検知が終了したときの時間を前記第2の位置に関連付けて記憶し、
前記表示制御手段は、前記ユーザー操作の検知が終了したときの時間からの経過時間に応じて、選択された位置に前記UI要素を表示するよう制御する
ことを特徴とする請求項1から12のいずれか1項に記載の情報処理装置。 - 表示装置へのタッチ操作を検知する検知手段と、
前記表示装置への第1のタッチ操作に応答して、前記第1のタッチ操作に対応する第1の位置に基づく処理を実行する第1の処理手段と、
前記処理の実行に用いられた前記第1の位置を記憶する記憶手段と、
前記表示装置への第2のタッチ操作が検知されたとき、前記記憶されている第1の位置と前記第2のタッチ操作に対応する第2の位置との距離を算出する算出手段と、
前記距離が所定値より小さければ、前記第2のタッチ操作に応答して、前記第1の位置に基づく処理を実行し、前記距離が前記所定値より大きければ、前記第2のタッチ操作に応答して、前記第2の位置に基づく処理を実行する第2の処理手段と、を備える
ことを特徴とする情報処理装置。 - 前記処理は、前記位置に基づき、スライダーをスライダーバー上に表示する処理である
ことを特徴とする請求項14に記載の情報処理装置。 - 前記処理は、前記位置に基づき、動画に含まれる複数のフレームのうちいずれかを選択する処理である
ことを特徴とする請求項14に記載の情報処理装置。 - 画像が前記表示装置に表示され、
前記処理は、前記位置に基づき、前記画像に含まれる画素を指定する処理である
ことを特徴とする請求項14に記載の情報処理装置。 - 操作領域が前記表示装置に表示され、
前記処理は、前記位置に基づき、コンテンツを調整するのに用いられるパラメータを指定する
ことを特徴とする請求項14に記載の情報処理装置。 - 前記処理手段は、前記第2のタッチ操作の継続時間が所定値より長い場合は、前記算出された距離に関わらず、前記第2のタッチ操作に応答して、前記第2の位置に基づく処理を実行する
ことを特徴とする請求項14から18のいずれか1項に記載の情報処理装置。 - 前記記憶手段は、さらに、前記第1の位置が取得された時間を前記処理の実行に用いられた前記第1の位置と関連付けて記憶し、
前記処理手段は、前記記憶されている時間から第2の位置が取得されるまでの経過時間が所定値より長い場合は、前記算出された距離に関わらず、前記第2のタッチ操作に応答して、前記第2の位置に基づく処理を実行する
ことを特徴とする請求項14から18のいずれか1項に記載の情報処理装置。 - 移動可能なユーザインタフェース要素(UI要素)を表示装置に表示させるステップと、
前記表示装置に対するユーザー操作を検知するステップと、
前記ユーザー操作が前記表示装置上で検知された第1の位置を取得するステップと、
前記取得された第1の位置に基づいて前記UI要素が前記表示装置上で表示される第2の位置を決定するステップと、
前記ユーザー操作の検知が終了したときの前記第2の位置を記憶するステップと、
ユーザー操作の検知が新たに開始されたときの第3の位置と前記記憶されている第2の位置との距離を算出するステップと、
前記ユーザー操作の検知が新たに開始された場合、前記UI要素を、前記第3の位置に基づき決定される第4の位置および前記第2の位置のうちいずれか一方に、前記算出された距離に応じて選択的に表示するよう制御するステップを有する
ことを特徴とする情報処理装置の制御方法。 - 表示装置へのタッチ操作を検知するステップと、
前記表示装置への第1のタッチ操作に応答して、前記第1のタッチ操作に対応する第1の位置に基づく処理を実行するステップと、
前記処理の実行に用いられた前記第1の位置を記憶するステップと、
前記表示装置への第2のタッチ操作が検知されたとき、前記記憶されている第1の位置と前記第2のタッチ操作に対応する第2の位置との距離を算出するステップと、
前記距離が所定値より小さければ、前記第2のタッチ操作に応答して、前記第1の位置に基づく処理を実行し、前記距離が前記所定値より大きければ、前記第2のタッチ操作に応答して、前記第2の位置に基づく処理を実行するステップを有する
ことを特徴とする情報処理装置の制御方法。 - コンピュータを請求項1から20のいずれか1項に記載の情報処理装置の各手段として機能させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112016005891.8T DE112016005891T5 (de) | 2015-12-22 | 2016-12-14 | Informationsverarbeitungsvorrichtung, Steuerverfahren dafür und Programm |
CN201680075865.1A CN108475166B (zh) | 2015-12-22 | 2016-12-14 | 信息处理装置及其控制方法和程序 |
GB1811917.2A GB2562931B (en) | 2015-12-22 | 2016-12-14 | Information-processing device, control method therefor, and program |
US16/001,132 US20180284980A1 (en) | 2015-12-22 | 2018-06-06 | Information-processing device and control method therefor |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-249392 | 2015-12-22 | ||
JP2015249392 | 2015-12-22 | ||
JP2016197269A JP6859061B2 (ja) | 2015-12-22 | 2016-10-05 | 情報処理装置およびその制御方法およびプログラム |
JP2016-197269 | 2016-10-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/001,132 Continuation US20180284980A1 (en) | 2015-12-22 | 2018-06-06 | Information-processing device and control method therefor |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017110606A1 true WO2017110606A1 (ja) | 2017-06-29 |
Family
ID=59090280
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/087156 WO2017110606A1 (ja) | 2015-12-22 | 2016-12-14 | 情報処理装置およびその制御方法およびプログラム |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108475166B (ja) |
WO (1) | WO2017110606A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021100173A (ja) * | 2019-12-20 | 2021-07-01 | シャープ株式会社 | 監視システム及び監視方法 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109901778A (zh) * | 2019-01-25 | 2019-06-18 | 湖南新云网科技有限公司 | 一种页面对象旋转缩放方法、存储器及智能设备 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013088891A (ja) * | 2011-10-14 | 2013-05-13 | Konica Minolta Business Technologies Inc | 情報端末及び描画制御プログラム並びに描画制御方法 |
JP2013218495A (ja) * | 2012-04-06 | 2013-10-24 | Canon Inc | 表示制御装置、表示制御方法、およびプログラム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8020100B2 (en) * | 2006-12-22 | 2011-09-13 | Apple Inc. | Fast creation of video segments |
JP5371798B2 (ja) * | 2010-01-12 | 2013-12-18 | キヤノン株式会社 | 情報処理装置、その情報処理方法及びプログラム |
US9465457B2 (en) * | 2010-08-30 | 2016-10-11 | Vmware, Inc. | Multi-touch interface gestures for keyboard and/or mouse inputs |
US20130014057A1 (en) * | 2011-07-07 | 2013-01-10 | Thermal Matrix USA, Inc. | Composite control for a graphical user interface |
US9131192B2 (en) * | 2012-03-06 | 2015-09-08 | Apple Inc. | Unified slider control for modifying multiple image properties |
CN103631419B (zh) * | 2012-08-27 | 2017-08-25 | 腾讯科技(深圳)有限公司 | 基于遥控触摸板的光标定位方法及系统 |
JP2014115734A (ja) * | 2012-12-06 | 2014-06-26 | Sharp Corp | 情報処理装置、情報処理装置の制御方法、および制御プログラム |
KR102087005B1 (ko) * | 2013-01-31 | 2020-03-11 | 삼성전자 주식회사 | 페이지 검색 방법 및 이를 지원하는 단말기 |
EP2770413A3 (en) * | 2013-02-22 | 2017-01-04 | Samsung Electronics Co., Ltd. | An apparatus for providing a cursor in electronic devices and a method thereof |
JP5924555B2 (ja) * | 2014-01-06 | 2016-05-25 | コニカミノルタ株式会社 | オブジェクトの停止位置制御方法、操作表示装置およびプログラム |
TWI610211B (zh) * | 2014-02-07 | 2018-01-01 | 財團法人工業技術研究院 | 觸控裝置、處理器及其觸控訊號讀取方法 |
-
2016
- 2016-12-14 CN CN201680075865.1A patent/CN108475166B/zh active Active
- 2016-12-14 WO PCT/JP2016/087156 patent/WO2017110606A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013088891A (ja) * | 2011-10-14 | 2013-05-13 | Konica Minolta Business Technologies Inc | 情報端末及び描画制御プログラム並びに描画制御方法 |
JP2013218495A (ja) * | 2012-04-06 | 2013-10-24 | Canon Inc | 表示制御装置、表示制御方法、およびプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021100173A (ja) * | 2019-12-20 | 2021-07-01 | シャープ株式会社 | 監視システム及び監視方法 |
JP7406976B2 (ja) | 2019-12-20 | 2023-12-28 | シャープ株式会社 | 監視システム及び監視方法 |
Also Published As
Publication number | Publication date |
---|---|
CN108475166A (zh) | 2018-08-31 |
CN108475166B (zh) | 2022-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11048404B2 (en) | Information processing apparatus, information processing method, and program | |
KR101532963B1 (ko) | 정보 처리장치 및 그 제어방법 | |
US9122388B2 (en) | Method and apparatus for controlling touch screen using timeline bar, recording medium with program for the same recorded therein, and user terminal having the same | |
US9465531B2 (en) | Information processing apparatus, display control method, and display control program for changing shape of cursor during dragging operation | |
JP6103807B2 (ja) | 表示制御装置、その制御方法及びプログラム | |
US10394444B2 (en) | Information processing device | |
CN103294337A (zh) | 电子装置及控制方法 | |
US8947464B2 (en) | Display control apparatus, display control method, and non-transitory computer readable storage medium | |
JP5885517B2 (ja) | 表示制御装置、表示制御装置の表示制御方法およびプログラム | |
US20160034149A1 (en) | Display apparatus, method of controlling display apparatus, and recordable medium storing program for performing method of controlling display apparatus | |
US8847929B2 (en) | Information processing apparatus, information processing method, and computer readable medium storing program | |
JP6192276B2 (ja) | 表示制御装置及びその制御方法、プログラム、及び記録媒体 | |
JP6226721B2 (ja) | 再生制御装置、再生制御方法、プログラム及び記憶媒体 | |
US20190102060A1 (en) | Information processing apparatus, display control method, and storage medium | |
KR20160065020A (ko) | 화상 표시 장치 및 화상 표시 방법 | |
KR20140094470A (ko) | 정보 처리 장치 및 정보 처리 방법 | |
WO2017110606A1 (ja) | 情報処理装置およびその制御方法およびプログラム | |
US9632697B2 (en) | Information processing apparatus and control method thereof, and non-transitory computer-readable medium | |
JP6494358B2 (ja) | 再生制御装置、再生制御方法 | |
JP2013012063A (ja) | 表示制御装置 | |
JP6859061B2 (ja) | 情報処理装置およびその制御方法およびプログラム | |
US20170351423A1 (en) | Information processing apparatus, information processing method and computer-readable storage medium storing program | |
US9818444B2 (en) | Information processing apparatus, control method thereof, and program | |
JP6501665B2 (ja) | 情報処理装置およびその制御方法およびプログラム | |
JP6003227B2 (ja) | 表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16878497 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112016005891 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 201811917 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20161214 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1811917.2 Country of ref document: GB |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16878497 Country of ref document: EP Kind code of ref document: A1 |