WO2015059992A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, et programme Download PDF

Info

Publication number
WO2015059992A1
WO2015059992A1 PCT/JP2014/071639 JP2014071639W WO2015059992A1 WO 2015059992 A1 WO2015059992 A1 WO 2015059992A1 JP 2014071639 W JP2014071639 W JP 2014071639W WO 2015059992 A1 WO2015059992 A1 WO 2015059992A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
state
operating body
processing apparatus
operating
Prior art date
Application number
PCT/JP2014/071639
Other languages
English (en)
Japanese (ja)
Inventor
元輝 東出
航 樋下
啓一郎 宮川
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2015059992A1 publication Critical patent/WO2015059992A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a program.
  • touch panels information processing apparatuses such as notebook PCs and tablet terminals equipped with touch panels and touch pads (hereinafter referred to as touch panels) have become widespread.
  • Such an information processing apparatus detects, as a pointing position, a position where the operation body touches the operation surface of the touch panel.
  • Patent Document 1 discloses a technique for designating (pointing) a position by touching a slider displayed on an operation surface on a touch panel with a finger as an operation body.
  • the pointing position when the operating tool is moved away from the operating surface, the pointing position may be shifted.
  • the pointing position when the operating body is separated from the operation surface diagonally, the pointing position is moved in conjunction with the oblique movement of the operating body, unlike when the operating body is separated vertically from the operation surface.
  • the pointing position after releasing the operating body is shifted from the desired pointing position before releasing the operating body.
  • the present disclosure proposes a method for suppressing the displacement of the pointing position that may occur when the operating body is separated from the operation surface.
  • the predetermined amount of the first operating body after the second state in which the first operating body is in contact with or close to the first operating surface and the operation on the second operating surface is detected is detected rather than the amount.
  • An information processing apparatus is provided that reduces the amount of change in the control parameter corresponding to the amount of movement.
  • a signal is acquired from a detection unit that detects contact or proximity of the operating tool to the operation surface, and the processor detects the movement of the operating tool detected based on the acquired signal.
  • the control corresponding to a predetermined amount of movement of the first operating body after detecting a first state in which the first operating body is in contact with or close to the first operating surface by changing a predetermined control parameter.
  • the first operating body after the second state in which the first operating body is in contact with or close to the first operating surface and the operation on the second operating surface is detected is detected rather than the change amount of the parameter. Reducing the amount of change of the control parameter corresponding to the predetermined movement amount.
  • the computer acquires a signal from the detection unit that detects contact or proximity of the operation body to the operation surface, and according to the movement of the operation body detected based on the acquired signal.
  • the control corresponding to a predetermined amount of movement of the first operating body after detecting a first state in which the first operating body is in contact with or close to the first operating surface by changing a predetermined control parameter.
  • the first operating body after the second state in which the first operating body is in contact with or close to the first operating surface and the operation on the second operating surface is detected is detected rather than the change amount of the parameter.
  • a program is provided for causing the change amount of the control parameter corresponding to the predetermined movement amount to be reduced.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an imaging display system 10 according to an embodiment of the present disclosure. It is a block diagram which shows an example of a function structure of the information processing apparatus 100 which concerns on one Embodiment. It is a figure which shows an example of the operation screen 700 displayed on the display part 110 which concerns on one Embodiment. It is a figure for demonstrating the example of a user's operation with respect to the operation screen 700 which concerns on one Embodiment. It is a figure for demonstrating the example of a user's operation with respect to the operation screen 700 which concerns on one Embodiment. It is a figure for demonstrating the example of a user's operation with respect to the operation screen 700 which concerns on one Embodiment. It is a figure for demonstrating the example of a user's operation with respect to the operation screen 700 which concerns on one Embodiment.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an imaging display system 10 according to an embodiment.
  • the imaging display system 10 displays an image captured by the imaging device on the display device.
  • the imaging display system 10 includes an imaging device 20, a display device 30, and an information processing device 100.
  • the imaging device 20, the display device 30, and the information processing device 100 can communicate with each other wirelessly or by wire.
  • the imaging device 20 is a video camera or a digital camera that captures an image of a subject or landscape.
  • the imaging device 20 transmits the captured image to the display device 30.
  • the user can change various settings of the imaging device 20. For example, the color setting of an image captured by the imaging device 20 can be changed.
  • the user can change the setting by remote operation using the information processing apparatus 100. Note that the user may change the setting with an operation unit provided in the imaging device 20.
  • the display device 30 is a television, for example, and displays the captured image received from the imaging device 20.
  • the display device 30 displays an image captured by the imaging device 20 in real time. Accordingly, the user can view the imaging result of the display device 30 while remotely operating the imaging device 20 with the information processing apparatus 100.
  • the information processing apparatus 100 is, for example, a tablet terminal, a smartphone, or a notebook PC, and is an apparatus that can remotely operate the imaging apparatus 20.
  • the user can change various settings of the imaging apparatus 20 by performing a touch operation on the operation screen of the information processing apparatus 100. For example, the user can adjust the color of the captured image on the operation screen.
  • the detailed configuration of the information processing apparatus 100 will be described later.
  • FIG. 2 is a block diagram illustrating an example of a functional configuration of the information processing apparatus 100 according to an embodiment.
  • the information processing apparatus 100 has a function of remotely operating the imaging apparatus 20.
  • the information processing apparatus 100 includes a display unit 110, an operation unit 120, a control unit 130, a storage unit 140, and a communication unit 150.
  • the display unit 110 displays characters, images, and other information on the screen. Display of characters, images, and other information on the display unit 110 is controlled by the control unit 130. In the present embodiment, the display unit 110 displays an operation screen for changing the settings of the imaging device 20. As the display unit 110, for example, a liquid crystal display or an organic EL display is used.
  • the operation unit 120 receives an operation on the information processing apparatus 100 by the user.
  • the operation unit 120 is, for example, a touch panel provided on the surface of the display unit 110 or integrated with the display unit 110.
  • the user can change the setting of the imaging device 20 by performing a touch operation on the operation surface of the touch panel with a finger that is an example of an operation body. Hard keys and buttons may be used for the operation unit 120 in addition to the touch panel.
  • the touch operation means an operation in which an input is confirmed when the finger contacts the operation surface, or an input is confirmed when the finger comes out of contact with the operation surface (so-called tap).
  • the user performs a touch operation on the operation unit 120 with a finger.
  • the present invention is not limited to this.
  • the user may perform an operation with a pen or the like.
  • the operation unit 120 accepts an input operation when the operation body touches the touch panel.
  • the operation unit 120 is not limited to this.
  • the operation unit 120 may accept an input operation when the operation body approaches the touch panel. good.
  • the operation unit 120 is not limited to a touch panel, and may be a touch pad, for example.
  • the control unit 130 is configured with an electronic circuit, for example, and controls the entire operation of the information processing apparatus 100.
  • the control unit 130 further includes an operation detection unit 132, a post-detection processing unit 134 that is an example of a processing unit, and a display control unit 136.
  • the operation detection unit 132 touches or approaches an operation body (for example, a user's finger) with respect to an operation screen (in other words, an operation surface of the operation unit 120 integrated with the display unit 110) displayed on the display unit 110. Is detected.
  • the operation detection unit 132 can identify and detect contact or proximity of a plurality of operation objects to the operation screen. For example, the operation detection unit 132 can detect that the second operating body is in contact with or close to another area of the operation screen while the first operating body is in contact with or close to an area of the operation screen.
  • the post-detection processing unit 134 has a function of performing a predetermined process on the detection result of the contact or proximity of the operating body by the operation detecting unit 132.
  • the post-detection processing unit 134 changes a predetermined control parameter according to the movement of the operation body detected based on the operation result of the operation detection unit 132. Specifically, the post-detection processing unit 134 controls the control corresponding to a predetermined movement amount of the first operating body after the first state in which the first operating body is in contact with or close to the first operating surface is detected.
  • the first operation body is in contact with or close to the first operation surface and the operation on the second operation surface is performed.
  • the amount of change in the control parameter corresponding to the predetermined amount of movement of the first operating body after the performed second state is detected is reduced. In such a case, even if the first operating body moves erroneously from the desired pointing position, the amount of movement of the operation target is handled small, so that deviation from the desired pointing position can be suppressed.
  • the first state is a state in which the first operating body is in contact with or close to the predetermined first operation region of the operation screen
  • the second state is a second operation region outside the first operation region of the operation screen.
  • the second operating body is in contact with or in close proximity.
  • the second operation area is an area around the first operation area, and is an area away from the first operation body when the first state is detected by a predetermined distance. Accordingly, the user can easily bring the second operating body into contact with or close to the second operating area during the operation of the first operating body in the first operating area.
  • the second operation area may vary depending on the position at which the first operating body contacts or approaches the operation screen. Thereby, it becomes easy to set a 2nd operation area
  • the post-detection processing unit 134 fixes the position of the operation target with respect to the operation of the first operating body when the second state of the second operating body is detected. In such a case, even if the first operating body is erroneously moved from the desired pointing position, the operation target does not move, so that deviation from the desired pointing position can be suppressed.
  • the post-detection processing unit 134 may decrease the movement amount of the operation target corresponding to the predetermined operation amount of the first operating body while the detection of the second state of the second operating body is continued. In such a case, while the user intends, the movement amount of the first moving body can be handled small, so that the user's intention is easily reflected.
  • the post-detection processing unit 134 may decrease the movement amount of the operation target corresponding to the predetermined operation amount of the first operating body after the second state of the second operating body is detected once. In such a case, since the operation of the second operating body is simplified, the operability is improved.
  • the display control unit 136 controls display on the display unit 110.
  • the display control unit 136 displays the processing of the post-detection processing unit 134 in association with the operation screen. Thereby, the user can recognize that the processing of the post-detection processing unit 134 is reflected by looking at the display content of the operation screen.
  • the display control unit 136 may display the processing of the post-detection processing unit 134 in association with the operation screen while the first operating body moves from the first state. Thereby, the user can recognize that the process of the post-detection processing unit 134 is reflected even if the first operating body is erroneously moved from the first state.
  • the display control unit 136 may display character information indicating the position of the second operation area on the operation screen. Accordingly, the user can easily recognize the position of the second operation area by looking at the displayed character information.
  • the display control unit 136 may display the second operation area in a different display form from the first operation area on the operation screen. For example, the display control unit 136 displays the second operation area so that it is emphasized more than the first operation area. This makes it easier for the user to identify the second operation area.
  • the display control unit 136 may display a predetermined button on the operation screen when the operation detection unit 132 detects the first state of the first operation body.
  • the state in which the second operating body is in contact with or close to the predetermined button may be the second state described above. In such a case, the user can easily recognize a method that can easily shift to the second state.
  • the storage unit 140 stores a program executed by the control unit 130 and information necessary for processing by the control unit 130.
  • the storage unit 140 stores adjustment information of settings of the imaging device 20 adjusted by the user on the operation screen.
  • the communication unit 150 is an interface that performs communication with other devices (the imaging device 20 and the display device 30). Communication with other devices by the communication unit 150 is controlled by the control unit 130. For example, the communication unit 150 transmits the adjustment information of the setting of the imaging device 20 adjusted by the user on the operation screen to the imaging device 20.
  • the operation detection unit 132, the post-detection processing unit 134, and the display control unit 136 are provided in the tablet terminal, the smartphone, and the notebook PC as the information processing apparatus 100.
  • the present invention is not limited to this.
  • the operation detection unit 132, the post-detection processing unit 134, and the display control unit 136 may be provided in a server that can communicate with the information processing apparatus 100 via a network.
  • the server receives information operated by the user using a tablet terminal or the like, and executes functions as the operation detection unit 132, the post-detection processing unit 134, and the display control unit 136.
  • the touch operation in which the operating body contacts the touch panel has been described as an example, but the present invention is not limited to this.
  • the present disclosure can be applied to an apparatus that detects a pointing operation on various operation surfaces.
  • the present invention can also be applied to a device that detects that the operating tool is close to the operating surface, or a method that detects the gesture of the operating tool or the operator using an imaging device such as a camera.
  • Example of operation on the operation screen> an operation screen displayed on the display unit 110 for adjusting the setting of the imaging device 20 and an operation example on the operation screen will be described.
  • FIG. 3 is a diagram illustrating an example of the operation screen 700 displayed on the display unit 110 according to the embodiment.
  • An operation screen 700 illustrated in FIG. 3 is a screen for adjusting the color of an image captured by the imaging device 20.
  • the operation screen 700 includes three operation areas 710, 720, and 730 for adjusting color.
  • the three operation areas 710, 720, and 730 have different degrees of brightness that can be adjusted.
  • the user can adjust the color of the imaging device 20 by performing a touch operation in the operation areas 710, 720, and 730.
  • three operation areas 710 to 730 are shown, but the present invention is not limited to this.
  • the number of operation areas may be two or less, or four or more.
  • the operation area 730 includes an adjustment area 731, an adjustment bar 732, and a reset button 733.
  • the adjustment area 731 is, for example, a circular area.
  • a movable pointer 731a (an example of an operation target) is displayed in the adjustment area 731 and the color is adjusted according to the position of the pointer 731a.
  • the user performs a touch operation so as to move the position of the pointer 731 a in the adjustment area 731.
  • the position of the bar of the adjustment bar 732 varies depending on the degree of user adjustment in the adjustment area 731. Note that the user can change the position of the bar by performing a touch operation on the adjustment bar 732.
  • the reset button 733 has a function of returning the color adjustment to a preset initial value when the user performs a touch operation.
  • Examples of user operations on the operation screen 700 according to an embodiment will be described with reference to FIGS. 4 to 7.
  • an operation in the operation area 730 of the operation screen 700 will be described as an example.
  • the operation screen 700 corresponds to the first operation surface and the second operation surface.
  • FIG. 4 to 7 are diagrams for explaining an example of a user operation on the operation screen 700 according to an embodiment.
  • the user places one of the fingers as the first operation body on the pointer 731a of the adjustment area 731 of the operation area 730 (here, the first operation body is described as an index finger F1). Contact.
  • the user can move the position of the pointer 731a by moving the index finger F1 in contact with the pointer 731a.
  • an image of a circular area may be displayed around the pointer 731a.
  • the circular area may be subjected to gradation so that the color becomes darker toward the pointer 731 a located at the center. This makes it easier for the user to guess the position of the pointer 731a hidden behind the finger.
  • an image of a circular area is displayed.
  • the present invention is not limited to this. For example, a rectangular area may be displayed.
  • the user brings another finger, which is the second operation body, into contact with an area outside the adjustment area 731 in the operation area 730 before releasing the index finger F1 from the operation screen 700.
  • the user moves the thumb F2 outside the adjustment area 731 in the operation area 730 as shown in FIG. Touch the area.
  • the pointer 731a is fixed at the first position.
  • the adjustment area 731 corresponds to the first operation area
  • the area other than the adjustment area 731 in the operation area 730 corresponds to the second operation area.
  • the operation screen 700 shows information indicating that the position of the pointer 731a is fixed.
  • a balloon 740 is displayed in association with the pointer 731a. Thereby, the user can easily recognize that the position of the pointer 731a is fixed.
  • the pointer 731a is fixed.
  • the display of the image of the circular area around the pointer 731a described in FIG. 4 may be changed.
  • the degree of gradation may be changed, the area may be reduced, or the circle may be changed to another shape (for example, a rectangle), or these may be combined. This makes it easier for the user to recognize that the pointer 731a is fixed.
  • the index finger F1 In order to confirm the color adjustment on the operation screen 700, it is necessary to release the index finger F1 that has been in contact with the pointer 731a.
  • the index finger F1 is released in a state where the thumb F2 is in contact with the area outside the adjustment area 731 as shown in FIG. As a result, the color is adjusted to correspond to the pointer 731a fixed at the first position.
  • the position of the pointer 731a is fixed while the thumb F2 as the second operating body is in contact with the area outside the adjustment area 731.
  • the present invention is not limited to this.
  • the position of the pointer 731a may be fixed if the thumb F2, which is the second operating body, is brought into contact with an area outside the adjustment area 731 once. In such a case, when the thumb F2 comes into contact with the area outside the adjustment area 731 once again, the position of the pointer 731a is released. In such a case, operability is improved because only one touch is required.
  • the position of the pointer 731a may be fixed by bringing the thumb F2 as the second operating body into contact with the area outside the adjustment area 731 a plurality of times (for example, twice). This makes it easy to distinguish from other touch operations. Further, the position of the pointer 731a may be fixed when a gesture operation such as a flick operation or a drag operation is performed with the thumb F2.
  • the second operating body is one finger (thumb F2).
  • the present invention is not limited to this.
  • the second operating body may be a plurality of fingers. Then, the position of the pointer 731a may be fixed by bringing a plurality of fingers into contact with the area outside the adjustment area 731 once.
  • the present invention is not limited to this.
  • the moving speed may be decreased when the first operating body subsequently moves.
  • the moving speed of the pointer 731a is also slowed down.
  • the position of the pointer 731a may be fixed by making another operating body contact the area outside the adjustment area 731 again.
  • the second operating body is brought into contact with an area outside the adjustment area 731.
  • the position of the pointer 731a is fixed, but the present invention is not limited to this.
  • the position of the pointer 731a may be fixed when the second operating body is brought into contact with a position separated from the first operating body by a predetermined distance in the adjustment region 731. In such a case, the contact position of the second operating body that fixes the position of the pointer 731a varies depending on the position of the first operating body.
  • the present invention is not limited to this.
  • the two adjustment areas 721 and 731 on the operation screen 700 may be operated simultaneously with two fingers (first operating body). Specifically, the user may bring the index finger of the left hand into contact with the pointer 721a in the adjustment area 721, and the index finger of the right hand may be brought into contact with the pointer 731a in the adjustment area 731.
  • the display screen 700 that adjusts the settings of the imaging device 20 is described as an example of the operation screen that the operating body touches or approaches.
  • the present invention is not limited to this and can be applied to various operation screens.
  • the present invention can be applied to the operation of the operating body on the map screen.
  • the display of the operation screen 700 is controlled so that the user can easily fix the pointer 731a.
  • a display example of the operation screen 700 according to an embodiment will be described with reference to FIGS.
  • an operation in the operation area 720 of the operation screen 700 will be described as an example.
  • FIG. 8 is a diagram for explaining a display example of the operation screen 700 according to an embodiment.
  • a fixing area 750 (second operation area) for contacting the second operating body (thumb F2) in order to fix the position of the pointer 721a touched by the first operating body (index finger F1), Is highlighted.
  • the area other than the adjustment area 721 and the fixed area 750 on the operation screen 700 is darkly displayed so as not to be noticeable.
  • the fixed area 750 is not displayed before the index finger F1 comes into contact with the adjustment area 721.
  • the fixed area 750 is formed around the circular adjustment area 721 as shown in FIG. 8 and is a circular area larger than the adjustment area 721.
  • FIG. 9 is a diagram for explaining a display example of the operation screen 700 according to an embodiment.
  • a fixing button 752 for bringing the second operating body (thumb F2) into contact is displayed in order to fix the position of the pointer 721a touched by the first operating body (forefinger F1).
  • areas other than the adjustment area 721 and the fixed button 752 are darkly displayed so as not to stand out.
  • the fixed button 752 is not displayed before the index finger F1 contacts the adjustment area 721.
  • the present invention is not limited to this, and the fixed button 752 may always be displayed on the operation screen 700 before the index finger F1 contacts the adjustment area 721.
  • the fixed button 752 is arranged around the adjustment area 721 (specifically, below the adjustment area 721) as shown in FIG.
  • the fixed button 752 is disposed at a position where the user can easily touch the second operating body.
  • the position of the pointer 721a is fixed.
  • FIG. 10 is a diagram for explaining a display example of the operation screen 700 according to an embodiment.
  • character information 754 indicating a region in which the second operating body (thumb F2) is brought into contact with the first operating body (forefinger F1) in order to fix the position of the pointer 721a is displayed.
  • the areas other than the adjustment area 721 and the character information 754 are darkly displayed so as not to stand out. Note that the character information 754 is not displayed before the index finger F1 contacts the adjustment area 721.
  • the character information 754 is displayed above the operation screen 700 as shown in FIG.
  • the character information 754 includes information indicating that the pointer 721a is fixed.
  • the position of the pointer 721a is fixed.
  • FIG. 11 is a diagram for explaining a display example of the operation screen 700 according to an embodiment.
  • the fixing area where the second operating body (thumb F2) is contacted is displayed in different colors. .
  • the fixed areas are all areas other than the adjustment area 721 on the operation screen 700 as shown in FIG.
  • character information indicating that the pointer 721a is fixed when touched is also displayed.
  • the position of the pointer 721a is fixed.
  • information on which the pointer 721a is fixed is displayed on the operation screen 700 of the information processing apparatus 100.
  • the present invention is not limited to this.
  • the information processing apparatus 100 transmits information related to the fixing of the pointer 721a to the display device 30 to display information indicating that the pointer 721a is fixed on the display device 30. Also good. Since the user adjusts while viewing the display device 30, if information on which the pointer 721 a is fixed is displayed on the display device 30, the user can easily recognize that the pointer 721 a is fixed.
  • the present invention is not limited to this.
  • the present invention can be applied to an apparatus having two display units each provided with an operation screen.
  • the two display units may be provided in separate housings so as to be foldable via, for example, a hinge unit. Further, the two display units may be separately provided in the housing so as to be capable of wireless communication.
  • adjustment areas 711, 721, and 731 shown in FIG. 4 are displayed on one operation screen (corresponding to the first operation surface) of the two display units, and the other operation screen (corresponding to the second operation surface).
  • adjustment bars 712, 722, and 732 shown in FIG. 4 are displayed.
  • the pointer 731a operated by the first operating body is fixed.
  • the present invention is not limited to this.
  • the pointer 731a may be fixed.
  • Example of operation of information processing apparatus An operation example of the information processing apparatus 100 according to an embodiment of the present disclosure will be described with reference to FIG.
  • FIG. 12 is a flowchart illustrating an operation example of the information processing apparatus 100 according to an embodiment.
  • the processing shown in FIG. 12 is realized by the CPU of the information processing apparatus 100 executing a program stored in the ROM.
  • the program to be executed may be stored in a recording medium such as a CD (Compact Disk), a DVD (Digital Versatile Disk), or a memory card, or may be downloaded from a server or the like via the Internet.
  • a recording medium such as a CD (Compact Disk), a DVD (Digital Versatile Disk), or a memory card, or may be downloaded from a server or the like via the Internet.
  • the display control unit 136 displays the operation screen 700 as shown in FIG. 3 on the display unit 110 (step S102).
  • the displayed operation screen 700 is a screen for adjusting the color of an image captured by the imaging device 20.
  • the operation detection unit 132 detects contact or proximity of the index finger F1 to the operation screen 700. (Step S104). For example, the operation detection unit 132 detects that the index finger F1 has touched the pointer 721a of the adjustment area 721.
  • the operation detection unit 132 determines whether or not the user is in contact with or close to the operation screen 700 with the second operation body (here, the thumb F2) (step S106).
  • the thumb F2 is brought into contact with the outside of the adjustment area 721 on the operation screen 700.
  • step S106 If it is determined in step S106 that the thumb F2 is in contact with or close to the operation screen 700 (Yes), the post-detection processing unit 134 fixes the position of the pointer 721a (step S108). For this reason, even if the index finger F1 subsequently moves, the pointer 721a does not move. At this time, the display control unit 136 displays information indicating that the position of the pointer 721a is fixed on the display screen 700 (step S110). Thereby, the user can easily recognize that the pointer 721a is fixed.
  • the operation detection unit 132 determines whether or not the index finger F1 has left the operation screen 700 (step S112). When adjusting to the color corresponding to the fixed position of the pointer 721a, the user releases the index finger F1.
  • step S112 If it is determined in step S112 that the index finger F1 has left the operation screen 700 (Yes), the post-detection processing unit 134 determines a value corresponding to the fixed position of the pointer 721a as the color adjustment value. (Step S114). Thereafter, the communication unit 150 transmits the determined color adjustment value to the imaging device 20. As a result, the imaging device 20 adjusts the color tone based on the received adjustment value, and transmits the adjusted image to the display device 30.
  • step S106 determines whether or not the thumb F2 has left the operation screen 700 (step S118).
  • step S118 If it is determined in step S118 that the index finger F1 has moved away from the operation screen 700 (Yes), the post-detection processing unit 134 corresponds to the position where the index finger F1 of the pointer 721a has moved away as the color adjustment value. The value is tested (step S120). Thereafter, the communication unit 150 transmits the determined color adjustment value to the imaging device 20. As a result, the imaging device 20 adjusts the color tone based on the received adjustment value, and transmits the adjusted image to the display device 30.
  • FIG. 13 is an explanatory diagram showing a hardware configuration example of the information processing apparatus 100 according to an embodiment.
  • the information processing apparatus 100 includes a CPU (Central Processing Unit) 201, a ROM (Read Only Memory) 202, a RAM (Random Access Memory) 203, an input device 208, an output device 210, A storage device 211, a drive 212, and a communication device 215 are provided.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 201 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 100 according to various programs. Further, the CPU 201 may be a microprocessor.
  • the ROM 202 stores programs used by the CPU 201, calculation parameters, and the like.
  • the RAM 203 temporarily stores programs used in the execution of the CPU 201, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus.
  • the input device 208 generates an input signal based on an input by the user, such as a mouse, a keyboard, a touch panel, a touch pad, a button, a microphone, a switch, and a lever, and an input by the user, and outputs the input signal to the CPU 201. It consists of an input control circuit.
  • a user of the information processing apparatus 100 can input various data and instruct a processing operation to the information processing apparatus 100 by operating the input device 208.
  • the output device 210 includes a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp. Furthermore, the output device 210 includes an audio output device such as a speaker and headphones. For example, the display device displays a captured image or a generated image. On the other hand, the audio output device converts audio data or the like into audio and outputs it.
  • a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp.
  • the output device 210 includes an audio output device such as a speaker and headphones.
  • the display device displays a captured image or a generated image.
  • the audio output device converts audio data or the like into audio and outputs it.
  • the storage apparatus 211 is a data storage apparatus configured as an example of a storage unit of the information processing apparatus 100 according to the present embodiment.
  • the storage device 211 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 211 stores programs executed by the CPU 201 and various data.
  • the drive 212 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 100.
  • the drive 212 reads information recorded on a removable storage medium 220 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 203.
  • the drive 212 can also write information into the removable storage medium 220.
  • the communication device 215 is a communication interface configured with, for example, a communication device for connecting to the network 230.
  • the communication device 215 may be a wireless LAN (Local Area Network) compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
  • LTE Long Term Evolution
  • the network 230 is a wired or wireless transmission path for information transmitted from a device connected to the network 230.
  • the network 230 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like.
  • the network 230 may also include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • the information processing apparatus 100 is configured so that the first operation body after the first state in which the first operation body is in contact with or close to the first operation surface is detected.
  • the first operating body touches or approaches the first operation surface rather than the change amount of the control parameter (specifically, the parameter related to the movement of the operation target displayed on the display unit) corresponding to the movement amount.
  • the amount of change in the control parameter corresponding to the predetermined amount of movement of the first operating body after the detection of the second state in which the operation on the second operation surface is performed is reduced.
  • the movement amount of the operation target (pointer) with respect to the operation of the first operating body after the second state is detected becomes small. For this reason, it can suppress that the pointing position after separating the 1st operation body from the operation screen shifts from the desired pointing position when the 1st operation body is located in the 1st state.
  • the information processing apparatus 100 fixes the position of the operation target (the position of the pointer) with respect to the operation of the first operating body. Thereby, it is possible to prevent the pointing position after the first operating body is separated from the operation screen from deviating from a desired pointing position when the first operating body is positioned in the first state.
  • a processing unit that acquires a signal from a detection unit that detects contact or proximity of the operation body to the operation surface and changes a predetermined control parameter according to the movement of the operation body detected based on the signal,
  • the processing unit is more than a change amount of the control parameter corresponding to a predetermined movement amount of the first operating body after the first state in which the first operating body is in contact with or close to the first operating surface is detected.
  • Corresponding to the predetermined amount of movement of the first operating body after the second state in which the first operating body is in contact with or close to the first operating surface and an operation on the second operating surface is detected is detected.
  • An information processing apparatus that reduces a change amount of the control parameter.
  • the second state is a state in which at least one second operation body is in contact with or close to the second operation surface.
  • the control parameter is a parameter related to the movement of the operation target displayed on the display unit.
  • the processing unit fixes the position of the operation target with respect to the operation of the first operating body.
  • the display control unit further includes a display control unit that displays the processing content of the processing unit in association with the first operation surface.
  • the information processing apparatus according to any one of (1) to (4).
  • the display control unit displays the processing content of the processing unit in association with the first operation surface while the first operating body moves from the first state.
  • the information processing apparatus is (5). (7)
  • the second operation surface is the same operation surface as the first operation surface.
  • the first state is a state in which the first operating body is in contact with or in proximity to a predetermined first operating area of the first operating surface.
  • the second state is a state in which the second operating body is in contact with or close to a second operation region outside the first operation region.
  • the information processing apparatus according to (7).
  • the second operation area is an area around the first operation area.
  • the information processing apparatus according to (8).
  • a display control unit for displaying character information indicating a position of the second operation area on the first operation surface;
  • the information processing apparatus according to (8) or (9).
  • the second operation area is an area separated from the first operation body by a predetermined distance when the first state is detected.
  • the second operation area varies according to a position where the first operation body contacts or approaches the first operation surface.
  • the second state is a state in which the second operating body is in contact with or close to a predetermined button on the second operation surface.
  • the processing unit reduces a change amount of the control parameter corresponding to the predetermined movement amount of the first operating body while the detection of the second state is continued.
  • the processing unit reduces a change amount of the control parameter corresponding to the predetermined movement amount of the first operating body.
  • the second operation surface is an operation surface different from the first operation surface. The information processing apparatus according to (1).
  • Imaging display system 20 Imaging apparatus 30
  • Information processing apparatus 110
  • Display part 120
  • Control part 132
  • Operation detection part 134
  • Post-detection process part 136
  • Display control part 700

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention a pour objet un procédé pour, au moment où un manipulateur est retiré d'une surface opérationnelle, minimiser un changement possible de la position sur la surface opérationnelle pointée par le manipulateur. Pour ce faire, un dispositif de traitement d'informations selon la présente invention est équipé d'une unité de traitement servant à obtenir un signal d'une unité de détection, qui détecte un contact d'un manipulateur sur une surface opérationnelle ou une proximité proche de ce dernier à ladite surface, et servant à varier un paramètre de commande prédéterminé selon le déplacement du manipulateur tel qu'il est détecté sur la base du signal, après la détection d'un premier état dans lequel un premier manipulateur est en contact avec une première surface opérationnelle ou à proximité proche de celle-ci, l'unité de traitement variant le paramètre de commande pour le premier manipulateur d'une manière telle que la quantité de changement du paramètre de commande pour une distance prédéterminée parcourue par le premier manipulateur après la détection d'un second état dans lequel une seconde surface opérationnelle est manipulée pendant que le premier manipulateur est en contact avec la première surface opérationnelle ou à proximité proche de celle-ci, soit inférieure à la quantité de changement du paramètre de commande pour la distance prédéterminée parcourue par le premier manipulateur après la détection du premier état, mais avant la détection du second état.
PCT/JP2014/071639 2013-10-25 2014-08-19 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme WO2015059992A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013222752 2013-10-25
JP2013-222752 2013-10-25

Publications (1)

Publication Number Publication Date
WO2015059992A1 true WO2015059992A1 (fr) 2015-04-30

Family

ID=52992601

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/071639 WO2015059992A1 (fr) 2013-10-25 2014-08-19 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme

Country Status (1)

Country Link
WO (1) WO2015059992A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017056651A1 (fr) * 2015-09-30 2017-04-06 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'information, et programme

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012055613A1 (fr) * 2010-10-29 2012-05-03 Robert Bosch Gmbh Dispositif de commande pour commander au moins un appareil électrique
JP2012098844A (ja) * 2010-10-29 2012-05-24 Canon Marketing Japan Inc 情報処理装置、情報処理方法、およびそのプログラム
JP2012114819A (ja) * 2010-11-26 2012-06-14 Toshiba Corp 電子機器、シークバーの表示方法
JP2013097411A (ja) * 2011-10-28 2013-05-20 Sanyo Electric Co Ltd 車両用ナビゲーション装置
GB2502671A (en) * 2012-03-26 2013-12-04 Boeing Co Virtual knobs on a touchscreen

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012055613A1 (fr) * 2010-10-29 2012-05-03 Robert Bosch Gmbh Dispositif de commande pour commander au moins un appareil électrique
JP2012098844A (ja) * 2010-10-29 2012-05-24 Canon Marketing Japan Inc 情報処理装置、情報処理方法、およびそのプログラム
JP2012114819A (ja) * 2010-11-26 2012-06-14 Toshiba Corp 電子機器、シークバーの表示方法
JP2013097411A (ja) * 2011-10-28 2013-05-20 Sanyo Electric Co Ltd 車両用ナビゲーション装置
GB2502671A (en) * 2012-03-26 2013-12-04 Boeing Co Virtual knobs on a touchscreen

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017056651A1 (fr) * 2015-09-30 2017-04-06 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'information, et programme
US10620719B2 (en) 2015-09-30 2020-04-14 Sony Corporation Information processing device and information processing method

Similar Documents

Publication Publication Date Title
US11816330B2 (en) Display device, display controlling method, and computer program
US8610678B2 (en) Information processing apparatus and method for moving a displayed object between multiple displays
US10942620B2 (en) Information processing apparatus, information processing method, program, and information processing system
US10705702B2 (en) Information processing device, information processing method, and computer program
US11188192B2 (en) Information processing device, information processing method, and computer program for side menus
CN104423697B (zh) 显示控制设备、显示控制方法和记录介质
US10180783B2 (en) Information processing device, information processing method and program that controls movement of a displayed icon based on sensor information and user input
KR20120079812A (ko) 정보 처리 장치, 정보 처리 방법 및 컴퓨터 프로그램
WO2013121807A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme d'ordinateur
TW201403391A (zh) 遠端互動系統及其控制
KR101432483B1 (ko) 제어영역을 이용한 터치스크린 제어방법 및 이를 이용한 단말
WO2015059992A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2014034549A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et support de stockage d'informations
JP5841023B2 (ja) 情報処理装置、情報処理方法、プログラム及び情報記憶媒体
US20200319793A1 (en) Information processing device, information processing method, and program
US20240103630A1 (en) A computer a software module arrangement, a circuitry arrangement, a user equipment and a method for an improved and extended user interface
JP6156709B2 (ja) 情報処理装置、情報処理方法、プログラム、及び、情報処理システム
US20190212891A1 (en) Electronic apparatus, information processing method, program, and storage medium
JP2017219914A (ja) 入力装置、入力方法及び入力プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14856381

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14856381

Country of ref document: EP

Kind code of ref document: A1