US20150033162A1 - Information processing apparatus, method, and non-transitory computer-readable medium - Google Patents
Information processing apparatus, method, and non-transitory computer-readable medium Download PDFInfo
- Publication number
- US20150033162A1 US20150033162A1 US14/379,931 US201314379931A US2015033162A1 US 20150033162 A1 US20150033162 A1 US 20150033162A1 US 201314379931 A US201314379931 A US 201314379931A US 2015033162 A1 US2015033162 A1 US 2015033162A1
- Authority
- US
- United States
- Prior art keywords
- operation target
- display
- size
- input
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/34—Microprocessors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/42—Graphical user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- an electronic device having a plurality of functions such as a mobile phone, a digital still camera, or the like, has been widespread.
- an electronic device is present, in which a menu screen which enables a user to perform each operation for performing a desired function is displayed on a touch panel, and the function according to an operation input of the touch panel is executed.
- the processing circuitry may be configured to compare the contact size to a threshold value.
- the processing circuitry may be configured to enable or disable the operation input for the operation target based on the comparison.
- the circuitry may be configured to enable an operation input for the operation target when the contact size is less than the threshold value.
- the circuitry may be configured to disable an operation input for the operation target when the contact size is greater than the threshold value.
- a non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, causes the information processing apparatus to perform a process comprising: controlling a display to display an operation target; determining a contact size of an object on the display; and enabling or disabling an operation input for the operation target based on the contact size.
- FIG. 10 is a flowchart which illustrates an example of a processing order of display control processing by the display control device 100 according to the first embodiment of the present technology.
- Second Embodiment display control: example of changing display magnification of operation target based on detection size (contact size)
- FIG. 1 is a perspective view which illustrates a configuration example of an appearance of a display control device 100 according to the first embodiment of the present technology.
- “a” illustrates an appearance of one surface side of the display control device 100 (that is, surface on which input-output unit 150 is provided).
- “b” of FIG. 1 illustrates an appearance of the other surface side of the display control device 100 (that is, surface on which lens 121 is provided).
- the display control device 100 includes first to fifth buttons 111 to 115 , speakers 101 and 102 , the lens 121 , and the input-output unit 150 .
- the display control device 100 is realized by a radio communication device which is able to display various images (for example, a mobile phone, or a smart phone including call and data communication functions).
- it is also possible to provide other operation members in the display control device 100 however, figures and descriptions thereof will be omitted.
- the speakers 101 and 102 are speakers which output various pieces of sound information.
- the speaker 101 is a speaker which is used when making a call
- the speaker 102 is a speaker which is used when reproducing contents, or the like.
- the input-output unit 150 is a unit which displays various images, and receives an operation input from a user based on a detection state of an object which approaches, or comes into contact with the display surface.
- the input-output unit 150 is also referred to as a touch screen, or a touch panel.
- the display control device 100 includes an operation reception unit 110 , an imaging unit 120 , a recording medium control unit 130 , a recording medium 140 , the input-output unit 150 , an input control unit 160 , a control unit 170 , a size information maintaining unit 171 , and a display control unit 180 .
- an operation reception unit 110 an imaging unit 120 , a recording medium control unit 130 , a recording medium 140 , the input-output unit 150 , an input control unit 160 , a control unit 170 , a size information maintaining unit 171 , and a display control unit 180 .
- the recording medium 140 is a medium which stores various pieces of information (still image content, or animation content) based on the control of the recording medium control unit 130 . In addition, the recording medium 140 supplies the stored various pieces of information to the recording medium control unit 130 .
- the reception unit 151 is a unit which receives an operation input of an object (for example, user's finger) which approaches, or comes into contact with the display surface of the input-output unit 150 , based on the detection state of the object.
- the reception unit 151 includes a plurality of electrostatic sensors which are arranged in a lattice shape.
- the electrostatic sensors are sensors which increase an electrostatic capacitance when an object (conductive object (for example, human hand, or finger)) approaches, or comes into contact with the display surface of the input-output unit 150 .
- the reception unit 151 is an example of the detection unit.
- the operation target is, for example, displayed on the input-output unit 150 like setting operation images 331 to 334 illustrated in FIG. 3 .
- the operation target is data for performing the operation input, or an object (for example, GUI (Graphical User Interface) parts).
- the input control unit 160 is a unit which performs a control relating to an operation input by a user (for example, touch operation (tap operation)) which is received by the reception unit 151 .
- the input control unit 160 detects a range (contract range) of the display surface of the input-output unit 150 on which a user's finger touched based on the information of the electrostatic sensor which is output from the reception unit 151 , and converts the contact range to a coordinate, based on a coordinate axis corresponding to the display surface.
- the input control unit 160 calculates a shape of the contact range based on the converted coordinate, and calculates a coordinate of the center of gravity in the shape.
- the input control unit 160 calculates the calculated coordinate of the center of gravity as a coordinate of a position (contact position) which a user's finger touched.
- the input control unit 160 outputs operation information relating to the shape of the calculated contract range, and the coordinate of the contact position to the control unit 170 .
- the control unit 170 recognizes the operation input of the user on the display surface of the input-output unit 150 based on the operation information (shape of contact range, coordinate of contact position, or the like) which is output from the input control unit 160 .
- the control unit 170 is a unit which controls each unit of the display control device 100 based on the operation signal from the operation reception unit 110 , and the operation information (shape of contact range, coordinate of contact position, or the like) from the input control unit 160 . In addition, the control unit 170 maintains a contact size (detected size) of the contact operation by the user which is detected on the display surface of the input-output unit 150 in the size information maintaining unit 171 .
- control unit 170 performs a control which changes operation contents of an operation target based on a contact size (detected size) of an object (for example, user's finger) on the display surface of the input-output unit 150 . That is, the control unit 170 changes the operation contents of at least a part of the plurality of operation targets based on the contact size (detected size) of the object on the display surface of the input-output unit 150 . For example, the control unit 170 changes the operation contents of the operation targets by disabling the operation input of the operation targets based on the contact size (detected size) of an object on the display surface of the input-output unit 150 .
- control unit 170 changes the operation content based on a comparison result between the contact size (detected size) of the object on the display surface of the input-output unit 150 and the size of the operation target on the display surface of the input-output unit 150 .
- the control unit 170 enables the operation input of the operation target based on the contact size (detected size) of the object on the display surface of the input-output unit 150 , and when the size of the operation target on the display surface of the input-output unit 150 is large.
- the control unit 170 disables the operation input of the operation target when the size of the operation target on the display surface of the input-output unit 150 is small.
- the operation target of which the operation input is disabled is referred to as a simple operation target
- the operation target of which the operation input is enabled is referred to as a detailed operation target.
- the reference size is a value (specified value) which denotes the size of the stylus which is used when performing an operation, or a finger of a standard user, and is preset.
- the contact size is a value based on the value which is detected in the reception unit 151 (value denoting size of stylus used when performing operation by user, or size of finger of user).
- control unit 170 causes the size information maintaining unit 171 to sequentially maintain the value which is detected in the reception unit 151 (for example, the area in which user's finger touched (contact area)).
- control unit 170 calculates a mean value of the contact area which is maintained in the size information maintaining unit 171 , and causes the size information maintaining unit 171 to maintain the mean value as a contact size (detected size). That is, the contact size (detected size) is a mean value of input values which are actually input. However, since it is also considered that a child may use the device after an adult, it is preferable to use the mean value per unit hour as the contact size (detected size).
- the display control unit 180 is a unit which causes each image to be output to the display unit 152 based on the control of the control unit 170 .
- the display control unit 180 causes a set screen for performing various setting when performing an imaging operation (for example, menu screen 300 in FIG. 3 ), or an imaged image which is output from the imaging unit 120 (so-called through image) to be displayed on the display unit 152 .
- the display control unit 180 causes content which is stored in the recording medium 140 (for example, still image content, or animation content) to be displayed on the display unit 152 .
- the menu screen 300 is a screen in which the operation targets are grouped according to the types thereof in order to look around the whole menu. That is, the grouped operation targets are divided into nine regions in a unit of group in a state of being compactedly displayed (overlook state), and are displayed on one screen (menu screen 300 ). That is, operation targets which are look similar (for example, items relating to the same function) belong to each group.
- the menu screen 300 which is divided into nine regions is an example, and it is also preferable to appropriately change the number of regions according to each operation target as a display target.
- an imaging mode setting region 310 a flash system setting region 320 , a white balance system setting region 330 , a reproducing setting region 340 , and a diaphragm adjusting region 350 are displayed on the menu screen 300 .
- a face detection system setting region 360 a guide display system setting region 370 , an imaged image size system setting region 380 , and an animation system setting region 390 are displayed on the menu screen 300 .
- the flash system setting region 320 is a region in which operation targets which are used when performing various setting relating to a flash are displayed.
- the white balance system setting region 330 is a region in which operation targets which are used when performing various setting relating to a white balance are displayed.
- the face detection system setting region 360 is a region in which operation targets which are used when performing various setting relating to face detection are displayed.
- the guide display system setting region 370 is a region in which operation targets which are used when performing various setting relating to a guide function (help function) are displayed.
- the operation targets, regions or the like which are displayed on the menu screen 300 are examples, and it is also preferable to appropriately perform changing according to the set mode, the imaging operation state, or the like.
- the control unit 170 specifies at which position of the menu screen 300 the touch operation has been performed. That is, the control unit 170 specifies the position on the display surface of the input-output unit 150 which a user's finger touched (contact position) based on the operation input which is output from the input control unit 160 . In addition, the control unit 170 performs a process corresponding to the detailed operation target when the contact position is included in the detailed operation target.
- an operation target which is viewed as an operation target which can be operated in practice is limited according to a relatively large specified value (reference value) when it is different from the size of the specified value (reference value), and becomes difficult to be operated.
- reference value a relatively large specified value
- the operation target becomes the conflict, or is erroneously operated according to a relatively small specified value (reference value).
- the detailed operation target, and the simple operation target are determined using the reference size and the detected size together.
- FIG. 4 is a diagram which illustrates a relationship between the operation target which is displayed on the input-output unit 150 and the reference size which is maintained in the size information maintaining unit 171 according to the first embodiment of the present technology.
- FIG. 4 “a” illustrates an example of a relationship between a reference size 200 and an operation target which is displayed by being enlarged.
- “b” of FIG. 4 illustrates an example of relationship between the reference size 200 and an operation target which is displayed by being compacted.
- setting operation images (operation targets) 331 to 334 in the white balance system setting region 330 in FIG. 3 are exemplified.
- the setting operation images 331 to 334 are displayed in the white balance system setting region 330 .
- the setting operation images 331 to 334 are operation images which are used when adjusting the white balance at the time of setting the imaging mode. For example, the white balance can be adjusted by the contact operation of the setting operation images 331 to 334 .
- a size which is used as a comparison target means an area on the display surface of the input-output unit 150 .
- a value which is specified by a shape or the like thereof for example, an area which is specified by rectangle surrounding operation target).
- the control unit 170 determines the operation target which is larger than the reference size 200 as the detailed operation target. In this manner, the operation target which is determined to be the detailed operation target can be operated using the user's contact operation on the display surface of the input-output unit 150 .
- the control unit 170 determines an operation target of which the size is smaller than the reference size 200 as the simple operation target. In this manner, the operation using a user's contact operation on the display surface of the input-output unit 150 is disabled with respect to the operation target which is determined to be the simple operation target.
- FIG. 5 illustrates the menu screen 300 a case where a contact operation is performed using a finger 10 as the operation object.
- the menu screen 300 illustrated in FIG. 5 is the same as that in FIG. 3 .
- the operation targets which are displayed on the menu screen 300 for example, setting operation screens 331 to 334
- only the operation targets which are surrounded by dotted lined-rectangles 401 and 402 are set to the detailed operation target.
- the menu screen 300 it is possible to perform only the contact operation of the operation of selecting the nine regions ( 310 , . . . , 390 ), and the operation of the detailed operation targets which are surrounded with the dotted lined-rectangles 401 and 402 .
- the detailed operation target and the simple operation target using a different display form so as to be easily distinguished from each other by a user. For example, it is possible to highlight the detailed operation target (displaying detailed operation target brighter than simple operation target). In addition, it is also preferable to display the detailed operation target in blue when the menu screen 300 is a display screen which is based on white. In addition, it is also preferable to display the detailed operation target such that the detailed operation target is minute, and is repeatedly displayed by being expanded, or compacted (For example, detailed operation target is displayed in a feeling of moving lightly).
- the contact operation is performed by the operation object of which the detected size is larger than the size of the setting operation images 331 to 334 in the menu screen 410 .
- the contact operation is performed using a hand 20 in a state of a rock as the operation object. Therefore, in FIGS. 7 to 9 , examples of a relationship between the detailed operation target and the simple operation target in a case where the contact operation is performed by the operation object of which the detected size is relatively large.
- FIG. 7 illustrates the menu screen 300 which is displayed when the contact operation is performed using the back of the hand 20 in the state of a rock as the operation object. For example, it is assumed that the contact operation is performed in a state where the hand 20 grasps something.
- the menu screen 300 illustrated in FIG. 7 is the same as that in FIG. 3 .
- each region by further enlarging the region in order to perform the contact operation with respect to each operation target (setting operation images 331 to 334 ).
- each operation target setting operation images 331 to 334 .
- FIG. 9 an example in which the setting operation image (simple operation target) 331 is displayed by being enlarged is illustrated in FIG. 9 .
- the setting operation image 331 which is displayed on the menu screen 420 is the detailed operation target (illustrated by being surrounded by dotted lined-rectangle 421 ). For this reason, it is possible to perform the contact operation of the setting operation image 331 on the menu screen 420 .
- the operation target when there is one operation target which is present on the display screen (or in a region of the predetermined range), the operation target is considered to be operated, even when the size of the operation target on the display screen is remarkably smaller than the detected size. For this reason, when there is one operation target which is present on the display screen (or in region of the predetermined range), the operation target is determined to be the detailed operation target. That is, even when the operation target is remarkably smaller than the detected size, the operation target is determined to be the detailed operation target when there is nothing in the periphery thereof.
- control unit 170 enables the operation input of an operation target when a predetermined condition is satisfied even when it is the operation target of which the operation input is determined to be disabled based on the contact size of an object on the display surface.
- a case where the predetermined condition is satisfied is, for example, a case in which the number of the operation target and other operation targets which is present in a predetermined region on the display surface is less than a predetermined number (for example, 1).
- a determination example in which a distance between the operation targets on the display surface is set to a determination element will be described.
- a case is assumed in which a plurality of operation targets are present in the display screen (or, in a region of a predetermined range).
- these two operation targets are considered to be difficult to operate (cannot be operated).
- the size of each operation target on the display surface is larger than the distance between the two operation targets by the predetermined value or more, the operation target is determined to be the simple operation target.
- the detailed operation target and the simple operation target are determined based on the reference size and the detected size.
- a display magnification thereof is changed, and an operation of the operation target is enabled.
- a configuration of a display control device according to the second embodiment of the present technology is approximately the same as the examples illustrated in FIGS. 1 and 2 , or the like. For this reason, regarding portions which are common to the first embodiment of the present technology, a part of descriptions thereof will be omitted.
- FIG. 12 is a flowchart which illustrates an example of a processing order of a display control processing by the display control device 100 according to the second embodiment of the present technology.
- a detected size as history information is sequentially maintained in the size information maintaining unit 171 .
- steps S 921 and S 922 descriptions thereof will be omitted since they have the same processing order as those of steps S 906 and S 907 illustrated in FIG. 10 .
- a control unit 170 compares a detected size which is obtained this time to a detected size in the past which is maintained in a size information maintaining unit 171 , and determines whether or not these are different from each other by a predetermined value or more (step S 923 ). In addition, when the two detected sizes are in a range of the predetermined value (step S 923 ), the operation of the display control processing is completed.
- control unit 170 performs a control for changing a display content of an operation target based on the contact size of an object on the display surface. Specifically, when there is an operation target of which the operation input is determined to be disabled based on the contact size of an object on the display surface, the control unit 170 performs a control for displaying the operation target by enlarging to a size by which the operation input of the operation target is enabled.
- FIG. 13 is a diagram which illustrates an example of a display screen (detected size measurement screen 500 ) which is displayed on an input-output unit 150 according to the embodiments of the present technology, and the measurement result.
- a measuring method illustrated in FIG. 13 can be realized, for example, by a method integrated into a calibration procedure which is performed on a precision adjustment screen of a general touch panel.
- “a” illustrates the detected size measurement screen 500 for measuring a detected size.
- a contact position image 501 and a tracing direction image 502 are displayed on the detected size measurement screen 500 .
- a user touches the contact position image 501 with an object (for example, user's finger 50 ) which is used when performing the operation input on the display surface of the input-output unit 150 .
- the user moves the finger 50 along an arrow of the tracing direction image 502 in a state where the finger 50 comes into contact with the display surface of the input-output unit 150 .
- a contact size of the finger 50 (detected size) when the user performs the contact operation of the finger 50 , and the tracing operation on the detected size measurement screen 500 .
- the contact size (detected size) of the finger 50 is measured by performing the contact operation and the tracing operation is illustrated, however, it is also preferable to measure the contact size (detected size) of the finger 50 by performing any one of the operations.
- the detected size becomes large, or small according to a finger size of a user which is used when performing an operation, or an intensity of contact with the display surface.
- an object other than the finger for example, a device such as stylus
- the detected size becomes large, or small according to the object which is used in the operation.
- a display control device such as a radio communication device has been described as an example.
- other display control devices electronic devices
- a view point position of the virtual space can be switched, or the enlarged and compacted display is possible.
- appliances such as a digital still camera, a digital video camera (for example, camera-integrated recorder), a digital photo frame, a smart phone, a tablet, a digital signage terminal, an automatic vending machine and a car navigation system.
- the processing order which is described in the above described embodiments may be understood as a method including these series of procedures, and may be understood as a program for causing a computer to execute these series of procedures, and a recording medium which stores the program.
- the recording medium for example, it is possible to use a CD (Compact Disc), an MD (MiniDisc), a DVD (Digital Versatile Disk), a memory card, a flexible disc (Blu-ray Disc (registered trade mark)).
- the present technology is also able to have a configuration as follows.
- An information processing apparatus comprising: circuitry configured to control a display to display an operation target; determine a contact size of an object on the display; and enable or disable an operation input for the operation target based on the contact size.
- circuitry is configured to enable an operation input for the operation target when the contact size is less than the threshold value.
- circuitry is configured to disable an operation input for the operation target when the contact size is greater than the threshold value.
- circuitry is configured to display at least a first operation target and a second operation target, wherein a display size of the first operation target is greater than a display size of the second operation target.
- circuitry is configured to enable an operation input for the first operation target when an operation input for the first operation target when the contact size is greater than the threshold value.
- circuitry is configured to control the display to display an enlarged version of the second operation target when the contact size is greater than the threshold value.
- circuitry configured to disable an operation input for the first operation target and the second operation target when the contact size is greater than the threshold value.
- circuitry configured to control the display to display at least a first operation target and a second operation target and identify a distance on the display between the first operation target and the second operation target.
- circuitry is configured to enable an operation input for the first operation target and the second operation target when the distance is greater than a predetermined threshold value and the contact size is greater than the threshold value.
- a non-transitory computer-readable medium including computer program instructions, which when executed by an information processing apparatus, causes the information processing apparatus to perform a process comprising: controlling a display to display an operation target; determining a contact size of an object on the display; and enabling or disabling an operation input for the operation target based on the contact size.
- a detection unit which detects an object which comes into contact with a display surface on which an operation target for which an operation input is performed is displayed; and a control unit which performs a control for changing an operation content of the operation target based on a contact size of the object on the display surface.
- control unit changes at least a part of the operation contents among the plurality of operation targets based on the contact size of the object on the display surface.
- control unit enables the operation input of the operation target when the size of the operation target on the display surface is large, and disables the operation input of the operation target when the size of the operation target on the display surface is small, based on the contact size of the object on the display surface.
- control unit changes the operation content of the operation target by disabling the operation input of the operation target based on the contact size of the object on the display surface.
- control unit performs a control for displaying the operation target by enlarging the operation target up to a size which enables the operation input of the operation target when there is an operation target of which the operation input is determined to be disabled based on the contact size of the object on the display surface.
- a display control device which includes:
- a method of controlling a display control device which includes:
- a detection procedure in which an object which comes into contact with a display surface on which an operation target for which an operation input is performed is displayed is detected; and a control procedure in which an operation content of the operation target is changed based on a contact size of an object on the display surface.
- a detection procedure in which an object which comes into contact with a display surface on which an operation target for which an operation input is performed is displayed is detected; and a control procedure in which an operation content of the operation target is changed based on a contact size of an object on the display surface.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Digital Computer Display Output (AREA)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-058063 | 2012-03-15 | ||
JP2012058063A JP5962085B2 (ja) | 2012-03-15 | 2012-03-15 | 表示制御装置、その制御方法およびプログラム |
PCT/JP2013/001276 WO2013136707A1 (en) | 2012-03-15 | 2013-03-01 | Information processing apparatus, method, and non-transitory computer-readable medium |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/001276 A-371-Of-International WO2013136707A1 (en) | 2012-03-15 | 2013-03-01 | Information processing apparatus, method, and non-transitory computer-readable medium |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/074,527 Continuation US10007401B2 (en) | 2012-03-15 | 2016-03-18 | Information processing apparatus, method, and non-transitory computer-readable medium |
US17/030,936 Continuation US11747958B2 (en) | 2012-03-15 | 2020-09-24 | Information processing apparatus for responding to finger and hand operation inputs |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150033162A1 true US20150033162A1 (en) | 2015-01-29 |
Family
ID=47901254
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/379,931 Abandoned US20150033162A1 (en) | 2012-03-15 | 2013-03-01 | Information processing apparatus, method, and non-transitory computer-readable medium |
US15/074,527 Active US10007401B2 (en) | 2012-03-15 | 2016-03-18 | Information processing apparatus, method, and non-transitory computer-readable medium |
US17/030,936 Active 2034-05-01 US11747958B2 (en) | 2012-03-15 | 2020-09-24 | Information processing apparatus for responding to finger and hand operation inputs |
US18/352,372 Pending US20230367455A1 (en) | 2012-03-15 | 2023-07-14 | Information processing apparatus for responding to finger and hand operation inputs |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/074,527 Active US10007401B2 (en) | 2012-03-15 | 2016-03-18 | Information processing apparatus, method, and non-transitory computer-readable medium |
US17/030,936 Active 2034-05-01 US11747958B2 (en) | 2012-03-15 | 2020-09-24 | Information processing apparatus for responding to finger and hand operation inputs |
US18/352,372 Pending US20230367455A1 (en) | 2012-03-15 | 2023-07-14 | Information processing apparatus for responding to finger and hand operation inputs |
Country Status (9)
Country | Link |
---|---|
US (4) | US20150033162A1 (ja) |
EP (1) | EP2826230A1 (ja) |
JP (1) | JP5962085B2 (ja) |
KR (6) | KR102216482B1 (ja) |
CN (1) | CN104185979B (ja) |
IN (1) | IN2014MN01752A (ja) |
RU (1) | RU2014136528A (ja) |
TW (1) | TW201351262A (ja) |
WO (1) | WO2013136707A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD742413S1 (en) * | 2013-11-21 | 2015-11-03 | Microsoft Corporation | Display screen with icon |
USD747339S1 (en) * | 2013-03-13 | 2016-01-12 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD749132S1 (en) * | 2013-11-22 | 2016-02-09 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
USD749628S1 (en) * | 2013-06-13 | 2016-02-16 | Zya, Inc. | Display screen or portion thereof with icon |
USD751112S1 (en) * | 2013-06-13 | 2016-03-08 | Zya, Inc. | Display screen or portion thereof with icon |
US20160117540A1 (en) * | 2014-10-28 | 2016-04-28 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Electronic device and control method thereof |
EP3068073A1 (en) * | 2015-03-12 | 2016-09-14 | Konica Minolta, Inc. | Conference support apparatus, conference support system, conference support program, and conference support method |
USD775663S1 (en) * | 2014-06-01 | 2017-01-03 | Apple Inc. | Display screen or portion thereof with a set of graphical user interfaces |
US20170115741A1 (en) * | 2015-10-26 | 2017-04-27 | Funai Electric Co., Ltd. | Input device |
USD827661S1 (en) * | 2015-12-04 | 2018-09-04 | Airbus Operations Gmbh | Display screen or portion thereof with graphical user interface |
USD872123S1 (en) * | 2016-06-10 | 2020-01-07 | Apple Inc. | Display screen or portion thereof with a group of icons |
USD877170S1 (en) | 2018-01-22 | 2020-03-03 | Apple Inc. | Electronic device with graphical user interface |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5962085B2 (ja) | 2012-03-15 | 2016-08-03 | ソニー株式会社 | 表示制御装置、その制御方法およびプログラム |
JP2015007949A (ja) | 2013-06-26 | 2015-01-15 | ソニー株式会社 | 表示装置、表示制御方法及びコンピュータプログラム |
US9619081B2 (en) * | 2014-02-04 | 2017-04-11 | Cirque Corporation | Using dynamically scaled linear correction to improve finger tracking linearity on touch sensors |
JP6525753B2 (ja) * | 2015-06-12 | 2019-06-05 | キヤノン株式会社 | 表示制御装置、その制御方法、およびプログラム |
CN108253634A (zh) * | 2018-01-12 | 2018-07-06 | 广东顺德圣堡莱热能科技有限公司 | 一种用于壁挂炉的触控装置及具有其的壁挂炉 |
KR102656447B1 (ko) * | 2018-02-27 | 2024-04-12 | 삼성전자주식회사 | 컨트롤러와 접촉된 신체 부위에 따라 그래픽 객체를 다르게 표시하는 방법 및 전자 장치 |
USD995557S1 (en) * | 2021-07-22 | 2023-08-15 | Intuit Inc. | Display screen with an animated graphical user interface showing a payment slider |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6169538B1 (en) * | 1998-08-13 | 2001-01-02 | Motorola, Inc. | Method and apparatus for implementing a graphical user interface keyboard and a text buffer on electronic devices |
US6211856B1 (en) * | 1998-04-17 | 2001-04-03 | Sung M. Choi | Graphical user interface touch screen with an auto zoom feature |
US6243091B1 (en) * | 1997-11-21 | 2001-06-05 | International Business Machines Corporation | Global history view |
US20090139778A1 (en) * | 2007-11-30 | 2009-06-04 | Microsoft Corporation | User Input Using Proximity Sensing |
US20090204925A1 (en) * | 2008-02-08 | 2009-08-13 | Sony Ericsson Mobile Communications Ab | Active Desktop with Changeable Desktop Panels |
US20100103127A1 (en) * | 2007-02-23 | 2010-04-29 | Taeun Park | Virtual Keyboard Input System Using Pointing Apparatus In Digital Device |
US20100162171A1 (en) * | 2008-12-19 | 2010-06-24 | Verizon Data Services Llc | Visual address book and dialer |
US20100289752A1 (en) * | 2009-05-12 | 2010-11-18 | Jorgen Birkler | Displays for electronic devices that detect and respond to the size and/or angular orientation of user input objects |
US20100302212A1 (en) * | 2009-06-02 | 2010-12-02 | Microsoft Corporation | Touch personalization for a display device |
US20110316888A1 (en) * | 2010-06-28 | 2011-12-29 | Invensense, Inc. | Mobile device user interface combining input from motion sensors and other controls |
US20120235967A1 (en) * | 2009-11-30 | 2012-09-20 | Sharp Kabushiki Kaisha | Display device |
US8279241B2 (en) * | 2008-09-09 | 2012-10-02 | Microsoft Corporation | Zooming graphical user interface |
US8352884B2 (en) * | 2009-05-21 | 2013-01-08 | Sony Computer Entertainment Inc. | Dynamic reconfiguration of GUI display decomposition based on predictive model |
US20130019201A1 (en) * | 2011-07-11 | 2013-01-17 | Microsoft Corporation | Menu Configuration |
US8436828B1 (en) * | 2012-01-27 | 2013-05-07 | Google Inc. | Smart touchscreen key activation detection |
US20130120278A1 (en) * | 2008-11-11 | 2013-05-16 | Christian T. Cantrell | Biometric Adjustments for Touchscreens |
US20130152002A1 (en) * | 2011-12-11 | 2013-06-13 | Memphis Technologies Inc. | Data collection and analysis for adaptive user interfaces |
US20180129375A1 (en) * | 2011-02-18 | 2018-05-10 | Sony Corporation | Method and apparatus for navigating a hierarchical menu based user interface |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5757358A (en) * | 1992-03-31 | 1998-05-26 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus for enhancing computer-user selection of computer-displayed objects through dynamic selection area and constant visual feedback |
JPH08234909A (ja) * | 1995-02-23 | 1996-09-13 | Casio Comput Co Ltd | 入力装置 |
US6259436B1 (en) | 1998-12-22 | 2001-07-10 | Ericsson Inc. | Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch |
WO2004070604A2 (en) * | 2003-02-05 | 2004-08-19 | Philips Intellectual Property & Standards Gmbh | Method of selecting objects of a user interface on a display screen |
KR101113234B1 (ko) * | 2004-12-27 | 2012-02-20 | 삼성전자주식회사 | 이미지 처리장치의 이미지 표시방법 |
KR100672539B1 (ko) * | 2005-08-12 | 2007-01-24 | 엘지전자 주식회사 | 터치스크린을 구비하는 이동통신단말기에서의 터치 입력인식 방법 및 이를 구현할 수 있는 이동통신단말기 |
US7843427B2 (en) | 2006-09-06 | 2010-11-30 | Apple Inc. | Methods for determining a cursor position from a finger contact with a touch screen display |
TWI460647B (zh) * | 2007-05-15 | 2014-11-11 | Htc Corp | 電子裝置與其軟體之使用者介面多重選擇方法 |
JP2009162691A (ja) * | 2008-01-09 | 2009-07-23 | Toyota Motor Corp | 車載情報端末 |
JP2009193423A (ja) * | 2008-02-15 | 2009-08-27 | Panasonic Corp | 電子機器の入力装置 |
JP2009265793A (ja) | 2008-04-23 | 2009-11-12 | Sony Ericsson Mobilecommunications Japan Inc | 表示操作装置、操作装置およびプログラム |
BRPI0918486A2 (pt) * | 2008-09-10 | 2017-03-21 | Opera Software Asa | método para selecionar um item em uma tela de exibição com uma interface de toque, aparelho, e, meio de armazenamento legível por computador |
JP5058187B2 (ja) * | 2009-02-05 | 2012-10-24 | シャープ株式会社 | 携帯情報端末 |
JP2010186442A (ja) * | 2009-02-13 | 2010-08-26 | Sharp Corp | 入力装置、及び入力制御方法 |
US20100265185A1 (en) * | 2009-04-17 | 2010-10-21 | Nokia Corporation | Method and Apparatus for Performing Operations Based on Touch Inputs |
KR20100134948A (ko) * | 2009-06-16 | 2010-12-24 | 삼성전자주식회사 | 터치스크린을 구비하는 장치의 메뉴 표시 방법 |
US20110057886A1 (en) * | 2009-09-10 | 2011-03-10 | Oliver Ng | Dynamic sizing of identifier on a touch-sensitive display |
KR101701492B1 (ko) * | 2009-10-16 | 2017-02-14 | 삼성전자주식회사 | 데이터 표시 방법 및 그를 수행하는 단말기 |
RU2556079C2 (ru) * | 2010-02-04 | 2015-07-10 | Нокиа Корпорейшн | Ввод данных пользователем |
US8982160B2 (en) * | 2010-04-16 | 2015-03-17 | Qualcomm, Incorporated | Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size |
JP5615642B2 (ja) * | 2010-09-22 | 2014-10-29 | 京セラ株式会社 | 携帯端末、入力制御プログラム及び入力制御方法 |
JP2012113666A (ja) * | 2010-11-29 | 2012-06-14 | Canon Inc | 表示サイズ切換タッチパネル |
US8405627B2 (en) * | 2010-12-07 | 2013-03-26 | Sony Mobile Communications Ab | Touch input disambiguation |
CN103492981A (zh) * | 2011-04-19 | 2014-01-01 | 惠普发展公司,有限责任合伙企业 | 触摸屏选择 |
JP5962085B2 (ja) * | 2012-03-15 | 2016-08-03 | ソニー株式会社 | 表示制御装置、その制御方法およびプログラム |
-
2012
- 2012-03-15 JP JP2012058063A patent/JP5962085B2/ja active Active
-
2013
- 2013-03-01 EP EP13710579.7A patent/EP2826230A1/en not_active Withdrawn
- 2013-03-01 KR KR1020207026299A patent/KR102216482B1/ko active IP Right Grant
- 2013-03-01 IN IN1752MUN2014 patent/IN2014MN01752A/en unknown
- 2013-03-01 US US14/379,931 patent/US20150033162A1/en not_active Abandoned
- 2013-03-01 CN CN201380012653.5A patent/CN104185979B/zh not_active Expired - Fee Related
- 2013-03-01 WO PCT/JP2013/001276 patent/WO2013136707A1/en active Application Filing
- 2013-03-01 KR KR1020147024837A patent/KR101790838B1/ko active IP Right Grant
- 2013-03-01 KR KR1020177030316A patent/KR101960906B1/ko active IP Right Grant
- 2013-03-01 RU RU2014136528A patent/RU2014136528A/ru not_active Application Discontinuation
- 2013-03-01 KR KR1020217004012A patent/KR102278817B1/ko active IP Right Grant
- 2013-03-01 KR KR1020197030078A patent/KR102158439B1/ko active IP Right Grant
- 2013-03-01 KR KR1020197007493A patent/KR20190030778A/ko active Application Filing
- 2013-03-05 TW TW102107665A patent/TW201351262A/zh unknown
-
2016
- 2016-03-18 US US15/074,527 patent/US10007401B2/en active Active
-
2020
- 2020-09-24 US US17/030,936 patent/US11747958B2/en active Active
-
2023
- 2023-07-14 US US18/352,372 patent/US20230367455A1/en active Pending
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6243091B1 (en) * | 1997-11-21 | 2001-06-05 | International Business Machines Corporation | Global history view |
US6211856B1 (en) * | 1998-04-17 | 2001-04-03 | Sung M. Choi | Graphical user interface touch screen with an auto zoom feature |
US6169538B1 (en) * | 1998-08-13 | 2001-01-02 | Motorola, Inc. | Method and apparatus for implementing a graphical user interface keyboard and a text buffer on electronic devices |
US20100103127A1 (en) * | 2007-02-23 | 2010-04-29 | Taeun Park | Virtual Keyboard Input System Using Pointing Apparatus In Digital Device |
US20090139778A1 (en) * | 2007-11-30 | 2009-06-04 | Microsoft Corporation | User Input Using Proximity Sensing |
US20090204925A1 (en) * | 2008-02-08 | 2009-08-13 | Sony Ericsson Mobile Communications Ab | Active Desktop with Changeable Desktop Panels |
US8279241B2 (en) * | 2008-09-09 | 2012-10-02 | Microsoft Corporation | Zooming graphical user interface |
US20130120278A1 (en) * | 2008-11-11 | 2013-05-16 | Christian T. Cantrell | Biometric Adjustments for Touchscreens |
US20100162171A1 (en) * | 2008-12-19 | 2010-06-24 | Verizon Data Services Llc | Visual address book and dialer |
US20100289752A1 (en) * | 2009-05-12 | 2010-11-18 | Jorgen Birkler | Displays for electronic devices that detect and respond to the size and/or angular orientation of user input objects |
US8352884B2 (en) * | 2009-05-21 | 2013-01-08 | Sony Computer Entertainment Inc. | Dynamic reconfiguration of GUI display decomposition based on predictive model |
US20100302212A1 (en) * | 2009-06-02 | 2010-12-02 | Microsoft Corporation | Touch personalization for a display device |
US20120235967A1 (en) * | 2009-11-30 | 2012-09-20 | Sharp Kabushiki Kaisha | Display device |
US20110316888A1 (en) * | 2010-06-28 | 2011-12-29 | Invensense, Inc. | Mobile device user interface combining input from motion sensors and other controls |
US20180129375A1 (en) * | 2011-02-18 | 2018-05-10 | Sony Corporation | Method and apparatus for navigating a hierarchical menu based user interface |
US20130019201A1 (en) * | 2011-07-11 | 2013-01-17 | Microsoft Corporation | Menu Configuration |
US20130152002A1 (en) * | 2011-12-11 | 2013-06-13 | Memphis Technologies Inc. | Data collection and analysis for adaptive user interfaces |
US8436828B1 (en) * | 2012-01-27 | 2013-05-07 | Google Inc. | Smart touchscreen key activation detection |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD747339S1 (en) * | 2013-03-13 | 2016-01-12 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD749628S1 (en) * | 2013-06-13 | 2016-02-16 | Zya, Inc. | Display screen or portion thereof with icon |
USD751112S1 (en) * | 2013-06-13 | 2016-03-08 | Zya, Inc. | Display screen or portion thereof with icon |
USD742413S1 (en) * | 2013-11-21 | 2015-11-03 | Microsoft Corporation | Display screen with icon |
USD749132S1 (en) * | 2013-11-22 | 2016-02-09 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
USD775663S1 (en) * | 2014-06-01 | 2017-01-03 | Apple Inc. | Display screen or portion thereof with a set of graphical user interfaces |
US9626546B2 (en) * | 2014-10-28 | 2017-04-18 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Electronic device and control method thereof |
US20160117540A1 (en) * | 2014-10-28 | 2016-04-28 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Electronic device and control method thereof |
EP3068073A1 (en) * | 2015-03-12 | 2016-09-14 | Konica Minolta, Inc. | Conference support apparatus, conference support system, conference support program, and conference support method |
US20170115741A1 (en) * | 2015-10-26 | 2017-04-27 | Funai Electric Co., Ltd. | Input device |
US10782789B2 (en) * | 2015-10-26 | 2020-09-22 | Funai Electric Co., Ltd. | Input device |
USD827661S1 (en) * | 2015-12-04 | 2018-09-04 | Airbus Operations Gmbh | Display screen or portion thereof with graphical user interface |
USD872123S1 (en) * | 2016-06-10 | 2020-01-07 | Apple Inc. | Display screen or portion thereof with a group of icons |
USD877170S1 (en) | 2018-01-22 | 2020-03-03 | Apple Inc. | Electronic device with graphical user interface |
USD928822S1 (en) | 2018-01-22 | 2021-08-24 | Apple Inc. | Electronic device with graphical user interface |
USD1009934S1 (en) | 2018-01-22 | 2024-01-02 | Apple Inc. | Display screen or portion thereof with group of graphical user interfaces |
Also Published As
Publication number | Publication date |
---|---|
TW201351262A (zh) | 2013-12-16 |
WO2013136707A1 (en) | 2013-09-19 |
KR102278817B1 (ko) | 2021-07-20 |
CN104185979B (zh) | 2017-09-26 |
US20230367455A1 (en) | 2023-11-16 |
KR102158439B1 (ko) | 2020-09-22 |
CN104185979A (zh) | 2014-12-03 |
KR101960906B1 (ko) | 2019-03-25 |
IN2014MN01752A (ja) | 2015-07-03 |
US11747958B2 (en) | 2023-09-05 |
RU2014136528A (ru) | 2016-03-27 |
JP2013191112A (ja) | 2013-09-26 |
US20210004130A1 (en) | 2021-01-07 |
KR20210018556A (ko) | 2021-02-17 |
US20160202856A1 (en) | 2016-07-14 |
KR20170122838A (ko) | 2017-11-06 |
EP2826230A1 (en) | 2015-01-21 |
KR20200108507A (ko) | 2020-09-18 |
KR20140146061A (ko) | 2014-12-24 |
KR20190030778A (ko) | 2019-03-22 |
US10007401B2 (en) | 2018-06-26 |
JP5962085B2 (ja) | 2016-08-03 |
KR20190119186A (ko) | 2019-10-21 |
KR102216482B1 (ko) | 2021-02-17 |
KR101790838B1 (ko) | 2017-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11747958B2 (en) | Information processing apparatus for responding to finger and hand operation inputs | |
US20160381282A1 (en) | Image processing apparatus and image processing method | |
US20150301595A1 (en) | Electronic apparatus and eye-gaze input method | |
US20150002436A1 (en) | Information processing apparatus, method, and non-transitory computer-readable medium | |
US20140036131A1 (en) | Method of capturing an image in a device and the device thereof | |
JP2017533602A (ja) | 電子デバイスのカメラ間の切り替え | |
JP2014010494A (ja) | 電子機器およびその制御方法、プログラム並びに記憶媒体 | |
US10313580B2 (en) | Electronic apparatus, control method therefor, and storage medium | |
CN112749590B (zh) | 目标检测方法、装置、计算机设备和计算机可读存储介质 | |
JP2021033539A (ja) | 電子機器、電子機器の制御方法、プログラム及び記憶媒体 | |
JP2014006781A (ja) | 情報処理装置、情報処理方法および記録媒体 | |
US11009991B2 (en) | Display control apparatus and control method for the display control apparatus | |
US11010045B2 (en) | Control apparatus, control method, and non-transitory computer readable medium | |
CN113242466B (zh) | 视频剪辑方法、装置、终端及存储介质 | |
JP2014056470A (ja) | 撮像装置、その制御方法、プログラムおよび記録媒体 | |
KR20190055871A (ko) | 전자장치 및 전자장치 제어방법 | |
JP2016181293A (ja) | 情報処理装置、情報処理方法および記録媒体 | |
JP2014135098A (ja) | 電子機器およびその制御方法、プログラム並びに記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HIRO, DAISUKE;TAKAOKA, LYO;YANO, AKANE;AND OTHERS;SIGNING DATES FROM 20140627 TO 20140701;REEL/FRAME:033577/0427 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |