US20130169565A1 - Computer-readable non-transitory storage medium, information processing apparatus, information processing system, and information processing method - Google Patents

Computer-readable non-transitory storage medium, information processing apparatus, information processing system, and information processing method Download PDF

Info

Publication number
US20130169565A1
US20130169565A1 US13/680,948 US201213680948A US2013169565A1 US 20130169565 A1 US20130169565 A1 US 20130169565A1 US 201213680948 A US201213680948 A US 201213680948A US 2013169565 A1 US2013169565 A1 US 2013169565A1
Authority
US
United States
Prior art keywords
information processing
unit
force
processing unit
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/680,948
Other languages
English (en)
Inventor
Kiyofumi Funahashi
Yasumasa MIYOSHI
Hiroki TAKUMA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Assigned to NINTENDO CO., LTD. reassignment NINTENDO CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUNAHASHI, KIYOFUMI, MIYOSHI, YASUMASA, TAKUMA, HIROKI
Publication of US20130169565A1 publication Critical patent/US20130169565A1/en
Priority to US15/825,873 priority Critical patent/US10732742B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the technology shown here relates to a computer-readable non-transitory storage medium, an information processing apparatus, an information processing system, and an information processing method which perform information processing in accordance with a user's operation.
  • an information processing apparatus with a touch panel which performs an operation by selecting a button representing s character, a symbol, and the like displayed on a screen of a display in accordance with an input performed on the touch panel by the user.
  • the conventional information processing apparatus simply uses various buttons or gesture inputs to realize various operations and cannot provide diversity to operations based on respective touch inputs performed on the touch panel. That is, the conventional information processing apparatus cannot provide a user interface which allows various operations.
  • a first objective of the present technology is to provide a new user interface which allows various operations.
  • a second objective of the present technology is to provide a user interface which allows more intuitive operations.
  • the present technology has the following features.
  • An example of the present technology is a computer-readable non-transitory storage medium having stored therein a program which causes a computer of an information processing apparatus including a force detection unit which detects a force applied to the information processing apparatus to function as an evaluation unit and an information processing unit.
  • the evaluation unit obtains a touch position from a touch input unit and evaluates an input area of a touch input.
  • the information processing unit performs predetermined information processing in accordance with the input area and the force detected by the force detection unit.
  • the information processing unit may perform the predetermined information processing in accordance with the input area when the force is detected by the force detection unit and the detected force.
  • the information processing unit may perform the predetermined information processing based on both of the input area and the detected force.
  • the predetermined information processing is performed based on both of the contact area and the force applied to the apparatus and the like, and thereby a wider variety of operations can be performed.
  • the information processing unit may perform the predetermined information processing based on the input area, the detected force, and the touch position.
  • the operation can be performed based on the three parameters representing the input area, the detected force, and the touch position, and thereby a wider variety of operations can be performed.
  • the force detection unit may be a movement sensor which detects a movement of the information processing apparatus and the information processing unit may perform the predetermined information processing in accordance with the input area and the detected movement.
  • a movement such as a vibration and the like of the information processing apparatus generated at a time of a touch input can be detected, and thus a force of the touch input can be indirectly detected without detecting a pressure of the touch input.
  • the information processing unit may perform the predetermined information processing in accordance with the input area, a magnitude and a direction of the movement detected by the movement sensor.
  • the force detection unit may be a pressure sensor which detects a pressure applied to the touch input unit and the detected force may be a magnitude of the pressure detected by the pressure sensor.
  • the pressure at the time of the touch input can be detected and thus the force of the touch input can be directly detected.
  • the information processing unit may perform the predetermined information processing when the input area and the detected force exceed a predetermined size and a predetermined magnitude, respectively.
  • the information processing unit may perform the predetermined information processing when the touch position is in a certain region.
  • the evaluation unit may evaluate the input area based on a number of touches performed on the touch input unit.
  • the information processing apparatus may include a storage unit and the storage unit may store a table of setting values defining information processes which correspond to the respective setting values.
  • the information processing unit may search the table of setting values for at least one of a setting value which corresponds to the input area and a setting value which corresponds to a force detection value representing the detected force and perform information processing which corresponds to the corresponding setting value.
  • the program may cause the computer of the information processing apparatus to further function as a display control unit which reads information from a storage unit and displays the information on a display.
  • the information processing performed by the information processing unit may be a process with regard to transformation of an object displayed on the display, and at least one of a transformation range and a degree of transformation of the object may be changed based on at least one of the input area and the detected force.
  • the information processing unit may further exert a predetermined shape change on the transformation of the object based on at least one of the input area and the detected force.
  • the information processing performed by the information processing unit may be a process with regard to an in-game effect and at least one of a target range and a magnitude of the in-game effect may be changed based on at least one of the input area and the detected force.
  • the program may cause the computer of the information processing apparatus to further function as a display control unit which reads information from a storage unit and displays the information on a display.
  • the information processing performed by the information processing unit may be a process with regard to image display of the display and at least one of a number of display images to transit and a transition speed may be changed based on at least one of the input area and the detected force.
  • the information processing performed by the information processing unit may be a process with regard to data reproduction, and at least one of a skip time and a speed of the data reproduction may be changed based on at least one of the input area and the detected force.
  • a user interface which allows a variety of operations can be provided.
  • FIG. 1 is a non-limiting example external view of an information processing apparatus
  • FIG. 2 is a non-limiting example block diagram illustrating an internal configuration of the information processing apparatus
  • FIG. 3 is a non-limiting example diagram illustrating an operation of the information processing apparatus with a single finger
  • FIG. 4 is a non-limiting example diagram illustrating an operation of the information processing apparatus with two fingers
  • FIG. 5 is a non-limiting example diagram illustrating a transformation pattern of a cubical object
  • FIG. 6 is a non-limiting example flow chart illustrating details of a display process
  • FIG. 7 is a non-limiting example diagram illustrating a display content of different sized objects
  • FIG. 8 is a non-limiting example flow chart illustrating details of a selection process of the different sized objects
  • FIG. 9 is a non-limiting example diagram illustrating a display content of objects having different depth values.
  • FIG. 10 is a non-limiting example flow chart illustrating details of a selection process of the objects having different depth values.
  • FIG. 1 is an external view of an information processing apparatus according to an exemplary embodiment of the present technology.
  • an information processing apparatus 1 is a hand-held information processing apparatus and includes a housing 1 a , a display 30 , a touch panel 40 , and an operation unit 50 .
  • the housing 1 a is of a size that can be held by one hand of a user.
  • the housing 1 a is provided with the display 30 and a front surface of the display 30 is covered with the touch panel 40 .
  • the housing 1 a is further provided with the operation unit 50 . Details of the display 30 , the touch panel 40 , and the operation unit 50 will be described below.
  • the touch panel 40 may be provided at any suitable location other than the front surface of the display 30 .
  • FIG. 2 is a block diagram illustrating an internal configuration of the information processing apparatus according to the exemplary embodiment of the present technology.
  • the information processing apparatus 1 includes an information processing unit 10 , a storage unit 20 , the display 30 , the touch panel 40 , the operation unit 50 , a wireless communication unit 60 , a sound input/output unit 70 , and an acceleration sensor 80 .
  • the information processing unit 10 reads an information processing program stored in the storage unit 20 and executes the information processing program, thereby performing information processing described below.
  • the information processing unit 10 includes a CPU (Central Processing Unit), for example.
  • the storage unit 20 stores the information processing program executed by the information processing unit 10 , image data to be displayed on the display 30 , sound data to be outputted from the sound input/output unit 70 , information from the touch panel 40 , the operation unit 50 , the wireless communication unit 60 , the sound input/output unit 70 , and the acceleration sensor 80 , and the like.
  • the storage unit 20 includes a RAM (Random Access Memory) or a ROM (Read Only Memory), for example.
  • the display 30 displays an image generated by the information processing unit 10 executing the information processing program and an image downloaded from a web site on the internet via the wireless communication unit 60 .
  • the display 30 includes an LCD (Liquid Crystal Display), for example.
  • the user of the information processing apparatus 1 brings his/her finger, a pen, and the like into contact with the touch panel 40 , thereby the touch panel 40 is caused to obtain information of a contact position and a contact area at regular time intervals and output the information to the information processing unit 10 .
  • the touch panel 40 includes an electrostatic capacitance type touch panel, for example.
  • the touch panel 40 can simultaneously detect positions of a plurality of points positioned on the touch panel 40 at regular intervals.
  • a contact region of a finger is wider enough than the regular interval.
  • the touch panel 40 simultaneously detects a plurality of positions, and outputs the plurality of positions to the information processing unit 10 . Because the touch panel 40 covers the display 30 , positions on the touch panel 40 detected by the touch panel 40 are referred to as positions on a screen of the display 30 for ease of explanation.
  • the information processing unit 10 stores the detected positions temporarily in the storage unit 20 .
  • the touch panel 40 has a rectangle shape, and a direction parallel to one side of the rectangle is defined as an X-axial direction while a direction parallel to a side adjoining the one side is defined as a Y-axial direction.
  • a contact position is defined by an X coordinate and a Y coordinate, for example.
  • the X coordinate is obtained by specifying a position having a maximum X value and a position having a minimum X value among positions forming the contact region and obtaining a center position between these positions.
  • the Y coordinate is obtained by specifying a position having a maximum Y value and a position having a minimum Y value among the positions forming the contact region and obtaining a center position of these positions.
  • a contact area is obtained by, for example: specifying the position having the maximum X value and the position having the minimum X value among positions forming the contact region and defining a difference between the X values as a length of a short axis (or long axis); specifying the position having the maximum Y value and the position having the minimum Y value among the positions forming the contact region and defining a difference between the Y values as a length of a long axis (or short axis); and obtaining an area of an ellipse using these lengths.
  • the contact area may be evaluated based on a number of touches (e.g., a number of fingers, touch pens, and the like which touch the touch panel 40 ) performed on the touch panel 40 .
  • the contact area is determined to be small in later-described FIG. 3 because a single finger is in contact with the touch panel 40 , while the contact area is determined to be large in later-described FIG. 4 because two fingers are in contact with the touch panel 40 .
  • the operation unit 50 obtains operation information in accordance with an operation performed by the user and outputs the operation information to the information processing unit 10 .
  • the operation unit 50 includes operation buttons which can be pressed down by the user, for example.
  • the wireless communication unit 60 transmits information from the information processing unit 10 to a server on the internet and another information processing apparatus and transmits information from the server on the internet and the other information processing apparatus to the information processing unit 10 .
  • the wireless communication unit 60 includes a module having a function of connecting to a wireless LAN by using a method based on, for example, IEEE802.11.b/g.
  • the sound input/output unit 70 outputs a sound represented by the sound data read by the information processing unit 10 from the storage unit 20 and outputs sound data representing a sound inputted from the outside of the information processing apparatus 1 to the information processing unit 10 .
  • the sound input/output unit 70 includes a microphone and a speaker, for example.
  • the acceleration sensor 80 detects accelerations in three-axial directions (X-axis, Y-axis, and Z-axial directions) shown in FIG. 1 and outputs acceleration values representing the detected accelerations to the information processing unit 10 .
  • the cubical object is displayed on the display 30 and transformed based on a contact area obtained when the user's finger contacts the touch panel 40 and a force applied to the information processing apparatus 1 by the contact.
  • the cubical object will be described below.
  • the display process according to the exemplary embodiment is applicable also to an object such as a plane object other than the cubical object.
  • FIG. 3 is a diagram illustrating an operation of the information processing apparatus with a single finger.
  • FIG. 4 is a diagram illustrating an operation of the information processing apparatus with two fingers.
  • FIG. 3 and FIG. 4 there are a case in which a touch input is performed by bringing a single finger 5 into contact with the touch panel 40 and a case in which a touch input is performed by bringing two fingers 5 , 6 into contact with the touch panel 40 .
  • the touch input with the single finger and the touch input with the two fingers are different from each other in that contact areas of respective contact regions 40 a , 40 b on the touch panel 40 are different. That is, the contact region 40 a in FIG. 3 is a contact area corresponding to a tip of a single finger and the contact region 40 b in FIG. 4 is a contact area corresponding to tips of two fingers which are twice the contact area corresponding to the tip of the single finger.
  • the number of fingers used for the touch input can be detected based on the contact area. It should be noted that, other than the number of fingers, which of a fingertip, a finger pad, a touch pen (stylus), and the like is used to perform a touch input on the touch panel 40 can be identified based on the contact area.
  • the information processing apparatus 1 includes the acceleration sensor 80 in the housing 1 a and detects a force applied to the information processing apparatus 1 at a time of the touch input based on a magnitude of an acceleration detected by the acceleration sensor 80 .
  • the user operates the information processing apparatus 1 by a touch input, the user performs the touch input by pressing his/her finger against the touch panel 40 .
  • the housing 1 a slightly shakes when the finger is pressed against the touch panel 40 .
  • the housing 1 a shakes more greatly.
  • the acceleration sensor 80 is a movement detection sensor which detects a movement of the information processing apparatus 1 (housing 1 a ), and the detected movement can be regarded as the force applied to the information processing apparatus 1 at the time of the touch input.
  • the detected movement can also be regarded as a vibration applied to the information processing apparatus 1 at the time of the touch input.
  • the acceleration sensor may be various types of acceleration sensors, such as, for example, an acceleration sensor which outputs an electric signal proportional to a detected acceleration, an acceleration sensor which outputs a detected acceleration as a mathematical value, and the like.
  • the acceleration sensor may be a contact type mechanical acceleration sensor which outputs a detection signal as a switch is turned on when an applied force (acceleration) exceeds a predetermined magnitude.
  • a sensor such as a gyro sensor which detects an angular velocity and a magnetic sensor which detects a direction
  • a function equivalent to that of the movement detection sensor can be realized by analyzing images which are sequentially captured by a camera and detecting differences among the images.
  • a pressure sensor may be provided in the touch panel 40 so that a pressure applied to the touch panel 40 by the touch input can be directly detected and the detected pressure may be regarded as the force applied to the information processing apparatus 1 at the time of the touch input. Accordingly, with a single operation of performing the touch input to the touch panel 40 , the user can obtain two types of parameters representing the contact area and the force applied to the information processing apparatus 1 .
  • FIG. 5 is a diagram illustrating a transformation pattern of a cubical object displayed on the display.
  • transformation ranges two types of ranges (hereinafter referred to as transformation ranges) of performing a transformation effect on the cubical object based on whether the contact area is greater than or equal to a predetermined first threshold value are set.
  • degrees of transformation two types of depth/height (hereinafter referred to as degrees of transformation) of the transformation effect performed on the cubical object based on whether the magnitude of the acceleration is greater than or equal to a predetermined second threshold value are set. That is, there are two levels of transformation ranges and two degrees of transformation and thus there are typical four transformation modes of the cubical object.
  • a coordinate system in a real space is associated with a coordinate system in a virtual space, thereby a more intuitive operation is realized.
  • a parameter representing an expanse on an X-Y plane in FIG. 3 and FIG. 4 which is a coordinate plane of the contact area is used as a transformation range. That is, the cubical object is transformed within a range corresponding to an area on the touch panel 40 which is actually contacted by the user's finger.
  • a parameter representing a depth/height with respect to a Z-axial direction which is a pressing direction of a touch operation in FIG. 3 and FIG. 4 is used as a degree of transformation. That is, the user can perform a transformation on the cubical object with an operational feeling as if the user actually presses the cubical object with his/her finger.
  • the parameter may represent an expanse on another plane and a depth in another axial direction.
  • a transformation mode a illustrates a transformation when a weak touch input is performed with the single finger 5 as shown in FIG. 3 , that is, when the contact area is less than the first threshold value and the magnitude of the acceleration is less than the second threshold value.
  • the cubical object has a shallow concave portion thereon in a narrow range. This transformation is accompanied by a transformed shape change associated with an image of the cubical object being hit by an object having a small apical surface with a weak force.
  • a transformation mode b illustrates a case of a touch input with the single finger 5 .
  • the transformation b illustrates a transformation when a strong touch input is performed, that is, when the contact area is less than the first threshold value and the magnitude of the acceleration is greater than or equal to the second threshold value.
  • the cubical object has a deep and sharp concave portion thereon in a narrow range. This transformation is accompanied by a transformed shape change associated with an image of the cubical object being hit by an object having a small apical surface with a strong force.
  • a transformation mode c illustrates a transformation when a weak touch input is performed with the two fingers 5 , 6 as shown in FIG. 4 , that is, when the contact area is greater than or equal to the first threshold value and the magnitude of the acceleration is less than the second threshold value.
  • the cubical object has a shallow concave portion thereon in a wide range. This transformation is accompanied by a transformed shape change associated with an image of the cubical object being hit by an object having a large apical surface with a weak force.
  • a transformation mode d illustrates a transformation in a case of a touch input with the two fingers 5 , 6 .
  • the transformation mode d illustrates a transformation when a strong touch input is performed, that is, when the contact area is greater than or equal to the first threshold value and the magnitude of the acceleration is greater than or equal to the second threshold value.
  • the cubical object has a deep and sharp concave portion thereon in a wide range. This transformation accompanies a transformed shape change associated with an image of the cubical object being hit by an object having a large apical surface with a strong force. Needless to say, these presentation effects are not indispensable. Only the transformation range and the degree of transformation may be changed in accordance with the contact area and the magnitude of the acceleration without changing the transformed shape in the transformation range.
  • the user can obtain two types of parameters representing the contact area and the force applied to the information processing apparatus 1 by the single operation of performing the touch input to the touch panel 40 and transform the target object in various manners based on the parameters.
  • FIG. 6 is a flow chart illustrating details of the display process according to the first exemplary embodiment.
  • the information processing unit 10 executes the information processing program stored in the storage unit 20 , thereby the process shown in the flow chart of FIG. 6 is performed.
  • the user holds the housing 1 a of the information processing apparatus 1 with one hand and performs a touch input for transforming a cubical object on the display 30 with the finger 5 or the fingers 5 , 6 of the other hand.
  • the information processing unit 10 obtains a plurality of positions on the screen of the display 30 from the touch panel 40 (step S 1 ), and determines whether any of the obtained plurality of positions is in a certain region set in the cubical object (step S 2 ). When a determination result is that all of the obtained plurality of positions are not in the certain region, the information processing unit 10 ends the processing.
  • the information processing unit 10 calculates a center contact position representing the center position of a region on the touch panel 40 contacted by the finger based on the plurality of positions on the screen of the display 30 obtained from the touch panel 40 (step S 3 ), and calculates a contact area representing an area of the region on the touch panel 40 contacted by the finger (step S 4 ).
  • a center contact position representing the center position of a region on the touch panel 40 contacted by the finger based on the plurality of positions on the screen of the display 30 obtained from the touch panel 40
  • a contact area representing an area of the region on the touch panel 40 contacted by the finger
  • the information processing unit 10 sets a transformation range of the cubical object based on the calculated contact area (step S 5 ). Specifically, a first threshold value for setting the transformation range is prestored in the storage unit 20 and the information processing unit 10 reads the first threshold value from the storage unit 20 . Then, the information processing unit 10 compares the calculated contact area with the first threshold value and sets the transformation range to either a large range or a small range based on a comparison result. For example, when the calculated contact area is greater than or equal to the first threshold value, the information processing unit 10 sets the transformation range to a group of the mode a or b shown in FIG. 5 . Meanwhile, when the calculated contact area is less than the first threshold value, the information processing unit 10 sets the transformation range to a group of the mode c or d shown in FIG. 5 .
  • the information processing unit 10 obtains an acceleration value from the acceleration sensor 80 (step S 6 ), and sets a degree of transformation for the cubical object based on the obtained acceleration value (step S 7 ). Specifically, a second threshold value for setting the degree of transformation is prestored in the storage unit 20 and the information processing unit 10 reads the second threshold value from the storage unit 20 . Then, the information processing unit 10 calculates a magnitude of the acceleration based on the obtained acceleration value, compares the calculated magnitude of the acceleration with the second threshold value, and sets the degree of transformation to either a great degree or a low degree based on a comparison result.
  • the information processing unit 10 sets the degree of transformation to the group of the mode a or c shown in FIG. 5 . Meanwhile, when the calculated magnitude of the acceleration is greater than or equal to the second threshold value, the information processing unit 10 sets the degree of transformation to the group of the mode b or d shown in FIG. 5 .
  • the information processing unit 10 Based on the set transformation range and the degree of transformation, the information processing unit 10 performs a well-known polygon transformation process for exerting a transformation effect on the cubical object with the calculated center contact position as the center (step S 8 ), and displays the cubical object after the transformation process on the display 30 (step S 9 ).
  • the cubical object can be transformed in various manners based on the contact area obtained by the touch input and the force applied to the information processing apparatus 1 at the time of the touch input. Variations of transformation of the object by the touch input can be increased.
  • the contact area and the magnitude of the acceleration on the touch panel 40 are evaluated by comparing the contact area and the magnitude of the acceleration with the respective predetermined threshold values and determined to be either large or small using two values.
  • a transformation may be performed successively based on the contact area and the magnitude of the acceleration as they are.
  • a table defining setting values corresponding to ranges of the contact area and the magnitude of the acceleration is stored in the storage unit 20 .
  • a setting value which corresponds to a range in which the calculated contact area or the magnitude of the acceleration falls may be set as the transformation range or the degree of transformation.
  • Each setting value allows a predetermined range of values for the corresponding contact area or the corresponding magnitude of the acceleration.
  • step S 7 setting is performed based only on the magnitude of the acceleration generated at the time of the touch input without considering an axial direction of the acceleration.
  • a great acceleration is considered to be generated in the Z-axial direction in FIG. 3 and FIG. 4 at the time of the touch input. Therefore, among components of the acceleration detected by the acceleration sensor 80 , only a component of the Z-axial direction which is a direction orthogonal to a touch surface of the touch panel 40 may be used. Needless to say, a magnitude of the force applied to the housing 1 a may be evaluated by taking all of the components including X-axis and Y-axis components. Further, because the acceleration sensor 80 constantly outputs detection values which include a gravity acceleration component, the degree of transformation may be set based on a degree of change in the acceleration obtained by performing a first derivation on the detection values and removing the gravity acceleration component.
  • the cubical object is transformed.
  • an in-game effect may be performed on an object in a game.
  • an in-game effect as if hitting with a fist is performed.
  • an in-game effect as if poking with a fingertip is performed.
  • an in-game effect as if slapping with a palm is performed.
  • an in-game effect as if stroking with a palm may be performed.
  • these effects can also be applied to an attack effect in a role-playing game, an action game, and the like.
  • a selection range of a target to perform an effect that is, an attack target is set in accordance with a contact area
  • a magnitude of the attack effect is set in accordance with a force applied to the information processing apparatus 1 .
  • a strong attack is performed on a single enemy.
  • a weak attack is performed on the single enemy.
  • an in-game effect and a target of the in-game effect may be selected in accordance with a touch position. For example, when a part of the touch panel 40 on which an enemy is displayed is touched, an attack magic may be performed on the enemy. Meanwhile, when a part of the touch panel 40 on which a friend is displayed is touched, a recovering magic or a defense magic may be performed on the friend.
  • the range and the magnitude of the in-game effect are set separately in accordance with the contact area and the acceleration value, respectively.
  • information processing with regard to an output may be performed based on both of an input area of a touch input and a force detected by a force detection unit. For example, in a game aimed at hitting a ball further in a virtual space, an input area to the touch panel 40 is converted into a first parameter value, an acceleration detected by the force detection unit is converted into a second parameter value, and the first parameter value and the second parameter value are added. The greater a value obtained by adding the parameter values is, the further the ball may be hit.
  • a transformation effect is exerted on an object in an application such as a game.
  • the transformation effect is applicable also to an operation input in an application for image viewing, moving image playback, and the like.
  • the operation input in the application for image viewing, moving image playback and the like will be described in detail.
  • an image, a moving image, and the like stored in the storage unit 20 are read and displayed on the display 30 .
  • the image, the moving image, and the like are operated based on a contact area and a force applied to the information processing apparatus 1 obtained by a touch input. Details of a display process according to this modification is generally the same as the display process of the flow chart shown in FIG. 6 , and thus description thereof will be made with reference to FIGS. 3 , 4 , and 6 in the same manner.
  • the user holds the housing 1 a of the information processing apparatus 1 with one hand and performs a touch input for performing an operation on an image and the like on the display 30 with the finger 5 or the fingers 5 , 6 of the other hand.
  • the information processing unit 10 obtains a plurality of positions on the screen of the display 30 from the touch panel 40 (step S 1 ), and determines whether any of the obtained plurality of positions is in a certain region set in an operation button of the application (step S 2 ). When a determination result is that all of the obtained plurality of positions are not in the certain region, the information processing unit 10 ends the processing.
  • the information processing unit 10 calculates a center contact position representing the center position of a region on the touch panel 40 contacted by the finger based on the plurality of positions on the screen of the display 30 obtained from the touch panel 40 (step S 3 ), and calculates a contact area representing an area of the region on the touch panel 40 contacted by the finger (step S 4 ).
  • the information processing unit 10 sets a range of an amount of operation to be performed on a target such as the image and the like based on the calculated contact area (step S 5 ).
  • the amount of operation corresponds to an operation content assigned to a corresponding operation button.
  • the information processing unit 10 obtains an acceleration value from the acceleration sensor 80 (step S 6 ), and sets a magnitude of an effect to be exerted by an operation performed on the target such as the image and the like based on the obtained acceleration value (step S 7 ).
  • the information processing unit 10 performs an operation process on the image and the like based on the thus set range of the amount of operation and the magnitude of the operation effect (step S 8 ), and causes the display 30 to display thereon a series of operations performed on the image and the like and results of the operations (step S 9 ).
  • step S 3 of obtaining the center contact position is not particularly necessary.
  • the information processing unit 10 may determine which operation button is pressed based on the center contact position.
  • a number of images to transit in a slide show which sequentially displays display images is set in accordance with a contact area and a transition speed of the display images is set in accordance with a magnitude of a force applied to the information processing apparatus 1 at a time of a touch input.
  • the contact area is small and the force applied to the information processing apparatus 1 is large, the number of images to transit is low and the transition speed is high.
  • the contact area is small and the force applied to the information processing apparatus 1 is small, the number of images to transit is low and the transition speed is low.
  • the contact area When the contact area is large and the force applied to the information processing apparatus 1 is large, the number of images to transit is high and the transition speed is high. When the contact area is large and the force applied to the information processing apparatus 1 is small, the number of images to transit is high and the transition speed is low.
  • a number of pages (text images) of the electronic book to be turned is set in accordance with a contact area and a page turning speed is set in accordance with a magnitude of a force applied to the information processing apparatus 1 at a time of a touch input.
  • the contact area is small and the force applied to the information processing apparatus 1 is large
  • the number of pages to be turned is low and the turning speed is high.
  • the contact area is large and the force applied to the information processing apparatus 1 is large, the number of pages to be turned is high and the turning speed is high.
  • the contact area is large and the force applied to the information processing apparatus 1 is small, the number of pages to be turned is high and the turning speed is low.
  • a skip time (number of frames) of moving image and music data to be reproduced is set in accordance with a contact area, and a reproduction speed of the moving image and music data is set in accordance with a magnitude of a force applied to the information processing apparatus 1 at a time of a touch input.
  • the skip time is reduced and playback is skipped for a period of time equivalent to the reduced time.
  • the skip time is reduced and fast-forward is performed.
  • the skip time is increased and playback is skipped for a period of time equivalent to the increased time.
  • the skip time is increased and the fast-forward is performed.
  • a specific operation may be performed when both of the contact area and the force applied to the information processing apparatus 1 satisfy respective predetermined conditions.
  • a zoom-in enlarged display
  • both of a detection value of the contact area and a detection value of the force at a time of the tap operation are greater than or equal to respective predetermined threshold values
  • a zoom out reduced display which is an operation opposite to a normal tap operation is performed. This is applicable also to a slide operation and a flick operation other than the tap operation.
  • Both of the contact area and the force applied to the information processing apparatus 1 are thus used as input conditions of the touch operation, thereby variations of the touch operation can be increased.
  • an operation content can be changed in accordance with whether a touch position is in a certain region.
  • the fast-forward may be performed when the user touches the right side of the touch panel 40 and a fast-rewind may be performed when the user touches the left side of the touch panel 40 .
  • the operation content may be changed based on a direction of an acceleration detected by the acceleration sensor 80 .
  • the operation content may be changed to a display in reverse order, the fast-rewind, and the like, as appropriate, in accordance with respective applications based on an acceleration direction ( ⁇ Z direction) in a depth direction generated at a time of a normal touch input and an acceleration direction (+Z direction) in a forward direction generated at a time of moving the housing 1 a to bring the touch panel 40 into contact with a finger.
  • ⁇ Z direction acceleration direction
  • (+Z direction acceleration direction in a depth direction generated at a time of a normal touch input
  • (+Z direction acceleration direction (+Z direction
  • resultant values may be used.
  • operation parameters for changing an operation content may be associated with a contact area and a magnitude of an acceleration in advance and set as unique values for the application.
  • a software such as an OS (operation software) of the information processing apparatus 1 which provides basic functions of the information processing apparatus 1
  • a table of operations or a database for associating the contact area and the magnitude of the acceleration with the operation parameters may be stored in the storage unit 20 for each software to be executed or for common use.
  • a predetermined operation command may be executed by using a variable of the operation command corresponding to the contact area and the magnitude of the acceleration detected at a time of a touch input.
  • a structure and an internal configuration of an information processing apparatus according to the second exemplary embodiment are the same as those of the first exemplary embodiment, and thus description thereof will be omitted.
  • a button selection operation and the like is identified by detecting a center position of a range on an operation screen contacted by a user's finger, a pen, or the like and determining on which button or selection region the center position is.
  • information terminals have become smaller and many buttons and selection regions are provided on a small display screen in many cases.
  • an operation content is identified based only on the center position of the contact range, an erroneous pressing may occur because of a slight displacement, which results in impeding a secure operation.
  • a user operation is recognized based on a size of a contact area and the like in addition to the center position of the contact range, thereby eliminating the above problem.
  • FIG. 7 shows, in a case where the user selects any of a plurality of objects displayed on the display screen 30 by a touch operation, an enlarged display of two of the displayed plurality of objects (an outer frame in FIG. 7 represents a range of a part of the display screen 30 ).
  • an object 101 is displayed so as to be larger than an object 102 .
  • the information processing unit 10 determines that the object 101 is selected.
  • the information processing unit 10 determines that the object 102 is selected.
  • the information processing unit 10 determines that the object 102 which is displayed so as to be smaller is selected.
  • the size of the contact area is determined by reading the largest one of contact areas detected during a period of time from a contact start (touch on) to a contact end (touch off), for example.
  • the larger an object is, the larger a determination region of the object becomes.
  • a determination region of each object is defined as being the same in position and size as a display region of the object. Accordingly, the object 101 has a determination region larger than a determination region of the object 102 . Consequently, due to the user's psychology, the contact area is likely to be large when the object 101 is selected and the contact area is likely to be small when the object 102 is selected. By thus applying the user's psychology to determination of selection, an operation which the user desires can easily be realized.
  • the user performs a touch input on the touch panel 40 using a finger or a touch pen while holding the information processing apparatus 1 .
  • the information processing unit 10 positions a plurality of objects which are selection targets in respective predetermined regions (step S 11 ), and determines a certain region based on a positional relation of the plurality of objects (step S 12 ).
  • a size and a shape of the certain region may not be precise and the certain region may be determined based on any method other than the above method.
  • the information processing unit 10 detects a coordinate group of a position on the touch panel 40 on which a touch is performed (step S 13 ), calculates to obtain position coordinates which are the center of the coordinate group (step S 14 ), and determines whether the center of the touch position is in the certain region (step S 15 ). Then, when a determination result is that the center of the touch position is not in the certain region (NO in step S 15 ), the information processing unit 10 determines whether the center of the touch position is in a region on an object (step S 20 ).
  • the information processing unit 10 determines that the object is selected (step S 21 ) and performs a display based on the selection on the display 30 (step S 22 ). Meanwhile, when the determination result is that the center of the touch position is not in the region on the object (NO in step S 20 ), the information processing unit 10 determines that a selection operation is not performed and performs a display for when no selection operation is performed on the display 30 (step S 22 ).
  • the information processing unit 10 calculates an area on the touch panel 40 contacted by the user based on the coordinate group of the contact position (step S 16 ). The information processing unit 10 determines whether the contact area is larger than a predetermined threshold value (S 17 ). When a determination result is that the contact area is larger than the predetermined threshold value (YES in step S 17 ), the information processing unit 10 determines that the object having a large area on the display screen is selected (step S 18 ) and performs a display based on the selection result (step S 22 ).
  • the information processing unit 10 determines that the object having a small area on the display screen is selected (step S 19 ) and performs a display based on the selection result (step S 22 ).
  • the threshold value may be uniformly defined or may be defined differently for each object.
  • FIG. 9 shows, in a case where the user selects any of a plurality of objects displayed on the display 30 by a touch operation, an enlarged display of two objects on the display screen.
  • An object 104 and an object 105 are displayed so as to overlap each other in a region 106 .
  • the information processing unit 10 determines that the object 104 is selected.
  • the information processing unit 10 determines that the object 105 is selected.
  • the information processing unit 10 determines that the object 105 which is displayed so as to be farther from the user in the depth direction is selected. Meanwhile, when the contact area is small, the information processing unit 10 determines that the object 104 which is displayed so as to be closer to the user in the depth direction is selected.
  • the size of the contact area is determined by reading the largest one of contact areas detected during a period of time from a contact start (touch on) to a contact end (touch off), for example.
  • the user performs a touch input on the touch panel 40 using a finger or a touch pen while holding the information processing apparatus 1 .
  • the information processing unit 10 positions two objects which are selection targets and a virtual camera which captures the two objects in a virtual space (step S 31 ).
  • the information processing unit 10 detects a coordinate group of a position on the touch panel 40 on which a touch is performed (step S 32 ) and calculates to obtain position coordinates which are the center of the coordinate group (step S 33 ). Further, the information processing unit 10 emits a ray in the virtual space based on the center of touch position and determines whether the ray contacts a plurality of objects (step S 34 ). When a determination result is that the ray contacts two objects (YES in step S 34 ), the information processing unit 10 proceeds the processing to step S 35 . At this time, the information processing unit 10 stores an order in which the ray contacts the two objects and understands an anteroposterior relationship of the two objects. Meanwhile, when the determination result is that the ray contacts only one object or does not contact any object (NO in step S 34 ), the information processing unit 10 proceeds the processing to step S 39 .
  • the information processing unit 10 determines whether the ray contacts only one object (step S 39 ). When a determination result is that the ray contacts only one object (YES in step S 39 ), the information processing unit 10 determines that the object contacted by the ray is selected (step S 40 ) and causes the display 30 to display thereon an image obtained by the virtual camera capturing the virtual space (step S 41 ). Meanwhile, when the determination result is that the ray does not contact any object (NO in step S 39 ), the information processing unit 10 proceeds the processing to step S 41 .
  • the information processing unit 10 calculates an area on the touch panel 40 contacted by the user based on the coordinate group of the contact position (step S 35 ). Next, the information processing unit 10 determines whether the contact area is larger than a predetermined threshold value (step S 36 ). When a determination result is that the contact area is larger than the predetermined threshold value (YES in step S 36 ), the information processing unit 10 determines that an object which is displayed so as to be farther from the user in the depth direction is selected (step S 37 ), and proceeds the processing to step S 41 .
  • the information processing unit 10 determines that an object which is displayed so as to be closer to the user in the depth direction is selected (step S 38 ) and proceeds the processing to step S 41 .
  • the threshold value may be uniformly defined or may be defined differently for each object.
  • the anteroposterior relationship (closer to the user or further from the user) between the two objects is determined by storing the order in which the ray contacts the two objects. Further, also in a case of three or more objects, determination of selection based on a contact area can be performed.
  • the contact area is used for determination of selection of the objects.
  • contact areas may be associated with display priorities (visual anteroposterior relationship of the objects).
  • the display 30 in the exemplary embodiment may be a parallax barrier type or a lenticular lens type stereoscopic display screen. This enables the user to perform a more intuitive operation and the like on an object with a sense of depth which is displayed on a stereoscopic display screen.
  • the contact area is associated with the size and the depth value of the object.
  • the contact area may be associated with anything (e.g., color, shape, up-down, left-right, and the like) with regard to the object. Accordingly, a wider variety of operations can be provided.
  • the contact area is associated with the size and the depth value of the object.
  • the contact area may be used as depth designation information in the virtual space.
  • the contact area can be used as information, with respect to an object, for such as moving the object in the depth direction, transforming the object and designating a degree of depth. Accordingly, a wider variety of operations can be provided.
  • the contact area is associated with the size and the depth value of the object.
  • which finger such as a thumb, an index finger, and the like contacts the touch panel 40 may be evaluated and a determination result may be reflected on determination of selection.
  • an instruction as to which finger is used for selecting an object may be given to the user (or from the user) each time or which finger is used may be determined based on the contact area and the contact range. Accordingly, a wider variety of operations can be provided.
  • the contact area is associated with the size and the depth value of the object and used for determination of selection of the object.
  • the association may be used for another predetermined operation and display change instead of determination of object selection. Accordingly, a wider variety of operations can be provided.
  • the contact area is associated with the size and the depth value of the object.
  • determination of object selection may be based on a force detected at a time of a touch.
  • various sensors such as an acceleration sensor, a gyro sensor, a pressure sensor, and the like can be adopted for detection of the force at the time of the touch. Accordingly, more intuitive and a wider variety of operations can be provided.
  • the contact area is associated with the size and the depth value of the object.
  • the determination process may be performed based on a contact time. Accordingly, a wider variety of operations can be provided.
  • the greater a contact area is, the smaller a depth value of an object which is determined to be selected may be.
  • the largest one of the contact areas detected during the time period from the contact start (touch on) to the contact end (touch off) is read.
  • an average value or an integrated value of the contact areas during the time period may be read.
  • a maximum value or an average value of contact areas during a time period from a predetermined time period prior to the touch off to the touch off is read.
  • the contact area may be detected at predetermined regular intervals and possible selection targets may be sequentially highlighted and displayed in accordance with the contact area.
  • possible selection targets may be sequentially highlighted and displayed in accordance with the contact area.
  • a mode of a touch reflects a user's mannerisms and personality and therefore the information processing unit 10 may learn each user's mannerisms when pressing the touch panel.
  • the information processing unit 10 may learn each user's mannerisms naturally by accumulating relationship between a selection target and a touch region, or by requesting the user to touch a predetermined region in advance. Accordingly, a wider variety of operations can be provided.
  • the selection target may be evaluated based also on the contact time in addition to the size of the contact area. Specifically, when the contact time is great, an object which is displayed in a large area may be determined to be selected. Accordingly, more intuitive and a wider variety of operations can be provided.
  • determination as to whether the object is selected may be made in accordance with a contact position and a size of a contact area.
  • the information processing program is executed by the information processing unit 10 which is a computer of the information processing apparatus 1 having the display 30 and the touch panel 40 which is a position input unit, the program causing the computer to function as:
  • a display control unit which displays a plurality of selection targets in a predetermined display mode on the display 30 ;
  • an evaluation unit which evaluates a contact area formed by a plurality of input positions detected by the position input unit
  • a determination unit which determines which selection target is selected based at least on the contact area and a relation between one selection target and another selection target in the display mode.
  • the determination unit selects a selection target having a size corresponding to a size of the contact area from among the plurality of selection targets.
  • the determination unit selects a selection target at a position corresponding to a size of the contact area from among the plurality of selection targets.
  • the plurality of selection targets and the virtual camera may be positioned in the virtual space and a selection target at a position at a depth corresponding to the size of the contact area may be selected from among the plurality of selection targets.
  • the information processing program further causes the computer of the information processing apparatus to function as a contact position detection unit which detects a position on the contact detection unit contacted by the user, and
  • the determination unit determines which selection target is selected when the contact position detected by the contact position detection unit is in a region between the plurality of objects or in a region (certain region) in which the plurality of objects overlap one another.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US13/680,948 2011-12-28 2012-11-19 Computer-readable non-transitory storage medium, information processing apparatus, information processing system, and information processing method Abandoned US20130169565A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/825,873 US10732742B2 (en) 2011-12-28 2017-11-29 Information processing program and method for causing a computer to transform a displayed object based on input area and force of a touch input

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-287725 2011-12-28
JP2011287725A JP6021335B2 (ja) 2011-12-28 2011-12-28 情報処理プログラム、情報処理装置、情報処理システム、および、情報処理方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/825,873 Continuation US10732742B2 (en) 2011-12-28 2017-11-29 Information processing program and method for causing a computer to transform a displayed object based on input area and force of a touch input

Publications (1)

Publication Number Publication Date
US20130169565A1 true US20130169565A1 (en) 2013-07-04

Family

ID=47504600

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/680,948 Abandoned US20130169565A1 (en) 2011-12-28 2012-11-19 Computer-readable non-transitory storage medium, information processing apparatus, information processing system, and information processing method
US15/825,873 Active 2033-01-11 US10732742B2 (en) 2011-12-28 2017-11-29 Information processing program and method for causing a computer to transform a displayed object based on input area and force of a touch input

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/825,873 Active 2033-01-11 US10732742B2 (en) 2011-12-28 2017-11-29 Information processing program and method for causing a computer to transform a displayed object based on input area and force of a touch input

Country Status (3)

Country Link
US (2) US20130169565A1 (de)
EP (1) EP2610728A3 (de)
JP (1) JP6021335B2 (de)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160034069A1 (en) * 2014-08-04 2016-02-04 Fujitsu Limited Information processing apparatus, input control method, and computer-readable recording medium
CN106415472A (zh) * 2015-04-14 2017-02-15 华为技术有限公司 一种手势控制方法、装置、终端设备和存储介质
US20170068389A1 (en) * 2014-05-14 2017-03-09 Sony Corporation Information processing apparatus, information processing method, and program
US20170075565A1 (en) * 2012-07-18 2017-03-16 Sony Corporation Mobile client device, operation method, recording medium, and operation system
WO2018078488A1 (en) * 2016-10-25 2018-05-03 Semiconductor Energy Laboratory Co., Ltd. Display device, display module, electronic device, and touch panel input system
US20180161674A1 (en) * 2015-06-11 2018-06-14 Bandai Namco Entertainment Inc. Terminal device
US20180336768A1 (en) * 2017-05-16 2018-11-22 Honeywell International Inc. Systems and methods for outdoor evacuation guidance using an uav

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6095527B2 (ja) * 2013-09-04 2017-03-15 レノボ・シンガポール・プライベート・リミテッド 携帯型情報処理装置、そのデータ処理方法、およびコンピュータが実行可能なプログラム
JP6257255B2 (ja) * 2013-10-08 2018-01-10 キヤノン株式会社 表示制御装置及び表示制御装置の制御方法
CN107357510B (zh) * 2013-11-21 2021-06-04 华为终端有限公司 触摸选择的视觉反馈方法和装置
KR102205283B1 (ko) * 2014-02-12 2021-01-20 삼성전자주식회사 적어도 하나의 어플리케이션을 실행하는 전자 장치 및 그 제어 방법
JP6442755B2 (ja) * 2014-02-28 2018-12-26 富士通コネクテッドテクノロジーズ株式会社 電子機器、制御プログラム、及び、制御方法
JP6224543B2 (ja) * 2014-07-29 2017-11-01 アルプス電気株式会社 入力装置及び指判定方法並びにプログラム
JP6675769B2 (ja) * 2015-11-25 2020-04-01 華為技術有限公司Huawei Technologies Co.,Ltd. 迅速な画面分割方法、装置、および電子デバイス、表示ui、および記憶媒体
JP6969516B2 (ja) * 2017-09-15 2021-11-24 株式会社セガ プログラム及び情報処理装置
JP6668440B2 (ja) * 2018-10-24 2020-03-18 シャープ株式会社 情報処理装置、情報処理方法及びプログラム
US10656763B1 (en) * 2019-01-04 2020-05-19 Sensel, Inc. Dynamic adjustment of a click threshold corresponding to a force-based tactile sensor

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020163509A1 (en) * 2001-04-13 2002-11-07 Roberts Jerry B. Touch screen with rotationally isolated force sensor
US20070070046A1 (en) * 2005-09-21 2007-03-29 Leonid Sheynblat Sensor-based touchscreen assembly, handheld portable electronic device having assembly, and method of determining touch location on a display panel
US20080084400A1 (en) * 2006-10-10 2008-04-10 Outland Research, Llc Touch-gesture control of video media play on handheld media players
US20090153495A1 (en) * 2007-12-18 2009-06-18 Wistron Corp. Input method for use in an electronic device having a touch-sensitive screen
US20100044121A1 (en) * 2008-08-15 2010-02-25 Simon Steven H Sensors, algorithms and applications for a high dimensional touchpad
US20110050588A1 (en) * 2009-08-27 2011-03-03 Symbol Technologies, Inc. Methods and apparatus for pressure-based manipulation of content on a touch screen
US20110050594A1 (en) * 2009-09-02 2011-03-03 Kim John T Touch-Screen User Interface
US20110050619A1 (en) * 2009-08-27 2011-03-03 Research In Motion Limited Touch-sensitive display with capacitive and resistive touch sensors and method of control
US20110057886A1 (en) * 2009-09-10 2011-03-10 Oliver Ng Dynamic sizing of identifier on a touch-sensitive display
US20110134061A1 (en) * 2009-12-08 2011-06-09 Samsung Electronics Co. Ltd. Method and system for operating a mobile device according to the rate of change of the touch area
US20110221684A1 (en) * 2010-03-11 2011-09-15 Sony Ericsson Mobile Communications Ab Touch-sensitive input device, mobile device and method for operating a touch-sensitive input device
US20120038582A1 (en) * 2010-08-13 2012-02-16 Immersion Corporation Systems and Methods for Providing Haptic Feedback to Touch-Sensitive Input Devices
US20120054665A1 (en) * 2010-08-30 2012-03-01 Sony Corporation Information processing apparatus, parameter setting method, and program
US20120105358A1 (en) * 2010-11-03 2012-05-03 Qualcomm Incorporated Force sensing touch screen
US20120139864A1 (en) * 2010-12-02 2012-06-07 Atmel Corporation Position-sensing and force detection panel
US20130063389A1 (en) * 2011-09-12 2013-03-14 Motorola Mobility, Inc. Using pressure differences with a touch-sensitive display screen
US8407606B1 (en) * 2009-01-02 2013-03-26 Perceptive Pixel Inc. Allocating control among inputs concurrently engaging an object displayed on a multi-touch device
US20130088434A1 (en) * 2011-10-06 2013-04-11 Sony Ericsson Mobile Communications Ab Accessory to improve user experience with an electronic display
US8477115B2 (en) * 2005-06-08 2013-07-02 Sony Corporation Input device, information processing apparatus, information processing method, and program
US8665225B2 (en) * 2007-01-07 2014-03-04 Apple Inc. Portable multifunction device, method, and graphical user interface for interpreting a finger gesture
US9030419B1 (en) * 2010-09-28 2015-05-12 Amazon Technologies, Inc. Touch and force user interface navigation

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000163193A (ja) * 1998-11-25 2000-06-16 Seiko Epson Corp 携帯情報機器及び情報記憶媒体
JP2005143714A (ja) * 2003-11-13 2005-06-09 Omron Corp ゲーム装置制御方法とゲーム装置
WO2006013520A2 (en) * 2004-08-02 2006-02-09 Koninklijke Philips Electronics N.V. System and method for enabling the modeling virtual objects
JP2008508629A (ja) * 2004-08-02 2008-03-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 圧力依存型視覚フィードバックを備えるタッチスクリーン
JP4258850B2 (ja) * 2004-12-28 2009-04-30 株式会社セガ 画像処理装置およびその方法
JP4715257B2 (ja) * 2005-03-22 2011-07-06 パナソニック株式会社 データ処理装置
CN102077161B (zh) * 2008-06-30 2017-08-01 日本电气株式会社 信息处理设备、显示控制方法和记录介质
JP2010020608A (ja) * 2008-07-11 2010-01-28 Olympus Imaging Corp 電子装置、カメラ、オブジェクト選択方法、および、オブジェクト選択プログラム
JP5100556B2 (ja) * 2008-07-30 2012-12-19 キヤノン株式会社 情報処理方法及び装置
JP2010093707A (ja) 2008-10-10 2010-04-22 Nec Saitama Ltd 携帯型電子装置、文字入力画面表示方法およびプログラム
JP4743268B2 (ja) * 2008-12-15 2011-08-10 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
JP5238640B2 (ja) * 2009-08-18 2013-07-17 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム制御プログラム、及びゲーム制御方法
JP2011053971A (ja) 2009-09-02 2011-03-17 Sony Corp 情報処理装置、情報処理方法およびプログラム
JP2011138402A (ja) * 2009-12-28 2011-07-14 Canon Inc 操作入力装置、表示装置、撮影装置及び動画再生装置
JP2011221640A (ja) * 2010-04-06 2011-11-04 Sony Corp 情報処理装置、情報処理方法およびプログラム
US8643612B2 (en) * 2010-05-25 2014-02-04 MCube Inc. Touchscreen operation threshold methods and apparatus
EP2390772A1 (de) * 2010-05-31 2011-11-30 Sony Ericsson Mobile Communications AB Benutzerschnittstelle mit dreidimensionaler Benutzereingabe

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020163509A1 (en) * 2001-04-13 2002-11-07 Roberts Jerry B. Touch screen with rotationally isolated force sensor
US8477115B2 (en) * 2005-06-08 2013-07-02 Sony Corporation Input device, information processing apparatus, information processing method, and program
US20070070046A1 (en) * 2005-09-21 2007-03-29 Leonid Sheynblat Sensor-based touchscreen assembly, handheld portable electronic device having assembly, and method of determining touch location on a display panel
US20080084400A1 (en) * 2006-10-10 2008-04-10 Outland Research, Llc Touch-gesture control of video media play on handheld media players
US8665225B2 (en) * 2007-01-07 2014-03-04 Apple Inc. Portable multifunction device, method, and graphical user interface for interpreting a finger gesture
US20090153495A1 (en) * 2007-12-18 2009-06-18 Wistron Corp. Input method for use in an electronic device having a touch-sensitive screen
US20100044121A1 (en) * 2008-08-15 2010-02-25 Simon Steven H Sensors, algorithms and applications for a high dimensional touchpad
US8407606B1 (en) * 2009-01-02 2013-03-26 Perceptive Pixel Inc. Allocating control among inputs concurrently engaging an object displayed on a multi-touch device
US20110050588A1 (en) * 2009-08-27 2011-03-03 Symbol Technologies, Inc. Methods and apparatus for pressure-based manipulation of content on a touch screen
US20110050619A1 (en) * 2009-08-27 2011-03-03 Research In Motion Limited Touch-sensitive display with capacitive and resistive touch sensors and method of control
US20110050594A1 (en) * 2009-09-02 2011-03-03 Kim John T Touch-Screen User Interface
US20110057886A1 (en) * 2009-09-10 2011-03-10 Oliver Ng Dynamic sizing of identifier on a touch-sensitive display
US20110134061A1 (en) * 2009-12-08 2011-06-09 Samsung Electronics Co. Ltd. Method and system for operating a mobile device according to the rate of change of the touch area
US20110221684A1 (en) * 2010-03-11 2011-09-15 Sony Ericsson Mobile Communications Ab Touch-sensitive input device, mobile device and method for operating a touch-sensitive input device
US20120038582A1 (en) * 2010-08-13 2012-02-16 Immersion Corporation Systems and Methods for Providing Haptic Feedback to Touch-Sensitive Input Devices
US20120054665A1 (en) * 2010-08-30 2012-03-01 Sony Corporation Information processing apparatus, parameter setting method, and program
US9030419B1 (en) * 2010-09-28 2015-05-12 Amazon Technologies, Inc. Touch and force user interface navigation
US20120105358A1 (en) * 2010-11-03 2012-05-03 Qualcomm Incorporated Force sensing touch screen
US20120139864A1 (en) * 2010-12-02 2012-06-07 Atmel Corporation Position-sensing and force detection panel
US20130063389A1 (en) * 2011-09-12 2013-03-14 Motorola Mobility, Inc. Using pressure differences with a touch-sensitive display screen
US20130088434A1 (en) * 2011-10-06 2013-04-11 Sony Ericsson Mobile Communications Ab Accessory to improve user experience with an electronic display

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170075565A1 (en) * 2012-07-18 2017-03-16 Sony Corporation Mobile client device, operation method, recording medium, and operation system
US10007424B2 (en) * 2012-07-18 2018-06-26 Sony Mobile Communications Inc. Mobile client device, operation method, recording medium, and operation system
US20170068389A1 (en) * 2014-05-14 2017-03-09 Sony Corporation Information processing apparatus, information processing method, and program
US10061438B2 (en) * 2014-05-14 2018-08-28 Sony Semiconductor Solutions Corporation Information processing apparatus, information processing method, and program
US20160034069A1 (en) * 2014-08-04 2016-02-04 Fujitsu Limited Information processing apparatus, input control method, and computer-readable recording medium
CN106415472A (zh) * 2015-04-14 2017-02-15 华为技术有限公司 一种手势控制方法、装置、终端设备和存储介质
JP2018511892A (ja) * 2015-04-14 2018-04-26 華為技術有限公司Huawei Technologies Co.,Ltd. ジェスチャコントロール方法、装置、端末デバイス、およびストレージ媒体
US10802704B2 (en) * 2015-04-14 2020-10-13 Huawei Technologies Co., Ltd. Gesture control method, apparatus, terminal device, and storage medium
US20180161674A1 (en) * 2015-06-11 2018-06-14 Bandai Namco Entertainment Inc. Terminal device
US10850196B2 (en) * 2015-06-11 2020-12-01 Bandai Namco Entertainment Inc. Terminal device
WO2018078488A1 (en) * 2016-10-25 2018-05-03 Semiconductor Energy Laboratory Co., Ltd. Display device, display module, electronic device, and touch panel input system
US20180336768A1 (en) * 2017-05-16 2018-11-22 Honeywell International Inc. Systems and methods for outdoor evacuation guidance using an uav

Also Published As

Publication number Publication date
EP2610728A2 (de) 2013-07-03
US20180081461A1 (en) 2018-03-22
EP2610728A3 (de) 2017-10-11
JP6021335B2 (ja) 2016-11-09
US10732742B2 (en) 2020-08-04
JP2013137613A (ja) 2013-07-11

Similar Documents

Publication Publication Date Title
US10732742B2 (en) Information processing program and method for causing a computer to transform a displayed object based on input area and force of a touch input
JP5802667B2 (ja) ジェスチャ入力装置およびジェスチャ入力方法
US9250799B2 (en) Control method for information input device, information input device, program therefor, and information storage medium therefor
JP5133515B2 (ja) ゲーム装置およびゲームプログラム
TWI543018B (zh) An input device, an input method, and storage medium
WO2011142317A1 (ja) ジェスチャー認識装置、方法、プログラム、および該プログラムを格納したコンピュータ可読媒体
US20150199030A1 (en) Hover-Sensitive Control Of Secondary Display
US8669947B2 (en) Information processing apparatus, information processing method and computer program
EP2558924B1 (de) Vorrichtung, verfahren und computerprogramm für benutzereingabe mithilfe einer kamera
US20110157017A1 (en) Portable data processing appartatus
KR20150010702A (ko) 제스처 인식 디바이스들 및 방법들
JP6078719B2 (ja) 移動制御装置及びプログラム
JP5783828B2 (ja) 情報処理装置およびその制御方法
US8922351B2 (en) Display apparatus, information processing system, recording medium and television receiver
KR20120126508A (ko) 포인터를 사용하지 않는 가상 터치 장치에서의 터치 인식 방법
JPWO2018003862A1 (ja) 制御装置、表示装置、プログラムおよび検出方法
JP6534011B2 (ja) 情報処理装置、情報処理プログラム、情報処理システム、および、情報処理方法
JP6519075B2 (ja) 情報処理装置、情報処理プログラム、情報処理システム、および、情報処理方法
WO2022247506A1 (en) Systems and methods for controlling virtual widgets in gesture-controlled device
JP5379275B2 (ja) ゲーム装置およびゲームプログラム
US9389780B2 (en) Touch-control system
JP6033061B2 (ja) 入力装置およびプログラム
JP2011243157A (ja) 電子機器、ボタンサイズ制御方法、及びプログラム
Tsuchida et al. TetraForce: a magnetic-based interface enabling pressure force and shear force input applied to front and back of a smartphone
CN116136736A (zh) 信息处理方法、装置、电子设备、存储介质及程序产品

Legal Events

Date Code Title Description
AS Assignment

Owner name: NINTENDO CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FUNAHASHI, KIYOFUMI;MIYOSHI, YASUMASA;TAKUMA, HIROKI;REEL/FRAME:029323/0172

Effective date: 20121106

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION