US20140071090A1 - Information processing apparatus, information processing method, and program - Google Patents

Information processing apparatus, information processing method, and program Download PDF

Info

Publication number
US20140071090A1
US20140071090A1 US14/116,137 US201214116137A US2014071090A1 US 20140071090 A1 US20140071090 A1 US 20140071090A1 US 201214116137 A US201214116137 A US 201214116137A US 2014071090 A1 US2014071090 A1 US 2014071090A1
Authority
US
United States
Prior art keywords
detection
information processing
proximity
processing apparatus
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/116,137
Other languages
English (en)
Inventor
Yusuke Onishi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ONISHI, YUSUKE
Publication of US20140071090A1 publication Critical patent/US20140071090A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes

Definitions

  • the technology relates to an information processing apparatus, an information processing method, and a program. More particularly, the technology provides an information processing apparatus, an information processing method, and a program that enable input manipulation to be correctly performed.
  • input manipulation can be performed by providing a capacitive or resistive touch panel.
  • a capacitive touch panel a change in capacitance generated by causing a manipulation object such as a finger or a contact pen to contact a touch panel manipulation surface is detected, so that a contact position is detected, as described in Patent Documents 1 and 2.
  • FIG. 1 illustrates a schematic cross-sectional view of an information processing apparatus using a touch panel.
  • a display unit 21 is provided in a casing 50 of the information processing apparatus and a touch panel to be a sensor unit 11 is provided on a side of a display surface of the display unit 21 .
  • the capacitance changes and it may be erroneously detected that the input manipulation has been performed in an active region ARb positioned proximally to a finger FG.
  • an information processing apparatus including a sensor unit that generates a sensor signal according to proximity and contact of a manipulation object, a proximity detecting unit that performs proximity detection of the manipulation object, on the basis of the sensor signal, and a determining unit that determines validity of the proximity detection result, according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled.
  • the sensor unit is composed of, for example, a capacitive touch panel.
  • the proximity detection of the manipulation object is performed by the proximity detecting unit, on the basis of the sensor signal generated according to the proximity and the contact of the manipulation object.
  • the proximity detection result is validated, when the detection position detected by the proximity detection is in the manipulatable region provided in the region where the detection of the contact of the manipulation object is enabled, and a detection invalidity flag is set and the proximity detection result is invalidated, when the detection position is outside the manipulatable region. Because the detection position is outside the manipulatable region, the detection invalidity flag is set.
  • the detection invalidity flag is released and the proximity detection result is validated.
  • the manipulatable region is displayed identifiably by a display unit.
  • a priority order is set to the detected manipulation objects. For example, the priority order of a manipulation object positioned in a predetermined region provided in the region where the detection of the contact of the manipulation object is enabled becomes high. In addition, when the plurality of manipulation objects are positioned in the predetermined region, the priority order of a manipulation object in which the signal strength of the sensor signal is high becomes high.
  • the proximity detection result is adopted on the basis of the priority order set as described above. In the determining unit, when a detection size of the manipulation object detected by the proximity detection is more than a threshold value, the manipulation object is determined as invalidity.
  • region display corresponding to each manipulation object is performed by the display unit on the basis of the proximity detection results of the plurality of detected manipulation objects and a process for setting a display size of the region display according to the signal strength of the sensor signal or a process for displaying a region surrounded by positions shown by the plurality of proximity detection results by the display unit, on the basis of the proximity detection results of the plurality of detected manipulation objects, is executed.
  • detection sensitivity of the proximity detection in the proximity detecting unit is controlled by a proximity detection control unit and detection sensitivity of the side of an end of the region where the detection of the contact of the manipulation object is enabled is decreased, such that the detection sensitivity becomes lower than the detection sensitivity of the other portion.
  • an information processing method including a process of generating a sensor signal according to proximity and contact of a manipulation object, a process of performing proximity detection of the manipulation object, on the basis of the sensor signal, and a process of determining validity of the proximity detection result, according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled.
  • a program for causing a computer to a step of performing proximity detection of a manipulation object, on the basis of a sensor signal generated by a sensor unit according to proximity and contact of the manipulation object, and a step of determining validity of the proximity detection result, according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled.
  • the program according to the technology is a program that can be provided to a general-purpose computer executing various programs and codes by a storage medium or a communication medium providing the program with a computer readable format, for example, a storage medium such as an optical disc, a magnetic disc, and a semiconductor memory or a communication medium such as a network.
  • the program is provided with the computer readable format, so that a process according to the program is realized on the computer.
  • proximity detection of a manipulation object is performed on the basis of a sensor signal generated according to proximity and contact of the manipulation object and validity of a proximity detection result is determined according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled. For this reason, erroneous detection can be prevented and input manipulation can be correctly performed.
  • FIG. 1 is a schematic cross-sectional view of an information processing apparatus.
  • FIG. 2 is a diagram illustrating a configuration of a touch panel.
  • FIG. 3 is a functional block diagram of an information processing apparatus.
  • FIG. 4 is a flowchart illustrating a first operation.
  • FIG. 5 is a diagram illustrating a manipulatable region and a detection invalidity flag release region.
  • FIG. 6 is a diagram illustrating an example of display of a manipulatable region.
  • FIG. 7 is a diagram illustrating a relation of movement of a manipulation object and a vicinity detection result.
  • FIG. 8 is a diagram illustrating a method of setting a region.
  • FIG. 9 is a flowchart illustrating a second operation.
  • FIG. 10 is a diagram illustrating an example of signal strength when a plurality of manipulation objects are detected.
  • FIG. 11 is a diagram illustrating a state in which a grip portion of a hand approaches a sensor unit when a finger is used as a manipulation object.
  • FIG. 12 is a diagram illustrating the case in which a multi-proximity detection mode is selected.
  • FIG. 13 is a diagram illustrating sensitivity adjustment using a register.
  • An information processing apparatus using the technology has a configuration in which a display unit 21 is provided in a casing 50 of the information processing apparatus and a sensor unit 11 is provided on a side of a display surface of the display unit 21 , as illustrated in FIG. 1 .
  • FIG. 2 illustrates a configuration of a touch panel.
  • a light transmitting substrate In the touch panel, a light transmitting substrate is used. In an active region where a manipulation input is received, a plurality of rows of first light transmitting electrode patterns 111 that extend in a first direction and a plurality of rows of second light transmitting electrode patterns 112 that extend in a second direction crossing the first direction are formed.
  • the capacitance is generated between the first light transmitting electrode patterns 111 and the second light transmitting electrode patterns 112 and the manipulation object. Therefore, it can be detected from the change of the capacitance that the manipulation object is positioned in the vicinity or the manipulation object approaches or contacts any place.
  • the first light transmitting electrode patterns 111 and the second light transmitting electrode patterns 112 are formed by the same layer on the same surface of the light transmitting substrate. In addition, because the first light transmitting electrode patterns 111 and the second light transmitting electrode patterns 112 are formed by the same layer on the same surface of the light transmitting substrate, a plurality of crossing portions of the first light transmitting electrode patterns 111 and the second light transmitting electrode patterns 112 exist.
  • one electrode pattern of the first light transmitting electrode pattern 111 and the second light transmitting electrode pattern 112 is connected in the crossing portion and the other electrode pattern is disconnected.
  • the first light transmitting electrode pattern 111 is connected and the second light transmitting electrode pattern 112 is disconnected.
  • a light transmitting inter layer insulating film is formed on the side of an upper layer of the first light transmitting electrode patterns 111 in the crossing portions.
  • Light transmitting relay electrodes 113 (oblique portions) that electrically connect the second light transmitting electrode patterns 112 disconnected in the crossing portions are formed on an upper layer of the inter layer insulating film. For this reason, the second light transmitting electrode patterns are electrically connected in the second direction.
  • each of the first light transmitting electrode patterns 111 and the second light transmitting electrode patterns 112 includes a pad portion of a large area of a rhombic shape that is provided in a region interposed by the crossing portions.
  • a connecting portion that is positioned in the crossing portions in the first light transmitting electrode patterns 111 is formed in a small-width shape with a width smaller than a width of the pad portion.
  • the relay electrode 113 is also formed in a small-width shape with a width smaller than the width of the pad portion and a strip shape.
  • the sensor unit 11 that is configured as described above generates a sensor signal showing the change of the capacitance generated by the proximity or the contact of the manipulation object and outputs the sensor signal to the proximity detecting unit 12 .
  • the display unit 21 is configured using a planar display element such as a liquid crystal display element.
  • the display unit 21 displays a menu screen to perform setting or operation switching of the information processing apparatus.
  • a backlight may be provided between the display unit 21 and the casing 50 and the back light may emit light from the side of a back surface of the display unit 21 , that is, the side of a surface facing the casing 50 to the display unit 21 , thereby facilitating viewing display of the display unit 21 .
  • FIG. 3 illustrates a functional block diagram of the information processing apparatus.
  • An information processing apparatus 10 has a sensor unit 11 , a proximity detecting unit 12 , a determining unit 13 , a proximity detection control unit 14 , a display unit 21 , and a system control unit 30 .
  • the sensor unit 11 generates the sensor signal showing the change of the capacitance generated by the proximity or the contact of the manipulation object and outputs the sensor signal to the proximity detecting unit 12 , as described above.
  • the proximity detecting unit 12 performs proximity detection of the manipulation object, on the basis of the sensor signal from the sensor unit 11 .
  • the proximity detecting unit 12 performs the proximity detection and outputs a detection result to the determining unit 13 .
  • a detection result showing a vicinity detection state or a detection result showing a proximity detection state is generated.
  • the vicinity detection state means a state in which a position of the manipulation object cannot be detected, but the manipulation object exists in the vicinity of the sensor unit 11 .
  • the proximity detection state means a state in which a position of the manipulation object approaching or contacting the sensor unit 11 is detected.
  • the determining unit 13 determines validity of the proximity detection result, according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled.
  • the determining unit 13 outputs a valid proximity detection result or the proximity detection result and a determination result of the validity to the system control unit 30 .
  • the determining unit 13 sets a priority order to the detected manipulation objects, adopts the proximity detection result on the basis of the set priority order, and outputs the proximity detection result to the system control unit 30 .
  • the proximity detection control unit 14 controls detection sensitivity of the proximity detection in the proximity detecting unit 12 , on the basis of a control signal supplied from the system control unit 30 or the determination result of the determining unit 13 .
  • the system control unit 30 generates a display signal and outputs the display signal to the display unit 21 .
  • the system control unit 30 executes an individual process or an integration process of the proximity detection result, on the basis of the proximity detection result supplied from the determining unit 13 .
  • the system control unit 30 determines a manipulation performed by the user from the proximity detection result supplied from the determining unit 13 and the display performed by the display unit 21 and performs control such that an operation of the information processing apparatus 10 becomes an operation according to a user manipulation, on the basis of a determination result.
  • the information processing apparatus 10 determines the validity of the proximity detection result, according to whether the detection position of the manipulation object detected by the proximity detection is in the manipulatable region provided in the region where the detection of the contact of the manipulation object is enabled, and prevents the input manipulation from being erroneously detected. In addition, when the plurality of manipulation objects are detected in the proximity detection, the information processing apparatus 10 sets the priority order to the detected manipulation objects and adopts the proximity detection result on the basis of the set priority order, thereby preventing the input manipulation from being erroneously detected. Furthermore, the information processing apparatus 10 controls the detection sensitivity in the proximity detection of the manipulation object and prevents the erroneous detection.
  • FIG. 4 is a flowchart illustrating the first operation of the information processing apparatus 10 .
  • the information processing apparatus 10 determines whether the state of the manipulation object is the vicinity/proximity detection state. When the manipulation object comes close to the sensor unit 11 and the state thereof becomes the vicinity detection state or when the manipulation object approaches or contacts the sensor unit 11 and the state thereof becomes the proximity detection state, the information processing apparatus 10 proceeds to step ST 2 . In addition, the information processing apparatus 10 proceeds to step ST 11 , in the other cases.
  • step ST 2 the information processing apparatus 10 displays the manipulatable region. Because the manipulation object comes close to the touch panel, the information processing apparatus 10 displays the manipulatable region by the display unit 21 and proceeds to step ST 3 .
  • a manipulatable region ARp is a region provided in an active region ARb to be a region where the detection of the contact of the manipulation object is enabled, as illustrated in FIG. 5 , and is associated with the display of the display unit 21 .
  • step ST 3 the information processing apparatus 10 determines whether the state of the manipulation object has transited first from the vicinity detection state to the proximity detection state.
  • the information processing apparatus 10 proceeds to step ST 4 and proceeds to step ST 6 in the other cases.
  • step ST 4 the information processing apparatus 10 proceeds to step ST 6 when the detection position is in the manipulatable region and proceeds to step ST 5 in the other cases.
  • step ST 5 the information processing apparatus 10 sets a detection invalidity flag and proceeds to step ST 6 .
  • step ST 6 the information processing apparatus 10 determines whether the detection invalidity flag is set and the detection position is in a detection invalidity flag release region.
  • a detection invalidity flag release region ARq is a region provided in the manipulatable region ARp, as illustrated in FIG. 5 .
  • the information processing apparatus 10 proceeds to step ST 7 .
  • the information processing apparatus 10 proceeds to step ST 8 in the other cases.
  • step ST 7 the information processing apparatus 10 releases the detection invalidity flag.
  • the information processing apparatus 10 releases the set detection invalidity flag and proceeds to step ST 8 .
  • step ST 8 the information processing apparatus 10 determines whether a state of the detection invalidity flag is a setting state. When the detection invalidity flag is set, the information processing apparatus 10 proceeds to step ST 9 and when the detection invalidity flag is released, the information processing apparatus 10 proceeds to step ST 10 .
  • step ST 9 the information processing apparatus 10 invalidates the proximity detection result generated by the proximity detecting unit 12 and returns to step ST 1 .
  • step ST 10 the information processing apparatus 10 validates the proximity detection result generated by the proximity detecting unit 12 and returns to step ST 1 .
  • step ST 11 the information processing apparatus 10 sets the manipulatable region as non-display.
  • the information processing apparatus 10 sets the manipulatable region as the non-display.
  • the information processing apparatus 10 turns off the backlight and can decrease consumption power.
  • step ST 1 to step ST 11 do not need to be executed and a part of the processes may be selectively executed. For example, only the processes from step ST 3 to step ST 10 may be executed.
  • FIG. 6 illustrates an example of display of the manipulatable region.
  • the information processing apparatus 10 sets the manipulatable region as the non-display, like a “button 1 ” in FIG. 6(A) .
  • the information processing apparatus 10 sets a color of the manipulatable region as a color with low luminosity, like a “button 2 ”.
  • the information processing apparatus 10 sets a size of button display as a small size.
  • the manipulatable region is set as the non-display, for example, the information processing apparatus 10 may turn off the backlight and set the display of the display unit 21 as black display. Alternatively, the information processing apparatus 10 may set the display of the display unit 21 as standby screen display.
  • the information processing apparatus 10 displays the manipulatable region, like a “button 1 ” in FIG. 6(B) .
  • the information processing apparatus 10 sets a color of the manipulatable region as a color with high luminosity, like a “button 2 ”.
  • the information processing apparatus 10 sets a size of button display as a large size, like a “button 3 ”.
  • the information processing apparatus 10 turns on the backlight and performs the display by the display unit 21 .
  • gradations or animations may be added to button or icon display, so that it may be identified that a state is a display state.
  • FIG. 7 illustrates a relation of the movement of the manipulation object and the proximity detection result.
  • the detection invalidity flag is set.
  • the proximity detection result is invalidated. As such, because the proximity detection result is invalidated, the erroneous detection is not generated when the manipulation object comes close to the outside of the manipulatable region.
  • the proximity detection result is invalidated even though the manipulation object is in the manipulatable region ARp and the state of the manipulation object becomes the proximity detection state at the position outside the detection invalidity flag release region ARq, as illustrated in FIG. 7(B) .
  • the detection position of the manipulation object becomes the position in the detection invalidity flag release region ARq, the detection invalidity flag is released and the proximity detection result is validated.
  • the detection invalidity flag release region is a region existing in the manipulatable region and narrower than the manipulatable region. Therefore, the validity of the proximity detection result can be stably determined.
  • the setting of the manipulatable region or the detection invalidity flag release region can be easily performed by using a register.
  • a register For example, as illustrated in FIG. 8(A) , an 8-bit register is provided and upper 4 bits of the register are set as a threshold value to set a range of a region in an X-axis direction and lower 4 bits of the register are set as a threshold value to set a range of a region in a Y-axis direction.
  • FIG. 8(B) illustrates the case in which the register is set as “0 ⁇ 42” and the manipulatable region is set.
  • Each of the resolution of the X-axis direction and the resolution of the Y-axis direction in the sensor unit 11 is set as “1023”.
  • the manipulatable region can be easily set by using the register.
  • the detection invalidity flag release region can be easily set by using the register.
  • the proximity detection of the manipulation object is performed on the basis of the signal sensor generated according to the proximity and the contact of the manipulation object and the validity of the proximity detection result is determined according to whether the detection position detected by the proximity detection is in the manipulatable region provided in the region where the detection of the contact of the manipulation object is enabled. For this reason, the error detection can be prevented and the input manipulation can be correctly performed.
  • the detection invalidity flag is set.
  • the detection position moves and becomes the position of the detection invalidity flag release region in the manipulatable region, the detection invalidity flag is released and the proximity detection result is validated. Therefore, a more reliable proximity detection result can be used as compared with the information processing apparatus according to the related art that uses the proximity detection result when the state of the manipulation object becomes the proximity detection state.
  • the detection invalidity flag release region may be offset from the center of the manipulatable region and may be set.
  • the detection invalidity flag release region of FIG. 7 is offset in a rightward direction and is set, such that the detection invalidity flag release region is apart from a portion which the finger is easy to contact, when the user grips the information processing apparatus.
  • the detection invalidity flag release region is set, generation of the case in which the detection invalidity flag is erroneously released decreases and the erroneous detection can be prevented more surely.
  • the manipulatable region is displayed identifiably by the display unit. For this reason, because the position of the manipulatable region can be easily confirmed by the user, the erroneous manipulation can be prevented by referring to the display.
  • FIG. 9 is a flowchart illustrating the second operation of the information processing apparatus 10 .
  • the information processing apparatus 10 determines whether a manipulation object having a detection area less than a threshold value exists. When the manipulation object having the detection area less than the threshold value does not exist, the information processing apparatus 10 proceeds to step ST 22 and when the manipulation object having the detection area less than the threshold value exists, the information processing apparatus 10 proceeds to step ST 23 .
  • step ST 22 the information processing apparatus 10 invalidates a manipulation object having a detection area equal to or more than the threshold value.
  • the information processing apparatus 10 invalidates the manipulation object having the detection area equal to or more than the threshold value, for example, a manipulation object such as a back of a hand or a palm of the hand having a detection area more than a detection area of the finger when the manipulation is performed by the finger and proceeds to step ST 23 .
  • step ST 23 the information processing apparatus 10 determines whether the detection is a plurality of proximity detections. When the number of proximally detected manipulation objects is plural, the information processing apparatus 10 proceeds to step ST 25 and when the number of proximally detected manipulation objects is one, the information processing apparatus 10 proceeds to step ST 24 .
  • step ST 24 the information processing apparatus 10 adopts the detected manipulation object. Because the number of manipulation objects having the detection area less than the threshold value is one, the information processing apparatus 10 adopts the detected manipulation object and proceeds to step ST 25 .
  • step ST 25 the information processing apparatus 10 determines whether a multi-proximity detection mode is selected. When the multi-proximity detection mode using a plurality of proximity detection results is selected, the information processing apparatus 10 proceeds to step ST 31 . In addition, when the multi-proximity detection mode is not selected, the information processing apparatus 10 proceeds to step ST 26 .
  • step ST 26 the information processing apparatus 10 determines whether a manipulation object exists in a predetermined region.
  • the information processing apparatus 10 proceeds to step ST 27 .
  • the information processing apparatus 10 proceeds to step ST 30 .
  • step ST 27 the information processing apparatus 10 determines whether only one manipulation object exists in the predetermined region. When one manipulation object exists in the predetermined region, the information processing apparatus 10 proceeds to step ST 28 . When the plurality of manipulation objects exist in the predetermined region, the information processing apparatus 10 proceeds to step ST 29 .
  • step ST 28 the information processing apparatus 10 adopts the manipulation object in the predetermined region.
  • the information processing apparatus 10 outputs a proximity detection result of the manipulation object in the predetermined region from the determining unit 13 to the system control unit 30 and returns to step ST 21 .
  • step ST 29 the information processing apparatus 10 adopts the manipulation object having the highest signal strength in the predetermined region. Because the plurality of manipulation objects exist in the predetermined region, the information processing apparatus 10 outputs a proximity detection result of the manipulation object having the highest signal strength from the determining unit 13 to the system control unit 30 and returns to step ST 21 .
  • step ST 26 the information processing apparatus 10 adopts the manipulation object having the highest signal strength in the entire region.
  • the information processing apparatus 10 outputs a proximity detection result of the manipulation object having the highest signal strength in the entire region from the determining unit 13 to the system control unit 30 and returns to step ST 21 .
  • step ST 31 the information processing apparatus 10 determines whether a mode is an individual process mode. When the individual mode in which the process is executed using the plurality of proximity detection results separately is selected, the information processing apparatus 10 proceeds to step ST 32 . In addition, when the individual mode is not selected, the information processing apparatus 10 proceeds to step ST 33 .
  • step ST 32 the information processing apparatus 10 executes the individual process of the proximity detection results.
  • the information processing apparatus 10 executes the process using all of the proximity detection results individually and returns to step ST 21 .
  • the identification display is arranged at the detection position of the manipulation object detected by the proximity detection.
  • a display size of the identification display is set according to the signal strength when the manipulation object is detected.
  • step ST 33 the information processing apparatus 10 executes an integration process.
  • the information processing apparatus 10 executes the process using all of the proximity detection results and returns to step ST 21 .
  • a region surrounded by the detection position of the manipulation object detected by the proximity detection is set as a region of the identification display.
  • step ST 21 to step ST 33 do not need to be executed and a part of the processes may be selectively executed. For example, only the processes from step ST 26 to step ST 30 may be executed or only the processes of step ST 26 to step ST 33 may be executed.
  • FIG. 10 illustrates an example of the signal strength when a plurality of manipulation objects are detected.
  • two fingers FG 1 and FG 2 are set as manipulation objects and the signal strength of the X-axis direction in the finger FG 1 is set as “Sx 1 ” and the signal strength of the X-axis direction in the finger FG 2 is set as “Sx 2 ”.
  • the signal strength of the Y-axis direction in the finger FG 1 is set as “Sy 1 ” and the signal strength of the Y-axis direction in the finger FG 2 is set as “Sy 2 ”.
  • the finger FG 1 and FG 2 are detected and the finger FG 1 is in a predetermined region shown by a broken line, the finger FG 1 is adopted as the manipulation object, because the number of manipulation objects in the predetermined region is one.
  • FIG. 11 illustrates the case in which a grip portion FH of the hand approaches the sensor unit, when the finger FG 1 is used as the manipulation object.
  • the grip portion FH of the hand has a large size, as compared with the finger FG 1 . Therefore, if the finger FG 1 comes close to the sensor unit, the grip portion FH also comes close to the sensor unit.
  • the threshold value of the detection area is set to be smaller than a detection area Sfh of the grip portion FH and more than a detection area Sfg1 of the finger FG 1 , even when the grip portion FH is detected as the manipulation object, the detection is invalidated. Therefore, only the finger FG 1 can be detected as the manipulation object and the grip portion FH can be prevented from being erroneously detected.
  • FIG. 12 illustrates the case in which the multi-proximity detection mode is selected.
  • FIG. 12(A) illustrates the case in which the multi-proximity detection mode is selected and the individual process mode is selected.
  • the two fingers FG 1 and FG 2 are set as the manipulation objects.
  • a display size of identification display PA 1 corresponding to the finger FG 1 becomes more than a display size of identification display PA 2 corresponding to the finger FG 2 .
  • the proximity detection result of the manipulation object can be easily confirmed by the display of the display unit 21 .
  • FIG. 12(B) illustrates the case in which the multi-proximity detection mode is selected and the individual process mode is not selected.
  • three fingers FG 1 , FG 2 , and FG 3 are set as manipulation objects.
  • a region surrounded by positions of the fingers FG 1 to FG 3 detected by the proximity detection is displayed as identification display PA 3 .
  • the range set on the basis of the plurality of proximity detection results can be displayed and a setting state of the region can be easily confirmed.
  • the priority order is set to the detected manipulation objects.
  • the priority order of the manipulation objects positioned in the predetermined region provided in the region where the detection of the contact of the manipulation objects is enabled or the manipulation objects where the signal strengths of the sensor signals are high in the manipulation objects positioned in the predetermined region is set high and the proximity detection result is adopted on the basis of the set priority order. For this reason, even though the plurality of manipulation objects are detected, the erroneous detection can be prevented by adopting the proximity detection result having the high priority order.
  • the detection size of the manipulation object detected by the proximity detection is more than the threshold value, the corresponding manipulation object is determined as the invalidity. For this reason, when the finger comes close to the manipulatable region, even if the grip portion of the hand comes close to the sensor unit, the grip portion can be prevented from being erroneously detected as the manipulation object.
  • the process using the plurality of proximity detection results individually or the process using the plurality of proximity detection results collectively can be executed. For this reason, identification display for every multiple manipulation objects can be performed or display of a region surrounded by the plurality of manipulation objects as the identification display is enabled and various processes can be executed.
  • the identification display in the display unit 21 is switched according to whether the individual process mode is selected has been described. However, various process operations may be switched on the basis of not only the identification display but also the proximity detection result.
  • the erroneous detection of the manipulation object is prevented on the basis of the position or the detection area of the manipulation object.
  • the third operation the case in which the detection sensitivity of the proximity detection is controlled and the erroneous detection is prevented will be described.
  • the proximity detection control unit 14 controls the detection sensitivity of the proximity detection performed by the proximity detecting unit 12 and prevents the erroneous detection.
  • the control of the detection sensitivity can be easily performed by using a register. For example, as illustrated in FIG. 13(A) , an 8-bit register for sensitivity adjustment is provided, setting on whether or not to perform the sensitivity adjustment is performed by upper 4 bits of the register, and a type of the sensitivity adjustment is set by lower 4 bits.
  • a most significant bit of the register is set as a bit to set whether or not to perform the sensitivity adjustments of sensors X 0 and Xm of the sides of left and right ends illustrated in FIG. 13(B) .
  • a second bit is set as a bit to set whether or not to perform the sensitivity adjustments of left and right sensors X 1 and Xm ⁇ 1 positioned at the inner sides of the sensors X 0 and Xm in which it is set by the most significant bit whether or not to perform the sensitivity adjustments.
  • a third bit is set as a bit to set whether or not to perform the sensitivity adjustments of sensors Y 0 and Yn of the sides of upper and lower ends.
  • a four bit is set as a bit to set whether or not to perform the sensitivity adjustments of upper and lower sensors Y 1 and Yn ⁇ 1 positioned at the inner sides of the sensors Y 0 and Yn in which it is set by the third bit whether or not to perform the sensitivity adjustments.
  • the sensitivity adjustments of the sensors X 0 and Xm are performed.
  • the sensitivity adjustments of the sensors X 1 and Xm ⁇ 1 are performed.
  • the sensitivity adjustments of the sensors Y 0 and Yn are performed.
  • the sensitivity adjustments of the sensors Y 1 and Yn ⁇ 1 are performed.
  • the sensitivity adjustment of the proximity detection can be performed by adjusting gain of a sensor signal or a level of a threshold value, when the detection of the manipulation object is performed by comparing the sensor signal and the threshold value.
  • FIG. 13(B) illustrates the case in which a sensitivity adjustment register is set as “0 ⁇ E4”.
  • the sensitivity adjustment is performed when bits are set as “1” and the sensitivity adjustment is not performed when bits are set as “0”.
  • the sensitivity is decreased to “50%” and when bits are set as “0”, the sensitivity is decreased to “0%”, that is, the proximity detection is not performed.
  • the sensitivity adjustment is not limited to the case in which the sensitivity adjustment is performed using the register. For example, when a sensor signal of each sensor is read from the sensor unit 11 , a table showing a relation of a read position and sensitivity may be prepared and the sensitivity adjustment may be performed on the basis of the table.
  • the sensitivity of a region where an icon or the like is not displayed by the display unit 21 is set as “0” so that reading of the sensor signal is not performed, consumption power can be decreased. Furthermore, directivity of the detection sensitivity can be maintained according to the obtained proximity detection result. For example, if the detection sensitivity is increased with respect to a movement direction of the finger to be the manipulation object, detection precision of the manipulation object can be raised.
  • a ground layer may be provided on a surface of the sensor unit 11 .
  • the ground layer may be provided in a peripheral portion of the sensor unit 11 to suppress the capacitance from being changed by a hand holding the casing.
  • a grounded metal plate or thin conductive film is used as the ground layer.
  • the series of processes described in the specification can be executed by hardware, software, or a complex configuration of the hardware and the software.
  • a program having a process sequence recorded is installed in a memory in a computer embedded in exclusive hardware and the program is executed.
  • the program is installed in a general-purpose computer that can execute various processes and the program is executed.
  • the program can be previously recorded on a hard disk and ROM (Read Only Memory) functioning as recording media.
  • the program may be temporarily or permanently stored (recorded) in removable recording media such as a flexible disc, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto optical) disc, a DVD (Digital Versatile Disc), a magnetic disc, and semiconductor memory.
  • removable recording media can be provided as so-called package software.
  • the program is installed from the removable recording media described above to the computer.
  • the program is transmitted from a download site to the computer by wireless or is transmitted to the computer by wire, through a network such as a LAN (Local Area Network) or the Internet.
  • the computer receives the program transmitted as described above and installs the program in recording media such as an embedded hard disk.
  • the present technology is not analyzed to be limited to the embodiments described above.
  • the first to third operations may be performed individually or the first to third operations may be combined and may be performed integrally.
  • the embodiments disclose the present technology in an exemplary form and it is apparent that those skilled in the art may find modifications and alternations of the embodiments without departing from the scope of the technology. That is, claims need to be considered to determine the scope of the present technology.
  • the present technology may take the following configurations.
  • An information processing apparatus including:
  • a sensor unit that generates a sensor signal according to proximity and contact of a manipulation object
  • a proximity detecting unit that performs proximity detection of the manipulation object, on the basis of the sensor signal
  • a determining unit that determines validity of a proximity detection result, according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled.
  • the determining unit validates the proximity detection result, when the detection position is in the manipulatable region, and invalidates the proximity detection result, when the detection position is outside the manipulatable region.
  • the determining unit sets a detection invalidity flag and invalidates the proximity detection result, when the detection position is outside the manipulatable region, and releases the detection invalidity flag and validates the proximity detection result, when the detection position moves and becomes a position of a detection invalidity flag release region in the manipulatable region.
  • a display unit that performs image display
  • the manipulatable region is displayed identifiably by the display unit.
  • the determining unit sets the priority order to the detected manipulation objects and adopts the proximity detection result on the basis of the set priority order.
  • the determining unit sets the high priority order to a manipulation object positioned in a predetermined region provided in the region where the detection of the contact of the manipulation object is enabled.
  • the determining unit sets the high priority order to a manipulation object in which the signal strength of the sensor signal is high.
  • the determining unit determines the manipulation object as invalid.
  • control unit that executes an individual process or an integration process of proximity detection results, when a plurality of manipulation objects are detected by the proximity detection.
  • a display unit that performs image display
  • control unit performs region display corresponding to each manipulation object by the display unit, on the basis of the proximity detection results of the plurality of detected manipulation objects, and sets a display size of the region display according to the signal strength of the sensor signal, as the individual process of the proximity detection results.
  • a display unit that performs image display
  • control unit displays a region surrounded by positions shown by the plurality of proximity detection results by the display unit, on the basis of the proximity detection results of the plurality of detected manipulation objects, as the integration process of the proximity detection results.
  • a proximity detection control unit that controls detection sensitivity of the proximity detection in the proximity detecting unit.
  • the proximity detection control unit decreases detection sensitivity of the side of an end of the region where the detection of the contact of the manipulation object is enabled, such that the detection sensitivity becomes lower than the detection sensitivity of the other portion.
  • the proximity detection control unit maintains directivity of the detection sensitivity according to the obtained proximity detection result.
  • the sensor unit is a capacitive touch panel.
  • proximity detection of a manipulation object is performed on the basis of a sensor signal generated according to proximity and contact of the manipulation object, and validity of a proximity detection result is determined according to whether a detection position detected by the proximity detection is in a manipulatable region provided in a region where detection of the contact of the manipulation object is enabled. For this reason, erroneous detection can be prevented and input manipulation can be correctly performed. Therefore, the technology is suitable for an apparatus that performs input manipulation using a touch panel.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US14/116,137 2011-06-16 2012-06-12 Information processing apparatus, information processing method, and program Abandoned US20140071090A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2011134263A JP2013003841A (ja) 2011-06-16 2011-06-16 情報処理装置と情報処理方法ならびにプログラム
JP2011-134263 2011-06-16
PCT/JP2012/064983 WO2012173106A1 (ja) 2011-06-16 2012-06-12 情報処理装置と情報処理方法ならびにプログラム

Publications (1)

Publication Number Publication Date
US20140071090A1 true US20140071090A1 (en) 2014-03-13

Family

ID=47357093

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/116,137 Abandoned US20140071090A1 (en) 2011-06-16 2012-06-12 Information processing apparatus, information processing method, and program

Country Status (5)

Country Link
US (1) US20140071090A1 (ja)
EP (1) EP2722735A4 (ja)
JP (1) JP2013003841A (ja)
CN (1) CN103608755A (ja)
WO (1) WO2012173106A1 (ja)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140062958A1 (en) * 2011-06-16 2014-03-06 Sony Corporation Information processing apparatus, information processing method, and program
US20160253016A1 (en) * 2015-02-27 2016-09-01 Samsung Electronics Co., Ltd. Electronic device and method for detecting input on touch panel
US20160286518A1 (en) * 2013-11-07 2016-09-29 Beijing Zhigu Rui Tuo Tech Co., Ltd Determination of a communication object

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5542224B1 (ja) 2013-03-06 2014-07-09 パナソニック株式会社 電子機器および座標検出方法
JP2015053034A (ja) * 2013-08-07 2015-03-19 船井電機株式会社 入力装置
JP2015038695A (ja) * 2013-08-19 2015-02-26 ソニー株式会社 情報処理装置および情報処理方法
CN104423656B (zh) * 2013-08-20 2018-08-17 南京中兴新软件有限责任公司 误触摸识别方法和装置
JP5653558B2 (ja) * 2014-07-25 2015-01-14 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 電子機器及び座標検出方法
CN108363507B (zh) * 2018-01-11 2020-06-05 Oppo广东移动通信有限公司 触摸屏死区的补偿方法、装置、电子设备和存储介质

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050052430A1 (en) * 2000-01-19 2005-03-10 Shahoian Erik J. Haptic interface for laptop computers and other portable devices
US20060112353A1 (en) * 2004-11-19 2006-05-25 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20080007434A1 (en) * 2006-07-10 2008-01-10 Luben Hristov Priority and Combination Suppression Techniques (PST/CST) for a Capacitive Keyboard
US20080170042A1 (en) * 2007-01-17 2008-07-17 Samsung Electronics Co., Ltd. Touch signal recognition apparatus and method and medium for the same
US20080305836A1 (en) * 2007-06-07 2008-12-11 Young Hwan Kim Mobile terminal and method of generating key signal therein
US20090021491A1 (en) * 2006-02-23 2009-01-22 Pioneer Corporation Operation input device
US20090247234A1 (en) * 2008-03-25 2009-10-01 Lg Electronics Inc. Mobile terminal and method of displaying information therein
US20100283750A1 (en) * 2009-05-06 2010-11-11 Samsung Electronics Co., Ltd. Method for providing interface
US20100328222A1 (en) * 2009-06-26 2010-12-30 Nokia Corporation Touch input for a user
US20120105481A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Touch control method and portable terminal supporting the same

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000039964A (ja) * 1998-07-22 2000-02-08 Sharp Corp 手書き入力装置
JP2002351613A (ja) * 2001-05-24 2002-12-06 Fanuc Ltd 数値制御装置用表示装置
US8018440B2 (en) * 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
KR100826532B1 (ko) * 2006-03-28 2008-05-02 엘지전자 주식회사 이동 통신 단말기 및 그의 키 입력 검출 방법
JP2008009750A (ja) 2006-06-29 2008-01-17 Casio Comput Co Ltd タッチパネル付き液晶表示素子
JP4849042B2 (ja) 2007-09-14 2011-12-28 沖電気工業株式会社 非接触センサ
FR2925716B1 (fr) * 2007-12-19 2010-06-18 Stantum Circuit electronique d'analyse a modulation de caracteristiques de balayage pour capteur tactile multicontacts a matrice passive
US20090174679A1 (en) * 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
JP2009183592A (ja) * 2008-02-08 2009-08-20 Ge Medical Systems Global Technology Co Llc 操作情報入力装置および超音波撮像装置
CN101661362A (zh) * 2008-08-28 2010-03-03 比亚迪股份有限公司 一种多点触摸感应装置
TW201011605A (en) * 2008-09-01 2010-03-16 Turbotouch Technology Inc E Method capable of preventing mistakenly triggering a touch panel
US20110069021A1 (en) * 2009-06-12 2011-03-24 Hill Jared C Reducing false touchpad data by ignoring input when area gesture does not behave as predicted
JP5669169B2 (ja) * 2009-07-28 2015-02-12 Necカシオモバイルコミュニケーションズ株式会社 端末装置及びプログラム

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050052430A1 (en) * 2000-01-19 2005-03-10 Shahoian Erik J. Haptic interface for laptop computers and other portable devices
US20060112353A1 (en) * 2004-11-19 2006-05-25 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20090021491A1 (en) * 2006-02-23 2009-01-22 Pioneer Corporation Operation input device
US20080007434A1 (en) * 2006-07-10 2008-01-10 Luben Hristov Priority and Combination Suppression Techniques (PST/CST) for a Capacitive Keyboard
US20080170042A1 (en) * 2007-01-17 2008-07-17 Samsung Electronics Co., Ltd. Touch signal recognition apparatus and method and medium for the same
US20080305836A1 (en) * 2007-06-07 2008-12-11 Young Hwan Kim Mobile terminal and method of generating key signal therein
US20090247234A1 (en) * 2008-03-25 2009-10-01 Lg Electronics Inc. Mobile terminal and method of displaying information therein
US20100283750A1 (en) * 2009-05-06 2010-11-11 Samsung Electronics Co., Ltd. Method for providing interface
US20100328222A1 (en) * 2009-06-26 2010-12-30 Nokia Corporation Touch input for a user
US20120105481A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Touch control method and portable terminal supporting the same

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140062958A1 (en) * 2011-06-16 2014-03-06 Sony Corporation Information processing apparatus, information processing method, and program
US10082912B2 (en) * 2011-06-16 2018-09-25 Sony Corporation Information processing for enhancing input manipulation operations
US20160286518A1 (en) * 2013-11-07 2016-09-29 Beijing Zhigu Rui Tuo Tech Co., Ltd Determination of a communication object
US9961661B2 (en) * 2013-11-07 2018-05-01 Beijing Zhigu Rui Tuo Tech Co., Ltd. Determination of a communication object
US20160253016A1 (en) * 2015-02-27 2016-09-01 Samsung Electronics Co., Ltd. Electronic device and method for detecting input on touch panel

Also Published As

Publication number Publication date
EP2722735A1 (en) 2014-04-23
WO2012173106A1 (ja) 2012-12-20
EP2722735A4 (en) 2015-02-25
CN103608755A (zh) 2014-02-26
JP2013003841A (ja) 2013-01-07

Similar Documents

Publication Publication Date Title
US10082912B2 (en) Information processing for enhancing input manipulation operations
US20140071090A1 (en) Information processing apparatus, information processing method, and program
JP6618122B2 (ja) 入力装置およびタッチパネルの制御方法
US8638320B2 (en) Stylus orientation detection
KR102257173B1 (ko) 감소된 기생 커패시턴스를 위한 변조된 전력 공급부
EP2538313B1 (en) Touch sensor panel
US8902191B2 (en) Proximity sensing for capacitive touch sensors
KR102040481B1 (ko) 사용자 입력의 유형들을 결정하는 시스템 및 방법
US20140043265A1 (en) System and method for detecting and interpreting on and off-screen gestures
CN102622147B (zh) 位置信息校正装置、触摸传感器以及位置信息校正方法
US20110278078A1 (en) Input device with force sensing
US20140015796A1 (en) Capacitive Sensor with Reduced Noise
CN108885513A (zh) 显示堆叠内的力感测
US20140247238A1 (en) System and method for dual mode stylus detection
US20120120019A1 (en) External input device for electrostatic capacitance-type touch panel
CN111538424B (zh) 具有改进的干扰性能的激活笔
KR20150065657A (ko) 장갑을 낀 그리고 장갑을 끼지 않은 사용자 입력에 대한 센싱 체제들을 스위칭하는 시스템 및 방법
JPWO2015182222A1 (ja) 指示体検出装置及びその信号処理方法
US9519360B2 (en) Palm rejection visualization for passive stylus
US9417782B2 (en) Portable terminal, input control program, and input control method
US20180348954A1 (en) Capacitive sensing using a phase-shifted mixing signal
JP7007984B2 (ja) 位置検出システム、位置検出方法及び位置検出装置
US11592925B1 (en) Low latency input object detection under low ground mass condition
US20230280857A1 (en) Touch sensing using polyvinylidene fluoride piezoelectric film
AU2013270546C1 (en) Methods and apparatus for capacitive sensing

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ONISHI, YUSUKE;REEL/FRAME:031558/0981

Effective date: 20131003

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION