WO2019113725A1 - Unité de commande tactile, appareil, terminal et procédé de commande tactile - Google Patents

Unité de commande tactile, appareil, terminal et procédé de commande tactile Download PDF

Info

Publication number
WO2019113725A1
WO2019113725A1 PCT/CN2017/115409 CN2017115409W WO2019113725A1 WO 2019113725 A1 WO2019113725 A1 WO 2019113725A1 CN 2017115409 W CN2017115409 W CN 2017115409W WO 2019113725 A1 WO2019113725 A1 WO 2019113725A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
area
frame
edge
detecting
Prior art date
Application number
PCT/CN2017/115409
Other languages
English (en)
Chinese (zh)
Inventor
李华飞
Original Assignee
深圳市汇顶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市汇顶科技股份有限公司 filed Critical 深圳市汇顶科技股份有限公司
Priority to PCT/CN2017/115409 priority Critical patent/WO2019113725A1/fr
Priority to CN201780078153.XA priority patent/CN110192170B/zh
Publication of WO2019113725A1 publication Critical patent/WO2019113725A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present application relates to the field of touch technologies, and in particular, to a touch controller, a device, a terminal, and a touch method.
  • touch technology is becoming more and more popular in the application of medium and large-size screen terminals, such as large-screen mobile phones and tablet computers.
  • part of the user's palm for example, the position of the tiger's mouth
  • part of the user's palm for example, the position of the tiger's mouth
  • the palm can be distinguished from the user's finger by the screen data (touch area and shape, etc.)
  • the palm is close to the edge of the touch panel
  • the contact characteristics of the palm and the touch panel become very close to the finger, and it is difficult to distinguish whether the palm touch or the finger touch, thereby causing the terminal to respond to similar unintentional touch actions, causing inconvenience to the user input and reducing user input.
  • a part of the embodiments of the present invention provides a touch controller, a device, a terminal, and a touch control method to solve the problem that a user's palm or the like is likely to cause a false response when contacting an edge region of the touch panel.
  • the embodiment of the present application provides a touch controller, including: a connected driving sensing unit and a processing unit; the driving sensing unit is configured to connect a touch area and a frame area of the touch panel, and receive the touch area and a sensing signal of the frame area; the processing unit is configured to determine whether there is an edge erroneous operation according to the sensing signal and a preset condition, and to block the edge erroneous operation, the preset condition at least: simultaneously touching the touch The control area and the frame area, or the processing unit is configured to report the sensing signal to the main processor, where the main processor determines whether the edge error exists according to the sensing signal and the preset condition Operate and block the edge misoperation.
  • the embodiment of the present invention further provides a touch device, including: a touch panel and a touch controller as described above; the touch panel includes a touch area and a frame area on a circumference side of the touch area.
  • the touch detection area is formed with a touch detection unit for detecting touch input information
  • the frame area is formed with a frame detection unit for detecting touch information of the frame, and the touch detection unit and the frame detection unit are connected to the touch control.
  • a touch device including: a touch panel and a touch controller as described above; the touch panel includes a touch area and a frame area on a circumference side of the touch area.
  • the touch detection area is formed with a touch detection unit for detecting touch input information
  • the frame area is formed with a frame detection unit for detecting touch information of the frame, and the touch detection unit and the frame detection unit are connected to the touch control.
  • the touch detection unit and the frame detection unit are connected to the touch control.
  • the embodiment of the present application further provides a touch terminal, including: a main processor and a touch device as described above; the main processor is connected to the touch device.
  • the embodiment of the present invention further provides a touch method, which is applied to the touch terminal as described above, the touch method includes: acquiring touch input information of the touch area and frame touch information of the border area; The touch input information and the frame touch information detect whether there is an edge erroneous operation that satisfies a preset condition, the preset condition at least including simultaneously touching the touch area and the border area; if an edge erroneous operation is detected , the detected edge misoperation is masked.
  • the embodiment of the present application detects the touch input information of the touch area of the touch panel and the frame touch information of the frame area, and determines whether there is a simultaneous touch according to the touch input information and the frame touch information.
  • the touch operation of the touch area and the border area, and the touch operation of touching the touch area and the border area at the same time is regarded as an edge misoperation. Since the touch operation is performed based on the touch position of the touch operation (that is, whether the touch area and the frame area are simultaneously touched), the detection of the edge erroneous operation can be performed, and the touch area and the shape of the touch operation can be calculated not only in the existing needs.
  • the detection efficiency of the edge misoperation is improved, and the accuracy of the edge misoperation detection is improved, and the response of the terminal to the mis-touch action can be reduced by more accurately shielding the edge misoperation, thereby improving the efficiency of the user input.
  • the preset condition further includes: a touch area on the touch area is greater than a preset threshold. Therefore, it is possible to accurately mask the edge misoperation of the touch area and the border area and the touch area on the touch area at the same time.
  • the driving sensing unit includes: a first driving sensing unit and a second driving sensing unit; the first driving sensing unit and the second driving sensing unit are both connected to the processing unit; the first driving sensing unit is used for Providing an excitation signal to the touch area and receiving a sensing signal of the touch area; the second driving sensing unit is configured to provide an excitation signal to the frame area, and receive a sensing signal of the frame area;
  • One of the processing unit and the main processor is configured to determine whether there is an edge misoperation according to the sensing signals provided by the first driving sensing unit and the second driving sensing unit, and the preset condition, and shielding The edge is mishandled.
  • Driving the touch area and the frame area separately by the independent driving sensing units facilitates the simplified driving control method.
  • the excitation signal of the first driving sensing unit and the second driving sensing unit are different in frequency; or the first driving sensing unit and the second driving sensing unit operate asynchronously. Therefore, the frame detecting unit and the touch detecting unit can be prevented from interfering with each other.
  • the frame detecting unit includes P capacitor detecting wires; the P capacitor detecting wire rings are disposed outside the touch area, and P is a natural number greater than or equal to 1.
  • P is a natural number greater than or equal to 1.
  • At least one of the P capacitance detecting wires simultaneously serves as a shield wire of the touch panel.
  • the touch panel structure can be simplified.
  • the capacitance detecting wire is connected to a driving channel and a sensing channel of the touch controller to form a self-capacitance sensor; or one of the capacitance detecting wires is connected to the driving channel, and another capacitance detecting wire is connected to The sensing channel thus forms a mutual capacitance sensor.
  • the capacitive sensor is simple in structure and easy to implement, which is conducive to cost saving.
  • the frame detecting unit includes Q touch sensors; Q is a natural number greater than or equal to 2; the Q touch sensors are distributed at different positions of the frame area, and are respectively used to detect whether the different positions are touch. Thereby, a plurality of frame touch operations can be detected.
  • FIG. 1 is a schematic diagram showing the structure and application of a touch controller according to a first embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a touch panel applied by a touch controller according to a first embodiment of the present application
  • FIG. 3 is a schematic diagram showing the structure and application of a touch device according to a second embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a self-capacitive frame detecting unit according to a second embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a mutual capacitance type frame detecting unit according to a second embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of still another touch controller according to a second embodiment of the present application.
  • FIG. 7 is a schematic structural view of a second driving sensing unit for a self-capacitance sensor according to a second embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a second driving sensing unit for a self-capacitance sensor according to a second embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a second driving sensing unit for a mutual capacitance sensor according to a second embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a second driving sensing unit for a mutual capacitance sensor according to a second embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a bezel detecting unit according to a third embodiment of the present application.
  • FIG. 12 is a flowchart of a touch method according to a fifth embodiment of the present application.
  • FIG. 13 is a flowchart of a touch method according to a sixth embodiment of the present application.
  • the first embodiment of the present application relates to a touch controller applied to a touch device, such as a touch device integrated with a medium and large size touch screen.
  • the touch controller 101 includes a processing unit 1011 and a driving sensing unit 1012, and the processing unit 1011 is connected to the driving sensing unit 1012.
  • the driving sensor unit 1012 is configured to connect the touch area 1021 of the touch panel 102 in the touch device 10 and the frame area 1022.
  • a touch detection unit 1023 (not shown in FIG. 2) is formed on the touch area 1021.
  • the touch detection unit 1023 is configured to detect touch input information on the touch area 1021.
  • a frame detecting unit 1024 (not shown in FIG. 2) is disposed on the frame area 1022 for detecting the frame touch information on the frame area 1022.
  • the driving sensing unit 1012 is configured to provide an excitation signal to the touch detection unit 1023 and the frame detecting unit 1024, and receive the sensing signals of the touch area 1021 and the frame area 1022.
  • the driving sensing unit 1012 sends the received sensing signal to the processing unit 1011.
  • the processing unit 1011 is configured to receive the sensing signal from the touch area 1021 and the frame area 1022, or may also be used to mask the detected edge erroneous operation that meets the preset condition, that is, the processing unit 1011 is configured to use according to the touch area 1021.
  • the touch input information acquires the touch coordinates of the touch operation, and suppresses the touch coordinates when it is determined that the touch operation is simultaneously touched in the bezel area 1022.
  • the processing unit 1011 can obtain the touch coordinates of the touch area 1021 according to the acquired touch input information, and determine whether the current touch is an edge misoperation according to the touch input information and the frame touch information, and if the edge is determined as an edge If the operation is incorrect, the touch operation is suppressed or blocked, that is, the obtained touch coordinates are not reported to the system, so that the screen does not respond to the touch operation.
  • the processing unit 1011 is further configured to report the touch input information and the frame touch information to the main processor 11 for the main processor 11 to block the detected edge misoperation, wherein the main processor 11 belongs to the touch.
  • the manner of communication between the main processor 11 and the processing unit 1011 is well known to those skilled in the art and will not be described herein. That is to say, the touch controller herein may include a processing unit for determining or suppressing edge misoperations, or may not include a processing unit for determining and suppressing edge misoperations, and may be processed or determined by an external processor. The function.
  • the driving sensing unit 1012 can be an integral unit, and the driving sensing unit 1012 is configured to provide an excitation signal to the touch detection unit 1023 and the frame detecting unit 1024, and is configured to receive the sensing signal fed back by the touch detecting unit 1023, and the processing unit 1011
  • the touch sensing input information is obtained according to the sensing signal fed back from the touch detecting unit 1023.
  • the driving sensing unit 1012 is configured to receive the sensing signal fed back by the frame detecting unit 1024, and the processing unit 1011 is configured to use the sensing signal fed back from the frame detecting unit 1024. Get the border touch information.
  • the touch input information includes, for example, touch input information of the stylus, touch input information of the user's finger, and touch input information (for example, touch information of the palm) that the touch detection unit 1023 can detect.
  • the processing unit 1011 is specifically configured to detect, according to the touch input information on the touch area 1021 and the frame touch information of the frame area 1022, whether there is an edge misoperation that satisfies a preset condition, and mask the detected edge misoperation.
  • the edge misoperation mainly refers to a non-user-desired touch operation that occurs at the edge of the touch area 1021. For example, when using the stylus input, the user's palm may stay in the touch area 1021, and the palm touch operation is not expected by the user, which may cause the terminal to respond incorrectly.
  • the screen body data (for example, the shape and area touched on the touch area 1021) is acquired, and the acquired touch shape is acquired. And the area is compared with the preset shape and the area, for example, the shape and the area of the finger, so that the screen data caused by the palm touch can be recognized; when the palm moves toward the edge of the touch area 1021 and stays in the touch area When the shape and the area at the edge of the 1021 are close to the user's finger, it is difficult to distinguish the palm data at this time from the finger data.
  • the preset condition is, for example, simultaneous touch on the touch area and the border area.
  • the processing unit 1011 may determine whether the frame area 1022 is touched according to the frame touch information. If the frame area 1022 is touched, the touch control input information may continue to detect the touch according to the touch input information.
  • the number of touch operations on the area 1021 is such that if the number of touch operations on the touch area 1021 matches the number of positions touched by the bezel area 1022, all touch operations on the touch area 1021 are masked, such as on the bezel area 1022. One of the positions is touched, and only one touch operation is detected on the touch area 1021, and the touch operation is masked.
  • the touch position of the touch operation on the touch area 1021 can be continuously detected, and the touch position on the frame area 1022 can be masked off.
  • the number is the same and the touch operation at the edge position of the touch area 1021 is touched. Therefore, the present embodiment does not need to calculate the touch shape and the area of the touch operation on the touch area, and the edge erroneous operation can be quickly detected only according to whether the frame area 1022 is touched or the touch position of the touch operation on the touch area 1021.
  • the edge erroneous operation may be further detected according to the touch area of the touch operation on the touch area 1021.
  • the processing unit 1011 can also compare the touch area 1011.
  • the size of the touch area of the plurality of touch operations at the edge position for example, the size of the touch area of the two touch operations, and thus one touch operation with a larger touch area is used as an edge erroneous operation, thereby shielding the edge erroneous operation.
  • the chance of the user's finger touching the touch area and the border area at the same time is small, the chance of the palm touching the touch area and the border area at the same time is relatively large, so the detection is satisfied.
  • the touch operation of the above preset condition can detect edge misoperation more quickly and accurately.
  • the preset condition may also be that the touch area in the touch area and the border area and the touch area are simultaneously greater than a preset threshold.
  • the preset threshold is, for example, the area of the user's finger.
  • the difference between the size of the other operating body and the size of the finger may be utilized to identify the edge misoperation caused by the edge of the touch area 1021.
  • the embodiment does not specifically limit the touch body.
  • the touch input information on the touch area 1021 ie, the screen data generated by the palm on the touch area 1021
  • the frame touch information on the border area 1022 such as whether the border is touched, are combined. That is, the screen data caused by the palm touch (ie, edge misoperation) can be more accurately recognized.
  • the preset condition may further include detecting that the stylus is turned on or detecting the stylus input signal.
  • the present embodiment does not specifically limit the preset condition, as long as the preset condition is based on the touch input information and the frame touch information.
  • the touch input information on the touch area of the touch panel and the touch information on the frame area of the touch panel can quickly detect whether there is a simultaneous touch in the touch area and the border area. Touch operation and mask off touch operations while touching the touch area as well as the border area. Since the touch position of the touch operation is simultaneously touched in the touch area and the frame area to determine whether it is an edge misoperation, the calculation can greatly simplify the calculation and improve the detection efficiency with respect to detecting the shape and the area of the touch operation. . Therefore, the present embodiment is useful for reducing the edge false trigger and improving the user input efficiency on the basis of quickly and accurately detecting the edge misoperation.
  • the second embodiment of the present application relates to a touch device, such as a touch terminal having a medium and large size touch screen, such as a smart phone or a tablet computer.
  • a touch device such as a touch terminal having a medium and large size touch screen, such as a smart phone or a tablet computer.
  • the present embodiment does not specifically describe the type of the touch terminal. limit.
  • the touch device 10 includes the touch controller 101 and the touch panel as described in the first embodiment, and the touch controller 101 is connected to the touch panel.
  • the touch panel 102 includes a touch area 1021 and a frame area 1022 .
  • the touch area 1021 is formed with a touch detection unit 1023 (not shown in FIG. 2 ).
  • the touch detection unit 1023 is configured to be used.
  • the touch input information on the touch area 1021 is detected.
  • a frame detecting unit 1024 (not shown in FIG. 2) is disposed on the frame area 1022 for detecting the frame touch information on the frame area 1022.
  • the touch controller 101 please refer to the first embodiment, and details are not described herein again.
  • the touch detection unit 1023 generally adopts a matrix sensing structure.
  • the types of the sensing structures include, for example, a capacitive type, a resistive type, an ultrasonic type, and an optical type.
  • the structure of the touch detecting unit 1023 is well known to those skilled in the art. I won't go into details here.
  • the frame detecting unit 1024 is configured to detect the frame touch information of the frame area 1022.
  • the frame detecting unit 1024 can adopt a self-capacitance or mutual capacitance type sensor, and the capacitive sensor has a simple structure and is easy to implement, which is beneficial to cost saving.
  • the frame detecting unit 1024 can also use the sensing structure in the touch detecting unit 1023, that is, the sensing unit located at the edge position of the touch detecting unit 1023 is used to detect touch information of the frame area, and the embodiment is The frame detecting unit 1024 is not particularly limited as long as the frame touch information of the bezel area 1022 can be detected.
  • the detecting electrode of the self-capacitance sensor is connected to the driving channel and the sensing channel of the driving sensing unit 1012 of the touch controller 101, and the processing unit 1011 of the touch controller 101 detects the detecting electrode.
  • the amount of capacitance change detects the frame touch information.
  • the frame detecting unit 1024 adopts the mutual capacitance sensor the driving electrode and the sensing electrode of the mutual capacitance sensor are respectively connected to the driving channel and the sensing channel of the driving sensing unit 1012, and the processing unit 1011 detects the capacitance between the driving electrode and the sensing electrode.
  • the amount of change detects the frame touch information.
  • the frame detecting unit 1024 of the touch panel 102 includes, for example, P capacitor detecting wires 10241 , where P may be 1.
  • P may be 1.
  • the capacitance detecting wire 1041 is disposed outside the touch area 1021.
  • the capacitance detecting wire 1041 is connected to the driving channel and the sensing channel of the touch controller 101 to form a self-capacitance sensor, that is, the capacitance detecting wire 1041 is connected to the driving channel of the driving sensing unit 1012 (not shown in FIG. 4) and the sensing. Channels to form a self-contained sensor.
  • the driving sensing unit 1012 detects the edge touch information on the frame area 1022 through the capacitance detecting wire 10241.
  • at least one of the capacitance detecting wires 10241 serves as a shield wire of the touch panel 102 at the same time, thereby simplifying the process.
  • the frame detecting unit 1024 includes at least two capacitance detecting wires and one of the capacitance detecting wires. Connected to the drive channel, another capacitor sense wire is connected to the sense channel to form a mutual capacitance sensor 1042, and two capacitances of the mutual capacitance sensor 1042 detect the wire spacing setting.
  • the processing unit 1011 acquires the frame touch information on the bezel area 1022 by detecting the capacitance change amount of the mutual capacitance formed by the two capacitance detecting wires.
  • the frame detecting unit 1024 may also include a plurality of capacitance detecting wires, for example, two, and each of the capacitance detecting wires is a self-capacitance sensor, or a setting of a connection manner between the capacitance detecting wire and the driving sensing unit 1012.
  • the frame detecting unit 1024 can switch between the self-capacitance and the mutual capacitance mode, for example, a multiplexer is added between the frame detecting unit 1024 and the driving sensing unit 1012, and a plurality of wires are selectively connected to the driving through the multiplexer.
  • Channel sensing channel.
  • the touch controller 101 can detect any position of the frame area 1022, so that the frame area 1022 can be realized by a small number of touch sensors. Touch information detection.
  • the touch detection unit 1023 can adopt a structure well known to those skilled in the art, and details are not described herein again.
  • the processing unit 1011 of the touch controller 101 or the main processor 11 can determine that it is The edge is erroneously operated, and the touch operation shown by the dotted circle area can be regarded as a normal touch operation because the frame area 1022 is not touched, so that the edge misoperation of the palm or the like can be quickly recognized.
  • the two capacitance detecting wires in FIG. 5 form a mutual capacitance type sensor, as long as the touch operation shown by the circle area touches the mutual capacitance sensor on the frame area 1022, it can be determined as an edge error. operating.
  • the two capacitance detecting wires shown in FIG. 5 are self-capacitive sensors, as the number of the capacitance detecting wires increases, the accuracy of the frame touch information detection on the frame region 1022 can be improved.
  • the touch controller 101 includes a processing unit 1011, a first driving sensing unit 1013, and a second driving sensing unit 1014.
  • the first driving sensing unit 1013 and the second driving sensing unit 1014 are two independent driving sensing units.
  • the driving sensing unit 1013 and the second driving sensing unit 1014 are both connected to the processing unit 1011.
  • the first driving sensing unit 1013 is configured to provide an excitation signal to the touch detecting unit 1023 of the touch area 1021 and receive an sensing signal of the touch area 1021.
  • the second driving sensing unit 1014 is configured to provide an excitation signal to the bezel detecting unit 1024 and receive the sensing signal of the bezel area 1022.
  • FIG. 7 a schematic diagram of a second driving sensing unit 1014 for driving a self-capacitive frame detecting unit.
  • the second driving sensing unit 1014 includes an excitation signal circuit 10141, an amplifying circuit 10142, and an analog-to-digital conversion circuit 10143.
  • Transceiver switch 10144 is transmitted and received.
  • the processing unit 1011 is connected to the input end of the excitation signal circuit 10141, the output end of the excitation signal circuit 10141 is connected to the first end of the transceiving switch 10144, the second end of the transceiving switch 10144 is connected to the capacitance detecting lead 1041, and the transceiving switch 10144 is connected.
  • the third end is connected to the input end of the amplifying circuit 10142, the output end of the amplifying circuit 10142 is connected to the input end of the analog to digital converting circuit 10143, and the output end of the analog to digital converting circuit 10143 is connected to the processing unit 1011.
  • the processing unit 1011 controls the excitation signal circuit 10141 to input a driving signal to the capacitance detecting wire 10241 through the transceiver switching switch 10144.
  • the electrical signal outputted by the capacitance detecting wire 10241 is input to the amplifying circuit 10142 through the transceiving switch 10144, and is amplified by the amplifying circuit 10142 and then input to analog-digital conversion.
  • the circuit 10143 is converted into a digital signal by the analog-to-digital conversion circuit 10143 and then input to the processing unit 1011.
  • the processing unit 1011 calculates the capacitance change amount of the capacitance detecting wire 10241 based on the change of the digital signal, and determines whether the bezel area 1024 is touched according to the capacitance change amount. If there is only one capacitance detecting wire 10241, the transceiver switching switch 10144 can be directly connected to the input terminal of the amplifying circuit 10142. Referring to FIG. 7 , in one example, when the frame detecting unit includes a plurality of capacitance detecting wires 10241 , the multiplexing switch 10145 can be added to the second driving sensing unit 1014 .
  • the multiplexing switch 10145 is a multi-channel input and output switch that can support touch detection of a plurality of channels (i.e., a plurality of self-capacitance capacitance detecting wires 10241) through the multiplexing switch 10145. At this time, as shown in FIG. 7, the transceiving switch 10144 can be connected to the input terminal of the amplifying circuit 10142 through the multiplexing switch 10145.
  • FIG. 8 is a schematic structural diagram of a second driving sensing unit 1014 for driving a self-capacitive frame detecting unit.
  • the frame detecting unit includes a plurality of capacitance detecting wires 10241
  • the first The number of the amplifying circuit 10142 and the analog-to-digital converting circuit 10143 in the two driving sensing unit 1014 is the same as the number of the capacitance detecting wires 10241, that is, each of the capacitance detecting wires 1041 is separately configured with an amplifying circuit and a digital converting circuit.
  • each of the capacitance detecting wires 10241 is connected to the amplifying circuit 10142, respectively.
  • FIG. 9 is a schematic structural diagram of a second driving sensing unit based on a mutual capacitance type frame detecting unit.
  • One end of the mutual capacitance type sensor 1042 is connected to an output end of the excitation signal circuit 10141, and the other end of the mutual capacitance type sensor 1042 and an amplifying circuit are shown.
  • the input terminal of 10142 is connected, so that the second driving sensing unit 1014 applies a driving signal to the driving electrode of the mutual capacitance sensor 1042, and receives the sensing signal through the sensing electrode of the mutual capacitance sensor 1042.
  • the frame detecting unit 1024 has only one mutual capacitance sensor 1042, the mutual capacitance sensor 1042 can be directly connected to the input end of the amplifying circuit 10142. Referring to FIG.
  • the multiplexing switch 10145 needs to be added to the second driving sensing unit 1014 .
  • the multiplexer switch 10145 is equivalent to a multi-channel input/output switch, and the multiplexer switch 10145 can support touch detection of a plurality of channels (ie, a plurality of mutual capacitance sensors 10242).
  • each (for example, three) mutual capacitance sensors 10242 are connected to the input terminal of the amplification circuit 10142 through the multiplexing switch 10145.
  • FIG. 10 for another structural diagram of the second driving sensing unit based on the mutual capacitance type frame detecting unit.
  • the frame detecting unit 1024 includes a plurality of mutual capacitance sensors 1042
  • the second driving sensing is performed.
  • the number of the amplifying circuit 10142 and the analog-to-digital converting circuit 10143 in the unit 1014 is the same as the number of the mutual capacitive sensor 10242, that is, the amplifying circuit and the digital converting circuit are separately configured for each mutual capacitive sensor 1042.
  • each of the mutual capacitance sensors 10242 is connected to the amplification circuit 10142.
  • the structure of the first driving sensing unit 1013 is well known to those skilled in the art, and details are not described herein again.
  • the excitation signal frequency of the first driving sensing unit 1013 and the second driving sensing unit 1014 are different, so that the frame detecting unit 1024 and the touch detecting unit 1023 do not interfere with each other when working simultaneously.
  • the first driving sensing unit 1013 and the second driving sensing unit 1014 can also work asynchronously, so that the frame detecting unit 1024 and the touch detecting unit 1023 do not interfere with each other.
  • a shielding layer or a shielding wire may be disposed between the frame detecting unit 1024 and the touch detecting unit 1023, so that the frame detecting unit 1024 and the touch detecting unit 1023 can synchronize with the same excitation frequency. Work so that they do not interfere with each other.
  • the capacitance detecting wires of the frame detecting unit of the embodiment are all disposed outside the touch area. Therefore, when a small amount of the capacitance detecting wire is disposed, the frame area can be touched when being touched at any position of the frame area, and
  • the driving of the bezel detecting unit can be realized by the same driving sensing unit or a separate driving sensing unit (ie, the second driving sensing unit).
  • the present invention adds a frame detecting unit to the frame area of the touch panel to detect whether the frame area is touched, so that it can be determined according to the touch position of the touch operation whether it is an edge misoperation, and thus In terms of detecting the shape and area of the touch operation, the calculation can be greatly simplified and the detection efficiency can be improved.
  • the third embodiment of the present application relates to a touch device.
  • This embodiment can be used as an alternative embodiment of the second embodiment.
  • the frame detecting unit is Each of the capacitance detecting wires is disposed on the outer side of the touch area, that is, at any position in the entire frame area, for example, the edge touches on the left side, the right side, the upper side, and the lower side can be measured by the same or the same set of annular capacitance detecting wires.
  • the bezel detecting unit includes a plurality of capacitive sensors, and each of the capacitive sensors is capable of detecting different positions of the bezel area.
  • the frame detecting unit includes Q touch sensors 10243 , and Q is a natural number greater than or equal to 2 .
  • the Q touch sensors are distributed at different positions of the bezel area 1022 and are respectively used to detect whether different positions are touched.
  • the touch sensor can be connected to the driving channel and the sensing channel of the touch controller to form Q self-capacitive sensors, or a part of the Q touch sensors is connected to the driving channel, and the other part is connected to the sensing channel.
  • the touch sensor can be connected to the driving channel and the sensing channel of the touch controller to form Q self-capacitive sensors, or a part of the Q touch sensors is connected to the driving channel, and the other part is connected to the sensing channel.
  • the touch sensor can be connected to the driving channel and the sensing channel of the touch controller to form Q self-capacitive sensors, or a part of the Q touch sensors is connected to the driving channel, and the other part is connected to the sensing channel.
  • the touch sensor can be connected to the driving channel and the sensing channel of the touch controller to form Q self-capacitive sensors, or a part of the Q touch sensors is connected to the driving channel, and the other part is connected to the sensing channel.
  • the touch sensor can be connected to form Q self-capacitive sensors, or
  • the frame detecting unit includes four touch sensors, and the four touch sensors are respectively disposed on four sides of the top, bottom, left, and right sides of the frame area, and are respectively used to detect whether each side of the frame area is touched.
  • the four touch sensors are respectively disposed on four sides of the top, bottom, left, and right sides of the frame area, and are respectively used to detect whether each side of the frame area is touched.
  • the fourth embodiment of the present application relates to a touch terminal, such as a tablet computer, a smart phone, a car audio, and the like.
  • the touch terminal of this embodiment includes the touch device and the main processor as described in the second or third embodiment.
  • the touch controller can be used to mask the detected edge misoperation that meets the preset condition, or the touch controller is configured to report the touch input information and the frame touch information obtained by the touch controller to the main processor for shielding detection by the main processor. The edge to the wrong operation.
  • the touch terminal of the embodiment can detect the edge misoperation more quickly or accurately by detecting the touch information of the frame on the frame area and combining the touch input information on the touch area, thereby facilitating the user input efficiency.
  • the fifth embodiment of the present application relates to a touch method, which is applied to a touch terminal, such as a tablet computer, as described in the fourth embodiment.
  • the touch method includes:
  • Step 201 Acquire touch input information of the touch area and border touch information of the border area.
  • the touch input information includes, for example, touch input information of the stylus, touch input information of the user's finger, and other touch input information that can be detected (for example, touch information of the palm).
  • the frame touch information includes, for example, that the bezel area is touched or the bezel area is not touched.
  • the frame touch information may further include: touch position information of the frame area, for example, the left side or the right side of the frame area is touched, and the number of touch operations of the frame may be acquired according to the touch information of the frame, for example, 0 or 1 Or 2 etc.
  • a plurality of touch sensors are disposed on the frame area, and the touch sensors may be evenly distributed in the frame area, or distributed in the frame area according to actual needs or in a densely spaced manner, so that touch information at different positions on the frame area can be detected, for example, When the palm of the left hand touches the left side of the border area of the tablet, and the palm of the right hand touches the right side of the border area, two border touches are generated.
  • Step 202 Detect whether there is an edge erroneous operation that satisfies the preset condition according to the touch input information and the frame touch information. If an edge erroneous operation is detected, step 203 is performed; otherwise, the process returns to step 201.
  • edge misoperation refers to a non-user-desired touch operation that occurs at the edge of the touch area.
  • the palm of the user may stay in the touch area, and the touch operation of the palm is not desired by the user, which may cause the touch terminal to respond incorrectly.
  • the screen data (for example, the shape and area touched on the touch area) is acquired, and the acquired touch shape and The area is compared with the preset shape and the area (such as the shape and area of the finger) to identify the screen data caused by the palm touch; when the palm moves toward the edge of the touch area and stays at the edge of the touch area
  • the preset shape and the area such as the shape and area of the finger
  • the preset condition is, for example, simultaneous touch on the touch area and the border area.
  • the touch terminal such as a tablet
  • the user's finger touches the touch area and the border area at the same time. There is less chance of touching the touch area and the border area at the same time than the palm, and the operation of touching the touch area and the border area at the same time is regarded as an edge misoperation, which can reduce the terminal error response.
  • detecting the edge erroneous operation that meets the preset condition according to the touch input information and the frame touch information includes: detecting whether there is a frame touch operation on the frame area according to the frame touch information, and if there is a frame touch operation, according to the touch
  • the input information detects an edge misoperation that satisfies a preset condition. If there is no border touch operation, it can quickly determine that there is no edge misoperation, and returns to step 201.
  • detecting the edge erroneous operation that meets the preset condition according to the touch input information specifically includes: obtaining the number of the frame touch operations according to the frame touch information, and detecting the edge erroneous operation according to the number of the frame touch operations. For example, when it is determined that the number of touch operations of the frame is 1 according to the touch information of the frame, it is determined whether there are multiple touch operations in the touch area according to the touch input information. If there is one touch operation, the touch operation may be directly determined as If the edge is mis-operated, if there are multiple touch operations, the touch position of each touch operation can be further analyzed according to the touch input information, thereby filtering out one edge misoperation in the touch area and the border area at the same time.
  • the preset condition may also be that the touch area in the touch area and the border area and on the touch area is greater than a preset threshold, and the preset threshold is preset.
  • the area of the user's finger is such that when the finger touches the touch area and the frame area at the same time, the touch area on the touch area becomes smaller, so that the palm can be more accurately recognized while touching the touch area.
  • an edge misoperation caused by the bezel area and the touch area is close to the touch area of the finger.
  • the difference between the size of the other operating body and the size of the finger may be utilized to identify the edge misoperation caused by the edge of the touch area.
  • the present embodiment does not specifically limit the touch body and the preset condition.
  • Step 203 Mask the detected edge misoperation.
  • the touch controller detects an edge misoperation, the touch controller does not report the coordinates of the edge misoperation to the main processor. If the main processor detects the edge misoperation, the main processor does not apply to the application or The operating system reports the coordinates corresponding to the edge misoperation.
  • the touch terminal of the embodiment can detect the edge misoperation more quickly or accurately by detecting the touch information of the frame on the frame area and combining the touch input information on the touch area, thereby facilitating the user input efficiency.
  • the sixth embodiment of the present application relates to a touch method, which is improved on the basis of the fifth embodiment, and further defines a detection condition of edge misoperation.
  • the touch method of this embodiment includes steps 301 to 304 .
  • Step 301 is the same as step 201, and steps 303 and 304 are the same as steps 202 and 203, and details are not described herein again.
  • Step 302 Determine whether the touch input information includes stylus input information. If the stylus input information is included, proceed to step 303. If the stylus input information is not included, return to step 301.
  • the contact between the palm and the touch area is often accompanied by the input operation of the stylus. Therefore, when detecting the input information of the stylus, the embodiment indicates that the user is using the stylus. At this time, by performing steps 303 and 304, the edge error caused by the palm can be effectively shielded. When the user does not use the stylus, step 303 and step 304 are not performed, which is beneficial to reducing system power consumption.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne une unité de commande tactile (101), un appareil, un terminal et un procédé de commande tactile. L'unité de commande tactile (101) comprend : une unité de détection de pilotage (1012) et une unité de traitement (1011) connectées l'une à l'autre ; l'unité de détection de pilotage (1012) étant conçue pour connecter une région tactile (1021) et une région de cadre (1022) d'un panneau tactile (102), et pour recevoir des signaux de détection provenant de la région tactile (1021) et de la région de cadre (1022) ; l'unité de traitement (1011) étant conçue pour déterminer, en fonction des signaux de détection et de conditions prédéfinies, si une opération erronée de bord existe, et pour ignorer l'opération erronée de bord, les conditions prédéfinies comprenant au moins : l'effleurement simultané de la région tactile (1021) et de la région de cadre (1022), ou l'unité de traitement (1011) étant conçue pour rapporter les signaux de détection à un processeur principal (11) pour qu'il détermine, en fonction des signaux de détection et des conditions prédéfinies, si l'opération erronée de bord existe, et pour ignorer l'opération erronée de bord. Ladite solution peut résoudre le problème de l'état de la technique selon lequel une réponse d'erreur se produit facilement lorsque la paume d'un utilisateur ou similaire entre en contact avec une région de bord du panneau tactile (102).
PCT/CN2017/115409 2017-12-11 2017-12-11 Unité de commande tactile, appareil, terminal et procédé de commande tactile WO2019113725A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/115409 WO2019113725A1 (fr) 2017-12-11 2017-12-11 Unité de commande tactile, appareil, terminal et procédé de commande tactile
CN201780078153.XA CN110192170B (zh) 2017-12-11 2017-12-11 触摸控制器、装置、终端及触控方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/115409 WO2019113725A1 (fr) 2017-12-11 2017-12-11 Unité de commande tactile, appareil, terminal et procédé de commande tactile

Publications (1)

Publication Number Publication Date
WO2019113725A1 true WO2019113725A1 (fr) 2019-06-20

Family

ID=66819843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/115409 WO2019113725A1 (fr) 2017-12-11 2017-12-11 Unité de commande tactile, appareil, terminal et procédé de commande tactile

Country Status (2)

Country Link
CN (1) CN110192170B (fr)
WO (1) WO2019113725A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010042A (zh) * 2021-02-26 2021-06-22 武汉华星光电半导体显示技术有限公司 一种触控显示面板

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110737357A (zh) * 2019-09-02 2020-01-31 Oppo(重庆)智能科技有限公司 一种触摸屏报点方法及终端、存储介质
CN114265518B (zh) * 2021-12-28 2023-06-27 武汉华星光电半导体显示技术有限公司 显示面板

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080308323A1 (en) * 2007-06-14 2008-12-18 Chun-Chung Huang Object Location Sensor of Touch Panel
CN101719038A (zh) * 2009-12-30 2010-06-02 友达光电股份有限公司 触控显示面板以及触控基板
CN103336637A (zh) * 2013-06-17 2013-10-02 业成光电(深圳)有限公司 触控感测电极结构与触控显示装置
CN103914162A (zh) * 2012-12-28 2014-07-09 联想(北京)有限公司 一种触摸屏误触确定方法、装置及电子设备
CN105786391A (zh) * 2016-03-24 2016-07-20 京东方科技集团股份有限公司 触控方法及装置、触控显示设备
CN106569709A (zh) * 2016-10-31 2017-04-19 努比亚技术有限公司 控制移动终端的装置及方法

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8446374B2 (en) * 2008-11-12 2013-05-21 Apple Inc. Detecting a palm touch on a surface
CN102955614B (zh) * 2012-11-02 2016-06-08 深圳市汇顶科技股份有限公司 一种用于触摸检测的抗干扰方法、系统及触摸终端
CN103279218A (zh) * 2012-12-24 2013-09-04 李永贵 无边框平板
CN104375685B (zh) * 2013-08-16 2019-02-19 中兴通讯股份有限公司 一种移动终端屏幕边缘触控优化方法及装置
WO2015125170A1 (fr) * 2014-02-18 2015-08-27 ニューコムテクノ株式会社 Dispositif de détection de position désignée
CN105117020A (zh) * 2015-09-23 2015-12-02 努比亚技术有限公司 一种处理边缘交互操作的方法和移动终端
CN106648190B (zh) * 2015-10-30 2019-07-02 深圳市汇顶科技股份有限公司 防止触摸屏边缘误操作的装置和方法
CN106814901A (zh) * 2015-11-30 2017-06-09 小米科技有限责任公司 触控信号响应方法及装置
JP6546111B2 (ja) * 2016-03-15 2019-07-17 アルプスアルパイン株式会社 入力装置とその制御方法及びプログラム
CN106201304A (zh) * 2016-06-23 2016-12-07 乐视控股(北京)有限公司 一种防误触操作的方法及装置
CN106406701B (zh) * 2016-09-14 2020-07-21 Tcl科技集团股份有限公司 一种防止触控终端误操作的方法、系统及触控终端
CN106775084B (zh) * 2016-12-16 2019-04-16 Oppo广东移动通信有限公司 一种触摸屏的防误触方法、装置及移动终端
CN107340910B (zh) * 2017-06-26 2020-09-01 Oppo广东移动通信有限公司 一种触摸按键的响应方法、装置、存储介质及电子设备
CN107390923B (zh) * 2017-06-30 2020-05-12 Oppo广东移动通信有限公司 一种屏幕防误触方法、装置、存储介质和终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080308323A1 (en) * 2007-06-14 2008-12-18 Chun-Chung Huang Object Location Sensor of Touch Panel
CN101719038A (zh) * 2009-12-30 2010-06-02 友达光电股份有限公司 触控显示面板以及触控基板
CN103914162A (zh) * 2012-12-28 2014-07-09 联想(北京)有限公司 一种触摸屏误触确定方法、装置及电子设备
CN103336637A (zh) * 2013-06-17 2013-10-02 业成光电(深圳)有限公司 触控感测电极结构与触控显示装置
CN105786391A (zh) * 2016-03-24 2016-07-20 京东方科技集团股份有限公司 触控方法及装置、触控显示设备
CN106569709A (zh) * 2016-10-31 2017-04-19 努比亚技术有限公司 控制移动终端的装置及方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010042A (zh) * 2021-02-26 2021-06-22 武汉华星光电半导体显示技术有限公司 一种触控显示面板
CN113010042B (zh) * 2021-02-26 2023-11-28 武汉华星光电半导体显示技术有限公司 一种触控显示面板

Also Published As

Publication number Publication date
CN110192170A (zh) 2019-08-30
CN110192170B (zh) 2022-10-14

Similar Documents

Publication Publication Date Title
US10884550B2 (en) Method, mobile terminal and non-transitory computer-readable storage medium for response control of touch screen
US9274652B2 (en) Apparatus, method, and medium for sensing movement of fingers using multi-touch sensor array
CN106775087B (zh) 一种移动终端的触摸屏控制方法、装置及移动终端
JP5476368B2 (ja) マルチタッチ検出
WO2018107900A1 (fr) Procédé et dispositif de prévention de mauvais toucher sur un écran tactile, terminal mobile et support d'informations
TWI463386B (zh) A method and an apparatus for improving noise interference of a capacitive touch device
US9778742B2 (en) Glove touch detection for touch devices
US7737955B2 (en) Electronic device and method providing a touch-based interface for a display control
US20130278547A1 (en) Electronic device
WO2019113725A1 (fr) Unité de commande tactile, appareil, terminal et procédé de commande tactile
TW201510804A (zh) 觸控面板控制方法
EP2955619A1 (fr) Terminal mobile et procédé de commande d'application associé
US20110216030A1 (en) Signal sensing structure for touch panels
TW201435691A (zh) 電容式觸控裝置
TWI444881B (zh) A touch device and a control method thereof, and an electronic device having the touch device
CN207571719U (zh) 触控面板、装置及终端
US9483137B2 (en) Touch mouse and input method thereof
CN104615345B (zh) 一种自动调整虚拟键盘位置的方法和装置
TWI700624B (zh) 觸控中心計算方法、觸控系統及觸控裝置
WO2016041429A1 (fr) Procédé et dispositif pour éviter une défaillance d'une touche d'écran tactile, et support de stockage d'ordinateur
KR101210991B1 (ko) 터치스크린 컨트롤러 아이씨
CN106293175B (zh) 触控处理器、触控装置、触控系统与触控方法
TW201444282A (zh) 按鍵模組與其訊號產生方法
JP5610216B2 (ja) 電子機器用の入力装置及び入力方法
US9817531B2 (en) Interleaved scanning for capacitive touch sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17934860

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17934860

Country of ref document: EP

Kind code of ref document: A1