WO2020239029A1 - 一种电子设备及数据处理方法 - Google Patents

一种电子设备及数据处理方法 Download PDF

Info

Publication number
WO2020239029A1
WO2020239029A1 PCT/CN2020/092936 CN2020092936W WO2020239029A1 WO 2020239029 A1 WO2020239029 A1 WO 2020239029A1 CN 2020092936 W CN2020092936 W CN 2020092936W WO 2020239029 A1 WO2020239029 A1 WO 2020239029A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
area
parameter
sensing
response
Prior art date
Application number
PCT/CN2020/092936
Other languages
English (en)
French (fr)
Other versions
WO2020239029A8 (zh
Inventor
高营
程孝仁
于宙
Original Assignee
联想(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201910472199.0A external-priority patent/CN110187795B/zh
Priority claimed from CN201910472214.1A external-priority patent/CN110187796B/zh
Application filed by 联想(北京)有限公司 filed Critical 联想(北京)有限公司
Priority to US17/615,544 priority Critical patent/US20220236827A1/en
Publication of WO2020239029A1 publication Critical patent/WO2020239029A1/zh
Publication of WO2020239029A8 publication Critical patent/WO2020239029A8/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • G06F1/1692Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present invention relates to the field of terminal technology, and in particular to an electronic device and a data processing method.
  • the current notebook devices usually have a touch area, but the user can only perform touch operations such as typing on the keyboard or clicking or sliding on the touch area, resulting in a single input method and poor user experience.
  • touch operations such as typing on the keyboard or clicking or sliding on the touch area
  • current notebooks usually set a touchpad at the center of the keyboard to achieve touchpad function, but this touchpad in notebooks can only implement touch control functions, resulting in a single response to input operations, which cannot bring more to users Experience.
  • This application provides an electronic device, including:
  • the first sensor is used to collect distance sensing parameters
  • the second sensor is used to collect touch sensing parameters
  • the touch sensing area of the second sensor covers a specific area of the first surface of the electronic device; the first sensor can collect distance sensing parameters in the space above the touch sensing area.
  • the first sensor is arranged at a position where the window of the circuit board of the second sensor is opened, and the distance of the first sensor to the space above the touch sensing area through the window is Sensor parameters are collected.
  • the window is arranged at the middle position of the first edge of the circuit board.
  • the above electronic device preferably, further includes:
  • a protection structure laid on the surface of the second sensor and covering the first sensor
  • the first sensor can collect the distance sensing parameter through the protection structure.
  • the above electronic device preferably, further includes:
  • the processor is configured to generate an operation instruction based on the distance sensing parameter and the touch sensing parameter, and respond to the operation instruction to execute a corresponding function.
  • the processor generates an operation instruction based on the distance sensing parameter and the touch sensing parameter, specifically:
  • gesture type is the first type, generate an operation instruction corresponding to the first type;
  • the touch sensing parameter is processed using the distance sensing parameter, and an operation instruction is generated based on the processed touch sensing parameter.
  • This application also provides a data processing method, including:
  • the touch sensing area of the second sensor covers a specific area of the first surface of the electronic device; the first sensor can collect distance sensing parameters in the space above the touch sensing area;
  • the first sensor is arranged at a position where a window of the circuit board of the second sensor is opened;
  • generating an operation instruction based on the distance sensing parameter and the touch sensing parameter includes:
  • gesture type is the first type, generate an operation instruction corresponding to the first type;
  • the touch sensing parameter is processed, and an operation instruction is generated based on the processed touch sensing parameter.
  • processing the touch sensing parameter includes:
  • the parameter corresponding to the position of the window opening among the touch sensing parameters is set as a target parameter.
  • the target parameter is related to the parameter corresponding to the position adjacent to the window in the touch sensing parameter.
  • the electronic device and data processing method disclosed in the present application are provided with a first sensor capable of collecting distance sensing parameters on the electronic device, so that the first sensor can touch the second sensor.
  • the distance sensing parameters in the space above the sensing area are collected.
  • a distance sensor is used to measure the distance sensing parameter of the space above the touch sensing area, so that in addition to the touch input, the electronic device can also realize the input through the distance sensor, thereby enriching the electronics.
  • the input method of the device improves the user experience.
  • This application also provides a data processing method, including:
  • the second sensor responds to the input operation in the sensing area.
  • determining the response mode of the second sensor in its sensing area based on at least the first parameter includes:
  • determining a response zone in the sensing area based on the first parameter includes:
  • mapping area where the operating body is mapped to the sensing area
  • a response zone is determined in the sensing area.
  • determining the response zone in the sensing area based on the mapping area includes:
  • mapping area is greater than or equal to the preset threshold, determine other areas in the sensing area except the mapping area as the response zone, and the mapping area is configured to prohibit responding to the input operation ;
  • mapping area is smaller than the threshold, based on the distribution position of the mapping area in the sensing area, determine the response zone in the sensing area.
  • determining the response zone in the sensing area based on the mapping area includes:
  • determining the response mode of the second sensor in its sensing area based on at least the first parameter includes:
  • a response mark is output in a display area corresponding to the sensing area, and the second sensor responds to the input operation with the response mark.
  • the above method preferably, based on the first parameter, outputting a response mark in a display area corresponding to the sensing area includes:
  • mapping area where the operating body is mapped to the sensing area
  • a response mark is output in the corresponding display area.
  • the above method preferably, based on the mapping area, outputting a response mark in the corresponding display area includes:
  • a response mark is output on the display area, and the display area is related to the relative position of the mapping area on the sensing area.
  • This application also provides an electronic device, including:
  • the first sensor is used to collect the first parameter
  • the second sensor has a sensing area
  • the second sensor is configured to determine the response mode of the second sensor in its sensing area based on at least the first parameter
  • the second sensor responds to the input operation in the sensing area.
  • the first sensor is arranged at a position where a window of the circuit board of the second sensor is opened, and the first sensor can collect the first parameter through the window.
  • the first sensor is a distance sensor
  • the second sensor is a touch sensor
  • the sensing area of the second sensor covers a specific area of the first surface of the electronic device; the first sensor can collect the first parameter in the space above the sensing area.
  • the window opening is arranged at the middle position of the first edge of the circuit board of the second sensor.
  • the data processing method and electronic device disclosed in the present application after receiving the first parameter of the first sensor, determine the response of the second sensor in its sensing area based on the first parameter In this way, the second sensor responds to the input operation in the sensing area in a determined response manner. It can be seen that in this application, based on the change or difference of the first parameter, the sensing area of the second sensor responds to the input operation in a corresponding response manner, thereby providing a variety of response methods for the user's input operation, which is significantly improved User experience.
  • FIG. 1 is a schematic structural diagram of an electronic device provided in Embodiment 1 of this application;
  • FIGS 2 to 3 are respectively application example diagrams of embodiments of this application.
  • FIG. 8 is an implementation flowchart of a data processing method provided in Embodiment 2 of this application.
  • FIG. 12 is a flowchart of a data processing method provided in Embodiment 1 of this application.
  • FIG. 20 is a schematic structural diagram of an electronic device according to Embodiment 2 of the application.
  • Figures 21-26 are diagrams of other application examples of the embodiments of the application.
  • the electronic device may be a device such as a notebook or a computer with touch function.
  • the electronic device in this embodiment may include the following structure:
  • the first sensor 101 is used to collect distance sensing parameters.
  • the first sensor 101 is a distance sensor, which can collect distance sensing parameters for objects in a specific space, and these distance sensing parameters can be used to detect state parameters such as the distance, position, and posture of the object.
  • the second sensor 102 is used to collect touch sensing parameters.
  • the second sensor 102 is a touch sensor, which can collect touch sensing parameters on the touch sensing area, and the touch sensing parameters can be used to detect parameters such as the position and trajectory of the touch operation.
  • the touch sensing area of the second sensor 102 may cover a specific area of the first surface of the electronic device, and the specific area may be an area convenient for the user to perform touch operations, such as the area under the notebook keyboard in FIG. 2 It is the touch sensing area, and the user can perform touch actions in the touch sensing area to realize touch input.
  • the first sensor 101 in this embodiment is arranged at a position corresponding to the second sensor 102 on the electronic device, so that the first sensor 101 can collect the distance sensing parameters in the space above the touch sensing area.
  • the touch sensing area of the second sensor 102 is a specific area below the notebook keyboard, and the first sensor 101 can collect distance sensing parameters of objects appearing in the space above the specific area.
  • a first sensor capable of collecting distance sensing parameters is provided on the electronic device, so that the first sensor can touch the second sensor.
  • the distance sensing parameters in the space above the area are collected.
  • the distance sensor is set to measure the distance sensing parameters of the space above the touch sensing area, so that in addition to the touch input, the electronic device can also realize the input through the distance sensor, thereby enriching
  • the input method of electronic equipment improves the user experience.
  • the first sensor 101 may be arranged at a position where the window 121 of the circuit board of the second sensor 102 is opened. As shown in FIG. 4, the window is opened at a specific position on the circuit board of the second sensor 102. , The first sensor 101 is arranged at the position of the window 121, so that the first sensor 101 can collect the distance sensing parameters of the space above the touch sensing area of the second sensor 102 through the window 121, and not because Circumstances where parameter collection cannot be achieved due to the occlusion of the electronic device casing or other structures.
  • the position of the window 121 may be the middle position of the first edge on the circuit board of the second sensor 102, as shown in FIG. 5.
  • the size of the window 121 such as the length and width can be determined based on the actual size of the first sensor 101, so that the first sensor 101 can sense the distance of the space above the touch sensing area of the second sensor 102 through the window 121 Parameters are collected.
  • a protective structure 103 can be laid on the surface of the second sensor 102 in this embodiment. As shown in FIG. 6, the protective structure 103 covers the first sensor 101, The material of the protective structure 103 can allow the first sensor 101 to collect the signal wave acquisition parameters of the distance sensing parameter to penetrate, so that the first sensor 101 can collect the touch that appears on the second sensor 102 through the protective structure 101. The distance sensing parameter of any object in the space above the sensing area.
  • the protective structure 103 may be made of glass material or mylar material, etc., allowing signal wave transmission and reception. Further, an ink layer can be laid on the protective structure 103, the ink layer will not affect the transmission and reception of signal waves, and the color of the ink layer can be consistent with the color of the device surface where the second sensor 102 is located on the electronic device. Does not affect the beauty of electronic equipment.
  • the electronic device in this embodiment may also include the following structure, as shown in FIG. 7:
  • the processor 104 such as a central processing unit (CPU), etc., after the first sensor 101 collects the distance sensing parameters and the second sensor 102 collects the touch sensing parameters, the processor 104 may be based on the distance sensing parameters And touch sensing parameters, generate operation instructions, and respond to the operation instructions to perform corresponding functions.
  • CPU central processing unit
  • the processor 104 may generate corresponding operation instructions based on the distance sensing parameters and touch sensing parameters, and respond to the operation instructions as The user provides the corresponding functions, such as selecting functions such as movie playback or shutdown.
  • the processor 104 when the processor 104 generates an operation instruction based on the distance sensor and touch sensing parameters, it can be specifically implemented in the following manner:
  • the gesture type of the operating body is identified.
  • the processor 104 analyzes the distance sensing parameters through a recognition algorithm to identify the gesture of the operating body, and then determine the gesture type of the operating body, such as The user's single finger sliding gesture, palm shaking gesture, or finger depression gesture type, etc.;
  • the gesture type of the operating body is the first type, generate an operation instruction corresponding to the first type, where the first type can be a preset functional gesture type, such as a single finger sliding gesture, a palm shaking gesture, etc. If the gesture type of the operating body is the first type, it indicates that the operating body is performing gesture input through the distance sensor. At this time, an operation instruction corresponding to the first type is generated, such as an operation instruction such as page turning or shutdown, etc. Realize the book page turning or shutdown function;
  • the touch sensing parameters are processed by the distance sensing parameters at this time, and the operation instructions are generated based on the processed touch sensing parameters.
  • the second type can be other gesture types other than the functional gesture type, such as gestures where the distance of the operating body is lower than a certain threshold, or gestures where the finger is pressed down and the distance between the fingers is below a certain threshold, or the palm is pressed down. If the gesture type of the operating body is the second type, it indicates that the operating body is performing touch input operations through the touch sensor. At this time, the touch sensing parameters are processed by the distance sensing parameters, which can then be based on the processed The touch sensing parameters generate operation instructions, thereby realizing responsive touch functions for users.
  • the processor 104 uses the distance sensing parameter to process the touch sensing parameter, which refers to processing the parameter corresponding to the window opening position where the first sensor is located in the touch sensing parameter. That is to say, in the touch sensing parameters, there is a case of missing or sudden changes in the touch sensing parameters at the window opening position caused by the setting of the first sensor on the window opening.
  • the processor 104 in this embodiment The parameters of the position where the touch sensing parameter is missing or sudden change in the touch sensing area can be corrected using the distance sensing parameter, for example, the parameter corresponding to the position of the window in the touch sensing parameter can be set as the target Parameter, and the target parameter can be a preset parameter value, or it can be related to the parameter corresponding to the position adjacent to the window in the touch sensing parameter, such as the position of the first sensor in the touch sensing parameter.
  • the parameters are set to be consistent with the parameters on the peripheral position of the first sensor, and then processed touch sensing parameters are obtained, thereby improving the accuracy of the touch sensing parameters, thereby achieving higher accuracy of touch input.
  • the processor 104 when the processor 104 processes the touch sensing parameters with the distance sensing parameters, it may first perform anti-incorrect touch processing on the touch input based on the distance sensing parameters, for example, based on the distance transmission.
  • the operating body distance, operating body area and other parameters in the sensing parameters determine whether the operating body input at this time is a touch input. Only when it is a touch input, the touch sensing parameters are corrected based on the previous solution, and Based on the modified touch sensing parameters, corresponding operation instructions are generated to realize the corresponding functions. If it is not a touch input, then there is no need to respond to the touch sensing parameters, no operation instructions are generated or the operation instructions are empty. Does not respond to the operation of the operator.
  • the operating body in the distance sensing parameter is less than a certain threshold and the area is greater than the corresponding threshold, the operating body is located in the touch sensing area and the area exceeds a certain value at the moment, for example, it is recognized as a palm gesture based on the distance sensing parameter And the palm distance is less than a certain threshold and the area is greater than the corresponding threshold.
  • the touch is considered to be a non-input touch.
  • the touch sensing parameter is set to 0 or empty, it is considered that no touch input is received (subsequent generation
  • the operation command is a null command), and then the processed touch sensing parameters are obtained, so as to realize the anti-mistouch processing, so as to achieve a higher accuracy of touch input.
  • the implementation flowchart of a data processing method provided in the second embodiment of this application can be applied to electronic devices with distance sensors and touch sensors, such as notebooks or computers, which are mainly used to implement electronic devices. Multiple inputs of the device.
  • the method in this embodiment may include the following steps:
  • Step 801 In response to the distance sensing parameter collected by the first sensor, obtain the touch sensing parameter collected by the second sensor.
  • the first sensor is a distance sensor, which can collect distance sensing parameters for objects in a specific space, and these distance sensing parameters can be used to detect state parameters such as the distance, position, and posture of the object.
  • the second sensor is a touch sensor, which can collect touch sensing parameters on the touch sensing area, and the touch sensing parameters can be used to detect parameters such as the position and trajectory of the touch operation.
  • the touch sensing area of the second sensor covers a specific area on the first surface of the electronic device, such as the C-side of a notebook.
  • the specific area may be an area convenient for the user to perform touch operations.
  • the first sensor is provided on the electronic device. The position corresponding to the second sensor enables the first sensor to collect distance sensing parameters in the space above the touch sensing area.
  • Step 802 Generate an operation instruction based on the distance sensing parameter and the touch sensing parameter.
  • Step 803 Respond to the operation instruction to execute the corresponding function.
  • the corresponding operation instruction can be generated based on the distance sensing parameter and the touch sensing parameter, and the user can respond to the operation instruction.
  • a first sensor capable of collecting distance sensing parameters is provided on an electronic device, so that the first sensor can touch the second sensor.
  • the distance sensing parameters in the space above the sensing area are collected, and then operating instructions can be generated based on the distance sensing parameters and touch sensing parameters and corresponding functions can be realized.
  • the distance sensor is set to measure the distance sensing parameters of the space above the touch sensing area, so that in addition to the touch input, the electronic device can also realize the input through the distance sensor, thereby enriching
  • the input method of electronic equipment improves the user experience.
  • the first sensor may be set at the position of the opening window of the circuit board of the second sensor.
  • step 802 in this embodiment is based on distance sensing parameters and touch transmission.
  • the gesture type of the operating body is identified.
  • the distance sensing parameters are analyzed by the recognition algorithm to identify the gesture of the operating body, and then determine the gesture type of the operating body, such as a user's single finger Swipe gestures, palm shaking gestures, or finger depression gesture types, etc.;
  • the gesture type of the operating body is the first type, generate an operation instruction corresponding to the first type, where the first type can be a preset functional gesture type, such as a single finger sliding gesture, a palm shaking gesture, etc. If the gesture type of the operating body is the first type, it indicates that the operating body is performing gesture input through the distance sensor. At this time, an operation instruction corresponding to the first type is generated, such as an operation instruction such as page turning or shutdown, etc. Realize the book page turning or shutdown function;
  • the touch sensing parameters are processed by the distance sensing parameters at this time, and the operation instructions are generated based on the processed touch sensing parameters.
  • the second type can be other gesture types other than the functional gesture type, such as gestures where the distance of the operating body is lower than a certain threshold, or gestures where the finger is pressed down and the distance between the fingers is below a certain threshold, or the palm is pressed down. If the gesture type of the operating body is the second type, it indicates that the operating body is performing touch input operations through the touch sensor. At this time, the touch sensing parameters are processed by the distance sensing parameters, which can then be based on the processed The touch sensing parameters generate operation instructions, thereby realizing responsive touch functions for users.
  • processing touch sensing parameters with distance sensing parameters refers to processing parameters corresponding to the position of the window where the first sensor is located in the touch sensing parameters. That is to say, in the touch sensing parameters, there is a case of missing or sudden changes in the touch sensing parameters at the window opening position caused by the setting of the first sensor on the window opening. At this time, the touch sensing parameters can be touched in this embodiment. The parameter that controls the position where the sensor parameter is missing or abrupt in the touch sensor area is corrected by the distance sensor parameter.
  • the parameter corresponding to the position of the window in the touch sensor parameter is set as the target parameter
  • the target parameter can be a preset parameter value, or it can be related to the parameter corresponding to the position adjacent to the window in the touch sensing parameter.
  • the parameter at the position of the first sensor in the touch sensing parameter is set to Consistent with the parameters at the peripheral position of the first sensor, the processed touch sensing parameters are obtained, thereby improving the accuracy of the touch sensing parameters, thereby achieving higher touch input accuracy.
  • the touch input when the touch sensing parameter is processed by the distance sensing parameter, the touch input may be processed to prevent accidental touch based on the distance sensing parameter, for example, based on the distance sensing parameter.
  • the operating body distance, operating body area and other parameters determine whether the operating body input at this time is a touch input. Only when it is a touch input, the touch sensing parameters are corrected based on the solution in the previous article, and based on the corrected The touch sensing parameters generate corresponding operation instructions to realize the corresponding functions. If it is not a touch input, then there is no need to respond to the touch sensing parameters, no operation instructions are generated or the operation instructions are empty, and the operation body is not affected. Respond to the operation.
  • the operating body in the distance sensing parameter is less than a certain threshold and the area is greater than the corresponding threshold, the operating body is located in the touch sensing area and the area of the area exceeds a certain value, such as a palm gesture based on the distance sensing parameter. And the palm distance is less than a certain threshold and the area is greater than the corresponding threshold. At this time, the touch is considered to be a non-input touch.
  • the touch sensing parameter is set to 0 or empty, it is considered that no touch input is received (subsequent generation
  • the operation command is a null command), and then the processed touch sensing parameters are obtained, so as to realize the anti-mistouch processing, so as to achieve a higher accuracy of touch input.
  • a window is set on the Printed Circuit Board (PCB) of the touchpad touchpad on the C side.
  • PCB Printed Circuit Board
  • the distance sensor can transmit and receive signals directly through the glass or mylar material in the window area of the touchpad.
  • the window area can be painted with the same color ink as the C surface. Therefore, the structure of the touchpad is a laminated structure, in which the ink material can be selected to penetrate A material that emits signal waves of the same frequency band.
  • the touchpad After the touchpad is opened and the window is broken, the touchpad will have a detection dead zone problem.
  • the following method of processing the touch sensing parameters can be used in this embodiment to remove the touchpad because the PCB opens the window
  • the dead zone problems are as follows:
  • the signal of the distance sensor can be used as the touchpad correction signal in this embodiment.
  • the distance sensor When the user's finger touches the distance sensor and it is determined that the user is not gesture input at this time, the distance sensor will The signal is transmitted to the touchpad, and the touchpad can use the X, Y, Z data in the sensing parameters of the distance sensor to compensate for the missing data in the touch sensing parameters collected by the capacitive sensor in the touchpad, so that the entire touchpad detection area signal is completed. After transmitting to the CPU, it responds without dead zone.
  • the specific process is shown in Figure 11:
  • the user can not only apply the original touchpad function, but also realize functions such as non-contact operation and intelligent monitoring, and improve user experience.
  • the detection position of the distance sensor is more in line with the user scenario, which can improve recognition efficiency and improve user experience.
  • the touch sensing parameter is corrected to solve the dead zone problem of the touchpad after opening the window.
  • FIG. 12 it is a flow chart of an implementation of a data processing method provided in the first embodiment of this application.
  • the method is suitable for use in an electronic device, such as a notebook or a computer.
  • the electronic device has a first sensor and a second sensor.
  • the first sensor and the second sensor are of different types.
  • the method in this embodiment may include the following steps:
  • Step 1201 Receive the first parameter of the first sensor.
  • the first sensor may be a distance sensor.
  • the first parameter of the first sensor may be a distance sensing parameter.
  • the distance sensor may transmit a signal wave and receive the signal reflected back after the signal wave encounters an object. Generating distance sensing parameters can realize gesture input, where the distance sensing parameters can be used to identify the shape, position, and posture of the object, such as identifying the gesture type of the operating body, and so on.
  • Step 1202 Based on at least the first parameter, determine the response mode of the second sensor in its sensing area.
  • the second sensor responds to the input operation in a certain response manner in the sensing area.
  • the first parameter of the first sensor is identified, and then the response mode of the second sensor in the sensing area is determined based on the recognition result. Different first parameters correspond to the second sensor. In different response methods.
  • the first sensor can collect the first parameter of the space above the sensing area of the second sensor. It can be seen that in this embodiment, it is based on the data collected by the first sensor in the space above the sensing area of the second sensor.
  • the first parameter determines the response mode of the second sensor in the sensing area. As shown in FIG. 13, when the first parameter in the space above the sensing area is different, the response mode on the sensing area may be different.
  • the sensing area of the second sensor covers a specific area of the first surface of the electronic device, such as all or part of the area under the keyboard on the C side of the notebook. As shown in FIG. 14, the first sensor can communicate with each other. The first parameter is collected in the space above the sensing area.
  • the data processing method provided in the first embodiment of the present application after receiving the first parameter of the first sensor, determines the response of the second sensor in its sensing area based on the first parameter In this way, the second sensor responds to the input operation in the sensing area in a determined response manner. It can be seen that in this embodiment, based on the change or difference of the first parameter, the sensing area of the second sensor responds to the input operation in a corresponding response mode, thereby providing a variety of response modes for the user input operation. Improve user experience.
  • the response zone is determined in the sensing area, thereby causing the second sensor to respond to the input operation with the response zone.
  • the other areas of the sensing area except the response zone are configured to prohibit response input operations. That is, in this embodiment, based on the first parameter collected by the first sensor in the space above the sensing area of the second sensor, the response zone that responds to the input operation is determined in the sensing area of the second sensor , And other areas in the sensing area of the second sensor are prohibited from responding to input operations except the response zone, as shown in FIG. 15.
  • the position parameter of the operating body in the first parameter such as the position and distance of the operating body relative to the first sensor.
  • the position parameter may also include the transmission of the operating body relative to the first sensor or the second sensor.
  • the morphological parameters of the sensing area such as size or posture;
  • the mapping area of the operating body mapped to the sensing area where the first sensor and the second sensor have a relative positional relationship, such as the first sensor and the second sensor:
  • the sensor can collect the relative positional relationship of the first parameter in the space above the sensing area of the second sensor. Therefore, in this embodiment, based on the position parameter of the operating body, the mapping of the operating body is mapped in the sensing area of the second sensor Area, where the mapping area has an area center and an area boundary.
  • the area center corresponds to the position where the operating body is mapped to the sensing area
  • the area boundary corresponds to the shape of the operating body relative to the first sensor or the operating body relative to the sensing area.
  • the shape of the region corresponds to it.
  • the region boundary is a circular boundary centered on the position where the operating body is mapped to the sensing region and the size radius of the operating body relative to the shape of the sensing region or a preset radius length as the radius. As shown in Figure 16;
  • the response zone is determined in the sensing area.
  • the partition corresponding to the distribution position of the mapping area in the sensing area may be determined as the response partition, as shown in FIG.
  • the sensing area is divided into left, middle and right partitions. After determining the mapping area of the operating body on the sensing area, it is determined that the mapping area is in the sensing area. The distribution position in the area, and then the partition where the distribution position is located is determined as the response partition. The right partition in FIG. 17 is the response partition.
  • the operating body can be identified to prevent false touches, for example, to determine whether the area of the mapping area is greater than or equal to a preset threshold and the operation The distance between the body and the sensing area is less than a certain threshold. If it is, it is considered that the operating body is not performing gesture input or touch input. At this time, it does not respond to the input operation of the operating body to prevent accidental touch; if the operating body is If the distance between the sensing areas is greater than a certain threshold, it indicates that the operating body is performing gesture input. At this time, gesture recognition is performed on the operating body to realize the function of gesture input.
  • the response area in the sensing area can be determined based on whether the mapping area is a valid input area. For example, in this embodiment, it can be determined whether the area of the mapping area is greater than or equal to the predetermined area. Set a threshold to determine the response zone.
  • the mapping area is greater than or equal to the preset threshold, it indicates that the mapping area of the operating body on the sensing area is too large, and it can be considered that the operating body is not performing touch input.
  • the operating body is If the distance between the sensing areas is less than a certain threshold, the other areas in the sensing area except the mapping area are determined as the response area, and the mapping area is configured as the area that prohibits responding to input operations, that is: if the mapping area is If the area of the area is greater than or equal to the preset threshold, the mapping area is determined as the wrong touch area of the operating body.
  • the mapping area it will not respond to input operations to prevent accidental touch;
  • the area of the mapping area is less than the preset threshold, at this time, it indicates that the operating body determines the response area in the sensing area based on the distribution position of the mapping area in the sensing area, for example, there are multiple
  • the pre-divided partitions it can be determined that the area closest to the distribution position of the mapping area in the sensing area is determined as the response partition, and the other partitions are configured not to respond to input operations.
  • the response mark is output in the display area at a position corresponding to the response zone.
  • the display area may be a display area on the same display screen, and the display area is divided.
  • the mouse is output to the right display area in the display area; or, the display area can be the display area on different displays, such as the display area of the local display and the display area of the external display.
  • the display area of the local display is the response area, the mouse is output on the display area of the local display.
  • step 1202 when determining the response mode of the second sensor in its sensing area at least based on the first parameter, step 1202 can be implemented in the following manner:
  • a response mark is output in the display area corresponding to the sensing area, and the second sensor responds to the input operation with the response mark.
  • the display area refers to the display area on the electronic device where the second sensor is located, and is used to output display content.
  • the display area can be the display area on the same display screen, and the display area is divided.
  • the first parameter is different, and the display area output by the corresponding response flag is different, such as the left area or the right area on the same display screen; or,
  • the display area can be the display area on different displays, such as the display area of the local display and the display area of the external display.
  • the first parameter is different, and the display area output by the corresponding response flag is different.
  • the response flag displays the display area of the local display or The display area of the external monitor is shown in Figure 18.
  • the response mark may be a mouse or other marks.
  • the response mark is output to the display area corresponding to the sensing area, thereby prompting the user to perform input operations on the display area.
  • outputting the response mark in the display area corresponding to the sensing area based on the first parameter can be specifically implemented in the following ways:
  • the position parameter of the operating body in the first parameter such as the position and distance of the operating body relative to the first sensor.
  • the position parameter may also include the transmission of the operating body relative to the first sensor or the second sensor.
  • the morphological parameters of the sensing area such as size or posture;
  • the mapping area of the operating body mapped to the sensing area where the first sensor and the second sensor have a relative positional relationship, such as the first sensor and the second sensor:
  • the sensor can collect the relative positional relationship of the first parameter in the space above the sensing area of the second sensor. Therefore, in this embodiment, based on the position parameter of the operating body, the mapping of the operating body is mapped in the sensing area of the second sensor Area, where the mapping area has an area center and an area boundary.
  • the area center corresponds to the position where the operating body is mapped to the sensing area
  • the area boundary corresponds to the shape of the operating body relative to the first sensor or the operating body relative to the sensing area.
  • the morphology of the region corresponds, for example, the region boundary is a circular boundary centered on the position where the operating body is mapped to the sensing region and the radius of the size of the operating body relative to the sensing region is the radius;
  • the response flag is output in the corresponding display area.
  • the display area associated with the mapping area is determined, and the response mark is output on the display area corresponding to the mapping area.
  • the display area is related to the relative position of the mapping area on the sensing area, that is, in this embodiment based on The location of the mapping area on the sensing area is determined, and the corresponding display area output response flag is determined. If the mapping area is on the left side of the sensing area, then the display area is the display area of the local display. If the mapping area is on the right side of the sensing area, Then the display area is the display area of the external display, as shown in FIG. 19, so that the response mark corresponds to the mapping area, that is, the response mark corresponds to the operating body.
  • FIG. 20 this is a schematic structural diagram of an electronic device provided in Embodiment 2 of this application.
  • the electronic device may be a notebook or other device, which may include the following structure:
  • the first sensor 901 is used to collect a first parameter.
  • the first sensor 901 may be a distance sensor, and the first parameter is a distance sensing parameter, which can realize gesture input.
  • the second sensor 902 has a sensing area for collecting a second parameter.
  • the second sensor 902 may be a touch sensor, and the second parameter is a touch sensing parameter, which can realize touch input.
  • the second sensor 902 is configured to determine the response mode of the second sensor in its sensing area based on at least the first parameter; wherein, the second sensor responds to the sensing area in the sensing area. Respond to input operations.
  • the first parameter of the first sensor is recognized, and the second sensor 902 determines the response mode of the second sensor in the sensing area based on the recognition result. Different first parameters are in the first parameter. The two sensors correspond to different response modes.
  • the first sensor can collect the first parameter of the space above the sensing area of the second sensor. It can be seen that in this embodiment, it is based on the data collected by the first sensor in the space above the sensing area of the second sensor.
  • the first parameter determines the response mode of the second sensor in the sensing area. When the first parameter in the space above the sensing area is different, the response mode on the sensing area may be different.
  • the sensing area of the second sensor covers a specific area of the first surface of the electronic device, such as all or part of the area under the keyboard on the C side of the notebook.
  • the first sensor can perform the first sensor on the space above the sensing area. One parameter collection.
  • the electronic device provided in the second embodiment of the present application, after receiving the first parameter of the first sensor, determines the response mode of the second sensor in its sensing area based on the first parameter , So that the second sensor responds to the input operation in the sensing area in a determined response manner. It can be seen that in this embodiment, based on the change or difference of the first parameter, the sensing area of the second sensor responds to the input operation in a corresponding response mode, thereby providing a variety of response modes for the user input operation. Improve user experience.
  • the first sensor 901 may be arranged at the position of the window 921 of the circuit board of the second sensor 902. As shown in FIG. 21, the first sensor 901 can pass through the The window 921 collects the first parameter. Specifically, the window 921 may be arranged at the middle position of the first edge of the circuit board of the second sensor 902. As shown in Figure 22, the area below the C-side keyboard of the notebook is the sensing area of the touch sensor. The opening window 921 is set in the center of the sensing area near the edge of the C-side keyboard, so that the distance sensor can communicate with each other. The space above the sensing area collects distance sensing parameters.
  • the touch sensor determines the response mode of the sensing area based on the distance sensing parameters collected by the distance sensor in the space above the sensing area, such as outputting a response on the corresponding display area Signs (local display or external display) to prompt the user to input or to respond to input operations in different response zones in the sensing area, etc., to provide users with different input experiences.
  • Signs local display or external display
  • the electronic device in this embodiment can also be configured with a processor to generate operation instructions based on the first parameter and the second parameter of the second sensor to perform corresponding functions.
  • the processor may recognize the gesture type of the operating body based on the distance sensing parameter, and determine the gesture type, and if the gesture type is the first type, generate an operation instruction corresponding to the first type , Realize gesture input; if the gesture type is the second type, process the touch sensing parameters with the distance sensing parameters, and generate operation instructions based on the processed touch sensing parameters to achieve Touch input.
  • the false touch prevention recognition can be performed based on the distance sensing parameter.
  • the processor may process the touch sensing parameter with the distance sensing parameter, such as the parameter corresponding to the position of the window in the touch sensing parameter Set as a target parameter, wherein the target parameter is related to a parameter corresponding to a position adjacent to the window in the touch sensing parameter.
  • the distance sensing parameter such as the parameter corresponding to the position of the window in the touch sensing parameter Set as a target parameter, wherein the target parameter is related to a parameter corresponding to a position adjacent to the window in the touch sensing parameter.
  • this embodiment it is proposed to integrate a notebook touchpad with a distance sensor.
  • the user can implement the original touchpad function on the touchpad.
  • non-contact operations can also be achieved through the touchpad, such as gesture input, which can realize smart closing of the screen or quick operation And other functions.
  • the specific method is:
  • the Radar sensor can directly transmit and receive signals through the glass or mylar material in the window area of the touchpad (touchpad is a larger one, for example, the entire keyboard is a touchpad, integrated with Radar sensor).
  • Radar is used to detect the distance L and the orientation a of the user's hand, and the mapping area where the hand falls or is mapped on the touchpad is obtained through calculation and prediction.
  • the center Le position of the mapping area is calculated, and the touchpad area corresponding to R(Le,r) is defined as the area where the palm will fall, that is, the mapping area.
  • r can be a preset value or based on Radar detects the value calculated by the shape parameter of the user's hand.
  • the MCU (Microcontroller Unit) of the Touchpad sets the calculated R (Le, r) area in advance as an anti-mistouch area, and at this time the set area is an invalid area.
  • the other areas R ⁇ (Le,r) in the sensing area are effective response areas.
  • the radar sensor detects the hand movement state in real time, and adjusts the R(Le,r) area and R ⁇ (Le,r) area in real time. Realize intelligent dynamic switching and improve the user experience, as shown in Figure 24.
  • the touchpad in the electronic device has a partition mode, which allows users to predefine touchpad partitions, such as defining the left and right areas or three areas, and the user can divide the touchpad into different functional areas.
  • Radar detects that the user's hand falls into this area, it intelligently switches the touchpad enabled area.
  • the user accesses an external display, set the right half of the touchpad as the external display controller, and the left half of the touchpad as the local display controller.
  • the cursor is automatically adjusted to the main display controller, otherwise, the cursor is automatically adjusted to the external display controller. Therefore, in this embodiment
  • the operation interface and the response area are dynamically switched by detecting the gesture input of the user's hand, as shown in FIG. 25.
  • a touchpad MCU is used to integrate a radar sensor.
  • the distance sensing signal collected by the Radar sensor is calculated to obtain L and a.
  • the touchpad can use the X and Y of the capacitive sensor sensor to obtain R(Le,r) after calculation.
  • the touchpad dynamically adjusts the working area R ⁇ (Le,r) And prevent accidental touch R(Le,r).
  • the radar sensor detects the user's hand gesture input, and dynamically adjusts the control interface and the corresponding area.
  • an integrated Radar sensor can be provided in this embodiment to dynamically adjust the touchpad response mode by predicting the user's hand position and other gestures, such as mouse display and determination of the response zone.
  • non-contact detection can be used to improve the function of preventing misoperation and improve user experience.
  • the touchpad is dynamically partitioned, and then the touchpad response area is dynamically adjusted, and the control interface is dynamically and intelligently switched to enhance product competitiveness.
  • the steps of the method or algorithm described in the embodiments disclosed in this document can be directly implemented by hardware, a software module executed by a processor, or a combination of the two.
  • the software module can be placed in random access memory (RAM), internal memory, read-only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disks, removable disks, CD-ROMs, or all areas in the technical field. Any other known storage medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种电子设备及数据处理方法,电子设备包括:第一传感器(101),用于采集距离传感参数;第二传感器(102),用于采集触控传感参数;其中,所述第二传感器(102)的触控传感区域覆盖所述电子设备第一表面的特定区域;所述第一传感器(101)能够对所述触控传感区域上方空间中的距离传感参数进行采集。通过设置距离传感器来对触控传感区域上方空间的距离传感参数,从而使得电子设备中除了可以实现触控输入之外,还可以通过距离传感器来实现输入,从而丰富电子设备的输入方式,改善用户的使用体验。

Description

一种电子设备及数据处理方法 技术领域
本发明涉及终端技术领域,尤其涉及一种电子设备及数据处理方法。
背景技术
目前的笔记本设备中通常设置有触控区域,但是用户只能在键盘敲击或在触控区域进行点击或滑动等触摸输入实现触控操作,导致输入方式单一,用户的使用体验较差。例如,目前的笔记本通常在键盘下方中心位置设置一块触摸板,以实现touchpad功能,但笔记本中这一触摸板只能实现触摸控制功能,导致对输入操作的响应方式单一,无法带给用户更多的使用体验。
发明内容
本申请提供了一种电子设备,包括:
第一传感器,用于采集距离传感参数;
第二传感器,用于采集触控传感参数;
其中,所述第二传感器的触控传感区域覆盖所述电子设备第一表面的特定区域;所述第一传感器能够对所述触控传感区域上方空间中的距离传感参数进行采集。
上述电子设备,优选的,所述第一传感器设置在所述第二传感器的电路板的开窗的位置,所述第一传感器通过所述开窗对所述触控传感区域上方空间的距离传感参数进行采集。
上述电子设备,优选的,所述开窗设置在所述电路板上第一边缘的中间位置。
上述电子设备,优选的,还包括:
保护结构,铺设在所述第二传感器的表面,并覆盖所述第一传感器;
其中,所述第一传感器能够透过所述保护结构采集到所述距离传感参数。
上述电子设备,优选的,还包括:
处理器,用于基于所述距离传感参数及所述触控传感参数,生成操作指令,并对所述操作指令进行响应,以执行相应的功能。
上述电子设备,优选的,其中,所述处理器基于所述距离传感参数及所述触控传感参数,生成操作指令,具体为:
基于所述距离传感参数,识别操作体的手势类型;
如果所述手势类型为第一类型,生成与所述第一类型相对应的操作指令;
如果所述手势类型为第二类型,以所述距离传感参数对所述触控传感参数进行处理,并 基于经过处理的触控传感参数生成操作指令。
本申请还提供了一种数据处理方法,包括:
响应于第一传感器采集到的距离传感参数,获得第二传感器采集到的触控传感参数;
其中,所述第二传感器的触控传感区域覆盖所述电子设备第一表面的特定区域;所述第一传感器能够对所述触控传感区域上方空间中的距离传感参数进行采集;
基于所述距离传感参数及所述触控传感参数,生成操作指令;
对所述操作指令进行响应,以执行相应的功能。
上述方法,优选的,所述第一传感器设置在所述第二传感器的电路板的开窗的位置;
其中,基于所述距离传感参数及所述触控传感参数,生成操作指令,包括:
基于所述距离传感参数,识别操作体的手势类型;
如果所述手势类型为第一类型,生成与所述第一类型相对应的操作指令;
如果所述手势类型为第二类型,对所述触控传感参数进行处理,并基于经过处理的触控传感参数生成操作指令。
上述方法,优选的,对所述触控传感参数进行处理,包括:
对所述触控传感参数中与所述开窗的位置对应的参数设置为目标参数。
上述方法,优选的,所述目标参数与所述触控传感参数中所述开窗相邻的位置对应的参数相关。
从上述技术方案可以看出,本申请公开的一种电子设备及数据处理方法,通过在电子设备上设置能够采集距离传感参数的第一传感器,使得第一传感器能够对第二传感器的触控传感区域上方空间中的距离传感参数进行采集。可见,本申请中通过设置距离传感器来对触控传感区域上方空间的距离传感参数,从而使得电子设备中除了可以实现触控输入之外,还可以通过距离传感器来实现输入,从而丰富电子设备的输入方式,改善用户的使用体验。
本申请还提供了一种一种数据处理方法,包括:
接收第一传感器的第一参数;
至少基于所述第一参数,确定第二传感器在其传感区域中的响应方式;
其中,所述第二传感器在所述传感区域中对输入操作进行响应。
上述方法,优选的,至少基于所述第一参数,确定第二传感器在其传感区域中的响应方式,包括:
基于所述第一参数,在所述传感区域中确定响应分区,所述第二传感器以所述响应分区对所述输入操作进行响应;
其中,所述传感区域中除所述响应分区之外的其他区域被配置为禁止响应所述输入操作。
上述方法,优选的,基于所述第一参数,在所述传感区域中确定响应分区,包括:
识别所述第一参数中操作体的位置参数;
基于所述位置参数,确定所述操作体映射到所述传感区域的映射区域;
基于所述映射区域,在所述传感区域中确定响应分区。
上述方法,优选的,基于所述映射区域,在所述传感区域中确定响应分区,包括:
如果所述映射区域的区域面积大于或等于预设阈值,将所述传感区域中除所述映射区域之外的其他区域确定为响应分区,所述映射区域被配置为禁止响应所述输入操作;
如果所述映射区域的区域面积小于所述阈值,基于所述映射区域在所述传感区域中的分布位置,在所述传感区域中确定响应分区。
上述方法,优选的,基于所述映射区域,在所述传感区域中确定响应分区,包括:
在所述传感区域中的多个分区中,确定与所述映射区域在所述传感区域中的分布位置相对应的分区为响应分区;
其中,所述传感区域中除所述响应分区之外的其他分区被配置为禁止响应所述输入操作。
上述方法,优选的,至少基于所述第一参数,确定第二传感器在其传感区域中的响应方式,包括:
基于所述第一参数,在所述传感区域对应的显示区域中输出响应标志,所述第二传感器以所述响应标志对所述输入操作进行响应。
上述方法,优选的,基于所述第一参数,在所述传感区域对应的显示区域中输出响应标志,包括:
识别所述第一参数中操作体的位置参数;
基于所述位置参数,确定所述操作体映射到所述传感区域的映射区域;
基于所述映射区域,在对应的显示区域中输出响应标志。
上述方法,优选的,基于所述映射区域,在对应的显示区域输出响应标志,包括:
确定与所述映射区域相关联的显示区域;
在所述显示区域上输出响应标志,所述显示区域与所述映射区域在所述传感区域上的相对位置相关。
本申请还提供了一种电子设备,包括:
第一传感器,用于采集第一参数;
第二传感器具有传感区域;
其中,所述第二传感器,用于至少基于所述第一参数,确定所述第二传感器在其传感区域中的响应方式;
其中,所述第二传感器在所述传感区域中对输入操作进行响应。
上述电子设备,优选的,所述第一传感器设置在所述第二传感器的电路板的开窗的位置,所述第一传感器能够透过所述开窗采集所述第一参数。
上述电子设备,优选的,所述第一传感器为距离传感器,所述第二传感器为触控传感器;
其中,所述第二传感器的传感区域覆盖所述电子设备第一表面的特定区域;所述第一传感器能够对所述传感区域上方空间中的第一参数进行采集。
上述电子设备,优选的,所述开窗设置在所述第二传感器的电路板上第一边缘的中间位置。
从上述技术方案可以看出,本申请公开的一种数据处理方法及电子设备,在接收到第一传感器的第一参数后,基于第一参数来确定第二传感器在其传感区域中的响应方式,进而使得第二传感器以确定的响应方式在传感区域中对输入操作进行响应。可见,本申请中基于第一参数的变化或不同,在第二传感器的传感区域中以相应的响应方式对输入操作进行响应,由此为用户的输入操作提供多种的响应方式,明显改善用户的使用体验。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1,为本申请实施例一提供的一种电子设备的结构示意图;
图2至图3分别为本申请实施例的应用示例图;
图4至图7分别为本申请实施例一的其他结构示意图;
图8为本申请实施例二提供的一种数据处理方法的实现流程图;
图9至图11分别为本申请实施例在实际应用中的其他示例图;
图12为本申请实施例一提供的一种数据处理方法的流程图;
图13-图19分别为本申请实施例的应用示例图;
图20为本申请实施例二提供的一种电子设备的结构示意图;以及
图21-图26分别为本申请实施例的其他应用示例图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描 述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
参考图1,为本申请实施例一提供的一种电子设备的结构示意图,该电子设备可以为具有触控功能的笔记本或电脑等设备。
具体的,本实施例中的电子设备可以包括以下结构:
第一传感器101,用于采集距离传感参数。
其中,第一传感器101为距离传感器,能够对特定空间中的物体采集距离传感参数,这些距离传感参数能够用于检测物体的距离、位置及姿态等状态参数。
第二传感器102,用于采集触控传感参数。
其中,第二传感器102为触控传感器,能够对触控传感区域上的触控传感参数进行采集,触控传感参数能够用于检测触控操作的位置及轨迹等参数。
在本实施例中,第二传感器102的触控传感区域可以覆盖电子设备第一表面的特定区域,该特定区域可以为便于用户进行触控操作的区域,如图2中笔记本键盘下方的区域即为触控传感区域,用户可以在触控传感区域进行触摸动作,以实现触控输入。
基于此,本实施例中的第一传感器101设置在电子设备上与第二传感器102相对应的位置,使得第一传感器101能够对触控传感区域上方空间中的距离传感参数进行采集,如图3中所示,第二传感器102的触控传感区域为笔记本键盘下方的特定区域,第一传感器101能够采集出现在该特定区域上方的空间上的物体的距离传感参数。
从上述技术方案可以看出,本申请实施例一的一种电子设备,通过在电子设备上设置能够采集距离传感参数的第一传感器,使得第一传感器能够对第二传感器的触控传感区域上方空间中的距离传感参数进行采集。可见,本实施例中通过设置距离传感器来对触控传感区域上方空间的距离传感参数,从而使得电子设备中除了可以实现触控输入之外,还可以通过距离传感器来实现输入,从而丰富电子设备的输入方式,改善用户的使用体验。
在一种实现方式中,第一传感器101可以设置在第二传感器102的电路板的开窗121的位置,如图4中所示,在第二传感器102的电路板中的特定位置上开窗,在开窗121的位置上设置该第一传感器101,使得第一传感器101能够通过开窗121对第二传感器102的触控传感区域上方空间的距离传感参数进行采集,而不会因为电子设备外壳或其他结构的遮挡而无法实现参数采集的情况。
具体的,开窗121的位置可以为第二传感器102电路板上第一边缘的中间位置,如图5 中所示。
其中,开窗121的尺寸如长和宽等可以基于第一传感器101的实际尺寸确定,使得第一传感器101能够通过开窗121对第二传感器102的触控传感区域上方空间的距离传感参数进行采集。
另外,为了对第一传感器101和第二传感器102进行保护,本实施例中可以在第二传感器102的表面铺设保护结构103,如图6中所示,该保护结构103覆盖第一传感器101,而保护结构103的材质能够允许第一传感器101用于采集距离传感参数的信号波采集参数的穿透,使得第一传感器101能够透过保护结构101采集到出现在第二传感器102的触控传感区域上方空间中任何物体的距离传感参数。
具体的,保护结构103可以为玻璃材质或mylar材料等,允许信号波发射和接收。进一步的,保护结构103上可以铺设一层油墨层,油墨层也不会影响信号波的发射和接收,并且油墨层的颜色可以与第二传感器102在电子设备上所在的设备表面的颜色一致,不影响电子设备的美观。
在一种实现方式中,本实施例中的电子设备还可以包括有以下结构,如图7中所示:
处理器104,如中央处理器CPU(Central Processing Unit)等,在第一传感器101采集到距离传感参数以及第二传感器102采集到触控传感参数之后,处理器104可以基于距离传感参数和触控传感参数,生成操作指令,并对操作指令进行响应,以执行相应的功能。
也就是说,用户在通过第一传感器101和第二传感器102进行输入操作之后,处理器104可以基于距离传感参数和触控传感参数生成相应的操作指令,并对操作指令进行响应后为用户提供相应的功能,例如选择影片播放或关机等功能。
具体的,本实施例中,处理器104在基于距离传感器及触控传感参数生成操作指令时,具体可以通过以下方式实现:
首先,基于距离传感参数,识别出操作体的手势类型,例如,处理器104通过识别算法对距离传感参数进行解析,以识别出操作体的姿态,进而确定出操作体的手势类型,如用户单个手指滑动的手势、手掌晃动的手势或手指下压的手势类型等;
如果操作体的手势类型为第一类型,生成与第一类型相对应的操作指令,其中,第一类型可以为预设的功能性手势类型,如手指单个手指滑动的手势、手掌晃动的手势等,如果操作体的手势类型为第一类型,表明操作体是在通过距离传感器进行手势输入,此时,生成与该第一类型相对应的操作指令,如翻页或者关机等的操作指令,以实现图书翻页或关机功能;
如果操作体的手势类型为第二类型,此时以距离传感参数对触控传感参数进行处理,并 基于经过处理的触控传感参数生成操作指令。其中,第二类型可以为除功能性手势类型之外的其他手势类型,如操作体的距离值低于一定阈值的手势或者手指下压且手指距离低于一定阈值的手势或手掌下压的手势等类型,如果操作体的手势类型为第二类型,表明操作体在通过触控传感器进行触控输入操作,此时,以距离传感参数对触控传感参数进行处理,进而能够基于经过处理的触控传感参数生成操作指令,由此为用户实现响应的触控功能。
具体的,本实施例中处理器104以距离传感参数对触控传感参数进行处理,是指对触控传感参数中与第一传感器所在的开窗的位置上对应的参数进行处理。也就是说,在触控传感参数中,存在由于开窗上第一传感器的设置所引起的开窗位置上触控传感参数的缺失或突变的情况,此时本实施例中处理器104可以对触控传感参数在触控传感区域中发生缺失或者突变的位置的参数利用距离传感参数进行修正,例如,对触控传感参数中与开窗的位置对应的参数设置为目标参数,而目标参数可以为预设的参数值,也可以为触控传感参数中与开窗相邻的位置上对应的参数相关,如对触控传感参数中第一传感器所在位置上的参数设置为与第一传感器周边位置上的参数一致,进而得到经过处理的触控传感参数,进而提高触控传感参数的准确性,以此实现更高的触控输入准确率。
需要说明的是,本实施例中处理器104在以距离传感参数对触控传感参数进行处理时,可以首先基于距离传感参数对触控输入进行防误触处理,例如,基于距离传感参数中的操作体距离、操作体面积等参数来确定此时的操作体输入是否为触控输入,只有在属于触控输入时,基于前文中的方案对触控传感参数进行修正,并基于修正的触控传感参数生成相应的操作指令,以实现相应的功能,而如果不属于触控输入,那么就无需对触控传感参数进行响应,不生成操作指令或操作指令为空,不对操作体的操作进行响应。
例如,如果所述距离传感参数中操作体的距离小于一定阈值且面积大于相应阈值,此刻操作体位于触控传感区域且区域面积超过一定值,例如基于距离传感参数识别出为手掌手势且手掌距离小于一定阈值且面积大于相应阈值,此时认为触控为非输入的触控,此时对触控传感参数设置为0或为空,即认为没有接收到触控输入(后续生成的操作指令为空指令),进而得到经过处理的触控传感参数,进而实现防误触处理,以此实现更高的触控输入准确率。
参考图8,为本申请实施例二提供的一种数据处理方法的实现流程图,该方法可以适用于具有距离传感器和触控传感器的电子设备中,如笔记本或电脑等,主要用于实现电子设备的多种输入。
具体的,本实施例中的方法可以包括以下步骤:
步骤801:响应于第一传感器采集的距离传感参数,获得第二传感器采集到的触控传感 参数。
其中,第一传感器为距离传感器,能够对特定空间中的物体采集距离传感参数,这些距离传感参数能够用于检测物体的距离、位置及姿态等状态参数。第二传感器为触控传感器,能够对触控传感区域上的触控传感参数进行采集,触控传感参数能够用于检测触控操作的位置及轨迹等参数。
本实施例中,第二传感器的触控传感区域覆盖电子设备第一表面如笔记本C面的特定区域,该特定区域可以为便于用户进行触控操作的区域,第一传感器设置在电子设备上与第二传感器相对应的位置,使得第一传感器能够对触控传感区域上方空间中的距离传感参数进行采集。
步骤802:基于距离传感参数及触控传感参数,生成操作指令。
步骤803:对操作指令进行响应,以执行相应的功能。
也就是说,用户在通过第一传感器和第二传感器进行输入操作之后,本实施例中可以基于距离传感参数和触控传感参数生成相应的操作指令,并对操作指令进行响应后为用户提供相应的功能,例如选择影片播放或关机等功能。
从上述技术方案可以看出,本申请实施例二的一种数据处理方法,通过在电子设备上设置能够采集距离传感参数的第一传感器,使得第一传感器能够对第二传感器的触控传感区域上方空间中的距离传感参数进行采集,进而可以基于距离传感参数和触控传感参数生成操作指令并实现相应功能。可见,本实施例中通过设置距离传感器来对触控传感区域上方空间的距离传感参数,从而使得电子设备中除了可以实现触控输入之外,还可以通过距离传感器来实现输入,从而丰富电子设备的输入方式,改善用户的使用体验。
在一种实现方式中,第一传感器可以设置在第二传感器的电路板的开窗的位置,参考图4中结构,相应的,本实施例中步骤802在基于距离传感参数及触控传感参数生成操作指令时,具体可以通过以下方式实现:
首先,基于距离传感参数,识别出操作体的手势类型,例如,通过识别算法对距离传感参数进行解析,以识别出操作体的姿态,进而确定出操作体的手势类型,如用户单个手指滑动的手势、手掌晃动的手势或手指下压的手势类型等;
如果操作体的手势类型为第一类型,生成与第一类型相对应的操作指令,其中,第一类型可以为预设的功能性手势类型,如手指单个手指滑动的手势、手掌晃动的手势等,如果操作体的手势类型为第一类型,表明操作体是在通过距离传感器进行手势输入,此时,生成与该第一类型相对应的操作指令,如翻页或者关机等的操作指令,以实现图书翻页或关机功能;
如果操作体的手势类型为第二类型,此时以距离传感参数对触控传感参数进行处理,并 基于经过处理的触控传感参数生成操作指令。其中,第二类型可以为除功能性手势类型之外的其他手势类型,如操作体的距离值低于一定阈值的手势或者手指下压且手指距离低于一定阈值的手势或手掌下压的手势等类型,如果操作体的手势类型为第二类型,表明操作体在通过触控传感器进行触控输入操作,此时,以距离传感参数对触控传感参数进行处理,进而能够基于经过处理的触控传感参数生成操作指令,由此为用户实现响应的触控功能。
具体的,本实施例中以距离传感参数对触控传感参数进行处理,是指对触控传感参数中与第一传感器所在的开窗的位置上对应的参数进行处理。也就是说,在触控传感参数中,存在由于开窗上第一传感器的设置所引起的开窗位置上触控传感参数的缺失或突变的情况,此时本实施例中可以对触控传感参数在触控传感区域中发生缺失或者突变的位置的参数利用距离传感参数进行修正,例如,对触控传感参数中与开窗的位置对应的参数设置为目标参数,而目标参数可以为预设的参数值,也可以为触控传感参数中与开窗相邻的位置上对应的参数相关,如对触控传感参数中第一传感器所在位置上的参数设置为与第一传感器周边位置上的参数一致,进而得到经过处理的触控传感参数,进而提高触控传感参数的准确性,以此实现更高的触控输入准确率。
需要说明的是,本实施例中在以距离传感参数对触控传感参数进行处理时,可以首先基于距离传感参数对触控输入进行防误触处理,例如,基于距离传感参数中的操作体距离、操作体面积等参数来确定此时的操作体输入是否为触控输入,只有在属于触控输入时,基于前文中的方案对触控传感参数进行修正,并基于修正的触控传感参数生成相应的操作指令,以实现相应的功能,而如果不属于触控输入,那么就无需对触控传感参数进行响应,不生成操作指令或操作指令为空,不对操作体的操作进行响应。
例如,如果所述距离传感参数中操作体的距离小于一定阈值且面积大于相应阈值,此刻操作体位于触控传感区域且区域面积超过一定值,如基于距离传感参数识别出为手掌手势且手掌距离小于一定阈值且面积大于相应阈值,此时认为触控为非输入的触控,此时对触控传感参数设置为0或为空,即认为没有接收到触控输入(后续生成的操作指令为空指令),进而得到经过处理的触控传感参数,进而实现防误触处理,以此实现更高的触控输入准确率。
以下以具有触控区域的笔记本为例,对本实施例中的技术方案进行举例说明:
为了在C面设置有键盘的笔记本上同时实现触控输入和非接触式的手势输入,本实施例中在C面的触摸板touchpad的电路板PCB(Printed Circuit Board)上设置开窗,将距离传感器集成与开窗区域,如图9中所示。
其中,距离传感器可以直接透过touchpad开窗区域的玻璃或者mylar材料发射和接受信 号。而为了保证一致性,如图9中所示,本实施例中可以在开窗区域涂上与C面相同颜色的油墨,由此,touchpad的结构呈层叠结构,其中,油墨材料选择可以穿透发射光同频段的信号波的材料。
基于以上方案,由于touchpad开窗破空后,touchpad会存在检测死区问题,为了解决这一问题,本实施例中可以通过以下对触控传感参数进行处理的方式,去除touchpad因为PCB开窗存在的死区问题,具体如下:
结合图10中的笔记本硬件架构中的数据流向图,本实施例中可以将距离传感器的信号作为touchpad修正信号,当用户手指触碰距离传感器且确定此时用户为非手势输入时,距离传感器将信号传递给touchpad,touchpad可以利用距离传感器的传感参数中的X、Y、Z的数据补偿touchpad中电容传感器所采集到的触控传感参数中缺失的数据,使得整个touchpad检测区域信号完成,传输给CPU后进行响应,无死区,具体流程如图11中所示:
开始后,首先判断是否有手指触摸touchpad,如果是,那么持续检测是否存在死区,并在检测到存在死区时,判断距离传感器的参数是否处于激活状态,如果是,那么获得距离传感参数中的数据Xp、Yp、Zp,并以此对touchpad的数据中死区X、Y、Z上的数据进行修正,得到touchpad的输出参数:X`,Y`,Z`。
可见,本实施例中可以提供集成距离传感器的新方式,并保持智能touchpad的参数完整性。
具体的,通过本实施例中的笔记本,用户不仅可以适用原有的touchpad功能,而且可以实现非接触式操作和智能监控等功能,提高用户体验。其次,距离传感器的检测位置更符合用户场景,可以提高识别效率,可以提高用户体验。进一步的,本实施例中通过触控传感参数的修正,解决touchpad开窗后的死区问题。
参考图12,为本申请实施例一提供的一种数据处理方法的实现流程图,该方法适用于电子设备中,如笔记本或电脑等,该电子设备中具有第一传感器和第二传感器,第一传感器和第二传感器的类型不同。
具体的,本实施例中的方法可以包括以下步骤:
步骤1201:接收第一传感器的第一参数。
其中,第一传感器可以为距离传感器,相应的,第一传感器的第一参数可以为距离传感参数,距离传感器可以通过发射信号波并接收信号波遇到物体后所反射回来的信号,以此生成距离传感参数,能够实现手势输入,这里的距离传感参数能够用于识别物体的形状、位置及姿态等信息,如识别出操作体的手势类型,等等。
步骤1202:至少基于第一参数,确定第二传感器在其传感区域中的响应方式。
其中,第二传感器在传感区域中以确定的响应方式对输入操作进行响应。
需要说明的是,本实施例中通过对第一传感器的第一参数进行识别,进而基于识别结果来确定第二传感器在传感区域中的响应方式,不同的第一参数在第二传感器上对应于不同的响应方式中。
具体的,第一传感器能够对第二传感器的传感区域的上方空间的第一参数进行采集,可见,本实施例中,基于第一传感器在第二传感器的传感区域的上方空间中采集的第一参数来确定第二传感器在传感区域中的响应方式,如图13中所示,在传感区域的上方空间的第参数不同时,传感区域上的响应方式可能不同。
需要说明的是,第二传感器的传感区域覆盖电子设备的第一表面的特定区域,如笔记本C面中键盘下方的全部区域或部分区域,如图14中所示,第一传感器能够对传感区域的上方空间进行第一参数的采集。
从上述技术方案可以看出,本申请实施例一提供的一种数据处理方法,在接收到第一传感器的第一参数后,基于第一参数来确定第二传感器在其传感区域中的响应方式,进而使得第二传感器以确定的响应方式在传感区域中对输入操作进行响应。可见,本实施例中基于第一参数的变化或不同,在第二传感器的传感区域中以相应的响应方式对输入操作进行响应,由此为用户的输入操作提供多种的响应方式,明显改善用户的使用体验。
在一种实现方式中,步骤1202在至少基于第一参数确定第二传感器在其传感区域中的响应方式时,具体可以通过以下方式实现:
基于第一参数,在传感区域中确定响应分区,由此使得第二传感器以响应分区对输入操作进行响应。
其中,传感区域中除了响应分区之外的其他区域被配置为禁止响应输入操作。也就是说,在本实施例中,基于第一传感器在第二传感器传感区域的上方空间中采集到的第一参数,在第二传感器的传感区域中确定对输入操作进行响应的响应分区,而除了该响应分区之外第二传感器的传感区域中的其他区域禁止对输入操作进行响应,如图15中所示。
具体的,本实施例中在基于第一参数在传感区域中确定响应分区时,具体可以通过以下方式实现:
首先,识别第一参数中操作体的位置参数,如操作体相对于第一传感器的方位和距离等参数,另外,位置参数中还可以包含有操作体相对于第一传感器或第二传感器的传感区域的形态参数,如尺寸或姿态等;
之后,基于位置参数,确定操作体映射到传感区域的映射区域,其中,第一传感器与第二传感器之间是具有相对位置关系的,如第一传感器与第二传感器之间处于:第一传感器能够采集第二传感器的传感区域上方空间的第一参数的相对位置关系,由此,本实施例中基于操作体的位置参数,在第二传感器的传感区域中映射出操作体的映射区域,其中映射区域具有区域中心和区域边界,具体的,区域中心与操作体映射到传感区域的位置相对应,而区域边界与操作体相对于第一传感器的形态或操作体相对于传感区域的形态相对应,例如,区域边界为以操作体映射到传感区域的位置为中心且以操作体相对于传感区域的形态的尺寸半径或者预设的半径长度为半径的圆形边界,如图16中所示;
最后,基于映射区域,在传感区域中确定响应分区。
在一种实现中,本实施例中可以在传感区域的多个分区中,确定与映射区域在传感区域中的分布位置相对应的分区为响应分区,如图17中所示,在传感区域中具有多个预先划分的分区,每个分区对应于不同的功能,对传感区域划分左中右分区,在确定操作体在传感区域上的映射区域之后,确定映射区域在传感区域中的分布位置,进而将其分布位置所在的分区确定为响应分区,如图17中的右侧分区为响应分区。
进一步的,本实施例中在确定映射区域之后,或者在基于映射区域确定响应分区之后,可以对操作体进行防误触识别,例如,判断映射区域的区域面积是否大于或等于预设阈值且操作体与传感区域之间的距离小于一定阈值,如果是,那么认为操作体并非进行手势输入或者触控输入,此时,不对操作体的输入操作进行响应,实现防误触;如果操作体与传感区域之间的距离大于一定阈值,那么表明操作体在进行手势输入,此时,对操作体进行手势识别,以实现手势输入的功能。
在另一种实现中,本实施例中可以基于映射区域是否为有效输入区域进行判断来确定传感区域中的响应区域,例如,本实施例中可以判断映射区域的区域面积是否大于或等于预设阈值,来确定响应分区。
具体的,如果映射区域的区域面积大于或等于预设阈值,此时表明操作体在传感区域上的映射区域过大,可以认定为操作体并非进行触控输入,此时,如果操作体与传感区域之间的距离小于一定阈值,那么将传感区域中除映射区域之外的其他区域确定为响应分区,并且,将映射区域配置为禁止响应输入操作的分区,即:如果映射区域的区域面积大于或等于预设阈值,将映射区域确定为操作体的误触区域,此时为避免触控错误,将传感区域中除映射区域之外的其他区域确定为响应区域,相反的,映射区域中则不会对输入操作进行响应,实现防误触;
而如果映射区域的区域面积小于预设阈值,此时,表明操作体那么基于映射区域在传感 区域中的分布位置,在传感区域中确定响应分区,例如,在传感区域中具有多个预先划分的分区,本实施例中可以确定传感区域中与映射区域的分布位置最近的区域确定为响应分区,而其他分区被配置为不对输入操作进行响应。
进一步的,本实施例中在确定响应区域之后,将响应标志输出在显示区域中与响应分区对应的位置上,其中,显示区域可以为同一显示屏上的显示区域,对该显示区域进行划分,此时,右侧分区为响应分区时,将鼠标输出在显示区域中右侧显示区域;或者,显示区域可以为不同显示屏上的显示区域,如本地显示器的显示区域和外接显示器的显示区域,此时,如果本地显示器的显示区域为响应区域时,将鼠标输出在本地显示器的显示区域上。
在一种实现方式中,步骤1202在至少基于第一参数确定第二传感器在其传感区域中的响应方式时,可以通过以下方式实现:
基于第一参数,在传感区域对应的显示区域中输出响应标志,第二传感器以响应标志对输入操作进行响应。
其中,显示区域是指第二传感器所在电子设备上的显示区域,用于输出显示内容。其中,显示区域可以为同一显示屏上的显示区域,对该显示区域进行划分,第一参数不同,对应响应标志所输出的显示区域不同,如同一显示屏上的左区域还是右区域;或者,显示区域可以为不同显示屏上的显示区域,如本地显示器的显示区域和外接显示器的显示区域,第一参数不同,对应响应标志所输出的显示区域不同,如响应标志显示本地显示器的显示区域或者外接显示器的显示区域,如图18中所示。
需要说明的是,响应标志可以为鼠标或者其他标志,本实施例中将响应标志输出传感区域所对应的显示区域,以此提示用户可以在该显示区域上进行输入操作。
具体的,本实施例中在基于第一参数在传感区域对应的显示区域中输出响应标志,具体可以通过以下方式实现:
首先,识别第一参数中操作体的位置参数,如操作体相对于第一传感器的方位和距离等参数,另外,位置参数中还可以包含有操作体相对于第一传感器或第二传感器的传感区域的形态参数,如尺寸或姿态等;
之后,基于位置参数,确定操作体映射到传感区域的映射区域,其中,第一传感器与第二传感器之间是具有相对位置关系的,如第一传感器与第二传感器之间处于:第一传感器能够采集第二传感器的传感区域上方空间的第一参数的相对位置关系,由此,本实施例中基于操作体的位置参数,在第二传感器的传感区域中映射出操作体的映射区域,其中映射区域具有区域中心和区域边界,具体的,区域中心与操作体映射到传感区域的位置相对应,而区域 边界与操作体相对于第一传感器的形态或操作体相对于传感区域的形态相对应,例如,区域边界为以操作体映射到传感区域的位置为中心且以操作体相对于传感区域的形态的尺寸半径为半径的圆形边界;
最后,基于映射区域,在对应的显示区域中输出响应标志。例如,确定与映射区域相关联的显示区域,在映射区域对应的显示区域上输出响应标志,此时显示区域与映射区域在传感区域上的相对位置相关,也就是说,本实施例中基于映射区域在传感区域上的位置,确定相对应的显示区域输出响应标志,如映射区域在传感区域左侧,那么显示区域为本地显示器的显示区域,如果映射区域在传感区域右侧,那么显示区域为外接显示器的显示区域,如图19中所示,由此使得响应标志与映射区域相对应,即响应标志与操作体相对应。
基于此,在确定输出响应标志的显示区域之后,其他显示区域如外接显示区域被配置为禁止响应输入操作或不做任何处理。
参考图20,为本申请实施例二提供的一种电子设备的结构示意图,该电子设备可以为笔记本等设备,其中可以包括以下结构:
第一传感器901,用于采集第一参数,如第一传感器901可以为距离传感器,第一参数为距离传感参数,能够实现手势输入。
第二传感器902具有传感区域,用于采集第二参数,例如,第二传感器902可以为触控传感器,第二参数为触控传感参数,能够实现触控输入。
其中,所述第二传感器902,用于至少基于所述第一参数,确定所述第二传感器在其传感区域中的响应方式;其中,所述第二传感器在所述传感区域中对输入操作进行响应。
需要说明的是,本实施例中通过对第一传感器的第一参数进行识别,进而第二传感器902基于识别结果来确定第二传感器在传感区域中的响应方式,不同的第一参数在第二传感器上对应于不同的响应方式中。
具体的,第一传感器能够对第二传感器的传感区域的上方空间的第一参数进行采集,可见,本实施例中,基于第一传感器在第二传感器的传感区域的上方空间中采集的第一参数来确定第二传感器在传感区域中的响应方式,在传感区域的上方空间的第参数不同时,传感区域上的响应方式可能不同。
需要说明的是,第二传感器的传感区域覆盖电子设备的第一表面的特定区域,如笔记本C面中键盘下方的全部区域或部分区域,第一传感器能够对传感区域的上方空间进行第一参数的采集。
从上述技术方案可以看出,本申请实施例二提供的一种电子设备,在接收到第一传感器 的第一参数后,基于第一参数来确定第二传感器在其传感区域中的响应方式,进而使得第二传感器以确定的响应方式在传感区域中对输入操作进行响应。可见,本实施例中基于第一参数的变化或不同,在第二传感器的传感区域中以相应的响应方式对输入操作进行响应,由此为用户的输入操作提供多种的响应方式,明显改善用户的使用体验。
在一种实现方式中,所述第一传感器901可以设置在所述第二传感器902的电路板的开窗921的位置,如图21中所示,所述第一传感器901能够透过所述开窗921采集所述第一参数。具体的,所述开窗921可以设置在所述第二传感器902的电路板上第一边缘的中间位置。如图22中所示,笔记本C面键盘下方区域即为触控传感器的传感区域,开窗921设置在传感区域的靠近C面键盘的边缘的中心位置,由此,距离传感器能够对传感区域的上方空间进行距离传感参数的采集,触控传感器基于距离传感器在传感区域的上方空间采集的距离传感参数确定传感区域上的响应方式,如在对应的显示区域上输出响应标志(本地显示或外接显示)以提示用户输入或者在传感区域中以不同的响应分区对输入操作进行响应等,为用户提供不同的输入体验。
另外,本实施例中的电子设备还可以通过设置处理器,用以基于第一参数和第二传感器的第二参数,生成操作指令,以执行相应的功能。
具体的,处理器可以基于所述距离传感参数,识别操作体的手势类型,并对手势类型进行判断,如果所述手势类型为第一类型,生成与所述第一类型相对应的操作指令,实现手势输入;如果所述手势类型为第二类型,以所述距离传感参数对所述触控传感参数进行处理,并基于经过处理的触控传感参数生成操作指令,以此实现触控输入。
具体的,在触控输入时,可以基于距离传感参数进行防误触识别。
进一步的,在触控输入时,处理器可以以所述距离传感参数对所述触控传感参数进行处理,如对所述触控传感参数中与所述开窗的位置对应的参数设置为目标参数,其中,所述目标参数与所述触控传感参数中所述开窗相邻的位置对应的参数相关。
需要说明的是,本实施例中电子设备的具体实现可以参考前文中相应内容,此处不再详述。
以下以具有触控区域的笔记本为例,对本实施例中的技术方案进行举例说明:
首先,本实施例中提出将笔记本touchpad集成距离传感器,用户可以在touchpad上实现原有的touchpad功能,而且,也可以通过touchpad实现非接触式操作,如手势输入gesture,实现智能关闭屏幕或快捷操作等功能。具体方式为:
在touchpad的电路板PCB(Printed Circuit Board)上开窗,将距离传感器如Radar传感器集成与开窗的区域。
而Radar传感器可以直接透过touchpad(touchpad为做大的,例如,整个键盘下面都是touchpad,集成Radar传感器)开窗区域的玻璃或者mylar材料发射和接受信号。
具体的,本实施例中,利用Radar检测用户手的距离L和方位a,通过计算预测得到手降落或映射在touchpad上的映射区域。如图23所示,计算得出映射区域的中心Le位置,定义R(Le,r)对应的touchpad区域为手掌将要落入的区域,即映射区域,r可以为预设值,也可以为基于Radar检测用户手的形态参数所计算出来的值。
在一种实现中,Touchpad的微控制单元MCU(Microcontroller Unit)提前设定计算出的R(Le,r)区域为防误触区域,此时设置区域为无效区域。在传感区域中的其他区域R`(Le,r)为有效响应区域。进一步的,本实施例中,radar传感器实时检测手移动状态,并且实时调整R(Le,r)区域和R`(Le,r)区域。实现智能动态切换,改善用户的使用体验,如图24中所示。
在另一种实现中,电子设备中touchpad具有分区模式,即容许用户预先定义touchpad分区,如定义左右区域或三个区域,用户可以划分touchpad为不同的功能区。当Radar检测用户手落入该区域时,智能切换touchpad使能区域。或者,当用户接入外接显示时,设定做右半部分touchpad为外接显示控制器,左半部分touchpad为本机显示控制器。而当检测到用户手落入的区域R(Le,r)为touchpad的左分区后,光标自动调整到本体显示控制器,否则,光标自动调整到外界显示控制器,由此,本实施例中通过检测用户手的手势输入对操作界面和响应区域进行动态切换,如图25中所示。
在具体实现中,如图26中的硬件架构中数据流向所示,本实施例中,利用touchpad MCU集成radar传感器。将Radar传感器采集的距离传感信号计算后得到L和a,touchpad可以利用电容传感器传感器的X和Y,通过计算后得到R(Le,r),touchpad动态调整工作区域R`(Le,r)和防误触R(Le,r)。
基于以上实现,本实施例中在touchpa的分区模式下,基于radar传感器检测用户手的手势输入,动态调整控制界面和相应区域。
可见,本实施例中可以提供集成Radar传感器,通过预判用户手方位等手势来动态调整touchpad响应方式,如鼠标显示和响应分区的确定等。
而且,本实施例中能够通过非接触式检测,提高防误操作功能,提高用户体验。进一步的,本实施例中基于自学习算法,对touchpad动态分区,进而动态调整touchpad响应区域,动态智能切换控制界面,增强产品竞争力。
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的装置而言,由 于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。
专业人员还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
结合本文中所公开的实施例描述的方法或算法的步骤可以直接用硬件、处理器执行的软件模块,或者二者的结合来实施。软件模块可以置于随机存储器(RAM)、内存、只读存储器(ROM)、电可编程ROM、电可擦除可编程ROM、寄存器、硬盘、可移动磁盘、CD-ROM、或技术领域内所公知的任意其它形式的存储介质中。
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本申请。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本申请的精神或范围的情况下,在其它实施例中实现。因此,本申请将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。

Claims (20)

  1. 一种电子设备,包括:
    第一传感器,用于采集距离传感参数;
    第二传感器,用于采集触控传感参数;
    其中,所述第二传感器的触控传感区域覆盖所述电子设备第一表面的特定区域;所述第一传感器能够对所述触控传感区域上方空间中的距离传感参数进行采集。
  2. 根据权利要求1所述的电子设备,所述第一传感器设置在所述第二传感器的电路板的开窗的位置,所述第一传感器通过所述开窗对所述触控传感区域上方空间的距离传感参数进行采集。
  3. 根据权利要求2所述的电子设备,所述开窗设置在所述电路板上第一边缘的中间位置。
  4. 根据权利要求2所述的电子设备,还包括:
    保护结构,铺设在所述第二传感器的表面,并覆盖所述第一传感器;
    其中,所述第一传感器能够透过所述保护结构采集到所述距离传感参数。
  5. 根据权利要求1所述的电子设备,还包括:
    处理器,用于基于所述距离传感参数及所述触控传感参数,生成操作指令,并对所述操作指令进行响应,以执行相应的功能。
  6. 根据权利要求5所述的电子设备,其中,所述处理器基于所述距离传感参数及所述触控传感参数,生成操作指令,具体为:
    基于所述距离传感参数,识别操作体的手势类型;
    如果所述手势类型为第一类型,生成与所述第一类型相对应的操作指令;
    如果所述手势类型为第二类型,以所述距离传感参数对所述触控传感参数进行处理,并基于经过处理的触控传感参数生成操作指令。
  7. 根据权利要求1所述的电子设备,其中,
    所述第一传感器用于采集第一参数;
    所述第二传感器具有传感区域,所述第二传感器用于至少基于所述第一参数,确定所述第二传感器在其传感区域中的响应方式;
    其中,所述第二传感器在所述传感区域中对输入操作进行响应。
  8. 根据权利要求7所述的电子设备,所述第一传感器为距离传感器,所述第二传感器为触控传感器;
    其中,所述第二传感器的传感区域覆盖所述电子设备第一表面的特定区域;所述第一传 感器能够对所述传感区域上方空间中的第一参数进行采集。
  9. 一种数据处理方法,包括:
    响应于第一传感器采集到的距离传感参数,获得第二传感器采集到的触控传感参数;
    其中,所述第二传感器的触控传感区域覆盖所述电子设备第一表面的特定区域;所述第一传感器能够对所述触控传感区域上方空间中的距离传感参数进行采集;
    基于所述距离传感参数及所述触控传感参数,生成操作指令;
    对所述操作指令进行响应,以执行相应的功能。
  10. 根据权利要求9所述的方法,所述第一传感器设置在所述第二传感器的电路板的开窗的位置;
    其中,基于所述距离传感参数及所述触控传感参数,生成操作指令,包括:
    基于所述距离传感参数,识别操作体的手势类型;
    如果所述手势类型为第一类型,生成与所述第一类型相对应的操作指令;
    如果所述手势类型为第二类型,对所述触控传感参数进行处理,并基于经过处理的触控传感参数生成操作指令。
  11. 根据权利要求10所述的方法,对所述触控传感参数进行处理,包括:
    对所述触控传感参数中与所述开窗的位置对应的参数设置为目标参数。
  12. 根据权利要求11所述的方法,所述目标参数与所述触控传感参数中所述开窗相邻的位置对应的参数相关。
  13. 一种数据处理方法,包括:
    接收第一传感器的第一参数;
    至少基于所述第一参数,确定第二传感器在其传感区域中的响应方式;
    其中,所述第二传感器在所述传感区域中对输入操作进行响应。
  14. 根据权利要求13所述的方法,至少基于所述第一参数,确定第二传感器在其传感区域中的响应方式,包括:
    基于所述第一参数,在所述传感区域中确定响应分区,所述第二传感器以所述响应分区对所述输入操作进行响应;
    其中,所述传感区域中除所述响应分区之外的其他区域被配置为禁止响应所述输入操作。
  15. 根据权利要求14所述的方法,基于所述第一参数,在所述传感区域中确定响应分区,包括:
    识别所述第一参数中操作体的位置参数;
    基于所述位置参数,确定所述操作体映射到所述传感区域的映射区域;
    基于所述映射区域,在所述传感区域中确定响应分区。
  16. 根据权利要求15所述的方法,基于所述映射区域,在所述传感区域中确定响应分区,包括:
    如果所述映射区域的区域面积大于或等于预设阈值,将所述传感区域中除所述映射区域之外的其他区域确定为响应分区,所述映射区域被配置为禁止响应所述输入操作;
    如果所述映射区域的区域面积小于所述阈值,基于所述映射区域在所述传感区域中的分布位置,在所述传感区域中确定响应分区。
  17. 根据权利要求15所述的方法,基于所述映射区域,在所述传感区域中确定响应分区,包括:
    在所述传感区域中的多个分区中,确定与所述映射区域在所述传感区域中的分布位置相对应的分区为响应分区;
    其中,所述传感区域中除所述响应分区之外的其他分区被配置为禁止响应所述输入操作。
  18. 根据权利要求13所述的方法,至少基于所述第一参数,确定第二传感器在其传感区域中的响应方式,包括:
    基于所述第一参数,在所述传感区域对应的显示区域中输出响应标志,所述第二传感器以所述响应标志对所述输入操作进行响应。
  19. 根据权利要求18所述的方法,基于所述第一参数,在所述传感区域对应的显示区域中输出响应标志,包括:
    识别所述第一参数中操作体的位置参数;
    基于所述位置参数,确定所述操作体映射到所述传感区域的映射区域;
    基于所述映射区域,在对应的显示区域中输出响应标志。
  20. 根据权利要求19所述的方法,基于所述映射区域,在对应的显示区域中输出响应标志,包括:
    确定与所述映射区域相关联的显示区域;
    在所述显示区域上输出响应标志,所述显示区域与所述映射区域在所述传感区域上的相对位置相关。
PCT/CN2020/092936 2019-05-31 2020-05-28 一种电子设备及数据处理方法 WO2020239029A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/615,544 US20220236827A1 (en) 2019-05-31 2020-05-28 Electronic apparatus and data processing method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201910472214.1 2019-05-31
CN201910472199.0A CN110187795B (zh) 2019-05-31 2019-05-31 一种数据处理方法及电子设备
CN201910472214.1A CN110187796B (zh) 2019-05-31 2019-05-31 一种电子设备及数据处理方法
CN201910472199.0 2019-05-31

Publications (2)

Publication Number Publication Date
WO2020239029A1 true WO2020239029A1 (zh) 2020-12-03
WO2020239029A8 WO2020239029A8 (zh) 2021-09-23

Family

ID=73551898

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092936 WO2020239029A1 (zh) 2019-05-31 2020-05-28 一种电子设备及数据处理方法

Country Status (2)

Country Link
US (1) US20220236827A1 (zh)
WO (1) WO2020239029A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140118276A1 (en) * 2012-10-29 2014-05-01 Pixart Imaging Inc. Touch system adapted to touch control and hover control, and operating method thereof
CN105637467A (zh) * 2013-10-28 2016-06-01 三星电子株式会社 识别用户手势的电子装置和方法
CN105980974A (zh) * 2014-03-03 2016-09-28 密克罗奇普技术公司 用于手势控制的系统及方法
CN106527804A (zh) * 2016-11-08 2017-03-22 北京用友政务软件有限公司 一种基于智能终端防误触的方法及装置
CN109710119A (zh) * 2018-12-28 2019-05-03 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置和存储介质
CN110187795A (zh) * 2019-05-31 2019-08-30 联想(北京)有限公司 一种数据处理方法及电子设备
CN110187796A (zh) * 2019-05-31 2019-08-30 联想(北京)有限公司 一种电子设备及数据处理方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090143141A1 (en) * 2002-08-06 2009-06-04 Igt Intelligent Multiplayer Gaming System With Multi-Touch Display
KR102251356B1 (ko) * 2014-06-20 2021-05-12 삼성전자주식회사 전기장을 이용하는 전자 장치
US11188143B2 (en) * 2016-01-04 2021-11-30 Microsoft Technology Licensing, Llc Three-dimensional object tracking to augment display area
US10732719B2 (en) * 2016-03-03 2020-08-04 Lenovo (Singapore) Pte. Ltd. Performing actions responsive to hovering over an input surface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140118276A1 (en) * 2012-10-29 2014-05-01 Pixart Imaging Inc. Touch system adapted to touch control and hover control, and operating method thereof
CN105637467A (zh) * 2013-10-28 2016-06-01 三星电子株式会社 识别用户手势的电子装置和方法
CN105980974A (zh) * 2014-03-03 2016-09-28 密克罗奇普技术公司 用于手势控制的系统及方法
CN106527804A (zh) * 2016-11-08 2017-03-22 北京用友政务软件有限公司 一种基于智能终端防误触的方法及装置
CN109710119A (zh) * 2018-12-28 2019-05-03 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置和存储介质
CN110187795A (zh) * 2019-05-31 2019-08-30 联想(北京)有限公司 一种数据处理方法及电子设备
CN110187796A (zh) * 2019-05-31 2019-08-30 联想(北京)有限公司 一种电子设备及数据处理方法

Also Published As

Publication number Publication date
WO2020239029A8 (zh) 2021-09-23
US20220236827A1 (en) 2022-07-28

Similar Documents

Publication Publication Date Title
WO2018107900A1 (zh) 一种触摸屏的防误触方法、装置、移动终端及存储介质
US20180188922A1 (en) System and Method for Gesture Control
EP3049898B1 (en) Pressure-sensitive trackpad
US9916046B2 (en) Controlling movement of displayed objects based on user operation
US20100253630A1 (en) Input device and an input processing method using the same
US20070268269A1 (en) Apparatus, method, and medium for sensing movement of fingers using multi-touch sensor array
CN109558061B (zh) 一种操作控制方法及终端
KR20140134459A (ko) 지문 스캔너를 포함하는 포터블 디바이스 및 그 제어 방법
WO2011023225A1 (en) Interactive surface with a plurality of input detection technologies
US20160054831A1 (en) Capacitive touch device and method identifying touch object on the same
EP3792740A1 (en) Key setting method and device, and storage medium
US20130201129A1 (en) Information processing apparatus, information processing method, and program
US20130293477A1 (en) Electronic apparatus and method for operating the same
US9069431B2 (en) Touch pad
TWI575429B (zh) 電容式觸控面板模組之操作模式切換方法
WO2016197714A1 (zh) 操作模式自动识别方法及终端
EP3008556A1 (en) Disambiguation of indirect input
WO2020239029A1 (zh) 一种电子设备及数据处理方法
CN116198435A (zh) 车辆的控制方法、装置、车辆以及存储介质
KR20170108764A (ko) 터치 압력의 구분 방법 및 그 이동 단말기
CN107980116B (zh) 悬浮触控感测方法、悬浮触控感测系统及悬浮触控电子设备
US11803273B2 (en) Touch sensor, touch pad, method for identifying inadvertent touch event and computer device
CN110187796B (zh) 一种电子设备及数据处理方法
CN110187795B (zh) 一种数据处理方法及电子设备
US9395858B2 (en) Capacitive finger navigation device with hybrid mode and operating method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20813303

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20813303

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.03.2022)