CN110554810A - touch key method, device and readable storage medium - Google Patents

touch key method, device and readable storage medium Download PDF

Info

Publication number
CN110554810A
CN110554810A CN201810558059.0A CN201810558059A CN110554810A CN 110554810 A CN110554810 A CN 110554810A CN 201810558059 A CN201810558059 A CN 201810558059A CN 110554810 A CN110554810 A CN 110554810A
Authority
CN
China
Prior art keywords
touch
sensor
incident angle
processor
operation body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810558059.0A
Other languages
Chinese (zh)
Inventor
谭华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd
Original Assignee
Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd filed Critical Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd
Priority to CN201810558059.0A priority Critical patent/CN110554810A/en
Publication of CN110554810A publication Critical patent/CN110554810A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Abstract

the embodiment of the invention provides a touch key method, a device and a readable storage medium, wherein the method comprises the following steps: the processor of the electronic equipment controls a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area; the processor receives a first incident angle sent by the electronic equipment after a first sensor collects first reflected light returned by the touch operation body, and receives a second incident angle sent by the electronic equipment after a second sensor collects second reflected light returned by the touch operation body; the processor determines a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.

Description

Touch key method, device and readable storage medium
Technical Field
The present invention relates to the field of touch keys, and in particular, to a touch key method, device and readable storage medium.
background
With the development of intelligent technology, more and more electronic devices enter the lives of people, and along with the personalized demands of users, people pay more and more attention to space utilization and expect that more and better functions of electronic devices are achieved, so that the design of the electronic market is miniaturized, but touch areas of the smaller and smaller electronic devices are reduced, so that the contradiction that multifunctional touch is achieved in small touch areas is increasingly highlighted, and therefore how to achieve multifunctional touch on the small touch areas is a problem to be solved.
The prior art mainly has two solutions to the above-mentioned problem, solution one: the method has the advantages that multiple touch operations are performed in one touch area, one touch operation corresponds to one function, and the main defect is that the frequency of using the touch area is increased, so that the service life of the electronic equipment is greatly shortened; scheme II: the touch areas are distributed at multiple positions, the total area of the touch areas is increased, and multifunctional touch is achieved. Although the prior art can realize multifunctional touch control on a small touch control area, various problems are brought, so how to enlarge the touch control area without occupying the use space is a problem to be solved.
disclosure of Invention
in view of this, embodiments of the present invention provide a touch key method, a device and a readable storage medium for solving the problems in the prior art that a touch area is increased but does not occupy a use space, and multi-functional touch can be achieved during use.
The technical scheme of the embodiment of the invention is realized as follows:
In a first aspect, an embodiment of the present invention provides a method for touch key pressing, where the method includes:
the processor of the electronic equipment controls a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
The processor receives a first incident angle sent by the electronic equipment after a first sensor collects first reflected light returned by the touch operation body, and receives a second incident angle sent by the electronic equipment after a second sensor collects second reflected light returned by the touch operation body;
The first incident angle is an included angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor; the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor;
the processor determines a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.
In a second aspect, an embodiment of the present invention provides an electronic device, where the electronic device at least includes: a light source, a first sensor, a second sensor, a memory, and a processor, wherein:
the light source is used for projecting light rays in a preset area;
The first sensor is used for sending a first incident angle to the processor after collecting first reflected light returned by the touch operation body;
the second sensor is used for acquiring second reflected light returned by the touch operation body and then sending a second incident angle to the processor;
The memory is used for storing a touch key program of an application;
The processor is used for executing a touch key program stored in the memory to realize the following steps:
controlling a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
receiving a first incident angle sent by a first sensor of the electronic equipment after acquiring first reflected light returned by the touch operation body, and receiving a second incident angle sent by a second sensor of the electronic equipment after acquiring second reflected light returned by the touch operation body by the processor;
the first incident angle is an included angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor; the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor;
and determining a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the touch key method described above.
the touch key method, the touch key equipment and the readable storage medium solve the problems that a touch area is enlarged but does not occupy a use space and multifunctional touch can be realized in the prior art, touch points are determined by an incident angle sent by reflected light returned by a touch operation body and a distance between the two optical sensors through the two optical sensors, then the touch area is enlarged through light source projection, the touch points can be accurately positioned, the touch area has no hardware design, the service life of the electronic equipment is prolonged, the touch convenience is good, and the user experience is improved.
Drawings
FIG. 1 is a schematic flow chart illustrating a method for implementing a touch key according to an embodiment of the present invention;
FIG. 2 is a schematic flowchart illustrating a method for implementing a touch key according to another embodiment of the present invention;
FIG. 3 is a schematic flowchart illustrating another method for implementing a touch key according to an embodiment of the present invention;
FIG. 4 is a system diagram of a touch key according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a rectangular coordinate system according to an embodiment of the present invention;
FIG. 6 is a diagram of a logic architecture for implementing a touch key according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of the structure of the apparatus according to the embodiment of the present invention.
Detailed Description
it should be understood that the embodiments described herein are only for explaining the technical solutions of the present invention, and are not intended to limit the scope of the present invention.
The embodiment of the invention provides a touch key method, which is applied to electronic equipment, and can be realized by calling a program code through a processor in the electronic equipment, wherein the program code can be saved in a computer storage medium, so that the electronic equipment at least comprises the processor and the storage medium.
fig. 1 is a schematic flow chart of a method for implementing a touch key according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
Step S101: the processor of the electronic equipment controls a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
here, the electronic device can apply the touch key method; for example, the electronic device may be a small-sized electric cooker, a soymilk maker, a power socket, a stewing cup, a water boiling kettle, or the like, or a household appliance whose touch area is not suitable for being arranged on the device housing. The processor can execute all application programs stored in the electronic equipment, including the program of the touch key method, and can calculate and use data.
The light source of the electronic equipment can emit light and can be projected in the preset area. The preset area is formed by the light of the light source pointing to another object along a fixed direction, and the area covers all the touch keys. The touch area is an area where the touch key is located, and touch operation is performed on the area to trigger the touch key. For example, referring to the light source shown in fig. 4, reference numeral L1, the area shown is a predetermined area, reference numeral 400, the area shown is a touch area, and reference numeral 401 is a touch key.
Step S102: the processor receives a first incident angle sent by the electronic equipment after a first sensor collects first reflected light returned by the touch operation body, and receives a second incident angle sent by the electronic equipment after a second sensor collects second reflected light returned by the touch operation body;
here, the touch operation body is an object which can reflect light projected in a preset area and has a surface which is not very smooth; for example, the touch operation body may be an object such as a finger or a stylus.
the first sensor can receive all reflected light rays of the preset area and can measure the incident angle of the reflected light rays; the second sensor can receive all reflected light rays of the preset area and can measure the incident angle of the reflected light rays; for example, the first sensor and the second sensor may be optical instruments such as a camera, an optical sensor, a beam analyzer, and the like.
The reflected light is that the light projected to the preset area by the light source generates diffuse reflection through the touch operation body, and the reflected light is irregularly reflected towards different directions due to the fact that the normal directions of all points are inconsistent, and the surface of the touch operation body can reflect the light towards all directions. The first reflected light (i.e., the first reflected light) is the light which is diffused and reflected by the touch control operation body and is received by the first sensor; the second reflected light (i.e., the second reflected light) is received by the second sensor after being diffusely reflected for passing through the touch operation body; for example, referring to the first reflected light shown in fig. 6, the S1 sensor receives light, and the second reflected light is light received by the S2 sensor.
The first incident angle is an incident angle at which the first reflected light is received by a first sensor; the second incident angle is an incident angle at which the second reflected light is received by the second sensor. For example, referring to fig. 6, the first incident angle is 55 degrees and the second incident angle is 60 degrees.
the first incident angle is an included angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor; the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor;
Step S103: the processor determines a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.
The distance between the first sensor and the second sensor is the distance on a straight line determined when the first sensor and the second sensor are installed. For example, the distance between the first sensor and the second sensor is b as shown in fig. 6, which is the distance between S1 and S2.
The touch point is a point where the touch operation body operates on the touch area, that is, a point where the light source projects on the preset area and reflects. For example, the touch points refer to points a and B shown in fig. 6.
In the embodiment of the present invention, the first sensor and the second sensor are used to measure a first incident angle and a second incident angle of a first reflected light and a second reflected light returned by the touch operation body in the preset area, and the first incident angle and the second incident angle are sent to the processor; the processor receives the first incident angle and the second incident angle, and obtains coordinates of a touch point according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor; therefore, the problems that the touch area is enlarged but does not occupy the use space and the touch can be multifunctional in use in the related art are solved, the touch area is enlarged, the touch point can be accurately positioned, the hardware design of the touch area is omitted, the service life of the electronic equipment is prolonged, and the user experience is improved.
Fig. 2 is a schematic flowchart of a method for implementing a touch key according to another embodiment of the present invention, and as shown in fig. 2, the method includes the following steps:
step S201: the processor of the electronic equipment controls a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
Step S202: the first sensor of the electronic equipment sends a first incident angle to the processor after collecting first reflected light returned by the touch operation body, and the second sensor of the electronic equipment sends a second incident angle to the processor after collecting second reflected light returned by the touch operation body;
step S203: the processor receives a first incident angle sent by the electronic equipment after a first sensor collects first reflected light returned by the touch operation body, and receives a second incident angle sent by the electronic equipment after a second sensor collects second reflected light returned by the touch operation body;
Here, the first incident angle is an angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor.
the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor.
step S204: the processor determines relation data of touch points according to the first incidence angle and the second incidence angle and a tangent function relation;
here, the tangent function relationship is a functional relationship formula or functional relationship data that satisfies the tangent function equation relationship in a right triangle. For example, if a rectangular coordinate system is established with a straight line between the first sensor and the second sensor as a horizontal axis, the tangent value of the first incident angle may be an ordinate-to-abscissa value of any point on the first incident light line, wherein the relationship between the first incident angle and the horizontal ordinate is referred to as a tangent function relationship, and the relationship between the first incident angle and the horizontal ordinate is referred to as functional relationship data.
and the relational data of the touch points are the coordinates of the touch points and the first incident angle and the second incident angle acquired by the first sensor and the second sensor, and the data are obtained according to the tangent function relation. For example, a touch operation is performed on one point in the preset area, the point is diffusely reflected to reflect light to the first sensor and the second sensor respectively to obtain the first incident angle and the second incident angle, and the tangent function relationship is established between the first incident angle and the second incident angle by using touch point coordinates respectively, where the establishing of the tangent function relationship by using the touch point coordinates is relationship data of the touch point.
step S205: the processor calculates the coordinates of the touch points on the touch area according to the relation data of the touch points and the distance between the first sensor and the second sensor;
step S206: if the coordinates of the touch point are in a preset trigger coordinate set, the processor determines the touch operation of the touch operation body as effective trigger operation;
here, the touch operation is that the touch operation body performs a key operation in the preset area. For example, the key operation may be an action of pressing, touching, clicking, or the like by a finger of a user at a certain point in the preset area.
the triggering operation is that the touch operation body performs key operation on the touch area. For example, the trigger operation may be an action of pressing, touching, clicking, and the like by a finger of a user at a certain point in the touch area.
the trigger coordinate set is a coordinate set corresponding to the key function item of the electronic equipment in the preset area. For example, the electronic device is an electric cooker, and the electric cooker corresponds to all key function item buttons, such as an opening button, a preset time button, a porridge button, an accelerated boiling button, a small amount of boiling buttons, a soup stewing button and the like, and corresponds to all coordinates in an established rectangular coordinate system.
Step S207: the processor searches out the identifier of the key function item corresponding to the effective trigger operation according to the coordinate of the effective trigger operation;
Here, the key function item is to start each specific function of the electronic device by using a key. For example, the electronic device is an 'electric cooker', and the electric cooker is started when the key function item is started by the starting pressure.
And the coordinates of the trigger operation are the coordinates of the touch points of the touch operation body on the touch area.
the identification of the key function item is a specific function name of the key function item or a marker replacing the name and the like. For example, the electronic device is an electric cooker, and the identification of the key function item can be Chinese characters such as porridge, accelerated cooking, small amount cooking, soup stewing and the like.
Step S208: the processor executes corresponding operation according to the identification of the key function item;
Here, the identifier of the key function item is a specific function name of the key function item or a marker or the like instead of the name.
And the corresponding operation is that the electronic equipment implements a corresponding function according to the identifier of the key function.
For example: the electronic equipment is an electric cooker, the key is marked as 'cooking porridge', and after the 'cooking porridge' is pressed, the electric cooker can execute the duration and time required by 'cooking porridge'.
in the embodiment of the present invention, the first sensor and the second sensor are used to measure a first incident angle and a second incident angle of a first reflected light and a second reflected light returned by the touch operation body in the preset area, and the first incident angle and the second incident angle are sent to the processor; the processor receives the first incident angle and the second incident angle, and obtains coordinates of a touch point according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor; then, judging whether the touch operation corresponding to the touch operation body is a trigger operation according to the coordinates of the touch points to determine whether to start a corresponding key function item and execute a key function; therefore, the specific position of the touch point can be accurately positioned, and according to the touch operation of the touch operation body responding to the position of the touch point, the touch area is virtual and occupies the actual space, so that the user experience is improved.
fig. 3 is a schematic flow chart of a method for implementing a touch key according to another embodiment of the present invention, as shown in fig. 3, the method includes the following steps:
step S301: the light emitted by the linear light source L1 is transmitted along a straight line, and a plane is formed in the key identification area;
here, the plane formed by the light of the linear light source and the key identification area are on the same plane and cover the key identification area.
the linear light source L1 emits light at a wavelength that is received by both the optical sensor and the optical sensor.
fig. 4 is a system architecture diagram of a touch key according to an embodiment of the present invention, as shown in fig. 4, the touch key system is composed of optical sensors S1 and S2 and a linear light source L1 located on the same plane, a Button mark is only used for a user to identify a touch region and does not participate in a touch key identification process, wherein the optical sensor S1 and the optical sensor S2 are distributed on two sides of the linear light source L1. As shown in fig. 4, the optical sensor S1 is located at the same height and plane as the light sensor S2 and the linear light source.
The optical sensor S1 is used for receiving a first incident light ray returned by the linear light source L1 through the touch control operation body, measuring the incident angle of the first incident light ray and emitting the first incident light ray;
The optical sensor S2 is used for receiving a second incident light ray returned by the linear light source L1 through the touch control operation body, measuring the incident angle of the second incident light ray and emitting the second incident light ray;
the linear light source L1 is used for emitting light spots of the light source, so that the light spots of the light source form light rays along one direction and form a plane;
The Button mark is only used for identifying the touch area by the user and does not participate in the touch key identification process.
the key identification is all Button keys which can be identified and are covered in the light source coverage area.
Step S302: the light emitted by the linear light source L1 cannot be reflected when no key clicking operation of a user occurs, and the optical sensors S1 and S2 cannot detect the reflected light;
Here, the click button operation is that the user performs a click operation on the plane formed in step 301 by using the touch operation body.
The detection is based on the principle that light is reflected back to the original substance after changing the propagation direction at the interface of the two substances, and the optical angle sensor can detect the incident angle of the reflected light.
step S303: when a user clicks the key identification area, light of the linear light source L1 is reflected at the clicked position, the optical sensors S1 and S2 detect the reflected light, and the incident angle is measured;
step S304: obtaining touch position information through mathematical calculation and comparing the touch position information with a key area to judge whether to trigger key operation;
the touch position information is coordinates of the touch point;
the Button area is the area marked by the Button;
The obtaining of the touch position information through mathematical calculation specifically includes the following steps:
Step 3041: an origin point where a point is taken as a coordinate point is selected to construct a rectangular coordinate system, and the position where the sensor S1 or S2 is located is preferably selected as the origin point (0, 0).
As shown in fig. 5, in the present embodiment, the optical sensor S1 is used as the origin (0,0), and the coordinate of S2 is (b,0) — where, the distance between the optical sensor S1 and the optical sensor S2 is b, according to the formula of tangent function in trigonometric function (3.1), for the coordinate (x n, y n) of the touch point N, the incident angle of the optical sensor S1 is S1, and the incident angle of the optical sensor S2 is θ:
in the formula (3.1), α is the angle number of the acute angle, y is the length of the right-angle side opposite to y, and x is the length of the other right-angle side.
step 3042, obtaining a tangent trigonometric function equation set (3.2) according to the formula (3.1), the incident angle θ and β of the optical sensor S1 and S2, and the coordinates (x n and y n) of the touch point N;
in the formula (3.2), b is the distance between the optical sensor S1 and the optical sensor S2.
step 3043, solving the coordinate (x n, y n) equation of the touch point N to obtain formula (3.3) according to formula (3.2);
Step 3044: obtaining the actual values of the incident angles θ and β, the actual values of the distances b between the optical sensor S1 and the optical sensor S2 according to the formula (3.3) in the step 303, and obtaining the coordinate value of the touch point N.
the coordinates are those of the coordinate system constructed by the touch point N in step 304. The linear light source covers a key identification area in the plane, and a key identification coordinate set is arranged corresponding to the key identification area. And judging whether the touch point is in the key identification coordinate set or not when the touch point coordinate is obtained.
observing equation (3.3) can yield: the coordinates of the touch point are determined by the incident angle degrees of the optical sensors S1, S2 and the distance between S1, S2. Wherein the distance b between the optical sensor S1 and the optical sensor S2 is determined by the mounting structure, known at the product design stage; the incident angle is read by the optical sensors S1, S2.
As shown in fig. 6, when the user touch point is A, B, the installation distance of the optical sensors S1 and S2 is 10 centimeters (cm), the incident angle degree of the light ray sensor S1 corresponding to the user touch point a is θ 75 degrees (°), the incident angle degree of the light ray sensor S2 is β 44 degrees, and the coordinate corresponding to b 10cm is (x a, y a), the formula (3.3) is substituted to obtain the formula (3.4) and obtain the coordinate of the point a (2.06, 7.67):
When the incident angle degree of the light ray sensor S1 corresponding to the user touch point B is 55 °, the incident angle degree of the light ray sensor S2 is 60 °, and the coordinate corresponding to the angle B of 10cm is (x b, y b), the equation (3.5) is substituted into the equation (3.3) to obtain the coordinate (5.48,7.83) of the point B:
And comparing the coordinates in the Button mark coordinate set with the coordinates of the touch points to obtain that the point A is outside the key, the point B is inside the key, discarding the trigger signal of the point A, and triggering the key to act, thereby finishing the key identification process.
The embodiment of the invention provides a touch key method, which comprises the following steps that firstly, light rays emitted by a linear light source L1 are transmitted along a straight line, and a plane is formed in a key identification area; the light emitted by the linear light source L1 is not reflected when no user click key operation occurs, and the optical sensors S1 and S2 cannot detect the reflected light; when a user clicks the key identification area, light of the linear light source L1 is reflected at the clicked position, the optical sensors S1 and S2 detect the reflected light, and the incident angle is measured; obtaining touch position information through mathematical calculation and comparing the touch position information with a key area to judge whether to trigger key operation; the problem that a camera and a light source at the front key position and the rear key position cannot be located at the same height to cause large size in the related technology is solved by adopting the double optical sensors, the optical sensors and the light source can be located at the same plane, the system height is reduced, the optical sensors can be installed in thinner household appliances, and meanwhile, the optical sensors are not limited in the camera any more and only need to judge the incident light angle.
an embodiment of the present invention provides an electronic device, and fig. 7 is a schematic diagram of a composition structure of a device according to an embodiment of the present invention, as shown in fig. 7, the electronic device 700 at least includes: light source 701, first sensor 702, second sensor 703, processor 704 and memory 705, wherein:
the light source 701 is used for projecting light rays in a preset area;
The first sensor 702 is configured to send a first incident angle to the processor after collecting first reflected light returned by the touch operation body;
The second sensor 703 is configured to send a second incident angle to the processor after collecting second reflected light returned by the touch control operation body;
the memory 705 is used for storing a touch key program of an application;
the processor 704 is configured to execute a touch key program stored in the memory to implement the following steps:
Controlling a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
Receiving a first incident angle sent by a first sensor of the electronic equipment after acquiring first reflected light returned by the touch operation body, and receiving a second incident angle sent by a second sensor of the electronic equipment after acquiring second reflected light returned by the touch operation body by the processor;
The first incident angle is an included angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor; the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor;
and determining a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.
It should be noted that, in the embodiment of the present invention, if the menu implementation method is implemented in the form of a software functional module and is sold or used as an independent product, the menu implementation method may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
Correspondingly, an embodiment of the present invention provides a computer storage medium, where computer-executable instructions are stored in the computer storage medium, and the computer-executable instructions are configured to execute the touch key method provided in another embodiment of the present invention.
as will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
the present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (10)

1. A touch key method is applied to electronic equipment, and comprises the following steps:
the processor of the electronic equipment controls a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
The processor receives a first incident angle sent by the electronic equipment after a first sensor collects first reflected light returned by the touch operation body, and receives a second incident angle sent by the electronic equipment after a second sensor collects second reflected light returned by the touch operation body;
the first incident angle is an included angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor; the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor;
The processor determines a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.
2. The method of claim 1, wherein the processor determines the touch point of the touch-operated body on the touch area according to the first incident angle, the second incident angle, and a distance between a first sensor and a second sensor, and comprises:
The processor determines relation data of touch points according to the first incidence angle and the second incidence angle and a tangent function relation;
And the processor calculates the coordinates of the touch points on the touch area according to the relation data of the touch points and the distance between the first sensor and the second sensor.
3. The method of claim 2, further comprising:
if the coordinates of the touch point are in a preset trigger coordinate set, the processor determines the touch operation of the touch operation body as effective trigger operation;
And the trigger coordinate set is a coordinate set corresponding to the key function item of the electronic equipment in the preset area.
4. the method of claim 3, further comprising:
the processor searches out the identifier of the key function item corresponding to the effective trigger operation according to the coordinate of the effective trigger operation;
And the processor executes corresponding operation according to the identification of the key function item.
5. the method according to any one of claims 1 to 4, further comprising:
The first sensor of the electronic equipment sends a first incident angle to the processor after collecting first reflected light returned by the touch operation body, and the second sensor of the electronic equipment sends a second incident angle to the processor after collecting second reflected light returned by the touch operation body;
the processor receives a first angle of incidence transmitted by the first sensor and receives a second angle of incidence transmitted by the second sensor.
6. an electronic device, characterized in that the electronic device comprises at least: a light source, a first sensor, a second sensor, a memory, and a processor, wherein:
The light source is used for projecting light rays in a preset area;
The first sensor is used for sending a first incident angle to the processor after collecting first reflected light returned by the touch operation body;
The second sensor is used for acquiring second reflected light returned by the touch operation body and then sending a second incident angle to the processor;
The memory is used for storing a touch key program of an application;
the processor is used for executing a touch key program stored in the memory to realize the following steps:
controlling a light source of the electronic equipment to project light rays in a preset area, wherein the preset area comprises a touch area;
Receiving a first incident angle sent by a first sensor of the electronic equipment after acquiring first reflected light returned by the touch operation body, and receiving a second incident angle sent by a second sensor of the electronic equipment after acquiring second reflected light returned by the touch operation body by the processor;
the first incident angle is an included angle between a first straight line passing through the first sensor and the second sensor and a second straight line passing through the touch operation body and the first sensor; the second incident angle is an included angle between the first straight line and a third straight line passing through the touch operation body and the second sensor;
and determining a touch point of the touch operation body on the touch area according to the first incident angle, the second incident angle and the distance between the first sensor and the second sensor.
7. The electronic device as recited in claim 6, wherein said light source, said first sensor and said second sensor are placed at the same height, and said first sensor and said second sensor are placed at a predetermined distance on both sides or on the same side of said light source.
8. the electronic device as claimed in claim 6, wherein the determining the touch point of the touch-operated body on the touch area according to the first incident angle, the second incident angle, and the distance between the first sensor and the second sensor comprises:
determining relation data of touch points according to the first incidence angle and the second incidence angle and a tangent function relation;
and calculating the coordinates of the touch points on the touch area according to the relation data of the touch points and the distance between the first sensor and the second sensor.
9. the electronic device of claim 8, wherein the processor is further configured to execute a touch button program stored in the memory to implement the following steps:
if the coordinates of the touch point are in a preset trigger coordinate set, the processor determines the touch operation of the touch operation body as effective trigger operation;
And the trigger coordinate set is a coordinate set corresponding to the key function item of the electronic equipment in the preset area.
10. a computer-readable storage medium, on which an application touch key program is stored, and which, when executed by a processor, implements a touch key method as provided in any one of claims 1 to 5.
CN201810558059.0A 2018-06-01 2018-06-01 touch key method, device and readable storage medium Pending CN110554810A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810558059.0A CN110554810A (en) 2018-06-01 2018-06-01 touch key method, device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810558059.0A CN110554810A (en) 2018-06-01 2018-06-01 touch key method, device and readable storage medium

Publications (1)

Publication Number Publication Date
CN110554810A true CN110554810A (en) 2019-12-10

Family

ID=68734919

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810558059.0A Pending CN110554810A (en) 2018-06-01 2018-06-01 touch key method, device and readable storage medium

Country Status (1)

Country Link
CN (1) CN110554810A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201562261U (en) * 2009-07-28 2010-08-25 联想(北京)有限公司 Electronic equipment
CN102402344A (en) * 2010-09-09 2012-04-04 广达电脑股份有限公司 Optical touch system
CN202306504U (en) * 2010-08-26 2012-07-04 耀光联有限公司 Optical keyboard based on gesture control
CN102622131A (en) * 2011-01-28 2012-08-01 联想(北京)有限公司 Electronic equipment and positioning method
US20130187892A1 (en) * 2011-06-02 2013-07-25 Uc-Logic Technology Corp. Optical touch device
GB201406918D0 (en) * 2014-04-17 2014-06-04 Light Blue Optics Inc Touch sensing systems
CN104049881A (en) * 2013-03-15 2014-09-17 鸿富锦精密工业(深圳)有限公司 Input device and method
CN104375714A (en) * 2013-08-16 2015-02-25 联想(北京)有限公司 Electric device and recognition operation method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201562261U (en) * 2009-07-28 2010-08-25 联想(北京)有限公司 Electronic equipment
CN202306504U (en) * 2010-08-26 2012-07-04 耀光联有限公司 Optical keyboard based on gesture control
CN102402344A (en) * 2010-09-09 2012-04-04 广达电脑股份有限公司 Optical touch system
CN102622131A (en) * 2011-01-28 2012-08-01 联想(北京)有限公司 Electronic equipment and positioning method
US20130187892A1 (en) * 2011-06-02 2013-07-25 Uc-Logic Technology Corp. Optical touch device
CN104049881A (en) * 2013-03-15 2014-09-17 鸿富锦精密工业(深圳)有限公司 Input device and method
CN104375714A (en) * 2013-08-16 2015-02-25 联想(北京)有限公司 Electric device and recognition operation method
GB201406918D0 (en) * 2014-04-17 2014-06-04 Light Blue Optics Inc Touch sensing systems

Similar Documents

Publication Publication Date Title
US10324566B2 (en) Enhanced interaction touch system
RU2685983C1 (en) Method and device for controlling fingerprint recognition on basis of display screen
US8686974B2 (en) Touch-sensitive system and method for controlling the operation thereof
CN108958490B (en) Electronic device, gesture recognition method thereof and computer-readable storage medium
US20190094990A1 (en) Touch sensing apparatus and method of operating the same
JP6315354B2 (en) Frequency conversion in touch sensor
US9288373B2 (en) System and method for human computer interaction
JP5446769B2 (en) 3D input display device
EP2492785B1 (en) Creative design system and method
CN106775536B (en) Double-screen switching method and device
CN106681636B (en) Method and device for preventing false touch and mobile terminal
KR20140088545A (en) Optical tablet stylus and indoor navigation system
CN203616738U (en) Touch control device provided with virtual keyboard
CN109241957A (en) Electronic device, fingerprint collecting method, device, storage medium and mobile terminal
CN105573522A (en) Operation method of mobile terminal and mobile terminal
US9886105B2 (en) Touch sensing systems
CN110554810A (en) touch key method, device and readable storage medium
US20140131550A1 (en) Optical touch device and touch control method thereof
CN109032354B (en) Electronic device, gesture recognition method thereof and computer-readable storage medium
CN110032290A (en) User interface
CN109107148A (en) control method, device, storage medium and mobile terminal
KR100704630B1 (en) Computer system including wireless input device and coordinates processing method for the same
CN110007796B (en) Electronic device and information processing method
US10606473B2 (en) Display method, display device, and program
CN108803888A (en) A kind of keyboard

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191210