CN113157147B - Touch position determining method and device - Google Patents

Touch position determining method and device Download PDF

Info

Publication number
CN113157147B
CN113157147B CN202110218162.2A CN202110218162A CN113157147B CN 113157147 B CN113157147 B CN 113157147B CN 202110218162 A CN202110218162 A CN 202110218162A CN 113157147 B CN113157147 B CN 113157147B
Authority
CN
China
Prior art keywords
touch
ultrasonic
distance
preset
ultrasonic probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110218162.2A
Other languages
Chinese (zh)
Other versions
CN113157147A (en
Inventor
李帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110218162.2A priority Critical patent/CN113157147B/en
Publication of CN113157147A publication Critical patent/CN113157147A/en
Application granted granted Critical
Publication of CN113157147B publication Critical patent/CN113157147B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Length Measuring Devices Characterised By Use Of Acoustic Means (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a touch position determining method, and belongs to the technical field of electronic equipment. The method comprises the following steps: acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes. According to the method and the device, the at least two ultrasonic probes are arranged around the projection module of the electronic equipment, the touch position of the touch object in the projection area is determined by utilizing the first distance detected by the ultrasonic probes and the second distance between the preset ultrasonic probes, and the determination of the touch position of the touch object on the projection display area is realized.

Description

Touch position determining method and device
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to a touch position determining method and device.
Background
With the rapid development of electronic devices, more and more users like to wear smart bracelets or watches to record data such as daily exercises, sleeping and the like, so as to monitor the exercises and manage daily health.
In the prior art, due to the limitation of the volume of the smart bracelet or the watch, man-machine interaction can be realized only through a small screen on the display screen. With the appearance of intelligent wrist rings and watches with projection functions, the content to be displayed can be realized through wrist throwing, wall throwing, desktops and other modes.
However, the electronic device with the projection function only has the projection function, and when a user performs a touch operation on the projection display area by using the touch object, for example, the touch object is a finger, a pen point, or the like, the conventional electronic device cannot determine the touch position.
Disclosure of Invention
An object of the embodiment of the application is to provide a method and a device for determining a touch position, which can solve the problem that in the prior art, an electronic device with a projection function cannot determine a touch position of a touch object on a projection display area.
In order to solve the technical problems, the application is realized as follows:
in a first aspect, an embodiment of the present application provides a method for determining a touch location, which is applied to an electronic device, where the electronic device includes: the ultrasonic probe comprises a projection module and at least two ultrasonic probes, wherein the at least two ultrasonic probes are distributed around the projection module; the detection ranges of the at least two ultrasonic probes cover the projection area of the projection module; the method comprises the following steps:
acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes;
and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes.
In a second aspect, an embodiment of the present application provides a touch location determining apparatus, which is applied to an electronic device, where the electronic device includes: the ultrasonic probe comprises a projection module and at least two ultrasonic probes, wherein the at least two ultrasonic probes are distributed around the projection module; the detection ranges of the at least two ultrasonic probes cover the projection area of the projection module; the device comprises:
the acquisition module is used for acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes;
and the determining module is used for determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions implement the steps of the touch location determining method according to the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the touch location determination method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the touch location determining method according to the first aspect.
According to the embodiment of the application, a first distance detected by each ultrasonic probe is obtained through at least two ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes. In the application, through setting up two at least ultrasonic probe around electronic equipment's projection module, utilize the first distance that ultrasonic probe surveyed, the second distance between the ultrasonic probe of predetermineeing to confirm the touch position of touch object in the projection region, realized can confirm the effect of touch object's touch position on the projection display region.
Drawings
FIG. 1 is a flow chart of a touch location determination method of the present application;
FIG. 2 is a flow chart of another touch location determination method of the present application;
FIG. 3 is a projection area calibration plan view of the present application;
FIG. 4 is a schematic structural view of an electronic device of the present application;
FIG. 5 is a schematic structural view of an ultrasonic probe of the present application;
FIG. 6 is a schematic diagram of the operation of the ultrasound probe of the present application;
FIG. 7 is a schematic plan view of the triangular relationship of the present application;
FIG. 8 is a schematic structural view of another electronic device of the present application;
FIG. 9 is a block diagram of a touch location determination apparatus of the present application;
FIG. 10 is a block diagram of an electronic device of the present application;
fig. 11 is a schematic diagram of a hardware structure of an electronic device of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
Before explaining the touch position determining method provided in the embodiment of the present application in detail, the structure of the electronic device adopted in the embodiment of the present application is explained.
The electronic equipment of the application is an intelligent bracelet or a watch comprising a projection module and at least two ultrasonic probes. The projection module is used for projecting a display picture, and the ultrasonic probe is used for transmitting and receiving ultrasonic waves to an object. The number of the ultrasonic probes is at least two, and the ultrasonic probes are distributed around the projection module. When the number of the ultrasonic probes is even, the ultrasonic probes are equidistantly and symmetrically arranged on two sides of the projection module and are positioned on the same horizontal straight line. When the number of the ultrasonic probes is odd, the ultrasonic probes are uniformly distributed around the projection module, and two adjacent ultrasonic probes and the projection module can form included angles, and the angles of all the included angles are the same. And, the detection range of at least two ultrasonic probes covers the projection area of the projection module. Each ultrasonic probe includes an ultrasonic transmitting unit and an ultrasonic receiving unit. In the following description, an even number of ultrasonic probes will be exemplified. Those skilled in the art may also employ other types of intelligent wearable devices including a projection module and at least two ultrasound probes, as desired.
In the embodiment of the application, any one ultrasonic probe is provided with two working modes: the ultrasonic wave sent by the ultrasonic sending unit of the ultrasonic probe is received by the ultrasonic receiving unit of each ultrasonic probe, or the ultrasonic wave sent by the ultrasonic sending unit of at least one other ultrasonic probe is received by the ultrasonic receiving unit of each ultrasonic probe.
For a better understanding of the structure of the electronic device of the present application, the following is an exemplary description with reference to the accompanying drawings:
referring to fig. 4, a schematic structural diagram of an electronic device of the present application is shown. As shown in fig. 4, the electronic device 400 includes an ultrasonic probe 401, an ultrasonic probe 402, and a projection module 403. The ultrasonic probe 401 and the ultrasonic probe 402 are symmetrically and equidistantly arranged on two sides of the projection module 403 and are positioned on the same horizontal straight line. After the projection module 403 projects the display content, a projection area 404 can be obtained. In this embodiment, the default projection module 403 projects the display content onto the back of the hand that is level with the wrist, and the complete projection area 404 is displayed on the back of the hand. The ultrasonic probe 401 and the ultrasonic probe 402 have a detectable range
Figure BDA0002954738730000041
Angle (S)/(S)>
Figure BDA0002954738730000042
The respective detectable ranges of the ultrasound probe 401 and the ultrasound probe 402 combine to completely cover the projection area 404.
Accordingly, referring to fig. 8, a schematic structural diagram of another electronic device of the present application is shown. As shown in fig. 8, the electronic device 800 includes an ultrasonic probe 801, an ultrasonic probe 802, an ultrasonic probe 803, an ultrasonic probe 804, and a projection module 805. The ultrasonic probe 801 and the ultrasonic probe 802 are symmetrically and equidistantly arranged on both sides of the projection module 805, and the ultrasonic probe 803 and the ultrasonic probe 804 are also symmetrically and equidistantly arranged on both sides of the projection module 805. The ultrasonic probe 801, the ultrasonic probe 802, the ultrasonic probe 803, and the ultrasonic probe 804 are positioned on the same horizontal straight line.
Referring to fig. 5, a schematic structural view of an ultrasonic probe of the present application is shown. As shown in fig. 5, each of the ultrasonic probes includes an ultrasonic transmitting unit 502 and an ultrasonic receiving unit 501. Wherein the ultrasonic transmitting unit 502 is used for transmitting ultrasonic waves, and the ultrasonic receiving unit 501 is used for receiving ultrasonic waves. The emitted wave emitted from the ultrasonic emission unit 502 of each ultrasonic probe may be received by the own ultrasonic receiving unit 501, or may be received by the ultrasonic receiving units 501 of other ultrasonic probes.
After understanding the structure of the electronic device of the present application, the touch location determining method provided in the embodiment of the present application is described in detail below through specific embodiments and application scenarios thereof.
Referring to fig. 1, a flowchart of a touch location determination method of the present application is shown. The method comprises the following steps:
step 101: acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes;
in this embodiment of the present application, after the electronic device starts the projection function, the projection module of the electronic device projects the display screen onto the projection area. The touch object may perform a touch operation on the projected area. The touch object may be a finger, a pen point, or the like of the user. The touch operation may be a click operation. For example, the back of the hand and the wrist of the user keep the same horizontal plane, and after the projection function of the worn intelligent bracelet is started, the projection module of the intelligent bracelet starts the projection function, a projection area with a certain range is displayed on the back of the hand, and the projected picture content is displayed in the projection area. Then, the user can click in the projection area on the back of the hand with the finger, and the intelligent bracelet can determine the specific touch position of the finger in the projection area according to the ultrasonic probe. If the clicking operation of the finger of the user does not fall in the projection area on the back of the hand, the intelligent bracelet does not respond at all.
Specifically, when the electronic device acquires the first distance detected by the ultrasonic probe, the working modes of the ultrasonic probe are different, and the acquired first distance is also different.
When the ultrasonic receiving unit of each ultrasonic probe receives ultrasonic waves sent by the ultrasonic sending unit of the ultrasonic probe, the first distance is the distance between the ultrasonic probe and the touch object. Because the ultrasonic wave received by each ultrasonic probe is the ultrasonic wave emitted by the ultrasonic probe, for any ultrasonic probe, the electronic equipment records the emission time while the ultrasonic emission unit of the ultrasonic probe emits the ultrasonic wave, the ultrasonic wave propagates in the air and is reflected back by a touch object, and when the ultrasonic receiving unit of the ultrasonic probe receives the reflected wave, the electronic equipment records the receiving time. Further, the distance between the ultrasonic probe and the touch object can be obtained in the following manner: distance = 340t/2. Wherein 340m/s is the ultrasonic wave propagation speed in the atmospheric environment at 25 ℃, and t is the time difference between the receiving time and the transmitting time of the ultrasonic wave recorded by the electronic equipment. Illustratively, referring to fig. 6, a schematic diagram of the operation of the ultrasound probe of the present application is shown. As shown in fig. 6, the first distance a from the ultrasonic probe 601 to the touch object and the first distance B from the ultrasonic probe 602 to the touch object can be obtained by determining the first distance as described above.
When the ultrasonic receiving unit of each ultrasonic probe receives ultrasonic waves sent by the ultrasonic transmitting units of other ultrasonic probes, the first distance is the sum of the distance from the ultrasonic probe transmitting the ultrasonic waves to the touch object and the distance from the touch object to the ultrasonic probe receiving the ultrasonic waves. Because the ultrasonic wave received by each ultrasonic probe is the emission wave emitted by other ultrasonic probes, for any ultrasonic probe, the ultrasonic emission unit of the ultrasonic probe records the emission time while emitting ultrasonic waves, the ultrasonic waves propagate in the air, the touch object is emitted to some other ultrasonic probe, and when the ultrasonic receiving unit of some other ultrasonic probe receives the ultrasonic waves, the electronic equipment records the receiving time of the ultrasonic waves. Further, the first distance may be obtained in the following manner: distance=340 t. Similarly, 340m/s is the ultrasonic wave propagation speed in the atmospheric environment at 25 ℃, and t is the time difference between the receiving time and the transmitting time of the ultrasonic wave recorded by the electronic equipment. Referring to fig. 6, a schematic diagram of the operation of the ultrasound probe of the present application is shown. As shown in fig. 6, assuming that after the ultrasonic transmitting unit of the ultrasonic probe 601 transmits ultrasonic waves to the touch object, the ultrasonic receiving unit of the ultrasonic probe 602 receives the ultrasonic waves, the above determined first distance=the distance a between the ultrasonic probe 601 and the touch object+the distance B between the ultrasonic probe 602 and the touch object is adopted.
Step 102: determining a touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes;
in the embodiment of the application, a distance data table is prestored in the electronic device, and the distance data table comprises distance data between all two ultrasonic probes. Specifically, when the ultrasonic receiving unit of each ultrasonic probe receives the ultrasonic wave sent by the ultrasonic sending unit of the ultrasonic probe, as shown in fig. 6, for example, according to the distance a between the ultrasonic probe 601 and the touch object, the distance B between the ultrasonic probe 602 and the touch object, and the distance D between the ultrasonic probe 601 and the ultrasonic probe 602 queried from the distance data table, the position of the touch object in the projection area can be determined by using a preset triangle relation formula. While the ultrasonic receiving unit of each ultrasonic probe receives the ultrasonic waves transmitted by the ultrasonic transmitting units of the other ultrasonic probes, for example, as shown in fig. 6, according to the first distance a+b determined in the foregoing step 101, the position of the touch object in the projection area may be determined by using a position determining model preset in the electronic device.
According to the embodiment of the application, a first distance detected by each ultrasonic probe is obtained through at least two ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes. In the application, through setting up two at least ultrasonic probe around electronic equipment's projection module, utilize the first distance that ultrasonic probe surveyed, the second distance between the ultrasonic probe of predetermineeing to confirm the touch position of touch object in the projection region, realized can confirm the effect of touch object's touch position on the projection display region.
Referring to fig. 2, a flowchart of another touch location determination method of the present application is shown. The method comprises the following steps:
step 201: prompting a touch object to perform touch operation at a preset calibration point of the projection area;
in this embodiment of the present application, when the electronic device is turned on each time and the projection function is started for the first time, a projection area is displayed on the back of the user's hand, and at least one preset calibration point is displayed in the projection area. The preset calibration points are preset position points used for prompting a user to calibrate. Illustratively, referring to FIG. 3, a projection area calibration plan view of the present application is shown. As shown in fig. 3, five calibration points are displayed in the projection area, which are respectively an upper left corner preset calibration point "1", an upper right corner preset calibration point "2", a lower left corner preset calibration point "3", a lower right corner preset calibration point "4" and a center calibration point "5".
And the display screen of the electronic equipment presents prompt information to prompt a user to use the touch object to perform touch operation on preset calibration points of the projection area according to the indication sequence. For example, a prompt "please click on the projection area in order of number size and position" appears on the display screen.
Step 202: receiving a first input for each preset calibration point;
in the embodiment of the application, in order to improve the accuracy of calibration, the first input may be performed multiple times for each preset calibration point. Specifically, as shown in fig. 3, the user may click the position where "1" is located 5 times, click the position where "2" is located 5 times, click the position where "3" is located 5 times, click the position where "4" is located 5 times, and click the position where "5" is located 5 times last in sequence on the projection area on the back of the hand according to the prompt information.
Step 203: responding to the first input, and acquiring touch coordinates of the preset calibration point;
in this embodiment, the touch coordinates of the preset calibration point may be obtained by referring to the execution mode of the subsequent steps 205 to 206. For example, assuming that the electronic device is shown in fig. 6 and includes two ultrasonic probes, if the user clicks the position "1" in fig. 3 5 times, the ultrasonic probe 601 may obtain the distance to the touch object 5 times, which are respectively: a1 A2, A3, A4, A5, the final distance between the ultrasound probe 601 and the touch object can be determined as: (A1+A2+A3+A4+A5)/5. Similarly, the final distance between the ultrasound probe 602 and the touch object is: (B1+B2+B3+B4+B5)/5. Furthermore, the touch coordinates of the preset calibration point '1' can be determined according to the triangle relation formula.
Step 204: calibrating the area range of the projection area according to the touch coordinates of each preset calibration point;
in the embodiment of the application, standard coordinates of each preset calibration point are stored in the electronic equipment in advance, and the acquired touch coordinates of each preset calibration point are compared with the standard coordinates to correct the touch coordinates of the preset calibration points, so that the calibration of the projection range of the projection area can be realized.
The steps 201 to 204 are that the electronic device prompts the user to calibrate the area range of the projection area when the electronic device is started each time and the projection function is started for the first time, so as to ensure that the touch position of the touch object in the projection area can be accurately determined in the actual use process of the user.
In this application, in the process of using the electronic device by the user, the error accumulation for a long time may also cause a deviation of determining the touch position of the touch object in the projection area, and in order to ensure the accuracy of determining the touch position in the use process, optionally, the method further includes the following steps:
receiving a second input of the touch object;
and responding to the second input, and entering a step of prompting the touch control object to perform touch control operation at a preset calibration point of the projection area.
Specifically, a hardware button is arranged on the electronic device, or a control with a preset pattern is displayed on the display content of the projection area. If the user triggers a hardware button or the space of the preset pattern displayed in the projection area, the electronic device considers that the user has the requirement of recalibrating the projection range of the projection area in the use process, and the display interface of the electronic device displays prompt information again so as to prompt the user to perform touch operation at the preset calibration point of the projection area by using the touch object. In this embodiment, the second input is a control that the user triggers a preset hardware button or a preset pattern in the projection area. Once the user selects the trigger hardware button, or a control of a preset pattern, step 201 is entered to trigger a calibration operation.
Step 205: acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes;
in this embodiment of the present application, the modes of operation of the ultrasound probe are different, and the manner of acquiring the first distance is also different, and there are the following cases:
1) When the ultrasonic receiving unit of each ultrasonic probe receives ultrasonic waves sent by the ultrasonic transmitting unit of the ultrasonic probe:
Optionally, when the number of the ultrasonic probes is two, or the number of the ultrasonic probes is at least three, and the contact area between the touch object and the projection area is smaller, for example, when the pen point is used, the first distance is the distance from the ultrasonic probe to the touch object.
Illustratively, as shown in fig. 6, a first distance a from the ultrasonic probe 601 to the touch object and a first distance B from the ultrasonic probe 602 to the touch object are acquired. The calculation method for determining the distance between the ultrasonic probe and the touch object is referred to the execution process of step 101, and will not be described herein.
Optionally, when the number of the ultrasonic probes is at least three and the contact area between the touch object and the projection area is large, such as a finger, the step 205 includes:
and acquiring first distances between the at least three ultrasonic probes and the touch object.
Specifically, in order to improve accuracy in determining the touch position of the touch object on the projection area. When the touch object is a touch object with a large contact area between a finger and the projection area, at least three ultrasonic probes can be arranged, and the first distance is the distance between the ultrasonic probes and the edge of the outline of the finger pressed on the projection area. In this embodiment, the calculation method for obtaining the first distances between the at least three ultrasonic probes and the edge of the finger outline may refer to the execution process of the step 101, which is not described herein again.
2) When the ultrasonic receiving unit of each ultrasonic probe receives ultrasonic waves sent by the ultrasonic transmitting units of other ultrasonic probes:
whether the number of the ultrasonic probes is two or at least three, the first distance is the sum of the distance between any ultrasonic probe which emits ultrasonic waves and the touch object and the distance between the touch object and any other ultrasonic probe which receives the ultrasonic waves.
In the embodiment of the application, any two ultrasonic probes can be paired in pairs to be used as an ultrasonic probe group, one ultrasonic probe in each group of ultrasonic probes is used as a sender, and the other ultrasonic probe is used as a receiver. Correspondence between all pairwise paired ultrasonic probes as ultrasonic probes for transmitting ultrasonic waves and ultrasonic probes for receiving ultrasonic waves is stored in the electronic device in advance. In the embodiment of the application, one of two ultrasonic probes which are symmetrical and equidistant on two sides of the default projection module transmits ultrasonic waves, and the other ultrasonic probe receives the ultrasonic waves. Illustratively, as shown in fig. 6, the first distance is: distance a between the ultrasonic probe 601 and the touch object + distance B between the touch object and the ultrasonic probe 602. Likewise, as shown in fig. 8, the first distance may be: the distance between the ultrasonic probe 801 and the touch object+the distance between the touch object and the ultrasonic probe 802 may be the distance between the ultrasonic probe 803 and the touch object+the distance between the touch object and the ultrasonic probe 804. The corresponding relation between other ultrasonic probes for sending ultrasonic waves and ultrasonic probes for receiving ultrasonic waves can be set by a person skilled in the art according to actual requirements. For example, one of two ultrasonic probes which are asymmetric and non-equidistant on two sides of the projection module transmits ultrasonic waves, and the other ultrasonic probe receives the ultrasonic waves. Illustratively, as shown in fig. 8, the first distance may be: the distance between the ultrasonic probe 801 and the touch object+the distance between the touch object and the ultrasonic probe 804 may be the distance between the ultrasonic probe 803 and the touch object+the distance between the touch object and the ultrasonic probe 802.
Step 206: determining touch coordinates of the touch object in a preset coordinate system according to the first distance and a preset second distance between the ultrasonic probes; wherein the coordinate system is set based on the position of the projection module on the electronic equipment;
in the embodiment of the application, the electronic device establishes a coordinate system based on the projection module. The position of the projection module is taken as the origin of coordinates, the projection direction of forward projection of the projection module is taken as the y axis, and the direction perpendicular to the projection direction is taken as the x axis. In the embodiment of the application, the height difference between the projection module and the projection area in the three-dimensional space is ignored.
Specifically, when determining the touch coordinates of the touch object in the preset coordinate system, there are the following cases:
1) When the ultrasonic receiving unit of each ultrasonic probe receives ultrasonic waves sent by the ultrasonic transmitting unit of the ultrasonic probe:
optionally, when the number of the ultrasonic probes is two, or the number of the ultrasonic probes is at least three, and the contact area between the touch object and the projection area is small, such as a pen point, the step 206 includes:
step 2061: calculating the first distance and a second distance between the preset ultrasonic probes according to a preset triangle relation formula, and determining touch coordinates of the touch object in a preset coordinate system;
In an embodiment of the present application, referring to fig. 7, a schematic plan view of a triangle relationship is shown. The following triangle relation formula can be adopted to determine the touch coordinate (x, y) of the touch object in a preset coordinate system:
Figure BDA0002954738730000111
Figure BDA0002954738730000112
wherein A is the distance between one ultrasonic probe and the touch object, B is the distance between the other ultrasonic probe and the touch object, and D is the distance between the two ultrasonic probes.
Specifically, referring to fig. 6 for illustration, when the number of probes is two, a distance from the ultrasonic probe 601 to the touch object is a, a distance from the ultrasonic probe 602 to the touch object is B, and after the distance between the ultrasonic probe 601 and the ultrasonic probe 602 is D from the distance data table stored in the electronic device in advance, the touch coordinates (x, y) of the touch object in the preset coordinate system can be determined by substituting the above formula. Referring to fig. 8 for exemplary illustration, when the probe data is four and the touch object is assumed to be the pen tip, the distance between the ultrasonic probe 801 and the pen tip is obtained to be E, the distance between the ultrasonic probe 802 and the pen tip is obtained to be F, and the distance between the ultrasonic probe 801 and the ultrasonic probe 802 is queried to be D1 in the distance data table stored in the electronic device in advance, the first touch coordinate (x 1, y 1) of the pen tip in the preset coordinate system is determined by substituting the above formula. Similarly, if the distance between the ultrasonic probe 803 and the pen tip is obtained as C, the distance between the ultrasonic probe 804 and the pen tip is obtained as D, and the distance between the ultrasonic probe 803 and the ultrasonic probe 804 is found as D2 in the distance data table stored in the electronic device in advance, the second touch coordinate (x 2, y 2) of the pen tip in the preset coordinate system can be determined by substituting the above formula. Therefore, the final touch coordinates of the pen tip in the preset coordinate system are: ((x 1+ x 2)/2, (y1 + y 2)/2). Obviously, when the number of the probes is at least three, a plurality of touch coordinates of the touch object can be calculated, and the final touch coordinates are determined according to the plurality of touch coordinates, so that the determined touch coordinates of the touch object are more accurate.
Optionally, when the number of the ultrasonic probes is at least three and the contact area between the touch object and the projection area is large, such as a finger, the step 206 includes:
step 2062: according to at least three first distances, a preset model is adopted to determine touch coordinates of a touch center point in a preset coordinate system; the touch center point is a center point of the shape of the touch object;
in the embodiment of the application, when the contact area between the touch object and the projection area is large, the accuracy of determining the touch coordinate is improved. After the first distances between the at least three ultrasonic probes and the touch object are obtained in step 205, a preset model may be utilized to determine the touch coordinates of the shape center point of the touch object in a preset coordinate system. Specifically, the step 2062 includes the substeps of:
sub-step 20621: acquiring at least three first touch coordinates according to at least three first distances;
specifically, when the touch object is a finger, the electronic device may obtain distances from the at least three ultrasonic probes to the finger according to step 2011, where obtaining at least three distances is equivalent to that at least three touch points exist on the contour where the finger contacts the projection area. Further, the aforementioned step 2021 may be adopted, and a preset triangle relation formula is utilized to obtain the touch coordinates of at least three touch points. It should be noted that, since the touch point is in the projection area, any one of the ultrasonic probes can detect the touch point, that is, for any one of the touch points, at least the distance from the two ultrasonic probes to the touch point can be obtained, and then the touch coordinate of the touch point can be obtained by using a preset triangle relation formula, and the touch coordinate can be used as one of the first touch coordinates.
Sub-step 20622: according to the three first touch coordinates, a preset shape center fitting model is adopted to determine touch coordinates of a touch center point;
in the embodiment of the application, a shape center fitting model is preset in the electronic device, and is used for fitting out the outline shape of the object and the touch coordinates of the touch midpoint according to at least three coordinates. Specifically, after at least three first touch coordinates of at least three touch points are obtained, inputting the coordinates into a preset shape center fitting model, and determining the touch coordinates of the touch center points by the shape fitting model. The shape center fitting model can fit the outline shape of the finger in contact with the projection area according to at least three first touch coordinates. For example, the shape of the touch object is obtained by using a preset shape center fitting model to be a section of arc, and at least three first touch coordinates are correspondingly located at different positions of the arc, that is, at least three points exist on the arc: and connecting the point a, the point b and the point c to obtain a straight line ab and a straight line bc, determining the respective middle points of the straight line ab and the straight line bc, and making the perpendicular bisectors of the straight line ab and the straight line bc, wherein the intersection point of the two perpendicular bisectors is a touch center point, and determining the touch center point can obtain the touch coordinates of the touch center point.
2) When the ultrasonic receiving unit of each ultrasonic probe receives ultrasonic waves sent by the ultrasonic transmitting units of other ultrasonic probes:
optionally, the step 206 includes:
step 2063: inputting the first distances detected by the plurality of groups of ultrasonic probes into a preset position determining model, and determining touch coordinates of the touch object in a preset coordinate system; each ultrasonic probe receives the emission wave of at least one other ultrasonic probe, and in each group of ultrasonic probes, one ultrasonic probe is a sender and one ultrasonic probe is a receiver.
Specifically, any two ultrasonic probes can be used as one ultrasonic probe group, one ultrasonic probe in each group of ultrasonic probes is used as a sender, and the other ultrasonic probe is used as a receiver. In the embodiment of the application, two symmetrical and equidistant ultrasonic probes on two sides of the projection module are used as one ultrasonic probe group. The first distance is: in each group of ultrasonic probes, the sum of the distance from one ultrasonic probe as an ultrasonic wave sender to a touch object and the distance from the touch object to the other ultrasonic probe as an ultrasonic wave receiver.
In the embodiment of the application, a position determination model is preset in the electronic equipment, and the position determination model is obtained through training in the following manner: the coordinate positions of the respective ultrasonic probes in the preset coordinate system are determined first, for example, referring to fig. 8, the coordinates of the ultrasonic probe 803 are (-D2/2, 0), the coordinates of the ultrasonic probe 804 are (D2/2, 0), the coordinates of the ultrasonic probe 801 are (-D1/2, 0), and the coordinates of the ultrasonic probe 802 are (D1/2, 0). Then, the touch object performs a touch operation at least one position in the projection area, so as to determine a first distance detected by each group of ultrasonic probes, for example, referring to fig. 8, it is assumed that the touch object clicks at a certain position in the projection area, and at this time, the first distances acquired by the ultrasonic probes 801 and 802 are: e+f, the first distance acquired by the ultrasonic probe 803 and the ultrasonic probe 804 is: C+D. And then, obtaining input coordinates of the sample points according to the first distance and the coordinates of the ultrasonic probe. For example, two sample point input coordinates may be obtained: (-D1/2, 0), (D1/2, 0), E+F) and ((-D2/2, 0), (D2/2, 0), C+D). Moreover, a preset touch coordinate corresponding relation table when different ultrasonic probe groups correspond to different touch positions can be preset by a person skilled in the art. And finally, inputting a plurality of samples into a neural network model for training, and inputting a plurality of samples into a coordinate and a preset touch coordinate corresponding relation table when different ultrasonic probe groups correspond to different touch positions, so as to obtain a position determination model. For example, the corresponding touch position (x, y) is stored in the position determination model ((-D1/2, 0), (D1/2, 0), and E+F). It should be noted that, when the position determination model is actually trained, touch operation can be performed by using as many positions of the touch object as possible in the projection area so as to determine a large number of sample input point coordinates, so that when the position determination model obtained by training is used, the touch coordinates of the touch object in the preset coordinate system can be more accurately determined.
After the position determination model is obtained, in actual operation and use, only the first distances detected by the plurality of groups of ultrasonic probes are input into the preset position determination model, and the preset position determination model can output the relative touch position (x, y).
Step 207: and determining the touch position of the touch coordinate in the projection area based on the area range of the projection area of the projection module in the coordinate system.
In the embodiment of the application, the content projected by the projection module has a certain area range, the electronic equipment divides the projection content in the projection area range in advance to obtain a plurality of subarea ranges, after the touch coordinates of the touch object are determined, the subarea range in which the touch coordinates fall is judged, and then the touch position of the touch object can be determined, if the part of the projection content clicked by the finger of the user can be determined.
Optionally, in the embodiment of the present application, the working manner of the ultrasonic probe may further be: in the case where at least two ultrasonic probes are operated in a time-sharing manner, the frequencies of the transmission waves of the respective ultrasonic probes are the same, or in the case where at least two ultrasonic probes are operated simultaneously, the frequencies of the transmission waves of the respective ultrasonic probes are different from each other.
With the former approach, since the respective ultrasonic probes do not operate simultaneously, power consumption of the electronic device can be reduced. For the latter mode, as each probe works simultaneously, the detection range is larger, and the first distance detected by each ultrasonic probe can be obtained simultaneously, so that the touch position of the touch object in the projection area can be determined more quickly. In addition, as the frequencies of the emitted waves of the ultrasonic probes are not the same, the mutual influence among the probes can be ensured.
According to the embodiment of the application, touch operation is performed at a preset calibration point of the projection area by prompting a touch object; receiving a first input for each preset calibration point; responding to the first input, and acquiring touch coordinates of the preset calibration point; and calibrating the area range of the projection area according to the touch coordinates of each preset calibration point. After the area range of the projection area is calibrated, acquiring a first distance detected by each ultrasonic probe according to at least two ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes. In the application, through set up two at least ultrasonic probe around the projection module of electronic equipment, after the regional scope calibration to projection region earlier, the first distance of reuse ultrasonic probe detection, the second distance between the ultrasonic probe of predetermineeing determine the touch position of touch object in projection region, realized can accurately confirm the effect of touch object's touch position on projection display region.
It should be noted that, in the touch location determining method provided in the embodiments of the present application, the execution body may be a touch location determining device, or a control module in the touch location determining device for executing the touch location determining method. In the embodiment of the present application, a touch position determining device executes a touch position determining method as an example, and the touch position determining device provided in the embodiment of the present application is described.
Referring to fig. 9, a block diagram of a touch location determination apparatus 900 of the present application is shown. Is applied to electronic equipment, and the electronic equipment comprises: the ultrasonic probe comprises a projection module and at least two ultrasonic probes, wherein the at least two ultrasonic probes are distributed around the projection module; the detection ranges of the at least two ultrasonic probes cover the projection area of the projection module; the apparatus 900 includes:
an acquisition module 901, configured to acquire a first distance detected by each of the at least two ultrasonic probes;
a determining module 902, configured to determine a touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes.
Optionally, the determining module 902 includes:
The first determining module is used for determining touch coordinates of the touch object in a preset coordinate system according to the first distance and a preset second distance between the ultrasonic probes; wherein the coordinate system is set based on the position of the projection module on the electronic equipment;
and the second determining module is used for determining the touch position of the touch coordinate in the projection area based on the area range of the projection area of the projection module in the coordinate system.
Optionally, the apparatus 900 further includes:
the prompting module is used for prompting the touch control object to perform touch control operation at a preset calibration point of the projection area;
a first receiving module for receiving a first input for each preset calibration point;
the first response module is used for responding to the first input and acquiring touch coordinates of the preset calibration point;
and the calibration module is used for calibrating the area range of the projection area according to the touch coordinates of each preset calibration point.
Optionally, the apparatus 900 further includes:
the second receiving module is used for receiving a second input of the touch object;
and the second response module is used for responding to the second input and entering the prompting module.
Optionally, the number of the ultrasonic probes is at least three; the acquiring module 901 is specifically configured to acquire a first distance between the at least three ultrasonic probes and the touch object;
optionally, the determining module 902 includes:
the determining submodule is used for determining touch coordinates of the touch center point in a preset coordinate system by adopting a preset model according to at least three first distances; the touch center point is a center point of the shape of the touch object.
Optionally, the determining submodule includes:
the first acquisition sub-module is used for acquiring at least three first touch coordinates according to at least three first distances;
and the first determining submodule is used for determining the touch coordinates of the touch center point by adopting a preset shape center fitting model according to the three first touch coordinates.
Optionally, the first determining module includes:
the first coordinate determining module is used for calculating the first distance and the second distance according to a preset triangle relation formula and determining touch coordinates of the touch object in a preset coordinate system.
Optionally, the first determining module includes:
The second coordinate determining module is used for inputting the first distances detected by the plurality of groups of ultrasonic probes into a preset position determining model and determining the touch coordinates of the touch object in a preset coordinate system; each ultrasonic probe receives the emission wave of at least one other ultrasonic probe, and in each group of ultrasonic probes, one ultrasonic probe is a sender and one ultrasonic probe is a receiver.
Optionally, the frequencies of the emitted waves of the ultrasonic probes are the same under the condition that the at least two ultrasonic probes work in a time-sharing mode; and under the condition that the at least two ultrasonic probes work simultaneously, the frequencies of the emitted waves of the ultrasonic probes are different from each other.
According to the embodiment of the application, a first distance detected by each ultrasonic probe is obtained through at least two ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes. In the application, through setting up two at least ultrasonic probe around electronic equipment's projection module, utilize the first distance that ultrasonic probe surveyed, the second distance between the ultrasonic probe of predetermineeing to confirm the touch position of touch object in the projection region, realized can confirm the effect of touch object's touch position on the projection display region.
The touch position determining device in the embodiment of the application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc., and the non-mobile electronic device may be a server, network attached storage (NetworkAttached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The touch location determining device in the embodiment of the present application may be a device with an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The touch location determining device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 8, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 10, the embodiment of the present application further provides an electronic device 1000, including a processor 1001, a memory 1002, and a program or an instruction stored in the memory 1002 and capable of being executed on the processor 1001, where the program or the instruction implements each process of the embodiment of the touch location determination method described above when executed by the processor 1001, and the same technical effects can be achieved, and for avoiding repetition, a description is omitted herein.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: radio frequency unit 1101, network module 1102, audio output unit 1103, input unit 1104, sensor 1105, display unit 1106, user input unit 1107, interface unit 1108, memory 1109, and processor 1110.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1110 by a power management system, such as to perform functions such as managing charging, discharging, and power consumption by the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine some components, or may be arranged in different components, which are not described in detail herein.
The processor 1110 is configured to obtain, through at least two ultrasonic probes, a first distance detected by each of the ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes.
Optionally, the processor 1110 is specifically configured to determine touch coordinates of the touch object in a preset coordinate system according to the first distance and a preset second distance between the ultrasonic probes; wherein the coordinate system is set based on the position of the projection module on the electronic equipment; and determining the touch position of the touch coordinate in the projection area based on the area range of the projection area of the projection module in the coordinate system.
Optionally, the processor 1110 is further configured to prompt the touch object to perform a touch operation at a preset calibration point of the projection area; receiving a first input for each preset calibration point; responding to the first input, and acquiring touch coordinates of the preset calibration point; and calibrating the area range of the projection area according to the touch coordinates of each preset calibration point.
Optionally, the processor 1110 is further configured to receive a second input of the touch object; and responding to the second input, and entering a step of prompting the touch control object to perform touch control operation at a preset calibration point of the projection area.
Optionally, the number of the ultrasonic probes is at least three; the processor 1110 is specifically configured to obtain a first distance between the at least three ultrasonic probes and the touch object;
optionally, the processor 1110 is specifically configured to determine, according to at least three first distances, a touch coordinate of a touch center point in a preset coordinate system by adopting a preset model; the touch center point is a center point of the shape of the touch object.
Optionally, the processor 1110 is specifically configured to obtain at least three first touch coordinates according to at least three first distances; and according to the three first touch coordinates, a preset shape center fitting model is adopted to determine the touch coordinates of the touch center point.
Optionally, the processor 1110 is specifically configured to calculate the first distance and the second distance according to a preset triangle relation formula, and determine touch coordinates of the touch object in a preset coordinate system.
Optionally, the processor 1110 is specifically configured to input a first distance detected by multiple groups of ultrasonic probes into a preset position determining model, and determine touch coordinates of the touch object in a preset coordinate system; each ultrasonic probe receives the emission wave of at least one other ultrasonic probe, and in each group of ultrasonic probes, one ultrasonic probe is a sender and one ultrasonic probe is a receiver.
Optionally, in a case where at least two ultrasonic probes of the processor 1110 operate in a time-sharing manner, the frequencies of the emitted waves of the ultrasonic probes are the same; under the condition that at least two ultrasonic probes work simultaneously, the frequencies of the emitted waves of the ultrasonic probes are different from each other.
According to the embodiment of the application, a first distance detected by each ultrasonic probe is obtained through at least two ultrasonic probes; and determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes. In the application, through setting up two at least ultrasonic probe around electronic equipment's projection module, utilize the first distance that ultrasonic probe surveyed, the second distance between the ultrasonic probe of predetermineeing to confirm the touch position of touch object in the projection region, realized can confirm the effect of touch object's touch position on the projection display region.
It should be appreciated that in embodiments of the present application, the input unit 1104 may include a graphics processor (Graphics Processing Unit, GPU) 11041 and a microphone 11042, the graphics processor 11041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. The touch panel 11071 is also referred to as a touch screen. The touch panel 11071 may include two parts, a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein. Memory 1109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 1110 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 1110.
The embodiment of the application further provides a readable storage medium, on which a program or an instruction is stored, where the program or the instruction realizes each process of the embodiment of the touch position determining method when executed by a processor, and the same technical effects can be achieved, so that repetition is avoided and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is configured to run a program or an instruction, implement each process of the above embodiment of the touch location determining method, and achieve the same technical effect, so that repetition is avoided, and no further description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), including several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. The touch position determining method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises: the ultrasonic probe comprises a projection module and at least two ultrasonic probes, wherein the at least two ultrasonic probes are distributed around the projection module; the detection ranges of the at least two ultrasonic probes cover the projection area of the projection module; the method comprises the following steps:
acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes;
determining a touch position of a touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes;
when the ultrasonic waves received by the ultrasonic probe come from the ultrasonic probe, the first distance is the distance between the ultrasonic probe and the touch object; when the ultrasonic waves received by the ultrasonic probe are from other ultrasonic probes except the ultrasonic probe, the first distance is the sum of the distance between any ultrasonic probe which transmits ultrasonic waves and the touch object and the distance between the touch object and any other ultrasonic probe which receives the ultrasonic waves;
when the number of the ultrasonic probes is at least three and the contact area of the touch object and the projection area is greater than a preset threshold, the acquiring, by at least two ultrasonic probes, the first distance detected by each ultrasonic probe includes:
Acquiring first distances between the at least three ultrasonic probes and the touch object;
acquiring at least three first touch coordinates according to at least three first distances;
and according to the three first touch coordinates, a preset shape center fitting model is adopted to determine the touch coordinates of the touch center point.
2. The method of claim 1, wherein determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes comprises:
determining touch coordinates of the touch object in a preset coordinate system according to the first distance and a preset second distance between the ultrasonic probes; wherein the coordinate system is set based on the position of the projection module on the electronic equipment;
and determining the touch position of the touch coordinate in the projection area based on the area range of the projection area of the projection module in the coordinate system.
3. The method of claim 2, further comprising, prior to acquiring the first distance detected by each of the ultrasound probes by at least two ultrasound probes:
prompting a touch object to perform touch operation at a preset calibration point of the projection area;
Receiving a first input for each preset calibration point;
responding to the first input, and acquiring touch coordinates of the preset calibration point;
and calibrating the area range of the projection area according to the touch coordinates of each preset calibration point.
4. A method according to claim 3, further comprising:
receiving a second input of the touch object;
and responding to the second input, and entering a step of prompting the touch control object to perform touch control operation at a preset calibration point of the projection area.
5. The method according to claim 2, wherein determining the touch coordinates of the touch object in a preset coordinate system according to the first distance and a preset second distance between the ultrasonic probes comprises:
and calculating the first distance and the preset second distance between the ultrasonic probes according to a preset triangle relation formula, and determining the touch coordinates of the touch object in a preset coordinate system.
6. The method according to claim 2, wherein determining the touch coordinates of the touch object in a preset coordinate system according to the first distance and a preset second distance between the ultrasonic probes comprises:
Inputting the first distances detected by the plurality of groups of ultrasonic probes into a preset position determining model, and determining touch coordinates of the touch object in a preset coordinate system; each ultrasonic probe receives the emission wave of at least one other ultrasonic probe, and in each group of ultrasonic probes, one ultrasonic probe is a sender and one ultrasonic probe is a receiver.
7. The method according to claim 1, wherein the frequencies of the emitted waves of the respective ultrasonic probes are the same in the case where the at least two ultrasonic probes are operated in a time-sharing manner; and under the condition that the at least two ultrasonic probes work simultaneously, the frequencies of the emitted waves of the ultrasonic probes are different from each other.
8. A touch location determining apparatus, characterized by being applied to an electronic device, the electronic device comprising: the ultrasonic probe comprises a projection module and at least two ultrasonic probes, wherein the at least two ultrasonic probes are distributed around the projection module; the detection ranges of the at least two ultrasonic probes cover the projection area of the projection module; the device comprises:
the acquisition module is used for acquiring a first distance detected by each ultrasonic probe through at least two ultrasonic probes;
the determining module is used for determining the touch position of the touch object in the projection area according to the first distance and a preset second distance between the ultrasonic probes;
When the ultrasonic waves received by the ultrasonic probe come from the ultrasonic probe, the first distance is the distance between the ultrasonic probe and the touch object; when the ultrasonic waves received by the ultrasonic probe are from other ultrasonic probes except the ultrasonic probe, the first distance is the sum of the distance between any ultrasonic probe which transmits ultrasonic waves and the touch object and the distance between the touch object and any other ultrasonic probe which receives the ultrasonic waves;
when the number of the ultrasonic probes is at least three, and the contact area of the touch object and the projection area is larger than a preset threshold value, the acquisition module is further used for acquiring first distances between the at least three ultrasonic probes and the touch object; acquiring at least three first touch coordinates according to at least three first distances; and according to the three first touch coordinates, a preset shape center fitting model is adopted to determine the touch coordinates of the touch center point.
9. An electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the touch location determination method of any of claims 1-7.
10. A readable storage medium, wherein a program or instructions is stored on the readable storage medium, which when executed by a processor, implements the steps of the touch location determination method according to any of claims 1-7.
CN202110218162.2A 2021-02-26 2021-02-26 Touch position determining method and device Active CN113157147B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110218162.2A CN113157147B (en) 2021-02-26 2021-02-26 Touch position determining method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110218162.2A CN113157147B (en) 2021-02-26 2021-02-26 Touch position determining method and device

Publications (2)

Publication Number Publication Date
CN113157147A CN113157147A (en) 2021-07-23
CN113157147B true CN113157147B (en) 2023-05-23

Family

ID=76883530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110218162.2A Active CN113157147B (en) 2021-02-26 2021-02-26 Touch position determining method and device

Country Status (1)

Country Link
CN (1) CN113157147B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108920067A (en) * 2018-05-15 2018-11-30 维沃移动通信有限公司 A kind of camera focusing method and mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150092962A (en) * 2014-02-06 2015-08-17 삼성전자주식회사 Method for processing data and an electronic device thereof
US10394392B2 (en) * 2015-01-14 2019-08-27 Atmel Corporation Object detection and scan
CN108829294B (en) * 2018-04-11 2021-08-27 卡耐基梅隆大学 Projection touch method and device and projection touch equipment
CN110989844A (en) * 2019-12-16 2020-04-10 广东小天才科技有限公司 Input method, watch, system and storage medium based on ultrasonic waves
CN111813272A (en) * 2020-05-29 2020-10-23 维沃移动通信有限公司 Information input method and device and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108920067A (en) * 2018-05-15 2018-11-30 维沃移动通信有限公司 A kind of camera focusing method and mobile terminal

Also Published As

Publication number Publication date
CN113157147A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
US10185433B2 (en) Method and apparatus for touch responding of wearable device as well as wearable device
US10585497B2 (en) Wearable device
CN101375297B (en) Interactive input system
US20150244911A1 (en) System and method for human computer interaction
US10665014B2 (en) Tap event location with a selection apparatus
EP3333675A1 (en) Wearable device user interface control
US20040021645A1 (en) Coordinate input apparatus, control method thereof, and program
US20140317576A1 (en) Method and system for responding to user's selection gesture of object displayed in three dimensions
US10303276B2 (en) Touch control system, touch control display system and touch control interaction method
CN109933252B (en) Icon moving method and terminal equipment
CN111344663B (en) Rendering device and rendering method
CN102033702A (en) Image display device and display control method thereof
US20150084930A1 (en) Information processor, processing method, and projection system
US10296096B2 (en) Operation recognition device and operation recognition method
US10936053B2 (en) Interaction system of three-dimensional space and method for operating same
CN108089772B (en) Projection touch method and device
US20100019972A1 (en) Multi-touch detection
CN108027648A (en) The gesture input method and wearable device of a kind of wearable device
JP2012048393A (en) Information processing device and operation method of the same
US20160370861A1 (en) Input device and input control method
CN112929734B (en) Screen projection method and device and electronic equipment
CN107943406B (en) touch point determining method of touch screen and terminal
CN113157147B (en) Touch position determining method and device
CN109579752B (en) Measuring method and terminal equipment
JPWO2015159550A1 (en) Information processing system, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant