CN112615615B - Touch positioning method, device, equipment and medium - Google Patents

Touch positioning method, device, equipment and medium Download PDF

Info

Publication number
CN112615615B
CN112615615B CN202011422873.3A CN202011422873A CN112615615B CN 112615615 B CN112615615 B CN 112615615B CN 202011422873 A CN202011422873 A CN 202011422873A CN 112615615 B CN112615615 B CN 112615615B
Authority
CN
China
Prior art keywords
state
positioning
target
key
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011422873.3A
Other languages
Chinese (zh)
Other versions
CN112615615A (en
Inventor
徐协增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Hongcheng Opto Electronics Co Ltd
Original Assignee
Anhui Hongcheng Opto Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Hongcheng Opto Electronics Co Ltd filed Critical Anhui Hongcheng Opto Electronics Co Ltd
Priority to CN202011422873.3A priority Critical patent/CN112615615B/en
Publication of CN112615615A publication Critical patent/CN112615615A/en
Application granted granted Critical
Publication of CN112615615B publication Critical patent/CN112615615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K19/00Logic circuits, i.e. having at least two inputs acting on one output; Inverting circuits
    • H03K19/0175Coupling arrangements; Interface arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is applicable to the technical field of computers, and provides a touch positioning method, which comprises the following steps: detecting a key state of a target key; responding to the key state of the target key as a target state, and sending a control instruction to the target application, wherein the control instruction is used for controlling the target application to present a positioning interface which presents information for guiding a user to execute touch positioning operation; based on the control instruction, receiving touch positioning operation executed by a user based on a positioning interface so as to realize touch positioning. By detecting the state of the target key, the method and the device control the target application to present the positioning interface when the key state is the target state. The user can conveniently and rapidly open the positioning interface by pressing the target key, so that touch positioning operation is performed based on the guiding information of the positioning interface to realize touch positioning. The method is beneficial to improving the efficiency of positioning the touch position on the electronic whiteboard.

Description

Touch positioning method, device, equipment and medium
Technical Field
The application belongs to the technical field of computers, and particularly relates to a touch positioning method, a touch positioning device, touch positioning equipment and a touch positioning medium.
Background
The electronic whiteboard, such as an interactive electronic whiteboard, has the functions of writing, annotating, drawing, multimedia entertainment, network conference and the like, and is a preferred product for office work, teaching and image-text interactive demonstration in the information age.
Currently, electronic whiteboards are commonly used with projectors. Fig. 1 is a schematic diagram of a relative positional relationship between an electronic whiteboard and a projection area of a projector according to the related art, and as shown in fig. 1, in practical application, the projection area 102 of the projector is generally smaller than the area of the electronic whiteboard 101 for easy viewing. In the actual use process, since the projection area of the projector is generally changed due to the position change of the projector, in order to realize accurate touch control on the electronic whiteboard in the projection area, when the projection area is changed, the touch control position on the electronic whiteboard needs to be repositioned.
In the related art, there is an improvement in efficiency of positioning a touch position on an electronic whiteboard.
Disclosure of Invention
The embodiment of the application provides a touch positioning method, a device, equipment and a medium, and aims to solve the problem that the efficiency of positioning a touch position on an electronic whiteboard in the related technology is not high enough.
In a first aspect, an embodiment of the present application provides a touch positioning method, where the method includes:
detecting a key state of a target key;
responding to the key state of the target key as a target state, and sending a control instruction to the target application, wherein the control instruction is used for controlling the target application to present a positioning interface which presents information for guiding a user to execute touch positioning operation;
based on the control instruction, receiving touch positioning operation executed by a user based on a positioning interface so as to realize touch positioning.
Further, detecting a key state of the target key includes:
detecting the level change of an interface of the target key;
in response to detecting that the level of the interface changes from a high level to a low level and then to a high level, the key state is determined to be a triggered state.
Further, detecting a key state of the target key includes:
detecting the level change of an interface of the target key;
in response to detecting a change in the level of the interface from a high level to a low level, the key state is determined to be a triggered state.
Further, sending a control instruction to the target application, including:
and sending a control instruction to the target application through a preset universal serial bus interface.
Further, in response to the key state of the target key being the target state, sending a control instruction to the target application, including:
and responding to the key state of the target key as a target state and the value of the parameter for indicating the opening and closing state of the positioning interface as a first preset value, and sending a control instruction to the target application, wherein the first preset value is used for indicating the opening and closing state of the positioning interface as a closing state.
Further, after sending the control instruction to the target application, the method further includes:
and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to a second preset value, wherein the second preset value is used for indicating the opening and closing state of the positioning interface to be the opening state.
Further, the method further comprises:
in response to detecting that the key state of the target key is the target state and the value of the parameter for indicating the opening and closing state of the positioning interface is the second preset value, sending a control instruction for controlling the target application to close the positioning interface to the target application, and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to the first preset value.
Further, the positioning interface presents information for guiding the user to execute the touch positioning operation, including: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of the projection area.
Further, based on the control instruction, receiving a touch positioning operation performed by the user based on the positioning interface to realize touch positioning, including:
selecting a contour locating point from a plurality of contour locating points presented by a locating interface, and executing information prompting operation on the selected contour locating point;
in response to detecting that the user clicks the contour locating point, acquiring coordinate values of the contour locating point, selecting an unselected contour locating point from a plurality of contour locating points, and continuing to execute information prompt operation until each contour locating point is clicked;
and determining coordinate values of each touch point on the electronic whiteboard according to the coordinate values of the plurality of contour locating points, so as to realize touch location.
In a second aspect, an embodiment of the present application provides a touch positioning device, including:
the state detection unit is used for detecting the key state of the target key;
the instruction sending unit is used for responding to the key state of the target key as a target state and sending a control instruction to the target application, wherein the control instruction is used for controlling the target application to present a positioning interface which presents information for guiding a user to execute touch positioning operation;
and the positioning execution unit is used for receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the touch positioning method described above when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program, where the computer program implements the steps of the touch positioning method when executed by a processor.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on an electronic device, causes the electronic device to perform the touch positioning method of any one of the above first aspects.
Compared with the related art, the embodiment of the application has the beneficial effects that: when the related art locates a touch position on an electronic whiteboard, a user typically manually opens a target application, and then manually searches a location interface from the opened target application. And finally, the user executes touch positioning operation based on the guiding information of the positioning interface so as to realize touch positioning. In the related art, if a user is unfamiliar with various operations of a target application, it is difficult to implement positioning of a touch position on an electronic whiteboard, resulting in low efficiency of positioning of the touch position on the electronic whiteboard. By detecting the state of the target key, the method and the device control the target application to present a positioning interface when the key state is the target state. The user can conveniently and rapidly open the positioning interface by pressing the target key, so that touch positioning operation is performed based on the guiding information of the positioning interface to realize touch positioning. The method is beneficial to improving the efficiency of positioning the touch position on the electronic whiteboard.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly introduce the drawings that are needed in the embodiments or the related technical descriptions, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a schematic diagram of a relative positional relationship between an electronic whiteboard and a projection area of a projector according to the related art;
FIG. 2 is a system architecture diagram of an application of a touch location method according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of a touch positioning method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a circuit for implementing detection of a key state of a target key according to an embodiment of the present application;
FIG. 5 is a schematic diagram of distribution of contour locating points according to an embodiment of the present application;
fig. 6 is a flowchart of a touch positioning method according to another embodiment of the present application;
Fig. 7 is a flowchart of a touch positioning method according to another embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a touch positioning device according to an embodiment of the disclosure;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In order to explain the technical aspects of the present application, the following examples are presented.
Example 1
Referring to fig. 2, a system architecture diagram for an application of a touch positioning method according to an embodiment of the present application is provided.
As shown in fig. 2, the system architecture may include an electronic whiteboard 201, a target key 2011 disposed on the electronic whiteboard, a terminal device 202 with a target application 2021 installed, and a projector 203. The terminal device 202 is electrically connected to the electronic whiteboard 201 and the projector 203, respectively.
An electronic whiteboard 201 for detecting a key state of the target key 2011; in response to the key state of the target key 2011 being the target state, a control instruction for controlling the target application 2021 to present a positioning interface is sent to the target application 2021, so that the user performs a touch positioning operation based on the positioning interface projected onto the electronic whiteboard, so as to implement touch positioning, where the positioning interface presents information for guiding the user to perform the touch positioning operation.
Terminal device 202 may be a variety of electronic devices with target applications installed, including but not limited to smartphones, tablets, laptop portable computers, desktop computers, and the like. The terminal device 202 may perform touch positioning based on a touch positioning operation performed by a user on the electronic whiteboard.
The projector 203 may be used to project content presented by the terminal device 203, such as a presented positioning interface, onto the electronic whiteboard 201.
By detecting the state of the target key, the method and the device control the target application to present the positioning interface when the key state is the target state. The user can conveniently and rapidly open the positioning interface by pressing the target key, so that touch positioning operation is performed based on the guiding information of the positioning interface to realize touch positioning. The method is beneficial to improving the efficiency of positioning the touch position on the electronic whiteboard.
Example two
Referring to fig. 3, an embodiment of the present application provides a touch positioning method, including:
step 301, detecting a key state of a target key.
The target key may be a preset key.
In practice, the target key is usually fixed on the electronic whiteboard. The target key is typically electrically connected to the controller of the electronic whiteboard through an interface, such as a General-Purpose Input/Output (GPIO) interface.
In this embodiment, the execution body of the touch positioning method may be an electronic whiteboard with a touch function. In practice, the execution subject may detect the key state of the target key by detecting a level change of the interface of the target key. As an example, if it is detected that the level of the interface changes from a low level to a high level, the key state may be determined as the triggered state. As another example, if it is detected that the level of the interface changes from high to low to high, the key state may also be determined as the triggered state.
Fig. 4 is a schematic circuit diagram of detecting a key state of a target key according to an embodiment of the present application. As shown in fig. 4, KEY is a target KEY, R is a resistor, and the resistance value of R may be 10 kiloohms. And 3.3V is the power supply voltage of the target key. The KEY may be an automatic reset KEY, and when the KEY is pressed, the circuit is turned on, and the level at the interface changes from high level to low level. When KEY is stopped, the circuit is opened, and the level at the interface is changed from low level to high level. Here, if the target key is an automatic reset key, the execution body may detect a level change of the interface in a process from pressing to bouncing of the target key, so as to obtain a key state of the target key.
In practice, key states may generally include a triggered state and an unactivated state. In general, the triggered state of the target key generally refers to a state in which the level of the interface of the target key changes. The unactivated state of a target key generally refers to a key state other than the activated state.
Optionally, detecting a key state of the target key includes: first, a level change of an interface of a target key is detected. Then, in response to detecting that the level of the interface changes from a high level to a low level and then to a high level, it is determined that the key state is a triggered state.
Here, in the case where the target key is the automatic reset key, when it is detected that the level of the interface of the target key changes from the high level to the low level and then to the high level, or when it is detected that the level of the interface of the target key changes from the low level to the high level and then to the low level, it may be determined that the key shape of the target key is the triggered state.
Optionally, detecting a key state of the target key includes: first, a level change of an interface of a target key is detected. Then, in response to detecting that the level of the interface changes from a high level to a low level, the key state is determined to be a triggered state.
Here, the execution body may determine that the key shape of the target key is the activated state when it is detected that the level of the interface of the target key changes, for example, when it changes from the high level to the low level.
And step 302, in response to the key state of the target key being the target state, a control instruction is sent to the target application.
The control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information for guiding a user to execute touch positioning operation.
Wherein the target state is typically a triggered state.
The target application is generally an application for implementing positioning of a touch position of the electronic whiteboard.
Here, after receiving the control instruction, the target application may open the positioning interface and project the positioning interface onto the electronic whiteboard through the projector based on the control instruction.
The positioning interface may be presented with information for guiding the user to perform touch positioning operations. As an example, information may be presented on the positioning interface for guiding the user to trace on the outline of the projection area. At this time, the touch positioning operation is an operation of tracing on the outline of the projection area.
Optionally, the positioning interface presents information for guiding the user to perform touch positioning operation, including: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of the projection area. In practice, the positioning interface is typically presented with contour positioning points of the projection area and information for guiding the user to click on the contour positioning points of the projection area. The contour locating point is typically a point for determining the position of the projection area. At this time, the touch positioning operation may be: clicking the contour locating point of the projection area.
As an example, the contour anchor points may be four vertices of the projection area.
As another example, the contour anchor point may be four vertices of the projection area plus a center point of the projection area.
Fig. 5 is a schematic distribution diagram of contour positioning points according to an embodiment of the present application. As shown in fig. 5, P0-P8 are contour anchor points. P0-P8 are distributed over a projection area 502, the projection area 502 being located on an electronic whiteboard 501. Wherein P0 is the center point of the projection area, P1-P4 are the vertices of the projection area, and P5-P8 are the midpoints of the four sides of the projection area.
Step 303, based on the control instruction, receiving a touch positioning operation performed by the user based on the positioning interface, so as to implement touch positioning.
Here, after the electronic whiteboard issues a control instruction to the target application, the electronic whiteboard may be projected with a positioning interface. Thus, the user can execute touch positioning operation on the electronic whiteboard based on the guiding information presented by the positioning interface so as to realize touch positioning.
Optionally, based on the control instruction, receiving a touch positioning operation performed by the user based on the positioning interface to implement touch positioning, including: and selecting a contour locating point from a plurality of contour locating points presented by a locating interface, and executing information prompt operation on the selected contour locating point. And responding to the detection that the user clicks the contour locating point, acquiring coordinate values of the contour locating point, selecting an unselected contour locating point from a plurality of contour locating points, and continuing to execute information prompt operation until each contour locating point is clicked. And determining coordinate values of each touch point on the electronic whiteboard according to the coordinate values of the plurality of contour locating points, so as to realize touch location.
The information prompting operation generally refers to an operation of displaying information for prompting a user to click on a contour locating point beside the contour locating point.
Here, each contour locating point can be displayed on the locating interface, and a prompt message for prompting the user to click on the contour locating point is displayed by flashing beside one contour locating point. After the user clicks the contour locating point, the locating interface flashes beside other contour locating points to display the prompt information for prompting the user to click the contour locating point until all the contour locating points are clicked. Thus, the electronic whiteboard can acquire the position of each contour locating point on the electronic whiteboard and the coordinate value under the target coordinate system. The target coordinate system is a display coordinate system of the terminal equipment where the target application is located. And the electronic whiteboard can obtain the mapping relation between the coordinate system of the electronic whiteboard and the target coordinate system based on the position of the same contour locating point in the electronic whiteboard and the coordinate value under the target coordinate system. Therefore, the coordinate value of each touch point on the electronic whiteboard under the target coordinate system can be calculated based on the mapping relation. Thereby realizing the positioning of the touch position on the electronic whiteboard.
It should be noted that, the electronic whiteboard may also only acquire the position of the contour locating point clicked by the user on the electronic whiteboard, and then send the acquired position of the contour locating point on the electronic whiteboard to the target application. So that the target application obtains the mapping relation between the coordinate system of the electronic whiteboard and the target coordinate system based on the position of the same contour locating point in the electronic whiteboard and the coordinate value under the target coordinate system. Therefore, the coordinate value of each touch point on the electronic whiteboard under the target coordinate system can be calculated based on the mapping relation. Thereby realizing the positioning of the touch position on the electronic whiteboard.
According to the method provided by the embodiment, the state of the target key is detected, so that when the key state is the target state, the target application is controlled to present a positioning interface. The user can conveniently and rapidly open the positioning interface by pressing the target key, so that touch positioning operation is performed based on the guiding information of the positioning interface to realize touch positioning. The method is beneficial to improving the efficiency of positioning the touch position on the electronic whiteboard.
In an optional implementation manner of each embodiment of the present application, the sending a control instruction to the target application includes:
And sending a control instruction to the target application through a preset universal serial bus (Universal Serial Bus, USB) interface.
In the implementation manner, the terminal equipment where the target application is located is generally provided with the USB interface, so that the control instruction is sent to the target application through the USB interface, the implementation is easy, and the cost can be saved.
Example III
The present embodiment provides a touch positioning method, and the embodiment is further described in the second embodiment, and the same or similar parts as those of the embodiment can be specifically referred to the related description of the second embodiment, which is not repeated here. Referring to fig. 6, the touch positioning method in the present embodiment includes:
step 601, detecting a key state of a target key.
In this embodiment, the specific operation of step 601 is substantially the same as that of step 301 in the embodiment shown in fig. 3, and will not be described here again.
Step 602, in response to the key state of the target key being the target state and the value of the parameter for indicating the on-off state of the positioning interface being the first preset value, sending a control instruction to the target application.
The first preset value is used for indicating that the opening and closing state of the positioning interface is in an off state. Here, the specific value of the first preset value may be "1" or "off", and the specific value form of the first preset value is not limited in this embodiment.
The positioning interface presents information for guiding a user to execute touch positioning operation.
Step 603, based on the control instruction, receiving a touch positioning operation performed by the user based on the positioning interface, so as to implement touch positioning.
In this embodiment, the specific operation of step 603 is substantially the same as that of step 303 in the embodiment shown in fig. 3, and will not be described herein.
In this embodiment, when the key state of the target key is the target state and the positioning interface is currently in the off state, a control instruction is sent to the target application. The method can send the control command to the target application more accurately and reasonably when the key is pressed by the user. Therefore, the target application is accurately controlled to present the positioning interface.
In some optional implementations of the present embodiment, after sending the control instruction to the target application, the method further includes: and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to a second preset value.
The second preset value is used for indicating that the opening and closing state of the positioning interface is an opening state.
Here, the execution body may update the value of the parameter to the second preset value in time after sending the control instruction to the target application. Updating the value of the parameter for indicating the opening and closing state of the positioning interface in time is beneficial to realizing accurate control of the positioning interface of the target application.
In an alternative implementation manner of the various embodiments of the present application, the method may further include the following steps: in response to detecting that the key state of the target key is the target state and the value of the parameter for indicating the opening and closing state of the positioning interface is the second preset value, sending a control instruction for controlling the target application to close the positioning interface to the target application, and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to the first preset value.
The second preset value is used for indicating that the opening and closing state of the positioning interface is an opening state.
Here, when the key state of the target key is the target state and the positioning interface is currently in the on state, a control instruction for controlling the target application to close the positioning interface is sent to the target application, so that the positioning interface can be closed when the user presses the key again. The opening and closing of the positioning interface can be controlled through the hardware keys, the operation is convenient, the user can quickly open the positioning interface, and the efficiency of positioning the touch position on the electronic whiteboard is further improved. And helps to improve the user experience.
Example IV
The present embodiment provides a touch positioning method, and the embodiment is further described in the second embodiment, and the same or similar parts as those of the embodiment can be specifically referred to the related description of the second embodiment, which is not repeated here. Referring to fig. 7, the touch positioning method in the present embodiment includes:
Step 701, detecting whether the positioning key is operated. If there is an operation, step 702 is performed, otherwise step 701 is continued.
Here, the positioning key is the above-described target key.
If the interface of the positioning key is detected to change in level, for example, the interface is changed from a high level to a low level, the positioning key can be determined to be operated.
Step 702, it is determined whether the positioning interface has been opened. If so, step 703 is performed, otherwise, step 704 is performed.
Here, the execution body may determine whether the positioning interface has been opened by analyzing a value indicating an on-off state of the positioning interface. If the parameter value is a first preset value, the positioning interface is indicated to be in a closed state. And if the parameter value is a second preset value, indicating that the positioning interface is in an on state.
Step 703, exit the positioning interface.
Here, if the current positioning interface is in the on state, the execution subject may control the target application to exit the positioning interface.
Step 704, a positioning command is sent via USB.
Here, the positioning command is the control instruction for controlling the target application to present the positioning interface.
The execution body may send the positioning command to the terminal device where the target application is located through the USB interface, and the terminal device may transmit the positioning command to the target application.
It should be noted that the execution subject of the steps 701-704 may be an electronic whiteboard.
Step 705, it is determined if a positioning command is received. If a positioning command is received, step 706 is performed, otherwise step 705 is continued.
Step 706, opening the positioning interface, displaying the specific positioning point of the multi-point positioning, and flashing and displaying to prompt the user to click on the position.
Here, the anchor point is the above-mentioned contour anchor point. Each contour locating point can be displayed on the locating interface, and prompt information for prompting a user to click on the contour locating point is displayed by flashing beside one contour locating point.
And step 707, receiving and storing the coordinates of the current positioning point clicked by the user, and prompting the user to click the next positioning point.
Here, after the user clicks the contour locating point, the locating interface flashes beside another contour locating point to display the prompt information for prompting the user to click the contour locating point until each contour locating point is clicked.
Step 708, after obtaining the coordinates of all positioning points, calculating the accurate touch area of the whole device, and completing the positioning function.
Here, the target application may obtain the coordinate values of each contour locating point, and may calculate the coordinate values of each touch point on the electronic whiteboard by using a preset coordinate conversion formula for converting the display coordinate in the terminal device where the target application is located into the display coordinate under the projection area based on the coordinate values of each contour locating point, so as to implement locating the touch position on the electronic whiteboard.
It should be noted that the execution subject of the steps 705-708 may be the terminal device where the target application is located.
Example five
Corresponding to the touch positioning method of the above embodiment, fig. 8 shows a block diagram of a touch positioning apparatus 800 provided in the embodiment of the present application, and for convenience of explanation, only the portion related to the embodiment of the present application is shown.
Referring to fig. 8, the apparatus includes:
a state detection unit 801 for detecting a key state of a target key;
the instruction sending unit 802 is configured to send a control instruction to the target application in response to the key state of the target key being the target state, where the control instruction is used to control the target application to present a positioning interface, and the positioning interface presents information for guiding the user to execute the touch positioning operation;
the positioning execution unit 803 is configured to receive, based on the control instruction, a touch positioning operation performed by the user based on the positioning interface, so as to implement touch positioning.
In one embodiment, the state detection unit 801 is specifically configured to:
detecting the level change of an interface of the target key;
in response to detecting that the level of the interface changes from a high level to a low level and then to a high level, the key state is determined to be a triggered state.
In one embodiment, the state detection unit 801 is specifically configured to:
detecting the level change of an interface of the target key;
in response to detecting a change in the level of the interface from a high level to a low level, the key state is determined to be a triggered state.
In one embodiment, sending control instructions to a target application includes:
and sending a control instruction to the target application through a preset universal serial bus interface.
In one embodiment, in response to the key state of the target key being the target state, sending a control instruction to the target application includes:
and responding to the key state of the target key as a target state and the value of the parameter for indicating the opening and closing state of the positioning interface as a first preset value, and sending a control instruction to the target application, wherein the first preset value is used for indicating the opening and closing state of the positioning interface as a closing state.
In one embodiment, after sending the control instruction to the target application, further comprising:
and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to a second preset value, wherein the second preset value is used for indicating the opening and closing state of the positioning interface to be the opening state.
In an embodiment, the apparatus further comprises an interface closing unit for:
In response to detecting that the key state of the target key is the target state and the value of the parameter for indicating the opening and closing state of the positioning interface is the second preset value, sending a control instruction for controlling the target application to close the positioning interface to the target application, and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to the first preset value.
In one embodiment, the positioning interface presents information for guiding a user to perform touch positioning operations, including: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of the projection area.
In one embodiment, the positioning execution unit 803 is specifically configured to:
selecting a contour locating point from a plurality of contour locating points presented by a locating interface, and executing information prompting operation on the selected contour locating point;
in response to detecting that the user clicks the contour locating point, acquiring coordinate values of the contour locating point, selecting an unselected contour locating point from a plurality of contour locating points, and continuing to execute information prompt operation until each contour locating point is clicked;
and determining the coordinate values of each touch point on the electronic whiteboard according to the coordinate values of the contour locating points and a preset coordinate conversion formula, so as to realize touch location.
The device provided by the embodiment controls the target application to present the positioning interface when the key state is the target state by detecting the state of the target key. The user can conveniently and rapidly open the positioning interface by pressing the target key, so that touch positioning operation is performed based on the guiding information of the positioning interface to realize touch positioning. The method is beneficial to improving the efficiency of positioning the touch position on the electronic whiteboard.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
Example six
Fig. 9 is a schematic structural diagram of an electronic device 900 according to an embodiment of the present application. As shown in fig. 9, the electronic device 900 of this embodiment includes: at least one processor 901 (only one processor is shown in fig. 9), a memory 902, and a computer program 903, such as a touch location program, stored in the memory 902 and executable on the at least one processor 901. The steps of any of the various method embodiments described above are implemented when the processor 901 executes the computer program 903. The processor 901, when executing the computer program 903, implements the steps in the embodiments of the respective touch location methods described above. The processor 901, when executing the computer program 903, performs the functions of the modules/units in the above-described device embodiments, for example, the functions of the units 801 to 803 shown in fig. 8.
By way of example, the computer program 903 may be partitioned into one or more modules/units, which are stored in the memory 902 and executed by the processor 901 to complete the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing particular functions to describe the execution of the computer program 903 in the electronic device 900. For example, the computer program 903 may be divided into a state detection unit, an instruction sending unit, and a positioning execution unit, where specific functions of each unit are described in the above embodiments, and are not described herein.
The electronic device 900 may be a computing device such as a server, desktop computer, tablet computer, cloud server, mobile terminal, and the like. Electronic device 900 may include, but is not limited to, a processor 901, a memory 902. It will be appreciated by those skilled in the art that fig. 9 is merely an example of an electronic device 900 and is not intended to limit the electronic device 900, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., an electronic device may also include an input-output device, a network access device, a bus, etc.
The processor 901 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 902 may be an internal storage unit of the electronic device 900, such as a hard disk or a memory of the electronic device 900. The memory 902 may also be an external storage device of the electronic device 900, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device 900. Further, the memory 902 may also include both internal and external storage units of the electronic device 900. The memory 902 is used to store computer programs and other programs and data required by the electronic device. The memory 902 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other manners. For example, the apparatus/electronic device embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be appropriately increased or decreased according to the requirements of the jurisdiction's jurisdiction and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the jurisdiction and the patent practice.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. A touch positioning method, the method comprising:
detecting a key state of a target key;
responding to the key state of the target key as a target state, and sending a control instruction to a target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information for guiding a user to execute touch positioning operation; wherein the target state is a triggered state;
based on the control instruction, receiving touch positioning operation executed by a user based on the positioning interface so as to realize touch positioning;
the response to the key state of the target key being the target state, sending a control instruction to a target application, including: responding to the key state of the target key as a target state and the value of a parameter for indicating the opening and closing state of the positioning interface as a first preset value, and sending a control instruction to a target application, wherein the first preset value is used for indicating the opening and closing state of the positioning interface as a closing state; and sending a control instruction for controlling the target application to close the positioning interface to the target application when the key state of the target key is a target state and the opening and closing state of the positioning interface is an opening state.
2. The method of claim 1, wherein detecting the key state of the target key comprises:
detecting the level change of an interface of the target key;
and in response to detecting that the level of the interface changes from a high level to a low level and then to a high level, determining that the key state is a triggered state.
3. The method of claim 1, wherein detecting the key state of the target key comprises:
detecting the level change of an interface of the target key;
in response to detecting a change in the level of the interface from a high level to a low level, the key state is determined to be a triggered state.
4. The method of claim 1, wherein the sending the control instruction to the target application comprises: and sending a control instruction to the target application through a preset universal serial bus interface.
5. The method of claim 1, wherein the positioning interface presents information for guiding a user to perform a touch positioning operation, comprising: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of a projection area.
6. The method of claim 5, wherein receiving, based on the control instruction, a touch location operation performed by a user based on the location interface to implement touch location, comprises:
selecting a contour locating point from a plurality of contour locating points presented by the locating interface, and executing information prompting operation on the selected contour locating point;
in response to detecting that the user clicks the contour locating point, acquiring coordinate values of the contour locating point, selecting an unselected contour locating point from the contour locating points, and continuing to execute the information prompt operation until each contour locating point is clicked;
and determining coordinate values of each touch point on the electronic whiteboard according to the coordinate values of the contour locating points, so as to realize touch location.
7. A touch location device, the device comprising:
the state detection unit is used for detecting the key state of the target key;
the instruction sending unit is used for responding to the key state of the target key as a target state and sending a control instruction to a target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information for guiding a user to execute touch positioning operation; wherein the target state is a triggered state;
The positioning execution unit is used for receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning;
the response to the key state of the target key being the target state, sending a control instruction to a target application, including: responding to the key state of the target key as a target state and the value of a parameter for indicating the opening and closing state of the positioning interface as a first preset value, and sending a control instruction to a target application, wherein the first preset value is used for indicating the opening and closing state of the positioning interface as a closing state; and sending a control instruction for controlling the target application to close the positioning interface to the target application when the key state of the target key is a target state and the opening and closing state of the positioning interface is an opening state.
8. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
9. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the method according to any one of claims 1 to 6.
CN202011422873.3A 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium Active CN112615615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011422873.3A CN112615615B (en) 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011422873.3A CN112615615B (en) 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN112615615A CN112615615A (en) 2021-04-06
CN112615615B true CN112615615B (en) 2023-12-26

Family

ID=75229601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011422873.3A Active CN112615615B (en) 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112615615B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101315586A (en) * 2008-07-21 2008-12-03 贾颖 Electronic pen for interactive electronic white board and its interaction control method
CN101727243A (en) * 2010-02-02 2010-06-09 中兴通讯股份有限公司 Method and device for acquiring calibration parameters of touch screen
CN201698327U (en) * 2010-05-14 2011-01-05 杭州高科工贸有限公司 Electronic white board based on digital wireless location
CN102354269A (en) * 2011-08-18 2012-02-15 宇龙计算机通信科技(深圳)有限公司 Method and system for controlling display device
CN102478974A (en) * 2010-11-30 2012-05-30 汉王科技股份有限公司 Electromagnetic pen for electronic whiteboard and control method thereof
CN102880360A (en) * 2012-09-29 2013-01-16 东北大学 Infrared multipoint interactive electronic whiteboard system and whiteboard projection calibration method
CN102880361A (en) * 2012-10-12 2013-01-16 南京芒冠光电科技股份有限公司 Positioning calibration method for electronic whiteboard equipment
CN104133598A (en) * 2013-05-03 2014-11-05 周永清 Driver-free positioning calibration guide method and driver-free positioning calibration guide device for electronic whiteboard system
CN104423629A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Electronic equipment and data processing method
CN104636060A (en) * 2014-04-29 2015-05-20 汉王科技股份有限公司 Soft key locating method and system
US10845878B1 (en) * 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3734407A1 (en) * 2011-02-10 2020-11-04 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101315586A (en) * 2008-07-21 2008-12-03 贾颖 Electronic pen for interactive electronic white board and its interaction control method
CN101727243A (en) * 2010-02-02 2010-06-09 中兴通讯股份有限公司 Method and device for acquiring calibration parameters of touch screen
CN201698327U (en) * 2010-05-14 2011-01-05 杭州高科工贸有限公司 Electronic white board based on digital wireless location
CN102478974A (en) * 2010-11-30 2012-05-30 汉王科技股份有限公司 Electromagnetic pen for electronic whiteboard and control method thereof
CN102354269A (en) * 2011-08-18 2012-02-15 宇龙计算机通信科技(深圳)有限公司 Method and system for controlling display device
CN102880360A (en) * 2012-09-29 2013-01-16 东北大学 Infrared multipoint interactive electronic whiteboard system and whiteboard projection calibration method
CN102880361A (en) * 2012-10-12 2013-01-16 南京芒冠光电科技股份有限公司 Positioning calibration method for electronic whiteboard equipment
CN104133598A (en) * 2013-05-03 2014-11-05 周永清 Driver-free positioning calibration guide method and driver-free positioning calibration guide device for electronic whiteboard system
CN104423629A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Electronic equipment and data processing method
CN104636060A (en) * 2014-04-29 2015-05-20 汉王科技股份有限公司 Soft key locating method and system
US10845878B1 (en) * 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于单片机STM8S103和ST05A的触摸按键设计;闫爱军;范海明;周钧;;舰船防化(第01期);32-35 *

Also Published As

Publication number Publication date
CN112615615A (en) 2021-04-06

Similar Documents

Publication Publication Date Title
CN102681722B (en) Coordinate detection system, information processor and method
CN108037888B (en) Skill control method, skill control device, electronic equipment and storage medium
JP2019042509A (en) Visual method and apparatus for compensating sound information, program, storage medium, and electronic device
WO2019085921A1 (en) Method, storage medium and mobile terminal for operating mobile terminal with one hand
CN105264507A (en) Apparatus and method of recognizing external device in a communication system
CN106161763B (en) Control method and device and electronic equipment
US20150160907A1 (en) Information processing method and electronic device
CN108228065B (en) Method and device for detecting UI control information and electronic equipment
US20190339858A1 (en) Method and apparatus for adjusting virtual key of mobile terminal
US20190317719A1 (en) Setting up multiple displays via user input
CN104202637A (en) Key remote control and target dragging method
CN109857298B (en) Application starting method, device, equipment and storage medium
EP3223072A1 (en) Projector playing control method, device, and computer storage medium
CN112685299B (en) Automatic test method, device, electronic equipment and readable storage medium
CN112615615B (en) Touch positioning method, device, equipment and medium
CN112835497A (en) Method and device for quickly entering electronic whiteboard and storage medium
CN109727370B (en) Control method, device, equipment and storage medium of vending machine
CN109637017B (en) Control method, device, equipment and medium for vending machine based on touch film
CN104898967A (en) Presenting indication of input to a touch-enabled pad on touch-enabled pad
CN112203131B (en) Prompting method and device based on display equipment and storage medium
CN110162251B (en) Image scaling method and device, storage medium and electronic equipment
CN112596725A (en) Grading method and grading device for programming works, terminal equipment and storage medium
WO2020124422A1 (en) Handwriting system control method and handwriting system
CN114020164B (en) Stylus control method and device, electronic equipment and storage medium
CN109271909A (en) Person's handwriting recognition methods, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant