CN112615615A - Touch positioning method, device, equipment and medium - Google Patents

Touch positioning method, device, equipment and medium Download PDF

Info

Publication number
CN112615615A
CN112615615A CN202011422873.3A CN202011422873A CN112615615A CN 112615615 A CN112615615 A CN 112615615A CN 202011422873 A CN202011422873 A CN 202011422873A CN 112615615 A CN112615615 A CN 112615615A
Authority
CN
China
Prior art keywords
positioning
target
key
state
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011422873.3A
Other languages
Chinese (zh)
Other versions
CN112615615B (en
Inventor
徐协增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Hongcheng Opto Electronics Co Ltd
Original Assignee
Anhui Hongcheng Opto Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Hongcheng Opto Electronics Co Ltd filed Critical Anhui Hongcheng Opto Electronics Co Ltd
Priority to CN202011422873.3A priority Critical patent/CN112615615B/en
Publication of CN112615615A publication Critical patent/CN112615615A/en
Application granted granted Critical
Publication of CN112615615B publication Critical patent/CN112615615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K19/00Logic circuits, i.e. having at least two inputs acting on one output; Inverting circuits
    • H03K19/0175Coupling arrangements; Interface arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is applicable to the technical field of computers, and provides a touch positioning method, which comprises the following steps: detecting the key state of a target key; responding to the fact that the key state of the target key is the target state, and sending a control instruction to the target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information used for guiding a user to execute touch positioning operation; and receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning. According to the method and the device, the target application is controlled to present the positioning interface when the key state is the target state by detecting the state of the target key. The user can conveniently and quickly open the positioning interface by pressing the target key, so that touch positioning operation is executed based on the guiding information of the positioning interface to realize touch positioning. The efficiency of positioning the touch position on the electronic whiteboard is improved.

Description

Touch positioning method, device, equipment and medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a touch positioning method, apparatus, device, and medium.
Background
An electronic whiteboard, for example, an interactive electronic whiteboard, has functions of writing, commenting, drawing, multimedia entertainment, web conference, and the like, and is a preferable product for office, teaching, and interactive graphics and text demonstration in the information era.
At present, an electronic whiteboard is usually used with a projector. Fig. 1 is a schematic diagram of a relative position relationship between an electronic whiteboard and a projection area of a projector provided in the related art, and as shown in fig. 1, in practical applications, the projection area 102 of the projector is generally smaller than the area of the electronic whiteboard 101 for easy viewing. In an actual use process, since a projection area of the projector is usually changed due to a position change of the projector, in order to accurately touch the electronic whiteboard in the projection area, when the projection area is changed, a touch position on the electronic whiteboard needs to be repositioned.
In the related art, there is an improvement in efficiency of positioning a touch position on an electronic whiteboard.
Disclosure of Invention
The embodiment of the application provides a touch positioning method, a touch positioning device, touch positioning equipment and a touch positioning medium, and aims to solve the problem that the efficiency of positioning a touch position on an electronic whiteboard in the related art is not high enough.
In a first aspect, an embodiment of the present application provides a touch positioning method, where the method includes:
detecting the key state of a target key;
responding to the fact that the key state of the target key is the target state, and sending a control instruction to the target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information used for guiding a user to execute touch positioning operation;
and receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning.
Further, detecting the key state of the target key comprises:
detecting the level change of an interface of a target key;
and determining the key state to be the triggered state in response to detecting that the level of the interface changes from high level to low level and then to high level.
Further, detecting the key state of the target key comprises:
detecting the level change of an interface of a target key;
and determining the key state as a triggered state in response to detecting that the level of the interface changes from a high level to a low level.
Further, sending a control instruction to the target application includes:
and sending a control instruction to the target application through a preset universal serial bus interface.
Further, in response to that the key state of the target key is the target state, sending a control instruction to the target application, including:
and sending a control instruction to the target application in response to that the key state of the target key is the target state and the value of the parameter for indicating the on-off state of the positioning interface is a first preset value, wherein the first preset value is used for indicating that the on-off state of the positioning interface is the off state.
Further, after sending the control instruction to the target application, the method further includes:
and assigning a value of a parameter for indicating the opening and closing state of the positioning interface as a second preset value, wherein the second preset value is used for indicating that the opening and closing state of the positioning interface is an opening state.
Further, the method further comprises:
and in response to the fact that the key state of the target key is detected to be the target state and the value of the parameter for indicating the opening and closing state of the positioning interface is the second preset value, sending a control instruction for controlling the target application to close the positioning interface to the target application, and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to be the first preset value.
Further, the positioning interface presents information for guiding a user to perform a touch positioning operation, including: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click on the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of the projection area.
Further, based on the control instruction, receiving a touch positioning operation executed by a user based on a positioning interface to realize touch positioning, including:
selecting contour positioning points from a plurality of contour positioning points presented by a positioning interface, and executing information prompt operation on the selected contour positioning points;
responding to the fact that the user clicks the contour positioning point, obtaining coordinate values of the contour positioning point, selecting the contour positioning point which is not selected from the contour positioning points, and continuing to execute information prompt operation until all the contour positioning points are clicked;
and determining the coordinate value of each touch point on the electronic whiteboard according to the coordinate values of the plurality of contour positioning points, so as to realize touch positioning.
In a second aspect, an embodiment of the present application provides a touch positioning device, including:
the state detection unit is used for detecting the key state of the target key;
the instruction sending unit is used for responding to the fact that the key state of the target key is the target state and sending a control instruction to the target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information used for guiding a user to execute touch positioning operation;
and the positioning execution unit is used for receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the touch location method when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the steps of the touch positioning method.
In a fifth aspect, an embodiment of the present application provides a computer program product, which when run on an electronic device, causes the electronic device to perform the touch positioning method of any one of the above first aspects.
Compared with the related technology, the embodiment of the application has the beneficial effects that: when the related art locates the touch position on the electronic whiteboard, usually, a user manually opens a target application, and then manually finds a locating interface from the opened target application. And finally, the user executes touch positioning operation based on the guiding information of the positioning interface so as to realize touch positioning. In the related art, if a user is unfamiliar with various operations of a target application, it is difficult to locate a touch position on an electronic whiteboard, which results in low efficiency in locating the touch position on the electronic whiteboard. According to the method and the device, the target application is controlled to present the positioning interface when the key state is the target state by detecting the state of the target key. The user can conveniently and quickly open the positioning interface by pressing the target key, so that touch positioning operation is executed based on the guiding information of the positioning interface to realize touch positioning. The efficiency of positioning the touch position on the electronic whiteboard is improved.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the embodiments or the related technical descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a relative position relationship between an electronic whiteboard and a projection area of a projector provided in the related art;
fig. 2 is a system architecture diagram of an application of a touch positioning method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a touch positioning method according to an embodiment of the present application;
FIG. 4 is a schematic circuit diagram of a key state detection circuit for detecting a target key according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a distribution of contour localization points provided by an embodiment of the present application;
fig. 6 is a schematic flowchart of a touch positioning method according to another embodiment of the present application;
fig. 7 is a schematic flowchart of a touch positioning method according to another embodiment of the present application;
fig. 8 is a schematic structural diagram of a touch positioning device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to explain the technical means of the present application, the following examples are given below.
Example one
Fig. 2 is a system architecture diagram of an application of a touch positioning method according to an embodiment of the present disclosure.
As shown in fig. 2, the system architecture may include an electronic whiteboard 201, a target key 2011 provided on the electronic whiteboard, a terminal device 202 installed with a target application 2021, and a projector 203. The terminal apparatus 202 is electrically connected to the electronic whiteboard 201 and the projector 203, respectively.
An electronic whiteboard 201, configured to detect a key state of a target key 2011; in response to that the key state of the target key 2011 is a target state, sending a control instruction for controlling the target application 2021 to present a positioning interface to the target application 2021, so that the user performs a touch positioning operation based on the positioning interface projected onto the electronic whiteboard to implement touch positioning, where the positioning interface presents information for guiding the user to perform the touch positioning operation.
The terminal device 202 may be a variety of electronic devices that have target applications installed, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like. The terminal device 202 can perform touch positioning based on a touch positioning operation performed by the user on the electronic whiteboard.
The projector 203 can be used to project content presented by the terminal device 203, such as a presented positioning interface, onto the electronic whiteboard 201.
According to the method and the device, the target application is controlled to present the positioning interface when the key state is the target state by detecting the state of the target key. The user can conveniently and quickly open the positioning interface by pressing the target key, so that touch positioning operation is executed based on the guiding information of the positioning interface to realize touch positioning. The efficiency of positioning the touch position on the electronic whiteboard is improved.
Example two
Referring to fig. 3, an embodiment of the present application provides a touch positioning method, including:
step 301, detecting the key state of the target key.
The target key may be a preset key.
In practice, the target key is usually fixedly arranged on the electronic whiteboard. The target key is typically electrically connected to the controller of the electronic whiteboard through an interface, for example, a General-Purpose Input/Output (GPIO) interface.
In this embodiment, the executing body of the touch positioning method may be an electronic whiteboard with a touch function. In practice, the execution main body may detect the key state of the target key by detecting a level change of the interface of the target key. As an example, if it is detected that the level of the interface changes from a low level to a high level, the key state may be determined as a triggered state. As another example, if it is detected that the level of the interface changes from a high level to a low level and then to a high level, the key state may also be determined as a triggered state.
Fig. 4 is a schematic circuit diagram for detecting a key state of a target key according to an embodiment of the present disclosure. As shown in FIG. 4, KEY is the target KEY, R is a resistor, and R may have a resistance of 10 kilo-ohms. And 3.3V is the power supply voltage of the target key. The KEY may be an automatic reset KEY, and when the KEY is pressed, the circuit is turned on, and the level at the interface is changed from a high level to a low level. When the KEY is stopped, the circuit is opened and the level at the interface changes from low to high. Here, if the target key is an auto-reset key, the execution body may detect a level change of the interface during a process from pressing to popping up of the target key, thereby obtaining a key state of the target key.
In practice, the key states may generally include a triggered state and an unactuated state. Generally, the triggered state of the target key generally refers to a state in which the level of the interface of the target key changes. The inactive state of the target key generally refers to the state of the key other than the activated state.
Optionally, detecting a key state of the target key includes: first, a level change of an interface of a target key is detected. Then, in response to detecting that the level of the interface changes from high level to low level and then to high level, the key state is determined to be the triggered state.
Here, in the case where the target key is an automatic reset key, when it is detected that the level of the interface of the target key changes from high level to low level and then to high level, or when it is detected that the level of the interface of the target key changes from low level to high level and then to low level, it may be determined that the key shape of the target key is in the triggered state.
Optionally, detecting a key state of the target key includes: first, a level change of an interface of a target key is detected. Then, in response to detecting that the level of the interface changes from a high level to a low level, the key state is determined to be a triggered state.
Here, the execution main body may determine that the key shape of the target key is in the triggered state when detecting that the level of the interface of the target key changes, for example, changes from a high level to a low level.
Step 302, in response to the key status of the target key being the target status, sending a control instruction to the target application.
The control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information used for guiding a user to execute touch positioning operation.
The target state is usually a triggered state.
The target application is generally an application for positioning a touch position of an electronic whiteboard.
Here, after receiving the control instruction, the target application may open the positioning interface and project the positioning interface onto the electronic whiteboard through the projector based on the control instruction.
The positioning interface may be presented with information for guiding a user in performing a touch positioning operation. As an example, the positioning interface may be presented with information for guiding a user to trace on the outline of the projection area. At this time, the touch positioning operation is an operation of tracing a line on the outline of the projection area.
Optionally, the positioning interface presents information for guiding a user to perform a touch positioning operation, including: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click on the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of the projection area. In practice, the positioning interface usually presents contour positioning points of the projection area and information for guiding the user to click on the contour positioning points of the projection area. The contour localization points described above are typically points used to determine the location of the projection region. At this time, the touch positioning operation may be: and clicking the contour positioning point of the projection area.
As an example, the contour positioning points may be four vertices of the projection area.
As another example, the contour positioning point may also be four vertices of the projection area plus a center point of the projection area.
Fig. 5 is a schematic distribution diagram of contour positioning points provided in the embodiment of the present application. As shown in FIG. 5, P0-P8 are contour anchor points. The P0-P8 are distributed on the projection area 502, and the projection area 502 is located on the electronic whiteboard 501. Wherein, P0 is the center point of the projection area, P1-P4 are the vertexes of the projection area, and P5-P8 are the midpoints of four sides of the projection area.
Step 303, receiving a touch positioning operation executed by a user based on the positioning interface based on the control instruction, so as to implement touch positioning.
Here, after the electronic whiteboard issues the control instruction to the target application, a positioning interface may be projected on the electronic whiteboard. In this way, the user can perform touch positioning operation on the electronic whiteboard based on the guidance information presented by the positioning interface to realize touch positioning.
Optionally, based on the control instruction, receiving a touch positioning operation performed by a user based on a positioning interface to implement touch positioning, including: and selecting contour positioning points from a plurality of contour positioning points presented by the positioning interface, and executing information prompt operation on the selected contour positioning points. And responding to the detected click of the user on the contour positioning point, acquiring the coordinate value of the contour positioning point, selecting the contour positioning point which is not selected from the plurality of contour positioning points, and continuing to execute the information prompt operation until all the contour positioning points are clicked. And determining the coordinate value of each touch point on the electronic whiteboard according to the coordinate values of the plurality of contour positioning points, so as to realize touch positioning.
The information prompting operation generally refers to an operation of displaying, near the contour positioning point, information for prompting a user to click the contour positioning point.
Here, each contour positioning point may be displayed on the positioning interface, and a prompt message for prompting the user to click the contour positioning point is displayed by flashing beside one contour positioning point. After the user clicks the contour positioning point, the positioning interface displays prompt information for prompting the user to click the contour positioning point in a flashing mode beside the other contour positioning points until all the contour positioning points are clicked. In this way, the electronic whiteboard can acquire the position of each contour positioning point on the electronic whiteboard and the coordinate value under the target coordinate system. And the target coordinate system is a display coordinate system of the terminal equipment where the target application is located. And the electronic whiteboard can obtain the mapping relation between the coordinate system of the electronic whiteboard and the target coordinate system based on the positions of the same contour positioning points in the electronic whiteboard and the coordinate values under the target coordinate system. Therefore, the coordinate values of all the touch points on the electronic whiteboard under the target coordinate system can be calculated and obtained based on the mapping relation. Therefore, the touch position on the electronic whiteboard can be positioned.
It should be noted that, the electronic whiteboard may also only obtain the position of the contour positioning point clicked by the user on the electronic whiteboard, and then send the obtained position of the contour positioning point on the electronic whiteboard to the target application. And enabling the target application to obtain a mapping relation between the coordinate system of the electronic whiteboard and the target coordinate system based on the positions of the same contour positioning points in the electronic whiteboard and the coordinate values under the target coordinate system. Therefore, the coordinate values of all the touch points on the electronic whiteboard under the target coordinate system can be calculated and obtained based on the mapping relation. Therefore, the touch position on the electronic whiteboard can be positioned.
According to the method provided by the embodiment, the target application is controlled to present the positioning interface when the key state is the target state by detecting the state of the target key. The user can conveniently and quickly open the positioning interface by pressing the target key, so that touch positioning operation is executed based on the guiding information of the positioning interface to realize touch positioning. The efficiency of positioning the touch position on the electronic whiteboard is improved.
In an optional implementation manner of each embodiment of the present application, the sending the control instruction to the target application includes:
and sending a control instruction to the target application through a preset Universal Serial Bus (USB) interface.
In the implementation mode, the terminal equipment where the target application is located usually has a USB interface, and the control instruction is sent to the target application through the USB interface, so that the implementation is easy and the cost can be saved.
EXAMPLE III
The embodiment of the present application provides a touch positioning method, which is a further description of the second embodiment, and reference may be specifically made to the related description of the second embodiment where the same or similar to the second embodiment, and details are not repeated here. Referring to fig. 6, the touch positioning method in the present embodiment includes:
step 601, detecting the key state of the target key.
In this embodiment, the specific operation of step 601 is substantially the same as the operation of step 301 in the embodiment shown in fig. 3, and is not described herein again.
Step 602, in response to that the key state of the target key is the target state and the value of the parameter for indicating the on-off state of the positioning interface is the first preset value, sending a control instruction to the target application.
The first preset value is used for indicating that the opening and closing state of the positioning interface is the closing state. Here, the specific value of the first preset value may be "1" or "off", and the specific value form of the first preset value is not limited in this embodiment.
The positioning interface presents information for guiding a user to execute touch positioning operation.
Step 603, receiving a touch positioning operation executed by a user based on the positioning interface based on the control instruction, so as to realize touch positioning.
In this embodiment, the specific operation of step 603 is substantially the same as the operation of step 303 in the embodiment shown in fig. 3, and is not described herein again.
In this embodiment, when the key state of the target key is the target state and the location interface is currently in the closed state, a control instruction is sent to the target application. The control command can be more accurately and reasonably sent to the target application when the key pressed by the user is detected. Therefore, the target application is accurately controlled to present the positioning interface.
In some optional implementations of this embodiment, after sending the control instruction to the target application, the method further includes: and assigning the value of the parameter for indicating the opening and closing state of the positioning interface as a second preset value.
And the second preset value is used for indicating that the opening and closing state of the positioning interface is an opening state.
Here, the execution main body may update the value of the parameter to a second preset value in time after sending the control instruction to the target application. And the value of the parameter for indicating the opening and closing state of the positioning interface is updated in time, so that the accurate control of the positioning interface of the target application is facilitated.
In an optional implementation manner of each embodiment of the present application, the method may further include the following steps: and in response to the fact that the key state of the target key is detected to be the target state and the value of the parameter for indicating the opening and closing state of the positioning interface is the second preset value, sending a control instruction for controlling the target application to close the positioning interface to the target application, and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to be the first preset value.
And the second preset value is used for indicating that the opening and closing state of the positioning interface is an opening state.
Here, when the key state of the target key is the target state and the positioning interface is currently in the open state, a control instruction for controlling the target application to close the positioning interface is sent to the target application, so that the positioning interface can be closed when the user presses the key again. The hardware key can be used for controlling the positioning interface to be opened and closed, operation is convenient, the user can quickly open the positioning interface, and the efficiency of positioning the touch position on the electronic whiteboard is further improved. And is helpful for improving user experience.
Example four
The embodiment of the present application provides a touch positioning method, which is a further description of the second embodiment, and reference may be specifically made to the related description of the second embodiment where the same or similar to the second embodiment, and details are not repeated here. Referring to fig. 7, the touch positioning method in the present embodiment includes:
step 701, detecting whether the positioning key has operation. If so, go to step 702, otherwise, go to step 701.
Here, the positioning key is the above-described target key.
If the level change of the interface of the positioning key is detected, for example, the interface changes from high level to low level, it can be determined that the positioning key is operated.
At step 702, it is determined whether the location interface is open. If so, go to step 703, otherwise, go to step 704.
Here, the execution main body may determine whether the positioning interface is opened by analyzing a value indicating an open/close state of the positioning interface. And if the parameter value is the first preset value, indicating that the positioning interface is in a closed state. And if the parameter value is the second preset value, indicating that the positioning interface is in an open state.
Step 703, quitting the positioning interface.
Here, if the current location interface is in an open state, the execution subject may control the target application to exit the location interface.
Step 704, send a locate command over the USB.
Here, the positioning command is the above-mentioned control instruction for controlling the target application to present the positioning interface.
The execution main body can send the positioning command to the terminal equipment where the target application is located through the USB interface, and the terminal equipment can transmit the positioning command to the target application.
It should be noted that the execution subject of steps 701-704 may be an electronic whiteboard.
Step 705, it is determined whether a positioning command is received. If the positioning command is received, go to step 706, otherwise, go to step 705.
Step 706, opening a positioning interface, displaying the specific positioning points of the multi-point positioning, and flashing to display to prompt the user to click the position.
Here, the anchor points are the above-described contour anchor points. Each contour positioning point can be displayed on the positioning interface, and prompt information for prompting a user to click the contour positioning point is displayed by flashing beside one contour positioning point.
And step 707, receiving and storing the coordinates of the current positioning point clicked by the user, and prompting the user to click the next positioning point.
After the user clicks the contour positioning point, the positioning interface displays prompt information for prompting the user to click the contour positioning point in a flashing mode beside other contour positioning points until all the contour positioning points are clicked.
In step 708, after the coordinates of all positioning points are obtained, the accurate touch area of the whole device is calculated, and the positioning function is completed.
Here, the target application may obtain coordinate values of each contour positioning point, and may calculate, based on the coordinate values of each contour positioning point, coordinate values of each touch point on the electronic whiteboard by using a preset coordinate conversion formula for converting display coordinates in the terminal device where the target application is located to display coordinates in the projection area, so as to position the touch position on the electronic whiteboard.
It should be noted that the execution subject of the steps 705-708 may be the terminal device where the target application is located.
EXAMPLE five
Fig. 8 shows a block diagram of a touch positioning device 800 provided in the embodiment of the present application, which corresponds to the touch positioning method in the foregoing embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 8, the apparatus includes:
a state detection unit 801 for detecting a key state of a target key;
the instruction sending unit 802 is configured to send a control instruction to the target application in response to that the key state of the target key is the target state, where the control instruction is used to control the target application to present a positioning interface, and the positioning interface presents information used to guide a user to perform a touch positioning operation;
the positioning execution unit 803 is configured to receive, based on the control instruction, a touch positioning operation executed by the user based on the positioning interface, so as to implement touch positioning.
In one embodiment, the state detection unit 801 is specifically configured to:
detecting the level change of an interface of a target key;
and determining the key state to be the triggered state in response to detecting that the level of the interface changes from high level to low level and then to high level.
In one embodiment, the state detection unit 801 is specifically configured to:
detecting the level change of an interface of a target key;
and determining the key state as a triggered state in response to detecting that the level of the interface changes from a high level to a low level.
In one embodiment, sending control instructions to the target application includes:
and sending a control instruction to the target application through a preset universal serial bus interface.
In one embodiment, in response to the key state of the target key being the target state, sending a control instruction to the target application includes:
and sending a control instruction to the target application in response to that the key state of the target key is the target state and the value of the parameter for indicating the on-off state of the positioning interface is a first preset value, wherein the first preset value is used for indicating that the on-off state of the positioning interface is the off state.
In one embodiment, after sending the control instruction to the target application, the method further includes:
and assigning a value of a parameter for indicating the opening and closing state of the positioning interface as a second preset value, wherein the second preset value is used for indicating that the opening and closing state of the positioning interface is an opening state.
In one embodiment, the apparatus further comprises an interface closing unit for:
and in response to the fact that the key state of the target key is detected to be the target state and the value of the parameter for indicating the opening and closing state of the positioning interface is the second preset value, sending a control instruction for controlling the target application to close the positioning interface to the target application, and assigning the value of the parameter for indicating the opening and closing state of the positioning interface to be the first preset value.
In one embodiment, the positioning interface is presented with information for guiding a user to perform a touch positioning operation, including: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click on the plurality of contour positioning points, wherein the plurality of contour positioning points are used for determining the position of the projection area.
In one embodiment, the positioning execution unit 803 is specifically configured to:
selecting contour positioning points from a plurality of contour positioning points presented by a positioning interface, and executing information prompt operation on the selected contour positioning points;
responding to the fact that the user clicks the contour positioning point, obtaining coordinate values of the contour positioning point, selecting the contour positioning point which is not selected from the contour positioning points, and continuing to execute information prompt operation until all the contour positioning points are clicked;
and determining the coordinate value of each touch point on the electronic whiteboard according to the coordinate values of the plurality of contour positioning points and a preset coordinate conversion formula, so as to realize touch positioning.
The device provided by the embodiment realizes that the target application is controlled to present the positioning interface when the key state is the target state by detecting the state of the target key. The user can conveniently and quickly open the positioning interface by pressing the target key, so that touch positioning operation is executed based on the guiding information of the positioning interface to realize touch positioning. The efficiency of positioning the touch position on the electronic whiteboard is improved.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
EXAMPLE six
Fig. 9 is a schematic structural diagram of an electronic device 900 according to an embodiment of the present application. As shown in fig. 9, the electronic apparatus 900 of this embodiment includes: at least one processor 901 (only one processor is shown in fig. 9), a memory 902, and a computer program 903, such as a touch location program, stored in the memory 902 and executable on the at least one processor 901. The steps in any of the various method embodiments described above are implemented when the computer program 903 is executed by the processor 901. The processor 901 implements the steps of the above embodiments of the touch positioning method when executing the computer program 903. The processor 901, when executing the computer program 903, implements the functions of the various modules/units in the various device embodiments described above, such as the functions of the units 801 to 803 shown in fig. 8.
Illustratively, the computer program 903 may be divided into one or more modules/units, which are stored in the memory 902 and executed by the processor 901 to accomplish the present application. One or more modules/units may be a series of computer program instruction segments capable of performing certain functions, which are used to describe the execution of computer program 903 in electronic device 900. For example, the computer program 903 may be divided into a state detection unit, an instruction sending unit, and a positioning execution unit, and specific functions of each unit are described in the foregoing embodiments, which are not described herein again.
The electronic device 900 may be a server, a desktop computer, a tablet computer, a cloud server, a mobile terminal, and other computing devices. The electronic device 900 may include, but is not limited to, a processor 901, a memory 902. Those skilled in the art will appreciate that fig. 9 is merely an example of an electronic device 900 and does not constitute a limitation of the electronic device 900 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 901 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 902 may be an internal storage unit of the electronic device 900, such as a hard disk or a memory of the electronic device 900. The memory 902 may also be an external storage device of the electronic device 900, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the electronic device 900. Further, the memory 902 may also include both internal storage units and external storage devices of the electronic device 900. The memory 902 is used for storing computer programs and other programs and data required by the electronic device. The memory 902 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated module, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments described above may be implemented by a computer program, which is stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, in accordance with legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunications signals.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A touch positioning method is characterized by comprising the following steps:
detecting the key state of a target key;
responding to the fact that the key state of the target key is a target state, and sending a control instruction to a target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information used for guiding a user to execute touch positioning operation;
and receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning.
2. The method of claim 1, wherein the detecting the key state of the target key comprises:
detecting the level change of the interface of the target key;
and in response to detecting that the level of the interface changes from high level to low level and then to high level, determining that the key state is a triggered state.
3. The method of claim 1, wherein the detecting the key state of the target key comprises:
detecting the level change of the interface of the target key;
and in response to detecting that the level of the interface changes from a high level to a low level, determining that the key state is a triggered state.
4. The method of claim 1, wherein sending control instructions to the target application comprises: and sending a control instruction to the target application through a preset universal serial bus interface.
5. The method of claim 1, wherein sending a control command to a target application in response to the key status of the target key being the target status comprises:
and sending a control instruction to a target application in response to that the key state of the target key is a target state and the value of a parameter for indicating the on-off state of the positioning interface is a first preset value, wherein the first preset value is used for indicating that the on-off state of the positioning interface is a closed state.
6. The method of claim 1, wherein the pointing interface is presented with information for guiding a user to perform a touch pointing operation, comprising: the positioning interface presents a plurality of contour positioning points and information for guiding a user to click the contour positioning points, wherein the contour positioning points are used for determining the position of a projection area.
7. The method according to claim 6, wherein the receiving, based on the control instruction, a touch positioning operation performed by a user based on the positioning interface to achieve touch positioning comprises:
selecting contour positioning points from a plurality of contour positioning points presented by the positioning interface, and executing information prompt operation on the selected contour positioning points;
responding to the detection that the user clicks the contour positioning point, acquiring coordinate values of the contour positioning point, selecting the contour positioning point which is not selected from the plurality of contour positioning points, and continuing to execute the information prompt operation until all the contour positioning points are clicked;
and determining the coordinate value of each touch point on the electronic whiteboard according to the coordinate values of the plurality of contour positioning points, so as to realize touch positioning.
8. A touch location device, the device comprising:
the state detection unit is used for detecting the key state of the target key;
the instruction sending unit is used for responding to the condition that the key state of the target key is a target state and sending a control instruction to a target application, wherein the control instruction is used for controlling the target application to present a positioning interface, and the positioning interface presents information used for guiding a user to execute touch positioning operation;
and the positioning execution unit is used for receiving touch positioning operation executed by a user based on the positioning interface based on the control instruction so as to realize touch positioning.
9. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202011422873.3A 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium Active CN112615615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011422873.3A CN112615615B (en) 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011422873.3A CN112615615B (en) 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN112615615A true CN112615615A (en) 2021-04-06
CN112615615B CN112615615B (en) 2023-12-26

Family

ID=75229601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011422873.3A Active CN112615615B (en) 2020-12-08 2020-12-08 Touch positioning method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112615615B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101315586A (en) * 2008-07-21 2008-12-03 贾颖 Electronic pen for interactive electronic white board and its interaction control method
CN101727243A (en) * 2010-02-02 2010-06-09 中兴通讯股份有限公司 Method and device for acquiring calibration parameters of touch screen
CN201698327U (en) * 2010-05-14 2011-01-05 杭州高科工贸有限公司 Electronic white board based on digital wireless location
CN102354269A (en) * 2011-08-18 2012-02-15 宇龙计算机通信科技(深圳)有限公司 Method and system for controlling display device
CN102478974A (en) * 2010-11-30 2012-05-30 汉王科技股份有限公司 Electromagnetic pen for electronic whiteboard and control method thereof
CN102880360A (en) * 2012-09-29 2013-01-16 东北大学 Infrared multipoint interactive electronic whiteboard system and whiteboard projection calibration method
CN102880361A (en) * 2012-10-12 2013-01-16 南京芒冠光电科技股份有限公司 Positioning calibration method for electronic whiteboard equipment
US20130321340A1 (en) * 2011-02-10 2013-12-05 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
CN104133598A (en) * 2013-05-03 2014-11-05 周永清 Driver-free positioning calibration guide method and driver-free positioning calibration guide device for electronic whiteboard system
CN104423629A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Electronic equipment and data processing method
CN104636060A (en) * 2014-04-29 2015-05-20 汉王科技股份有限公司 Soft key locating method and system
US10845878B1 (en) * 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101315586A (en) * 2008-07-21 2008-12-03 贾颖 Electronic pen for interactive electronic white board and its interaction control method
CN101727243A (en) * 2010-02-02 2010-06-09 中兴通讯股份有限公司 Method and device for acquiring calibration parameters of touch screen
CN201698327U (en) * 2010-05-14 2011-01-05 杭州高科工贸有限公司 Electronic white board based on digital wireless location
CN102478974A (en) * 2010-11-30 2012-05-30 汉王科技股份有限公司 Electromagnetic pen for electronic whiteboard and control method thereof
US20130321340A1 (en) * 2011-02-10 2013-12-05 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
CN102354269A (en) * 2011-08-18 2012-02-15 宇龙计算机通信科技(深圳)有限公司 Method and system for controlling display device
CN102880360A (en) * 2012-09-29 2013-01-16 东北大学 Infrared multipoint interactive electronic whiteboard system and whiteboard projection calibration method
CN102880361A (en) * 2012-10-12 2013-01-16 南京芒冠光电科技股份有限公司 Positioning calibration method for electronic whiteboard equipment
CN104133598A (en) * 2013-05-03 2014-11-05 周永清 Driver-free positioning calibration guide method and driver-free positioning calibration guide device for electronic whiteboard system
CN104423629A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Electronic equipment and data processing method
CN104636060A (en) * 2014-04-29 2015-05-20 汉王科技股份有限公司 Soft key locating method and system
US10845878B1 (en) * 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
闫爱军;范海明;周钧;: "基于单片机STM8S103和ST05A的触摸按键设计", 舰船防化, no. 01, pages 32 - 35 *

Also Published As

Publication number Publication date
CN112615615B (en) 2023-12-26

Similar Documents

Publication Publication Date Title
CN108037888B (en) Skill control method, skill control device, electronic equipment and storage medium
CN102681722B (en) Coordinate detection system, information processor and method
JP2019042509A (en) Visual method and apparatus for compensating sound information, program, storage medium, and electronic device
CN110090444B (en) Game behavior record creating method and device, storage medium and electronic equipment
WO2019085921A1 (en) Method, storage medium and mobile terminal for operating mobile terminal with one hand
CN108159697B (en) Virtual object transmission method and device, storage medium and electronic equipment
US9146667B2 (en) Electronic device, display system, and method of displaying a display screen of the electronic device
US10281996B2 (en) Touch sensitive system and stylus for commanding by maneuvering and method thereof
WO2021239016A1 (en) Application icon display method and apparatus, and electronic device
CN108228065B (en) Method and device for detecting UI control information and electronic equipment
CN103150108A (en) Equipment screen component moving method and device, and electronic equipment
CN105264507A (en) Apparatus and method of recognizing external device in a communication system
CN108762628B (en) Page element mobile display method and device, terminal equipment and storage medium
CN112218134A (en) Input method and related equipment
CN110347324A (en) Shortcut call method, device, storage medium and meeting all-in-one machine
CN113095227A (en) Robot positioning method and device, electronic equipment and storage medium
CN105653177A (en) Method for selecting clickable elements of terminal equipment interface and terminal equipment
CN104202637A (en) Key remote control and target dragging method
CN104714751A (en) Projecting method and mobile terminal
CN103279304B (en) Method and device for displaying selected icon and mobile device
CN112596623A (en) Interaction method and device of interaction equipment, electronic equipment and storage medium
CN112615615B (en) Touch positioning method, device, equipment and medium
CN111522487A (en) Image processing method and device for touch display product, storage medium and electronic equipment
CN110825280A (en) Method, apparatus and computer-readable storage medium for controlling position movement of virtual object
CN110908568A (en) Control method and device for virtual object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant