CN112462982A - Infrared touch method, device, equipment and computer storage medium - Google Patents

Infrared touch method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN112462982A
CN112462982A CN202011433271.8A CN202011433271A CN112462982A CN 112462982 A CN112462982 A CN 112462982A CN 202011433271 A CN202011433271 A CN 202011433271A CN 112462982 A CN112462982 A CN 112462982A
Authority
CN
China
Prior art keywords
touch object
information
projection
rotation
target touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011433271.8A
Other languages
Chinese (zh)
Inventor
于子鹏
戴俊德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Hongcheng Opto Electronics Co Ltd
Original Assignee
Anhui Hongcheng Opto Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Hongcheng Opto Electronics Co Ltd filed Critical Anhui Hongcheng Opto Electronics Co Ltd
Priority to CN202011433271.8A priority Critical patent/CN112462982A/en
Priority to PCT/CN2020/141061 priority patent/WO2022121036A1/en
Publication of CN112462982A publication Critical patent/CN112462982A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an infrared touch method, an infrared touch device, infrared touch equipment and a computer storage medium. The infrared touch method comprises the following steps: under the condition that the rotation input of a target touch object on an infrared touch screen is received, acquiring a plurality of projection information corresponding to the target touch object on a plurality of angle positions of the infrared touch screen in the rotation process of the target touch object by the rotation input; determining an object to be controlled associated with the target touch object, and determining rotation information corresponding to the rotation input according to the plurality of projection information; and performing manipulation associated with the rotation information for the object to be manipulated. According to the embodiment of the application, the complexity degree of the target touch object during operation on the infrared touch screen can be effectively reduced, and the operation efficiency is improved.

Description

Infrared touch method, device, equipment and computer storage medium
Technical Field
The present application relates to the field of touch technologies, and in particular, to an infrared touch method, apparatus, device, and computer storage medium.
Background
As is known, an infrared touch screen is mainly composed of infrared emitting and receiving sensing elements mounted on an outer frame of the touch screen, an infrared detection network is formed on the surface of the infrared touch screen, and any touch object can change the infrared rays on a contact to realize the operation of the infrared touch screen. In the prior art, a touch object is generally used for realizing a writing function, or for triggering a control in a certain application, and the like; however, when the cross-application or cross-function operation is implemented, the application or function is often switched by using the touch object, and then the switched application or function is controlled, so that the operation is complicated and the efficiency is low.
Disclosure of Invention
The embodiment of the application provides an infrared touch method, an infrared touch device, an infrared touch equipment and a computer storage medium, and aims to solve the problems that in the prior art, when an infrared touch screen is used for cross-application or cross-function operation, the operation is complex and the efficiency is low.
In one aspect, an embodiment of the present application provides an infrared touch method, where the method includes:
under the condition that rotation input of a target touch object on an infrared touch screen is received, acquiring a plurality of projection information corresponding to the target touch object on a plurality of angle positions of the infrared touch screen in the rotation process of the target touch object by the rotation input;
determining an object to be controlled associated with the target touch object, and determining rotation information corresponding to the rotation input according to the plurality of projection information;
and performing manipulation associated with the rotation information for the object to be manipulated.
On the other hand, an embodiment of the present application provides an infrared touch device, and the device includes:
the acquisition module is used for acquiring a plurality of projection information corresponding to a target touch object on a plurality of angular positions of the infrared touch screen in the rotation process of the target touch object by the rotation input under the condition of receiving the rotation input of the target touch object on the infrared touch screen;
the determining module is used for determining an object to be controlled related to the target touch object and determining rotation information corresponding to the rotation input according to the projection information;
and the control module is used for executing control related to the rotation information aiming at the object to be controlled.
In another aspect, an embodiment of the present application provides an electronic device, where the electronic device includes: a processor and a memory storing computer program instructions;
the processor executes the computer program instructions to realize the infrared touch method.
In another aspect, an embodiment of the present application provides a computer storage medium, where computer program instructions are stored on the computer storage medium, and when the computer program instructions are executed by a processor, the infrared touch method is implemented.
According to the infrared touch method provided by the embodiment of the application, under the condition that the rotation input of the target touch object on the infrared touch screen is received, the projection information corresponding to the target touch object in multiple angles can be obtained, the rotation information corresponding to the rotation input can be determined according to the projection information, and the control related to the rotation information can be executed aiming at the object to be controlled related to the target touch object. Therefore, in the embodiment of the application, the target touch object can be rotated, the object to be controlled can be called and controlled, the process of firstly switching the application or function and then controlling is omitted, the complexity of operation is effectively reduced, and the operation efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of an infrared touch method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating an infrared touch screen scanning a target touch object according to an embodiment of the present disclosure;
3a, 3b, and 3c are exemplary diagrams of rectangular target touch objects projected on an infrared touch screen;
fig. 4 is an exemplary diagram of a data table with a mapping relationship of a projected touch object preset in an embodiment of the present application;
5a, 5b, and 5c are exemplary diagrams of a projection of a triangular target touch object on an infrared touch screen;
fig. 6 is an exemplary diagram of volume adjustment by using a target touch object in the embodiment of the present application;
fig. 7 is a schematic flowchart illustrating a process of determining an object to be operated by identifying a target touch object in the embodiment of the present application;
fig. 8 is an exemplary diagram comparing a reporting protocol with an existing reporting protocol in the embodiment of the present application;
fig. 9 is a schematic structural diagram of an infrared touch device according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application will be described in detail below, and in order to make objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative only and are not intended to be limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by illustrating examples thereof.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In order to solve the prior art problems, embodiments of the present application provide an infrared touch method, an infrared touch device, an infrared touch apparatus, and a computer storage medium. First, the infrared touch method provided by the embodiment of the present application is introduced below.
Fig. 1 shows a schematic flow chart of an infrared touch method according to an embodiment of the present application. As shown in fig. 1, the method includes:
step 101, under the condition that a rotation input of a target touch object on an infrared touch screen is received, acquiring a plurality of projection information corresponding to the target touch object on a plurality of angle positions of the infrared touch screen in the rotation process of the target touch object by the rotation input;
102, determining an object to be controlled associated with a target touch object, and determining rotation information corresponding to rotation input according to a plurality of projection information;
and 103, performing the operation associated with the rotation information for the object to be operated.
In this embodiment, the target touch object may refer to an object placed on the infrared touch screen, such as a stylus, a finger of a user, an eraser, or a rectangular block or a triangular block with a certain size, which is not specifically limited herein.
It is easy to understand that, when a user uses a target touch object to perform input on the infrared touch screen, the user may rotate with respect to the target touch object, and at this time, the user may consider that the target touch object performs rotational input on the infrared touch screen.
During the rotation process of the target touch object, the infrared touch screen can scan the target touch object. With reference to fig. 2, an infrared transmitting and receiving sensing element is often integrated on the infrared touch screen to complete the scanning process; specifically, the infrared emitting and receiving sensing element may include an infrared emitting lamp and an infrared receiving lamp; when a target touch object is placed on the infrared touch screen, light beams emitted by some infrared emission lamps may be blocked, and based on the condition that the infrared receiving lamps receive the light beams, touch information of the target touch object on the infrared touch screen can be acquired.
In this embodiment, the touch information may be used to determine projection information of the target touch object on the infrared touch screen.
For example, with reference to fig. 2, when a target touch object is placed on an infrared touch screen, an infrared emitting lamp and an infrared receiving lamp which are arranged oppositely in the up-down direction may be used to obtain an orthographic projection size of the target touch object on the bottom edge in the drawing; or the orthographic projection size of the target touch object on the left side in the figure is obtained by using the infrared transmitting lamps and the infrared receiving lamps which are oppositely arranged in the left-right direction. The orthographic projection sizes of the target touch object on the bottom edge and the left edge can correspond to the orthographic projection sizes of the target touch object on two adjacent edges of the infrared touch screen. Of course, this is merely an example of the manner of obtaining the above-mentioned orthogonal projection size, and the above-mentioned orthogonal projection size may also include an orthogonal projection size of the target touch object on the top side or the right side, and the like.
For another example, also referring to fig. 2, in the infrared touch screen, in order to reduce the number of the infrared emission and reception sensing elements, the infrared emission lamp therein may scan the target touch object in a sector scanning manner, and when at a moment, the infrared emission lamp and the infrared reception lamp participating in the scanning operation are not vertically (or horizontally) opposite, that is, an oblique projection of the target touch object on a certain side of the infrared touch screen may be implemented, and then corresponding oblique projection information may be acquired.
In this embodiment, the acquired projection information may refer to orthographic projection information or oblique projection information, or both.
It is easy to understand that the angular position of the target touch object generally changes during the rotation process; for example, for a rectangular block, it is possible to rotate from a horizontal position to an inclined position, and accordingly, the projection information obtained correspondingly changes when the rectangular block is at different angular positions. For another example, for the stylus, the transparent portion and the non-transparent portion may be arranged at an interval along the circumferential direction at an end of the stylus contacting the infrared touch screen, so that the obtained projection information may be changed when the stylus is at different angular positions. According to the change of the projection information, the rotation information may be obtained, where the rotation information may be a rotation angle or a rotation direction, and is not limited specifically here.
As indicated above, the target touch object may theoretically be a touch object of various shapes; in an infrared touch screen, it may be default that the target touch object is a touch object of a certain shape, for example, a touch object defaulted to a rectangular block. Or, when the infrared touch screen is manipulated, the user may select the shape of the currently used target touch object on a specific touch object selection interface, so that the infrared touch screen can determine the target touch object. Or, the infrared touch screen can also recognize the target touch object according to the projection information of the target touch object acting on the infrared touch screen at present.
In this embodiment, the target touch object may be associated with an object to be controlled; in other words, an association relationship may be established between the touch object and the object to be operated.
For the object to be operated, some target functions such as volume adjustment, brightness adjustment or application switching can be performed; but also some target applications such as calculators, weather or maps, etc. For different touch objects, an association relationship may be established with corresponding objects to be controlled, for example, a rectangular block may be associated with volume adjustment, a triangular block may be associated with a map application, and the like.
The following description is given in connection with some application scenarios for performing operations associated with rotation information for an object to be manipulated.
For example, if the object to be controlled is a volume adjustment function, the rotation information includes a rotation angle and a rotation direction, the volume can be adjusted up or down according to the rotation direction, and the volume adjustment value can be determined according to the rotation angle. The prior art may need to click the volume control and then drag the button for controlling the volume to adjust the volume. In contrast, the present application can simplify the operation process of volume adjustment.
For another example, if the object to be operated is a map application, the rotation information includes a rotation angle, and when the rotation angle is greater than an angle threshold, the map application may be opened. In the prior art, an icon corresponding to a map application may need to be searched, or the icon corresponding to the map application may need to be switched to a desktop first and then clicked. In contrast, the map application can be opened more efficiently.
For another example, if the object to be operated is an application switching function, the rotation information includes a rotation direction, and the application program running in the foreground may be switched left and right according to the rotation direction. In the prior art, the switching of foreground application programs can be realized by triggering to enter a background process management interface and then dragging left and right. In contrast, the application can realize the fast switching of the application program.
Of course, the above is only an illustration of some application scenarios, and the operation related to the object to be controlled and the rotation information may be set according to actual needs, and is not specifically limited herein.
According to the infrared touch method provided by the embodiment of the application, under the condition that the rotation input of the target touch object on the infrared touch screen is received, the projection information corresponding to the target touch object in multiple angles can be obtained, the rotation information corresponding to the rotation input can be determined according to the projection information, and the control related to the rotation information can be executed aiming at the object to be controlled related to the target touch object. Therefore, in the embodiment of the application, the target touch object can be rotated, the object to be controlled can be called and controlled, the process of firstly switching the application or function and then controlling is omitted, the complexity of operation is effectively reduced, and the operation efficiency is improved.
In one example, when a sliding input or a clicking input of a target touch object on the infrared touch screen is received, some preset manipulation processes may be performed, for example, in the case of receiving the sliding input, handwriting may be generated; or, when the received click input is a click input, triggering a control corresponding to the touch point may be performed, and the like. Thus, the functions that the target touch object can realize can be effectively enriched.
It is easy to understand that, during the rotation of the target touch object, the infrared touch screen can acquire the projection information of the target touch object in real time. In the rotating process of the target touch object, the rotating speed is usually much lower than the scanning frame rate of the infrared touch screen, so that the rotating information of the target touch object can be accurately calculated in the rotating process. For simplifying the description, the following description mainly takes first projection information generated by the target touch object at the first position and second projection information generated by the target touch object at the second position as examples in the rotation process.
The first position and the second position may respectively refer to positions of the target touch object before and after rotation. For example, in combination with some practical application scenarios, when a user operates a target touch object to rotate, the target touch object may be directly rotated by a certain angle, and then the target touch object is taken off from the infrared touch screen; or the device can be rotated by a certain angle firstly, stopped and then rotated by a certain angle. The first position may be a position where the target touch object is located when the target touch object starts to rotate, the second position may be a final position reached before the target touch object is taken down, or a position where the dwell time of the target touch object exceeds a time threshold, and the like, and is not specifically limited herein.
Optionally, each piece of projection information includes orthographic projection sizes of the target touch object on two adjacent edges of the infrared touch screen;
determining rotation information corresponding to the rotation input according to the plurality of projection information, including:
determining angle information corresponding to each projection information according to the orthographic projection size included in each projection information, wherein the angle information includes the angle of the angle position corresponding to each projection information relative to the reference position;
determining a rotation angle corresponding to the rotation input according to the angle information respectively corresponding to the plurality of projection information; wherein the rotation information includes a rotation angle.
It is easy to understand that there can be numerous infrared emission lamps and infrared receiving lamps on each side of infrared touch screen, through the work order of rationally arranging each infrared emission lamp and infrared receiving lamp, can acquire the orthographic projection size of target touch object. In addition, in this embodiment, each projection information at least includes orthogonal projection sizes of the target touch object on two adjacent sides of the infrared touch screen, and by obtaining projections of the target touch object in different directions, information such as an angular position and a spatial position of the target touch object in the infrared touch screen can be better obtained.
Taking the example that the plurality of projection information includes the first projection information and the second projection information, the orthographic projection size included in the first projection information may reflect the orthographic projection condition of the target touch object on two sides of the infrared touch screen at the first position; the orthographic projection size included by the second projection information can reflect the orthographic projection condition of the target touch object on the two edges of the infrared touch screen at the second position. The first and second positions then correspond to the angular positions mentioned herein.
As indicated above, the target touch object may be determined or identified by the infrared touch screen in advance, in other words, the information of the shape, size, etc. of the target touch object may be acquired by the infrared touch screen in advance.
Assuming that the target touch object is a rectangular block with a cross-sectional length × height of 60 × 30 (unit mm, and the dimensions are considered to be mm unless otherwise specified), the orthographic projection size of the target touch object may be 60 × 30 at the above reference position. If the position is the first position, the size of the forward projection in the corresponding first projection information is 67.0 × 56.0; in the second position, the size of the forward projection in the corresponding second projection information is 56.0 × 67.0. A first angle and a second angle may be determined based on the orthographic projection dimensions, wherein the first angle is a placement angle of the first location relative to the reference location and the second angle is a placement angle of the second location relative to the reference location.
Specifically, referring to fig. 3a and 3b, the target touch object may be considered to be at the reference position in fig. 3a, and the target touch object may be considered to be at the first position or at the second position in fig. 3b, and the placement angle β exists with respect to the reference position. The size of the target touch object is a multiplied by b; at the angular position of the placement angle β, the orthographic projection size is c × d. In conjunction with fig. 3b, the following dimensional relationships can be obtained:
c=b×sinβ+a×cosβ (1)
d=b×cosβ+a×sinβ (2)
on the basis of this, if a, b, c, and d are known, each of the placement angles β can be obtained by a table lookup method or a function calculation method. That is, the first angle and the second angle can be calculated based on the above equation, and the rotation angle can be obtained by subtracting the first angle and the second angle.
The following describes the table lookup method or the function calculation method with reference to two examples.
In an example, the determining the angle information corresponding to each projection information according to the forward projection size included in each projection information respectively includes:
and determining angle information corresponding to the projection information according to a preset first corresponding relation and the orthographic projection size included by the projection information, wherein the first corresponding relation includes the corresponding relation between the orthographic projection size of the target touch object and the angle information.
The first corresponding relationship may be embodied by a table as shown in fig. 4, in which orthographic projection sizes of the target touch object at different placement angles are described, where the placement angle is relative to a preset reference position, and the orthographic projection size of the target touch object at the reference position may be 60 × 30, where 60 × 30 may be regarded as a value of a × b.
If the orthographic projection size is 67.0 × 56.0 in the first projection information, the placing angle β at the moment can be found to be 30 degrees; in the second projection information, if the size of the orthographic projection is 56.0 × 67.0, it can be found that the placement angle β at this time is 60 °, and the rotation angle is 30 ° according to the values of the two placement angles β.
The above manner of obtaining the rotation angle based on the table can be regarded as a table look-up method, which can directly and efficiently query the value of each placement angle, and ensure the obtaining efficiency of the rotation angle. In order to calculate the rotation angle more precisely, the placement angle β and the rotation angle may be calculated by a functional calculation method.
Specifically, in another example, the determining the angle information corresponding to each projection information according to the front projection size included in each projection information includes:
and determining angle information corresponding to the projection information according to the preset size information of the target touch object and the orthographic projection size included by the projection information.
In the function calculation method, in combination of the above-mentioned formula (1) and formula (2), the above-mentioned formula (2) may be multiplied by b for the above-mentioned formula (1) multiplied by a,
obtaining:
a×c=a×b×sinβ+a×a×cosβ (3)
b×d=b×b×cosβ+a×b×sinβ (4)
subtracting formula (4) from formula (3) yields:
cosβ=(ac–bd)/(a2-b2) Or sin β ═ b (bc-ad)/(b)2-a2)
Similarly, when the touch object information of the target touch object is identified, the values of a and b are known, and the values of c and d exist in each orthogonal projection size, the placement angle β of the target touch object relative to the reference position before and after rotation can be obtained, and the rotation angle can be obtained by subtracting the placement angle β.
In combination with the above calculation process, in the present embodiment, the first angle and the second angle are determined based on the reference position, and the rotation angle is calculated according to the first angle and the second angle, so that the calculation process is simple, and the reduction of the consumption of calculation resources is facilitated. In contrast, the calculation accuracy of the function calculation method for the rotation angle does not depend on the step length of the placement angle β in the data table, and the function calculation method can be more accurate and efficient.
Of course, the above description is mainly made of the acquisition of motion information of a rectangular target touch object, and other target touch objects with non-circular cross sections can also be realized based on a similar principle. For example, referring to fig. 5a and 5b, schematic diagrams of a triangular target touch object under three placement angles are shown; specifically, fig. 5a may be regarded as a schematic diagram in which the triangular target touch object is placed at the reference position, and fig. 5b may be regarded as a schematic diagram in which the triangular target touch object is placed at the first position or the second position.
Assuming that the target touch object is an equilateral triangle and has a side length a, the orthographic projection size c × d of the target touch object at different placement angles β can be calculated as follows:
c=a*sinγ;
d=a*cosβ+a*cosγ;
wherein gamma + beta is 120 °
By combining the above formulas, a table of the orthographic projection sizes of the triangular target touch object under different placement angles β can be actually compiled; the rotation angle is calculated according to the data table and the formula, which is similar to the calculation method of the rectangular target touch object, and is not described herein again.
In order to acquire the rotation direction of the target touch object, the functions which can be realized by the target touch object are enriched; optionally, each piece of projection information includes oblique projection information of the target touch object on any side of the infrared touch screen;
determining rotation information corresponding to the rotation input according to the plurality of projection information, including:
determining a rotation direction corresponding to the rotation input according to oblique projection information respectively included in the plurality of projection information; wherein the rotation information comprises a rotation direction.
Taking the target touch object as a rectangular block as an example, referring to fig. 3a, 3b and 3c, fig. 3a can be regarded as a schematic diagram of the horizontal position, or the reference position, of the rectangular target touch object, and fig. 3b and 3c respectively show the case where the rectangular target touch object rotates counterclockwise and clockwise relative to the horizontal position and then projects information obliquely.
Under the same rotation angle, the orthogonal projection sizes in fig. 3b and fig. 3c may be the same, and when the infrared light beam is emitted from the upper right corner to the lower left corner, the target touch object in fig. 3b may block more infrared emission light beams from a certain angle, but not in fig. 3 c. The target touch object shields the light beam of the infrared emission lamp and is extremely related to the positions of the currently working infrared emission lamp and infrared receiving lamp; that is, when the infrared emitting lamp and the infrared receiving lamp are arranged obliquely, the target touch object can form an oblique projection on one side of the infrared touch screen; the length of the oblique projection or the position distribution of the infrared receiving lamps capable of receiving the infrared beams can be influenced by the rotating direction of the target touch object, and according to the characteristics, the rotating direction of the target touch object can be determined according to the oblique projection information.
In other words, the oblique projection information may be expressed as: and in the oblique projection process, the number or the position distribution of the infrared receiving lamps receiving the infrared beams.
Similarly, the determination of the rotation direction of the triangular target touch object can be understood by combining the oblique projection information in fig. 5a, 5b and 5 c. Referring to fig. 5a, 5b and 5c, schematic diagrams of a triangular target touch object at three placement angles are shown; specifically, fig. 5a can be regarded as a schematic diagram of a triangular target touch object placed at a horizontal position or a reference position, and in fig. 5b, the corresponding target touch object rotates clockwise, so that more oblique light beams emitted from the upper right direction to the lower left direction can be blocked; in fig. 5c, the corresponding target touch object is selected counterclockwise, and the oblique light beams emitted from the upper right direction to the lower left direction are blocked less; based on the difference of the oblique projection information, the rotation direction of the target touch object can be judged.
Referring to fig. 6, fig. 6 is an exemplary diagram of a scene application for volume adjustment through a target touch object. The target touch object is a rectangular block and is placed in a display area of the infrared touch screen; identifying the touch object information of the target touch object, and judging that the object to be controlled associated with the target touch object is a volume adjusting function; at this time, the current volume may be displayed at the edge of the initial angular position of the target touch object, for example, in the form of "40%" volume percentage; after receiving the rotation input, the target touch object rotates to another angle position, and the rotation direction and the rotation angle of the target touch object can be obtained through projection information corresponding to the two angle positions of the target touch object, for example, when the rotation direction is clockwise, the volume can be increased, the volume can be adjusted to be 60% according to the rotation angle, and simultaneously, 60% can be displayed on the edge of the angle position of the target touch object after the target touch object rotates.
Of course, it should be noted that, in general, to determine the rotation angle and the rotation direction of the target touch object, the cross-sectional shape of the target touch object may be a shape other than a circle. However, in some possible embodiments, the acquisition of the rotation information may also be achieved by performing a structural improvement on the target touch object with a circular cross section, for example, arranging the transparent portion and the non-transparent portion in sequence on the circumferential surface of the target touch object, so that there may be a change in the projection information during the rotation of the target touch object, and thus the rotation information may be acquired.
In some application scenarios, the target touch object may be a touch object selected by a user from a plurality of different touch objects, and in order to facilitate the user to implement operations and controls on different applications or functions through different touch objects, optionally, as shown in fig. 7, the determining an object to be operated and controlled associated with the target touch object includes:
step 701, identifying a target touch object according to projection information;
step 702, determining an object to be operated associated with the target touch object according to a preset second corresponding relation;
the second corresponding relation comprises a corresponding relation between the touch object and the object to be operated, and the object to be controlled is a target function or a target application.
It is easy to understand that, since the target touch object may be randomly selected by the user, the infrared touch screen may not be able to directly obtain various information such as the size of the target touch object, and therefore, the target touch object may need to be first identified to obtain the information of the target touch object.
For example, an area value may be calculated according to the size of the orthographic projection in the projection information, and the target touch object may be specifically a stylus, a finger of a user, an eraser or the like according to the area value. When the touch screen is easy to understand, the placement angles of the target touch objects on the touch screen are different, so that the area values calculated according to different projection information of the same target touch object may be different, but since the area values respectively corresponding to the stylus, the user finger, the eraser and the like may fall into different area ranges, the target touch object can be identified according to the area value obtained through calculation and the area range corresponding to each target touch object. In addition, the electronic device where the infrared touch screen is located can store information of some touch objects in advance, and when a target touch object is identified, corresponding information can be called to obtain the target touch object.
Of course, this is merely an example of an application scenario for identifying a target touch object according to projection information, and in practical applications, the identification may also be implemented based on other manners, which is not specifically limited herein.
On the basis of identifying the target touch object, when the rotational input is received, the target touch object can be determined to be associated with the object to be controlled according to a preset second corresponding relation. For example, when the target touch object is a user finger and it is determined that the user finger is associated with the map application according to the second correspondence, if a rotation input is received, the map application may be started according to the rotation information; for another example, when the target touch object is an eraser and the eraser-related volume adjustment function is determined according to the second object relationship, if a rotation input is received, the volume may be adjusted according to the rotation information.
In one example, the identifying the target touch object according to the projection information in step 701 includes:
identifying a target touch object according to a preset projection touch object mapping relation and projection information; the preset projection touch object mapping relation is a corresponding relation between the touch object and a plurality of preset projection information, and each preset projection information is associated with the placement angle of the touch object.
The preset projection touch object mapping relationship can be represented to a certain extent by a table as shown in fig. 4, that is, for any touch object, a corresponding relationship can be established with a plurality of preset projection information, and the preset projection information can correspond to different placement angles. When the target touch object is identified, when the acquired projection information matches one of the preset projection information, which touch object the target touch object specifically belongs to can be identified according to the preset projection information.
With continued reference to fig. 4, fig. 4 shows preset projection information of a rectangular block with a rectangular cross section and a length × width (a × b) of 60 × 30, where the preset projection information may be a forward projection size (c × d) of the touch object on two adjacent sides of the infrared touch screen at different placement angles β, and may be recorded in a form of a table.
For example, when the user places the rectangular block with the cross section of 60 × 30 on the infrared touch screen at a specific angle, and the obtained first projection information indicates that the orthogonal projection size of the rectangular block on two adjacent edges of the infrared touch screen is 56.0 × 67.0, by querying the table shown in fig. 3, the data of 55.98 × 66.96 can be found, and the rectangular block can be identified as a rectangular block with the cross section of 60 × 30.
Compared with the method for identifying the target touch object based on the area value in the above embodiment, the accuracy of identification can be improved based on the preset projection touch object mapping relationship. For example, the target touch object may be a rectangular block with a cross section of 60 × 30, or may be a rectangular block with a cross section of 50 × 30, and based on the area value identification manner, there may be a certain overlapping portion in the area range corresponding to the two rectangular blocks, which makes it difficult to distinguish the two rectangular blocks. And the specific sizes of various rectangular blocks can be obtained by combining the preset projection touch object mapping relation, so that the touch object information of the two rectangular blocks can be accurately identified.
The projection information used for identifying the target touch object may be oblique projection information in addition to the above orthogonal projection size.
By way of example, if there is a rectangular block having a cross-sectional length x height of 60 x 30, and a triangular block having a cross-sectional length x height of 60 x 30; when the two target touch objects are respectively placed on the infrared touch screen, the orthogonal projection sizes on two adjacent edges of the infrared touch screen may be equal. Therefore, in order to accurately identify the touch object information of the two target touch objects, the oblique projection information may be further used in the present embodiment.
It is easily understood that, for the rectangular block and the triangular block, the size of the oblique projection on one side of the infrared touch screen or the position of the oblique projection on the side is usually different, and based on the difference of the oblique projection information, two target touch objects can be distinguished. Therefore, the touch object information of the target touch object is identified by combining the oblique projection information of the target touch object on any side of the infrared touch screen, and the identification accuracy of the touch object information of the target touch object can be further improved.
In practical application, for electronic equipment with an infrared touch screen, a processor performs corresponding actions according to a scanning result of a target touch object; the scanning result usually adopts a specific reporting protocol. Referring to fig. 8, fig. 8 shows a difference between the existing reporting protocol and the reporting protocol of the embodiment of the present application, where the existing reporting protocol mainly includes a packet header, a command, a touch ID, a touch attribute, a touch coordinate, a touch object length and width, and a check code; the main difference of the reporting protocol in the embodiment of the application is embodied in remarks. For example: for the command, more writing can be performed in the existing reporting protocol, and if the embodiment of the application can distinguish that a specific application is started based on the touch object or writing is performed according to the identification of the touch object information of the target touch object; in addition, the reporting protocol in the embodiment of the application can increase the rotation direction (clockwise or counterclockwise) and the angular position (or the rotation angle, the value of which may be 0 to 180 °) of the touch object (generally expressed as the nth point in the infrared touch screen).
An embodiment of the present application further provides an infrared touch device, as shown in fig. 9, the infrared touch device includes:
an obtaining module 901, configured to obtain, when a rotational input of a target touch object on an infrared touch screen is received, a plurality of pieces of projection information corresponding to the target touch object at a plurality of angular positions of the infrared touch screen in a rotation process of the target touch object by the rotational input;
a determining module 902, configured to determine an object to be controlled associated with a target touch object, and determine rotation information corresponding to a rotation input according to a plurality of projection information;
and the control module 903 is used for executing control related to the rotation information for the object to be controlled.
Optionally, each piece of projection information includes orthographic projection sizes of the target touch object on two adjacent edges of the infrared touch screen;
accordingly, the determining module 902 includes:
the first determining unit is used for determining angle information corresponding to each piece of projection information according to the orthographic projection size included in each piece of projection information, and the angle information includes the angle of the angle position corresponding to each piece of projection information relative to the reference position;
the second determining unit is used for determining a rotation angle corresponding to the rotation input according to the angle information respectively corresponding to the plurality of projection information; wherein the rotation information includes a rotation angle.
Optionally, the first determining unit may be specifically configured to:
and determining angle information corresponding to the projection information according to a preset first corresponding relation and the orthographic projection size included by the projection information, wherein the first corresponding relation includes the corresponding relation between the orthographic projection size of the target touch object and the angle information.
Optionally, the first determining unit may be specifically configured to:
and determining angle information corresponding to the projection information according to the preset size information of the target touch object and the orthographic projection size included by the projection information.
Optionally, each piece of projection information includes oblique projection information of the target touch object on any side of the infrared touch screen;
accordingly, the determining module 902 may include:
the third determining unit is used for determining the rotating direction corresponding to the rotating input according to the oblique projection information respectively included in the plurality of projection information; wherein the rotation information comprises a rotation direction.
Optionally, the determining module 902 may include:
the identification unit is used for identifying the target touch object according to the projection information;
the fourth determining unit is used for determining an object to be operated related to the target touch object according to a preset second corresponding relation;
the second corresponding relation comprises a corresponding relation between the touch object and the object to be operated, and the object to be controlled is a target function or a target application.
Optionally, the fourth determining unit may be specifically configured to:
identifying a target touch object according to a preset projection touch object mapping relation and projection information; the preset projection touch object mapping relation is a corresponding relation between the touch object and a plurality of preset projection information, and each preset projection information is associated with the placement angle of the touch object.
It should be noted that the infrared touch device is a device corresponding to the infrared touch method, and all implementation manners in the embodiments of the method are applicable to the embodiments of the device, so that the same technical effect can be achieved.
Fig. 10 shows a hardware structure diagram of an electronic device provided in an embodiment of the present application.
The electronic device may include a processor 1001 and a memory 1002 that stores computer program instructions.
Specifically, the processor 1001 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
Memory 1002 may include mass storage for data or instructions. By way of example, and not limitation, memory 1002 may include a Hard Disk Drive (HDD), a floppy Disk Drive, flash memory, an optical Disk, a magneto-optical Disk, magnetic tape, or a Universal Serial Bus (USB) Drive or a combination of two or more of these. Storage 1002 may include removable or non-removable (or fixed) media, where appropriate. The memory 1002 may be internal or external to the integrated gateway disaster recovery device, where appropriate. In a particular embodiment, the memory 1002 is non-volatile solid-state memory.
The memory may include Read Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices. Thus, in general, the memory includes one or more tangible (non-transitory) computer-readable storage media (e.g., a memory device) encoded with software comprising computer-executable instructions and when the software is executed (e.g., by one or more processors), it is operable to perform the operations described with reference to the method according to an aspect of the disclosure.
The processor 1001 reads and executes the computer program instructions stored in the memory 1002 to implement any one of the infrared touch methods in the above embodiments.
In one example, the electronic device may also include a communication interface 1003 and a bus 1004. As shown in fig. 10, a processor 1001, a memory 1002, and a communication interface 1003 are connected to each other via a bus 1004 to complete mutual communication.
The communication interface 1003 is mainly used for implementing communication between modules, apparatuses, units and/or devices in this embodiment.
Bus 1004 includes hardware, software, or both to couple together the components of the online data traffic billing device. By way of example, and not limitation, a bus may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hyper Transport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus or a combination of two or more of these. The bus 1004 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
In addition, in combination with the infrared touch method in the foregoing embodiments, embodiments of the present application may provide a computer storage medium to implement. The computer storage medium having computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement any of the above-described embodiments of the infrared touch method.
It is to be understood that the application is not limited to the particular arrangements and instrumentality described above and shown in the drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications and additions or change the order between the steps after comprehending the spirit of the present application.
The functional blocks shown in the above structural block diagrams may be implemented as hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the present application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of a machine-readable medium include electronic circuits, semiconductor memory devices, ROM, flash memory, Erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via a computer network, such as the internet, an intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this application describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware for performing the specified functions or acts, or combinations of special purpose hardware and computer instructions.
As will be apparent to those skilled in the art, for convenience and brevity of description, the specific working processes of the systems, modules and units described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It should be understood that the scope of the present application is not limited thereto, and any equivalent modifications or substitutions can be easily made by those skilled in the art within the technical scope of the present application, and are intended to be covered by the present application.

Claims (10)

1. An infrared touch method is characterized by comprising the following steps:
under the condition that a rotation input of a target touch object on an infrared touch screen is received, acquiring a plurality of projection information corresponding to the target touch object on a plurality of angle positions of the infrared touch screen in the rotation process of the target touch object by the rotation input;
determining an object to be controlled associated with the target touch object, and determining rotation information corresponding to the rotation input according to the plurality of projection information;
and executing the operation associated with the rotation information aiming at the object to be operated.
2. The method of claim 1, wherein each of the projection information includes orthographic projection sizes of the target touch object on two adjacent edges of the infrared touch screen;
the determining rotation information corresponding to the rotation input according to the plurality of projection information includes:
determining angle information corresponding to each piece of projection information according to the orthogonal projection size included in each piece of projection information, wherein the angle information includes an angle of an angle position corresponding to each piece of projection information relative to a reference position;
determining a rotation angle corresponding to the rotation input according to the angle information corresponding to the plurality of projection information respectively; wherein the rotation information includes the rotation angle.
3. The method according to claim 2, wherein the determining the angle information corresponding to each of the projection information according to the orthogonal projection size included in each of the projection information respectively comprises:
determining angle information corresponding to the projection information according to a preset first corresponding relation and a forward projection size included in the projection information, wherein the first corresponding relation includes a corresponding relation between the forward projection size of the target touch object and the angle information.
4. The method according to claim 2, wherein the determining the angle information corresponding to each of the projection information according to the orthogonal projection size included in each of the projection information respectively comprises:
and determining angle information corresponding to the projection information according to the preset size information of the target touch object and the orthographic projection size included by the projection information.
5. The method of claim 1, wherein each of the projection information includes oblique projection information of the target touch object on any side of the infrared touch screen;
the determining rotation information corresponding to the rotation input according to the plurality of projection information includes:
determining a rotation direction corresponding to the rotation input according to the oblique projection information respectively included in the plurality of projection information; wherein the rotation information includes the rotation direction.
6. The method according to claim 1, wherein the determining the object to be manipulated associated with the target touch object comprises:
identifying the target touch object according to the projection information;
determining an object to be operated associated with the target touch object according to a preset second corresponding relation;
the second corresponding relation comprises a corresponding relation between a touch object and an object to be operated, and the object to be operated is a target function or a target application.
7. The method of claim 6, wherein the identifying the target touch object according to the projection information comprises:
identifying the target touch object according to a preset projection touch object mapping relation and the projection information; the preset projection touch object mapping relation is a corresponding relation between a touch object and a plurality of preset projection information, and each preset projection information is associated with the placement angle of the touch object.
8. An infrared touch device, comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a plurality of projection information corresponding to a target touch object on a plurality of angular positions of an infrared touch screen in the rotating process of the target touch object by a rotating input under the condition of receiving the rotating input of the target touch object on the infrared touch screen;
the determining module is used for determining an object to be controlled related to the target touch object and determining rotation information corresponding to the rotation input according to the plurality of projection information;
and the control module is used for executing control related to the rotation information aiming at the object to be controlled.
9. An electronic device, characterized in that the electronic device comprises: a processor and a memory storing computer program instructions;
the processor, when executing the computer program instructions, implements the infrared touch method of any of claims 1-7.
10. A computer storage medium having computer program instructions stored thereon, which when executed by a processor implement the infrared touch method of any one of claims 1-7.
CN202011433271.8A 2020-12-10 2020-12-10 Infrared touch method, device, equipment and computer storage medium Withdrawn CN112462982A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011433271.8A CN112462982A (en) 2020-12-10 2020-12-10 Infrared touch method, device, equipment and computer storage medium
PCT/CN2020/141061 WO2022121036A1 (en) 2020-12-10 2020-12-29 Infrared touch-control method, apparatus, and device, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011433271.8A CN112462982A (en) 2020-12-10 2020-12-10 Infrared touch method, device, equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN112462982A true CN112462982A (en) 2021-03-09

Family

ID=74801043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011433271.8A Withdrawn CN112462982A (en) 2020-12-10 2020-12-10 Infrared touch method, device, equipment and computer storage medium

Country Status (2)

Country Link
CN (1) CN112462982A (en)
WO (1) WO2022121036A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4991458B2 (en) * 2007-09-04 2012-08-01 キヤノン株式会社 Image display apparatus and control method thereof
US9213440B2 (en) * 2010-07-27 2015-12-15 Hewlett-Packard Development Company L.P. System and method for remote touch detection
KR101976605B1 (en) * 2016-05-20 2019-05-09 이탁건 A electronic device and a operation method
CN106781849A (en) * 2016-12-16 2017-05-31 崔熙媛 A kind of laser pen and its control method
CN107562288B (en) * 2017-08-31 2020-03-06 广东美的制冷设备有限公司 Response method based on infrared touch device, infrared touch device and medium
CN108572765B (en) * 2018-04-20 2021-03-12 北京硬壳科技有限公司 Touch control identification method and device
CN111694468B (en) * 2020-06-15 2023-09-05 广州创知科技有限公司 Method and device for determining type of target object

Also Published As

Publication number Publication date
WO2022121036A1 (en) 2022-06-16

Similar Documents

Publication Publication Date Title
US9430093B2 (en) Monitoring interactions between two or more objects within an environment
US9052778B2 (en) Infrared touch screen
US8442269B2 (en) Method and apparatus for tracking target object
US20150026646A1 (en) User interface apparatus based on hand gesture and method providing the same
US9311756B2 (en) Image group processing and visualization
CN104598082B (en) A kind of method and device for determining candidate touch point
EP2996067A1 (en) Method and device for generating motion signature on the basis of motion signature information
EP2984545B1 (en) Virtual touch screen
US20160188178A1 (en) Display Processing Method And Portable Mobile Terminal
CN105593786A (en) Gaze-assisted touchscreen inputs
CN111177869A (en) Method, device and equipment for determining sensor layout scheme
CN110858814B (en) Control method and device for intelligent household equipment
CN106598351A (en) Touch point processing method and equipment
CN112462982A (en) Infrared touch method, device, equipment and computer storage medium
US20160140699A1 (en) Automatically identifying and healing spots in images
CN104766332A (en) Image processing method and electronic device
US10379677B2 (en) Optical touch device and operation method thereof
CN109032354B (en) Electronic device, gesture recognition method thereof and computer-readable storage medium
CN112882594B (en) Touch device, positioning method, device and medium
CN104679428A (en) Method for judging photograph rotation direction according to single finger gestures
JP6796506B2 (en) Input system
Fujiwara et al. Interactions with a line-follower: An interactive tabletop system with a markerless gesture interface for robot control
CN112561995B (en) Real-time and efficient 6D attitude estimation network, construction method and estimation method
CN112418316B (en) Robot repositioning method and device, laser robot and readable storage medium
WO2021007733A1 (en) Method for recognizing gesture for operating terminal device, and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210309

WW01 Invention patent application withdrawn after publication