CN110209242B - Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment - Google Patents

Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment Download PDF

Info

Publication number
CN110209242B
CN110209242B CN201910432357.XA CN201910432357A CN110209242B CN 110209242 B CN110209242 B CN 110209242B CN 201910432357 A CN201910432357 A CN 201910432357A CN 110209242 B CN110209242 B CN 110209242B
Authority
CN
China
Prior art keywords
button
function
projection
touch
binding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910432357.XA
Other languages
Chinese (zh)
Other versions
CN110209242A (en
Inventor
林德熙
吕毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201910432357.XA priority Critical patent/CN110209242B/en
Publication of CN110209242A publication Critical patent/CN110209242A/en
Application granted granted Critical
Publication of CN110209242B publication Critical patent/CN110209242B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1639Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a button function binding method, a button function calling method, a button function binding device, a button function calling device and projection control equipment. The method comprises the following steps: acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board; and acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling the button function bound with the button position according to the function calling touch operation sensed by the projection whiteboard. By adopting the technical scheme, operation is not needed to be carried out on the projection control terminal, and convenience of calling program functions by a user is improved.

Description

Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment
Technical Field
The application relates to the field of interactive intelligent panels, in particular to a button function binding method, a button function calling method, a button function binding device, a button function calling device, a projection control device, a storage medium and a button function calling system.
Background
In the field of interactive intelligent panels, a projection whiteboard is one of important devices in interactive intelligent devices, and is widely applied to various application scenes, so that the working efficiency and the learning efficiency of people are greatly improved.
For example, in a teaching scenario, a teacher may connect a notebook with a projector in a classroom and project a picture (e.g., PPT material) displayed on the notebook onto an electronic whiteboard through the projector. Therefore, the participants can see the picture of the notebook computer on the electronic whiteboard, and the conference discussion is facilitated.
In an actual teaching scene, some program functions are often called, for example, when a teacher speaks a class, a browser of a notebook computer needs to be started to access a certain webpage. For another example, it is necessary to start the recording function of the notebook to record the teaching audio.
However, since the user can only operate the notebook computer to call the corresponding program function, the user cannot directly start the program function of the notebook computer on the electronic whiteboard, which causes inconvenience to the user.
Therefore, the prior art has the problem that the operation of a user is inconvenient when the program function is called.
Disclosure of Invention
In view of the foregoing, it is necessary to provide a function binding method and apparatus, a function calling method and apparatus, a projection control device, a storage medium, and a button function calling system, in view of the above technical problems.
In a first aspect, a function binding method is provided, including:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
and acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling the button function bound with the button position according to the function calling touch operation sensed by the projection whiteboard.
In a second aspect, a method for calling a button function is provided, including:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
acquiring a button function, and establishing a binding relationship between the button position and the button function;
receiving a function calling touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard;
and determining that the function calling touch coordinate is matched with the button position, and calling the button function bound with the button position.
In a third aspect, there is provided a button function binding apparatus, including:
the position acquisition module is used for acquiring a button position of the projection white board, the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
and the function acquisition module is used for acquiring a button function and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling touch operation according to the function sensed by the projection whiteboard and calling the button function bound with the button position.
In a fourth aspect, there is provided a button function calling apparatus including:
the position acquisition module is used for acquiring a button position of the projection white board, the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
the function acquisition module is used for acquiring a button function and establishing a binding relationship between the button position and the button function;
the coordinate receiving module is used for receiving the function calling touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard;
and the function calling module is used for determining that the function calling touch coordinate is matched with the button position and calling the button function bound with the button position.
In a fifth aspect, a projection control apparatus is provided, which includes: a memory, a display screen, and one or more processors;
the memory for storing one or more programs;
the one or more processors, when executing the one or more programs, implement the steps of:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
and acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling the button function bound with the button position according to the function calling touch operation sensed by the projection whiteboard.
In a sixth aspect, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
and acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling the button function bound with the button position according to the function calling touch operation sensed by the projection whiteboard.
In a seventh aspect, a button function calling system is provided, including:
the system comprises an electronic whiteboard and a projection control terminal; the electronic whiteboard has a touch sensing function; the display picture of the projection control end is projected to the electronic whiteboard through a projector;
the electronic whiteboard is used for sensing function binding touch operation, generating function binding touch coordinates according to the function binding touch operation, and sending the function binding touch coordinates to the projection control end;
the projection control terminal is used for acquiring the button position of the electronic whiteboard according to the function binding touch coordinate, acquiring the button function and establishing the binding relationship between the button position and the button function;
the projection control terminal is further used for calling touch operation according to the function sensed by the electronic whiteboard and calling the button function bound with the button position.
According to the button function binding method and device, the button function calling method and device, the projection control equipment, the storage medium and the button function calling system, the button position of the projection white board is obtained through the function binding touch operation sensed by the projection white board, and the binding relation between the button function and the button position is established. When projection is carried out, according to the binding relation between the button functions and the button positions, when the projection white board senses function calling touch operation aiming at the button positions, the button functions bound with the button positions can be called. Therefore, the user can call the program function of the projection control terminal through touch operation on the projection white board without operating on the projection control terminal, and convenience of calling the program function by the user is improved.
Further, the button positions are determined through function binding touch operation of a user on the projection white board, and the button functions are bound to the button positions, so that any button functions can be bound to any position on the projection white board through simple touch operation. The user does not need to write complex user-defined button codes by himself while the user's requirement for the user-defined button is realized, and the cost of the user-defined button is reduced. Moreover, the user-defined button can be realized through simple touch operation, so that the user can adjust the position and the function of the button according to the requirement at any time, and the flexibility of the user-defined button is improved.
Further, since any button function can be bound to any position on the projection whiteboard, when the touch sensing function of the button position on the projection whiteboard where the button function has been bound is abnormal, the user can bind the button function to another position again to obtain a new button position. Therefore, the problem that the user-defined button cannot be used due to the abnormal touch induction function of the button position on the projection white board is solved. Moreover, even if the touch sensing function of the button position on the projection white board is abnormal, the button function can be continuously used without maintaining or replacing the whole projection white board, and the maintenance cost of the projection white board is reduced.
Drawings
FIG. 1 is a flow chart of a button function binding method according to a first embodiment;
FIG. 2 is a schematic diagram of an application environment for a button function binding method;
FIG. 3A is a schematic diagram of a user clicking on a projection whiteboard when the projection distance is correct;
FIG. 3B is a schematic diagram of a user clicking on a projection whiteboard when the projection distance is incorrect;
FIG. 4 is a schematic diagram of a projected whiteboard having button identifiers, under an embodiment;
FIG. 5 is a diagram of a scenario of a touch prompt of an embodiment;
FIG. 6 is a flowchart of a button function calling method according to the second embodiment;
FIG. 7 is a schematic structural diagram of a button function binding apparatus according to a third embodiment;
FIG. 8 is a schematic structural diagram of a button function calling apparatus according to a fourth embodiment;
FIG. 9 is a schematic structural view of a projection control apparatus according to a fifth embodiment;
fig. 10 is a schematic structural diagram of a button function call system according to a seventh embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Example one
Fig. 1 is a flowchart of a method for binding a button function according to a first embodiment of the present application. FIG. 2 is a schematic diagram of an application environment for a button function binding method. The button function binding method provided in this embodiment may be executed by the projection control end 210 in fig. 2, where the projection control end 210 may be implemented in a software and/or hardware manner, and the projection control end 210 may be formed by two or more physical entities, or may be formed by one physical entity. The projection control terminal 210 may be a terminal for controlling projection, such as a laptop, a desktop, a mobile phone, a tablet, or an interactive smart tablet.
The projector 220 may be an electronic device with a projection function, the projector 220 and the projection control terminal 210 may be connected by a wire/wireless connection, and the projector 220 may receive a projection image sent by the projection control terminal 210 and project the projection image to the projection whiteboard 230.
The projection whiteboard 230 may be an electronic whiteboard with touch sensing function. In practical application, the projection whiteboard may be specifically an electronic whiteboard provided with an infrared touch screen, an electromagnetic touch screen, a capacitive touch screen, and other touch screens. The projection whiteboard 230 and the projection control terminal 210 may be connected by wire/wireless.
First, in actual use, an electronic whiteboard generally has a display function and a touch sensing function, but a user may use only the electronic whiteboard to display a projection screen without using the display function. Therefore, in the above scenario, the electronic whiteboard used for projection, i.e., the projection whiteboard 230 described above.
The projection whiteboard 230 can sense a touch operation of a user through a touch sensing function, and generate a touch coordinate according to the sensed touch operation. The touch coordinates are determined according to the position touched by the touch operation. More specifically, a plane coordinate system on which the position touched by the touch operation is located, that is, the touch coordinate, may be constructed based on the projection whiteboard 230.
When the projection distance and the projection position between the projector 220 and the projection whiteboard 230 are correct, the picture projected by the projector 220 matches the size of the projection whiteboard 230. The touch coordinates generated by the touch operation sensed by the projection whiteboard correspond to the coordinates of each position on the projection screen, that is, the coordinates of each position on the screen displayed by the projection control terminal 210. Therefore, when the projection distance and the projection position between the projector 220 and the projection whiteboard 230 are correct, the user can simulate a click operation or a touch operation on the screen displayed by the projection control terminal 210 according to the user's touch operation on the projection whiteboard 230.
FIG. 3A isAnd when the projection distance is correct, the user clicks the schematic diagram of the projection white board. As shown in the figure, when the projection distance is correct, the image projected by the projector 220 matches the size of the projection white board 230. The projected picture comprises calling annotation function icon 1 and calling recording function icon 2, when the user clicks the annotation function icon 1 of the projection white board 230, the projection white board 230 generates a touch coordinate (x) according to the sensed touch operation1,y1) Will (x)1,y1) Sent to the projection control end 210, and the projection control end 210 determines (x)1,y1) The corresponding coordinate on the displayed picture is (x)1U,y1U) And on the displayed picture, aiming at (x)1U,y1U) The annotation function icon 1 performs a click operation or a touch operation to call an annotation function.
However, in practical applications, the projection distance or the projection position between the projector 220 and the projection whiteboard 230 may be inappropriate, which may cause a problem that the projection screen of the projector 220 is not matched with the projection whiteboard, making it difficult for the user to invoke the required program function.
Fig. 3B is a schematic diagram of a user clicking on a projection whiteboard when the projection distance is incorrect. As shown, when the projection distance is too short, the image projected by the projector 220 may be smaller than the projection white board 230. As shown by the solid line box in the projection whiteboard 230, the screen projected by the projector 220 when the projection distance is too short. For comparison, the content of the dashed box in the figure is used to indicate a picture when the projection distance is correct, and the picture of the solid box is smaller than the picture of the dashed box. When the user clicks the icon 1 of the projection whiteboard 230, the projection whiteboard 230 generates a touch coordinate (x) according to the sensed touch operation2,y2) Will (x)2,y2) Sent to the projection control end 210, and the projection control end 210 determines (x)2,y2) The corresponding coordinate on the displayed picture is (x)2U,y2U) And on the displayed picture, aiming at (x)2U,y2U) The recording function icon 2 is called to perform a click operation or a touch operation, and the recording function is called. Therefore, the user wants to call the annotation function but calls the annotation functionThe recording function is realized.
When the projection position between the projector 220 and the projection whiteboard 230 is incorrect, the above problem of calling function error also occurs, and will not be described herein.
Because the projection distance and the projection position are incorrect, the user is difficult to call the required program function, in the prior art, the user cannot control the projection control terminal 210 to call a certain program function by touching the projection white board 230, and the user can only operate the projection control terminal 210 to call a certain program function, which causes inconvenience in user operation. In order to solve the problem that it takes time for a user to call a program function, a first embodiment of the present application provides a method for binding a button function, which is described with reference to fig. 1 by taking an example that the method is applied to a projection control end 210, and the method may specifically include:
step S110, obtaining a button position of the projection whiteboard, wherein the projection whiteboard has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection whiteboard.
The button position may be a position of a button for calling a certain program function on the projection whiteboard. For example, an identifier of a button may be provided on the projection whiteboard 230, and accordingly, the position of the identifier of the button on the projection whiteboard 230, i.e., the button position. The button position may be specifically expressed as (x)1,y1)、(x2,y2)…(xN,yN) The set of coordinates may also be specifically expressed as one start point coordinate (x)1,y1) And dimensional parameters such as height H, width W, etc.
The function binding touch operation may be an operation for acquiring a button position by a user touching the projection whiteboard 230 in the process of binding the button function.
In a specific implementation, in the process of binding the button function, a user may perform a touch operation such as clicking or pressing on any position on the projection whiteboard 230. For the sake of explanation, this touch operation is named a function binding touch operation.
In practical applications, in order to prompt the user of the position of the button on the projection whiteboard 230 and the function corresponding to the button, an identifier of the button may be pasted on the projection whiteboard 230. The button marks may be of any shape, such as square, rectangular, circular, etc. The indicia of the buttons may be of any size, for example, a square button having a length and width of 5cm by 5 cm. The identification of the button can be provided anywhere on the projection whiteboard 230, for example, in the upper left corner of the projection whiteboard. The identification of the buttons may be any number. The button labels may be printed with a schematic and/or name of the function.
Fig. 4 is a schematic diagram of a button label on a projected whiteboard in one embodiment. As shown, a square sticker can be pasted on the projection whiteboard 230 as the button identifier. The sticker may have printed thereon a schematic icon and name for the button function, with button 231 being the "select" function and button 232 being the "eraser" function as shown.
When setting the button position, the user can perform a touch operation such as clicking or pressing on the edge of the button mark. For example, for a square button label, the user can click on each point on the edge of the button label.
In a specific implementation, the projection whiteboard 230 may sense a function binding touch operation of the user, and generate a touch coordinate according to the sensed function binding touch operation. The projection whiteboard 230 may send the touch coordinates to the projection control terminal 210. The projection control end 210 may be installed with a custom button configuration software, and after receiving the touch coordinate, the projection control end 210 may obtain the button position according to the touch coordinate through the custom button configuration software.
For example, the user may click on each point on the edge of the button identification. The projection whiteboard 230 senses a user's click and generates a touch coordinate (x)1,y1)、(x2,y2)…(xN,yN) A series of touch coordinates that the projection whiteboard 230 sends to the projection whiteboard 230. The projection whiteboard 230 may map the touch coordinates (x)1,y1)、(x2,y2)…(xN,yN) As the button position.
There may be various specific embodiments of the position of the touch operation acquisition button according to the function binding sensed by the projection whiteboard.
For example, for a square button label, the user can click on each point on the edge of the button label. The projection whiteboard 230 senses a user's click and generates touch coordinates. The projection control end 210 may determine a vertex coordinate according to the touch coordinate, calculate a button height and a button width according to the vertex coordinate, and use the vertex coordinate, the button height, and the button width as the button position.
For another example, for a circular button mark, the user may press the center of the button mark, and the projection whiteboard 230 senses the user's press to generate touch coordinates. The projection control end 210 may determine a button circle center according to the touch coordinates, and use the button circle center and a preset button radius as the button position.
For another example, the projection whiteboard generates a touch coordinate according to the sensed function binding touch operation, determines a button position according to the touch coordinate, and then sends the button position to the projection control end 210.
A person skilled in the art may select a specific implementation manner of obtaining the position of the button according to the function binding touch operation sensed by the projection whiteboard according to actual needs, which is not limited in the embodiment of the present application.
Step S120, acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling a touch operation according to the function call touch operation sensed by the projection whiteboard and calling the button function bound with the button position.
The button function may include various program functions on the projection control terminal 210, and the button function may be embodied as an identifier corresponding to a certain program function. The program function may specifically be a function of starting a browser, starting electronic whiteboard software, shortcut key operation, sending a global system message, sending a hardware message to a specific device port, and the like.
The function calling touch operation may be an operation in which a user touches the projection whiteboard during the projection process to call a button function.
In a specific implementation, the projection control terminal 210 may obtain the button function, and then bind the button position with the button function to establish a binding relationship between the button position and the button function.
The specific implementation of the get button function can be varied. For example, the user may write a function command for calling a button function, and input the function command to the projection control terminal 210. For another example, the projection control end 210 may preset a plurality of candidate button functions for the user to select, and the user may select one or more of the candidate button functions. Those skilled in the art can obtain the button function by using various specific embodiments according to actual needs, and the embodiment of the present application does not limit this.
There are many specific embodiments for establishing the binding relationship between the button positions and the button functions. For example, the button position may be specifically a coordinate, the button function may be specifically a function command, and the projection control end 210 may write the coordinate and the function command into a binding relationship configuration file and store the binding relationship in the local storage, so that the binding relationship between the button position and the button function is established and recorded by using the binding relationship configuration file. Those skilled in the art can adopt various specific embodiments according to actual requirements to establish a binding relationship between a button position and a button function, which is not limited in the embodiment of the present application.
When performing projection, when a user performs a function calling touch operation on the projection whiteboard 230, the projection whiteboard 230 calls the touch operation according to the sensed function, generates a touch coordinate, sends the touch coordinate to the projection control end 210, the projection control end 210 determines that the touch coordinate is matched with a button position, and the projection control end 210 calls the button function bound with the button position according to the binding relationship between the button position and the button function.
For example, the projection whiteboard 230 has a sticker identifying "launch browser button" affixed theretoIn the projection process, the user clicks the sticker of the "start browser button", and the projection white board 230 senses the click of the user to generate a touch coordinate (x)1,y1) And transmits the touch coordinates (x)1,y1) To the projection control terminal 210, the projection control terminal 210 determines the touch coordinate (x)1,y1) And matching a certain button position, and determining that the button position is in a binding relationship with a function of starting the browser, so that the projection control terminal 210 calls a program function of starting the browser to start the browser. After the projection control end 210 starts the browser, the display screen of the projection control end 210 displays the browser interface, and the browser interface displayed by the projection control end 210 can be projected onto the projection white board 230 through the projector 220.
According to the technical scheme of the embodiment of the application, the button positions of the projection whiteboard are obtained by binding the touch operation according to the functions sensed by the projection whiteboard, and the binding relation between the button functions and the button positions is established. When projection is carried out, according to the binding relation between the button functions and the button positions, when the projection white board senses function calling touch operation aiming at the button positions, the button functions bound with the button positions can be called. Therefore, the user can call the program function of the projection control terminal through touch operation on the projection white board without operating on the projection control terminal, and convenience of calling the program function by the user is improved.
Further, the button positions are determined through function binding touch operation of a user on the projection white board, and the button functions are bound to the button positions, so that any button functions can be bound to any position on the projection white board through simple touch operation. The user does not need to write complex user-defined button codes by himself while the user's requirement for the user-defined button is realized, and the cost of the user-defined button is reduced. Moreover, the user-defined button can be realized through simple touch operation, so that the user can adjust the position and the function of the button according to the requirement at any time, and the flexibility of the user-defined button is improved.
Further, since any button function can be bound to any position on the projection whiteboard, when the touch sensing function of the button position on the projection whiteboard where the button function has been bound is abnormal, the user can bind the button function to another position again to obtain a new button position. Therefore, the problem that the user-defined button cannot be used due to the abnormal touch induction function of the button position on the projection white board is solved. Moreover, even if the touch sensing function of the button position on the projection white board is abnormal, the button function can be continuously used without maintaining or replacing the whole projection white board, and the maintenance cost of the projection white board is reduced.
Optionally, the step S110 includes:
receiving a function binding touch coordinate of the projection white board; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; and generating the button position according to the function binding touch coordinate.
The function-bound touch coordinates may be coordinates generated according to a function-bound touch operation performed by the user on the projection whiteboard 230.
In a specific implementation, the projection whiteboard 230 may sense a function binding touch operation of a user, generate a touch coordinate according to the sensed function binding touch operation, and send the touch coordinate to the projection control end 210.
According to the technical scheme of the embodiment of the application, the projection white board generates the function binding touch coordinate according to the function binding touch operation, and the projection control end generates the button position according to the function binding touch coordinate and the function binding touch coordinate, so that the projection white board only needs to have a simple touch sensing function, the requirement of a user on a user-defined button can be met, and the design cost and the hardware cost of the projection white board are reduced.
Optionally, the generating the button position according to the function binding touch coordinate includes:
determining vertex coordinates in the function binding touch coordinates; calculating the coordinate distance between the vertex coordinates to obtain the height and the width of the button; generating the button position; the button position includes the vertex coordinates, the button height, and the button width.
Wherein the vertex coordinates may be coordinates of one or more vertices in the rectangular button. For example, the coordinates of the vertex at the lower right corner of the button, and the coordinates of the vertex at the upper left corner of the button.
Wherein the button height may be the height of the button. The button width may be the width of the button.
Wherein, the coordinate distance may be a distance of the vertex coordinate in a two-dimensional coordinate system.
In a specific implementation, a user can touch at multiple points on the projection whiteboard 230, thereby obtaining multiple function-bound touch coordinates. For the plurality of function-bound touch coordinates, the projection control end 210 may determine one or more vertex coordinates. For example, for a rectangular button, the coordinates of the four vertices of the rectangle may be determined.
After determining the vertex coordinates, the coordinate distance between the vertex coordinates may be calculated, resulting in a button height and a button width. Finally, the vertex coordinates, button height, button width may be taken as button positions.
For example, for a rectangular button, the coordinates of the vertices of the upper left corner, the lower left corner, and the lower right corner of the rectangle may be determined as vertex coordinates. Calculating the coordinate distance between the vertexes of the upper left corner and the lower left corner to obtain the height of the rectangle as the height of the button; and calculating the coordinate distance between the vertexes of the upper left corner and the upper right corner to obtain the width of the rectangle as the width of the button.
For another example, for a rectangular button, the coordinates of the vertices of the upper left corner and the lower right corner of the rectangle may be determined as vertex coordinates. And calculating the longitudinal coordinate distance between the vertexes through the y-axis coordinates of the vertexes of the upper left corner and the lower right corner to obtain the height of the rectangle as the height of the button. And calculating the transverse coordinate distance between the vertexes through the x-axis coordinates of the vertexes of the upper left corner and the lower right corner to obtain the width of the rectangle as the width of the button.
In practical application, the process of obtaining vertex coordinates, button heights, and button widths for a plurality of function binding touch coordinates may be understood as a process of solving a circumscribed rectangle for a plurality of function binding touch coordinates. Binding a plurality of function binding touch coordinates to serve as a set PL of coordinate points, and solving a circumscribed rectangle R of the set PL, wherein the circumscribed rectangle R comprises vertex coordinates of R, height and width of R.
First, the coordinates of the top left corner vertex and the coordinates of the bottom right corner vertex are calculated from the set PL, assuming PLiFor the elements of the set PL, MinX, MinY, MaxX, and MaxY can be calculated by the following formulas:
MinX=Min(PLi.X);MinY=Min(PLi.Y);MaxX=Max(PLi.X);MaxY=Max(PLi.Y)
then, the coordinates of the top left corner vertex and the coordinates of the bottom right corner vertex are calculated by the following formulas:
LeftTopPoint=(MinX,MinY);RightBottomPoint=(MaxX,MaxY)
the circumscribed rectangle R can be represented as the coordinates of the top left corner vertex and the coordinates of the bottom right corner vertex:
R=(LeftTopPoint,RightBottomPoint)
finally, the height and width of the rectangle are calculated by the following formula:
R.Width=RightBottomPoint.X-LeftTopPoint.X
R.Height=RightBottomPoint.Y-LeftTopPoint.Y
according to the technical scheme of the embodiment of the application, the vertex coordinates of the touch coordinates are bound according to the functions, the button height and the button width are determined according to the coordinate distance between the vertex coordinates, and the vertex coordinates, the button height and the button width are used as the button positions.
Optionally, the establishing a binding relationship between the button position and the button function includes:
generating function instance code for the button function; generating an instance identifier corresponding to the function instance code; generating array elements; the array element comprises the instance identification and the button position; and adding the array elements to a preset initial array to obtain a binding relation array.
The function instance code may be code for invoking instantiation of a program function on the projection control terminal 210.
The binding relationship array may be an ordered sequence of elements that records the binding relationship between the button position and the button function.
In a specific implementation, multiple object instances may be created, i.e., code for invoking button functions is generated as function instance code. For example, a configuration file in a JSON (Object Notation) format may be created, where a command name and a button position corresponding to a button function are recorded in the configuration file, and then a parser is called to parse the configuration file to create a plurality of Object instances, so as to obtain a function instance code. An object instance may inherit from a Command class, and example code for a defined Command class is as follows:
public class Command
{
public int Id{set;get;}
public Button Button{set;get;}
}
public class Button
{
public int X{set;get;}
public int Y{set;get;}
public int Width{set;get;}
public int Height{set;get;}
}
after generating the function instance code, an instance identification corresponding to the function instance code may be generated accordingly. For example, an ID may be extracted from the function instance code as an instance identification.
Then, an array element containing the instance identifier and the button position is generated, and the array element is added to a preset initial array to obtain a binding relation array. For example, an array L is preset, an array element is formed by the instance identifier ID, the vertex coordinates (x, y), the button height H and the button width W, and the array element is added to the array L to obtain an array L' as the binding relationship array.
Therefore, the binding relationship array comprises a plurality of array elements, each array element records the corresponding relationship between the button position and the instance identifier, the corresponding function instance code can be determined according to the instance identifier, and the function instance code is used for calling the corresponding button function, that is, the binding relationship between the button position and the button function is established.
Optionally, the generating a function instance code of the button function includes:
acquiring a function configuration file of the button function; extracting a function name from the function configuration file; determining a function type according to the function name; and analyzing the function configuration file according to the function category to obtain the function instance code.
The function configuration file may be a file for configuring a binding relationship between a button position and a button function.
Wherein the function category may be a category of button functions. For example, functions may be divided into categories such as shortcut keys, launching software, browser open links, sending global system messages, sending hardware messages, and so on.
In a specific implementation, a user may write a configuration file in the projection control end 210, and the projection control end 210 may use the configuration file as a function configuration file. The file format of the function configuration file can be various, and in practical application, the JSON format can be adopted as the file format of the configuration file.
In practical applications, in order to configure the binding relationship between the button function and the button position by using the function profile, a command name and a button position corresponding to the button function may be recorded in the function profile. An example of an initial JSON configuration file is as follows:
{
"X": "x-coordinate of the button",
"Y": "the y-coordinate of the button",
"Width": "width of the button",
"Height": "height of the button",
"Command": "command of button"
}
Wherein, the values of "X" and "Y" can be obtained according to the vertex coordinates, the values of "Width" and "Height" can be obtained according to the button Width and the button Height, and the value of "Command" can be input by the user. The numerical value of the Command may include a plurality of attributes, for example, attributes such as "Name", "Keys", "Path", and the like, and after the numerical values of the attributes in the JSON configuration file are filled in, the function configuration file can be obtained. The function configuration file may be stored locally at the projection control end 210, for example, C: \ Program Files (x86) \ project File \ button.
When the custom button configuration software of the projection control end 210 is started, the custom button configuration software may run in the background of the projection control end 210. The custom button configuration software may monitor the generation or modification of functional configuration files while running. And when the new functional configuration file is changed or the functional configuration file is modified, reading the functional configuration file, and thus obtaining the functional configuration file.
After the function profile is obtained, the function name may be extracted from the function profile. For example, from a function configuration file in JSON format, a numerical value of the "Name" field is extracted as a function Name.
The function category can be determined by the proposed function name. For example, a function configuration file in JSON format, where the value of the "Name" attribute is "MCU", and the function type is "send hardware message"; the value of the Name attribute is ShortKey, and the function type is shortcut key.
After the function type is determined, an analyzer corresponding to the function type can be called to analyze the function configuration file and create a corresponding instance so as to obtain a function instance code.
According to the technical scheme of the embodiment of the application, the button position and the button function are bound in a configuration file mode, the button position and the button function can be adjusted in a configuration file modification mode, a user does not need to perform complex operation, and convenience of user-defined buttons is improved.
Optionally, the function category is a shortcut key category, and the analyzing the function configuration file according to the function category to obtain the function instance code includes:
calling a shortcut key function analyzer to analyze the function configuration file to obtain keyboard keys; generating the function instance code; the function instance code includes the keyboard keys.
In specific implementation, when the function type is a shortcut key type, a shortcut key function parser may be called to parse the function configuration file to obtain the keyboard keys. Then, a function instance code containing the keyboard keys is generated. For example, information of the function profile record in JSON format is as follows:
{
"X": "x-coordinate of the button",
"Y": "the y-coordinate of the button",
"Width": "width of the button",
"Height": "height of the button",
"Command":
{
"Name":"ShortKey",
"Keys":"Ctrl,S"
}
according to the value 'ShortKey' of 'Name', the function type can be determined to be the shortcut key type. And calling a shortcut key function analyzer to analyze the function configuration file. In the specific process of analysis, the value of the "Keys" attribute can be read to obtain "Ctrl" and "S" as the Keys of the keyboard. The keyboard key of Ctrl + S can realize the function of saving the shortcut key of the file. After the function configuration file is analyzed, a function instance code is created as follows:
public class ShortKeyCommand:Command
{
public string[]Keys{set;get;}
}
and executing the function instance code to realize the function of quickly saving files.
Of course, in practical application, different keyboard keys and combinations thereof can be set to realize corresponding shortcut key functions. For example, "win + d" may display the desktop, "alt + F4" may close the window, "PageUp" and "PageDown" may enable PPT courseware paging. Those skilled in the art can set different keyboard keys and combinations thereof according to actual needs.
It should be noted that the button function for implementing the shortcut key function is implemented by calling the global keyboard simulator and implementing a key press by software simulation of the keyboard key press.
Optionally, the function type is a software starting type, and the analyzing the function configuration file according to the function type to obtain the function instance code includes:
calling a starting software function analyzer to analyze the function configuration file to obtain a software absolute path and a starting parameter; generating the function instance code; the function instance code includes the software absolute path and the startup parameters.
In specific implementation, when the function type is the start software type, a start software function analyzer can be called to analyze the function configuration file to obtain the software absolute path and the start parameters. Then, function instance code is generated that contains the software absolute path and startup parameters. For example, information of the function profile record in JSON format is as follows:
{
"X": "x-coordinate of the button",
"Y": "the y-coordinate of the button",
"Width": "width of the button",
"Height": "height of the button",
"Command":
{
"Name":"Excuse",
"Path":"C:\Program Files(x86)\Seewo\EasiNote5\swenlauncher\swenlauncher.exe"
"Args":
[
"display",
]
}
according to the value "Excuse" of "Name", the function category can be determined as the boot software category. And calling a starting software function analyzer to analyze the function configuration file. In the specific process of analysis, the value C of the attribute of 'Path' can be read, namely \ Program Files (x86) \ Seewo \ EasiNote5\ swenlaunter. The value "display" of "Args" can be read as a startup parameter of the whiteboard software.
After the function configuration file is analyzed, a function instance code is created as follows:
public class Excuse:Command
{
///<summary>
///C:\Program Files(x86)\Seewo\EasiNote5\swenlauncher\swenlauncher.exe
///</summary>
public string Path{get;set;}
///<summary>
///display
///</summary>
public string[]Args{get;set;}
}
the function instance code is executed, the function of starting the whiteboard software can be realized, and the teaching mode is entered according to the starting parameter 'display'.
It should be noted that the function of starting the software is to start the software by calling a process initiator provided by the system, and meanwhile, the process initiator provided by the system supports the transmission of parameters to the started software, and if the command of starting the software has parameters, the parameters of the start are transmitted.
It should be further noted that, in practical applications, a user-defined and more complex function can be implemented in combination with the incoming parameters. For example, a user needs to set a button function to download a document from a certain website, and then open the document using Word software. To implement this function, a facility program may be developed, in which the value of the "Path" attribute is set to the Path where the facility program is located, one value of "Args" is set to the website where the document needs to be downloaded, and the other value of "Args" is set to the Path of Word software. By the method, the functions can be expanded, and the user-defined and complex functions are met.
Optionally, the function category is a browser open link category, and the analyzing the function configuration file according to the function category to obtain the function instance code includes:
calling a link analyzer to analyze the function configuration file to obtain a link website; generating the function instance code; the function instance code includes the link website.
In specific implementation, when the function type is a browser open link type, a link parser may be called to parse the function configuration file to obtain a link website. Then, a function instance code containing the linked web address is generated.
For example, information of the function profile record in JSON format is as follows:
{
"X": "x-coordinate of the button",
"Y": "the y-coordinate of the button",
"Width": "width of the button",
"Height": "height of the button",
"Command":
{
"Name":"UrlLauncher",
"Url":"http://seewo.com"
}
}
according to the value "UrlLauncher" of the "Name", the function category can be determined as the browser open link category. And calling a link analyzer to analyze the function configuration file. In the specific process of the analysis, the value of the attribute of 'Url', http:// seewo.
After the function configuration file is analyzed, a function instance code is created as follows:
public class UrlLauncher:Command
{
///<summary>
///http://seewo.com
///</summary>
public string Url{get;set;}
}
com is accessed to the website through a browser.
It should be noted that the command for invoking the browser is to invoke a process launcher provided by the system, launch a default browser of the system, and transfer the launched link by means of a parameter, so that the link can be opened by the browser.
Optionally, the function type is a type of sending a global system message, and the analyzing the function configuration file according to the function type to obtain the function instance code includes:
calling a system message analyzer to analyze the function configuration file to obtain a global system message; generating the function instance code; the function instance code includes the global system message.
In a specific implementation, when the function type is the type of sending the global system message, a system message parser may be called to parse the function configuration file to obtain the global system message. Then, function instance code is generated that contains the global system message. For example, information of the function profile record in JSON format is as follows:
{
"X": "x-coordinate of the button",
"Y": "the y-coordinate of the button",
"Width": "width of the button",
"Height": "height of the button",
"Command":
{
"Name":"MSG",
"WindowsMessage":"WM_PAINT"
}
}
according to the value "MSG" of "Name", the function category can be determined as the category of sending global system messages. And calling a system message analyzer to analyze the function configuration file. In the specific process of parsing, the value "WM _ PAINT" of the "windows message" attribute may be read as a global system message. After the function configuration file is analyzed, a function instance code is created as follows:
public class WinMsg:Command
{
///<summary>
///WM_PAINT
///</summary>
public int Msg{set;get;}
}
in the projection process, a certain software operation may be jammed, and therefore, forced software refreshing is required to restore the normal operation of the software. Executing the function example code can realize the function of forcibly refreshing the software by sending a Windows global system message.
It should be noted that the function of sending the global Windows message is implemented by directly sending the global Windows message.
Optionally, the function type is a hardware message sending type, and the analyzing the function configuration file according to the function type to obtain the function instance code includes:
calling a hardware message analyzer to analyze the function configuration file to obtain message content, message length, hardware identification and a sending port; generating the function instance code; the function instance code includes the message content, the message length, the hardware identification, and the send port.
In specific implementation, when the function type is a hardware message sending type, a hardware message parser may be called to parse the function configuration file to obtain information of message content, message length, hardware identifier, a sending port, and the like. Then, a function instance code is generated that contains the message content, the message length, the hardware identification, and the sending port. For example, information of the function profile record in JSON format is as follows:
{
"X": "x-coordinate of the button",
"Y": "the y-coordinate of the button",
"Width": "width of the button",
"Height": "height of the button",
"Command":
{
"Name":"MCU",
"PID": "PID (Product ID, device identification code) of usb (Universal Serial BUS) device"
"VID": "VID (Vendor ID, supplier identification code) of usb device"
"KEY": 'transmitting port'
"Message": "message content"
"Length": message Length "
}
}
According to the value "MCU" of "Name", the function type can be determined as the type of sending hardware message. And calling a hardware message analyzer to analyze the function configuration file. In the specific process of the analysis, the values of PID, VID, KEY, Message and Length can be read to obtain the hardware identifier, the sending port, the Message content and the Message Length. After the function configuration file is analyzed, a function instance code is created as follows:
public class Mcu:Command
{
///<summary>
pid of// usb device
///</summary>
public int Pid{set;get;}
///<summary>
Vid of// usb device
///</summary>
public int Vid{set;get;}
///<summary>
// usb device port
///</summary>
public string Key{set;get;}
///<summary>
// message content
///</summary>
public string Message{set;get;}
///<summary>
// message Length
///</summary>
public int Length{set;get;}
}
Executing the function instance code can send hardware message to the port of the specific USB device according to PID and VID, so as to realize the function of sending the hardware message to the specific hardware. For example, during the projection process, the projection whiteboard 230 and the projector 220 are connected via USB. By sending a hardware message to projector 220, projector 220 may be controlled to adjust the brightness of the projection.
It should be noted that, sending the hardware message requires first parsing the hardware message into byte arrays, where the parsing method may be to group the character strings into two character groups, combine two characters into one byte array in a manner of converting the two characters into one byte, determine whether the current byte array is the same as the message length to be sent, and if the length of the byte array is smaller than the message length to be sent, fill 0 until the length of the byte array is the same as the message length. And then calling the USB connector to find the corresponding USB sending hardware message.
Optionally, the method further comprises: receiving a function calling touch coordinate of the projection whiteboard; the function calling touch coordinate is generated according to the function calling touch operation; traversing array elements in the binding relation array to obtain a target array element; the button position of the target array element is matched with the function calling touch coordinate; and calling the button function according to the target array element.
The function call touch coordinates may be coordinates generated according to a function call touch operation performed by the user on the projection whiteboard 230.
In a specific implementation, during the projection process, a user may perform a touch operation such as clicking, pressing, or the like on a specific position on the projection whiteboard 230.
In practical applications, in order to prompt the user of a specific position of a button on the projection whiteboard 230 and a function corresponding to the button, an identifier of the button may be pasted on the projection whiteboard 230. The user can perform a touch operation such as clicking, pressing, or the like on the mark on the projection whiteboard 230.
The projection whiteboard 230 generates a touch coordinate according to the function call touch operation, and obtains a function call touch coordinate. The projection whiteboard 230 transmits the function call touch coordinates to the projection control terminal 210. The projection control end 210 matches the function call touch coordinate with each array element in the binding relationship array, and takes the array element whose button position matches the function call touch coordinate as a target array element.
For example, the coordinates clicked by the user on the projection whiteboard 230 are (X, Y), and when the button position (vertex coordinates (X, Y), button Height, button Width) of the nth array element Ln in the L array satisfies the following formula, it can be determined that the coordinates match the button position:
x≥Ln.X&&x≤Ln.X+L_n.Width&&y≥Ln.Y&&y≤Ln.Y+Ln.Height
according to the instance identifier in the target array element, the corresponding function instance code can be determined, and the function instance code is executed, namely, the corresponding button function can be called.
For example, when the user clicks the sticker identifier of "start whiteboard software" on the projection whiteboard 230, the projection control terminal executes the function instance code for starting the whiteboard software, and the projection control terminal 210 can start the whiteboard software. The activated whiteboard software is projected onto the projection whiteboard 230 by the projector 220.
According to the technical scheme of the embodiment of the application, in the projection process, the projection white board sends the function calling touch coordinates generated by the function calling touch operation of the induction user to the projection control end, the projection control end calls the button function according to the matching result of the function calling touch coordinates and the button position, in the process, the projection white board only needs to execute the processing of generating the coordinates and sending the coordinates without carrying out complex processing, and therefore the design cost and the hardware cost of the projection white board are saved while the button function is conveniently called.
Optionally, the invoking the button function according to the target array element includes:
acquiring an example identifier in the target array element to obtain a target example identifier; acquiring a function instance code corresponding to the target instance identifier to obtain a target instance code; and executing the target instance code to call the button function corresponding to the target instance code.
In a specific implementation, after the target array element is determined, the projection control end 210 may extract an instance identifier from the target array element, and use the instance identifier as a target instance identifier, and according to the target instance identifier, may extract a corresponding function instance code as a target instance code. Executing the object instance code may call the corresponding button function on the projection control terminal 210.
Optionally, after the receiving the function call touch coordinate of the projection whiteboard, the method further includes:
extracting a touch pressing coordinate and a touch lifting coordinate from the function calling touch coordinate; calculating a coordinate distance between the touch down coordinate and the touch up coordinate; determining that a coordinate distance between the touch down coordinate and the touch up coordinate is smaller than a preset distance threshold; and executing the step of traversing the array elements in the binding relation array to obtain target array elements.
The touch down coordinates may be coordinates generated when the user starts a touch operation. The touch-up coordinates may be coordinates generated when the user ends the touch operation. The coordinate distance may be a distance of the coordinates in a two-dimensional coordinate system.
In a specific implementation, the projection control terminal 210 may obtain a plurality of consecutive function call touch coordinates, and determine a touch down coordinate and a touch up coordinate in the plurality of consecutive function call touch coordinates. For example, when a function calling touch operation is sensed to generate function calling touch coordinates, the current time is recorded as the touch time of the function calling touch coordinates, the function calling touch coordinate with the earliest touch time is determined as the touch pressing coordinate, and the function calling touch coordinate with the latest touch time is determined as the touch lifting coordinate according to the touch time of each function calling touch coordinate. And then, calculating a coordinate distance between the touch down coordinate and the touch up coordinate, and when the calculated coordinate distance is smaller than a preset distance threshold, executing a step of traversing array elements in the binding relation array to obtain a target array element.
In practical applications, there may be a case where a button of the projection whiteboard 230 is touched by mistake due to an erroneous operation. For example, when a user explains a projection image of the projection whiteboard 230, the user may slide on the projection whiteboard 230 with a finger, press the projection whiteboard 230 at a certain position and lift the projection whiteboard 230 after sliding to another position, and if the user passes through a button position during the sliding process, the user may trigger the calling of the button function. Therefore, when the coordinate distance between the touch-down coordinate and the touch-up coordinate is large, it indicates that the user is not performing a touch operation such as clicking or pressing at a certain position of the projection whiteboard, but a certain misoperation, and even if the function calling touch coordinate matches a certain button position, the corresponding button function does not need to be called.
In the technical scheme of the embodiment of the application, the touch down coordinate and the touch up coordinate are determined in the function calling touch coordinate, and when the coordinate distance between the touch down coordinate and the touch up coordinate is smaller than a preset distance threshold, the step of traversing the array elements in the binding relation array to obtain the target array element is started to call the button function according to the target array element, so that the problem that the button function is called mistakenly due to misoperation of a user is avoided.
Optionally, after the receiving the function-bound touch coordinates of the projection whiteboard, the method further includes: generating a touch prompt image; the touch prompt image comprises a touch prompt identifier; the position of the touch prompt identifier on the touch prompt image is determined according to the function binding touch coordinate; sending the touch prompt image to a projector; and the touch prompt image is used for projecting to the projection white board by the projector so as to prompt the user of the position of the function binding touch coordinate on the projection white board through the touch prompt identifier.
The touch prompt image may be an image including a touch prompt identifier. The touch prompt identification can be an identification for prompting the user to functionally bind the location of the touch coordinates on the projection whiteboard 230. For example, a highlighted dot, or a red-sided box.
In a specific implementation, after receiving the function binding touch coordinate of the projection whiteboard 230, the projection control end 210 may generate an image, use the image as a plane coordinate system, determine, in the plane coordinate system, an identifier coordinate that is the same as the function binding touch coordinate, and add a touch prompt identifier to the identifier coordinate, thereby obtaining a touch prompt image. Then, the touch prompt image is sent to the projector 220, and the projector 220 projects the touch prompt image to the projection whiteboard 230, so that the touch prompt image is displayed on the projection whiteboard 230, and the user views the projection whiteboard 230, and can know the position of the function binding touch coordinate, generated by the projection whiteboard 230 according to the sensed function binding touch operation, on the projection whiteboard 230 according to the position of the touch prompt identifier in the touch prompt image.
In practical applications, if the function-bound touch coordinates are not the coordinates required by the user, the user may perform the touch operation again to re-determine the function-bound touch coordinates. For example, in order to prompt the user of the position of the button on the projection whiteboard 230 and the function corresponding to the button, an identifier of the button may be pasted on the projection whiteboard 230. The user needs to click on the identifier so that the function binding touch coordinate generated by the click coincides with the position of the identifier of the button on the projection whiteboard 230. If the function-bound touch coordinates deviate from the identity of the button, the user needs to re-click to re-determine the function-bound touch coordinates.
FIG. 5 is a diagram of a scenario of a touch prompt, under an embodiment. As shown in the figure, a user performs a function binding touch operation at a certain position on the projection whiteboard 230, the projection whiteboard 230 senses the function binding touch operation to generate a function binding touch coordinate, the function binding touch coordinate is sent to the projection control end 210, the projection control end 210 generates a touch prompt image 233 according to the function binding touch coordinate and sends the touch prompt image 233 to the projector 220, the projector 220 projects the touch prompt image to the projection whiteboard 230, the touch prompt image 233 is displayed on the projection whiteboard 230, wherein the touch prompt image 233 contains a highlight dot, which is a touch prompt identifier, and the user can know the position of the function binding touch coordinate generated by the projection whiteboard 230 according to the sensed function binding touch operation on the projection whiteboard 230 through the highlight dot.
According to the technical scheme of the embodiment of the application, the touch prompt image containing the touch prompt identifier is generated by binding the touch coordinate according to the function, the touch prompt image is sent to the projector, the projector projects the touch prompt image to the projection white board, and a user can know the position of the function binding touch coordinate on the projection white board through the position of the touch prompt identifier on the touch prompt image, so that the problem of button position positioning error caused by the fact that the function binding touch coordinate is not in accordance with the coordinate required by the user is solved.
In addition, according to the technical scheme of the embodiment of the application, the projection function of the projector is utilized to assist the user in positioning the button position, and the display function does not need to be added on the projection white board, so that the hardware cost and the design cost of the projection white board are saved.
Example two
Fig. 6 is a flowchart of a method for invoking a button function according to a second embodiment of the present application. The button function calling method provided in this embodiment may be executed by the projection control terminal 210 in fig. 2. Specifically, referring to fig. 6, the method for invoking a button function according to the second embodiment of the present application specifically includes:
step S210, obtaining a button position of the projection whiteboard, wherein the projection whiteboard has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection whiteboard.
In a specific implementation, in the process of binding the button function, a user may perform a touch operation such as clicking or pressing on any position on the projection whiteboard 230. For the sake of explanation, this touch operation is named a function binding touch operation. The projection whiteboard 230 may sense a function-binding touch operation of the user and generate touch coordinates according to the sensed function-binding touch operation. The projection whiteboard 230 may send the touch coordinates to the projection control terminal 210. The projection control end 210 may be installed with a custom button configuration software, and after receiving the touch coordinate, the projection control end 210 may obtain the button position according to the touch coordinate through the custom button configuration software.
The button position may be specifically expressed as (x)1,y1)、(x2,y2)…(xN,yN) The set of coordinates may also be specifically expressed as one start point coordinate (x)1,y1) And dimensional parameters such as height H, width W, etc.
Step S220, acquiring a button function, and establishing a binding relationship between the button position and the button function.
In a specific implementation, the projection control terminal 210 may obtain the button function, and then bind the button position with the button function to establish a binding relationship between the button position and the button function.
Step S230, receiving a function calling touch coordinate of the projection whiteboard; and the function calling touch coordinates are generated according to the function calling touch operation sensed by the projection whiteboard.
In a specific implementation, the projection whiteboard 230 generates corresponding touch coordinates as function call touch coordinates according to the function call touch operation. The projection whiteboard 230 transmits the function call touch coordinates to the projection control terminal 210.
Step S240, determining that the function calling touch coordinate matches the button position, and calling a button function bound to the button position.
In specific implementation, the touch coordinates can be matched with a plurality of button positions, and when the matched button positions exist in the touch coordinates, the button functions bound with the button positions are called.
For example, the button position is the start point coordinate (x)1,y1) And height H, width W, such that a range of coordinates, touch coordinates (x), can be determinedN,yN) And when the touch button is within the coordinate range, the touch coordinate is determined to be matched with the button position.
In practical application, an array of button positions and button functions can be pre-established, and the button positions matched with the touch coordinates can be determined in a mode of traversing the array. The way of establishing an array and traversing the array has been described in detail in the first embodiment, and is not described herein again.
According to the technical scheme of the embodiment of the application, the button positions of the projection whiteboard are obtained by binding the touch operation according to the functions sensed by the projection whiteboard, and the binding relation between the button functions and the button positions is established. When projection is carried out, according to the binding relation between the button functions and the button positions, when the projection white board senses function calling touch operation aiming at the button positions, the button functions bound with the button positions can be called. Therefore, the user can call the program function of the projection control terminal through touch operation on the projection white board without operating on the projection control terminal, and convenience of calling the program function by the user is improved.
Further, the button positions are determined through function binding touch operation of a user on the projection white board, and the button functions are bound to the button positions, so that any button functions can be bound to any position on the projection white board through simple touch operation. The user does not need to write complex user-defined button codes by himself while the user's requirement for the user-defined button is realized, and the cost of the user-defined button is reduced. Moreover, the user-defined button can be realized through simple touch operation, so that the user can adjust the position and the function of the button according to the requirement at any time, and the flexibility of the user-defined button is improved.
Further, since any button function can be bound to any position on the projection whiteboard, when the touch sensing function of the button position on the projection whiteboard where the button function has been bound is abnormal, the user can bind the button function to another position again to obtain a new button position. Therefore, the problem that the user-defined button cannot be used due to the abnormal touch induction function of the button position on the projection white board is solved. Moreover, even if the touch sensing function of the button position on the projection white board is abnormal, the button function can be continuously used without maintaining or replacing the whole projection white board, and the maintenance cost of the projection white board is reduced.
Optionally, the establishing a binding relationship between the button position and the button function includes:
generating function instance code for the button function; generating an instance identifier corresponding to the function instance code; generating array elements; the array element comprises the instance identification and the button position; and adding the array elements to a preset initial array to obtain a binding relation array.
For the above steps, since the specific implementation manner and beneficial effects have been described in detail in the first embodiment, no further description is given here.
Optionally, the generating a function instance code of the button function includes:
acquiring a function configuration file of the button function; extracting a function name from the function configuration file; determining a function type according to the function name; and analyzing the function configuration file according to the function category to obtain the function instance code.
For the above steps, since the specific implementation manner and beneficial effects have been described in detail in the first embodiment, no further description is given here.
Optionally, the step S240 includes:
traversing array elements in the binding relation array to obtain a target array element; the button position of the target array element is matched with the function calling touch coordinate; and calling the button function according to the target array element. For the above steps, since the specific implementation manner and beneficial effects have been described in detail in the first embodiment, no further description is given here.
Optionally, the invoking the button function according to the target array element includes:
acquiring an example identifier in the target array element to obtain a target example identifier; acquiring a function instance code corresponding to the target instance identifier to obtain a target instance code; and executing the target instance code to call the button function corresponding to the target instance code. For the above steps, since the specific implementation manner and beneficial effects have been described in detail in the first embodiment, no further description is given here.
It should be understood that although the steps in the flowcharts of fig. 1 and 6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise.
EXAMPLE III
Fig. 7 is a schematic structural diagram of a button function binding apparatus according to a third embodiment of the present application. Referring to fig. 7, the button function binding apparatus provided in this embodiment specifically includes: a location acquisition module 310 and a function acquisition module 320; wherein:
the position acquiring module 310 is configured to acquire a button position of a projection whiteboard, where the projection whiteboard has a touch sensing function, and the button position is obtained by binding a touch operation according to the function sensed by the projection whiteboard;
the function obtaining module 320 is configured to obtain a button function, and establish a binding relationship between the button position and the button function, where the binding relationship is used to invoke a button function bound with the button position according to a function invoking touch operation sensed by the projection whiteboard.
Optionally, the position obtaining module 310 includes: the function binding touch coordinate receiving submodule is used for receiving the function binding touch coordinate of the projection whiteboard; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; and the position generating submodule is used for generating the button position according to the function binding touch coordinate.
Optionally, the position generation sub-module includes: a vertex coordinate determination unit for determining a vertex coordinate in the function binding touch coordinates; the distance calculation unit is used for calculating the coordinate distance between the vertex coordinates to obtain the height and the width of the button; a position generating unit for generating the button position; the button position includes the vertex coordinates, the button height, and the button width.
Optionally, the function acquiring module 320 includes: the instance code generation submodule is used for generating a function instance code of the button function; the identification generation submodule is used for generating an example identification corresponding to the function example code; the array element generation submodule is used for generating array elements; the array element comprises the instance identification and the button position; and the adding submodule is used for adding the array elements to a preset initial array to obtain a binding relation array.
Optionally, the example code generation submodule includes: the configuration unit is used for acquiring a function configuration file of the button function; a name extraction unit for extracting a function name in the function configuration file; the classification unit is used for determining the function category according to the function name; and the file analyzing unit is used for analyzing the function configuration file according to the function type to obtain the function instance code.
Optionally, the apparatus further comprises: the function calling touch coordinate receiving module is used for receiving the function calling touch coordinates of the projection white board; the function calling touch coordinate is generated according to the function calling touch operation; the array traversing module is used for traversing the array elements in the binding relation array to obtain target array elements; the button position of the target array element is matched with the function calling touch coordinate; and the function calling module is used for calling the button function according to the target array element.
Optionally, the function calling module includes: a target instance identifier obtaining submodule, configured to obtain an instance identifier in the target array element, to obtain a target instance identifier; the target instance code acquisition submodule is used for acquiring a function instance code corresponding to the target instance identifier to obtain a target instance code; and the code execution submodule is used for executing the target instance code so as to call the button function corresponding to the target instance code.
Optionally, the apparatus further comprises: the coordinate extraction module is used for extracting a touch pressing coordinate and a touch lifting coordinate from the function calling touch coordinate; the distance calculation module is used for calculating the coordinate distance between the touch pressing coordinate and the touch lifting coordinate; the distance determining module is used for determining that the coordinate distance between the touch pressing coordinate and the touch lifting coordinate is smaller than a preset distance threshold value; and the step execution module is used for executing the step of traversing the array elements in the binding relation array to obtain the target array elements.
Optionally, the apparatus further comprises: the image generation module is used for generating a touch prompt image; the touch prompt image comprises a touch prompt identifier; the position of the touch prompt identifier on the touch prompt image is determined according to the function binding touch coordinate; the image sending module is used for sending the touch prompt image to a projector; and the touch prompt image is used for projecting to the projection white board by the projector so as to prompt the user of the position of the function binding touch coordinate on the projection white board through the touch prompt identifier.
Optionally, the function type is a shortcut key type, and the file parsing unit is specifically configured to: calling a shortcut key function analyzer to analyze the function configuration file to obtain keyboard keys; generating the function instance code; the function instance code includes the keyboard keys.
Optionally, the function type is a software starting type, and the file parsing unit is specifically configured to: calling a starting software function analyzer to analyze the function configuration file to obtain a software absolute path and a starting parameter; generating the function instance code; the function instance code includes the software absolute path and the startup parameters.
Optionally, the function category is a browser open link category, and the file parsing unit is specifically configured to: calling a link analyzer to analyze the function configuration file to obtain a link website; generating the function instance code; the function instance code includes the link website.
Optionally, the function type is a type of sending a global system message, and the file parsing unit is specifically configured to: calling a system message analyzer to analyze the function configuration file to obtain a global system message; generating the function instance code; the function instance code includes the global system message.
Optionally, the function type is a hardware message sending type, and the file parsing unit is specifically configured to: calling a hardware message analyzer to analyze the function configuration file to obtain message content, message length, hardware identification and a sending port; generating the function instance code; the function instance code includes the message content, the message length, the hardware identification, and the send port.
Example four
Fig. 8 is a schematic structural diagram of a button function calling device according to a fourth embodiment of the present application. Referring to fig. 8, the button function calling apparatus provided in this embodiment specifically includes: a position acquisition module 410, a function acquisition module 420, a coordinate receiving module 430 and a function calling module 440; wherein
A position obtaining module 410, configured to obtain a button position of a projection whiteboard, where the projection whiteboard has a touch sensing function, and the button position is obtained by binding a touch operation according to a function sensed by the projection whiteboard;
a function obtaining module 420, configured to obtain a button function, and establish a binding relationship between a button position and the button function;
a coordinate receiving module 430, configured to receive the function call touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard;
and a function calling module 440, configured to determine that the touch coordinate of the function call matches the button position, and call a button function bound to the button position.
The button function binding device and the button function calling device provided by the above can be used for executing the button function binding method and the button function calling method provided by any of the above embodiments, and have corresponding functions and beneficial effects. For the specific limitations of the button function binding means and the button function calling means, reference may be made to the above limitations of the button function binding method and the button function calling method, which are not described herein again. The respective modules in the button function binding means and the button function calling means may be wholly or partially implemented by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
EXAMPLE five
Fig. 9 is a schematic structural diagram of a projection control apparatus according to a fifth embodiment of the present application. As shown in the drawing, the projection control apparatus includes: a processor 50, a memory 51, a display 52, an input device 53, an output device 54, and a communication device 55. The number of the processors 50 in the projection control apparatus may be one or more, and one processor 50 is illustrated as an example. The number of the memories 51 in the projection control apparatus may be one or more, and one memory 51 is illustrated as an example. The processor 50, the memory 51, the display screen 52, the input device 53, the output device 54, and the communication device 55 of the projection control apparatus may be connected by a bus or other means, and the bus connection is taken as an example in the figure. In an embodiment, the projection control device may be a computer, a mobile phone, a tablet, an interactive smart tablet, or the like.
The memory 51 is a computer readable storage medium, and can be used for storing software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the button function binding and button function calling methods described in any embodiment of the present application. The memory 51 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory 51 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 51 may further include memory located remotely from the processor 50, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The display screen 52 is a display screen 52 with a touch function, which may be a capacitive screen, an electromagnetic screen, or an infrared screen. In general, the display screen 52 is used for displaying data according to instructions from the processor 50, and is also used for receiving touch operations applied to the display screen 52 and sending corresponding signals to the processor 50 or other devices. Optionally, when the display screen 52 is an infrared screen, the display screen further includes an infrared touch frame, and the infrared touch frame is disposed around the display screen 52, and may also be configured to receive an infrared signal and send the infrared signal to the processor 50 or other devices. The communication device 55 is used for establishing a communication connection with other devices, and may be a wired communication device and/or a wireless communication device. The input device 53 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the projection control apparatus, and may also be a camera for acquiring images and a sound pickup apparatus for acquiring audio data. The output device 54 may include an audio device such as a speaker. It should be noted that the specific composition of the input device 53 and the output device 54 may be set according to actual conditions. The processor 50 executes various functional applications and data processing of the device, i.e., the above-described button function binding method or button function calling method, by executing software programs, instructions, and modules stored in the memory 51.
Specifically, in the embodiment, when the processor 50 executes one or more programs stored in the memory 51, the following operations are specifically implemented:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
and acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling the button function bound with the button position according to the function calling touch operation sensed by the projection whiteboard.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: receiving a function binding touch coordinate of the projection white board; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; and generating the button position according to the function binding touch coordinate.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: determining vertex coordinates in the function binding touch coordinates; calculating the coordinate distance between the vertex coordinates to obtain the height and the width of the button; generating the button position; the button position includes the vertex coordinates, the button height, and the button width.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: generating function instance code for the button function; generating an instance identifier corresponding to the function instance code; generating array elements; the array element comprises the instance identification and the button position; and adding the array elements to a preset initial array to obtain a binding relation array.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: acquiring a function configuration file of the button function; extracting a function name from the function configuration file; determining a function type according to the function name; and analyzing the function configuration file according to the function category to obtain the function instance code.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: receiving a function calling touch coordinate of the projection whiteboard; the function calling touch coordinate is generated according to the function calling touch operation; traversing array elements in the binding relation array to obtain a target array element; the button position of the target array element is matched with the function calling touch coordinate; and calling the button function according to the target array element.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: acquiring an example identifier in the target array element to obtain a target example identifier; acquiring a function instance code corresponding to the target instance identifier to obtain a target instance code; and executing the target instance code to call the button function corresponding to the target instance code.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: extracting a touch pressing coordinate and a touch lifting coordinate from the function calling touch coordinate; calculating a coordinate distance between the touch down coordinate and the touch up coordinate; determining that a coordinate distance between the touch down coordinate and the touch up coordinate is smaller than a preset distance threshold; and executing the step of traversing the array elements in the binding relation array to obtain target array elements.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: generating a touch prompt image; the touch prompt image comprises a touch prompt identifier; the position of the touch prompt identifier on the touch prompt image is determined according to the function binding touch coordinate; sending the touch prompt image to a projector; and the touch prompt image is used for projecting to the projection white board by the projector so as to prompt the user of the position of the function binding touch coordinate on the projection white board through the touch prompt identifier.
Based on the above embodiment, where the function category is a shortcut key category, the one or more processors 50 further implement the following operations: calling a shortcut key function analyzer to analyze the function configuration file to obtain keyboard keys; generating the function instance code; the function instance code includes the keyboard keys.
Based on the above embodiment, where the function class is a boot software class, the one or more processors 50 further implement the following operations: calling a starting software function analyzer to analyze the function configuration file to obtain a software absolute path and a starting parameter; generating the function instance code; the function instance code includes the software absolute path and the startup parameters.
Based on the above embodiment, where the function category is a browser open link category, the one or more processors 50 further implement the following operations: calling a link analyzer to analyze the function configuration file to obtain a link website; generating the function instance code; the function instance code includes the link website.
Based on the above embodiment, where the function category is a send global system message category, the one or more processors 50 further implement the following operations: calling a system message analyzer to analyze the function configuration file to obtain a global system message; generating the function instance code; the function instance code includes the global system message.
Based on the above embodiment, where the function class is a send hardware message class, the one or more processors 50 further implement the following operations: calling a hardware message analyzer to analyze the function configuration file to obtain message content, message length, hardware identification and a sending port; generating the function instance code; the function instance code includes the message content, the message length, the hardware identification, and the send port.
On the basis of the above embodiment, the one or more processors 50 also implement the following operations: acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board; acquiring a button function, and establishing a binding relationship between the button position and the button function; receiving a function calling touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard; and determining that the function calling touch coordinate is matched with the button position, and calling the button function bound with the button position.
EXAMPLE six
A storage medium containing computer-executable instructions for performing a button function binding method when executed by a computer processor, the method comprising: acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board; and acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling the button function bound with the button position according to the function calling touch operation sensed by the projection whiteboard.
The computer-executable instructions, when executed by a computer processor, are further for performing a button function call method comprising: acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board; acquiring a button function, and establishing a binding relationship between the button position and the button function; receiving a function calling touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard; and determining that the function calling touch coordinate is matched with the button position, and calling the button function bound with the button position.
The storage medium containing the computer-executable instructions provided by the embodiments of the present application is not limited to the operations of the button function binding method described above, and may also perform the related operations in the button function binding method provided by any embodiments of the present application, and may also perform the related operations in the button function calling method provided by any embodiments of the present application, and has corresponding functions and advantages.
EXAMPLE seven
Fig. 10 is a schematic structural diagram of a button function calling system according to a seventh embodiment of the present application, where the button function calling system according to the seventh embodiment of the present application may specifically include: an electronic whiteboard 710 and a projection control terminal 720; wherein: the electronic whiteboard 710 has a touch sensing function; the display image of the projection control terminal 720 is projected to the electronic whiteboard 710 through a projector; the electronic whiteboard 710 is configured to sense a function binding touch operation, generate a function binding touch coordinate according to the function binding touch operation, and send the function binding touch coordinate to the projection control end 720; the projection control terminal 720 is configured to obtain a button position of the electronic whiteboard 710 according to the function binding touch coordinate, obtain a button function, and establish a binding relationship between the button position and the button function; the projection control terminal 720 is further configured to invoke a touch operation according to the function sensed by the electronic whiteboard 710, and invoke a button function bound to the button position.
For the above embodiments, the steps, specific implementation manners, and beneficial effects executed by the electronic whiteboard 710 and the projection control end 720 have been described in detail in the first and second embodiments, and are not described again. The button function calling system provided by the above can be used for executing the button function binding method and the button function calling method provided by any of the above embodiments, and has corresponding functions and beneficial effects.

Claims (9)

1. A button function binding method is applied to a projection control terminal, and comprises the following steps:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board; the button position of the projection whiteboard is obtained, including: receiving a function binding touch coordinate of the projection whiteboard, and generating the button position according to the function binding touch coordinate; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; the function binding touch operation is a touch operation on the edge of a button identifier, and the button identifier is a sticker at any position on the projection whiteboard; the sticker is printed with a button function schematic icon and a name; if the button identifier is rectangular, the button position comprises the vertex coordinate of the button identifier, the button height and the button width; if the button mark is circular, the button position comprises a button circle center and a button radius of the button mark;
acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling a touch operation according to the function call touch operation sensed by the projection whiteboard and calling the button function bound with the button position; the button function is the function of the projection control end; the function of the projection control terminal is a function command which is written by a user and used for calling the button function;
generating a touch prompt image;
sending the touch prompt image to a projector; and the touch prompt image is used for the projector to project to the projection white board.
2. The method of claim 1, wherein generating the button location according to the function-bound touch coordinates comprises:
determining vertex coordinates in the function binding touch coordinates;
calculating the coordinate distance between the vertex coordinates to obtain the height and the width of the button;
generating the button position; the button position includes the vertex coordinates, the button height, and the button width.
3. The method of claim 1, wherein the establishing the binding relationship between the button location and the button function comprises:
generating function instance code for the button function;
generating an instance identifier corresponding to the function instance code;
generating array elements; the array element comprises the instance identification and the button position;
and adding the array elements to a preset initial array to obtain a binding relation array.
4. A button function calling method is applied to a projection control terminal, and comprises the following steps:
acquiring a button position of a projection white board, wherein the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board; the button position of the projection whiteboard is obtained, including: receiving a function binding touch coordinate of the projection whiteboard, and generating the button position according to the function binding touch coordinate; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; the function binding touch operation is a touch operation on the edge of a button identifier, and the button identifier is a sticker at any position on the projection whiteboard; the sticker is printed with a button function schematic icon and a name; if the button identifier is rectangular, the button position comprises the vertex coordinate of the button identifier, the button height and the button width; if the button mark is circular, the button position comprises a button circle center and a button radius of the button mark;
acquiring a button function, and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling a touch operation according to the function call touch operation sensed by the projection whiteboard and calling the button function bound with the button position; the button function is the function of the projection control end; the function of the projection control terminal is a function command which is written by a user and used for calling the button function;
generating a touch prompt image;
sending the touch prompt image to a projector; the touch prompt image is projected to the projection white board by the projector;
receiving a function calling touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard;
and determining that the function calling touch coordinate is matched with the button position, and calling the button function bound with the button position.
5. A button function binding apparatus, comprising:
the position acquisition module is used for acquiring a button position of the projection white board, the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
the position acquisition module is further used for receiving the function binding touch coordinate of the projection whiteboard and generating the button position according to the function binding touch coordinate; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; the function binding touch operation is a touch operation on the edge of a button identifier, and the button identifier is a sticker at any position on the projection whiteboard; the sticker is printed with a button function schematic icon and a name; if the button identifier is rectangular, the button position comprises the vertex coordinate of the button identifier, the button height and the button width; if the button mark is circular, the button position comprises a button circle center and a button radius of the button mark;
the function acquisition module is used for acquiring a button function and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling touch operation according to the function sensed by the projection whiteboard and calling the button function bound with the button position; the button function is the function of the projection control end; the function of the projection control terminal is a function command which is written by a user and used for calling the button function;
the image generation module is used for generating a touch prompt image;
the image sending module is used for sending the touch prompt image to a projector; and the touch prompt image is used for the projector to project to the projection white board.
6. A button function calling apparatus, comprising:
the position acquisition module is used for acquiring a button position of the projection white board, the projection white board has a touch sensing function, and the button position is obtained by binding touch operation according to the function sensed by the projection white board;
the position acquisition module is further used for receiving the function binding touch coordinate of the projection whiteboard and generating the button position according to the function binding touch coordinate; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; the function binding touch operation is a touch operation on the edge of a button identifier, and the button identifier is a sticker at any position on the projection whiteboard; the sticker is printed with a button function schematic icon and a name; if the button identifier is rectangular, the button position comprises the vertex coordinate of the button identifier, the button height and the button width; if the button mark is circular, the button position comprises a button circle center and a button radius of the button mark;
the function acquisition module is used for acquiring a button function and establishing a binding relationship between the button position and the button function, wherein the binding relationship is used for calling touch operation according to the function sensed by the projection whiteboard and calling the button function bound with the button position; the button function is the function of the projection control end; the function of the projection control terminal is a function command which is written by a user and used for calling the button function;
the image generation module is used for generating a touch prompt image;
the image sending module is used for sending the touch prompt image to a projector; the touch prompt image is projected to the projection white board by the projector;
the coordinate receiving module is used for receiving the function calling touch coordinate of the projection whiteboard; the function calling touch coordinates are generated according to function calling touch operation sensed by the projection whiteboard;
and the function calling module is used for determining that the function calling touch coordinate is matched with the button position and calling the button function bound with the button position.
7. A button function calling system, comprising:
the system comprises an electronic whiteboard and a projection control terminal; the electronic whiteboard has a touch sensing function; the display picture of the projection control end is projected to the electronic whiteboard through a projector;
the electronic whiteboard is used for sensing function binding touch operation, generating function binding touch coordinates according to the function binding touch operation, and sending the function binding touch coordinates to the projection control end;
the projection control terminal is used for acquiring the button position of the electronic whiteboard according to the function binding touch coordinate, acquiring the button function and establishing the binding relationship between the button position and the button function; the button position is obtained according to the function binding touch operation sensed by the projection white board; the binding relation is used for calling touch operation according to the function sensed by the projection white board and calling the button function bound with the button position; the button function is the function of the projection control end; the function of the projection control terminal is a function command which is written by a user and used for calling the button function;
the projection control terminal is further used for receiving the function binding touch coordinate of the projection white board and generating the button position according to the function binding touch coordinate; the function binding touch coordinate is generated by the projection whiteboard according to the function binding touch operation; the function binding touch operation is a touch operation on the edge of a button identifier, and the button identifier is a sticker at any position on the projection whiteboard; the sticker is printed with a button function schematic icon and a name; if the button identifier is rectangular, the button position comprises the vertex coordinate of the button identifier, the button height and the button width; if the button mark is circular, the button position comprises a button circle center and a button radius of the button mark;
the projection control terminal is also used for generating a touch prompt image and sending the touch prompt image to a projector; the touch prompt image is projected to the projection white board by the projector;
the projection control terminal is further used for calling touch operation according to the function sensed by the electronic whiteboard and calling the button function bound with the button position.
8. A projection control apparatus, characterized by comprising: a memory having one or more processors;
the memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to perform the method recited in any of claims 1-4.
9. A storage medium containing computer-executable instructions for performing the method of any one of claims 1-4 when executed by a computer processor.
CN201910432357.XA 2019-05-23 2019-05-23 Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment Active CN110209242B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910432357.XA CN110209242B (en) 2019-05-23 2019-05-23 Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910432357.XA CN110209242B (en) 2019-05-23 2019-05-23 Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment

Publications (2)

Publication Number Publication Date
CN110209242A CN110209242A (en) 2019-09-06
CN110209242B true CN110209242B (en) 2022-01-11

Family

ID=67788253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910432357.XA Active CN110209242B (en) 2019-05-23 2019-05-23 Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment

Country Status (1)

Country Link
CN (1) CN110209242B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580830B (en) * 2020-05-12 2023-09-15 北京飞漫软件技术有限公司 Binding and parsing method for hypertext markup language document element

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101071344A (en) * 2006-05-12 2007-11-14 深圳市巨龙科教高技术股份有限公司 Electronic pen for interactive electronic white board
CN101324812A (en) * 2006-12-19 2008-12-17 邱波 Human-machine interactive apparatus, electronic equipment and input method
CN101533316A (en) * 2009-04-10 2009-09-16 梁雨时 Interactive electric whiteboard system and method for using same
CN201340598Y (en) * 2009-01-16 2009-11-04 深圳市巨龙科教高技术股份有限公司 Interactive electronic whiteboard
CN101587399A (en) * 2009-02-09 2009-11-25 鑫能源科技(深圳)有限公司 Electronic writing equipment and electronic writing system
CN106484195A (en) * 2015-08-27 2017-03-08 华为技术有限公司 The control method of electronic whiteboard, device and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101071344A (en) * 2006-05-12 2007-11-14 深圳市巨龙科教高技术股份有限公司 Electronic pen for interactive electronic white board
CN101324812A (en) * 2006-12-19 2008-12-17 邱波 Human-machine interactive apparatus, electronic equipment and input method
CN201340598Y (en) * 2009-01-16 2009-11-04 深圳市巨龙科教高技术股份有限公司 Interactive electronic whiteboard
CN101587399A (en) * 2009-02-09 2009-11-25 鑫能源科技(深圳)有限公司 Electronic writing equipment and electronic writing system
CN101533316A (en) * 2009-04-10 2009-09-16 梁雨时 Interactive electric whiteboard system and method for using same
CN106484195A (en) * 2015-08-27 2017-03-08 华为技术有限公司 The control method of electronic whiteboard, device and system

Also Published As

Publication number Publication date
CN110209242A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
CN107273002B (en) Handwriting input answering method, terminal and computer readable storage medium
CN101730878B (en) Touch event model for web pages
CN104978317B (en) Webpage generation method and device, website generation method and website establishment server
WO2018000626A1 (en) Television-based webpage browsing control method and related device
CN108829371B (en) Interface control method and device, storage medium and electronic equipment
US20210051374A1 (en) Video file playing method and apparatus, and storage medium
US10013156B2 (en) Information processing apparatus, information processing method, and computer-readable recording medium
US20140080104A1 (en) Kanji stroke order learning device, kanji stroke order learning support method, kanji stroke order learning system and recording medium in which kanji stroke order learning program is recorded
CN109062433A (en) Method, apparatus, smart machine and the storage medium of touch data processing
CN105094381B (en) One kind writing treating method and apparatus
CN104133844A (en) Method and device for loading webpage
CN106575300A (en) Image based search to identify objects in documents
CN111814885A (en) Method, system, device and medium for managing image frames
US20180024976A1 (en) Annotation providing method and device
CN103294766A (en) Associating strokes with documents based on the document image
CN109670507B (en) Picture processing method and device and mobile terminal
CN110990010A (en) Software interface code generation method and device
CN112214271A (en) Page guiding method and device and electronic equipment
JP2008241736A (en) Learning terminal and its controlling method, correct/incorrect determining sever and its control method, learning system, learning terminal control program, correct/incorrect determination server control program, and recording medium with program recorded thereon
CN110209242B (en) Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment
CN113626023A (en) Sketch-based cross-platform interface development method and system, computer device and medium
JP2007524135A (en) Electronic pen-computer multimedia interactive system
JP4511467B2 (en) Response generation for electronic pen-computer multimedia interactive systems
CN115687146A (en) BIOS (basic input output System) test method and device, computer equipment and storage medium
JP2012226085A (en) Electronic apparatus, control method and control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant