CN118295571A - Event control method based on interactive projection, related device and storage medium - Google Patents

Event control method based on interactive projection, related device and storage medium Download PDF

Info

Publication number
CN118295571A
CN118295571A CN202310002198.6A CN202310002198A CN118295571A CN 118295571 A CN118295571 A CN 118295571A CN 202310002198 A CN202310002198 A CN 202310002198A CN 118295571 A CN118295571 A CN 118295571A
Authority
CN
China
Prior art keywords
target
interactive
interaction area
interaction
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310002198.6A
Other languages
Chinese (zh)
Inventor
俞焕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Publication of CN118295571A publication Critical patent/CN118295571A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an event control method based on interactive projection, a related device and a storage medium, wherein an application scene comprises various terminals, such as: a mobile phone, a computer, a vehicle-mounted terminal, etc. The application comprises the following steps: displaying a projection editing interface; responding to a region selection instruction aiming at the interactive operation panel, displaying a first original interaction region in the interactive operation panel according to the region selection instruction, and displaying an interaction setting panel aiming at the first original interaction region; responding to key value input operation of an interaction setting panel aiming at a first original interaction area, and acquiring a target key value corresponding to the first original interaction area; under the condition that the projection equipment projects the interactive projection picture, when touch operation aiming at the interactive projection picture is detected, if the position relation between a touch point of the touch operation and a first target interaction area meets an event triggering condition, controlling at least one equipment to execute a target event. The application can save equipment deployment cost and improve the debugging efficiency of projection interaction.

Description

Event control method based on interactive projection, related device and storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to an event control method based on interactive projection, a related device, and a storage medium.
Background
The interactive projection (INTERACTIVE PROJECTION) technique is to capture a target image (e.g., a participant) by a capture device and then analyze the captured image by an image analysis system to produce an action of the captured object. The action data is combined with the real-time image interaction system, so that a tightly combined interaction effect is generated between the participants and the screen.
At present, the interactive projection effect can be realized based on the conductive graphite technology. That is, graphite pigment is applied to an object by utilizing the conductive properties of graphite, and the object is energized. By utilizing the characteristic of manpower conduction, a passage is formed when a participant touches an object, so that the clicking position of the participant is obtained, and corresponding animation display is performed.
However, the inventors found that at least the following problems exist in the current solutions: before realizing interactive projection based on the conductive graphite technology, a set of conductive interactive equipment is also required to be installed below the plane where the graphite coating area is located. Therefore, the deployment cost is high, the whole deployment process is complicated, and the projection interaction debugging efficiency is low.
Disclosure of Invention
The embodiment of the application provides an event control method, a related device and a storage medium based on interactive projection, which not only can save equipment deployment cost, but also can improve the debugging efficiency of projection interaction.
In view of this, an aspect of the present application provides an event control method based on interactive projection, including:
displaying a projection editing interface, wherein the projection editing interface provides an interactive operation panel;
responding to a region selection instruction aiming at the interactive operation panel, and displaying a first original interaction region in the interactive operation panel according to the region selection instruction, wherein the first original interaction region is used for determining the position information of a first target interaction region in an interaction projection picture;
responding to key value input operation of an interaction setting panel aiming at a first original interaction area, and acquiring a target key value corresponding to the first original interaction area, wherein the target key value and a target event have a unique mapping relation;
Under the condition that the projection equipment projects the interactive projection picture, when touch operation aiming at the interactive projection picture is detected, if the position relation between a touch point of the touch operation and a first target interaction area meets an event triggering condition, controlling at least one equipment to execute a target event.
Another aspect of the present application provides an event control apparatus, comprising:
the display module is used for displaying a projection editing interface, wherein the projection editing interface provides an interactive operation panel;
The display module is further used for responding to the region selection instruction aiming at the interactive operation panel, and displaying a first original interaction region in the interactive operation panel according to the region selection instruction, wherein the first original interaction region is used for determining the position information of a first target interaction region in an interactive projection picture;
The acquisition module is used for responding to the key value input operation of the interaction setting panel aiming at the first original interaction area and acquiring a target key value corresponding to the first original interaction area, wherein the target key value and the target event have a unique mapping relation;
The control module is used for controlling at least one device to execute a target event if the position relationship between a touch point of the touch operation and the first target interaction area meets an event triggering condition when the touch operation for the interactive projection picture is detected under the condition that the projection device projects the interactive projection picture.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
The display module is specifically used for triggering an area selection instruction when continuous contact operation with the interactive operation panel is kept, and displaying a track corresponding to the continuous contact operation in the interactive operation panel;
When the end of the continuous contact operation is detected, a first original interaction area is displayed in the interactive operation panel according to the track corresponding to the continuous contact operation.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
And the display module is also used for displaying contact position information corresponding to the continuous contact operation when the continuous contact operation with the interactive operation panel is maintained, wherein the contact position information is used for describing the real-time position of the contact corresponding to the continuous contact operation in the interactive operation panel.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
The display module is specifically used for triggering an area selection instruction when the clicking operation aiming at the interactive operation panel is detected, and displaying each contact corresponding to the clicking operation and the connecting edge between every two adjacent contacts in the interactive operation panel, wherein each two adjacent contacts represent contacts generated by two adjacent clicking operations;
and displaying a first original interaction area in the interactive operation panel according to the connecting edge between every two adjacent contacts.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
And the display module is also used for displaying contact position information corresponding to the clicking operation when the clicking operation aiming at the interactive operation panel is detected, wherein the contact position information is used for describing the real-time position of the contact corresponding to the clicking operation in the interactive operation panel.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the event control device further includes an operation module and a determination module;
The operation module is used for responding to the hierarchy setting operation aiming at the first original interaction area if the first original interaction area and the second original interaction area have the overlapping area, wherein the second original interaction area is used for determining the position information of the second target interaction area in the interaction projection picture;
The judging module is used for judging the position relationship between the touch point of the touch operation and the first target interaction area when the touch point of the touch operation is positioned in the overlapping area of the first target interaction area and the second target interaction area under the condition that the hierarchy priority of the first original interaction area is higher than that of the second original interaction area;
The judging module is further configured to judge a positional relationship between the touch point of the touch operation and the second target interaction region when the touch point of the touch operation is located in a region where the first target interaction region coincides with the second target interaction region, in a case where the hierarchical priority of the second original interaction region is higher than the hierarchical priority of the first original interaction region.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
The display module is specifically used for displaying an interaction setting panel aiming at the first original interaction area on the projection editing interface when the clicking operation aiming at the first original interaction area is detected, wherein the interaction setting panel is used for setting a key value responding to the type of the clicking operation;
When a long-press operation aiming at the first original interaction area is detected, displaying an interaction setting panel aiming at the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the long-press operation;
When the directional swipe operation for the first original interaction area is detected, displaying an interaction setting panel for the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the directional swipe operation.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the event control device further includes a determination module;
The acquisition module is further used for responding to an area selection instruction aiming at the interactive operation panel, acquiring a length ratio and a width ratio between the interactive projection picture and the interactive operation panel after the first original interactive area is displayed in the interactive operation panel according to the area selection instruction, wherein the length ratio represents a ratio between the corresponding length of the interactive projection picture and the corresponding length of the interactive operation panel, and the width ratio represents a ratio between the corresponding width of the interactive projection picture and the corresponding width of the interactive operation panel;
the determining module is used for determining the position information of the first target interaction area in the interaction projection picture according to the position information, the length ratio and the width ratio of the first original interaction area in the interaction operation panel.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the touch operation is a click operation or a long press operation;
The acquisition module is further used for acquiring target position information of a touch point of the touch operation when the touch operation for the interactive projection picture is detected, wherein the target position information is used for describing the position of the touch point in the interactive projection picture;
The acquisition module is further used for acquiring a position inclusion relation between a touch point of the touch operation and the first target interaction area if the first target interaction area is determined according to the target position information;
and the determining module is further used for determining that the event triggering condition is met if the position containing relation indicates that the touch point of the touch operation is located in the first target interaction area.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
The acquisition module is specifically used for responding to touch operation for the interactive projection picture to acquire an original image, wherein the original image comprises a bright spot area, and the bright spot area is generated according to the touch operation;
Performing binarization processing on the original image to obtain a binarized image;
performing contour extraction processing on the binarized image to obtain an image to be processed, wherein the image to be processed comprises a region to be processed;
performing contour filling processing on a region to be processed in the image to be processed to obtain a target image, wherein the target image comprises a target rectangular region;
And taking the central position information of the target rectangular area included in the target image as the target position information of the touch point of the touch operation.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the event control device further includes an execution module;
The acquisition module is further used for acquiring the position information of each target interaction region in K target interaction regions contained in the interaction projection picture, wherein K is an integer greater than or equal to 1;
the determining module is also used for determining K distance information according to the target position information and the position information of each target interaction area;
the determining module is also used for determining minimum distance information from the K pieces of distance information;
And the execution module is used for executing the step of acquiring the position inclusion relation between the touch point of the touch operation and the first target interaction area if the minimum distance information is determined based on the target position information and the position information of the first target interaction area.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
The acquisition module is specifically used for constructing a ray based on touch points of touch operation;
acquiring the number of intersection points generated by the ray and the first target interaction area;
if the number of the intersection points is an odd number, determining that the position inclusion relation is used for indicating touch points of the touch operation to be positioned in the first target interaction area;
If the number of the intersection points is even, determining that the position inclusion relation is used for indicating that the touch point of the touch operation is located outside the first target interaction area.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the touch operation is a swipe operation;
The acquisition module is further used for acquiring initial position information of an initial touch point of the touch operation and end position information of an end touch point of the touch operation when the touch operation for the interactive projection picture is detected, wherein the initial position information is used for describing the position of the initial touch point in the interactive projection picture, and the end position information is used for describing the position of the end touch point in the interactive projection picture;
the acquisition module is also used for acquiring a position inclusion relation according to the initial position information and the end position information;
the determining module is further configured to determine that the event triggering condition is satisfied if the position-containing relationship indicates that the start touch point and the end touch point are both located within the first target interaction region.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the target event is an audio play event;
The display module is also used for displaying an audio playing control in the interactive projection picture, wherein the audio playing control is positioned in the first target interaction area;
The control module is specifically configured to determine that an event triggering condition is met if a touch point of the touch operation is located in the first target interaction area, and control the audio device to play an audio file corresponding to the audio playing event.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the target event is a menu selection event;
the display module is further used for displaying a menu selection control in the interactive projection picture, wherein the menu selection control is positioned in the first target interaction area;
The control module is specifically configured to determine that an event triggering condition is met if a touch point of the touch operation is located within the first target interaction area, and control the projection device to project an animation effect corresponding to the menu selection event.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the target event is a page flip event;
the display module is also used for displaying the current page in the interactive projection picture;
the control module is specifically configured to determine that an event triggering condition is met if both a start touch point of the touch operation and an end touch point of the touch operation are located in the first target interaction area, and control the projection device to project a previous page or a next page of the current page.
In another aspect, the present application provides an interactive projection system, including a computer device, a projection device, and a signal acquisition device;
The projection equipment is used for projecting an interactive projection picture;
the signal acquisition equipment is used for acquiring touch operation aiming at the interactive projection picture;
The computer device is configured to perform the methods of the above aspects.
Another aspect of the application provides a computer device comprising a memory storing a computer program and a processor implementing the methods of the above aspects when the processor executes the computer program.
Another aspect of the application provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the method of the above aspects.
In another aspect of the application, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the methods of the above aspects.
From the above technical solutions, the embodiment of the present application has the following advantages:
In the embodiment of the application, an event control method based on interactive projection is provided, and a projection editing interface is displayed through computer equipment. And responding to the region selection instruction aiming at the interactive operation panel, displaying a first original interaction region in the interactive operation panel according to the region selection instruction, and displaying an interaction setting panel aiming at the first original interaction region on a projection editing interface. The first original interaction region is used for determining the position information of the first target interaction region in the interaction projection picture. And then, responding to the key value input operation of the interaction setting panel aiming at the first original interaction area, and acquiring a target key value corresponding to the first original interaction area. Under the condition that the projection equipment projects the interactive projection picture, when touch operation aiming at the interactive projection picture is detected, if the position relation between a touch point of the touch operation and a first target interaction area meets an event triggering condition, controlling at least one equipment to execute a target event. Through the mode, the user can visually edit the original interaction area through the mode of the free selection area. Based on the method, the original interaction region is directly mapped into the interaction projection picture, so that a corresponding target interaction region can be formed, and the deployment mobility is good. Therefore, the scheme provided by the application not only can save the equipment deployment cost, but also improves the debugging efficiency of projection interaction.
Drawings
FIG. 1 is a schematic diagram of an interactive projection system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an implementation environment of an event control method according to an embodiment of the present application;
FIG. 3 is a flow chart of an event control method according to an embodiment of the application;
FIG. 4 is a schematic diagram of a projection editing interface according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interaction flow of an event control method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a customized first original interaction region according to an embodiment of the present application;
FIG. 7 is another schematic diagram of a customized first original interaction region according to an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating the setting of the hierarchical priority of the original interaction region according to an embodiment of the present application;
FIG. 9 is a schematic diagram of the operation type of setting the original interaction region according to the embodiment of the present application;
FIG. 10 is a schematic diagram of determining target position information of a touch point according to an embodiment of the present application;
FIG. 11 is a schematic diagram of determining a positional relationship between a touch point and a target interaction area according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an embodiment of the present application for implementing audio playback based on interactive projection;
FIG. 13 is a schematic diagram of an interactive projection based menu selection in an embodiment of the present application;
FIG. 14 is a schematic diagram of an embodiment of the present application for implementing electronic reading based on interactive projection;
FIG. 15 is a schematic diagram of an event control device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides an event control method, a related device and a storage medium based on interactive projection, which not only can save equipment deployment cost, but also can improve the debugging efficiency of projection interaction.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "includes" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
The interactive projection system adopts the technology of mixing the virtual reality technology and the dynamic capturing technology, and is a further development of the virtual reality technology. Virtual reality is a technique for creating three-dimensional images by a computer, providing a user with a three-dimensional space and interacting therewith. Through mixed reality, the user can control the virtual image and simultaneously touch the real environment.
For ease of understanding, referring to fig. 1, fig. 1 is a schematic diagram of an architecture of an interactive projection system according to an embodiment of the present application, where the interactive projection system mainly includes a signal acquisition device, a computer device, a projection device, and an auxiliary device. The signal acquisition equipment is used for capturing and shooting according to interaction requirements. Signal acquisition devices include, but are not limited to, infrared sensors, video cameras, thermal cameras, and the like. The computer device is used for analyzing the data acquired in real time, and the generated data is in butt joint with the virtual scene system. Computer devices include, but are not limited to, servers and terminals. Projection devices use projectors or other visualization devices to present images at specific locations for use as carriers of interactive images. Projection devices include, but are not limited to, projectors, plasma displays, liquid crystal displays, light-emitting diode (LED) screens. Auxiliary equipment includes, but is not limited to, transmission lines, mounting members, audio devices, and the like.
Taking the deployment of the interactive projection system shown in fig. 1 as an example, in order for a computer device to be able to determine which interactive region a user is located in for a touch operation of an interactive projection screen, it is necessary to calculate a projection distance, a focal length, a projection screen size, and the like. Therefore, the deployment process of the interactive projection system is complicated and lacks reusability. Based on the above, the application provides an event control method for interactive projection, which enables an developer to edit the shape of a touch area of an interactive projection picture through a mode of freely selecting areas. Therefore, the purpose of adapting to the interactive area can be achieved without debugging the interactive projection system.
The event control method for interactive projection provided by the application comprises at least one of the following scenes when applied.
(1) The ground interaction;
The interactive projection system uses computer vision technology and projection technology to create an interactive experience. The ground interaction projection system uses a top-hung projection device to project a plane interaction picture onto the ground, and the touch operation behavior in the area is identified through the signal acquisition device. The user may interact with the content in the projection, e.g., simulating a scene of playing a football, simulating a scene of fishing, simulating a scene of playing a chess, etc.
(2) Wall surface interaction;
The wall interactive projection system can display rich contents, such as pictures, characters, video and audio. The wall surface interactive projection system comprises projection equipment for projecting the vertical surface interactive picture to the wall surface. Through the visual recognition system, the behavior of the user in front of the screen can be recognized, and specific actions of the user can interact with the content in the projection, such as viewing pictures, browsing web pages, playing games, etc.
(3) Desktop interaction;
The desktop interactive projection system uses a multipoint interaction technology, combines a projection technology or an infrared touch screen technology, enables a user to freely interact with digital content on a desktop, and enables the user to obtain brand new operation experience. With the audio device, the user can hear the corresponding sound effect and sound while operating, and the shape of the desktop is not limited.
It should be noted that the above application scenario is only an example, and the event control method provided in this embodiment may also be applied to other scenarios, which is not limited herein.
The method provided by the application can be applied to the implementation environment shown in fig. 2, wherein the implementation environment comprises a terminal 110, a server 120, a projection device 140 and a signal acquisition device 150. Wherein communication between the terminal 110 and the server 120 may be performed through a communication network 130. Where communication network 130 uses standard communication techniques and/or protocols, typically the Internet, but may be any network including, but not limited to, bluetooth, a local area network (local area network, LAN), a metropolitan area network (metropolitan area network, MAN), a wide area network (wide area network, WAN), a mobile, private network, or any combination of virtual private networks. In some embodiments, custom or dedicated data communication techniques may be used in place of or in addition to the data communication techniques described above. The projection device 140 and the signal acquisition device 150 may be placed in parallel and spaced apart a distance according to the service requirements. The input signal to the projection device 140 is provided by the server 120, and the server 120 receives the signal input from the signal acquisition device 150 in real time.
The terminal 110 to which the present application relates includes, but is not limited to, a mobile phone, a computer, an intelligent voice interaction device, an intelligent home appliance, a vehicle-mounted terminal, an aircraft, etc. The client is deployed on the terminal 110, and the client may run on the terminal 110 in a browser mode, or may run on the terminal 110 in an independent APP mode, or the like. The server 120 according to the present application may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (content delivery network, CDN), and basic cloud computing services such as big data and AI platform.
In combination with the above implementation environment, the user triggers the region selection instruction through the projection editing interface provided by the terminal 110, and the terminal 110 sends the region selection instruction to the server 120 through the communication network 130, and simultaneously displays the original interaction region. Then, the user continues to trigger a key value input operation through the terminal 110 to configure a target key value corresponding to the original interaction region. The server 120 transmits the relevant data of the interactive projection screen to the projection device 140. In the case where the projection device 140 projects an interactive projection screen, a touch operation of a user is captured by the signal acquisition device 150, and a corresponding interactive effect is achieved based on the touch operation.
With reference to fig. 3, the event control method based on the interactive projection in the embodiment of the present application may be independently completed by a terminal, may be independently completed by a server, or may be completed by a cooperation of the terminal and the server, and the method of the present application includes:
210. Displaying a projection editing interface, wherein the projection editing interface provides an interactive operation panel;
in one or more embodiments, a user can customize an interactive element through a projection editing interface.
Specifically, for ease of understanding, please refer to fig. 4, fig. 4 is a schematic diagram of a projection editing interface according to an embodiment of the present application, and A1 is shown as an indication interactive operation panel. The user may click or slide on the interactive operation panel using an input device (e.g., mouse, keyboard) or finger, etc., to trigger a region selection instruction for the interactive operation panel.
220. Responding to a region selection instruction aiming at the interactive operation panel, and displaying a first original interaction region in the interactive operation panel according to the region selection instruction, wherein the first original interaction region is used for determining the position information of a first target interaction region in an interaction projection picture;
In one or more embodiments, in response to an area selection instruction for the interactive operation panel, not only a first original interactive area is displayed in the interactive operation panel, but also an interactive setting panel for the first original interactive area is displayed in the projection editing interface. And determining the position information of the first target interaction area in the interaction projection picture according to the position information of the first original interaction area in the interaction operation panel.
It is understood that "in response to" means a condition or state upon which an operation is performed, one or more operations may be performed when certain conditions or states are met. These operations may be real-time or with some delay.
Specifically, a real-time communication (RTC) technique is used to read the region selection command, and the first original interaction region is drawn in the interactive operation panel. Taking a webpage client as an example, a webpage real-time communication (webRTC) technology is utilized to read an area selection instruction, and a first original interaction area is drawn on an interaction operation panel, wherein the interaction operation panel is called Canvas provided by a webpage platform. The webRTC allows network applications or sites to establish a peer-to-peer (peer-to-peer) connection between browsers without intermediation, so as to realize transmission of video streams and/or audio streams or any other data.
230. Responding to key value input operation of an interaction setting panel aiming at a first original interaction area, and acquiring a target key value corresponding to the first original interaction area, wherein the target key value and a target event have a unique mapping relation;
In one or more embodiments, when a user selects a certain original interaction region from the interactive operation panels, that is, an interaction setting panel corresponding to the original interaction region is displayed on the projection editing interface.
Specifically, for ease of understanding, please refer to fig. 4 again, A2 is used to indicate the first original interaction region. A3 is used for indicating the interaction setting panel aiming at the first original interaction area, namely, the interaction setting panel indicated by A3 can be used for correspondingly setting the first original interaction area. For example, the user triggers a key value input operation by inputting a target key value in the input box indicated by A4. The target key value and the target event have a unique mapping relationship, for example, the target key value is "0001", and the corresponding target event is "play audio a".
240. Under the condition that the projection equipment projects the interactive projection picture, when touch operation aiming at the interactive projection picture is detected, if the position relation between a touch point of the touch operation and a first target interaction area meets an event triggering condition, controlling at least one equipment to execute a target event.
In one or more embodiments, after the computer device, the projection device, and the signal acquisition device establish a communication connection, data to be projected may be transmitted to the projection device. Based on the above, under the condition that the projection device projects the interactive projection picture, the signal acquisition device captures the touch operation behavior, so that the touch operation on the interactive projection picture is detected. And if the position relation between the touch point of the touch operation and the first target interaction area meets the event triggering condition, controlling at least one device (e.g. a projection device, an auxiliary device and the like) to execute the target event.
In the following description, for convenience of understanding, referring to fig. 5, fig. 5 is a schematic diagram of an interaction flow of an event control method according to an embodiment of the present application, as shown in the drawing, where a platform end includes a computer device, and a projection end includes a projection device and a signal acquisition device. Specifically:
in step B1, a user (e.g., debugger, developer) boxes the original interaction region through the interactive operation panel.
In step B2, the user (e.g., debugger, developer) defines the key value of the original interaction area through the interaction setting panel, for example, defines the key value of a certain original interaction area as the target key value.
In step B3, capturing, by the signal acquisition device, a picture captured by the camera in real time, so as to detect a touch operation on the interactive projection picture.
In step B4, a user (e.g., a participant) may perform a touch operation using a finger within the interactive projection screen.
In step B5, when the signal acquisition device detects a touch operation for the interactive projection screen, a touch point location of the touch operation is calculated in real time. For example, in one case, the touch point location may be calculated by the signal acquisition device. In another case, the signal acquisition device may transmit the captured camera frame to the computer device, and the computer device calculates the touch point position according to the camera frame.
In step B6, it is determined whether the touch point of the touch operation is within the target interaction area, and if yes, step B7 is performed. The target interaction area and the original interaction area have a unique mapping relation.
In step B7, the target key value corresponding to the target interaction area is broadcasted to call the interface related to the target event to realize the corresponding function.
In step B8, the interactive projection picture corresponding to the target key value is processed, and the corresponding picture content is displayed.
In step B9, at least one device is controlled to execute the target event, thereby providing the user with an effect corresponding to the target event.
In an embodiment of the application, an event control method based on interactive projection is provided. Through the mode, the user can visually edit the original interaction area through the mode of the free selection area. Based on the method, the original interaction region is directly mapped into the interaction projection picture, so that a corresponding target interaction region can be formed, and the deployment mobility is good. Therefore, the scheme provided by the application not only can save the equipment deployment cost, but also improves the debugging efficiency of projection interaction.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, in response to a region selection instruction for an interactive operation panel, displaying a first original interaction region in the interactive operation panel according to the region selection instruction may specifically include:
when continuous contact operation with the interactive operation panel is kept, a trigger area selects an instruction, and a track corresponding to the continuous contact operation is displayed in the interactive operation panel;
When the end of the continuous contact operation is detected, a first original interaction area is displayed in the interactive operation panel according to the track corresponding to the continuous contact operation.
In one or more embodiments, a manner of triggering a region selection instruction is presented. As can be seen from the foregoing embodiments, the projection editing interface provides an interactive operation panel, on the basis of which a user (e.g., a debugger, a developer) can perform continuous contact operation on the interactive operation panel to trigger an area selection instruction. Meanwhile, based on the RTC technology, a track corresponding to the continuous contact operation is displayed in the interactive operation panel.
Specifically, for ease of understanding, referring to fig. 6, fig. 6 is a schematic diagram of a customized first original interaction area according to an embodiment of the present application, as shown in fig. 6 (a), a user may drag a mouse to move a cursor on an interactive operation panel, so as to maintain continuous contact operation with the interactive operation panel. Wherein C1 is used to indicate a trajectory corresponding to the continuous contact operation. As shown in fig. 6 (B), when the user stops the continuous contact operation, that is, according to the trajectory corresponding to the continuous contact operation, the original interaction region is displayed in the interactive operation panel. Wherein C2 is used to indicate the first original interaction region.
It should be noted that the computer device may perform the region selection processing by using the visual polygon region selection drawing (Canvas PolygonDrawing) technology, so as to obtain one or more original interaction regions (i.e., visual region selection).
In a second embodiment of the present application, a method for triggering a region selection instruction is provided. By the above manner, based on the continuous contact operation by the user, the corresponding stroking locus can be displayed in the interactive operation panel. Thereby achieving the effect of selecting the visualization and being convenient for the user to observe the condition of selecting the area in real time.
Optionally, on the basis of the respective embodiments corresponding to fig. 3, another optional embodiment provided by the embodiment of the present application may further include, when continuous contact operation with the interactive operation panel is maintained:
And displaying contact position information corresponding to the continuous contact operation, wherein the contact position information is used for describing the real-time position of the contact corresponding to the continuous contact operation in the interactive operation panel.
In one or more embodiments, a manner of displaying position information based on a swipe operation is presented. As can be seen from the foregoing embodiments, when the continuous contact operation is performed, the point where the cursor or finger contacts the interactive operation panel is the contact point. Based on this, the contact point position information can be displayed on the projection editing interface.
Specifically, for ease of understanding, please refer again to fig. 6 (a), wherein C3 is used to indicate the contact point generated by the current continuous contact operation, and C4 is used to indicate the real-time position of the contact point. The contact point position information corresponding to the contact point indicated by C3 is (120.00, 80,00). If the contact continues to move downward, the contact position information will also continue to change.
In the embodiment of the application, a manner of displaying position information based on a swipe operation is provided. By the method, the real-time position information of the touch point can be displayed based on continuous contact operation, so that a user can know the frame selection condition of the interaction area conveniently, and corresponding adjustment can be made in time.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, in response to a region selection instruction for an interactive operation panel, displaying a first original interaction region in the interactive operation panel according to the region selection instruction may specifically include:
When the clicking operation aiming at the interactive operation panel is detected, a triggering area selection instruction is displayed, and each contact corresponding to the clicking operation and the connecting edge between every two adjacent contacts are displayed in the interactive operation panel, wherein each two adjacent contacts represent contacts generated by two adjacent clicking operations;
and displaying a first original interaction area in the interactive operation panel according to the connecting edge between every two adjacent contacts.
In one or more embodiments, another manner of triggering area selection instructions is presented. As can be seen from the foregoing embodiments, the projection editing interface provides an interactive operation panel, on the basis of which a user (e.g., a debugger, developer) can perform a click operation on the interactive operation panel to trigger an area selection instruction. Meanwhile, based on the RTC technology, the contact and the connecting edge corresponding to the clicking operation are displayed in the interactive operation panel.
Specifically, for ease of understanding, referring to fig. 7, fig. 7 is another schematic diagram of a customized first original interaction area in an embodiment of the present application, as shown in fig. 7 (a), a user may move a cursor on an interactive operation panel by moving a mouse, and when the user clicks the mouse, a clicking operation for the interactive operation panel is triggered. Illustratively, D1 is a first contact clicked by the user, and D2 is a second contact clicked by the user, based on which a connection edge between two adjacent contacts (i.e., the first contact and the second contact) is automatically generated. D3 is the third contact clicked by the user, based on which the connecting edge between two adjacent contacts (i.e., the second contact and the third contact) is automatically generated. D4 is the fourth contact clicked by the user, based on which the connecting edge between two adjacent contacts (i.e., the third contact and the fourth contact) is automatically generated.
As shown in fig. 6 (B), a polygonal area, which is the original interaction area, can be generated according to the contact clicked by the user. Wherein D6 is used to indicate the first original interaction region.
It should be noted that the computer device may perform the region selection process through the visualization Canvas PolygonDrawing technology, so as to obtain one or more original interaction regions (i.e., the visualized region selection).
In the embodiment of the application, another mode of triggering the region selection instruction is provided. By the mode, based on clicking operation of a user, corresponding contacts and a connecting edge between the two contacts can be displayed in the interactive operation panel. Therefore, the visual effect of selection is achieved, the user can observe the condition of area selection in real time conveniently, the requirement of polygon selection is met, and therefore the convenience of operation is improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, when a click operation for the interactive operation panel is detected, the method may further include:
And displaying contact position information corresponding to the clicking operation, wherein the contact position information is used for describing the real-time position of the contact corresponding to the clicking operation in the interactive operation panel.
In one or more embodiments, a manner of displaying location information based on a click operation is presented. As can be seen from the foregoing embodiments, when the clicking operation is performed, the point where the cursor or finger contacts the interactive operation panel is the contact point. Based on this, the contact point position information can be displayed on the projection editing interface.
Specifically, for ease of understanding, please refer again to fig. 7 (a), wherein D4 is used to indicate the contact point generated by the current click of the user, and D6 is used to indicate the real-time position of the contact point. The contact point position information corresponding to the contact point indicated by D4 is (120.00, 60,00).
In the embodiment of the application, a mode for displaying position information based on clicking operation is provided. By the method, the position information of the touch point can be displayed based on clicking operation, so that a user can know the frame selection condition of the interaction area conveniently, and corresponding adjustment can be made in time.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, another optional embodiment provided by the embodiment of the present application may further include:
if the first original interaction area and the second original interaction area have the coincident area, responding to the hierarchy setting operation aiming at the first original interaction area, wherein the second original interaction area is used for determining the position information of the second target interaction area in the interaction projection picture;
Under the condition that the hierarchy priority of the first original interaction area is higher than that of the second original interaction area, when the touch point of the touch operation is located in the overlapping area of the first target interaction area and the second target interaction area, judging the position relationship between the touch point of the touch operation and the first target interaction area;
And under the condition that the hierarchy priority of the second original interaction region is higher than that of the first original interaction region, when the touch point of the touch operation is positioned in the overlapping region of the first target interaction region and the second target interaction region, judging the position relationship between the touch point of the touch operation and the second target interaction region.
In one or more embodiments, a manner of customizing interactive region hierarchies is presented. In the foregoing embodiment, the user may select a plurality of original interaction areas in the interactive operation panel, and if the plurality of original interaction areas overlap, the hierarchical priority corresponding to each of the original interaction areas may also be set.
Specifically, for ease of understanding, please refer to fig. 8, fig. 8 is a schematic diagram illustrating a setting of a hierarchical priority of an original interaction region according to an embodiment of the present application, as shown in fig. 8 (a), E1 is used to indicate a first original interaction region, and E2 is used to indicate a second original interaction region. It can be seen that there is a coincidence region between the first original interaction region and the second original interaction region. Based on this, the user can select to perform a hierarchy setting operation on the first original interaction region.
Illustratively, the user clicks on the first original interactive region indicated by E1, thereby displaying the hierarchical priority of the first original interactive region within the region indicated by E3. As shown in fig. 8 (a), the first original interaction region has a hierarchical priority of "1". Based on the above, if the touch point of the touch operation is located in the overlapping area of the first target interaction area and the second target interaction area, the positional relationship between the touch point of the touch operation and the first target interaction area is preferentially judged. And if the touch point is in the first target interaction area, executing an event corresponding to the first target interaction area.
Illustratively, the user clicks on the second original interactive region indicated by E2, thereby displaying the hierarchical priority of the second original interactive region within the region indicated by E3. As shown in fig. 8 (B), the hierarchical priority of the second original interaction region is "1". Based on the above, if the touch point of the touch operation is located in the overlapping area of the first target interaction area and the second target interaction area, the position relationship between the touch point of the touch operation and the second target interaction area is preferentially judged. And if the touch point is in the second target interaction area, executing an event corresponding to the second target interaction area.
It should be noted that, since the original interaction area and the target interaction area have a one-to-one correspondence, the hierarchical priority of the target interaction area can be determined by setting the hierarchical priority of the original interaction area. Based on the above, when the touch point of the touch operation is located in the plurality of target interaction areas, the event corresponding to the target interaction area with the highest hierarchical priority is preferentially triggered.
In the embodiment of the application, a method for customizing the hierarchical relationship of the interaction area is provided. By the method, the user can define the corresponding hierarchy for the original interaction area. Therefore, under the condition that the overlapping touch areas exist, the interactive area of the current touch can be effectively distinguished, and therefore debugging efficiency of interactive development of plane projection is improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, displaying, on the projection editing interface, an interaction setting panel for the first original interaction area may specifically include:
When a clicking operation aiming at the first original interaction area is detected, displaying an interaction setting panel aiming at the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the clicking operation type;
When a long-press operation aiming at the first original interaction area is detected, displaying an interaction setting panel aiming at the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the long-press operation;
When the directional swipe operation for the first original interaction area is detected, displaying an interaction setting panel for the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the directional swipe operation.
In one or more embodiments, a manner of setting the type of interactive zone operation is presented. From the foregoing embodiments, it is also possible for a user (e.g., debugger, developer) to customize the types of operations supported by each original interaction region. Based on this, the user can configure key values for the original interaction areas corresponding to different operation types.
Specifically, for ease of understanding, referring to fig. 9, fig. 9 is a schematic diagram illustrating an operation type of setting an original interaction region according to an embodiment of the present application, as shown in fig. 9 (a), F1 is used to indicate a first original interaction region, and F2 is used to indicate a key value setting region. When the user performs a click operation within the first original interaction region indicated by F1, a key value, for example, "0001", may be input in the key value setting region indicated by F2. Wherein "0001" is the key value of the first original interaction region corresponding to the click operation type.
For example, as shown in fig. 9 (B), when the user performs a long press operation within the first original interaction region indicated by F1, a key value, for example, "0002", may be input in the key value setting region indicated by F2. Wherein "0002" is the key value of the first original interaction area corresponding to the long-press operation type.
For example, as shown in fig. 9 (C), when the user performs a directional swipe operation in the first original interaction region indicated by F1, a key value, for example, "0003", may be input in the key value setting region indicated by F3. Wherein "0003" is the key value of the corresponding directional swipe operation type of the first original interaction area.
It should be noted that, since the original interaction area has a one-to-one correspondence with the target interaction area, the operation type supported by the target interaction area can be determined by setting the operation type of the original interaction area. For example, the first original interaction region corresponds to three operation types, and then the first target interaction region also corresponds to three operation types. For example, when the user triggers a click operation through the interactive projection screen, an event corresponding to "0001" may be performed. For example, when the user triggers a long press operation through the interactive projection screen, an event corresponding to "0002" may be executed. For example, when the user triggers a directional swipe operation through the interactive projection screen, an event corresponding to "0003" may be performed.
In a second embodiment of the present application, a manner of setting an operation type of an interaction region is provided. By the mode, when the interactive area is edited, the operation type can be set according to the service requirement. Based on the above, in practical application, the click operation, the long press operation, the directional swipe operation and the like of the user are supported, so that the flexibility of the scheme is improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, after the area selection instruction for the interactive operation panel is responded, displaying the first original interaction area in the interactive operation panel according to the area selection instruction may further include:
Acquiring a length ratio and a width ratio between the interactive projection picture and the interactive operation panel, wherein the length ratio represents a ratio between the corresponding length of the interactive projection picture and the corresponding length of the interactive operation panel, and the width ratio represents a ratio between the corresponding width of the interactive projection picture and the corresponding width of the interactive operation panel;
And determining the position information of the first target interaction region in the interaction projection picture according to the position information, the length ratio and the width ratio of the first original interaction region in the interaction operation panel.
In one or more embodiments, a manner of determining corresponding location information for a target interaction region is presented. As can be seen from the foregoing embodiments, after the user (e.g., debugger, developer) creates the original interaction region in the interactive operation panel, the original interaction region can be scaled in equal proportion to obtain the target interaction region in the interactive projection screen,
Specifically, assume that the size of the interactive operation panel is 600×800, and the size of the interactive projection screen is 1200×1600. Based on this, the length ratio between the interactive projection screen and the interactive operation panel is 2 (i.e., 1200/600=2), and the width ratio between the interactive projection screen and the interactive operation panel is 2 (i.e., 1600/800=2). Taking the first original interaction region as a rectangular region as an example, the coordinate information of the top left corner vertex of the first original interaction region is (100, 500), the coordinate information of the top right corner vertex of the first original interaction region is (300, 500), the coordinate information of the bottom right corner vertex of the first original interaction region is (300,80), and the coordinate information of the bottom left corner vertex of the first original interaction region is (100, 80). Then, according to the position information (i.e. the coordinate information of each vertex) of the first original interaction region in the interactive operation panel, the length ratio and the width ratio, the position information (i.e. the coordinate information of each vertex) of the first target interaction region in the interactive projection screen can be calculated.
Illustratively, the coordinate information of the top left corner vertex of the first original interaction region on the interactive operation panel is (100, 500), and based on the coordinate information, the coordinate information of the top left corner vertex is processed according to the length ratio and the width ratio, so that the coordinate information of the top left corner vertex of the first target interaction region on the interactive projection picture is (200, 1000).
For example, the coordinate information of the top right corner vertex of the first original interaction region on the interactive operation panel is (300, 500), and based on the coordinate information, the coordinate information of the top right corner vertex is processed according to the length ratio and the width ratio, so that the coordinate information of the top right corner vertex of the first target interaction region on the interactive projection picture is (600,1000).
For example, the coordinate information of the right-lower corner vertex of the first original interaction region on the interactive operation panel is (300,80), and based on this, the coordinate information of the right-lower corner vertex is processed according to the length ratio and the width ratio, so as to obtain the coordinate information of the right-lower corner vertex of the first target interaction region on the interactive projection screen is (600,160).
Illustratively, the coordinate information of the left lower corner vertex of the first original interaction region on the interactive operation panel is (100, 80), and based on the coordinate information, the coordinate information of the left lower corner vertex is processed according to the length ratio and the width ratio, so that the coordinate information of the left lower corner vertex of the first target interaction region on the interactive projection picture is (200,160).
In the embodiment of the application, a method for determining the position information corresponding to the target interaction area is provided. By the method, the position information of the target interaction area on the interaction projection picture is determined based on the position information of the original interaction area on the interaction operation panel. Therefore, the setting of the interaction area has good mobility, so that the equipment deployment cost can be saved, and the debugging efficiency of projection interaction is improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, the touch operation is a click operation or a long press operation;
when the touch operation for the interactive projection screen is detected, the method may further include:
acquiring target position information of a touch point of a touch operation, wherein the target position information is used for describing the position of the touch point in an interactive projection picture;
If the first target interaction area is determined according to the target position information, acquiring a position inclusion relation between a touch point of the touch operation and the first target interaction area;
And if the position containing relation indicates that the touch point of the touch operation is positioned in the first target interaction area, determining that the event triggering condition is met.
In one or more embodiments, a manner of determining whether a single touch point is located in a first target interaction region is described. In the foregoing embodiment, when the touch operation for the interactive projection screen is detected, if the touch operation is a click operation or a long press operation, the target position information of the touch point corresponding to the touch operation needs to be further determined, and if the target interaction area closest to the touch point is determined to be the first target interaction area according to the target position information, the position inclusion relationship between the touch point and the first target interaction area is continuously determined. And under the condition that the touch point is positioned in the first target interaction area, the position relation between the touch point and the first target interaction area is represented to meet the event triggering condition.
In the embodiment of the application, a method for determining whether a single touch point is located in a first target interaction area is provided. By the method, the position relation between the single touch point and the target interaction area can be identified, so that whether the corresponding event needs to be executed is further judged. Thereby, the feasibility and operability of the scheme is increased.
Optionally, based on the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, obtaining target position information of a touch point of a touch operation may specifically include:
responding to touch operation for the interactive projection picture, and acquiring an original image, wherein the original image comprises a bright spot area which is generated according to the touch operation;
Performing binarization processing on the original image to obtain a binarized image;
performing contour extraction processing on the binarized image to obtain an image to be processed, wherein the image to be processed comprises a region to be processed;
performing contour filling processing on a region to be processed in the image to be processed to obtain a target image, wherein the target image comprises a target rectangular region;
And taking the central position information of the target rectangular area included in the target image as the target position information of the touch point of the touch operation.
In one or more embodiments, a manner of determining target location information corresponding to a touch point is described. As can be seen from the foregoing embodiments, a user (e.g., a participant) performs a touch operation on an interactive projection screen, and based on the touch operation, target position information of a touch point can be determined according to the touch operation.
Specifically, for ease of understanding, referring to fig. 10, fig. 10 is a schematic diagram illustrating determining target position information of a touch point according to an embodiment of the present application, and capturing an original image by a signal acquisition device based on a touch operation performed by a user (e.g., a participant) on an interactive projection screen. The original image comprises a bright spot area which is generated according to touch operation. For example, the contact area between the finger and the interactive projection screen is used as a bright spot area.
Based on this, the original image is subjected to binarization processing to obtain a binarized image. The binarization processing of the image is to set the gray value of the pixel point on the image to 0 or 255, so that the whole image presents obvious black-white effect. As shown, white pixels in the binarized image belong to the bright spot region. And then, carrying out contour extraction processing on the binarized image to obtain a to-be-processed image comprising the to-be-processed area. Wherein the extracted contour constitutes the region to be treated. Next, contour filling processing is performed on the region to be processed, and a target image including a target rectangular region is obtained. The target rectangular area is an circumscribed rectangular frame of the area to be processed. And finally, calculating the position information of the center of the target rectangular area, and taking the position information corresponding to the center position as target position information.
In the embodiment of the application, a method for determining the target position information corresponding to the touch point is provided. Through the mode, the computer equipment can continuously perform image recognition processing on the interactive projection picture, and in the process, the image content of the non-bright spot area is filtered by utilizing an image binarization algorithm, so that the target position information of the touch point is obtained. Thereby improving the feasibility and operability of the scheme.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, another optional embodiment provided by the embodiment of the present application may further include:
Acquiring position information of each target interaction region in K target interaction regions contained in an interaction projection picture, wherein K is an integer greater than or equal to 1;
Determining K distance information according to the target position information and the position information of each target interaction area;
determining minimum distance information from the K pieces of distance information;
And if the minimum distance information is determined based on the target position information and the position information of the first target interaction area, executing the step of acquiring the position inclusion relation between the touch point of the touch operation and the first target interaction area.
In one or more embodiments, a manner of determining a target interaction region based on target location information is presented. In the foregoing embodiments, the interactive projection screen may include a plurality of interactive areas, and thus, it may be determined whether the user interacts with the target interactive area based on the distance information between the target position information and each interactive area. If yes, a target interaction area for generating interaction is further obtained.
Specifically, it is assumed that the interactive projection screen includes 3 target interactive regions. Assume that the interactive projection picture includes a first target interactive region, a second target interactive region, and a third target interactive region. The location information of each target interaction area may be central location information of the target interaction area. For ease of understanding, referring to table 1, table 1 is a schematic illustration of the corresponding distance information between the touch point and each target interaction area.
TABLE 1
The distance information between the touch point and the first target interaction area is 40
The distance information between the touch point and the second target interaction area is 60
The distance information between the touch point and the third target interaction area is 120
As shown in table 1, the distance information between the touch point and the first target interaction area is the shortest. Therefore, the position inclusion relation between the touch point and the first target interaction area can be further obtained. For example, if the distance information between the touch point and the second target interaction area is shortest. The position inclusion relationship between the touch point and the second target interaction region is further obtained, and so on, which will not be described herein.
It will be appreciated that in practical applications, the distance information may also be determined, and the valid touch condition is satisfied if the distance information is less than or equal to a distance threshold (e.g., 100). Then, the distance information meeting the effective touch condition can be ranked, and the minimum distance information can be screened out.
In an embodiment of the present application, a method for determining a target interaction area based on target location information is provided. By the method, the target interaction area closest to the touch point can be determined according to the target position information, so that the purpose of user operation can be accurately identified, and the effect of projection interaction is improved.
Optionally, based on the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the present application, acquiring a position inclusion relationship between a touch point of a touch operation and a first target interaction area may specifically include:
constructing a ray based on a touch point of a touch operation;
acquiring the number of intersection points generated by the ray and the first target interaction area;
if the number of the intersection points is an odd number, determining that the position inclusion relation is used for indicating touch points of the touch operation to be positioned in the first target interaction area;
If the number of the intersection points is even, determining that the position inclusion relation is used for indicating that the touch point of the touch operation is located outside the first target interaction area.
In one or more embodiments, a manner of determining positional inclusion relationships using radiology is presented. From the foregoing, the positional relationship between the touch point of the touch operation and the target interaction region can be determined based on the ray method. The present application is described by taking a relationship between determining a position of a touch point and a first target interaction area as an example.
It will be appreciated that there are only two cases in which a ray passes through the boundary of an area: one is penetration and the other is penetration. When a point is outside the area, rays pass through the boundary of the area, and the rays pass through the boundary of the area, and pass through and pass out correspondingly, so that the number of times of passing through is even. When a point is in the area, the boundary of the first crossing area of the ray should be penetrated out, and the next time if the ray penetrates into the area, the ray will be penetrated out correspondingly once, so the crossing times should be odd.
Specifically, for ease of understanding, referring to fig. 11, fig. 11 is a schematic diagram illustrating determining a positional relationship between a touch point and a target interaction area in an embodiment of the present application, as shown in the drawing, a ray is first constructed based on the touch point, and then the number of intersections between the ray and the target interaction area is calculated. Based on the above principle, if the number of the intersecting points is odd, it is determined that the touch point is located in the first target interaction area. Otherwise, if the number of the intersection points is even, determining that the touch point is located outside the first target interaction area. For example, assuming that the target interaction area shown in fig. 11 is the first target interaction area, it can be seen that the number of intersections generated by the ray and the first target interaction area is 5, and therefore, it can be determined that the touch point is located within the first target interaction area, that is, the position inclusion relationship is used to indicate that the touch point of the touch operation is located within the first target interaction area.
In the embodiment of the application, a method for determining the position inclusion relationship by using a ray method is provided. By the mode, the ray method is suitable for judging the relation between the points and the polygons, is relatively simple to realize and high in speed, and therefore response efficiency of projection interaction is improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided in the present application, the touch operation is a swipe operation;
when the touch operation for the interactive projection screen is detected, the method may further include:
Acquiring initial position information of an initial touch point of touch operation and end position information of an end touch point of touch operation, wherein the initial position information is used for describing the position of the initial touch point in an interactive projection picture, and the end position information is used for describing the position of the end touch point in the interactive projection picture;
Acquiring a position inclusion relation according to the initial position information and the end position information;
If the position containing relation indicates that the initial touch point and the end touch point are both located in the first target interaction area, determining that the event triggering condition is met.
In one or more embodiments, a method of determining whether a plurality of touch points are located in a first target interaction region is described. In the foregoing embodiments, when a touch operation is detected for the interactive projection screen, if the touch operation is a swipe operation, it is necessary to further determine the start position information of the start touch point and the end position information of the end touch point corresponding to the touch operation. Based on this, the position inclusion relationship may be acquired from the start position information and/or the end position information.
Specifically, in one case, if it is determined that the target interaction area closest to the initial touch point is the first target interaction area according to the initial position information, the initial touch point and the final touch point are continuously determined, and the positions of the initial touch point and the final touch point and the first target interaction area respectively include a relation. In another case, if it is determined that the target interaction area closest to the ending touch point is the first target interaction area according to the ending position information, the starting touch point and the ending touch point are continuously determined, and the positions between the starting touch point and the ending touch point and the first target interaction area respectively include the relation. For example, the start touch point and the end touch point are both located within the first target interaction region, i.e. it is determined that the event triggering condition is satisfied.
In another embodiment of the present application, a method for determining whether a plurality of touch points are located in a first target interaction area is provided. By the method, the position relation between the touch points and the target interaction area can be identified, so that whether the corresponding event needs to be executed is further judged. Thereby, the feasibility and operability of the scheme is increased.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiment of the present application, the target event is an audio play event;
May further include:
displaying an audio playing control in the interactive projection picture, wherein the audio playing control is positioned in a first target interaction area;
if the position relationship between the touch point of the touch operation and the first target interaction area meets the event triggering condition, controlling at least one device to execute the target event may specifically include:
If the touch point of the touch operation is located in the first target interaction area, determining that an event triggering condition is met, and controlling the audio equipment to play an audio file corresponding to the audio playing event.
In one or more embodiments, a manner of enabling audio playback based on interactive projection is presented. As can be seen from the foregoing embodiments, the target event may be an audio playing event, and based on this, the audio device may be controlled to play an audio file corresponding to the audio playing event when it is determined to execute the audio playing event. Optionally, the projection device may also be controlled to project an animation effect (e.g., an animation effect of a presentation player) corresponding to the audio play event.
Specifically, for ease of understanding, referring to fig. 12, fig. 12 is a schematic diagram illustrating an implementation of audio playback based on interactive projection according to an embodiment of the present application, as shown in the drawing, an audio playback control indicated by G1 is displayed in an interactive projection screen, where the audio playback control is located in a first target interaction area, and the first target interaction area is a target interaction area determined according to a first original interaction area. When the user clicks the audio playing control indicated by G1, the touch point of the touch operation is located in the first target interaction area. Thus, the computer device may control the audio device to play the audio file corresponding to the audio play event. For example, song A is played.
In the embodiment of the application, a mode for realizing audio playing based on interactive projection is provided. By the mode, the audio playing control is displayed in the interactive projection picture, and a user can perform touch operation on the audio playing control. Based on the method, after the computer equipment responds to the touch operation of the user, the audio equipment can be controlled to play the audio file, so that the flexibility and the interestingness of the interactive projection are improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiment of the present application, the target event is a menu selection event;
May further include:
displaying a menu selection control in the interactive projection picture, wherein the menu selection control is positioned in a first target interaction area;
if the position relationship between the touch point of the touch operation and the first target interaction area meets the event triggering condition, controlling at least one device to execute the target event may specifically include:
If the touch point of the touch operation is located in the first target interaction area, determining that an event triggering condition is met, and controlling the projection equipment to project an animation effect corresponding to the menu selection event.
In one or more embodiments, a manner of implementing menu selection based on interactive projection is presented. As can be seen from the foregoing embodiments, the target event may be a menu selection event, and based on this, in a case where it is determined to execute the menu selection event, the projection device may be controlled to project an animation effect corresponding to the menu selection event. Optionally, the audio device may also be controlled to play the corresponding audio (e.g., "you have successfully ordered dishes").
Specifically, for ease of understanding, referring to fig. 13, fig. 13 is a schematic diagram illustrating implementation of menu selection based on interactive projection in an embodiment of the present application, as shown in the drawing, a menu selection control indicated by H1 is displayed in an interactive projection screen, where the menu selection control is located in a first target interaction area, and the first target interaction area is a target interaction area determined according to a first original interaction area. When the user clicks the menu selection control indicated by H1, the touch point of the touch operation is positioned in the first target interaction area. The computer device may then control the projection device to project an animation effect corresponding to the menu selection event. For example, the changing effect corresponding to "preserved vegetable braised meat" may be highlighted.
Secondly, in the embodiment of the application, a mode for realizing menu selection based on interactive projection is provided. By the mode, the menu selection control is displayed in the interactive projection picture, and a user can perform touch operation on the menu selection control. Based on the method, after the computer equipment responds to the touch operation of the user, the projection equipment can be controlled to project the animation effect corresponding to the menu selection event, so that the flexibility and the interestingness of the interactive projection are improved.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiment of the present application, the target event is a page flip event;
May further include:
displaying a current page in the interactive projection picture;
if the position relationship between the touch point of the touch operation and the first target interaction area meets the event triggering condition, controlling at least one device to execute the target event may specifically include:
if the initial touch point of the touch operation and the final touch point of the touch operation are both located in the first target interaction area, determining that the event triggering condition is met, and controlling the projection device to project a previous page or a next page of the current page.
In one or more embodiments, a manner of implementing electronic reading based on interactive projection is presented. As can be seen from the foregoing embodiments, the target event may be a page flip event, based on which, in the case that it is determined to perform the page flip event, a previous page or a next page of the current page may be controlled to be projected. For example, if the touch operation is a right swipe operation, the control projects a page subsequent to the current page. For another example, if the touch operation is a left swipe operation, the previous page of the current page is controlled to be projected.
Specifically, for ease of understanding, referring to fig. 14, fig. 14 is a schematic diagram illustrating an implementation of electronic reading based on interactive projection according to an embodiment of the present application, and as shown in the drawing, a current page indicated by I1 is displayed in an interactive projection screen. It is assumed that the first target interaction region is located at the lower right of the interaction projection screen. Based on this, when the user swipes from left to right at the lower right of the current page, it can be determined that the touch point of the touch operation is located within the first target interaction region. Thus, the computer device may control the projection device to project a page subsequent to the current page, e.g., to project a page subsequent to that indicated as I2.
In the embodiment of the application, a mode for realizing electronic reading based on interactive projection is provided. By the method, the current page is displayed in the interactive projection picture, and the user can perform touch operation on the current page. Based on the method, after the computer equipment responds to the touch operation of the user, the projection equipment can be controlled to project the previous page or the next page of the current page, so that the flexibility and the interestingness of the interactive projection are improved.
Referring to fig. 15 for describing the event control device in detail, fig. 15 is a schematic diagram of an embodiment of the event control device according to the present application, and the event control device 30 includes:
The display module 310 is configured to display a projection editing interface, where the projection editing interface provides an interactive operation panel;
The display module 310 is further configured to respond to a region selection instruction for the interactive operation panel, and display a first original interaction region in the interactive operation panel according to the region selection instruction, where the first original interaction region is used to determine position information of a first target interaction region in an interactive projection screen;
The obtaining module 320 is configured to obtain a target key value corresponding to the first original interaction area in response to a key value input operation of the interaction setting panel for the first original interaction area, where the target key value has a unique mapping relationship with the target event;
The control module 330 is configured to, when the projection device projects the interactive projection screen and a touch operation for the interactive projection screen is detected, control at least one device to execute the target event if a positional relationship between a touch point of the touch operation and the first target interaction region satisfies an event triggering condition.
In an embodiment of the application, an event control device is provided. By adopting the device, the user can visually edit the original interaction area through the mode of freely selecting the area. Based on the method, the original interaction region is directly mapped into the interaction projection picture, so that a corresponding target interaction region can be formed, and the deployment mobility is good. Therefore, the scheme provided by the application not only can save the equipment deployment cost, but also improves the debugging efficiency of projection interaction.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The display module 310 is specifically configured to trigger an area selection instruction when a continuous contact operation with the interactive operation panel is maintained, and display a track corresponding to the continuous contact operation in the interactive operation panel;
When the end of the continuous contact operation is detected, a first original interaction area is displayed in the interactive operation panel according to the track corresponding to the continuous contact operation.
In an embodiment of the application, an event control device is provided. By adopting the device, based on continuous contact operation of a user, the corresponding stroking track can be displayed in the interactive operation panel. Thereby achieving the effect of selecting the visualization and being convenient for the user to observe the condition of selecting the area in real time.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The display module 310 is further configured to display contact position information corresponding to the continuous contact operation when the continuous contact operation with the interactive operation panel is maintained, where the contact position information is used to describe a real-time position of a contact corresponding to the continuous contact operation in the interactive operation panel.
In an embodiment of the application, an event control device is provided. By adopting the device, the real-time position information of the touch point can be displayed based on continuous contact operation, so that a user can know the frame selection condition of the interaction area conveniently, and corresponding adjustment can be made in time.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The display module 310 is specifically configured to, when a click operation for the interactive operation panel is detected, trigger an area selection instruction, and display, in the interactive operation panel, each contact corresponding to the click operation and a connecting edge between every two adjacent contacts, where each two adjacent contacts represents a contact generated by two adjacent click operations;
and displaying a first original interaction area in the interactive operation panel according to the connecting edge between every two adjacent contacts.
In an embodiment of the application, an event control device is provided. By adopting the device, based on clicking operation of a user, corresponding contacts and a connecting edge between the two contacts can be displayed in the interactive operation panel. Therefore, the visual effect of selection is achieved, the user can observe the condition of area selection in real time conveniently, the requirement of polygon selection is met, and therefore the convenience of operation is improved.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The display module 310 is further configured to display contact location information corresponding to a click operation when the click operation for the interactive operation panel is detected, where the contact location information is used to describe a real-time location of a contact corresponding to the click operation in the interactive operation panel.
In an embodiment of the application, an event control device is provided. By adopting the device, the position information of the touch point can be displayed based on clicking operation, so that a user can know the frame selection condition of the interaction area conveniently, and corresponding adjustment can be made in time.
Optionally, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15, the event control device 30 further includes an operation module 340 and a determination module 350;
The operation module 340 is configured to respond to a hierarchical setting operation for the first original interaction region if there is a coincidence region between the first original interaction region and the second original interaction region, where the second original interaction region is used to determine position information of the second target interaction region in the interactive projection screen;
The determining module 350 is configured to determine, when the level priority of the first original interaction region is higher than the level priority of the second original interaction region and the touch point of the touch operation is located in a region overlapping the first target interaction region and the second target interaction region, a positional relationship between the touch point of the touch operation and the first target interaction region;
The determining module 350 is further configured to determine, when the touch point of the touch operation is located in the overlapping area of the first target interaction area and the second target interaction area under the condition that the level priority of the second original interaction area is higher than the level priority of the first original interaction area, a positional relationship between the touch point of the touch operation and the second target interaction area.
In an embodiment of the application, an event control device is provided. By adopting the device, the user can define the corresponding hierarchy for the original interaction area. Therefore, under the condition that the overlapping touch areas exist, the interactive area of the current touch can be effectively distinguished, and therefore debugging efficiency of interactive development of plane projection is improved.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The display module 310 is specifically configured to display, on the projection editing interface, an interaction setting panel for the first original interaction area when a click operation for the first original interaction area is detected, where the interaction setting panel is configured to set a key value corresponding to a type of the click operation;
When a long-press operation aiming at the first original interaction area is detected, displaying an interaction setting panel aiming at the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the long-press operation;
When the directional swipe operation for the first original interaction area is detected, displaying an interaction setting panel for the first original interaction area on a projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the directional swipe operation.
In an embodiment of the application, an event control device is provided. By adopting the device, the operation type can be set according to the service requirement when the interactive area is edited. Based on the above, in practical application, the click operation, the long press operation, the directional swipe operation and the like of the user are supported, so that the flexibility of the scheme is improved.
Optionally, on the basis of the embodiment corresponding to fig. 15, in another embodiment of the event control device 30 provided in the embodiment of the present application, the event control device 30 further includes a determining module 360;
The obtaining module 320 is further configured to obtain a length ratio and a width ratio between the interactive projection screen and the interactive operation panel after the first original interactive area is displayed in the interactive operation panel according to the area selection instruction in response to the area selection instruction for the interactive operation panel, where the length ratio represents a ratio between a corresponding length of the interactive projection screen and a corresponding length of the interactive operation panel, and the width ratio represents a ratio between a corresponding width of the interactive projection screen and a corresponding width of the interactive operation panel;
The determining module 360 is configured to determine the position information of the first target interaction region in the interaction projection screen according to the position information, the length ratio and the width ratio of the first original interaction region in the interaction operation panel.
In an embodiment of the application, an event control device is provided. By adopting the device, the position information of the target interaction area on the interaction projection picture is determined based on the position information of the original interaction area on the interaction operation panel. Therefore, the setting of the interaction area has good mobility, so that the equipment deployment cost can be saved, and the debugging efficiency of projection interaction is improved.
Optionally, on the basis of the embodiment corresponding to fig. 15, in another embodiment of the event control device 30 provided in the embodiment of the present application, the touch operation is a click operation or a long press operation;
The obtaining module 320 is further configured to obtain, when a touch operation for the interactive projection screen is detected, target position information of a touch point of the touch operation, where the target position information is used to describe a position of the touch point in the interactive projection screen;
The obtaining module 320 is further configured to obtain a position inclusion relationship between a touch point of the touch operation and the first target interaction region if the first target interaction region is determined according to the target position information;
the determining module 360 is further configured to determine that the event triggering condition is met if the location inclusion relationship indicates that the touch point of the touch operation is located within the first target interaction region.
In an embodiment of the application, an event control device is provided. By adopting the device, the position relation between the single touch point and the target interaction area can be identified, so that whether the corresponding event needs to be executed is further judged. Thereby, the feasibility and operability of the scheme is increased.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The obtaining module 320 is specifically configured to obtain an original image in response to a touch operation for an interactive projection screen, where the original image includes a bright spot area, and the bright spot area is generated according to the touch operation;
Performing binarization processing on the original image to obtain a binarized image;
performing contour extraction processing on the binarized image to obtain an image to be processed, wherein the image to be processed comprises a region to be processed;
performing contour filling processing on a region to be processed in the image to be processed to obtain a target image, wherein the target image comprises a target rectangular region;
And taking the central position information of the target rectangular area included in the target image as the target position information of the touch point of the touch operation.
In an embodiment of the application, an event control device is provided. By adopting the device, the computer equipment can continuously perform image recognition processing on the interactive projection picture, and in the process, the image content of the non-bright spot area can be filtered by utilizing an image binarization algorithm, so that the target position information of the touch point is obtained. Thereby improving the feasibility and operability of the scheme.
Optionally, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15, the event control device 30 further includes an execution module 370;
The obtaining module 320 is further configured to obtain location information of each of K target interaction areas included in the interaction projection picture, where K is an integer greater than or equal to 1;
The determining module 360 is further configured to determine K distance information according to the target location information and the location information of each target interaction area;
the determining module 360 is further configured to determine minimum distance information from the K distance information;
The execution module 370 is configured to execute a step of acquiring a positional inclusion relationship between a touch point of the touch operation and the first target interaction region if the minimum distance information is determined based on the target positional information and the positional information of the first target interaction region.
In an embodiment of the application, an event control device is provided. By adopting the device, the target interaction area closest to the touch point can be determined according to the target position information, so that the purpose of user operation can be more accurately identified, and the effect of projection interaction is improved.
Alternatively, in another embodiment of the event control device 30 provided in the embodiment of the present application based on the embodiment corresponding to fig. 15,
The obtaining module 320 is specifically configured to construct a ray based on a touch point of a touch operation;
acquiring the number of intersection points generated by the ray and the first target interaction area;
if the number of the intersection points is an odd number, determining that the position inclusion relation is used for indicating touch points of the touch operation to be positioned in the first target interaction area;
If the number of the intersection points is even, determining that the position inclusion relation is used for indicating that the touch point of the touch operation is located outside the first target interaction area.
In an embodiment of the application, an event control device is provided. By adopting the device, the ray method is suitable for judging the relation between the point and the polygon, and the device is relatively simple to realize and high in speed, so that the response efficiency of projection interaction is improved.
Optionally, on the basis of the embodiment corresponding to fig. 15, in another embodiment of the event control device 30 provided in the embodiment of the present application, the touch operation is a swipe operation;
The obtaining module 320 is further configured to obtain, when a touch operation for the interactive projection screen is detected, starting position information of a starting touch point of the touch operation and ending position information of an ending touch point of the touch operation, where the starting position information is used for describing a position of the starting touch point in the interactive projection screen, and the ending position information is used for describing a position of the ending touch point in the interactive projection screen;
The obtaining module 320 is further configured to obtain a position inclusion relationship according to the start position information and the end position information;
the determining module 360 is further configured to determine that the event triggering condition is satisfied if the position inclusion relationship indicates that the start touch point and the end touch point are both located within the first target interaction region.
In an embodiment of the application, an event control device is provided. By adopting the device, the position relation between the touch points and the target interaction area can be identified, so that whether the corresponding event needs to be executed is further judged. Thereby, the feasibility and operability of the scheme is increased.
Optionally, based on the embodiment corresponding to fig. 15, in another embodiment of the event control device 30 provided in the embodiment of the present application, the target event is an audio playing event;
The display module 310 is further configured to display an audio playing control in the interactive projection screen, where the audio playing control is located in the first target interaction area;
The control module 330 is specifically configured to determine that the event triggering condition is met if the touch point of the touch operation is located within the first target interaction area, and control the audio device to play the audio file corresponding to the audio playing event.
In an embodiment of the application, an event control device is provided. By adopting the device, the audio playing control is displayed in the interactive projection picture, and a user can perform touch operation on the audio playing control. Based on the method, after the computer equipment responds to the touch operation of the user, the audio equipment can be controlled to play the audio file, so that the flexibility and the interestingness of the interactive projection are improved.
Optionally, on the basis of the embodiment corresponding to fig. 15, in another embodiment of the event control device 30 provided in the embodiment of the present application, the target event is a menu selection event;
The display module 310 is further configured to display a menu selection control in the interactive projection screen, where the menu selection control is located in the first target interaction area;
The control module 330 is specifically configured to determine that the event triggering condition is met if the touch point of the touch operation is located within the first target interaction area, and control the projection device to project the animation effect corresponding to the menu selection event.
In an embodiment of the application, an event control device is provided. By adopting the device, the menu selection control is displayed in the interactive projection picture, and a user can perform touch operation on the menu selection control. Based on the method, after the computer equipment responds to the touch operation of the user, the projection equipment can be controlled to project the animation effect corresponding to the menu selection event, so that the flexibility and the interestingness of the interactive projection are improved.
Optionally, on the basis of the embodiment corresponding to fig. 15, in another embodiment of the event control device 30 provided in the embodiment of the present application, the target event is a page flip event;
the display module 310 is further configured to display a current page in the interactive projection screen;
The control module 330 is specifically configured to determine that the event triggering condition is met if the start touch point of the touch operation and the end touch point of the touch operation are both located in the first target interaction area, and control the projection device to project a previous page or a next page of the current page.
In an embodiment of the application, an event control device is provided. By adopting the device, the current page is displayed in the interactive projection picture, and the user can perform touch operation on the current page. Based on the method, after the computer equipment responds to the touch operation of the user, the projection equipment can be controlled to project the previous page or the next page of the current page, so that the flexibility and the interestingness of the interactive projection are improved.
The embodiment of the application also provides computer equipment, which can be a server or a terminal. In the following description, a computer device is taken as an example, as shown in fig. 16, for convenience of explanation, only the portion relevant to the embodiment of the present application is shown, and specific technical details are not disclosed, please refer to the method portion of the embodiment of the present application. In the embodiment of the application, a terminal is taken as a mobile phone for example to describe:
Fig. 16 is a block diagram showing a part of the structure of a mobile phone related to a terminal provided by an embodiment of the present application. Referring to fig. 16, the mobile phone includes: radio Frequency (RF) circuitry 410, memory 420, input unit 430, display unit 440, sensor 450, audio circuitry 460, wireless fidelity (WIRELESS FIDELITY, wiFi) module 470, processor 480, and power supply 490. Those skilled in the art will appreciate that the handset configuration shown in fig. 16 is not limiting of the handset and may include more or fewer components than shown, or may combine certain components, or may be arranged in a different arrangement of components.
The following describes the components of the mobile phone in detail with reference to fig. 16:
The RF circuit 410 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, in particular, after receiving downlink information of the base station, the downlink information is processed by the processor 480; in addition, the data of the design uplink is sent to the base station. In general, RF circuitry 410 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 410 may also communicate with networks and other devices via wireless communications. The wireless communications may use any communication standard or protocol including, but not limited to, global System for Mobile communications (global system of mobile communication, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), long term evolution (long term evolution, LTE), email, short message service (short MESSAGING SERVICE, SMS), and the like.
The memory 420 may be used to store software programs and modules, and the processor 480 may perform various functional applications and data processing of the cellular phone by executing the software programs and modules stored in the memory 420. The memory 420 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 420 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The input unit 430 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the handset. In particular, the input unit 430 may include a touch panel 431 and other input devices 432. The touch panel 431, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 431 or thereabout using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 431 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 480, and can receive commands from the processor 480 and execute them. In addition, the touch panel 431 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 430 may include other input devices 432 in addition to the touch panel 431. In particular, other input devices 432 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a mouse, joystick, etc.
The display unit 440 may be used to display information input by a user or information provided to the user as well as various menus of the mobile phone. The display unit 440 may include a display panel 441, and optionally, the display panel 441 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like. Further, the touch panel 431 may cover the display panel 441, and when the touch panel 431 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 480 to determine the type of the touch event, and then the processor 480 provides a corresponding visual output on the display panel 441 according to the type of the touch event. Although in fig. 16, the touch panel 431 and the display panel 441 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 431 and the display panel 441 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 450, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 441 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 441 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the handset are not described in detail herein.
Audio circuitry 460, a speaker 461, a microphone 462 can provide an audio interface between the user and the handset. The audio circuit 460 may transmit the received electrical signal after the audio data conversion to the speaker 461, and the electrical signal is converted into a sound signal by the speaker 461 and output; on the other hand, the microphone 462 converts the collected sound signals into electrical signals, which are received by the audio circuit 460 and converted into audio data, which are processed by the audio data output processor 480 and sent to, for example, another cell phone via the RF circuit 410, or which are output to the memory 420 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a mobile phone can help a user to send and receive e-mails, browse web pages, access streaming media and the like through a WiFi module 470, so that wireless broadband Internet access is provided for the user. Although fig. 16 shows the WiFi module 470, it is understood that it does not belong to the necessary constitution of the mobile phone, and can be omitted entirely as required within the scope of not changing the essence of the invention.
Processor 480 is the control center of the handset, connects the various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset by running or executing software programs and/or modules stored in memory 420, and invoking data stored in memory 420. Optionally, the processor 480 may include one or more processing units; alternatively, the processor 480 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 480.
The handset further includes a power supply 490 (e.g., a battery) for powering the various components, optionally in logical communication with the processor 480 through a power management system that performs functions such as managing charge, discharge, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which will not be described herein.
The steps performed by the terminal in the above embodiments may be based on the terminal structure shown in fig. 16.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the methods described in the foregoing embodiments.
Embodiments of the present application also provide a computer program product comprising a computer program which, when executed by a processor, implements the steps of the methods described in the foregoing embodiments.
It will be appreciated that in the specific embodiments of the present application, related data such as user information and control parameters are involved, and when the above embodiments of the present application are applied to specific products or technologies, user permissions or consents need to be obtained, and the collection, use and processing of related data need to comply with related laws and regulations and standards of related countries and regions.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or partly in the form of a software product or all or part of the technical solution, which is stored in a storage medium, and includes several instructions for causing a computer device (which may be a server or a terminal device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media in which a computer program can be stored.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (20)

1. An event control method based on interactive projection is characterized by comprising the following steps:
Displaying a projection editing interface, wherein the projection editing interface provides an interactive operation panel;
Responding to an area selection instruction aiming at the interactive operation panel, and displaying a first original interaction area in the interactive operation panel according to the area selection instruction, wherein the first original interaction area is used for determining the position information of a first target interaction area in an interactive projection picture;
responding to key value input operation of an interaction setting panel aiming at the first original interaction area, and acquiring a target key value corresponding to the first original interaction area, wherein the target key value and a target event have a unique mapping relation;
when the interactive projection picture is projected by the projection equipment, and the touch operation aiming at the interactive projection picture is detected, if the position relationship between the touch point of the touch operation and the first target interaction area meets an event triggering condition, controlling at least one equipment to execute the target event.
2. The event control method according to claim 1, wherein the displaying a first original interactive region in the interactive operation panel according to the region selection instruction in response to the region selection instruction for the interactive operation panel comprises:
when continuous contact operation with the interactive operation panel is kept, triggering the region selection instruction, and displaying a track corresponding to the continuous contact operation in the interactive operation panel;
And when the continuous contact operation is detected to be finished, displaying the first original interaction area in the interactive operation panel according to the track corresponding to the continuous contact operation.
3. The event control method according to claim 2, wherein the method further comprises, while maintaining continuous contact operation with the interactive operation panel:
And displaying contact position information corresponding to the continuous contact operation, wherein the contact position information is used for describing the real-time position of a contact corresponding to the continuous contact operation in the interactive operation panel.
4. The event control method according to claim 1, wherein the displaying a first original interactive region in the interactive operation panel according to the region selection instruction in response to the region selection instruction for the interactive operation panel comprises:
When the clicking operation aiming at the interactive operation panel is detected, triggering the region selection instruction, and displaying each contact corresponding to the clicking operation and the connecting edge between every two adjacent contacts in the interactive operation panel, wherein each two adjacent contacts represent contacts generated by two adjacent clicking operations;
And displaying the first original interaction area in the interactive operation panel according to the connecting edge between every two adjacent contacts.
5. The event control method according to claim 2, wherein when a click operation for the interactive operation panel is detected, the method further comprises:
and displaying the contact position information corresponding to the clicking operation, wherein the contact position information is used for describing the real-time position of the contact corresponding to the clicking operation in the interactive operation panel.
6. The event control method according to claim 1, characterized in that the method further comprises:
If the first original interaction area and the second original interaction area have the overlapping area, responding to the hierarchy setting operation aiming at the first original interaction area, wherein the second original interaction area is used for determining the position information of the second target interaction area in the interaction projection picture;
Under the condition that the hierarchy priority of the first original interaction area is higher than that of the second original interaction area, when the touch point of the touch operation is located in a superposition area of the first target interaction area and the second target interaction area, judging the position relationship between the touch point of the touch operation and the first target interaction area;
And under the condition that the hierarchy priority of the second original interaction region is higher than that of the first original interaction region, when the touch point of the touch operation is positioned in a superposition region of the first target interaction region and the second target interaction region, judging the position relationship between the touch point of the touch operation and the second target interaction region.
7. The event control method according to claim 1, wherein displaying an interaction setting panel for the first original interaction region on the projection editing interface includes:
When a clicking operation aiming at the first original interaction area is detected, displaying an interaction setting panel aiming at the first original interaction area on the projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the clicking operation;
When a long-press operation for the first original interaction area is detected, displaying an interaction setting panel for the first original interaction area on the projection editing interface, wherein the interaction setting panel is used for setting a key value responding to a long-press operation type;
And when the directional swipe operation for the first original interaction area is detected, displaying an interaction setting panel for the first original interaction area on the projection editing interface, wherein the interaction setting panel is used for setting a key value responding to the type of the directional swipe operation.
8. The event control method according to claim 1, wherein the method further comprises, in response to a region selection instruction for the interactive operation panel, after displaying a first original interactive region in the interactive operation panel according to the region selection instruction:
Acquiring a length ratio and a width ratio between the interactive projection picture and the interactive operation panel, wherein the length ratio represents a ratio between the corresponding length of the interactive projection picture and the corresponding length of the interactive operation panel, and the width ratio represents a ratio between the corresponding width of the interactive projection picture and the corresponding width of the interactive operation panel;
and determining the position information of the first target interaction area in the interaction projection picture according to the position information of the first original interaction area in the interaction operation panel, the length ratio and the width ratio.
9. The event control method according to claim 1, wherein the touch operation is a click operation or a long press operation;
when the touch operation for the interactive projection screen is detected, the method further comprises:
Acquiring target position information of a touch point of the touch operation, wherein the target position information is used for describing the position of the touch point in the interactive projection picture;
If the first target interaction area is determined according to the target position information, acquiring a position inclusion relation between a touch point of the touch operation and the first target interaction area;
and if the position inclusion relationship indicates that the touch point of the touch operation is positioned in the first target interaction area, determining that the event triggering condition is met.
10. The event control method according to claim 9, wherein the obtaining the target position information of the touch point of the touch operation includes:
Responding to the touch operation aiming at the interactive projection picture, and acquiring an original image, wherein the original image comprises a bright spot area which is generated according to the touch operation;
Performing binarization processing on the original image to obtain a binarized image;
performing contour extraction processing on the binarized image to obtain an image to be processed, wherein the image to be processed comprises a region to be processed;
Performing contour filling processing on the region to be processed in the image to be processed to obtain a target image, wherein the target image comprises a target rectangular region;
And taking the central position information of the target rectangular area included in the target image as the target position information of the touch point of the touch operation.
11. The event control method according to claim 9, characterized in that the method further comprises:
acquiring position information of each target interaction region in K target interaction regions contained in the interaction projection picture, wherein K is an integer greater than or equal to 1;
determining K distance information according to the target position information and the position information of each target interaction area;
determining minimum distance information from the K pieces of distance information;
and if the minimum distance information is determined based on the target position information and the position information of the first target interaction area, executing the step of acquiring the position inclusion relation between the touch point of the touch operation and the first target interaction area.
12. The event control method according to claim 9, wherein the obtaining the position inclusion relationship between the touch point of the touch operation and the first target interaction region includes:
Constructing a ray based on the touch point of the touch operation;
acquiring the number of intersection points generated by the ray and the first target interaction area;
If the number of the intersection points is an odd number, determining that the position inclusion relation is used for indicating that the touch point of the touch operation is located in the first target interaction area;
And if the number of the intersection points is even, determining that the position inclusion relation is used for indicating that the touch point of the touch operation is located outside the first target interaction area.
13. The event control method according to claim 1, wherein the touch operation is a swipe operation;
when the touch operation for the interactive projection screen is detected, the method further comprises:
acquiring initial position information of an initial touch point of the touch operation and end position information of an end touch point of the touch operation, wherein the initial position information is used for describing the position of the initial touch point in the interactive projection picture, and the end position information is used for describing the position of the end touch point in the interactive projection picture;
acquiring a position inclusion relation according to the initial position information and the termination position information;
And if the position inclusion relationship indicates that the initial touch point and the end touch point are both positioned in the first target interaction area, determining that the event triggering condition is met.
14. The event control method according to any one of claims 1 to 13, wherein the target event is an audio play event;
The method further comprises the steps of:
displaying an audio playing control in the interactive projection picture, wherein the audio playing control is positioned in the first target interaction area;
and if the position relationship between the touch point of the touch operation and the first target interaction area meets an event triggering condition, controlling at least one device to execute the target event, including:
And if the touch point of the touch operation is positioned in the first target interaction area, determining that the event triggering condition is met, and controlling the audio equipment to play the audio file corresponding to the audio playing event.
15. The event control method according to any one of claims 1 to 13, wherein the target event is a menu selection event;
The method further comprises the steps of:
Displaying a menu selection control in the interactive projection picture, wherein the menu selection control is positioned in the first target interaction area;
and if the position relationship between the touch point of the touch operation and the first target interaction area meets an event triggering condition, controlling at least one device to execute the target event, including:
And if the touch point of the touch operation is positioned in the first target interaction area, determining that the event triggering condition is met, and controlling the projection equipment to project the animation effect corresponding to the menu selection event.
16. The event control method according to any one of claims 1 to 13, wherein the target event is a page flip event;
The method further comprises the steps of:
Displaying a current page in the interactive projection picture;
and if the position relationship between the touch point of the touch operation and the first target interaction area meets an event triggering condition, controlling at least one device to execute the target event, including:
And if the initial touch point of the touch operation and the final touch point of the touch operation are both positioned in the first target interaction area, determining that the event triggering condition is met, and controlling a projection device to project a previous page or a next page of the current page.
17. An event control apparatus, comprising:
The display module is used for displaying a projection editing interface, wherein the projection editing interface provides an interactive operation panel;
The display module is further configured to respond to an area selection instruction for the interactive operation panel, and display a first original interaction area in the interactive operation panel according to the area selection instruction, where the first original interaction area is used to determine position information of a first target interaction area in an interactive projection picture;
the acquisition module is used for responding to the key value input operation of the interaction setting panel aiming at the first original interaction area and acquiring a target key value corresponding to the first original interaction area, wherein the target key value and a target event have a unique mapping relation;
And the control module is used for controlling at least one device to execute the target event if the position relationship between the touch point of the touch operation and the first target interaction area meets an event triggering condition when the touch operation on the interactive projection picture is detected under the condition that the interactive projection picture is projected by the projection device.
18. An interactive projection system is characterized by comprising computer equipment, projection equipment and signal acquisition equipment;
The projection equipment is used for projecting an interactive projection picture;
The signal acquisition equipment is used for acquiring touch operation aiming at the interactive projection picture;
The computer device is configured to perform the event control method as claimed in any one of claims 1 to 16.
19. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the event control method of any of claims 1 to 16 when the computer program is executed.
20. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program when executed by a processor realizes the steps of the event control method of any of claims 1 to 16.
CN202310002198.6A 2023-01-03 Event control method based on interactive projection, related device and storage medium Pending CN118295571A (en)

Publications (1)

Publication Number Publication Date
CN118295571A true CN118295571A (en) 2024-07-05

Family

ID=

Similar Documents

Publication Publication Date Title
CN110703966B (en) File sharing method, device and system, corresponding equipment and storage medium
US10636221B2 (en) Interaction method between user terminals, terminal, server, system, and storage medium
CN109905754B (en) Virtual gift receiving method and device and storage equipment
CN111417028B (en) Information processing method, information processing device, storage medium and electronic equipment
CN111491197B (en) Live content display method and device and storage medium
CN105187930A (en) Video live broadcasting-based interaction method and device
CN103051865B (en) The method that picture controls and terminal, video conference device
CN103207728A (en) Method Of Providing Augmented Reality And Terminal Supporting The Same
US20240078703A1 (en) Personalized scene image processing method, apparatus and storage medium
CN108024073B (en) Video editing method and device and intelligent mobile terminal
CN107992263A (en) A kind of information sharing method and mobile terminal
CN112312217A (en) Image editing method and device, computer equipment and storage medium
CN108920069A (en) A kind of touch operation method, device, mobile terminal and storage medium
CN113350793B (en) Interface element setting method and device, electronic equipment and storage medium
CN113596555B (en) Video playing method and device and electronic equipment
CN111127595A (en) Image processing method and electronic device
CN109710127A (en) A kind of screenshotss method and mobile terminal
CN109803104A (en) A kind of record screen method, device and mobile terminal
CN113485626A (en) Intelligent display device, mobile terminal and display control method
CN109495616A (en) A kind of photographic method and terminal device
CN110908757B (en) Method and related device for displaying media content
CN113546419A (en) Game map display method, device, terminal and storage medium
CN113126875B (en) Virtual gift interaction method and device, computer equipment and storage medium
WO2018068517A1 (en) Method for controlling screenshot by rolling ball, and related intelligent device
CN116594616A (en) Component configuration method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication