CN117873322A - Projection picture interaction control method, device and storage medium - Google Patents

Projection picture interaction control method, device and storage medium Download PDF

Info

Publication number
CN117873322A
CN117873322A CN202410022597.3A CN202410022597A CN117873322A CN 117873322 A CN117873322 A CN 117873322A CN 202410022597 A CN202410022597 A CN 202410022597A CN 117873322 A CN117873322 A CN 117873322A
Authority
CN
China
Prior art keywords
projection
control object
projection picture
relative position
position change
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410022597.3A
Other languages
Chinese (zh)
Inventor
宁仲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yibin Jimi Photoelectric Co Ltd
Original Assignee
Yibin Jimi Photoelectric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yibin Jimi Photoelectric Co Ltd filed Critical Yibin Jimi Photoelectric Co Ltd
Priority to CN202410022597.3A priority Critical patent/CN117873322A/en
Publication of CN117873322A publication Critical patent/CN117873322A/en
Pending legal-status Critical Current

Links

Landscapes

  • Transforming Electric Information Into Light Information (AREA)

Abstract

The invention discloses a projection picture interaction control method, equipment and a storage medium. Acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects; determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images; and controlling the movement of the projection picture according to the relative position change. The user can control the movement of the projection picture through the hand movement, so that the operation and the realization logic of similar functions are simplified, and the technical problem that the user interaction experience is poor when the projection picture is regulated by using the remote controller in the related technology is solved.

Description

Projection picture interaction control method, device and storage medium
Technical Field
The present invention relates to the field of projection interactive control, and in particular, to a method, an apparatus, and a storage medium for controlling projection image interaction.
Background
Most of the projection devices on the market at present have a simple display function, and the projection images to be controlled can only be controlled by a remote controller, and lack of a control function capable of being matched with the actions of a user.
Some projection devices have the function of recognizing gestures of a user for operation, but most operation logics are complicated and are not easy to realize.
Disclosure of Invention
The embodiment of the application provides a projection picture interaction control method, a projection picture interaction control device and a storage medium, which are used for at least solving the problems that user experience is poor, action interaction logic is complex, implementation is difficult and the like when a remote controller is used for controlling projection pictures in the related technology.
According to an aspect of an embodiment of the present application, there is provided a projection screen interaction control method, including: a projection picture interaction control method is characterized in that,
the method comprises the following steps:
acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects;
determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images;
and controlling the movement of the projection picture according to the relative position change.
Optionally, the determining, according to the multiple frames of continuous images, a relative position change of the control object with respect to the projection screen includes:
according to the multi-frame continuous images, determining coordinate information of the projection picture and coordinate information of a control object;
and acquiring the relative position change of the control object relative to the projection picture according to the coordinate information of the projection picture and the coordinate information of the control object.
Optionally, the obtaining the relative position change of the control object with respect to the projection picture according to the coordinate information of the projection picture and the coordinate information of the control object includes:
determining key point coordinates of the projection picture from any one or more images in the multi-frame continuous images;
determining key point coordinates of the control object according to a first image and a second image in the multi-frame continuous images, wherein the shooting time of the first image is later than that of the second image;
and acquiring the relative position change of the control object relative to the projection picture according to the key point coordinates of the control object in the first image, the key point coordinates of the control object in the second image and the key point coordinates of the projection picture.
Optionally, the acquiring the relative position change of the control object with respect to the projection screen includes:
calculating to obtain a first difference value between the key point coordinates of the control object in the first image and the key point coordinates of the projection picture;
calculating to obtain a second difference value between the key point coordinates of the control object in the second image and the key point coordinates of the projection picture;
and acquiring the relative position change of the control object relative to the projection picture according to the first difference value and the second difference value.
Optionally, the controlling the movement of the projection screen according to the relative position change includes:
determining a target moving direction of the projection picture according to the relative position change;
and controlling the movement of the projection picture based on the target movement direction.
Optionally, the controlling the movement of the projection screen according to the relative position change further includes:
judging whether the control object is positioned in the projection picture or not, and acquiring a judging result;
and responding to the relative position change according to the judging result, and controlling the projection picture to move.
Optionally, before the acquiring the multiple continuous images, the multiple continuous images include the projection picture and the image of the control object, the method further includes:
and responding to an interaction control instruction, and controlling a sensor to acquire the multi-frame continuous images based on the interaction control instruction.
Optionally, before controlling the movement of the projection screen according to the relative position change, the method further includes:
calculating the distance between the key point of the control object and the projection picture according to the multi-frame continuous images;
and if the distance is smaller than the preset threshold value, performing the next step.
According to another aspect of the embodiments of the present application, there is further provided a projection screen interaction control apparatus, including:
the image acquisition module is used for acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects;
the position determining module is used for determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images;
and the projection control module is used for controlling the movement of the projection picture according to the relative position change.
According to another aspect of the embodiments of the present application, there is further provided a storage medium, wherein the storage medium includes a stored program, and when the program runs, the device in which the storage medium is controlled to execute the projection screen interaction control method according to any one of claims 1 to 8.
In the embodiment of the application, firstly, a plurality of continuous images are acquired, wherein the continuous images comprise projection pictures and images of control objects; then, according to the multi-frame continuous images, determining the relative position change of the control object relative to the projection picture; and finally, controlling the movement of the projection picture according to the relative position change. The user can control the movement of the projection picture through the hand movement, so that the operation and the realization logic of similar functions are simplified, and the technical problem that the user interaction experience is poor when the projection picture is regulated by using the remote controller in the related technology is solved.
Drawings
FIG. 1 is a flowchart of an embodiment of a method for interactive control of projection images provided in the present application;
fig. 2 is a schematic diagram of mapping a projection screen to a sensor provided in the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The present application is described in detail below with reference to the accompanying drawings and examples.
The invention provides a projection picture interaction control method, as shown in fig. 1, which comprises the following steps of S102-S104:
s102: acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects;
in the embodiment provided by the application, when the user wants to perform interactive control on the picture, the interactive control on the projection picture can be performed by enabling the interactive control instruction. In other embodiments, the projection device may also turn on this function by default.
Specifically, in response to an interaction control instruction, a sensor is controlled to acquire the multi-frame continuous images based on the interaction control instruction. For example, the user can trigger the interactive control function of the projection screen through voice control or options in a menu interface, and the user can trigger the above function quickly by setting a shortcut key.
In the embodiment provided by the application, the hand action interaction scene applied by the scheme is based on the data of hand recognition, which is a plurality of frames of continuous images acquired by sensing.
Specifically, the multiple continuous images are collected by the sensor, the multiple continuous images may include two or more images within a period of time, the two or more images may be images of adjacent frames or images of non-adjacent frames, and the multiple continuous images may be any of RGB, infrared or other types, including projection screen images generated by the projection device and images of the control object. At the same time, the sensor can also acquire distance data and/or depth data.
Wherein the control object is a user's hand, in some other embodiments, the control object may also be another object, such as a wand or the like, that can be used to indicate and/or facilitate the performance of the control operation.
S103: determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images;
in the embodiment provided by the application, the coordinate information of the projection picture and the coordinate information of the control object are acquired based on multiple frames of continuous images. And acquiring the relative position change of the control object relative to the projection picture through the coordinate information of the control object and the coordinate information of the projection picture. Specifically, because of the characteristics of the sensor and the projection screen, in this embodiment, the relative positions of the projection screen and the sensor are fixed, therefore, only one frame of image is selected arbitrarily from multiple frames of continuous image data collected By the sensor to extract the position data, distance data and/or depth data related to the projection screen recorded By the sensor, three-dimensional coordinates of a plurality of projection screen feature points are obtained through the position data, the distance data and/or the depth data, and plane fitting is performed on the three-dimensional coordinates of the plurality of projection screen feature points through a spatial plane formula ax+by+cz+d=0, so as to obtain a fitting plane P and normal vectors N (a, B, C) of the fitting plane P. Wherein A, B, C is a normal vector parameter of the fitting plane, and x, y and z are x-axis coordinate, y-axis coordinate and z-axis coordinate of the three-dimensional coordinate, respectively. After the fitting plane P is obtained, the fitting plane P may be used as a mapping of the projection screen in a three-dimensional coordinate system, where the fitting plane P completely coincides with a projection area such as a projection screen in a three-dimensional space.
Specifically, a first image and a second image are selected from a plurality of continuous images, and coordinates of key points of the control object are determined according to the first image and the second image, and it should be noted that in this embodiment, the shooting time of the first image is later than that of the second image, so that in the plurality of continuous images, the position sequence of the first image is before the position sequence of the second image.
Specifically, in this embodiment, a preset recognition model may be used to recognize a control object region in the first image and the second image, one or more points in the control object region may be selected as key points, and the three-dimensional coordinate points of the control object key points in the first image and the second image may be obtained by using the position data, the distance data and/or the depth data of the control object region.
A two-dimensional coordinate system is established based on the sensor pixel point distribution, and four angular coordinates of a projection picture are mapped to the two-dimensional coordinate system based on the sensor based on the normal vector N (A, B, C) of the fitting plane P and the mapping relation H, as shown in fig. 2. Specifically, the formula is usedAnd calculating a mapping relation H, wherein K is an internal parameter of the sensor, R, T is an external parameter from the sensor to the matched projection optical machine, and K is an internal parameter of the matched projection optical machine. Passing the four angular coordinates of the projection picture through a mapping process formula P cor =H -1 ×S cor Mapping to the two-dimensional coordinate system based on the distribution of the sensor pixel points, wherein P cor For four projected picturesAngular coordinates, S cor And obtaining a coordinate area of the projection picture in the two-dimensional coordinate system based on the distribution of the sensor pixel points for the four angular coordinates of the projection picture to correspond to the 4 angular coordinates of the two-dimensional coordinate system. For the control object key points, the three-dimensional coordinates of the control object key points can be mapped to a two-dimensional coordinate system based on a sensor through simple mapping, so that the coordinates of the control object key points can be obtained.
And selecting a key point in a two-dimensional coordinate area of the projection picture, wherein the key point can be a central point of the projection picture, and obtaining the key point coordinates of the projection picture, and the key point coordinates of the projection picture are the coordinates of the key point of the projection picture in a two-dimensional coordinate system based on a sensor.
It should be noted that, the coordinate system in this embodiment may be established based on the distribution of the sensor pixels, and may also be established by other manners, such as establishing a corresponding coordinate system based on the projector optical machine.
It should be noted that, besides selecting the center point of the projection picture as the key point, any other point in the projection area can be selected as the key point, so that the specific method implementation is not affected.
In the embodiment provided in the present application, after obtaining the coordinate information of the control object and the coordinate information of the projection screen, a preamble detection step may be further performed.
Specifically, calculating the distance between the key point of the control object and the projection picture, and executing the subsequent steps if the distance is smaller than a preset threshold value. In other embodiments, the above-described relative position change may also be calculated when the distance is less than or equal to a preset threshold.
Specifically, the distance value Q between the key point of the control object and the sensor is obtained by controlling the three-dimensional coordinates of the key point of the control object and the three-dimensional coordinates of the characteristic points of the projection picture 1 And the distance value Q between the projection picture and the sensor 2 According to formula Q 3 =Q 2 -Q 1 Obtaining the distance Q between the key point of the control object and the projection picture 3 When Q 3 When the value is smaller than or equal to a preset threshold value T, the subsequent steps are executed, and when Q is smaller than or equal to a preset threshold value T 3 Greater than a pre-determinedWhen the threshold T is set, the execution of the subsequent steps is canceled.
In some other embodiments, there may be other methods for calculating the distance between the key point of the control object and the projection screen, for example: identifying a control object in a plurality of frames of continuous images through a preset identification model, directly extracting position information of the control object based on a sensor to obtain two-dimensional key point coordinates of the control object, simultaneously obtaining the distance between the key point of the control object and the sensor based on the sensor, obtaining three-dimensional key point coordinates of the control object according to the two-dimensional key point coordinates of the control object and the distance between the key point of the control object and the sensor, and calculating the distance Q 'between the key point of the control object and the fitting plane P, wherein the distance Q' between the key point of the control object and the fitting plane P is equal to the distance Q between the key point of the control object and a projection picture 3 Therefore, when Q' is smaller than the preset threshold T, the subsequent steps are performed, when Q 3 And when the preset threshold value T is smaller than the preset threshold value T, the execution of the subsequent steps is canceled.
The function of the preamble detection step is that after the user starts the interactive control function, the situation that the hand accident is identified by the sensor possibly occurs, so that the projection picture is moved by mistake, the use experience of the user is affected, and the preamble detection step greatly reduces the occurrence of the situation and improves the experience of the user.
In the embodiment provided by the application, the relative position change of the control object relative to the projection picture is obtained according to the key point coordinates of the control object in the first image, the key point coordinates of the control object in the second image and the key point coordinates of the projection picture.
Specifically, the first difference value N between the coordinates of the key points of the control object in the first image and the coordinates of the key points of the projection screen is obtained by performing corresponding operation on the coordinates of the key points of the control object in the first image and the coordinates of the key points of the projection screen 1 Performing corresponding operation on the key point coordinates of the control object in the second image and the key point coordinates of the projection picture to obtain a second difference value N between the key point coordinates of the control object in the second image and the key point coordinates of the projection picture 2 And obtaining the relative position change of the control object relative to the projection picture according to the first difference value and the second difference value. For example: the coordinates of the key points of the projection picture are (X, y), and the coordinates of the key points of the control object in the first image are (X 1 ,Y 1 ) The coordinates of the key points of the control object in the second image are (X 2 ,Y 2 ) First by formula N n (L n ,K n )=(X n -x,Y n -y) obtaining a first difference N 1 And a second difference N 2 Continuing through the formula N ' (L ', K ') =n 2 -N 1 And obtaining the relative position change of the key point of the control object relative to the projection picture, wherein N' represents the relative position change quantity, L represents the abscissa change quantity and K represents the ordinate change quantity.
S104: controlling the projection picture to move according to the relative position change;
in the embodiment provided by the application, the target moving direction of the projection picture can be determined according to the relative variation, and then the projection picture is controlled to move based on the target moving direction.
Specifically, in the present application, 9 target moving directions are included, the 9 target moving directions are determined by the relative position change amounts N ' (L ', K ') of the control object key points with respect to the projection screen,
when L '> 0, K' > 0, is the first target moving direction,
when L '> 0, K' < 0, is the second target moving direction,
when L '> 0, k' =0, is the third target moving direction,
when L '< 0, K' > 0, is the fourth target moving direction,
when L '< 0, k' < 0, for the fifth target movement direction,
when L '< 0, k' =0, for the sixth target moving direction,
when L '=0, k' > 0, is the seventh target moving direction,
when L '=0, k' < 0, for the eighth target moving direction,
when L '=0, k' =0, for the ninth target moving direction,
the first to ninth target movement directions correspond to the upper right, lower right, upper left, lower left, right left, upper right, lower right, and stationary nine target movement directions, respectively.
In this embodiment, a relatively complete correspondence between the target movement direction and the relative position change amount N ' (L ', K ') is provided, and in some other embodiments, the target movement direction may include only 4 target movement directions, such as horizontal left-right movement, vertical up-down movement, and the like, corresponding to the third target movement direction, the sixth target movement direction, the seventh target movement direction, and the eighth target movement direction.
In some other embodiments, the target moving direction may include only 2 target moving directions, such as horizontal left-right movement, corresponding to the third target moving direction and the sixth target moving direction.
In this embodiment, after determining the target moving direction, the projection screen is controlled to move a preset distance in the target moving direction, where the preset distance may be any value, and no constraint is imposed.
In some other embodiments, in addition to determining the target movement direction, a target movement distance is determined, the target movement distance including a horizontal target movement distance value, which is |l '|, and a vertical target movement distance value, which is |k' |. For example, when |l '|=1, |k' |=2, and the target moving direction is the first moving direction, the control image is moved by 1 pixel distance in the right direction and continues to be moved by 2 pixel distances in the right direction.
It should be noted that, in this embodiment, only one method of controlling the movement of the projection screen is provided, and other methods may be used to control the movement of the projection screen, for example, by controlling the rotation of the cradle head of the mating projector.
In this embodiment, the position of the control object may be determined, and whether to respond to the relative position change or not and control the movement of the projection screen may be determined based on the result of the determination.
Specifically, one or more virtual decision boxes with one edge being the edge of the coordinate area of the projection picture are adjacently arranged outside the coordinate area of the projection picture, and the one or more virtual decision boxes are also one or more coordinate areas in a two-dimensional coordinate system based on the distribution of the sensor pixels. The size of the virtual decision box can be freely adjusted.
Specifically, coordinates of control object key points are obtained, wherein the coordinates are two-dimensional coordinates of the control object key points in a two-dimensional coordinate system based on sensor pixel distribution, coordinate areas of a projection picture in the two-dimensional coordinate system based on sensor pixel distribution are obtained, when the coordinates of the control object key points fall into the coordinate areas of the projection picture, the control object is considered to be positioned in the projection picture, when the coordinates of the control object key points do not fall into the coordinate areas of the projection picture but fall into a virtual judgment frame, the control object is considered to be positioned at the edge of the projection picture, and when the coordinates of the control object key points do not fall into the coordinate areas of the projection picture and do not fall into the virtual judgment frame, the control object is considered to be positioned outside the projection picture.
In some other embodiments, a virtual decision box may be provided in the coordinate system, where the virtual decision box is considered as a coordinate area, one or more edges of the projection screen are inside the virtual decision box, when the coordinates of the control object key point fall into the coordinate area of the projection screen and do not fall into the virtual decision box, the control object is considered to be inside the projection screen, when the coordinates of the control object key point fall into the coordinate area of the projection screen and fall into the virtual decision box, the control object is considered to be at the edge of the projection screen, and when the coordinates of the control object key point do not fall into the coordinate area of the projection screen and do not fall into the virtual decision box, the control object is considered to be outside the projection screen.
In some other embodiments, one or more projection screen keywords may be set, a distance between the control object keywords and the projection screen keywords may be calculated, when the distance between the control object keywords and the projection screen keywords is smaller than a set threshold, the control object may be considered to be at the edge of the projection screen, when the coordinates of the control object keywords fall into the coordinate area of the projection screen and the distance between the control object keywords and the projection screen keywords is larger than the set threshold, the control object may be considered to be at the projection screen, and when the coordinates of the control object keywords do not fall into the coordinate area of the projection screen and the distance between the control object keywords and the projection screen keywords is larger than the set threshold, the control object may be considered to be at the outside of the projection screen.
Specifically, a first difference N between the coordinates of the key points of the control object in the first image and the coordinates of the key points of the projection picture is obtained 1 (L 1 ,K 1 ) And a second difference N between the coordinates of the key points of the control object and the coordinates of the key points of the projection picture in the second image 2 (L 2 ,K 2 ) By the formulaCalculating to obtain the distance H between the key point of the control object in the first image and the key point of the projection picture 1 And controlling the distance H between the key point of the object and the key point of the projection picture in the second image 2 Judgment of H 1 And H is 2 Is of a size of (a) and (b),
when the control object is in the projection picture, and only when H 1 <H 2 And controlling the movement of the projection picture in response to the corresponding relative position change.
When the control object is at the edge of the projection picture, and only when H 1 >H 2 And controlling the movement of the projection picture in response to the corresponding relative position change.
When the control object is outside the projection screen, no relative position change is responded. Through the steps, based on the partially reasonable limitation, redundant operation of the user in the interactive control process can be reduced, which is equivalent to reducing the operation amount of the user, so that the user can adjust the projection picture to a satisfactory position in a short time.
The method provided by the embodiment of the application can enable the projection picture to be adjusted in real time along with the movement of the control object, long-time waiting is not needed by a user, and the method is convenient and quick. Compared with other implementation modes, the method provided by the application simplifies the control logic and the implementation mode of the projection picture interaction control, and greatly improves the interaction experience of the user.
In addition, when all the steps are carried out, the real-time correction module and the eye protection module are triggered synchronously, the state of projection picture adjustment is obtained in real time, the correction module can obtain the moving angle of the matched projector to dynamically correct the picture, meanwhile, the eye protection module can analyze whether the human body enters the projection area or not, if so, a mask is triggered, and the human head area is covered. The user is prevented from being stimulated by light to the eyes.
The application also provides a projection picture interaction adjusting device for realizing the projection picture interaction adjusting method, wherein:
the image acquisition module is used for acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects;
the position determining module is used for determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images;
and the projection control module is used for controlling the movement of the projection picture according to the relative position change.
The application also provides a storage medium, which is characterized in that the storage medium comprises a stored program, wherein the equipment where the storage medium is located is controlled to execute the projection picture interaction control method when the program runs.

Claims (10)

1. A projection screen interaction control method, the method comprising:
acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects;
determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images;
and controlling the movement of the projection picture according to the relative position change.
2. The method according to claim 1, wherein determining a change in relative position of the control object with respect to the projection screen based on the plurality of frames of continuous images comprises:
according to the multi-frame continuous images, determining coordinate information of the projection picture and coordinate information of a control object;
and acquiring the relative position change of the control object relative to the projection picture according to the coordinate information of the projection picture and the coordinate information of the control object.
3. The method according to claim 2, wherein the obtaining the relative position change of the control object with respect to the projection screen according to the coordinate information of the projection screen and the coordinate information of the control object includes:
determining key point coordinates of the projection picture from any one or more images in the multi-frame continuous images;
determining key point coordinates of the control object according to a first image and a second image in the multi-frame continuous images, wherein the shooting time of the first image is later than that of the second image;
and acquiring the relative position change of the control object relative to the projection picture according to the key point coordinates of the control object in the first image, the key point coordinates of the control object in the second image and the key point coordinates of the projection picture.
4. A projection screen interactive control method according to claim 3, wherein said obtaining a change in relative position of said control object with respect to said projection screen comprises:
calculating to obtain a first difference value between the key point coordinates of the control object in the first image and the key point coordinates of the projection picture;
calculating to obtain a second difference value between the key point coordinates of the control object in the second image and the key point coordinates of the projection picture;
and acquiring the relative position change of the control object relative to the projection picture according to the first difference value and the second difference value.
5. The method according to claim 1, wherein said controlling the movement of the projection screen according to the relative position change comprises:
determining a target moving direction of the projection picture according to the relative position change;
and controlling the movement of the projection picture based on the target movement direction.
6. The method according to claim 1, wherein said controlling the movement of the projection screen according to the relative position change, further comprises:
judging whether the control object is positioned in the projection picture or not, and acquiring a judging result;
and responding to the relative position change according to the judging result, and controlling the projection picture to move.
7. The method for interactive control of projection images according to claim 1, wherein the step of obtaining a plurality of continuous images, the plurality of continuous images including the projection images and the images of the control object, further comprises:
and responding to an interaction control instruction, and controlling a sensor to acquire the multi-frame continuous images based on the interaction control instruction.
8. The method according to claim 1, wherein before controlling the movement of the projection screen according to the relative position change, further comprising:
calculating the distance between the key point of the control object and the projection picture according to the multi-frame continuous images;
and if the distance is smaller than the preset threshold value, performing the next step.
9. A projection screen interaction control apparatus comprising:
the image acquisition module is used for acquiring a plurality of continuous images, wherein the continuous images comprise projection pictures and images of control objects;
the position determining module is used for determining the relative position change of the control object relative to the projection picture according to the multi-frame continuous images;
and the projection control module is used for controlling the movement of the projection picture according to the relative position change.
10. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the projection screen interaction control method of any one of claims 1 to 8.
CN202410022597.3A 2024-01-07 2024-01-07 Projection picture interaction control method, device and storage medium Pending CN117873322A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410022597.3A CN117873322A (en) 2024-01-07 2024-01-07 Projection picture interaction control method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410022597.3A CN117873322A (en) 2024-01-07 2024-01-07 Projection picture interaction control method, device and storage medium

Publications (1)

Publication Number Publication Date
CN117873322A true CN117873322A (en) 2024-04-12

Family

ID=90594286

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410022597.3A Pending CN117873322A (en) 2024-01-07 2024-01-07 Projection picture interaction control method, device and storage medium

Country Status (1)

Country Link
CN (1) CN117873322A (en)

Similar Documents

Publication Publication Date Title
US10410089B2 (en) Training assistance using synthetic images
US20190139297A1 (en) 3d skeletonization using truncated epipolar lines
CN107911621B (en) Panoramic image shooting method, terminal equipment and storage medium
WO2020063100A1 (en) Augmented reality image display method and apparatus, and device
US10672143B2 (en) Image processing method for generating training data
US10402657B2 (en) Methods and systems for training an object detection algorithm
US9516214B2 (en) Information processing device and information processing method
EP3395066B1 (en) Depth map generation apparatus, method and non-transitory computer-readable medium therefor
KR20170031733A (en) Technologies for adjusting a perspective of a captured image for display
KR102461232B1 (en) Image processing method and apparatus, electronic device, and storage medium
CN110738736B (en) Image processing apparatus, image processing method, and storage medium
KR20140136943A (en) Information processing apparatus, information processing system, and information processing method
CN112351325B (en) Gesture-based display terminal control method, terminal and readable storage medium
KR101256046B1 (en) Method and system for body tracking for spatial gesture recognition
CN107145822A (en) Deviate the method and system of user&#39;s body feeling interaction demarcation of depth camera
US11138743B2 (en) Method and apparatus for a synchronous motion of a human body model
CN112733579A (en) Object reconstruction method, device, equipment and storage medium
US11080822B2 (en) Method, system and recording medium for building environment map
CN111179341B (en) Registration method of augmented reality equipment and mobile robot
US20220244788A1 (en) Head-mounted display
CN112073640A (en) Panoramic information acquisition pose acquisition method, device and system
CN114089879B (en) Cursor control method of augmented reality display equipment
CN117873322A (en) Projection picture interaction control method, device and storage medium
WO2022116459A1 (en) Three-dimensional model construction method and apparatus, and device, storage medium and computer program
KR20190082129A (en) Apparatus and method for rendering 3dimensional image using video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination