CN115016716A - Projection interaction method and system - Google Patents

Projection interaction method and system Download PDF

Info

Publication number
CN115016716A
CN115016716A CN202210606437.4A CN202210606437A CN115016716A CN 115016716 A CN115016716 A CN 115016716A CN 202210606437 A CN202210606437 A CN 202210606437A CN 115016716 A CN115016716 A CN 115016716A
Authority
CN
China
Prior art keywords
sub
projection
sequence
preset
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210606437.4A
Other languages
Chinese (zh)
Inventor
刘翔宇
危学涛
郭磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest University of Science and Technology
Original Assignee
Southwest University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest University of Science and Technology filed Critical Southwest University of Science and Technology
Priority to CN202210606437.4A priority Critical patent/CN115016716A/en
Publication of CN115016716A publication Critical patent/CN115016716A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Abstract

The application relates to a projection interaction method and system, wherein the method comprises the following steps: acquiring a projection picture sequence of a target projection image; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before each projection picture is projected, if a coding bit corresponding to a subarea of the projection picture in a coding sequence is a first preset coding bit, adjusting the color intensity of the subarea; and projecting each projection picture. The method can improve the accuracy of target positioning.

Description

Projection interaction method and system
Technical Field
The present application relates to the field of projection technologies, and in particular, to a projection interaction method and system.
Background
With the development of projection technology, human-computer interaction (HCI) technology has appeared, Visual Interaction (VI) is one of the main applications of human-computer interaction, and generally, a projection screen can complete human-computer interaction by using human hand actions.
In the existing man-machine interaction method, a projector can embed beacon information into pixels of a projection picture, and then a beacon signal is transmitted through visible light modulation based on the embedded beacon information, so that a photodiode receives the beacon signal to realize target positioning.
However, in the above method, the accuracy of target positioning is low due to abnormal brightness and flicker of the projected picture.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a projection interaction method and system capable of improving the positioning accuracy.
In a first aspect, the present application provides a projection interaction method. The method comprises the following steps: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region, obtaining a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region with a projection picture sequence of the sub-region; before each projection picture is projected, if the corresponding coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area; and projecting each projection picture.
In one embodiment, the encoding the position coordinates of each of the sub-regions respectively to obtain an encoded sequence of the position coordinates of each of the sub-regions includes: carrying out Gray coding on the position coordinates of the sub-regions respectively to obtain a Gray coding sequence; and carrying out Manchester coding on the Gray coding sequence to obtain the Manchester coding sequence of the position coordinates of each sub-region.
In one embodiment, if the coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region includes: if the coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, increasing a first preset value for the color intensity of the R channel of the sub-region, and decreasing a second preset value for the color intensity of the B channel of the sub-region, wherein the second preset value is a preset multiple of the first preset value.
In a second aspect, the application further provides a projection interaction device. The device comprises:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a projection picture sequence of a target projection image, and the projection picture sequence comprises a plurality of continuous projection pictures; the dividing module is used for dividing the target projection image into sub-areas; the coding module is used for coding the position coordinates of each sub-region respectively to obtain a coding sequence of the position coordinates of each sub-region, and the coding sequence of the sub-region corresponds to the projection picture sequence of the sub-region; the color intensity adjusting module is used for adjusting the color intensity of each sub-region of the projection picture if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit before the projection of each projection picture; and the projection module is used for projecting each projection picture.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region, obtaining a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region with a projection picture sequence of the sub-region; before each projection picture is projected, if the corresponding coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area; and projecting each projection picture.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before each projection picture is projected, if the coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area; and projecting each projection picture.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before each projection picture is projected, if the corresponding coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area; and projecting each projection picture.
In a sixth aspect, the present application provides a projection interaction method. The method comprises the following steps: acquiring a color sensing signal output by a color sensor in a preset signal period; processing color sensing signals output in a preset signal period to obtain corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and decoding the digital sequence to obtain the corresponding position coordinate.
In one embodiment, the acquiring the color sensing signal output by the color sensor in the preset signal period includes: acquiring an analog current signal output by the color sensor; and determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the next analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
In one embodiment, the processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal includes: converting the analog current signals corresponding to the receiving duration into digital voltage signals; and sampling, quantizing and filtering each digital voltage signal to obtain the corresponding digital voltage signal.
In one embodiment, the decoding the digitized sequence to obtain corresponding position coordinates includes: manchester decoding is carried out on the digital sequence to obtain a digital sequence after Manchester decoding; and carrying out Gray decoding on the digital sequence subjected to Manchester decoding to obtain the position coordinate.
In one embodiment, the position coordinates include R channel position coordinates and B channel position coordinates, the method further comprising: and if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any pre-stored subregion, determining that the obtained position coordinate is valid, otherwise, determining that the obtained position coordinate is invalid.
In one embodiment, the method further comprises: and determining a sliding track by combining position coordinates corresponding to a plurality of continuous preset signal periods.
In a seventh aspect, the present application further provides a projection interaction apparatus. The device comprises: the acquisition module is used for acquiring a color sensing signal output by the color sensor in a preset signal period; the processing module is used for processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signal is digitized into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold value, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and the decoding module is used for decoding the digital sequence to obtain the corresponding position coordinate.
In an eighth aspect, the present application further provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program: acquiring a color sensing signal output by a color sensor in a preset signal period; processing color sensing signals output in a preset signal period to obtain corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and decoding the digital sequence to obtain the corresponding position coordinate.
In a ninth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of: acquiring a color sensing signal output by a color sensor in a preset signal period; processing color sensing signals output in a preset signal period to obtain corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and decoding the digital sequence to obtain the corresponding position coordinate.
In a tenth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of: acquiring a color sensing signal output by a color sensor in a preset signal period; processing color sensing signals output in a preset signal period to obtain corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and decoding the digital sequence to obtain the corresponding position coordinate.
In an eleventh aspect, a projection interaction system, the system comprising: a projection device, a color sensor, and a receive processing device in communication with the color sensor,
the projection device is configured to: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before each projection picture is projected, if the corresponding coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area; projecting each projection picture;
the color sensor is used for outputting a color sensing signal;
the reception processing device is configured to: acquiring a color sensing signal output by the color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and decoding the digital sequence to obtain the corresponding position coordinate.
According to the projection interaction method and the projection interaction system, after the projection equipment acquires the projection picture sequence of the target projection image, the target projection image can be divided into the sub-regions, the position coordinates of the sub-regions are respectively coded, the coding sequence of the position coordinates of the sub-regions is obtained, and the coding sequence of the sub-regions corresponds to the projection picture sequence of the sub-regions; before the projection equipment projects each projection picture, if a coding bit corresponding to a subarea of the projection picture in a coding sequence is a first preset coding bit, adjusting the color intensity of the subarea; then, the projection device can project each projection picture; the receiving and processing device communicated with the color sensor can acquire color sensing signals output by the color sensor in a preset signal period, process the color sensing signals output in the preset signal period to acquire corresponding digital voltage signals, and digitize the digital voltage signals into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is greater than a voltage threshold value, a coding bit after the digital voltage signals are digitized is determined as a first preset coding bit; and if the voltage value of the digital voltage signal is less than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit, and decoding the digitization sequence, so that the corresponding position coordinate can be obtained. By respectively coding the position coordinates of each subregion, the problems of abnormal brightness and flicker of the projection picture can be solved, and the accuracy of the position coordinates obtained based on the changed color sensing signals is high by adjusting the color intensity of each subregion of each projection picture.
Drawings
FIG. 1 is a block diagram of a projection interaction system in one embodiment;
FIG. 2 is a flow diagram illustrating a method of projection interaction in one embodiment;
FIG. 3 is a schematic diagram of Gray coding of 8 × 8 sub-regions;
FIG. 4 is a schematic illustration of sub-region division and color intensity adjustment of sub-regions;
FIG. 5 is a schematic diagram of color space conversion modulation based on YCbCr color space;
FIG. 6 is a schematic diagram of beacon information for a sub-region;
FIG. 7 is a flowchart illustrating a method of projection interaction in one embodiment;
FIG. 8 is a schematic diagram of a process for acquiring color sensing signals by the receiving and processing device in one embodiment;
FIGS. 9 a-9 c are schematic diagrams of a test image;
10 a-10 c are schematic diagrams illustrating the effect of different images and different frame rates on human perception;
FIGS. 11 a-11 b are schematic diagrams of the perception rate of the human eye perception for communication distance and viewing angle;
FIG. 12 is a graph showing the result of sub-region size versus beacon identification rate;
13 a-13 c are graphs illustrating the effect of different images and different frame rates on beacon identification rate;
FIG. 14 is a graph illustrating the results of one communication distance effect on beacon identification rate;
FIG. 15 is a schematic illustration of a positioning result;
FIG. 16 is a diagram illustrating the effect of different background colors on the positioning accuracy;
FIG. 17 is a block diagram of a projection interaction device in one embodiment;
FIG. 18 is a block diagram showing the construction of a projection interactive apparatus according to another embodiment;
FIG. 19 is a diagram showing an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Visual interaction is one of the main applications of human-computer interaction, which means that projected pictures can be interactively used by using the actions of human hands. In the existing man-machine interaction method based on multiple intelligent projectors, in order to try to improve user experience and equipment performance, projection systems can be divided into three categories.
One type is a depth camera based projection system that typically uses infrared structured light and time of flight (ToF) techniques to obtain three-dimensional information of an interactive scene on a projector. However, this requires equipment replacement of the existing projector, and the replaced projector is provided with a depth camera, which results in a high deployment requirement of the projection system. One is a projection system based on a general camera, which adds a camera to a projector to realize interaction with human motion by using captured two-dimensional image information. Although a common camera is easily deployed on a projector, a large number of image processing algorithms affect the real-time performance of the projector and increase the complexity of the projection system. One type is a non-visual device based projection system that uses sound, pressure, acceleration, light, and other non-visual media to send and receive data to a projector. The user's finger can wear lightweight sensor to realize man-machine interaction with the projector. Moreover, the projection system based on the non-visual equipment only needs to add cheap hardware circuits, and the complexity of a signal processing algorithm is low.
Among them, a digital micro-mirror device (DMD) projector based on Visible Light Communication (VLC) is widely used because it does not need to add or modify any hardware, for example, the DMD projector may be used in scenes such as lecture and meeting projection in an indoor classroom.
During the interaction between the projector and the user, the user may wear a lightweight receiving sensor, such as a photo-diode (PD), and the DMD projector may embed different beacon information (i.e., man-machine interaction data) for each pixel and modulate the transmitted beacon signal through a visible light channel. The PD-based receiving sensor can receive beacon signals sent by the projector and convert the received light signals into electric signals, and the electric signals are processed to obtain man-machine interaction data, so that man-machine interaction is completed. Among other things, the visible light channel is considered to be free of electromagnetic interference and thus may provide a higher positioning accuracy. Therefore, the PD-based receiver has good real-time performance, can support the positioning and tracking of dynamic targets (mouse and gestures), and greatly expands the application of the projector.
However, when the PD-based receiver realizes human-computer interaction, the accuracy of target positioning is low due to abnormal brightness and flickering of the projected picture.
Based on this, the present application provides a projection interaction system capable of improving the positioning accuracy, and as shown in fig. 1, a structural block diagram of a projection interaction system is provided, where the projection interaction system includes a projection device 102, a color sensor 104, and a receiving processing device 106.
After the projection device 102 acquires the projection picture sequence of the target projection image, the projection device may divide sub-regions into the target projection image, and encode the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of the sub-region, and since the coding sequence of the sub-region corresponds to the projection picture sequence of the sub-region, before each projection picture is projected, if the coding bit corresponding to the sub-region in the coding sequence of the projection picture is the first preset coding bit, the color intensity of the sub-region may be adjusted, so that after the color sensor 104 acquires the color sensing signal through the visible light channel, the receiving processing device 106 in communication with the color sensor 104 may determine the corresponding position coordinates based on the color sensing signal output by the color sensor 104 within the preset signal period, thereby achieving positioning.
The projection device 104 may be composed of two subsystems, which are a light engine and a driving board; the light engine comprises an optical device, an RGB-LED and an intelligent digital micro-mirror device (DMD) projector, wherein the model of the projector can be DLPDLCR2000 EVM; the driver board includes a DLPC2607 display controller and a DLPA1000 processor.
The projection device 102 may obtain the target projection image from a terminal, which may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, smart televisions, and the like; projection device 102 may also obtain the target projection image from a server; thereafter, the projection device 102 may generate a sequence of projection screens of the target projection image.
The terminal or the server may also generate a projection picture sequence of the target projection image and divide the target projection image into sub-regions, and the projection device 102 may acquire the projection picture sequence of the target projection image from the terminal or the server and the target projection image after dividing the sub-regions.
Wherein the color sensor may include a photodiode array, e.g., an N × N photodiode array, N may be 8 or other value; when N is 8, each 16 photodiodes are provided with a blue filter, a green filter, a red filter and no filter, and all the photodiode arrays with the same color are connected in parallel to increase the induced current of the detected light intensity; a convex lens can be embedded in the color sensor to increase the intensity of the color sensor for receiving optical signals, so that the identification of three primary colors is facilitated; when using color sensor to realize human-computer interaction, color sensor can wear on user's finger as light sensor, also can embed in the projection command tool, for example, the projector is waved the pen, and color sensor also can adopt other modes to pack, and this application embodiment does not do the restriction.
In one embodiment, as shown in fig. 2, a projection interaction method is provided, which is exemplified by the method applied to the projection device 104 in fig. 1, and includes the following steps:
step 202, acquiring a projection picture sequence of the target projection image.
The target projection image is the image to be projected, the projection equipment needs to acquire a projection picture sequence of the target projection image before projecting the target projection image, the projection picture sequence comprises a plurality of continuous projection pictures, and the plurality of projection pictures can be understood as repeated projection of the target projection image.
Step 204, dividing the target projection image into sub-regions.
The number of the sub-regions is multiple, the number of the sub-regions is related to the positioning accuracy, when the number of the sub-regions is large, the smaller the size of the sub-region is, the higher the positioning accuracy is, for example, the size of the sub-region may be 5cm × 5cm, 10cm × 10cm or other values, the specific value of the size of the sub-region may be set according to an actual application scenario, and the embodiment is not limited.
And step 206, respectively coding the position coordinates of each sub-region, obtaining a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to the projection picture sequence of the sub-region.
The world position coordinates of each sub-region can be obtained by taking the lower left corner of the target projection image as the origin of coordinates, the world position coordinates of the sub-region comprise a plurality of groups, and the region formed by the plurality of groups is the sub-region; after the world position coordinates of each sub-region are processed, the position coordinates of each sub-region can be obtained, and the position coordinates are a sequence consisting of '0' and '1'; after the position coordinates of each sub-region are respectively coded, the obtained coded sequence of the position coordinates of each sub-region is also a sequence consisting of '0' and '1', and the quality of the projection picture of each sub-region can be improved by respectively coding the position coordinates of each sub-region.
Wherein the projection picture sequence of the sub-area comprises a plurality of continuous projection pictures of the sub-area; the coding sequence of the sub-region corresponds to the projection picture sequence of the sub-region, and it can be understood that when the projection picture of the sub-region is projected in one period, the number of bits of the coding bits in the coding sequence of the sub-region is the same as the number of times of projection of the projection picture of the sub-region.
For example, the projection screen sequence of the target projection image may be [ a1, a2, A3, a4], where each of a1 to a4 indicates a repeated projection of the target projection image, and when the target projection image is repeatedly projected, a sub-region is also divided for the target projection image which is repeatedly projected, and the divided sub-regions of the target projection image which is repeatedly projected are the same; for example, after the target projection image is divided into sub-regions, one of the sub-regions is q1, the coding sequence of q1 is "0110", the first bit coding bit "0" in "0110" corresponds to q1 in the target projection image corresponding to a1, the second bit coding bit "1" in "0110" corresponds to q1 in the target projection image corresponding to a2, the third bit coding bit "1" in "0110" corresponds to q1 in the target projection image corresponding to A3, and the fourth bit coding bit "0" in "0110" corresponds to q1 in the target projection image corresponding to a 4.
Specifically, the number of bits in the encoded sequence of sub-regions is related to projection parameters, including the focal length of the projection device, the projection screen size, the Liquid Crystal Display (LCD) panel size, the distance from the projection device to the projection screen, and the preset sub-region size.
For example, the maximum distance D of the projection device to the projection screen max And a minimum distance D of the projection device to the projection screen min Equation (1) is satisfied, and equation (1) is as follows:
Figure BDA0003671536900000091
wherein f is max Is the maximum focal length of the projection device, f min Is the minimum focal length, S, of the projection device size For projection screen size, L size Is the LCD panel size.
At D max 、f max 、L size Known or D min 、f min 、L size In the known case, based on formula(1) The size S of the projection screen can be obtained size . Thus, under the condition that the size of the preset subarea is known, the number of equal parts of the long side of the projection screen and the number of equal parts of the wide side of the projection screen can be determined, and further, the number of bits of the coding bits in the coding sequence of the subarea can be determined through the following formula, namely:
in the coding sequence of the sub-regions
Figure BDA0003671536900000092
Figure BDA0003671536900000093
Wherein the content of the first and second substances,
Figure BDA0003671536900000094
indicating rounding up.
For example, when each of the sub-regions has a size of 5cm × 5cm, the long side of the projection screen may be divided into m equal parts, and the wide side of the projection screen may be divided into n equal parts, m may be 20, and n may be 15, and thus the encoded sequence of the sub-regions may be obtained
Figure BDA0003671536900000095
Specifically, the method for obtaining the coded sequence of the position coordinates of each sub-region by the projection device respectively coding the position coordinates of each sub-region includes: carrying out Gray coding on the position coordinates of each sub-region respectively to obtain a Gray coding sequence; manchester coding is carried out on the Gray coding sequence to obtain the Manchester coding sequence of the position coordinates of each sub-region; wherein the number of coded bits in the manchester code sequence is twice the number of coded bits in the gray code sequence.
The number of continuous projection pictures in the projection picture sequence of the target projection image is the same as the number of bits of the coding sequence of the sub-region, so that the number of continuous projection pictures in the projection picture sequence of the target projection image can be determined by determining the number of bits of the coding sequence of the sub-region.
When the existing PD-based receiver realizes man-machine interaction, abnormal brightness change and problems can occur in a projection picture, and the projection interaction method provided by the application can enable the coding bit in the gray coding sequence of the adjacent sub-regions to have only one bit difference by respectively carrying out gray coding on the position data of each sub-region, thereby avoiding obvious boundary difference and brightness change between the adjacent sub-regions; by carrying out Manchester coding on the Gray coding sequence, human eye flicker perception caused by long-time '0' or long-time '1' can be avoided, and therefore user experience is improved.
The relation between the number of projection images and the number of sub-regions due to the gray coding required is O (log) 2 n), n is equal number of the projection screen broadside, therefore, under the condition of satisfying the formula of the number of bits of the coded bits in the coding sequence of the sub-region, as shown in fig. 3, a schematic diagram of the gray coding principle of 8 × 8 sub-regions is provided; wherein, black indicates that the coded bit is "1" and white indicates that the coded bit is "0".
In step 208, before the projection images are projected, if the coding bit corresponding to the sub-region of the projection image in the coding sequence is the first preset coding bit, the color intensity of the sub-region is adjusted.
The coding sequence of the sub-region is a sequence formed by "0" and "1", so that the first preset coding bit can be set to "1", and thus, before the projection picture of each sub-region is projected, if the coding bit corresponding to the sub-region in the coding sequence is the first preset coding bit, the color intensity of the sub-region can be adjusted, so that the accuracy of target positioning can be improved based on the change of the color intensity.
Specifically, if a coding bit corresponding to a sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region includes: if the coding bit corresponding to the sub-region of the projection picture in the coding sequence is a first preset coding bit, a first preset value is increased for the color intensity of the R channel of the sub-region, and a second preset value is decreased for the color intensity of the B channel of the sub-region, wherein the second preset value is a preset multiple of the first preset value.
The preset multiple may be determined by combining the relationship between (R, G, B) values in the YCbCr color space and the RGB color space, and the requirement of the overall light intensity after adjusting the color brightness.
The YCbCr color space is a color coding scheme commonly used in video products such as cameras and digital televisions, and is not an absolute color space but a compressed and shifted version of the YUV color space (Y: signal luminance, U/V: signal chrominance). In the YCbCr color space, Y refers to a luminance component, Cb refers to a blue chrominance component, and Cr refers to a red chrominance component.
Since the human eye is more sensitive to the Y component in the image/video, the human eye cannot perceive the luminance change in the transmitted image when the chrominance component changes between Cb and Cr. In a projection device, the light intensity and color of each pixel in the projected image is produced by additive mixing in different red, green and blue colors (the three primary colors). The conversion between the YCbCr color space and the RGB color space is as follows:
Figure BDA0003671536900000101
further derivation of equation (2) can yield equation (3), where equation (3) is shown below:
Figure BDA0003671536900000102
in the formula (2) and the formula (3), Y 709 Is a perceivable luminance, the (R, G, B) value on the RGB color space is an integer between 0 and 255, and the beacon communication with robustness and the luminance variation that is hardly perceived by human eyes are satisfied by changing the (R, G, B) value, that is, the overall light intensity after adjusting the color luminance is maintained unchanged, and thus, the constraint conditions are set according to the formula (2) and the formula (3) as follows:
Figure BDA0003671536900000103
considering the integer nature of a computer system, the requirement is δ 1 、δ 2 And delta 3 Selecting appropriate values such that C b And C r Tends to 0, while Y is 709 The brightness variation of (a) remains unchanged.
From equation (4), it can be found that the weighting factor of the color intensity of the G channel is the highest if δ is added to the color intensity of the G channel 2 Equation (4) will be a complex multi-solution problem. A plurality of (delta) 123 ) The solution requires a color sensor that consumes a significant amount of computation time and does not guarantee that the computed solution is the correct one.
Further, it can be found from the formula (4) that the weight coefficient of the color intensity of the R channel is almost 3 times the color intensity of the B channel. Therefore, a preset multiple of 3 may be set, such that when the first preset value is δ and the preset multiple is 3, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, δ may be increased on the color intensity of the R channel of the sub-region, and 3 δ may be decreased on the color intensity of the B channel of the sub-region, and the color intensity of the G channel remains unchanged, that is (δ is) 123 )=(δ,0,-3δ)。
If the coding bit corresponding to the sub-region of the projection picture in the coding sequence is not the first preset coding bit, or it is understood that the coding bit corresponding to the sub-region of the projection picture in the coding sequence is the second preset coding bit, the projection device may not adjust the color intensity of the sub-region; the second predetermined encoding bit may be "0".
As shown in fig. 4, a schematic diagram of sub-region division and color intensity adjustment of the sub-regions is provided, the projection device divides the target projection image into the sub-regions, and after gray coding and manchester coding are performed on the position coordinates of each sub-region, a coding sequence of the position coordinates of the sub-regions can be obtained.
For example, in a projection picture in the projection picture sequence, the encoding bits of the sub-regions in the bottom row are sequentially "100111110", so that before the projection picture is projected, when the encoding bits of the sub-regions in the bottom row are the first preset encoding bits, δ is increased in the color intensity of the R channel of the sub-region and 3 δ is decreased in the color intensity of the B channel; when the encoding bit of each subarea in the lowest row is not the first preset encoding bit, the color intensity of the R channel and the color intensity of the B channel of the subarea are not adjusted.
Since the communication distance affects the intensity of the color sensor signal received by the color sensor, the communication distance affects the first preset value. Furthermore, the color of the projected image will also influence the first preset value, but in an application scenario the communication distance is usually determined in advance, and therefore the influence of the color of the projected image on the first preset value will be analyzed.
For example, the colors of the projected image may be configured as a monochrome image, the monochrome image colors including: red, orange, yellow, green, blue, violet, white and black. Then, in the case where the communication distance is determined, for example, the communication distance is 2 meters, the influence of the different first preset values on each monochrome image is tested.
According to the test result, the larger the first preset value is, the easier the position coordinate corresponding to the color channel is to decode. However, when the first preset value is greater than 7, the human eye may perceive the light intensity change in most color cases. For the case where the monochrome image color is blue, the variation in the intensity perceived by the human eye can still be caused due to the purkinje effect. The first preset value may be set to any one of 4 to 8 according to various projection parameters, colors of projected images, and ambient light conditions.
As shown in fig. 5, a schematic diagram of color space conversion modulation based on YCbCr color space is provided, and it can be found from fig. 5 and equation (4) that when the parameter δ is set too large, the R channel position coordinate and the B channel position coordinate are easily demodulated/decoded, but the overall brightness or color change of the screen may cause human eye perception; if the parameter δ is set too small, the color sensor may not be able to sense the change of color intensity, i.e., cannot acquire a color sensing signal, thereby causing the failure of analyzing the R channel position coordinate and the B channel position coordinate; wherein the reference red/blue intensity is an unmodified base color intensity.
And step 210, projecting each projection picture.
According to the projection interaction method, the projection equipment can acquire the projection picture sequence of the target projection image, divide the target projection image into the sub-regions, further encode the position coordinates of the sub-regions respectively, acquire the encoding sequence of the position coordinates of the sub-regions, and correspond the encoding sequence of the sub-regions to the projection picture sequence of the sub-regions, so that before projection of each projection picture, if the encoding bit corresponding to the sub-region of the projection picture in the encoding sequence is the first preset encoding bit, the color intensity of the sub-region is adjusted, and then the projection picture is projected. Before the projection picture of each sub-area is projected, the color intensity of each sub-area is adjusted, so that the positioning can be realized based on the change of the color intensity of each sub-area, and the positioning accuracy is improved.
Before the projection device transmits the optical signal corresponding to the code sequence of the position coordinates of each sub-region, the projection device may add a sequence header before the code sequence of the position coordinates of each sub-region and add a parity bit and a tail frame after the sequence header, so that complete beacon information of each sub-region may be composed.
For example, the projection device has an original display resolution of 640 × 340 pixels, the frame rate is 60Hz, and in the complete beacon information of each sub-region, the coded bits of the sequence header are five bits, which can be denoted as "11111"; nine coded sequences for representing the position coordinates of the sub-region; the parity bits and the coded bits of the last frame are respectively one bit, and can be respectively expressed as '0'; the sequence header and parity bits, and the end frame, among others, may be used for synchronization and verification.
The coded bits of the sequence header are five bits, and the duration of the optical signal corresponding to the first coded bit in the coded sequence of the sub-region is longest from the receiving angle of the optical signal; it can also be understood that after the optical signal corresponding to the last coded bit in the coded sequence of the sub-region is received, the optical signal corresponding to the first coded bit in the sub-region continues to be received again, and the receiving duration of the optical signal corresponding to the first coded bit in the coded sequence of the sub-region is longer than the receiving duration of the optical signals corresponding to the other coded bits in the coded sequence of the sub-region.
As shown in fig. 6, a schematic diagram of beacon information of a seed region is provided, where the sequence header lasts for 5 bits, and each time slot of the remaining bits lasts for 1 bit, or it is understood that the number of bits of the sequence header is 5, the number of bits of the parity bits and the number of bits of the end frame are both 1, and the number of bits encoded by each bit in the encoded sequence is also 1; for example, when the number of bits of the encoded sequence of the sub-region is 18 bits and the frame rate is 60Hz, the total number of bits of the beacon information transmitted by the projection device for the sub-region is 25 bits, which means that 25/60 seconds are required to transmit the beacon information for the sub-region.
When the projection picture sequence, the first preset value and the receiving duration of the analog current signal are changed according to the size and the positioning accuracy of the projection screen, if the projection equipment works in an image projection mode, a projection image usually stays for more than 2 seconds, and the projection equipment can transmit a plurality of complete beacon information; if the projection equipment works in a video projection mode, if severe dynamic image changes occur in some frames, the receiving and processing equipment can wait for 1-2 seconds and then perform position coordinate recognition again to realize target positioning.
In one embodiment, as shown in fig. 7, a projection interaction method is provided, which is described by taking the method as an example applied to the receiving processing device in fig. 1, and the method includes the following steps:
step 702, acquiring a color sensing signal output by the color sensor in a preset signal period.
Step 704, processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal.
The color sensor can acquire an optical signal transmitted by the projection device, and the color sensor can acquire a color sensing signal after processing the optical signal, so that the receiving and processing device acquires the color sensing signal output by the color sensor in a preset signal period and processes the color sensing signal output in the preset signal period to acquire a corresponding digital voltage signal; the preset signal period is a period set by the color sensor to receive the signal.
Step 706, digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than the voltage threshold, determining the coded bit after the digital voltage signal is digitized as a first preset coded bit; and if the voltage value of the digital voltage signal is less than or equal to the voltage threshold value, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit.
Step 708, decoding the digitized sequence to obtain corresponding position coordinates.
The color sensor can be worn on the finger of the user as a light sensor and can also be embedded in the projection command tool, so that the corresponding position coordinates are used for indicating the position of the finger of the user in the projection picture or the position of the projection command pen in the projection picture, and based on the positions, the positioning of the finger of the user or the positioning of the projection command tool can be realized, namely the target positioning can be realized.
Specifically, the receiving and processing device decodes the digitized sequence to obtain corresponding position coordinates, and includes: performing Manchester decoding on the digital sequence to obtain a Manchester decoded digital sequence; and the receiving processing equipment performs Gray decoding on the digitized sequence subjected to Manchester decoding to obtain position coordinates.
According to the projection interaction method, the receiving and processing equipment can acquire the color sensing signals output by the color sensor in the preset signal period, and process the color sensing signals output in the preset signal period to acquire the corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, determining a coded bit after the digital voltage signal is digitized as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and then decoding the digitized sequence to obtain the corresponding position coordinates. Since the color sensing signals are transmitted after being adjusted based on the color, after the projection device projects the picture, the position coordinates obtained by the color sensing signals received by the receiving processing device based on the changed color intensity are received, and the accuracy of successful positioning is high.
Specifically, the receiving and processing device acquires the color sensing signal output by the color sensor in the preset signal period, an optional implementation manner is shown in fig. 8, and fig. 8 is a schematic flow diagram of the receiving and processing device acquiring the color sensing signal provided in this embodiment, and may include the following steps:
step 802, obtaining an analog current signal output by the color sensor.
Wherein the color sensor receives a light signal from the projection device, the color sensor may convert the light signal to an analog current signal.
Step 804, determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the next analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
When the projection device transmits the optical signal corresponding to the code sequence of each sub-region to the color sensor, the projection device adds a sequence header in front of the code sequence of each sub-region, and based on the sequence header, the receiving duration of the first bit code bit in the code sequence of each sub-region can be determined to be longest, so that the receiving processing device can determine the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the next analog current signal with the longest receiving duration as the color sensing signal output by the color sensor in a preset signal period.
Specifically, processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal includes: and converting the analog current signals corresponding to the receiving duration into digital voltage signals, and sampling, quantizing and filtering the digital voltage signals to obtain corresponding digital voltage signals.
The receiving and processing equipment can comprise a single chip microcomputer, and the type of the single chip microcomputer can be STM 32; the color sensor can send the analog current signals to the single chip microcomputer, the single chip microcomputer can convert the analog current signals corresponding to the receiving duration into digital voltage signals through the USB-RS32, and the digital voltage signals are sampled, quantized and filtered to obtain corresponding digital voltage signals.
In one mode, the single chip microcomputer can send the digital voltage signal to the receiving processing device, the receiving processing device judges a voltage value in the digital voltage signal, and if the voltage value of the digital voltage signal is greater than a voltage threshold, a coding bit after the digital voltage signal is digitized is determined as a first preset coding bit, and the first preset coding bit can be '1'; if the voltage value of the digital voltage signal is less than or equal to the voltage threshold, the coded bit after the digital voltage signal is digitized is determined as a second preset coded bit, and the second preset coded bit may be "0".
In another mode, the single chip microcomputer can send the digital voltage signal to the server, the server judges the voltage value in the digital voltage signal, and if the voltage value of the digital voltage signal is greater than a voltage threshold value, the coding bit of the digital voltage signal after digitization is determined as a first preset coding bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; then, the server can send the digitized sequence of the digitized digital voltage signal to the receiving processing equipment, and then the receiving processing equipment decodes the digitized sequence to obtain the corresponding position coordinate; or the server decodes the digital sequence to obtain a corresponding position coordinate and sends the position coordinate to the receiving and processing equipment; if the application scene is a conference room projection scene, the server may be a local server set in a conference room.
In an embodiment, an embodiment of the present application is another optional method embodiment in which the receiving processing device performs the receiving processing by using location coordinates, where the location coordinates include an R-channel location coordinate and a B-channel location coordinate, and the method includes:
and if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any pre-stored subregion, the receiving processing equipment determines that the obtained position coordinate is valid, otherwise, the receiving processing equipment determines that the obtained position coordinate is invalid.
In the communication process of the projection equipment and the color sensor, the transmitted data can be influenced by factors such as ambient light noise, the responsivity of a photodiode array in the color sensor, the amplification factor of an operational amplifier in the color sensor and the like, so that the accuracy of the position coordinate determined by the R channel position coordinate and the B channel position coordinate is high in order to improve the position coordinate recognition rate; the situation that the R channel position coordinate is different from the B channel position coordinate, the situation that the R channel position coordinate is the same as the B channel position coordinate, and the situation that the R channel position coordinate is different from the pre-stored position coordinate of any one sub-region is probabilistic.
For the condition that the R channel position coordinate is different from the B channel position coordinate, the receiving and processing equipment can indicate the color sensor to output the color sensing signal again, and further continuously judge the relation between the R channel position coordinate and the B channel position coordinate; the preset number of times may be 5 times or other values.
Wherein, for the condition that the R channel position coordinate is the same as the B channel position coordinate and the R channel position coordinate is not the same as the position coordinate of any pre-stored subregion, the probability of the wrong identification of the R channel position coordinate is assumed to be P E1 And the probability of wrong identification of the position coordinate of the B channel is P E2 Since the three primary colors are orthogonal to each other, P is E1 And P E2 There is no linear correlation between the two, therefore, the probability of the identification error occurring at the same time in the R channel position coordinate and the B channel position coordinate is P E1 ×P E2 P compared to the single channel case E1 ×P E2 <P E1 ,P E1 ×P E2 <P E2 Therefore, when the position coordinates of the two channels are adopted for positioning decision, the positioning accuracy can be improved.
Specifically, the receiving processing device may also determine the sliding track by combining position coordinates corresponding to a plurality of consecutive preset signal periods.
After determining the sliding track in combination with the position coordinates corresponding to the continuous multiple preset signal periods, the receiving processing device may determine a processing operation on the target projection image based on the sliding track, for example, the processing operation may be switching to a next image of the target projection image; the sliding track and the processing operation have a corresponding relation.
The color sensor can be worn on the finger of the user as a light sensor and can also be embedded in the projection command tool, so that the receiving and processing device can also determine the gesture direction by combining the position coordinates of the multiple sub-regions based on the color sensing signals output by the color sensor, and further determine the processing operation on the target projection image, wherein the gesture direction and the processing operation have a corresponding relationship, for example, when the gesture direction is upward, the processing operation on the target projection image is used for closing the target projection image.
The projection interaction method provided by the application is tested in the aspects of human eye perception, beacon identification rate and positioning accuracy under the condition of considering factors such as the size of a subregion, different images, different frame rates, communication distance (namely the distance from a projection device to a projection screen), viewing angle and the like; wherein, the beacon identification rate can be understood as the position coordinate identification rate; the experiment is carried out in a laboratory of 5m multiplied by 6m, the lower left corner of a projection screen is used as a coordinate origin, and the height of projection equipment from the ground is 1.2 m.
In the aspect of human eye perception test, experiments are carried out at 10:00 a.m. and 8:00 a.m., so that the influence of ambient light on the projection interaction method can be conveniently analyzed; wherein the distance between the observer and the projection screen was set to 2 meters and 5 participants were tested for human eye perception.
In particular, when testing the influence of the sub-region size on human eye perception, the projection device may transmit the encoded sequence using the yellow image and the black image as monochromatic background content; the frame rate for transmitting the projected image is 60 Fps.
When the sizes of the sub-areas are 5cm multiplied by 5cm, 15cm multiplied by 15cm and 25cm multiplied by 25cm respectively, the human eyes cannot perceive the brightness change under the three sizes of the sub-areas in the daytime; at night, when the sub-area sizes are 5cm × 5cm and 15cm × 15cm, human eyes can perceive the luminance change under the yellow background image because: at night, the brightness of the projection screen is the main light source in the room, and the brightness received by human eyes is mostly from the projection screen; the projection device encodes the position coordinates of each sub-region separately, which may make the brightness variation between adjacent sub-regions insignificant, but the coordinate data between non-adjacent sub-regions may have significant differences, thereby causing human eye perception.
Specifically, when testing the influence of different images and different frame rates on human perception, the distance between the projection device and the projection screen is 1.5 m. As shown in fig. 9 a-9 c, a schematic diagram of a test image is provided, fig. 9a being a Lena image, fig. 9b being a Peppers image, and fig. 9c being a cartoon video stream image.
The method tests that the Lena images transmit the coded sequence at different frame rates, the Lena images and the Peppers images appear alternately to transmit the coded sequence and the cartoon video stream transmission coded sequence, and can obtain the test results shown in FIGS. 10 a-10 c.
As shown in fig. 10 a-10 c, a schematic diagram of the effect of different images and different frame rates on human perception is provided, where fig. 10a shows the perception rate of Lena images when coded sequences are transmitted at different frame rates, fig. 10b shows the perception rate of Lena images and Peppers images when coded sequences are transmitted alternately, and fig. 10c shows the perception rate of cartoon video streams when the brightness of pictures changes abnormally (i.e. normal brightness changes generated by the cartoon video streams themselves), the buffering of the video streams changes, and the color changes.
As can be seen from fig. 10a, different frame rates do not cause the human eye to perceive the brightness change of the Lena image; as can be seen from FIG. 10b, the sequence of the interactive transmission between the Lena image and the Peppers image does not cause human eye perception; as can be seen from fig. 10c, the human eye does not perceive any abnormal change in the brightness of the picture, the buffering of the video stream, and the color change.
Specifically, when testing the influence of the communication distance and the viewing angle on human eye perception, the test is divided into day and night, a Lena image transmission coding sequence is used, the frame rate is 60Fps, the distance between the projection equipment and the projection screen is 1-5 m, and the viewing angle of an observer is-60 ° ~60 ° (ii) a Wherein 100% means that the human eye cannot perceive the luminance change.
As shown in fig. 11a and 11b, a schematic diagram of the perception rate of the communication distance and the viewing angle for human eyes is provided, wherein fig. 11a shows the perception rate at different communication distances, and fig. 11b shows the perception rate at different viewing angles.
As can be seen from fig. 11a, when the communication distance increases, human eyes can perceive the change of brightness, because the increase of the communication distance causes the size of the projection picture to increase when being projected, so that the brightness of the projection screen decreases, and the brightness of the projection screen is easily interfered by ambient light, which may cause human eyes to perceive the change of brightness; from FIG. 11b, it can be seen that different viewing angles have no effect on the perception performance of the human eye at a distance of 1.5 cm; therefore, as can be seen from fig. 11a and 11b, the communication distance is a relatively important factor affecting the human eye perception performance with respect to the viewing angle.
In the aspect of beacon identification rate testing, the testing is divided into day and night, and the curtain is completely closed without the interference of ambient light in a room; specifically, when the influence of the size of the subarea on the beacon identification rate is tested, a convex lens can be embedded in the color sensor, and the size of the convex lens can be 1.5cm × 1.5 cm.
As shown in fig. 12, which provides a graph of the result of the sub-region size versus the beacon identification rate, it can be seen from fig. 12 that the beacon identification rate decreases as the sub-region size decreases, because when the sub-region size decreases, the color sensor may receive a plurality of color intensities from a plurality of sub-regions, which may result in a failure of the beacon identification; in addition, as the size of the sub-area increases, the beacon recognition rate also fluctuates because the color sensor may be located on the boundary of two sub-areas, which also results in no beacon being recognized.
Specifically, when the influence of different images and different frame rates on the beacon recognition rate is tested, the test condition is the same as that when the influence of different images and different frame rates on the perception of human eyes is tested.
Fig. 13 a-13 c are schematic diagrams showing the effect of different images and different frame rates on the beacon recognition rate, where fig. 13a shows the beacon recognition rate when Lena images transmit coded sequences at different frame rates, fig. 13b shows the beacon recognition rate when Lena images and Peppers images interact to generate transmitted coded sequences, and fig. 13c shows the beacon recognition rate when the cartoon video stream has abnormal picture brightness change (i.e. normal brightness change generated by the cartoon video stream itself), video stream buffer change, and color change.
As can be seen from fig. 13a and 13b, different frame rates and different images have no effect on the beacon recognition rate; it can be found from fig. 13c that when the projection device codes the position coordinates of each sub-region in the cartoon video stream respectively, the position coordinates may not be identified finally, because the position coordinates are identified according to the color intensity change on the R channel and the color intensity change on the B channel in the present application, the reference color intensities of the R channel and the B channel are always changed due to the rapid change of each frame of video content of the cartoon video stream itself, and further, the receiving processing device cannot set a reasonable voltage threshold, thereby failing to identify the position coordinates.
Specifically, in testing the influence of the communication distance on the beacon recognition rate, the size of the sub-area is 15m × 15m, as shown in fig. 14, a result diagram of the influence of the communication distance on the beacon recognition rate is provided, and it can be seen from fig. 14 that as the communication distance increases, the beacon recognition rate decreases because the increase in the communication distance causes the light intensity received by the color sensor to decrease, and the useful light intensity is mixed into the ambient light noise, resulting in a failure in position coordinate recognition.
In the aspect of positioning accuracy testing, specifically, the method is used for testing in daytime, the positioning accuracy of the edge of the projection screen and the center of the projection screen is tested on the projection screen, positioning points and a tracking graph are drawn, and the positioning accuracy is tested by using 30 random positioning points.
As shown in fig. 15, a schematic diagram of the positioning result is provided, and it can be seen from fig. 15 that when the size of the sub-region is 10cm × 10cm, the positioning accuracy of centimeter level can be achieved, and the average positioning error is 8 cm; the positioning accuracy of the edge of the projection screen is the same as that of the center of the projection screen, which shows that the provided projection interaction method can well realize target positioning and can provide effective target positioning for any point on the projection screen.
In the aspect of positioning accuracy testing, specifically, images transmitted in the daytime when positioning results are obtained are respectively projected onto a yellow desktop and a white wall, and the influence of different background color planes on the positioning accuracy is observed. As shown in fig. 16, a result schematic diagram of the influence of different background colors on the positioning accuracy is provided, and it can be found from fig. 16 that different background color planes have almost no influence on the positioning accuracy, so that the projection interaction method provided by the present application can perform effective human-computer interaction on different background planes. According to the test, the projection interaction method can realize centimeter-level target positioning, and widens man-machine interaction of the intelligent projector based on visible light positioning.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not limited to being performed in the exact order illustrated and, unless explicitly stated herein, may be performed in other orders. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application further provides a projection interaction device for implementing the projection interaction method. The implementation scheme for solving the problem provided by the apparatus is similar to the implementation scheme described in the above method, so specific limitations in one or more embodiments of the projection interaction apparatus provided below may refer to the limitations on the projection interaction method in the foregoing, and details are not described here.
In one embodiment, as shown in fig. 17, there is provided a projection interaction apparatus including: an obtaining module 1702, a dividing module 1704, an encoding module 1706, a color intensity adjusting module 1708, and a projecting module 1710, wherein:
an obtaining module 1702, configured to obtain a projection picture sequence of a target projection image, where the projection picture sequence includes a plurality of continuous projection pictures; a dividing module 1704 for dividing the target projection image into sub-regions; a coding module 1706, configured to code the position coordinates of each sub-region, respectively, obtain a coding sequence of the position coordinates of each sub-region, and correspond the coding sequence of the sub-region to a projection picture sequence of the sub-region; a color intensity adjusting module 1708, configured to, before projecting each projection picture, adjust the color intensity of the sub-region if a coding bit corresponding to the sub-region of the projection picture in the coding sequence is a first preset coding bit; and a projection module 1710, configured to project each projection picture.
In one embodiment, the encoding module 1706 is further configured to: carrying out Gray coding on the position coordinates of each sub-region respectively to obtain a Gray coding sequence; and carrying out Manchester coding on the Gray coding sequence to obtain the Manchester coding sequence of the position coordinates of each sub-region.
In an embodiment, the color intensity adjustment module 1708 is further configured to: if the coding bit corresponding to the sub-region of the projection picture in the coding sequence is a first preset coding bit, a first preset value is increased for the color intensity of the R channel of the sub-region, and a second preset value is decreased for the color intensity of the B channel of the sub-region, wherein the second preset value is a preset multiple of the first preset value.
In one embodiment, as shown in fig. 18, there is provided a projection interaction apparatus including: an obtaining module 1802, a processing module 1804, a digitizing module 1806, and a decoding module 1808, wherein:
an obtaining module 1802, configured to obtain a color sensing signal output by a color sensor in a preset signal period; the processing module 1804 is configured to process the color sensing signal output within a preset signal period to obtain a corresponding digital voltage signal; a digitizing module 1806, configured to digitize the digital voltage signal into a corresponding digitized sequence, where if a voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; a decoding module 1808, configured to decode the digitized sequence to obtain a corresponding position coordinate.
In one embodiment, the obtaining module 1802 is further configured to: acquiring an analog current signal output by a color sensor; and determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the next analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
In one embodiment, the processing module 1804 is further configured to: converting the analog current signals corresponding to the receiving duration into digital voltage signals; and sampling, quantizing and filtering each digital voltage signal to obtain a corresponding digital voltage signal.
In one embodiment, the decoding module 1808 is further configured to: manchester decoding is carried out on the digital sequence to obtain a digital sequence after Manchester decoding; and performing Gray decoding on the digitized sequence subjected to Manchester decoding to obtain position coordinates.
In one embodiment, the processing module 1804 is further configured to: and if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any pre-stored subregion, determining that the obtained position coordinate is valid, otherwise, determining that the obtained position coordinate is invalid.
In one embodiment, the processing module 1804 is further configured to: and determining the sliding track by combining position coordinates corresponding to a plurality of continuous preset signal periods.
The modules in the projection interaction device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 19. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a projection interaction method.
Those skilled in the art will appreciate that the architecture shown in fig. 19 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In one embodiment, there is provided a projection interaction system, comprising: a projection device, a color sensor, and a receive processing device in communication with the color sensor,
a projection device to: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into sub-regions; respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before each projection picture is projected, if a coding bit corresponding to a subarea of the projection picture in a coding sequence is a first preset coding bit, adjusting the color intensity of the subarea; projecting each projection picture;
a color sensor for outputting a color sensing signal;
a reception processing device for: acquiring a color sensing signal output by a color sensor in a preset signal period; processing color sensing signals output in a preset signal period to obtain corresponding digital voltage signals; digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, determining a coded bit after the digital voltage signal is digitized as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit; and decoding the digital sequence to obtain corresponding position coordinates.
It should be noted that, the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include a Read-Only Memory (ROM), a magnetic tape, a floppy disk, a flash Memory, an optical Memory, a high-density embedded nonvolatile Memory, a resistive Random Access Memory (ReRAM), a Magnetic Random Access Memory (MRAM), a Ferroelectric Random Access Memory (FRAM), a Phase Change Memory (PCM), a graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), for example. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application should be subject to the appended claims.

Claims (10)

1. A method of projection interaction, the method comprising:
acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures;
dividing the target projection image into sub-regions;
respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region;
before each projection picture is projected, if the corresponding coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area;
and projecting each projection picture.
2. The method according to claim 1, wherein the encoding the position coordinates of each of the sub-regions respectively to obtain an encoded sequence of the position coordinates of each of the sub-regions comprises:
carrying out Gray coding on the position coordinates of the sub-regions respectively to obtain a Gray coding sequence;
and carrying out Manchester coding on the Gray coding sequence to obtain the Manchester coding sequence of the position coordinates of each sub-region.
3. The method according to claim 1 or 2, wherein if the corresponding code bit of the sub-region of the projection picture in the code sequence is a first preset code bit, adjusting the color intensity of the sub-region comprises:
if the coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, increasing a first preset value for the color intensity of the R channel of the sub-region, and decreasing a second preset value for the color intensity of the B channel of the sub-region, wherein the second preset value is a preset multiple of the first preset value.
4. A method of projection interaction, the method comprising:
acquiring a color sensing signal output by a color sensor in a preset signal period;
processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals;
digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit;
and decoding the digital sequence to obtain corresponding position coordinates.
5. The method of claim 4, wherein the obtaining the color sensing signal output by the color sensor in a preset signal period comprises:
acquiring an analog current signal output by the color sensor;
and determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the next analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
6. The method according to claim 5, wherein the processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals comprises:
converting the analog current signals corresponding to the receiving duration into digital voltage signals;
and sampling, quantizing and filtering each digital voltage signal to obtain the corresponding digital voltage signal.
7. The method of any one of claims 4 to 6, wherein said decoding the digitized sequence to obtain corresponding position coordinates comprises:
manchester decoding is carried out on the digital sequence to obtain a digital sequence after Manchester decoding;
and carrying out Gray decoding on the digital sequence subjected to Manchester decoding to obtain the position coordinate.
8. The method of claim 7, wherein the location coordinates comprise R-channel location coordinates and B-channel location coordinates, the method further comprising:
and if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any prestored subarea, determining that the obtained position coordinate is valid, otherwise, determining that the obtained position coordinate is invalid.
9. The method of claim 8, further comprising:
and determining a sliding track by combining position coordinates corresponding to a plurality of continuous preset signal periods.
10. A projection interaction system, the system comprising: a projection device, a color sensor, and a receive processing device in communication with the color sensor,
the projection device is configured to:
acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures;
dividing the target projection image into sub-regions;
respectively coding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region;
before each projection picture is projected, if the corresponding coding bit of the sub-area of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-area;
projecting each projection picture;
the color sensor is used for outputting a color sensing signal;
the reception processing device is configured to:
acquiring a color sensing signal output by the color sensor in a preset signal period;
processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals;
digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold, a coded bit after the digital voltage signal is digitized is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold, determining the coded bit of the digital voltage signal after digitization as a second preset coded bit;
and decoding the digital sequence to obtain the corresponding position coordinate.
CN202210606437.4A 2022-05-31 2022-05-31 Projection interaction method and system Pending CN115016716A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210606437.4A CN115016716A (en) 2022-05-31 2022-05-31 Projection interaction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210606437.4A CN115016716A (en) 2022-05-31 2022-05-31 Projection interaction method and system

Publications (1)

Publication Number Publication Date
CN115016716A true CN115016716A (en) 2022-09-06

Family

ID=83070926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210606437.4A Pending CN115016716A (en) 2022-05-31 2022-05-31 Projection interaction method and system

Country Status (1)

Country Link
CN (1) CN115016716A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480175B1 (en) * 1999-09-17 2002-11-12 International Business Machines Corporation Method and system for eliminating artifacts in overlapped projections
JP2010230925A (en) * 2009-03-26 2010-10-14 Tamura Seisakusho Co Ltd Projection system, projection method, projection program, and projection vector operation device
US20120320042A1 (en) * 2011-06-15 2012-12-20 Scalable Display Technologies, Inc. System and method for color and intensity calibrating of a display system for practical usage
US20150244911A1 (en) * 2014-02-24 2015-08-27 Tsinghua University System and method for human computer interaction
WO2018012929A1 (en) * 2016-07-14 2018-01-18 Samsung Electronics Co., Ltd. Projection system with enhanced color and contrast
CN107728413A (en) * 2016-08-12 2018-02-23 深圳市光峰光电技术有限公司 Optical projection system and the brightness adjusting method applied to optical projection system
CN108332670A (en) * 2018-02-06 2018-07-27 浙江大学 A kind of structured-light system coding method for merging the positive and negative Gray code of RGB channel and striped blocks translation
JP2018197824A (en) * 2017-05-24 2018-12-13 キヤノン株式会社 Projection device, information processing apparatus, and method for controlling them, and program
CN110177265A (en) * 2019-06-11 2019-08-27 成都极米科技股份有限公司 Method for controlling projection, projection control and optical projection system
JP2020053710A (en) * 2018-09-21 2020-04-02 キヤノン株式会社 Information processing apparatus, projection device, control method of projection device, and program
CN112770095A (en) * 2021-01-28 2021-05-07 广州方硅信息技术有限公司 Panoramic projection method and device and electronic equipment
CN113556521A (en) * 2020-04-23 2021-10-26 青岛海信激光显示股份有限公司 Projection image correction method and projection system
US20210409666A1 (en) * 2020-06-29 2021-12-30 Coretronic Corporation Projection positioning system and projection positioning method thereof
WO2022088419A1 (en) * 2020-10-26 2022-05-05 成都极米科技股份有限公司 Projection method, apparatus, and device, and computer-readable storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480175B1 (en) * 1999-09-17 2002-11-12 International Business Machines Corporation Method and system for eliminating artifacts in overlapped projections
JP2010230925A (en) * 2009-03-26 2010-10-14 Tamura Seisakusho Co Ltd Projection system, projection method, projection program, and projection vector operation device
US20120320042A1 (en) * 2011-06-15 2012-12-20 Scalable Display Technologies, Inc. System and method for color and intensity calibrating of a display system for practical usage
US20150244911A1 (en) * 2014-02-24 2015-08-27 Tsinghua University System and method for human computer interaction
WO2018012929A1 (en) * 2016-07-14 2018-01-18 Samsung Electronics Co., Ltd. Projection system with enhanced color and contrast
CN107728413A (en) * 2016-08-12 2018-02-23 深圳市光峰光电技术有限公司 Optical projection system and the brightness adjusting method applied to optical projection system
JP2018197824A (en) * 2017-05-24 2018-12-13 キヤノン株式会社 Projection device, information processing apparatus, and method for controlling them, and program
CN108332670A (en) * 2018-02-06 2018-07-27 浙江大学 A kind of structured-light system coding method for merging the positive and negative Gray code of RGB channel and striped blocks translation
JP2020053710A (en) * 2018-09-21 2020-04-02 キヤノン株式会社 Information processing apparatus, projection device, control method of projection device, and program
CN110177265A (en) * 2019-06-11 2019-08-27 成都极米科技股份有限公司 Method for controlling projection, projection control and optical projection system
CN113556521A (en) * 2020-04-23 2021-10-26 青岛海信激光显示股份有限公司 Projection image correction method and projection system
US20210409666A1 (en) * 2020-06-29 2021-12-30 Coretronic Corporation Projection positioning system and projection positioning method thereof
WO2022088419A1 (en) * 2020-10-26 2022-05-05 成都极米科技股份有限公司 Projection method, apparatus, and device, and computer-readable storage medium
CN112770095A (en) * 2021-01-28 2021-05-07 广州方硅信息技术有限公司 Panoramic projection method and device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱博;解利军;杨廷俊;王琦晖;郑耀;: "自适应复杂环境的投影图像校正算法", 计算机辅助设计与图形学学报, no. 07, 15 July 2012 (2012-07-15) *

Similar Documents

Publication Publication Date Title
US11373338B2 (en) Image padding in video-based point-cloud compression CODEC
US11348283B2 (en) Point cloud compression via color smoothing of point cloud prior to texture video generation
US11699217B2 (en) Generating gaze corrected images using bidirectionally trained network
CN108206951B (en) Method of encoding an image comprising privacy occlusion
US10582196B2 (en) Generating heat maps using dynamic vision sensor events
US20240129636A1 (en) Apparatus and methods for image encoding using spatially weighted encoding quality parameters
CN111512342A (en) Method and device for processing repeated points in point cloud compression
CN110463206B (en) Image filtering method, device and computer readable medium
US10593028B2 (en) Method and apparatus for view-dependent tone mapping of virtual reality images
CN113841411A (en) Single-pass boundary detection in video-based point cloud compression
US20200267396A1 (en) Human visual system adaptive video coding
CN112887728A (en) Electronic device, control method and system of electronic device
TW201436542A (en) System and method for improving video encoding using content information
Boyle The effects of capture conditions on the CAMSHIFT face tracker
JP5950605B2 (en) Image processing system and image processing method
EP3067857A1 (en) Method and device for processing a peripheral image
CN109191398B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
JP2007524286A (en) System and method for processing image data
KR20190080732A (en) Estimation of illumination chromaticity in automatic white balancing
US11025891B2 (en) Method and system for generating a two-dimensional and a three-dimensional image stream
KR20220124151A (en) Passing Attributes in V-PCC
CN115016716A (en) Projection interaction method and system
US11221477B2 (en) Communication apparatus, communication system, and data communication method
CN107318021B (en) Data processing method and system for remote display
CN115086665A (en) Error code masking method, device, system, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination