CN117459663B - Projection light self-correction fitting and multicolor repositioning method and device - Google Patents

Projection light self-correction fitting and multicolor repositioning method and device Download PDF

Info

Publication number
CN117459663B
CN117459663B CN202311778131.8A CN202311778131A CN117459663B CN 117459663 B CN117459663 B CN 117459663B CN 202311778131 A CN202311778131 A CN 202311778131A CN 117459663 B CN117459663 B CN 117459663B
Authority
CN
China
Prior art keywords
virtual
reality system
array
real
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311778131.8A
Other languages
Chinese (zh)
Other versions
CN117459663A (en
Inventor
任志忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tiantu Wanjing Technology Co ltd
Original Assignee
Beijing Tiantu Wanjing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tiantu Wanjing Technology Co ltd filed Critical Beijing Tiantu Wanjing Technology Co ltd
Priority to CN202311778131.8A priority Critical patent/CN117459663B/en
Publication of CN117459663A publication Critical patent/CN117459663A/en
Application granted granted Critical
Publication of CN117459663B publication Critical patent/CN117459663B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The embodiment of the invention provides a method and a device for self-correcting fitting and multi-color repositioning of projection light, comprising the following steps: acquiring data information of a real shooting space and synchronizing the data information to a virtual reality system; setting a real joint array for a real shooting space according to user requirements; setting a virtual joint array for a virtual reality system according to the real joint array; acquiring first mapping information of a virtual reality system; performing color fusion of parameter adjustment and projection light on a virtual joint array in the virtual reality system according to a working component and first mapping information of the virtual reality system, and obtaining second mapping information of the virtual reality system after the color fusion; repositioning and parameter setting the real joint array of the real shooting space according to the second mapping information and the virtual joint array, so that the colors of the real shooting space and the virtual reality system are consistent. The method realizes linkage and interaction between the real world and the virtual world, and provides a real virtual reality experience for the user.

Description

Projection light self-correction fitting and multicolor repositioning method and device
Technical Field
The invention relates to the field of virtual manufacture and space audio-visual effect, in particular to a method and a device for self-correcting fitting and multi-color repositioning of projection light.
Background
In the field of audiovisual production, shooting and presentation means come in multiple iterations and updates. In the present virtual manufacturing age of LEDs, a high-quality LED screen is adopted as a background to shoot a synthesized picture. In the LED shooting process, the high-reflection object can directly reflect LED light spots, so that illumination is provided for people or objects of the green curtain in shooting, and color interference of the green curtain is effectively reduced. However, the LED screen has too low plasticity, so that it cannot be shaped in a complex manner (such as a three-dimensional triangle and a three-dimensional diamond) for a space effect, and the three-dimensional paper folding complex track cloth lamp has the problems of high cost and unsatisfactory effect. Therefore, the prior art can only consider the light attribute effect of spatial consistency, and cannot present the correct light attribute according to the correct picture spatial correspondence.
Disclosure of Invention
The embodiment of the invention aims to provide a method and a device for self-correcting fitting and multi-color repositioning of projection light, which realize mapping of virtual parameters to the real world and provide a real virtual reality experience for users.
In order to achieve the above object, an embodiment of the present invention provides a method for self-correcting fitting of projection light and repositioning multiple colors, the method comprising:
Acquiring data information of a real shooting space, and synchronizing the data information to a virtual reality system;
setting a real joint array for the real shooting space according to the user demand;
setting a virtual joint array for a virtual reality system according to the real joint array;
acquiring first mapping information of the virtual reality system;
performing parameter adjustment and color fusion of projection light on a virtual joint array in the virtual reality system according to the working assembly of the virtual reality system and the first mapping information, and obtaining second mapping information of the virtual reality system after the color fusion;
repositioning and setting parameters of a real joint array of a real shooting space according to the second mapping information and the virtual joint array so that the colors of the real shooting space and a virtual reality system are consistent;
the real joint array and the virtual joint array are both projected light spots.
Optionally, the data information at least comprises projection origin information, imaging plane information, visual information, projection light information and positioning information;
the synchronizing the data information to the virtual reality system includes: setting a virtual reality system, wherein the virtual reality system is at least provided with a virtual camera and a virtual positioner; and synchronizing the data information to the virtual camera for synchronizing with a camera and a positioner of the real shooting space.
Optionally, the setting a virtual joint array for a virtual reality system according to the real joint array includes:
and adjusting the projection light of the virtual combined array according to the form of the real combined array in the real shooting space, so that the form of the real combined array in the real shooting space is consistent with the form of the virtual combined array in the virtual reality system.
Optionally, the projected light is a controllable programmed pixel light bead in at least one group of the combined array, and the light efficiency parameter of the projected light is at least one of brightness, color, saturation, color temperature, color rendering and intensity.
Optionally, the repositioning and parameter setting of the real joint array of the real shooting space according to the second mapping information and the virtual joint array includes:
determining color information of the virtual reality system according to the second mapping information and the virtual joint array;
mapping the projection light parameters of the real shooting space;
repositioning the projected light parameters according to color information of the virtual reality system;
synchronizing the color information update to a real camera for synchronizing with a camera and a locator of a real shooting space;
The repositioning includes calibrating, fitting, adapting and matching the actual projected light.
Optionally, the first mapping information and the second mapping information at least include a position, a range, a direction, a distance, brightness, color, saturation, color temperature, color rendering, and intensity of the projected light spot.
In another aspect, the present invention provides a projected light self-correcting fitting and multi-color repositioning device comprising:
the acquisition module is used for acquiring data information of a real shooting space and synchronizing the data information to the virtual reality system;
the first processing module is used for setting a real joint array for a real shooting space according to the user demand;
the second processing module is used for setting a virtual joint array for a virtual reality system according to the real joint array;
the third processing module is used for acquiring first mapping information of the virtual reality system; performing parameter adjustment and color fusion of projection light on a virtual joint array in the virtual reality system according to the working assembly of the virtual reality system and the first mapping information, and obtaining second mapping information of the virtual reality system after the color fusion;
the fourth processing module is used for repositioning and setting parameters of the real joint array of the real shooting space according to the second mapping information and the virtual joint array so that the colors of the real shooting space and the virtual reality system are consistent;
The real joint array and the virtual joint array are both projected light spots.
Optionally, the data information at least comprises projection origin information, imaging plane information, visual information, projection light information and positioning information;
the synchronizing the data information to the virtual reality system includes: setting a virtual reality system, wherein the virtual reality system is at least provided with a virtual camera and a virtual positioner;
the working components in the virtual reality system include: the system comprises at least one virtual camera, a virtual locator, a specific component unit, an abnormal mapping processing unit, a calibration fitting mapping unit, an AI illumination analysis unit, an AI mapping analysis unit, an overall color fusion unit, a partial color mapping list, a virtual display device and a multi-connected controller.
And synchronizing the data information to the virtual camera for synchronizing with a camera and a positioner of the real shooting space.
Optionally, the setting a virtual joint array for a virtual reality system according to the real joint array includes:
and adjusting the projection light of the virtual combined array according to the form of the real combined array in the real shooting space, so that the form of the real combined array in the real shooting space is consistent with the form of the virtual combined array in the virtual reality system.
Optionally, the multi-joint controller and the joint array controller reposition and parameter set the real joint array of the real shooting space according to the second mapping information and the virtual joint array, including:
determining color information of the virtual reality system according to the second mapping information and the virtual joint array;
mapping the projection light parameters of the real shooting space;
repositioning the projected light parameters according to color information of the virtual reality system;
synchronizing the color information update to a real camera for synchronizing with a camera and a locator of a real shooting space;
the repositioning includes calibrating, fitting, adapting and matching the actual projected light.
The self-correction fitting and multicolor repositioning method for projection light comprises the following steps: acquiring data information of a real shooting space, and synchronizing the data information to a virtual reality system; setting a real joint array for the real shooting space according to the user demand; setting a virtual joint array for a virtual reality system according to the real joint array; acquiring first mapping information of the virtual reality system; performing parameter adjustment and color fusion of projection light on a virtual joint array in the virtual reality system according to the working assembly of the virtual reality system and the first mapping information, and obtaining second mapping information of the virtual reality system after the color fusion; repositioning and setting parameters of a real joint array of a real shooting space according to the second mapping information and the virtual joint array so that the colors of the real shooting space and a virtual reality system are consistent; the real joint array and the virtual joint array are both projected light spots. The invention realizes the restoration and mapping of the projection points in the real shooting space and the corresponding projection points in the virtual reality system through the cooperative work of the virtual reality system and the joint array of the real shooting space, realizes the linkage and interaction between the real world and the virtual world, and provides the real virtual reality experience for users.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings are included to provide a further understanding of embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain, without limitation, the embodiments of the invention. In the drawings:
FIG. 1 is a schematic flow chart of a method for self-correcting fitting and multi-color repositioning of projected light according to the present invention;
FIG. 2 is a schematic diagram of a federated array of the present invention;
FIG. 3 is a schematic diagram of an embodiment of a projected light self-correction fitting and multi-color repositioning method of the present invention;
FIG. 4 is a schematic diagram of imaging of a virtual three-dimensional space and a real shooting space in a virtual reality system of the present invention;
FIG. 5 is a schematic diagram of a virtual environment import virtual reality system of the present invention;
FIG. 6 is a schematic diagram of instant messaging and control modes of a multiple controller and a unified array controller according to the present invention;
FIG. 7 is a schematic view of the virtual world cyclic repeat map projected light to the real world of the present invention;
FIG. 8 is a schematic diagram of the interior of a joint array controller of the present invention.
Description of the reference numerals
100-a first camera;
200-a first positioner;
300-a second camera;
400-a second positioner;
1-a first joint array;
2-a second coalescing array;
3-a third combined array;
4-fourth combined array.
Detailed Description
The following describes the detailed implementation of the embodiments of the present invention with reference to the drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the invention, are not intended to limit the invention.
Fig. 1 is a schematic flow chart of a method for self-correcting fitting and multi-color repositioning of projection light according to the present invention, as shown in fig. 1, the method for self-correcting fitting and multi-color repositioning of projection light according to the present invention includes: step S101 is to acquire data information of a real shooting space and synchronize the data information to a virtual reality system. Specifically, the data information at least includes projection origin information, imaging plane information, visual information, projection light information, and positioning information. The method comprises the steps of obtaining data information of a real shooting space, namely shooting an imaging space of a real world by using a real camera, obtaining basic information of a person and a real joint array, and positioning shooting contents in the real camera such as the person, the scene, the object, the real joint array and the like of the real world through at least one real positioner.
Specifically, the synchronizing the data information to the virtual reality system includes: setting a virtual reality system, wherein the virtual reality system is at least provided with a virtual camera and a virtual positioner; and synchronizing the data information to the virtual camera for synchronizing with a camera and a positioner of the real shooting space. The working components in the virtual reality system include: the system comprises at least one virtual camera, a virtual locator, a specific component unit, an abnormal mapping processing unit, a calibration fitting mapping unit, an AI illumination analysis unit, an AI mapping analysis unit, an overall color fusion unit, a partial color mapping list, a virtual display device and a multi-connected controller.
The virtual reality system is an application scene, is a PC display end and a control terminal, and has a certain connection mode with the real world. By inputting information of synchronous real world, the effect required to be achieved in the real world is simulated in the virtual reality system, so that a user can achieve the effect required by a demander in a highly realistic three-dimensional virtual environment, the virtual environment comprises elements such as scenes, objects, characters and joint arrays, the environment can be adjusted to achieve the effect required by the demander, the required information is obtained, and the consumption of shooting resources and personnel resources in the real world is reduced. The multi-connected controller component exists in the virtual reality system, so that the demand information of the regulated virtual environment effect is transmitted and output to the real world through the multi-connected controller. The components in the virtual reality system include a virtual interaction device, a virtual camera, a virtual positioner, a specific component unit, an anomaly mapping processing unit, a calibration fit mapping unit (referred to as a virtual projection device), an AI illumination analysis unit (referred to as a virtual illumination device), an AI mapping analysis unit, an overall color fusion unit (referred to as a virtual projection environment), partial color mapping sheets and virtual displays, a multi-connected controller device, and the like. The virtual camera and the virtual positioner may be added or subtracted according to the user's needs, and there may be many at the same time. The virtual interactive device in the virtual reality system is to directly display the interactive result of the virtual display device, namely the interactive visual effect of the virtual reality system, in the virtual reality device, the interactive information generated by the virtual interactive device and the whole virtual world when elements such as scenes, objects, characters, joint arrays and the like in the virtual world are changed.
Step S102 is to set a real joint array for the real shooting space according to the user' S requirement. The real joint array and the virtual joint array are both projected light spots. The projected light spot includes at least projected light spot imaging plane information, visual information, projected light information, and positioning information. The projection light refers to pixel lamp beads in at least one group of combined arrays, and the light efficiency parameters of the projection light comprise light efficiency parameters such as brightness, color, saturation, color temperature, color rendering, intensity and the like. The color fusion of the projection light comprises the properties of the projection light such as the mixed projection light direction, the projection light range, the projection light distance, the projection light position and the like. The combined array is controlled by the combined array controller preferentially, and the real combined array and the virtual combined array are all projection light spots. For example, the joint array is composed of n pixel beads, which are controllably programmable beads. The controllable programming lamp bead in the application refers to the control display after being directly programmed by a virtual reality system. As shown in fig. 2, a joint array may be any of a water wave shape, a circular shape, a rectangular shape, a stripe shape, a dot shape, and a three-dimensional body, which is formed by n controllable programmed pixel beads, and these arbitrary shaped arrays are called joint arrays. The joint array controller may control a plurality of joint arrays, which may arbitrarily compose a new-shaped array.
Specifically, the joint array controller controls the brightness and the color of the controllable programming pixel lamp beads in each group of joint arrays according to the data signals of the multi-joint controller in the virtual display system, so that the whole controllable programming pixel lamp bead array is accurately controlled. The combined array controller can also adjust the arrangement mode and the number of a plurality of groups of combined arrays according to the requirement, so that different projection effects and visual effects can be realized.
Step S103 is to set a virtual joint array for the virtual reality system according to the real joint array. According to a specific embodiment, the setting a virtual joint array for a virtual reality system according to the real joint array includes: and adjusting the projection light of the virtual combined array according to the form of the real combined array in the real shooting space, so that the form of the real combined array in the real shooting space is consistent with the form of the virtual combined array in the virtual reality system. And adjusting parameters.
In particular, a system is created in which at least one second camera 300 can receive and analyze virtual projected light information of a real joint array. For measuring the intensity and direction of the mapped projected light at different times and different locations. A network of multiple virtual sensors is designed and deployed. First, each virtual sensor has the ability to measure the intensity, brightness, color, saturation, color temperature, color rendering, intensity, as well as the direction of the projected light, the range of the projected light, the distance of the projected light, which is typically accomplished by the adaptive optics unit system of FIG. 2. The virtual sensor network needs to be able to monitor the change of the virtual projected light continuously or periodically in time and location.
At least a second locator 400 is designed which can receive position information from the processed data and convert it to a position in virtual space. The second locator 400 needs to receive location information from the processed data. Such information may include position coordinates, direction angles, speed, etc. Such information may be transmitted to the second locator 400 through wireless communication technology or a wired connection. Data conversion: the second locator 400 needs to translate this position information into a position in virtual space. This can be achieved by using a coordinate transformation algorithm in the virtual reality system AI algorithm. The coordinate conversion algorithm may convert the position information in the physical space into coordinates in the virtual space. The accuracy of the coordinate transformation algorithm directly affects the accuracy of the position in virtual space. Therefore, the AI algorithm in the AI module selects an appropriate coordinate transformation algorithm, and performs fine parameter adjustment and calibration high-performance calculation: to achieve a more accurate position conversion, the second positioner 400 has high performance computing capabilities. This may be accomplished by using a high-performance computer or specialized hardware device. For example, the AI module may be used to perform high performance calculations on the positioning information of the second locator 400 to achieve a more accurate position transition to ensure real-time and fluency of the system. And (3) data output: the second locator 400 outputs the position information in the virtual space into the virtual reality system.
Thus, the AI module ensures stable transmission and correctness of the second locator 400 and the second camera 300 data. Second, the virtual sensor network needs to be integrated with the virtual reality system. And transmitting the data of the virtual sensor network to a virtual reality system for processing and analysis. In a virtual reality system, the acquisition and response of the change information of the projection points in the real world array can be realized by comparing and analyzing the data of the sensor network with the data in the virtual environment. And enabling the form of the real combined array in the real shooting space to be consistent with the form of the virtual combined array in the virtual reality system.
Step S104 is to obtain first mapping information of the virtual joint array in the virtual reality system. The specific implementation manner of obtaining the first mapping information of the virtual reality system includes: at least one virtual camera and a virtual positioner are arranged in the virtual reality system, and the virtual environment is imported as shown in fig. 2 and is used for obtaining the space matching information of the virtual environment, the characters and the virtual joint array. The first mapping information at least comprises the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, projection light position, projection light direction, projection light range, projection light distance and the like of the projection light spot.
Specifically, the virtual environment is imported: the virtual environment may be imported into the virtual reality system in a variety of ways. A model is created by the AI module and then imported into the virtual reality system. In addition, photographs or videos may also be used to create the virtual environment. The virtual environment is imported into the virtual reality system as shown in fig. 5. The AI module of the virtual reality system automatically adds a fourth joint array 4, and introduces virtual environment information into the fourth joint array 4. The virtual environment is imported, and the specific component units of the working assembly are required to be processed. The specific component units may be analyzed and measured by analyzing and measuring the spectrum of the light projected by the controllable, programmed pixel beads in the combined array, i.e., the spectrum of the light projected by the controllable, programmed pixel beads in the combined array. The specific component units can be combined with a control system and an algorithm in the multi-connected controller to realize automatic correction and optimization of the projected light. For example, when the spectral distribution of the projected light changes, the specific component unit may detect and analyze the change, and feed back the result to the control system and algorithm in the multi-connected controller, so as to automatically adjust parameters such as brightness and color of the controllable-programmed pixel beads of the fourth combined array 4, and realize self-correction fitting to the projected light. The virtual environment is a virtual background required for shooting. The specific component unit automatically recognizes the virtual background, extracts the virtual environment information, and transmits the virtual environment information into the fourth combined array 4. The AI module fully copies the pixel information of the virtual environment into the fourth combined array 4. The fourth combined array 4 is a set of controllably programmable pixel beads.
Step S105 is to perform color fusion of the modulating and projecting light on the virtual joint array in the virtual reality system according to the working component of the virtual reality system and the first mapping information, and obtain second mapping information of the virtual reality system after the color fusion. The working components of the virtual reality system comprise at least one virtual camera, a virtual positioner, a specific component unit, an abnormal mapping processing unit, a calibration fitting mapping unit (i.e. a virtual projection device), an AI illumination analysis unit (i.e. a virtual illumination device), an AI mapping analysis unit, an integral color fusion unit (i.e. a virtual projection environment), a partial color mapping list, a virtual display, a multi-connected controller device and the like.
After the working assembly in the virtual reality system receives the first mapping information, when a user changes parameters of the projection light of the controllable programming pixel lamp beads in the virtual combined array in the virtual reality system according to own requirements, color fusion and automatic adjustment are carried out on the projection light of the controllable programming pixel lamp beads in the virtual environment and the virtual combined array according to the information. This process involves color adjustment of the projected spots or voxels of each controllably programmable pixel bead of the virtual fourth combined array 4 of fig. 5 to match the color of the projected spot.
In this process, the colors and shadows between the virtual environment and the light projected by the controllable, programmable pixel beads of the real character, object, virtual joint array require the work component fusion unit to match and coordinate. The color fusion of the projection light by controlling each joint array pixel lamp bead comprises the mixed projection light direction, the projection light range, the projection light distance and the projection light color. The second mapping information includes at least the intensity, brightness, color, saturation, color temperature, color rendering, intensity, direction of the projected light, range of the projected light, distance of the projected light, and position of the projected light spot. And other optical properties to enable rapid switching and mixing between colors. The unit can further use the information to fuse and adjust the whole color to create a more colorful visual effect.
According to a specific embodiment, the application realizes multi-color repositioning and fusion through the fusion unit. The fusion unit is used for controlling the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, direction, range, distance and position of the projected light of each pixel lamp bead of the controllable dialect and other optical properties of the projected light, so that the rapid switching and mixing of multiple colors are realized. The fusion unit utilizes the information to fuse and automatically adjust the whole colors or partial colors so as to create rich and colorful visual effects.
Step S106 is repositioning and setting parameters of the real joint array of the real shooting space according to the second mapping information and the virtual joint array, so that the colors of the real shooting space and the virtual reality system are consistent.
According to a specific embodiment, the repositioning and parameter setting of the real joint array of the real shooting space according to the second mapping information and the virtual joint array includes: the multi-connected controller and the combined array controller determine color information of the virtual reality system according to the second mapping information and the virtual combined array; the multi-connected controller is a virtual reality system working component which sends out the second mapping information and the virtual joint array quasi-joint array to determine the color information data of the virtual reality system in real time. Projection light parameters mapped to the real shooting space; repositioning the projected light parameters according to color information of the virtual reality system; determining color information of the virtual reality system by the second mapping information and the virtual joint array; updating and synchronizing to the real camera for synchronizing with the camera and the locator of the real shooting space. The joint array controller repositioning includes calibrating, fitting, adapting, and matching the actual projected light. So that the real federated array setup is consistent with four virtual federated arrays. So that real-world photographers can take pictures without errors and resource consumption.
Determining color information of the virtual reality system according to the second mapping information and the virtual joint array; measuring projection light parameters of the real shooting space; repositioning the projected light parameters according to color information of the virtual reality system; the repositioning includes calibrating, fitting, adapting, and matching the projected light. The real joint array and the virtual joint array are both projected light spots.
Fig. 3 is a schematic diagram of a specific embodiment of a method for self-correcting fitting and repositioning multiple colors by using projection light according to the present invention, as shown in fig. 3, a real camera and a real positioner in a real shooting space are first set, and an imaging space is shot to obtain input information (including information on a projection origin, information on an imaging plane, visual information, information on projection light) and positioning information, and the information is synchronized to a virtual camera and a virtual positioner. The projection origin information includes information such as a projection light direction, a projection light range, a projection light distance, and a position of projection light of a projection light color. The imaging plane information is spatial plane information. The visual information is shooting information of an acquired imaging space. The projection light information includes information such as brightness, color, saturation, color temperature, color rendering, intensity, and the like.
A plurality of real cameras and real positioners may be provided in the real shooting space, and a plurality of virtual cameras and virtual positioners may be provided in the virtual reality system. As shown in fig. 4, the first camera 100 and the first positioner 200 are disposed in the real photographing space. A second camera 300 and a second locator 400 are provided in the virtual reality system, and the first camera 100 is used to capture an image of the real world for capturing an image having an accurate color. The first positioner 200 is used for spatial positioning. The first camera 100 photographs the green curtain space, acquires an image of the imaging space, and is used to separate the background and the foreground in the post-production. When the image of the imaging space is acquired for creating a virtual joint array in a virtual reality system, a real joint array scene of a real world can be accurately simulated.
The functions of the first camera 100 include: capturing an image of the real world, in particular it is capable of capturing reflected light of the projected light impinging on the object and generating image data; acquiring visual information about the distribution, shape, color, etc. of the projected light in the real world for subsequent projected light self-correction fitting and multi-color repositioning; the first camera 100 may also monitor the actual effect of the projected light. By analyzing the captured image, it is possible to evaluate whether the desired effect is achieved in terms of brightness, color, saturation, color temperature, color rendering, intensity, uniformity, and the like of the projected light.
The first locator 200 is used for locating the photographed contents of the first camera 100 of a person, a scene, an object, etc. in the real world. The method specifically comprises the steps of determining the space position, determining the space position of the projection light in the real world, accurately measuring the coordinates and the directions of the projection light in the three-dimensional space, and providing position information for the follow-up self-correction fitting of the projection light of the pixel beads controlled by the combined array. The first positioner 200 is further configured to spatially map, by measuring the position of the projected light in the real world, the position information of the projected light into the virtual environment, so as to achieve correspondence between the virtual environment and the real environment. The position information of the first positioner 200 is fed back to a virtual reality system and algorithm, so as to realize self-correction fitting of the projection light. For example, if the distribution of the projected light is found to be inconsistent with the expectation, the virtual reality system may adjust parameters such as brightness and color of the pixel beads according to the position information provided by the first positioner 200, so as to achieve self-correction and fitting. The first camera 100 and the first locator 200 function to capture images, acquire visual information, and determine spatial positions, respectively, in the real world. They provide important data support for subsequent self-correcting fitting of the projected light and multi-color repositioning by capturing and analyzing the projected light distribution and position information in the real world. At the same time, this information can also be used to monitor the actual effect of the projected light, ensuring that it is reaching the intended performance in the real world.
Specifically, a real-world green curtain scene is photographed using the first camera 100. The shooting content basic information (including image data information, visual information, projection effect information) is acquired. The first camera 100 shoots a person or an object in an imaging space, a green curtain image is obtained, the shot imaging space image is processed by a working component of the virtual reality system, a part of a proper image is selected according to different application scenes and application requirements, and a first processing module in the virtual reality system selects input information from basic information to obtain the input information. Positioning information (including spatial position information, spatial mapping information, controllable programming pixel bead projection origin information, controllable programming pixel bead projection light position information) is obtained from the information of the real space by the first positioner 200. The first locator 200 locates the photographed contents of the first camera 100 such as a real-world person, scene, object, joint array, etc.: determining real world space and position information of projection light of the pixel lamp beads of control programming, and determining the position information of the projection light in the real world; and determining the space mapping information of the projection light to obtain input information. The input information and the positioning information obtained by the first camera 100 and the first positioner 200 thereof are synchronized into the second camera 300 and the second positioner 400 in the virtual three-dimensional space in the virtual reality system. The functions of the second camera 300 and the second locator 400 are substantially identical to those of the first camera 100 and the first locator 200, except that the first camera 100 and the first locator 200 are operated on the real world, and the second camera 300 and the second locator 400 are operated on the virtual world. Then, the input information and the positioning information acquired by the first camera 100 and the first positioner 200 are synchronized to the second camera 300 and the second positioner 400 in the virtual reality system in real time. This process may be implemented via a data transmission line or a wireless connection.
And arranging a plurality of groups of joint arrays in the real world according to the user demands, and setting the form of the selected joint arrays in the virtual reality system and the virtual environment of the virtual three-dimensional space. Such that arranging multiple sets of federated arrays in a three-dimensional virtual environment is consistent with the real world federated arrays. As shown in fig. 4, the interaction relationship of the virtual three-dimensional space and the real world space in the virtual reality system is illustrated. The combined array is formed by randomly arranging n controllable programming pixel lamp beads, and the projection light of the controllable programming pixel lamp beads can be a point light source or a parallel light. The projected light of the controllably programmable pixel beads has directionality that enables it to be used to simulate a variety of different lighting effects, such as side lights, top lights, etc. The projected light may also have different colors and intensities. Different lighting effects can be simulated by adjusting these properties. The irradiation range and position of the projection light and the range distance can also be adjusted, for example, it can be set to irradiate only a specific object or area, or it can be used for omnidirectional illumination. Finally, the projection light can also have different illumination angles and projection modes. This enables it to be used to simulate a variety of different light environments and light effects, thereby functioning in a variety of different application scenarios. In the invention, the controllable programming pixel lamp beads and the projection light are combined to obtain the pixel lamp beads with the property and the attribute of the projection light. I.e. the controllable pixel beads in the present application project light.
In the invention, the virtual reality system refers to an important application scene, and by inputting real world information, the effect required to be achieved in the real world is simulated in the virtual reality system, so that a user can obtain the highly realistic three-dimensional virtual environment, wherein the virtual environment comprises the elements such as a scene, an object, a person, a joint array and the like, and the real environment also comprises the elements such as the scene (the environment in the real world is mapped into the real joint array by the virtual joint array of the application, that is, the environment in the real world does not exist, and is the environment formed by the controllable programming pixel lamp beads), the object, the person, the joint array and the like. And adjusting the virtual environment to enable the environment to reach the effect required by the demander and obtain the demand information. The virtual reality system is provided with a working component multi-connected controller, so that the demand information of the regulated virtual environment effect is converted into the joint array, and the joint array is transmitted and output to the joint array in the real world through the multi-connected controller. The components in the virtual reality system include a virtual interaction device, a second camera 300, a second locator 400, a specific component unit, an anomaly mapping processing unit, a calibration fit mapping unit (virtual projection device), an AI illumination analysis unit (virtual illumination device), an AI mapping analysis unit, an overall color fusion unit (virtual projection environment), partial color mapping sheets and virtual displays, and a multi-connected controller device, etc. The virtual interaction device in the virtual reality system is used for interacting with the whole virtual world when elements such as a scene, an object, a person, a joint array and the like in the virtual world are changed into the virtual reality device, and the virtual display device directly displays the interaction result, namely the interaction visualization effect of the virtual reality system, and the display effect of the pixel lamp bead projection lamp of the real world is automatically controlled by the AI. In the invention, the virtual reality system can perform various treatments on the virtual joint array.
Multiple sets of federated arrays, at least one set of federated arrays, are deployed in the real world. The joint array can have n forms, and can be arranged into any shape according to the requirements of users. The required joint array is arranged in the real world in the imaging space. The input information obtained by the first camera 100 and the positioning information obtained by the first positioner 200 are synchronized to the second camera 300 and the second positioner 400, respectively. The second camera 300 and the second locator 400 are provided in the virtual world in the virtual reality system, and input information and positioning information are transmitted to the virtual reality system. The virtual joint array is mapped to a virtual three-dimensional space in the virtual reality system by the second locator 400 and the positioning information. The virtual joint array is arranged in a virtual three-dimensional space by the synchronized positioning information by the positioning information input into the second positioner 400. The working component anomaly mapping processing unit of the virtual reality system projects light to self-correct fitting of anomaly data, and accurate projection light measurement and correction can be performed through the working component anomaly mapping processing unit. In a complex real environment, the direction and intensity of the projected light may be affected by a number of factors. Self-correcting fitting techniques can be used to eliminate these effects, resulting in more accurate measurements.
The second camera 300 receives input information transmitted from the first camera 100 in the virtual reality system, and generates an image of the virtual three-dimensional space based on the input information using the input information as input information. This image is visual information obtained from the real world photographed by the first camera 100, and thus can accurately reflect the selection scene of the real world. The rendering information synchronization system synchronizes and adjusts parameters such as the position, the positive direction, and the phase misalignment correction of the first camera 100.
The second locator 400 receives the positioning information transmitted from the first locator 200 and uses the positioning information as input information, and determines the position of the second locator 400 in the virtual three-dimensional space according to the positioning information. This location information is used to map the virtual federated array into a virtual three-dimensional space.
Virtual federated arrays are a technique for creating and manipulating complex data arrays in a virtual environment. For example, a federated array may contain a series of virtual objects or environments that are organized and arranged in some manner in virtual space.
Specifically, a system is created that can receive and parse virtual projected light information of a virtual federated array. A virtual sensor network is provided creating a system where at least one second camera 300 can receive and analyze virtual projected light information of a real joint array. For measuring the intensity and direction of the mapped projected light at different times and different locations. A network of multiple virtual sensors is designed and deployed. First, each virtual sensor has the ability to measure the intensity, brightness, color, saturation, color temperature, color rendering, intensity, and direction of projected light, range of projected light, distance of projected light, which is typically accomplished by the working assembly adaptive optics unit system of fig. 2. The virtual sensor network needs to be able to monitor the change of the virtual projected light continuously or periodically in time and location. Meanwhile, the data of the virtual sensor network and the data in the virtual environment are compared and analyzed, so that the acquisition and response of the change information of the projection points in the real world array can be realized.
In the present invention, the abnormality map processing unit is a unit for processing an abnormality map. The non-linear mapping that exists between the projected image and the target image is due to various reasons. Such a mapping relationship may cause deviation or distortion of the projection light from the target image in terms of the brightness, color, saturation, color temperature, color rendering, intensity, direction of the projection light, range of the projection light, distance of the projection light, and position of the projection light. The abnormal mapping processing unit can realize accurate control and adjustment of the projected light by analyzing and modeling the nonlinear mapping relation. For example, geometric correction, color correction, brightness correction, etc. may be performed on the projected image using an image processing algorithm to achieve accurate restoration and presentation of the target image. In addition, the AI algorithm can be used for learning and predicting the abnormal mapping relation, so that accurate prediction and control of future projection images are realized. The abnormal mapping processing unit is an important component in the method and the device of the projection light self-correction fitting and multicolor repositioning technology, can improve the quality and the accuracy of the projection effect, and ensures the expressive force and the consistency of the visual effect.
Accurate projected light information can be obtained by inputting information and positioning information. This information is then input into a second locator 400, which uses this information to determine its position in the virtual three-dimensional space. This location information is then used to map the creation and placement of corresponding virtual objects or environments in the virtual federated array. In this way, the real world scene is accurately simulated and rendered in a virtual three-dimensional space. The method enables the virtual reality system to reflect and map real world conditions in real time, and enhances the sense of realism of the virtual reality system.
And according to the AI working assembly of the virtual reality system, setting a virtual joint array, importing, processing and adjusting the virtual environment to obtain a component array, and changing parameters of projection light spots in the component array according to requirements to obtain first mapping information.
Specifically, the virtual environment may be imported into the virtual reality system in a variety of ways. A model is created by the AI module and then imported into the virtual reality system. In addition, a virtual environment may also be created using photos or videos, imported into the virtual fourth combined array 4. As shown in fig. 5, the virtual environment is imported into the virtual reality system. The first combined array 1, the second combined array 2, the third combined array 3 and the fourth combined array 4 are all arrays after parameter adjustment. The virtual reality system AI module automatically adds a fourth joint array 4. The virtual environment is imported into a virtual reality system, and the virtual environment is a virtual background required for shooting. The AI module automatically recognizes the virtual background, extracts the virtual environment information, and passes the virtual environment information into the fourth combined array 4. The AI module fully replicates the virtual environment into the fourth combined array 4. The first combined array 1, the second combined array 2 and the third combined array 3 refer to the array form after parameter adjustment in the invention.
A model is created by the AI module and then imported into the virtual reality system. In addition, photographs or videos may also be used to create the virtual environment. The virtual environment is imported into the virtual reality system as shown in fig. 5. The AI module of the virtual reality system automatically adds a fourth joint array 4, and introduces virtual environment information into the fourth joint array 4. The virtual environment is imported, and the specific component units of the working assembly are required to be processed. The specific component units may be analyzed and measured by analyzing and measuring the spectrum of the light projected by the controllable, programmed pixel beads in the combined array, i.e., the spectrum of the light projected by the controllable, programmed pixel beads in the combined array. The specific component units can be combined with a control system and an algorithm in the multi-connected controller to realize automatic correction and optimization of the projected light. For example, when the spectral distribution of the projected light changes, the specific component unit may detect and analyze the change, and feed back the result to the control system and algorithm in the multi-connected controller, so as to automatically adjust parameters such as brightness and color of the controllable-programmed pixel beads of the fourth combined array 4, and realize self-correction fitting to the projected light. The virtual environment is a virtual background required for shooting. The specific component unit automatically recognizes the virtual background, extracts the virtual environment information, and transmits the virtual environment information into the fourth combined array 4. The AI module fully copies the pixel information of the virtual environment into the fourth combined array 4. The fourth combined array 4 is a set of controllably programmable pixel beads.
And the AI component of the virtual reality system carries out parameter adjustment on the virtual joint array according to a preset algorithm and program. This process includes determining the shape, size, arrangement, and intensity, brightness, color, saturation, color temperature, color rendering, intensity, and properties of the projected light location, projected light direction, projected light range, projected light distance, etc. of each projected light spot in the virtual combined array and its combined array. In the virtual reality system, parameters of the projected light spots in the component array are changed according to requirements, and a user can change the parameters of the projected light spots in the component array according to requirements in the virtual reality system. For example, a user may change the intensity, brightness, color, saturation, color temperature, color rendering, intensity, and properties of the projected light, projected light location, projected light direction, projected light range, projected light distance, etc. of a portion of the projected light or the projected light as a whole through an adaptive optics unit, a multi-color repositioning unit, and the working assembly adapts the optical unit to suit a particular virtual environment scene or effect. The AI module automatically adjusts when adjusting the virtual joint array parameters by analyzing the input information and positioning information of the change input. By processing and modulating the projected spots of the virtual combined array, a particular component unit generates a component array. The component array may be a modulated array comprising a plurality of projected spots.
The AI illumination analysis unit of the working assembly is responsible for analyzing the illumination conditions of the virtual environment and the virtual array and determining the optimal illumination setting according to a preset algorithm and program. For example, the AI lighting analysis unit may consider the effects of reflection of light, shadows and localized lighting, projected light ranges, projected light direction projected light distances, projected light locations, etc., to achieve a virtual three-dimensional space, a virtual joint array, and a more realistic lighting simulation of a person. In the invention, the unit is mainly responsible for analyzing the illumination condition of the projection light to analyze the parameters of the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, projection light position, projection light direction, projection light range, projection light distance and the like of the projection light so as to determine the illumination condition of the projection light. The AI illumination analysis unit may further analyze and correct the projected light using such information to improve the quality and accuracy of the projected effect.
The adaptive optics unit is responsible for processing and adjusting the projected light of the virtual combined array. When the parameters of the projected light spots of the combined array are changed, the shape, the size, the focus and other attributes of the light can be adjusted according to the analysis result of the AI illumination analysis unit and other factors, so that more accurate and clear light projection effect can be realized. In the present invention, the adaptive optics means a wavefront coding technology that can change its shape and optical characteristics, thereby improving the imaging quality of the virtual camera optics. In a virtual reality system, i.e. a virtual camera, a real camera images a person's position information connected to a virtual camera. The invention can keep stable in space and time, thereby improving display effect, and the self-adaptive unit can be used for improving various effects such as definition, color reduction degree, visual effect and the like of reality.
Setting a virtual joint array in a virtual reality system, importing a virtual environment, processing and adjusting projection light by a specific component unit, and processing by an AI illumination analysis unit and an adaptive optical unit. These steps together accomplish the complex task of creating and adjusting the projected light in the virtual environment to obtain the first mapping information.
After the working assembly in the virtual reality system receives the first mapping information, when a user changes parameters of the projection light of the controllable programming pixel lamp beads in the virtual combined array in the virtual reality system according to own requirements, color fusion and automatic adjustment are carried out on the projection light of the controllable programming pixel lamp beads in the virtual environment and the virtual combined array according to the information. This process involves color adjustment of the projected spot or voxel of each controllably programmable pixel bead of the virtual joint array of fig. 5 to match the color of the projected spot.
And according to the first mapping information, the working assembly in the virtual reality system performs integral color fusion on the virtual three-dimensional space virtual environment to obtain second mapping information. In this process, the colors and shadows between the virtual environment and the light projected by the controllable, programmable pixel beads of the real character, object, virtual joint array require the work component fusion unit to match and coordinate. The color fusion of the projection light by controlling each joint array pixel lamp bead comprises the mixed projection light direction, the projection light range, the projection light distance and the projection light color. The second mapping information includes at least the intensity, brightness, color, saturation, color temperature, color rendering, intensity, direction of the projected light, range of the projected light, distance of the projected light, and position of the projected light spot. And other optical properties, to achieve fast switching and mixing between colors. The unit can further use the information to fuse and adjust the whole color to create a more colorful visual effect.
According to a specific embodiment, the application realizes multi-color repositioning and fusion through the fusion unit. The fusion unit is used for controlling the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, direction, range, distance and position of the projected light of each pixel lamp bead of the controllable dialect and other optical properties of the projected light, so that the rapid switching and mixing of multiple colors are realized. The fusion unit utilizes the information to fuse and automatically adjust the whole colors or partial colors so as to create rich and colorful visual effects.
Color fusion involves adjusting the overall color or partial color of a real character, object, or both, from the virtual environment and the imaging information to make them appear more coordinated and realistic in the virtual three-dimensional space. This may involve adjusting the color of each pixel projected light or voxel to match a parametric attribute such as the color of the projected spot or other colors in the environment. During the color fusion process, the color and shadow AI modules in the virtual environment are automatically updated to reflect the new projected light points and other objects. Finally, the virtual reality system renders the virtual environment, the real person, the object and the virtual joint array together to generate a final image. This process involves multiple levels of rendering, such as rendering the virtual environment first, then rendering the real characters and objects, and finally fusing them together.
The virtual mapping information describes the color and location in virtual three-dimensional space of the fused virtual environment, the real character, the object, the virtual joint array, and the changed joint array projected light spots, which can be used to generate the final virtual reality experience, the virtual environment being imported into the virtual fourth joint array 4 such that the virtual fourth joint array 4 displays a display effect consistent with the virtual environment image. According to a specific embodiment, the application realizes multi-color repositioning and fusion through the fusion unit. The fusion unit is used for controlling the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, direction, range, distance and position of the projected light of each pixel lamp bead of the controllable dialect and other optical properties of the projected light, so that the rapid switching and mixing of multiple colors are realized. The fusion unit utilizes the information to fuse and automatically adjust the whole colors or partial colors so as to create rich and colorful visual effects.
Color fusion may involve techniques such as color blending, projection light mapping blending, etc. The color mixing generates a new color by superimposing two or more colors together. In a virtual reality system, color mixing may be used to fuse colors between the virtual environment importing generated fourth joint array 4 and the real person, object, making them appear more coordinated. In order to achieve more realistic projected light and color effects, virtual reality systems require: virtual reality system engines are a technology that simulates physical phenomena such as gravity, collisions, illumination, projection effects, etc. In a virtual reality system, a physical engine may be used to simulate the projected illumination and reflection effects of the real world, making objects in the virtual environment appear more realistic. Global illumination: global illumination is a technique that simulates the interaction of rays in a complex scene, taking into account the interaction and reflection between rays. In a virtual reality system, global illumination may be used to simulate the projected illumination effects of the real world, making objects in the virtual environment appear more realistic. The second camera 300 captures the rendered final image and transmits it to the work assembly multi-level controller.
According to a specific embodiment, the repositioning and parameter setting of the real joint array of the real shooting space according to the second mapping information and the virtual joint array includes: determining color information of the virtual reality system according to the second mapping information and the virtual joint array; the multi-connected controller is a virtual reality system working component which sends out the second mapping information and the virtual joint array quasi-joint array to determine the color information data of the virtual reality system in real time. Projection light parameters mapped to the real shooting space; repositioning the projected light parameters according to color information of the virtual reality system; determining color information of the virtual reality system by the second mapping information and the virtual joint array; updating and synchronizing to the real camera for synchronizing with the camera and the locator of the real shooting space. The joint array controller repositioning includes calibrating, fitting, adapting, and matching the actual projected light. Such that the real federated array setup is consistent with a virtual federated array. So that real-world photographers can take pictures without errors and resource consumption.
And performing calibration processing and fitting processing on the second mapping information through a working assembly calibration fitting mapping unit. In the present invention, the mapping display unit refers to a unit for realizing a mapping relationship between projection light and a display device. Specifically, the mapping display unit may map and convert the intensity, brightness, color, saturation, color temperature, color rendering, intensity, and direction of the projected light, range of the projected light, distance of the projected light, position of the projected light, and other optical properties of the projected light. The mapping relation can be preset or dynamically adjusted according to actual requirements and scenes. By the implementation of the mapping display unit, parameters such as color and brightness between the projected light and the display device can be optimally matched and presented. The map display unit generally includes, map relation setting: and setting the mapping relation between the projection light and the display equipment according to the actual requirements and the scene. Such mappings may include adjustment and setting of parameters such as color matching, brightness matching, contrast matching, etc. Real-time monitoring and adjustment: the mapping relationship is adjusted and optimized by monitoring the intensity, brightness, color, saturation, color temperature, color rendering, intensity, and direction of the projected light, the range of the projected light, the distance of the projected light, the position of the projected light, and other optical attribute parameters of the projected light and the display device in real time. The real-time monitoring and adjustment can ensure that parameters such as color, brightness and the like between the projection light and the display device achieve optimal matching and presentation effects.
According to different application scenes and requirements, the mapping display unit can realize switching among multiple modes. For example, different mapping relationships and parameter settings may be used in the day and night modes to accommodate different light conditions and application requirements. In summary, the mapping display unit is an important component in the method and apparatus of the projected light self-correction fitting and multi-color repositioning technique, which can realize the optimal mapping and matching effect between the projected light and the display device, and ensure the consistency and optimal presentation of the visual effect. This step is mainly to calibrate and fit the joint array of the real world and the joint array of the virtual space. Specifically, it is necessary to determine the coordinate systems of the two arrays first, and then to achieve correspondence between the real world and the virtual space by finding the mapping relationship between the two arrays. This process may involve calibration and fitting of position, angle, velocity, etc. attributes to each projected spot. After the calibration fit is completed, a multi-color repositioning process is required. This step is mainly to reposition and adjust the color of each projected spot in the real world joint array based on the second mapping information. In particular, matching and adjusting the colors of projected spots in the real world based on color information in the virtual space is required to achieve color consistency in the real world and the virtual space. In this step, the AI modules are used to further analyze and process the real world joint array. The AI module may intelligently analyze and optimize the real-world joint array based on the second mapping information and the data in the virtual space. Specifically, the AI module may automatically adjust and control the projected spots of the joint array by machine learning, deep learning, etc. methods to achieve more accurate projected light illumination and color effects. Partial color map processing unit: after the above steps are completed, the real world federated array has been substantially identical to the federated array of virtual space. In some cases, however, separate processing and mapping of certain specific colors or hues may be required. This step is mainly to adjust and process specific colors or colors in the real world joint array according to specific requirements and scenes so as to achieve more realistic and richer color effects.
The calibration fitting unit of the present invention refers to a unit for calibrating and fitting a projected light spot of the real world. Specifically, the calibration fitting unit may determine the characteristics and distribution of the projected light by analyzing and measuring parameters such as the spectrum, intensity, brightness, color, saturation, color temperature, color rendering, intensity, direction of the projected light, range of the projected light, distance of the projected light, and position of the projected light. And then, according to preset standards and requirements, calibrating and fitting the projection light by using a calibration algorithm and a fitting model so as to realize accurate control and adjustment of the projection light. The calibration fitting unit typically includes the following functions: calibration function: the difference and deviation from the standard values are determined by measuring and analyzing parameters such as the spectrum, intensity, brightness, color, saturation, color temperature, color rendering, intensity, direction of the projected light, range of the projected light, distance of the projected light, position of the projected light, and the like of the projected light. The projected light is then calibrated and corrected using a calibration algorithm to achieve precise control of the projected light. Fitting function: the distribution condition and the change rule of the projection light are obtained by sampling and analyzing parameters such as the spectrum, the intensity, the brightness, the color, the saturation, the color temperature, the color rendering degree, the intensity, the direction of the projection light, the range of the projection light, the distance of the projection light, the position of the projection light and the like. The projected light is then fitted and modeled using the fitting model to achieve accurate description and control of the projected light. Self-adaptive adjustment function: according to the sampling data and the calibration and fitting results, parameters such as the projection light intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, projection light direction, projection light range, projection light distance, projection light position and the like of the pixel lamp beads are adaptively adjusted, so that real-time control and adjustment of the projection light are realized, and the calibration and fitting unit is an important component part in the method and the device of the self-correction fitting and multicolor repositioning technology of the projection light, and can improve the quality and accuracy of the projection effect and ensure the expressive power and consistency of the visual effect.
In the invention, the multicolor repositioning unit refers to the controllable programming pixel lamp bead projection light with different colors, which is turned on or off according to the requirement, so as to realize repositioning of colors. For increasing the display color range and the dynamic color range, thereby improving the visual effect. In a method and apparatus for self-correcting fitting of projected light and multi-color repositioning techniques, multi-color repositioning elements refer to elements used to color reposition projected light in a combined array of controllable programmed pixel beads projected light. In particular, the multi-color repositioning unit may achieve rapid switching and mixing between multiple colors by controlling the brightness, color, and other optical properties of each pixel bead. Such color repositioning techniques may be applied to a variety of situations, such as stage performance, large-scale activities, commercial presentations, and film shots, to create a more colorful visual effect. In the prior art, multicolor repositioning units have typically been implemented using high-precision, high-speed control systems and algorithms. In short, the multicolor repositioning unit is an important component in the method and the device of the projection light self-correction fitting and multicolor repositioning technology, can improve the quality and expressive force of visual effect, and creates more wonderful and shocked visual experience for shooting in various occasions.
Setting the joint array in the real world to coincide with the virtual joint array after the parameter adjustment may have some deviation and inaccuracy, so that calibration and multi-color repositioning are required, which involves calibrating each projected light spot for intensity, brightness, color, saturation, color temperature, color rendering, intensity, and projected light direction, projected light range, projected light distance, position of the projected light, etc. to ensure that they coincide with the corresponding projected light spots in the virtual world.
The second mapping information is issued to various components in the virtual reality system. This information is used to control projected light points and other objects in the virtual environment, as well as interactions with the real character and the real joint array.
The working assembly automatically controls the communication between the multi-connected controller and the combined array controller: working components in the virtual reality system can automatically control the communication of the multi-gang controller and the united array controller. The working components and the joint array controller are connected wirelessly, and in the process, the working component multi-joint controller in the virtual reality system can generate control instructions according to the second mapping information and other input information so as to control the actions of the joint array controller. As shown in fig. 6, the number of multi-connected controllers in the virtual reality system is adjusted according to the user's needs. In the present invention, a multi-set controller refers to a controller capable of controlling n sets of joint arrays simultaneously. A multiple set of federated arrays refers to an array of all federated arrays controlled by a plurality of federated array controllers. A multiple controller refers to a controller capable of controlling n sets of joint array controllers. Through the use of multiple controllers, the instant communication of multiple combined array controllers can be realized, and the accurate control of the projection light of countless controllable programming pixel lamp beads is realized, so that the correction of various colors is realized. Specifically, the multi-joint controller can control the intensity, brightness, color, saturation, color temperature, color rendering, intensity, direction of the projected light, range of the projected light, distance of the projected light, and position of the projected light of the programmable pixel beads in each group of joint arrays by receiving signals from the joint array controller according to the signals. Thereby realizing accurate control of the whole pixel bead array. In addition, the multi-connected controller can adjust the arrangement mode and the number of the multi-group combined arrays according to the requirement, the combined array controller can control the arrangement mode and the number of part of the combined arrays, and the combined array controller can control the arrangement mode and the number of the pixel lamp beads in the part of the combined arrays. Thereby serving to achieve different projection and visual effects.
Communication is through a multi-gang controller and a federated array controller, which refers to a controller that controls multiple federated arrays. Fig. 8 is a schematic diagram of the internal circuitry of a joint controller, schematically illustrating the fabrication of a joint array controller. The joint array is composed of n controllable programming pixel beads, and one joint array can be an array with any shape such as a water wave shape, a circular shape, a rectangular shape, a strip shape and the like, which is called joint array. The joint array controller may control a plurality of joint arrays, which may arbitrarily compose a new-shaped array. The joint controller controls a plurality of joint arrays for acquiring variation information of projection points in the real world array. The information may include the intensity, brightness, color, saturation, color temperature, color rendering, intensity, direction of the projected light, range of the projected light, distance of the projected light, position of the projected light, and the like of the projected point, and interaction information with the virtual environment and other objects, that is, the change information of the projected point in the virtual joint array is obtained.
And according to the change information of the projection points in the virtual joint array, realizing synchronous connection of the real world projection light and the virtual world projection light. The above steps are repeated in successive cycles as the second camera 300 captures video, enabling real-time generation of a real joint array of fused images/video maps into the real world. And the joint controller transmits the change information of the projection points to enable the real joint array projection points to correspondingly change. First, a user can change parameters of a real joint array in a virtual reality system. Variation information of projected points in the real world array is obtained, which describes variation of the position, color, intensity, etc. of the projected points.
And the joint controller transmits the change information to the real-world array projection system according to the obtained change information of the virtual joint array projection points. In this process, the joint controller may need to communicate with the array projection system to communicate change information and control the action of the projection points. The multi-link controller maps projected spot parameter controls in accordance with the virtual joint array into the real world joint array. The real world joint array projected light spot feeds back information data of the real projected light spot. The federated array controller may control a federated array or multiple federated arrays. There are also several joint array controllers that control the mode of a joint array. Different user requirements have different mode settings.
After receiving the change information issued by the joint controller, the real world array projection system performs corresponding actions according to the information, including changing the intensity, brightness, color, saturation, color temperature, color rendering, intensity, and the attributes of the projection light direction, the projection light range, the projection light distance, the projection light position, and the like of the projection point so as to match the requirements of the virtual environment and other objects.
As shown in fig. 7, the virtual world cycle repeats the above steps to map the projected light to the real world representation. Specifically, parameters are introduced into the virtual environment in the virtual reality system and adjusted by working components of the virtual reality system, and the change of the projection points in the real world array corresponding to the corresponding projection points in the virtual environment can be realized through the cooperative work of the multi-connected controller, the combined controller and the real world array projection system. Such changes may be a match of properties such as intensity, brightness, color, saturation, color temperature, color rendering, intensity, and direction of projected light, range of projected light, distance of projected light, position of projected light, or a corresponding adjustment of position. Mapping of virtual parameters to the real world is achieved through the combined use of these techniques so that the user can continuously and synchronously photograph and produce movies in real time in a continuous manner using the first camera 100, providing a more realistic virtual reality experience for the user.
The self-correction fitting and multicolor repositioning method for projection light comprises the following steps: acquiring data information of a real shooting space, and synchronizing the data information to a virtual reality system; setting a real joint array for the real shooting space according to the user demand; setting a virtual joint array for a virtual reality system according to the real joint array; acquiring first mapping information of the virtual reality system; performing parameter adjustment and color fusion of projection light on a virtual joint array in the virtual reality system according to the working assembly of the virtual reality system and the first mapping information, and obtaining second mapping information of the virtual reality system after the color fusion; repositioning and setting parameters of a real joint array of a real shooting space according to second mapping information of the multi-connected controller and the virtual joint array so that colors of the real shooting space and a virtual reality system are consistent; the real joint array and the virtual joint array are both projected light spots. According to the invention, through the cooperative work of the virtual reality system and the combined array of the real shooting space, the restoration and the mapping of the projection points in the real shooting space and the projection points in the corresponding virtual reality system are realized, the linkage and the interaction between the real world and the virtual world are realized, the real combined array is set according to the adjustment virtual parameters, and a user can directly shoot a film through the first camera 100, so that the real virtual reality experience is provided for the user.
The foregoing details of the optional implementation of the embodiment of the present invention have been described in detail with reference to the accompanying drawings, but the embodiment of the present invention is not limited to the specific details of the foregoing implementation, and various simple modifications may be made to the technical solution of the embodiment of the present invention within the scope of the technical concept of the embodiment of the present invention, and these simple modifications all fall within the protection scope of the embodiment of the present invention.
In addition, the specific features described in the above embodiments may be combined in any suitable manner without contradiction. In order to avoid unnecessary repetition, various possible combinations of embodiments of the present invention are not described in detail.
Those skilled in the art will appreciate that all or part of the steps in implementing the methods of the embodiments described above may be implemented by a program stored in a storage medium, including instructions for causing a single-chip microcomputer, chip or processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In addition, any combination of various embodiments of the present invention may be performed, so long as the concept of the embodiments of the present invention is not violated, and the disclosure of the embodiments of the present invention should also be considered.

Claims (10)

1. A method for self-correcting fitting and multi-color repositioning of projected light, the method comprising:
acquiring data information of a real shooting space, and synchronizing the data information to a virtual reality system;
setting a real joint array for the real shooting space according to the user demand;
setting a virtual joint array for a virtual reality system according to the real joint array;
acquiring first mapping information of the virtual reality system;
performing parameter adjustment and color fusion of projection light on a virtual joint array in the virtual reality system according to the working assembly of the virtual reality system and the first mapping information, and obtaining second mapping information of the virtual reality system after the color fusion;
repositioning and setting parameters of a real joint array of a real shooting space according to the second mapping information and the virtual joint array so that the colors of the real shooting space and a virtual reality system are consistent;
the real combined array and the virtual combined array are all projection light spots;
The obtaining the first mapping information of the virtual reality system includes: setting at least one virtual camera and a virtual positioner in a virtual reality system, importing a virtual environment, and obtaining space matching information of the virtual environment, the character and a virtual joint array;
the first mapping information and the second mapping information at least comprise the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, direction of the projected light, range of the projected light, distance of the projected light and position of the projected light spot;
the projected light is at least one group of pixel beads in a joint array;
the combined array is an array formed by a plurality of controllable programming pixel lamp beads;
the performing color fusion of the modulating and projecting light on the virtual joint array in the virtual reality system according to the working component of the virtual reality system and the first mapping information includes: after a working assembly in the virtual reality system receives the first mapping information, a user changes parameters of the projected light of the controllable programming pixel lamp beads in the virtual combined array in the virtual reality system according to own requirements, and color fusion and automatic adjustment are carried out on the projected light of the controllable programming pixel lamp beads in the virtual environment and the virtual combined array according to the parameters;
The color fusion includes mixing a projected light direction, a projected light range, a projected light distance, and a projected light color.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the data information at least comprises projection origin information, imaging surface information, visual information, projection light information and positioning information;
the synchronizing the data information to the virtual reality system includes: setting a virtual reality system, wherein the virtual reality system is at least provided with a virtual camera and a virtual positioner; and synchronizing the data information to the virtual camera for synchronizing with a camera and a positioner of the real shooting space.
3. The method of claim 1, wherein the setting a virtual joint array for a virtual reality system from the real joint array comprises:
and adjusting the projection light of the virtual combined array according to the form of the real combined array in the real shooting space, so that the form of the real combined array in the real shooting space is consistent with the form of the virtual combined array in the virtual reality system.
4. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the projected light is a controllable programmed pixel lamp bead in at least one group of combined arrays, and the light efficiency parameter of the projected light is at least one of brightness, color, saturation, color temperature, color rendering and intensity.
5. The method of claim 1, wherein the repositioning and parameter setting of the real joint array of the real shooting space according to the second mapping information and the virtual joint array comprises:
determining color information of the virtual reality system according to the second mapping information and the virtual joint array;
mapping the projection light parameters of the real shooting space;
repositioning the projected light parameters according to color information of the virtual reality system;
synchronizing the color information update to a real camera for synchronizing with a camera and a locator of a real shooting space;
the repositioning includes calibrating, fitting, adapting and matching the actual projected light.
6. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the first mapping information and the second mapping information at least comprise the position, the range, the direction, the distance, the brightness, the color, the saturation, the color temperature, the color rendering and the intensity of the projection light spot.
7. A projected light self-correcting fitting and multi-color repositioning device, the device comprising:
the acquisition module is used for acquiring data information of a real shooting space and synchronizing the data information to the virtual reality system;
The first processing module is used for setting a real joint array for the real shooting space according to the user demand;
the second processing module is used for setting a virtual joint array for a virtual reality system according to the real joint array;
the third processing module is used for acquiring first mapping information of the virtual reality system; performing parameter adjustment and color fusion of projection light on a virtual joint array in the virtual reality system according to the working assembly of the virtual reality system and the first mapping information, and obtaining second mapping information of the virtual reality system after the color fusion;
the fourth processing module is used for repositioning and setting parameters of the real joint array of the real shooting space according to the second mapping information and the virtual joint array so that the colors of the real shooting space and the virtual reality system are consistent;
the real combined array and the virtual combined array are all projection light spots;
the obtaining the first mapping information of the virtual reality system includes: setting at least one virtual camera and a virtual positioner in a virtual reality system, importing a virtual environment, and obtaining space matching information of the virtual environment, the character and a virtual joint array;
The first mapping information and the second mapping information at least comprise the intensity, brightness, color, saturation, color temperature, color rendering degree, intensity, direction of the projected light, range of the projected light, distance of the projected light and position of the projected light spot;
the projected light is at least one group of pixel beads in a joint array;
the combined array is an array formed by a plurality of controllable programming pixel lamp beads;
the performing color fusion of the modulating and projecting light on the virtual joint array in the virtual reality system according to the working component of the virtual reality system and the first mapping information includes: after a working assembly in the virtual reality system receives the first mapping information, a user changes parameters of the projected light of the controllable programming pixel lamp beads in the virtual combined array in the virtual reality system according to own requirements, and color fusion and automatic adjustment are carried out on the projected light of the controllable programming pixel lamp beads in the virtual environment and the virtual combined array according to the parameters;
the color fusion includes mixing a projected light direction, a projected light range, a projected light distance, and a projected light color.
8. The apparatus of claim 7, wherein the device comprises a plurality of sensors,
the data information at least comprises projection origin information, imaging surface information, visual information, projection light information and positioning information;
The synchronizing the data information to the virtual reality system includes: setting a virtual reality system, wherein the virtual reality system is at least provided with a virtual camera and a virtual positioner; and synchronizing the data information to the virtual camera for synchronizing with a camera and a positioner of the real shooting space.
9. The apparatus of claim 7, wherein the setting a virtual joint array for a virtual reality system according to the real joint array comprises:
and adjusting the projection light of the virtual combined array according to the form of the real combined array in the real shooting space, so that the form of the real combined array in the real shooting space is consistent with the form of the virtual combined array in the virtual reality system.
10. The apparatus of claim 7, wherein the repositioning and parameter setting of the real joint array of the real shooting space according to the second mapping information and the virtual joint array comprises:
determining color information of the virtual reality system according to the second mapping information and the virtual joint array;
mapping the projection light parameters of the real shooting space;
repositioning the projected light parameters according to color information of the virtual reality system;
Synchronizing the color information update to a real camera for synchronizing with a camera and a locator of a real shooting space;
the repositioning includes calibrating, fitting, adapting and matching the actual projected light.
CN202311778131.8A 2023-12-22 2023-12-22 Projection light self-correction fitting and multicolor repositioning method and device Active CN117459663B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311778131.8A CN117459663B (en) 2023-12-22 2023-12-22 Projection light self-correction fitting and multicolor repositioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311778131.8A CN117459663B (en) 2023-12-22 2023-12-22 Projection light self-correction fitting and multicolor repositioning method and device

Publications (2)

Publication Number Publication Date
CN117459663A CN117459663A (en) 2024-01-26
CN117459663B true CN117459663B (en) 2024-02-27

Family

ID=89591419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311778131.8A Active CN117459663B (en) 2023-12-22 2023-12-22 Projection light self-correction fitting and multicolor repositioning method and device

Country Status (1)

Country Link
CN (1) CN117459663B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009127910A1 (en) * 2008-04-18 2009-10-22 Rajakaruna Wijemuni Gunawardan Realistic parallax visual system
CN103226830A (en) * 2013-04-25 2013-07-31 北京大学 Automatic matching correction method of video texture projection in three-dimensional virtual-real fusion environment
JP2014016563A (en) * 2012-07-11 2014-01-30 National Institute Of Information & Communication Technology Three-dimensional display device
WO2014031899A1 (en) * 2012-08-22 2014-02-27 Goldrun Corporation Augmented reality virtual content platform apparatuses, methods and systems
WO2014191990A1 (en) * 2013-05-26 2014-12-04 Pixellot Ltd. Method and system for low cost television production
CN105027030A (en) * 2012-11-01 2015-11-04 艾卡姆有限公司 Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
CN114051129A (en) * 2021-11-09 2022-02-15 北京电影学院 Film virtualization production system and method based on LED background wall
CN116486048A (en) * 2022-11-14 2023-07-25 腾讯科技(深圳)有限公司 Virtual-real fusion picture generation method, device, equipment and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253700A1 (en) * 2009-04-02 2010-10-07 Philippe Bergeron Real-Time 3-D Interactions Between Real And Virtual Environments
US20190333541A1 (en) * 2016-11-14 2019-10-31 Lightcraft Technology Llc Integrated virtual scene preview system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009127910A1 (en) * 2008-04-18 2009-10-22 Rajakaruna Wijemuni Gunawardan Realistic parallax visual system
JP2014016563A (en) * 2012-07-11 2014-01-30 National Institute Of Information & Communication Technology Three-dimensional display device
WO2014031899A1 (en) * 2012-08-22 2014-02-27 Goldrun Corporation Augmented reality virtual content platform apparatuses, methods and systems
CN105027030A (en) * 2012-11-01 2015-11-04 艾卡姆有限公司 Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
CN103226830A (en) * 2013-04-25 2013-07-31 北京大学 Automatic matching correction method of video texture projection in three-dimensional virtual-real fusion environment
WO2014191990A1 (en) * 2013-05-26 2014-12-04 Pixellot Ltd. Method and system for low cost television production
CN114051129A (en) * 2021-11-09 2022-02-15 北京电影学院 Film virtualization production system and method based on LED background wall
CN116486048A (en) * 2022-11-14 2023-07-25 腾讯科技(深圳)有限公司 Virtual-real fusion picture generation method, device, equipment and system

Also Published As

Publication number Publication date
CN117459663A (en) 2024-01-26

Similar Documents

Publication Publication Date Title
US11210839B2 (en) Photometric image processing
US11526067B2 (en) Lighting assembly for producing realistic photo images
US20240029342A1 (en) Method and data processing system for synthesizing images
CN111062869B (en) Multi-channel correction splicing method for curved curtain
JP2003208601A (en) Three dimensional object photographing device, three dimensional shape model generation device, three dimensional shape model generation method, and three dimensional shape model generation program
KR20100023970A (en) Lighting device
US20210233489A1 (en) A system and methodology for the high-fidelity display of artwork images
CN109671388A (en) The acquisition methods and device of correction data
CN117459663B (en) Projection light self-correction fitting and multicolor repositioning method and device
CN103809364B (en) True three-dimensional image display systems and true three-dimensional image display method
CN110009723A (en) The method for reconstructing and device of environment light source
JPH04212193A (en) Illumination control method
CN109446945A (en) Threedimensional model treating method and apparatus, electronic equipment, computer readable storage medium
JP6730787B2 (en) Projection device
US20230031464A1 (en) Image processing apparatus and virtual illumination system
WO2021200143A1 (en) Image processing device, image processing method, and 3d model data generation method
JP5162393B2 (en) Lighting device
CN112040596B (en) Virtual space light control method, computer readable storage medium and system
JP6244091B2 (en) Planetarium control device
CN117527995A (en) Simulated live-action shooting method and system based on space simulated shooting
JP2023021670A (en) Image processing device, image-capturing system, and illumination device
JP2023136026A (en) Information processing device and virtual illumination system
WO2023094882A1 (en) Increasing dynamic range of a virtual production display
CN110163919A (en) Three-dimensional modeling method and device
WO2023094872A1 (en) Increasing dynamic range of a virtual production display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant