CN110033521B - Three-dimensional visualization system based on VR and AR technologies - Google Patents

Three-dimensional visualization system based on VR and AR technologies Download PDF

Info

Publication number
CN110033521B
CN110033521B CN201910255016.XA CN201910255016A CN110033521B CN 110033521 B CN110033521 B CN 110033521B CN 201910255016 A CN201910255016 A CN 201910255016A CN 110033521 B CN110033521 B CN 110033521B
Authority
CN
China
Prior art keywords
module
color
target object
image
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910255016.XA
Other languages
Chinese (zh)
Other versions
CN110033521A (en
Inventor
张淼辉
王伦海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Gucheng Future Education Technology Co Ltd
Original Assignee
Chongqing Gucheng Future Education Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Gucheng Future Education Technology Co Ltd filed Critical Chongqing Gucheng Future Education Technology Co Ltd
Priority to CN201910255016.XA priority Critical patent/CN110033521B/en
Publication of CN110033521A publication Critical patent/CN110033521A/en
Application granted granted Critical
Publication of CN110033521B publication Critical patent/CN110033521B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a three-dimensional visualization system based on VR and AR technologies, which is used for solving the problems that in the three-dimensional visualization system in the prior art, colors cannot be adjusted according to the angle, distance and time period of an observer, so that the colors observed by the observer at any angle are unchanged, and the real effect is reduced; the system comprises an image acquisition module, an image processing module, a three-dimensional modeling module, an image color extraction module, a color adjustment module, a three-dimensional modeling module, a sunshine simulation module, a visual angle detection module, an interaction module and VR and AR intelligent equipment; according to the three-dimensional visualization system based on the VR and AR technologies, recalculation is carried out through the color adjusting module according to the observation visual angles of the VR intelligent equipment and the AR intelligent equipment and the input time period, then the new assignment colors of the pixel lattices are used for coloring the modeling, and therefore the colors of the three-dimensional modeling of an observer during observation are closer to reality.

Description

Three-dimensional visualization system based on VR and AR technologies
Technical Field
The invention relates to the technical field of three-dimensional visualization, in particular to a three-dimensional visualization system based on VR and AR technologies.
Background
The visualization technology is a theory, a method and a technology which convert data into graphs or images to be displayed on a screen by using computer graphics and image processing technology and carry out interactive processing. The method relates to a plurality of fields such as computer graphics, image processing, computer vision, computer aided design and the like, and becomes an integrated technology for researching a series of problems such as data representation, data processing, decision analysis and the like; in the past decade, computer graphics has been developed to make three-dimensional modeling techniques gradually perfect, to enable the reproduction of objects in the three-dimensional world by computer simulation, and to represent complex information by three-dimensional shapes; meanwhile, parallel computing technology and graphic acceleration hardware rise rapidly in recent years, so that visualization technology is greatly improved.
In the three-dimensional visualization system in the prior art, colors can not be adjusted according to the angle, distance and time period of an observer, so that the colors observed by the observer at any angle are unchanged, and the real effect is reduced.
Disclosure of Invention
The invention aims to provide a three-dimensional visualization system based on VR and AR technologies.
The technical problem to be solved by the invention is as follows:
(1) adjusting the color of the three-dimensional visualization system according to the observation visual angle, the distance and the time period to ensure that the three-dimensional model is observed more truly;
(2) how to record, cache and reasonably delete the observation video of the user;
the purpose of the invention can be realized by the following technical scheme: a three-dimensional visualization system based on VR and AR technologies comprises an image acquisition module, an image processing module, a three-dimensional modeling module, an image color module, a color adjustment module, a three-dimensional modeling module, a sunshine simulation module, a visual angle detection module, an interaction module and VR and AR intelligent equipment;
the image acquisition module is used for acquiring a mobile phone image, an unmanned aerial vehicle inclined image, a street view vehicle image and internet image data of a target object; the image acquisition module sends the acquired mobile phone image, unmanned plane inclination image, street view vehicle image and internet image data of the target object to the image processing module; the image processing module receives the mobile phone image, the unmanned aerial vehicle inclined image, the street view vehicle image and the internet image data of the target object sent by the image acquisition module and extracts the visual data of the target object; the visualized data comprises target object feature points extracted through an SITF algorithm and target object visualized color data; the image processing module sends the feature point data of the target object to the three-dimensional modeling module; the three-dimensional modeling module receives the target object feature point data sent by the image processing module and carries out three-dimensional modeling on the target object feature points;
the image processing module sends the visualized color data to the image color module, the image color module receives the visualized color data of the target object sent by the image processing module and carries out computer color identification, and the specific identification steps are as follows:
the method comprises the following steps: dividing object visualization color data into a plurality of pixel grids PijI 1 … … n, j 1 … … n; setting computerColor classification into RGBk,k=1……n;
Step two: pixel cell PijThe colors in the color table are matched with the computer color classification, and the set threshold value is exceeded by the set threshold value, namely the pixel grid PijThe color inside is the matched color RGBk
The image color module sends the color matched with the pixel grid to the color adjusting module;
the sunshine simulation module is used for simulating illumination of the target object according to a time sequence and acquiring an illumination intensity value of each pixel grid of the target object; the sunshine simulation module sends the collected illumination intensity value of each pixel grid of the target object to the color adjustment module;
the visual angle detection module is used for detecting the observation visual angles of the VR intelligent equipment and the AR intelligent equipment in real time and the coordinates of an observation viewpoint and a target object; the visual angle detection module sends the observation visual angle of the VR intelligent equipment and the AR intelligent equipment to be detected and the coordinates of the observation viewpoint and the target object to the color adjustment module, the color adjustment module is used for adjusting the color of the target object, and the specific adjustment steps are as follows:
s1: calculating included angles between pixel grids in observation visual angles of the VR and AR intelligent devices and observation viewpoints of the VR and AR intelligent devices; and mark the included angle as Dij
S2: calculating the distance between the target object pixel grid and the observation viewpoint of the VR intelligent equipment and the observation viewpoint of the AR intelligent equipment in the observation visual angle of the VR intelligent equipment and the observation viewpoint of the AR intelligent equipment; and the distance is denoted as Sij
S3: setting each pixel cell PijCorresponding illumination intensity value LXij
S4: using formulas
Figure GDA0002232190550000031
Obtaining the pixel brightness MD of the pixel grids at different positionsij(ii) a Wherein u1, u2 and u3 are preset proportionality coefficients; MDbIs a brightness standard value; dbIs a preset angle threshold value;
s5: pixel grid PijMatched color RGBkAnd the calculated pixel brightness MDijReassigning the pixel grid to obtain a new assigned color; recording the new assigned color as CRGBk
The color adjusting module sends the new assigned colors of the pixel lattices to the three-dimensional modeling module; the three-dimensional modeling module carries out color reassignment on the target object according to the new assigned colors of the pixel lattices; the three-dimensional modeling module sends the three-dimensional model of the target object to a server.
Preferably, the server receives the three-dimensional model of the target object sent by the three-dimensional modeling module and marks the three-dimensional model as a three-dimensional visualization system for storage; the server also comprises a sound leading-in module and a history caching module; the sound leading-in module is used for leading in the sound of the three-dimensional model of the target object to the three-dimensional visualization system; the history cache module is used for recording the access of a user to the three-dimensional visualization system and caching, and comprises a storage unit, a statistical unit, a calculation unit and a deletion unit; the storage unit is used for caching videos of the recording user accessing the three-dimensional visualization system, and the counting unit is used for counting the access times of the user to the videos of the recording user accessing the three-dimensional visualization system; the computing unit is used for computing the caching time of the video cached in the storage unit and used for the user to access the three-dimensional visualization system; the deleting unit is used for deleting the video cached in the storage unit and used for the user to access the three-dimensional visualization system; the specific calculation steps of the calculation unit are as follows:
the method comprises the following steps: a user accesses a three-dimensional visualization system in a server to start timing and cache, and when the user stops accessing, the timing is stopped and the cache is stopped; recording the duration of the cached video as Ti, i … … n; the size Gi, i … … n of the cached video;
step two: obtaining the cache storage time Hi by using a formula Hi- (Ti-Tb) z1- (Gi-Gb) z2+ Qi z 3; wherein Qi is the access frequency of the user to the video of the recording user accessing the three-dimensional visualization system; ha is a standard value of the cache time; gb is a time standard value of the buffered video; z1, z2 and z3 are preset fixed proportionality coefficients; the longer the duration of the cached video is, the shorter the caching storage time is; the larger the cache video memory is, the shorter the cache storage time is; the more access times; the longer the cache storage time;
step three: the sum of the video caching time and the caching storage time is the deletion date; when the deletion date is the same as the current date of the system; the calculation unit sends a 'deletion instruction' to the deletion unit, and the deletion unit deletes the cached video;
preferably, the interaction module comprises an access unit and a voice unit; the access unit is used for logging in and accessing the server through VR and AR equipment by a user; the voice unit is used for real-time voice communication between users.
Preferably, the interaction module further comprises an input unit; the input unit is used for inputting a time period by a user and sending the time period to the sunshine simulation module; the sunshine simulation module receives and processes the input time period sent by the input unit, and the processing steps are as follows:
the method comprises the following steps: setting a time period as STm; the simulated position of the sun is denoted as Wm;
step two: matching the input time period with a set time period STm; obtaining a matched sun simulation position Wm;
step three: the target object is provided with illumination intensity detection units in different directions; simulating the position Wm by the sun; setting different directions of the target object model as Fd;d=1……12;FdThe illumination intensity collected by the illumination intensity detection unit in the direction is recorded as LXd
Step four: judging the direction in which the pixel grids are positioned; when the direction of the pixel grid belongs to Fd(ii) a LX corresponding to the pixel gridij=LXd
The invention has the beneficial effects that:
(1) according to the invention, a mobile phone image, an unmanned aerial vehicle inclined image, a street view vehicle image and internet image data of a target object are obtained through an image obtaining module and then transmitted to an image processing module; then carrying out color extraction and modeling on the target object; recalculating through a color adjusting module according to the observation visual angles of the VR and AR intelligent devices and the input time period, and then coloring the modeling by the new assigned colors of the pixel grids, so that the color of the three-dimensional modeling of an observer during observation is closer to reality;
(2) using formulas
Figure GDA0002232190550000051
Obtaining the pixel brightness MD of the pixel grids at different positionsij(ii) a The larger the illumination intensity value is, the larger the pixel brightness value is; the closer the distance between the target object pixel grid and the observation viewpoints of the VR intelligent equipment and the AR intelligent equipment is, the larger the pixel grid brightness value is; the smaller the deviation of the included angle from the ninety degree direction, the larger the pixel grid brightness value;
(3) the server also comprises a sound leading-in module and a history caching module; the sound leading-in module is used for leading in the sound of the three-dimensional model of the target object to the three-dimensional visualization system; the history cache module is used for recording that a user accesses the three-dimensional visualization system and caching, and obtaining cache storage time Hi by using a formula Hi- (Ti-Tb) z1- (Gi-Gb) z2+ Qi z 3; the longer the duration of the cached video is, the shorter the caching storage time is; the larger the cache video memory is, the shorter the cache storage time is; the more access times; the longer the cache storage time; the interaction module comprises an access unit and a voice unit; the access unit is used for logging in and accessing the server through VR and AR equipment by a user; the voice unit is used for real-time voice communication between users.
Drawings
The invention will be further described with reference to the accompanying drawings.
Fig. 1 is a schematic block diagram of a three-dimensional visualization system based on VR and AR technologies according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the invention relates to a three-dimensional visualization system based on VR and AR technologies, which includes an image acquisition module, an image processing module, a three-dimensional modeling module, an image color module, a color adjustment module, a three-dimensional modeling module, a sunshine simulation module, a viewing angle detection module, an interaction module, and VR and AR intelligent devices;
the image acquisition module is used for acquiring a mobile phone image, an unmanned aerial vehicle inclined image, a street view vehicle image and internet image data of a target object; the image acquisition module sends the acquired mobile phone image, unmanned plane inclination image, street view vehicle image and internet image data of the target object to the image processing module; the image processing module receives the mobile phone image, the unmanned aerial vehicle inclination image, the street view vehicle image and the internet image data of the target object sent by the image acquisition module and extracts visual data of the target object; the visualized data comprises target object feature points extracted through an SITF algorithm and target object visualized color data; the image processing module sends the feature point data of the target object to the three-dimensional modeling module; the three-dimensional modeling module receives the target object feature point data sent by the image processing module and carries out three-dimensional modeling on the target object feature points;
the image processing module sends the visualized color data to the image color module, the image color module receives the visualized color data of the target object sent by the image processing module and carries out computer color identification, and the specific identification steps are as follows:
the method comprises the following steps: dividing object visualization color data into a plurality of pixel grids PijI 1 … … n, j 1 … … n; setting computer color classification to RGBk,k=1……n;
Step two: pixel cell PijThe colors in the color table are matched with the computer color classification, and the set threshold value is exceeded by the set threshold value, namely the pixel grid PijThe color inside is the matched color RGBk(ii) a Amplifying the color of the target object by multiple times to form a pixel grid, and matching the color of the pixel grid with the color of a computer to obtain the RGB color identified by the computer;
the image color module sends the color matched with the pixel grid to the color adjusting module;
the sunshine simulation module is used for simulating illumination on the target object according to a time sequence and acquiring an illumination intensity value of each pixel grid of the target object; the sunshine simulation module sends the collected illumination intensity value of each pixel grid of the target object to the color adjustment module;
the visual angle detection module is used for detecting the observation visual angles of the VR intelligent equipment and the AR intelligent equipment in real time and the coordinates of the observation viewpoint and the target object; the visual angle detection module sends the observation visual angle of the VR intelligent equipment and the AR intelligent equipment to be detected and the coordinates of the observation viewpoint and the target object to the color adjustment module, the color adjustment module is used for adjusting the color of the target object, and the specific adjustment steps are as follows:
s1: calculating included angles between pixel grids in observation visual angles of the VR and AR intelligent devices and observation viewpoints of the VR and AR intelligent devices; and mark the included angle as Dij
S2: calculating the distance between the target object pixel grid and the observation viewpoint of the VR and AR intelligent equipment in the observation visual angle of the VR and AR intelligent equipment, and recording the distance as Sij
S3: setting each pixel cell PijCorresponding illumination intensity value LXij
S4: using formulasObtaining the pixel brightness MD of the pixel grids at different positionsij(ii) a Wherein u1, u2 and u3 are preset proportionality coefficients; MDbIs a brightness standard value; dbIs a preset angle threshold value; the larger the illumination intensity value is, the larger the pixel brightness value is; the closer the distance between the target object pixel grid and the observation viewpoints of the VR intelligent equipment and the AR intelligent equipment is, the larger the pixel grid brightness value is; the smaller the deviation of the included angle from the ninety degree direction, the larger the pixel grid brightness value; adjusting the visualized color change of the three-dimensional model in real time according to the included angle and the distance of the observation visual angle and the simulated illumination intensity, so that the observation is closer to the reality;
s5: pixel grid PijMatched color RGBkAnd the calculated pixel brightness MDijReassigning the pixel grid to obtain a new assigned color; will be newAssigned color of (2) is noted as CRGBk
The color adjusting module sends the new assigned colors of the pixel lattices to the three-dimensional modeling module; the three-dimensional modeling module carries out color reassignment on the target object according to the new assigned colors of the pixel lattices; the three-dimensional modeling module sends the three-dimensional model of the target object to a server;
the server receives the three-dimensional model of the target object sent by the three-dimensional modeling module and marks the three-dimensional model as a three-dimensional visualization system for storage; the server also comprises a sound leading-in module and a history caching module; the sound leading-in module is used for leading in the sound of the three-dimensional model of the target object to the three-dimensional visualization system; three-dimensional modeling, pixel grid reassignment color and sound import form a three-dimensional scene, observation is carried out through VR and AR intelligent equipment, a history cache module is used for recording a user to access a three-dimensional visual system and caching, and the history cache module comprises a storage unit, a statistical unit, a calculation unit and a deletion unit; the storage unit is used for caching videos of the recording user accessing the three-dimensional visualization system, and the counting unit is used for counting the access times of the user to the videos of the recording user accessing the three-dimensional visualization system; the computing unit is used for computing the caching time of the video cached in the storage unit and used for the user to access the three-dimensional visualization system; the deleting unit is used for deleting the video cached in the storage unit and used for the user to access the three-dimensional visualization system; the specific calculation steps of the calculation unit are as follows:
the method comprises the following steps: a user accesses a three-dimensional visualization system in a server to start timing and cache, and when the user stops accessing, the timing is stopped and the cache is stopped; recording the duration of the cached video as Ti, i … … n; the size Gi, i … … n of the cached video;
step two: obtaining the cache storage time Hi by using a formula Hi- (Ti-Tb) z1- (Gi-Gb) z2+ Qi z 3; wherein Qi is the access frequency of the user to the video of the recording user accessing the three-dimensional visualization system; ha is a standard value of the cache time; gb is a time standard value of the buffered video; z1, z2 and z3 are preset fixed proportionality coefficients; the longer the duration of the cached video is, the shorter the caching storage time is; the larger the cache video memory is, the shorter the cache storage time is; the more access times; the longer the cache storage time;
step three: the sum of the video caching time and the caching storage time is the deletion date; when the deletion date is the same as the current date of the system; the calculation unit sends a 'deletion instruction' to the deletion unit, and the deletion unit deletes the cached video;
the interaction module comprises an access unit and a voice unit; the access unit is used for logging in and accessing the server through VR and AR equipment by a user; the voice unit is used for real-time voice communication between users;
the interaction module further comprises an input unit; the input unit is used for inputting a time period by a user and sending the time period to the sunshine simulation module; the sunshine simulation module receives and processes the input time period sent by the input unit, and the processing steps are as follows:
the method comprises the following steps: setting a time period as STm; the simulated position of the sun is denoted as Wm;
step two: matching the input time period with a set time period STm; obtaining a matched sun simulation position Wm;
step three: the target object is provided with illumination intensity detection units in different directions; simulating the position Wm by the sun; setting different directions of the target object model as Fd;d=1……12;FdThe illumination intensity collected by the illumination intensity detection unit in the direction is recorded as LXd
Step four: judging the direction in which the pixel grids are positioned; when the direction of the pixel grid belongs to FdDirection; LX corresponding to the pixel gridij=LXd
The working principle of the invention is as follows: the image acquisition module acquires a mobile phone image, an unmanned aerial vehicle inclined image, a street view vehicle image and internet image data of a target object and then transmits the mobile phone image, the unmanned aerial vehicle inclined image, the street view vehicle image and the internet image data to the image processing module; then carrying out color extraction and modeling on the target object; recalculating through a color adjusting module according to the observation visual angles of the VR and AR intelligent devices and the input time period, and then coloring the modeling by the new assigned colors of the pixel grids, so that the color of the three-dimensional modeling of an observer during observation is closer to reality; three-dimensional modeling, pixel grid reassignment color and voice import formationA dimensional scene is observed through VR and AR intelligent equipment; using formulasObtaining the pixel brightness MD of the pixel grids at different positionsij(ii) a The larger the illumination intensity value is, the larger the pixel brightness value is; the closer the distance between the target object pixel grid and the observation viewpoints of the VR intelligent equipment and the AR intelligent equipment is, the larger the pixel grid brightness value is; the smaller the deviation of the included angle from the ninety degree direction, the larger the pixel grid brightness value; the server also comprises a sound leading-in module and a history caching module; the sound leading-in module is used for leading in the sound of the three-dimensional model of the target object to the three-dimensional visualization system; the history cache module is used for recording that a user accesses the three-dimensional visualization system and caching, and obtaining cache storage time Hi by using a formula Hi- (Ti-Tb) z1- (Gi-Gb) z2+ Qi z 3; the longer the duration of the cached video is, the shorter the caching storage time is; the larger the cache video memory is, the shorter the cache storage time is; the more access times; the longer the cache storage time; the interaction module comprises an access unit and a voice unit; the access unit is used for logging in and accessing the server through VR and AR equipment by a user; the voice unit is used for real-time voice communication between users.
The foregoing is merely exemplary and illustrative of the present invention and various modifications, additions and substitutions may be made by those skilled in the art to the specific embodiments described without departing from the scope of the invention as defined in the following claims.

Claims (4)

1. A three-dimensional visualization system based on VR and AR technologies is characterized by comprising an image acquisition module, an image processing module, a three-dimensional modeling module, an image color module, a color adjustment module, a three-dimensional modeling module, a sunshine simulation module, a visual angle detection module, an interaction module and VR and AR intelligent equipment;
the image acquisition module is used for acquiring a mobile phone image, an unmanned aerial vehicle inclined image, a street view vehicle image and internet image data of a target object; the image acquisition module sends the acquired mobile phone image, unmanned plane inclination image, street view vehicle image and internet image data of the target object to the image processing module; the image processing module receives the mobile phone image, the unmanned aerial vehicle inclined image, the street view vehicle image and the internet image data of the target object sent by the image acquisition module and extracts the visual data of the target object; the visualized data comprises target object feature points extracted through an SITF algorithm and target object visualized color data; the image processing module sends the feature point data of the target object to the three-dimensional modeling module; the three-dimensional modeling module receives the target object feature point data sent by the image processing module and carries out three-dimensional modeling on the target object feature points;
the image processing module sends the visualized color data to the image color module, the image color module receives the visualized color data of the target object sent by the image processing module and carries out computer color identification, and the specific identification steps are as follows:
the method comprises the following steps: dividing object visualization color data into a plurality of pixel grids PijI 1 … … n, j 1 … … n; setting computer color classification to RGBk,k=1……n;
Step two: pixel cell PijThe colors in the color table are matched with the computer color classification, and the set threshold value is exceeded by the set threshold value, namely the pixel grid PijThe color inside is the matched color RGBk
The image color module sends the color matched with the pixel grid to the color adjusting module;
the sunshine simulation module is used for simulating illumination of the target object according to a time sequence and acquiring an illumination intensity value of each pixel grid of the target object; the sunshine simulation module sends the collected illumination intensity value of each pixel grid of the target object to the color adjustment module;
the visual angle detection module is used for detecting the observation visual angles of the VR intelligent equipment and the AR intelligent equipment in real time and the coordinates of an observation viewpoint and a target object; the visual angle detection module sends the observation visual angle of the VR intelligent equipment and the AR intelligent equipment to be detected and the coordinates of the observation viewpoint and the target object to the color adjustment module, the color adjustment module is used for adjusting the color of the target object, and the specific adjustment steps are as follows:
s1: calculating included angles between pixel grids in observation visual angles of the VR and AR intelligent devices and observation viewpoints of the VR and AR intelligent devices; and mark the included angle as Dij
S2: calculating the distance between the target object pixel grid and the observation viewpoint of the VR intelligent equipment and the observation viewpoint of the AR intelligent equipment in the observation visual angle of the VR intelligent equipment and the observation viewpoint of the AR intelligent equipment; and the distance is denoted as Sij
S3: setting each pixel cell PijCorresponding illumination intensity value LXij
S4: using formulas
Figure FDA0002232190540000021
Obtaining the pixel brightness MD of the pixel grids at different positionsij(ii) a Wherein u1, u2 and u3 are preset proportionality coefficients; MDbIs a brightness standard value; dbIs a preset angle threshold value;
s5: pixel grid PijMatched color RGBkAnd the calculated pixel brightness MDijReassigning the pixel grid to obtain a new assigned color; recording the new assigned color as CRGBk
The color adjusting module sends the new assigned colors of the pixel lattices to the three-dimensional modeling module; the three-dimensional modeling module carries out color reassignment on the target object according to the new assigned colors of the pixel lattices; the three-dimensional modeling module sends the three-dimensional model of the target object to a server.
2. The three-dimensional visualization system according to claim 1, wherein the server receives the three-dimensional model of the object from the three-dimensional modeling module and marks the three-dimensional model as the three-dimensional visualization system for storage; the server also comprises a sound leading-in module and a history caching module; the sound leading-in module is used for leading in the sound of the three-dimensional model of the target object to the three-dimensional visualization system; the history cache module is used for recording the access of a user to the three-dimensional visualization system and caching, and comprises a storage unit, a statistical unit, a calculation unit and a deletion unit; the storage unit is used for caching videos of the recording user accessing the three-dimensional visualization system, and the counting unit is used for counting the access times of the user to the videos of the recording user accessing the three-dimensional visualization system; the computing unit is used for computing the caching time of the video cached in the storage unit and used for the user to access the three-dimensional visualization system; the deleting unit is used for deleting the video cached in the storage unit and used for the user to access the three-dimensional visualization system; the specific calculation steps of the calculation unit are as follows:
the method comprises the following steps: a user accesses a three-dimensional visualization system in a server to start timing and cache, and when the user stops accessing, the timing is stopped and the cache is stopped; recording the duration of the cached video as Ti, i … … n; the size Gi, i … … n of the cached video;
step two: obtaining the cache storage time Hi by using a formula Hi- (Ti-Tb) z1- (Gi-Gb) z2+ Qi z 3; wherein Qi is the access frequency of the user to the video of the recording user accessing the three-dimensional visualization system; ha is a standard value of the cache time; gb is a time standard value of the buffered video; z1, z2 and z3 are preset fixed proportionality coefficients; the longer the duration of the cached video is, the shorter the caching storage time is; the larger the cache video memory is, the shorter the cache storage time is; the more access times; the longer the cache storage time;
step three: the sum of the video caching time and the caching storage time is the deletion date; when the deletion date is the same as the current date of the system; the calculation unit sends a "delete instruction" to the deletion unit, which deletes the cached video.
3. The three-dimensional visualization system according to claim 1, wherein the interaction module comprises an access unit and a speech unit; the access unit is used for logging in and accessing the server through VR and AR equipment by a user; the voice unit is used for real-time voice communication between users.
4. The three-dimensional visualization system according to claim 1, wherein the interaction module further comprises an input unit; the input unit is used for inputting a time period by a user and sending the time period to the sunshine simulation module; the sunshine simulation module receives and processes the input time period sent by the input unit, and the processing steps are as follows:
the method comprises the following steps: setting a time period as STm; the simulated position of the sun is denoted as Wm;
step two: matching the input time period with a set time period STm; obtaining a matched sun simulation position Wm;
step three: the target object is provided with illumination intensity detection units in different directions; simulating the position Wm by the sun; setting different directions of the target object model as Fd;d=1……12;FdThe illumination intensity collected by the illumination intensity detection unit in the direction is recorded as LXd
Step four: judging the direction in which the pixel grids are positioned; when the direction of the pixel grid belongs to Fd(ii) a LX corresponding to the pixel gridij=LXd
CN201910255016.XA 2019-04-01 2019-04-01 Three-dimensional visualization system based on VR and AR technologies Active CN110033521B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910255016.XA CN110033521B (en) 2019-04-01 2019-04-01 Three-dimensional visualization system based on VR and AR technologies

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910255016.XA CN110033521B (en) 2019-04-01 2019-04-01 Three-dimensional visualization system based on VR and AR technologies

Publications (2)

Publication Number Publication Date
CN110033521A CN110033521A (en) 2019-07-19
CN110033521B true CN110033521B (en) 2020-01-14

Family

ID=67237127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910255016.XA Active CN110033521B (en) 2019-04-01 2019-04-01 Three-dimensional visualization system based on VR and AR technologies

Country Status (1)

Country Link
CN (1) CN110033521B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114397960B (en) * 2021-12-28 2024-05-31 深圳潜行创新科技有限公司 Flight control direction visualization method based on intelligent mobile equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489022A (en) * 2008-01-16 2009-07-22 延世大学工业学术合作社 Color recovery method and system
CN106796771A (en) * 2014-10-15 2017-05-31 精工爱普生株式会社 The method and computer program of head-mounted display apparatus, control head-mounted display apparatus

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102938057B (en) * 2012-10-19 2015-09-23 株洲南车时代电气股份有限公司 A kind of method for eliminating vehicle shadow and device
JP6329417B2 (en) * 2014-03-31 2018-05-23 株式会社Subaru Outside environment recognition device
US9361670B2 (en) * 2014-09-04 2016-06-07 National Taipei University Of Technology Method and system for image haze removal based on hybrid dark channel prior
IL236243A (en) * 2014-12-14 2016-08-31 Elbit Systems Ltd Visual perception enhancement of displayed color symbology
CN106296621B (en) * 2015-05-22 2019-08-23 腾讯科技(深圳)有限公司 Image processing method and device
CN109409251B (en) * 2015-08-18 2023-05-16 奇跃公司 Virtual and augmented reality systems and methods
JP6930092B2 (en) * 2016-11-17 2021-09-01 セイコーエプソン株式会社 Electro-optic equipment, manufacturing method of electro-optical equipment, and electronic devices

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489022A (en) * 2008-01-16 2009-07-22 延世大学工业学术合作社 Color recovery method and system
CN106796771A (en) * 2014-10-15 2017-05-31 精工爱普生株式会社 The method and computer program of head-mounted display apparatus, control head-mounted display apparatus

Also Published As

Publication number Publication date
CN110033521A (en) 2019-07-19

Similar Documents

Publication Publication Date Title
US11538229B2 (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
CN107247834B (en) A kind of three dimensional environmental model reconstructing method, equipment and system based on image recognition
CN106033435B (en) Item identification method and device, indoor map generation method and device
EP3550516B1 (en) Environmental parameter based selection of a data model for recognizing an object of a real environment
WO2023093217A1 (en) Data labeling method and apparatus, and computer device, storage medium and program
CN110322564B (en) Three-dimensional model construction method suitable for VR/AR transformer substation operation environment
CN109887003A (en) A kind of method and apparatus initialized for carrying out three-dimensional tracking
CN105427385A (en) High-fidelity face three-dimensional reconstruction method based on multilevel deformation model
CN110428449A (en) Target detection tracking method, device, equipment and storage medium
CN107944459A (en) A kind of RGB D object identification methods
CN110660125B (en) Three-dimensional modeling device for power distribution network system
CN102932638B (en) 3D video monitoring method based on computer modeling
WO2020211427A1 (en) Segmentation and recognition method, system, and storage medium based on scanning point cloud data
CN112818925A (en) Urban building and crown identification method
CN107066605A (en) Facility information based on image recognition has access to methods of exhibiting automatically
CN112562056A (en) Control method, device, medium and equipment for virtual light in virtual studio
KR20210129360A (en) System for providing 3D model augmented reality service using AI and method thereof
CN110033521B (en) Three-dimensional visualization system based on VR and AR technologies
Zhai et al. Image real-time augmented reality technology based on spatial color and depth consistency
CN112396831B (en) Three-dimensional information generation method and device for traffic identification
CN103646397A (en) Real-time synthetic aperture perspective imaging method based on multi-source data fusion
CN116935008A (en) Display interaction method and device based on mixed reality
CN114255328A (en) Three-dimensional reconstruction method for ancient cultural relics based on single view and deep learning
Wang et al. Research and implementation of the sports analysis system based on 3D image technology
CN116612256B (en) NeRF-based real-time remote three-dimensional live-action model browsing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant