CN109829954A - The application of the pseudo- color mapped system of 3D and its mapping method and the pseudo- color mapped system of 3D - Google Patents

The application of the pseudo- color mapped system of 3D and its mapping method and the pseudo- color mapped system of 3D Download PDF

Info

Publication number
CN109829954A
CN109829954A CN201711182607.6A CN201711182607A CN109829954A CN 109829954 A CN109829954 A CN 109829954A CN 201711182607 A CN201711182607 A CN 201711182607A CN 109829954 A CN109829954 A CN 109829954A
Authority
CN
China
Prior art keywords
data
depth
value
mapping
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711182607.6A
Other languages
Chinese (zh)
Other versions
CN109829954B (en
Inventor
田新蕾
王正
洪玉芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN201711182607.6A priority Critical patent/CN109829954B/en
Publication of CN109829954A publication Critical patent/CN109829954A/en
Application granted granted Critical
Publication of CN109829954B publication Critical patent/CN109829954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses the application of the pseudo- color mapped system of a 3D and its mapping method and the pseudo- color mapped system of 3D, the pseudo- color mapped system of the 3D includes an interactive unit, a pretreatment unit and a processing unit.The interactive unit is for being communicatively coupled to an acquisition device, to receive the original data from acquisition device.The pretreatment unit is communicatively coupled with the interactive unit, which can receive and pre-process the original data, to generate a depth bounds data.The processing unit is communicatively coupled with the pretreatment unit and the interactive unit respectively, and can be based on the depth bounds data processing original data, to generate a pcolor as data.The interactive unit is for being communicatively coupled to an output device, so that after the interactive unit receives the pcolor as data, the interactive unit can show corresponding pcolor picture by output device, distinguish the different depth of measured target will pass through different colours.

Description

The application of the pseudo- color mapped system of 3D and its mapping method and the pseudo- color mapped system of 3D
Technical field
The present invention relates to a Pseudo-color technology field, relate more specifically to the pseudo- color mapped system of a 3D and its mapping method and One 3D depth data can be converted into a figure that can be directly viewable by the application of the pseudo- color mapped system of 3D, the pseudo- color mapped system of the 3D Picture.
Background technique
In recent years, with the rapid development of microelectric technique, computer technology, modern communication technology, human society is just It walks with vigorous strides to march toward the information age.Correspondingly, depth camera is also rapidly developed and is popularized therewith, and more and more scenes and field need Depth camera is applied to, for example, three-dimensional modeling, recognition of face, gesture identification or target tracking etc..
In general, different according to the hardware implementation mode of depth camera, mainstream 3D used by depth camera in industry at present Machine vision about there are three types of: structure light, TOF time-of-flight method (TOF camera) and binocular stereo imaging.Not with traditional camera With place be the depth camera can simultaneously photographed grey-tone image information, and include depth 3 dimension information.The depth The design principle of camera is to emit a reference beam for measured target, by the time difference or phase difference for calculating light echo, to change The distance of measured target is calculated, to generate depth information.However, leading to this due to the limitation of itself working principle of the depth camera Depth camera can not obtain the color information of the measured target, although that is, the depth camera can acquire the depth of the measured target Information, but image captured by the depth camera is the gray level image containing depth information, rather than color image.
It is understood that it is exactly that human eye can only tell twenties kinds of gray scales, and (human eye differentiates ash that human eye vision, which has a feature, The poor ability of degree, generally only tens orders of magnitude), that is to say, that the collected super height of gray level image resolution ratio of institute has 1000 gray levels, but unfortunately, people can only find out twenties, that is to say, that 50 times of information loss.But human eye It is quite strong for colored resolution capability, thousands of kinds of colorations can be told, therefore, if depth will be contained by Pseudo Col ored Image The gray level image of information is mapped to pseudo color image (or the depth information of measured target is directly mapped to pseudo color image) Afterwards, human eye can extract more information content.
Normally, existing Pseudo Col ored Image method is a depth bounds to be artificially previously set, and the depth bounds are divided At several pieces, every part of depth bounds are matched with a RGB colorimetric system, then by institute in a manner of Linear Mapping or Nonlinear Mapping The depth information (depth value of each pixel of the measured target) of the measured target of acquisition is mapped to corresponding RGB colorimetric system, into And depth camera depth information collected is formed into pseudo color image through Pseudo Col ored Image, in order to which human eye is directly viewable To obtain visual information, to distinguish the different depth of measured target by color.
However, due to the existing Pseudo Col ored Image method be empirically come the depth bounds being artificially previously set, and The otherness of different measured targets is not accounted for, and does not account for the viewing distance of the depth camera, therefore work as viewing distance When inappropriate or measured target depth level is unobvious, it just will appear all or part of depth value meeting of the measured target It falls in except the depth bounds being previously set, and normal Pseudo Col ored Image can not be carried out.Or the measured target Depth information is excessively concentrated, i.e., the depth value of the measured target largely falls in a bit of region in the depth bounds, so that Most of pixel of the measured target is mapped to same colour system, so that the color in pcolor picture obtained is not rich enough Richness, retrochromism is smaller, and human eye is difficult intuitively and easily to obtain accurate visual information.Particularly, once this is tested When the depth value of target is excessively concentrated and is mapped to same colour system (or even being mapped same similar color), the measured target Boundary will be unable to significantly be distinguished, and directly affect the visual effect that human eye is directly viewable, and then also will affect the depth camera It is universal and be widely applied.
Summary of the invention
A purpose of the present invention is that providing the pseudo- color mapped system of a 3D and its mapping method and the pseudo- color mapped system of 3D Application, wherein an original data of a measured target can be converted into a pcolor picture number by the pseudo- color mapped system of the 3D According to distinguishing the different depth of the measured target will pass through different colours.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the pseudo- color mapped system of the 3D can by an output device by the pcolor as data show a puppet Coloured picture picture is directly seen with to facilitate human eye and looks into and distinguish the boundary of the measured target.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the pseudo- color mapped system of the 3D can calculate automatically the measured target based on the original data A depth bounds, and then corresponding color is mapped according to the depth bounds, to enrich the pcolor picture obtained In color, while retrochromism is small asks caused by capable of being also effectively prevented from generation due to depth information is excessively concentrated Topic.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the pixel-map with different depth value can be gone out the face of different colour systems by the pseudo- color mapped system of the 3D Color, to highlight the different depth of the measured target, so as to obviously distinguish the boundary of the measured target.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the pseudo- color mapped system of the 3D can the depth bounds set by real-time update, taken with meeting difference The needs of scape distance or the different measured targets, so that the use scope of the pseudo- color mapped system of the 3D is improved, relative to tradition By way of depth bounds are manually set for, the pseudo- color mapped system of the 3D of the invention by described in updating in real time The mode of depth bounds can not only improve the quality of the pcolor picture, and it is pseudo- color significantly to expand the 3D The application range of mapped system.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the pseudo- color mapped system of the 3D can the change based on angle coverage or distance, calculate and adjust automatically The set depth bounds, to obtain the pcolor picture of high quality.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the pseudo- color mapped system of the 3D can the depth information based on the measured target synchronously map out it is different Brightness, to help to enhance the identifiability of the pcolor picture obtained.Specifically, away from an observer on the measured target Closer place (depth value is smaller), then on the pcolor picture corresponding pixel brighter display.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the depth information of the measured target can synchronously be mapped out corresponding face by the pseudo- color mapped system of the 3D Color and brightness, with increase the pcolor picture obtained can distinguishing characteristics, to help to distinguish the difference of the measured target Depth.
Another object of the present invention is to provide the pseudo- color mapped system of a 3D and its pseudo- color mapping systems of mapping method and 3D The application of system, wherein the depth information of the measured target can be mapped to a hsv color model by the pseudo- color mapped system of the 3D, To obtain the pcolor picture for having depth information, so as to direct viewing to the boundary of the measured target.
One aspect under this invention, the present invention provide the pseudo- color mapped system of a 3D comprising:
One analytical unit, wherein the analytical unit obtains one deeply by analyzing an original data of a measured target Spend range data;With
One processing unit, wherein the processing unit is communicatively connected in a network in the analytical unit, wherein the processing Original data described in the depth bounds data processing that unit is obtained based on the analytical unit, to obtain about described One pcolor of measured target is as data, so that the original data of the measured target is mapped as the pcolor As data.
According to one embodiment of present invention, the pseudo- color mapped system of the 3D further comprises an interactive unit, wherein institute Stating interactive unit includes an output module, wherein the output module is communicatively connected in a network in the processing unit, wherein institute Output module is stated being communicatively connected in a network the pseudo- coloured silk for showing that the processing unit obtains by way of an output device Image data, to form a pcolor picture of the measured target on the output device.
According to one embodiment of present invention, the pseudo- color mapped system of the 3D further comprises an interactive unit, wherein institute Stating interactive unit includes a receiving module, wherein the analytical unit is communicatively connected in a network in the receiving module, wherein institute State the initial pictures that receiving module obtains the measured target being communicatively connected in a network by way of an acquisition device Data, the analytical unit obtain the original data of the measured target from the receiving module.
According to one embodiment of present invention, the interactive unit includes a receiving module, and the analytical unit can be led to It is connected to letter the receiving module, wherein the receiving module obtains by way of an acquisition device being communicatively connected in a network The original data of the measured target is obtained, the analytical unit obtains the measured target from the receiving module The original data.
According to one embodiment of present invention, the analytical unit includes a statistical module, and the statistical module can be led to It is connected to letter the receiving module and the processing unit, wherein the statistical module is described just from receiving module reception Beginning image data, and the statistical module is by way of counting the depth value of all pixels in the original data The depth bounds of the pixel are automatically calculated out, to obtain the depth bounds data.
According to one embodiment of present invention, the processing unit further comprises the mapping being mutually communicatively coupled Module and a configuration module, wherein the mapping block is communicatively connected in a network in the receiving module and the analytical unit, Wherein the mapping block receives the original data from the receiving module respectively and receives institute from the analytical unit Depth bounds data are stated, and the mapping block is to map to the original data based on the depth bounds data The mode of one default mapping range generates a depth map data, wherein the configuration module receives the depth map data, And the depth map data are configured to generate the pcolor as data.
According to one embodiment of present invention, the analytical unit include the statistical module that is mutually communicatively coupled and One judgment module, the statistical module are communicatively connected in a network in the receiving module, and the judgment module is communicably connected It is connected to the processing unit, wherein the statistical module is from the receiving module reception original data and to count The mode for stating the depth value of all pixels in original data automatically calculates out a depth bounds of the pixel, to obtain One initial depth range data, wherein the judgment module obtains the depth bounds number according to the initial depth range data According to.
According to one embodiment of present invention, the statistical module periodically receives the initial graph from the receiving module As data.
According to one embodiment of present invention, the processing unit further comprises a mapping block and can be communicated respectively Ground is connected to a configuration module and a correction module for the mapping block, and the correction module is communicatively connected in a network in described Receiving module and the analytical unit, wherein the correction module is based on from the received depth bounds number of the analytical unit An amendment original data is obtained after the received original data of the receiving module according to amendment, wherein described reflect Penetrate module be based on will be from the received amendment of the correction module from the received depth bounds data of the analytical unit Original data obtains a depth map data after mapping to a predetermined mapping range, wherein described in configuration module reception Depth map data, and the depth map data are configured to generate the pcolor as data.
According to one embodiment of present invention, the mapping block synchronously receives the initial graph from the receiving module The depth bounds data are received as data and from the analytical unit.
According to one embodiment of present invention, the mapping block further comprises integrating module and being communicated respectively Ground is connected to the tone mapping block and a brightness mapping block for integrating module, and tone mapping module quilt can It is communicatively coupled to the receiving module and the analytical unit, the brightness mapping block is communicatively connected in a network to be connect in described Module and the analytical unit are received, wherein the tone mapping module is to be based on the depth bounds data for the initial pictures The mode that data map to a default tone mapping range obtains a tone map data, and the brightness mapping block is to be based on institute It states depth bounds data and the mode that the original data maps to a predetermined luminance mapping range is obtained into brightness mapping Data, wherein described integrate the module entirely tone map data and brightness mapping data, to obtain the depth model Enclose data.
According to one embodiment of present invention, the mapping block further comprises a tone mapping block, the tone Mapping block is communicatively connected in a network in the receiving module and the analytical unit, wherein the tone mapping module is to be based on The mode that the original data maps to a default tone mapping range is obtained a tone and reflected by the depth bounds data Data are penetrated, the tone map data forms the depth map data.
According to one embodiment of present invention, the mapping relations of the mapping block are a linear mapping relation.
Other side under this invention, the present invention further provides an electronic equipments comprising:
One acquisition device;
One output device;And
The pseudo- color mapped system of one 3D, wherein the pseudo- color mapped system of the 3D further comprises:
One interactive unit, wherein the interactive unit includes a receiving module and an output module, the receiving module quilt It is communicatively coupled with the acquisition device, the output module is communicatively connected in a network in the output device;
One analytical unit, wherein the analytical unit is communicatively connected in a network in the interactive unit, wherein the analysis Unit analyze the receiving module it is received by described device acquire a measured target an original data after obtain one Depth bounds data;And
One processing unit, wherein the processing unit is communicatively connected in a network in the analytical unit, wherein the processing Original data described in the depth bounds data processing that unit is obtained based on the analytical unit, to obtain about described One pcolor of measured target is as data, wherein the output module can show that the processing unit obtains in the output device The pcolor obtained is as data, to form a pcolor picture on the output device.
Other side under this invention, the present invention further provides the pseudo- color mapping methods of a 3D, wherein the 3D is pseudo- color Mapping method includes the following steps:
(a) depth bounds are obtained in a manner of the original data for analyzing a measured target an analytical unit Data;With
(b) pass through initial graph described in the depth bounds data processing that a processing unit is obtained based on the analytical unit As data, to obtain a pcolor of the measured target as data, thus by the initial pictures number of the measured target According to being mapped as the pcolor as data.
According to one embodiment of present invention, further comprise step in the step (a):
The depth value of all pixels in the original data is counted, to obtain the depth value of the pixel Maximum value and minimum value;With
According to the maximum value and minimum value of the depth value, the depth bounds data are obtained.
According to one embodiment of present invention, further comprise step in the step (b):
The depth value of the pixel of the original data is mapped respectively according to the depth bounds data At a depth map value of the corresponding pixel, to obtain a depth map data;With
According to the corresponding HSV value of depth map data configuration in the pixel, to obtain the pcolor as data.
According to one embodiment of present invention, will be described in the original data according to the depth bounds data The depth value of pixel is mapped to the depth map value of the corresponding pixel respectively, to obtain the depth map model The step of enclosing further comprises step:
Based on the depth bounds data, by a tone mapping block by the pixel of the original data The depth value is mapped to a tone mapping value of the corresponding pixel respectively, to obtain a tone map data;
Based on the depth bounds data, by a brightness mapping block by the pixel of the original data The depth value is mapped to a brightness mapping value of the corresponding pixel respectively, to obtain brightness mapping data;And
The module integrated tone map data is integrated by one and the brightness maps data, is reflected with obtaining the depth Penetrate data.
According to one embodiment of present invention, will be described in the original data according to the depth bounds data The depth value of pixel is mapped to the depth map value of the corresponding pixel respectively, to obtain the depth map model The step of enclosing further comprises step: the depth bounds data is based on, by a tone mapping block by the initial graph It is of the same colour to obtain as the depth value of the pixel of data is mapped to a tone mapping value of the corresponding pixel respectively Mapping data are adjusted, wherein the tone map data forms the depth map data.
According to one embodiment of present invention, further comprise step in the step (a):
The depth value of all pixels in the original data is counted, to obtain the maximum of the depth value of the pixel The mode of value and minimum value obtains an initial depth range data;
The initial depth range data is judged by a judgment module, to obtain the depth bounds data.
According to one embodiment of present invention, further comprise step in the step (b):
According to original data described in the depth bounds data correction, to obtain an amendment original data;
The depth value of the pixel of the amendment original data is distinguished according to the depth bounds data It is mapped to a depth map value of the corresponding pixel, to obtain a depth map data;And
According to the corresponding HSV value of depth map data configuration in the pixel, to obtain the pcolor as data.
According to one embodiment of present invention, according to the depth bounds data by the amendment original data The depth value of the pixel is mapped to the depth map value of the corresponding pixel respectively, is reflected with obtaining the depth The step of penetrating range further comprises step:
Based on the depth bounds data, by a tone mapping block by the picture of the amendment original data The depth value of element is mapped to a tone mapping value of the corresponding pixel respectively, to obtain a tone map data;
Based on the depth bounds data, by a brightness mapping block by the picture of the amendment original data The depth value of element is mapped to a brightness mapping value of the corresponding pixel respectively, to obtain brightness mapping data;With And
The module integrated tone map data is integrated by one and the brightness maps data, is reflected with obtaining the depth Penetrate data.
According to one embodiment of present invention, according to the depth bounds data by the amendment original data The depth value of the pixel is mapped to the depth map value of the corresponding pixel respectively, is reflected with obtaining the depth The step of penetrating range further comprises step: being based on the depth bounds data, is repaired by a tone mapping block by described The depth value of the pixel of positive original data is mapped to a tone mapping value of the corresponding pixel respectively, with A tone map data is obtained, wherein the tone map data forms the depth map data.
Other side under this invention, the present invention further provides one by an electronic equipment show a pcolor as Method, wherein the display methods of the pcolor picture includes the following steps:
(A) original data about a measured target is acquired by an acquisition device of an electronic equipment;
(B) original data is received from the acquisition device by a receiving module;
(C) a depth bounds data are obtained in such a way that an analytical unit analyzes the original data;
(D) pass through initial graph described in the depth bounds data processing that a processing unit is obtained based on the analytical unit As data, to obtain a pcolor of the measured target as data;And
(E) it is shown in a manner of an output device of the electronic equipment an output module by being communicatively connected in a network The pcolor is as data, to form a pcolor picture of the measured target on the output device.
By the understanding to subsequent description and attached drawing, further aim of the present invention and advantage will be fully demonstrated. These and other objects of the invention, feature and advantage, by following detailed descriptions, drawings and claims are able to abundant body It is existing.
Detailed description of the invention
Fig. 1 is the system structure diagram of the pseudo- color mapped system of a 3D of one first preferred embodiment according to the present invention.
Fig. 2 is a mapping block of the pseudo- color mapped system of the 3D of above-mentioned first preferred embodiment according to the present invention Structural schematic diagram.
Fig. 3 is that the pseudo- color mapped system of the 3D of above-mentioned first preferred embodiment according to the present invention generates a depth bounds The block diagram representation of data.
Fig. 4 is that the pseudo- color mapped system of the 3D of above-mentioned first preferred embodiment according to the present invention generates a pcolor picture The block diagram representation of data.
Fig. 5 is that the pseudo- color mapped system of the 3D of above-mentioned first preferred embodiment according to the present invention generates a depth map The block diagram representation of data.
Fig. 6 is the flow diagram of the pseudo- color mapping method of a 3D of above-mentioned first preferred embodiment according to the present invention.
Fig. 7 is the system structure diagram of the pseudo- color mapped system of a 3D of one second preferred embodiment according to the present invention.
Fig. 8 is a mapping block of the pseudo- color mapped system of the 3D of above-mentioned second preferred embodiment according to the present invention Structural schematic diagram.
Fig. 9 is that the pseudo- color mapped system of the 3D of above-mentioned second preferred embodiment according to the present invention generates a depth bounds The block diagram representation of data.
Figure 10 is that the pseudo- color mapped system of the 3D of above-mentioned second preferred embodiment according to the present invention generates a pcolor As the block diagram representation of data.
Figure 11 is that one depth of the pseudo- color mapped system generation of the 3D of above-mentioned second preferred embodiment according to the present invention is reflected Penetrate the block diagram representation of data.
Figure 12 is the flow diagram of the pseudo- color mapping method of a 3D of above-mentioned second preferred embodiment according to the present invention.
Specific embodiment
It is described below for disclosing the present invention so that those skilled in the art can be realized the present invention.It is excellent in being described below Embodiment is selected to be only used as illustrating, it may occur to persons skilled in the art that other obvious modifications.It defines in the following description Basic principle of the invention can be applied to other embodiments, deformation scheme, improvement project, equivalent program and do not carry on the back Other technologies scheme from the spirit and scope of the present invention.
It will be understood by those skilled in the art that in exposure of the invention, term " longitudinal direction ", " transverse direction ", "upper", The orientation of the instructions such as "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom" "inner", "outside" or position are closed System is to be based on the orientation or positional relationship shown in the drawings, and is merely for convenience of description of the present invention and simplification of the description, without referring to Show or imply that signified device or element must have a particular orientation, be constructed and operated in a specific orientation, therefore above-mentioned art Language is not considered as limiting the invention.
In the present invention, term " one " is interpreted as " one or more " in claim and specification, i.e., in a reality Example is applied, the quantity of an element can be one, and in a further embodiment, the quantity of the element can be multiple.Unless Clearly illustrate in exposure of the invention the element quantity only one, otherwise term " one " can not be interpreted as unique or single One, term " one " should not be understood as the limitation to quantity.
With reference to shown in Fig. 1 to Fig. 6 of attached drawing, the 3D puppet coloured silk for showing one first preferred embodiment according to the present invention reflects Penetrate system and mapping method.First preferred embodiment according to the present invention, the pseudo- color mapped system 20 of the 3D includes mutual An interactive unit 21, an analytical unit 22 and a processing unit 23 for communication connection, wherein the interactive unit 21 can receive An original data D1 from an acquisition device 10, and the original data D1 can be transmitted separately to the analysis Unit 22 and the processing unit 23, wherein the analytical unit 22 can receive the original data D1, and described point Analysis unit 22 can pre-process the original data D1 to calculate a depth bounds, to generate a depth bounds data D2, Wherein the processing unit 23 can receive the original data D1 and the depth bounds data D2, and be based on the depth Range data D2 handles the original data D1, to generate a pcolor as data D4, and can by the pcolor picture Data D4 is transmitted to the interactive unit 21, wherein the interactive unit 21 receives the pcolor as data D4, and described Interactive unit 21 can be based on the pcolor by an output device 30 and show a pcolor picture as data, so that human eye is directly seen It sees the pcolor picture, and distinguishes the different depth of a measured target W by the different colours of the pcolor picture.
In other words, the present invention also provides an electronic equipments, wherein the electronic equipment includes the acquisition device 10, institute Output device 30 and the pseudo- color mapped system 20 of the 3D are stated, wherein the pseudo- color mapped system 20 of the 3D includes the interactive unit 21, the analytical unit 22 and the processing unit 23, wherein the analytical unit 22 and the processing unit 23 respectively by It is communicatively coupled with the interactive unit 21, the processing unit 23 is communicatively connected in a network in the analytical unit 22, and And the interactive unit 21 is communicatively connected in a network in the acquisition device 10 and the output device 30.
It is worth noting that, the acquisition device 10 can be, but not limited to be implemented as a depth camera, for acquiring one The depth information of measured target W (or testee), to generate the original data D1, wherein the original data D1 includes the depth value of each pixel in the initial pictures (gray level image) of measured target W.It is worth noting that, described initial Pixel described in image can be, but not limited to be implemented as a valid pixel, to obtain the more accurate initial pictures number According to D1.It will be appreciated by those skilled in the art that the depth camera can be a structure light depth camera, it is also possible to a binocular Three-dimensional imaging depth camera can also be a TOF camera.Citing ground, the acquisition device 10 are implemented as a TOF camera, Described in TOF camera include a light source emitting module and a photosensitive receiving module, the light source emitting module photosensitive connects with described It receives module to match, and generates the depth information of a measured target W based on TOF depth measurement.More specifically, the light source hair The light wave that module emits a specific band is penetrated, the light wave that emits is in the surface reflection of measured target, by the photosensitive reception Module is received, and in turn, the photosensitive receiving module is according to transmitting light wave and receives the time difference between light wave or phase difference The depth information of measured target W is calculated, to obtain the original data of measured target W.
In addition, the output device 30 can be, but not limited to be implemented as a display screen, based on the pseudo- color mapping of the 3D The pcolor that system 20 generates shows the pcolor picture of measured target W as data D4.It should be pointed out that described Output device 30 is also possible to mobile phone, computer, plate, touch screen etc. equipment having a display function.
First preferred embodiment according to the present invention, as shown in figures 1 and 3, the pseudo- color mapped system 20 of the 3D The interactive unit 21 includes the receiving module 211 and an output module 212 being mutually communicatively coupled, wherein the interaction The receiving module 211 of unit 21 is communicatedly connect with the acquisition device 10, is generated with receiving the acquisition device 10 The original data D1, the output module 212 of the interactive unit 21 communicatedly connects with the output device 30 It connects, by the processing unit 23 pcolor generated as data D4 the is transmitted to output device 30, to pass through State the pcolor picture that output device 30 shows measured target W.It is worth noting that, the mode of the communication connection is illustrated It ground but is not limited to be implemented as wired or wireless connection type, such as electric connection mode, signal connection type, data connection side Formula.
It is noted that the pseudo- color mapped system 20 of the 3D can be, but not limited to be implemented as an autonomous system, wherein The receiving module 211 and the output module 212 of the interactive unit 21 of the pseudo- color mapped system 20 of the 3D can be with It is implemented as a USB data interface, to facilitate the receiving module 211 to pass through a data line and the acquisition device 10 and described Output device 30 communicatedly connects, and the output module 212 is communicatedly connect by the data line with the output device 30.When So, it will be appreciated by those skilled in the art that, the independence even is implemented as in the pseudo- color mapped system 20 of the 3D In the specific example of system, the receiving module 211 and the output module 212 are each implemented as the USB data interface Scheme is also only citing, and is not construed as the color content of mapped system 20 pseudo- to the 3D of the invention and the limit of range System.For example, at least one module in other possible examples, in the receiving module 211 and the output module 212 It may be implemented as wireless module.It should be pointed out that the pseudo- color mapped system 20 of the 3D also may be implemented as an embedded system System, wherein the pseudo- color mapped system 20 of the 3D is embedded in the acquisition device 10 or is embedded in the output device 30, So that the acquisition device 10 or the output device 30 have the function of to carry out Pseudo-color technology automatically.
In first preferred embodiment of the invention, the receiving module 211 of the interactive unit 21 respectively with The analytical unit 22 and the processing unit 23 are communicatively coupled, and the original data D1 is transmitted separately to institute Analytical unit 22 and the processing unit 23 are stated, wherein the analytical unit 22 receives and pre-processes the original data D1 To generate the depth bounds data D2, the processing unit 23 receives and processes the original data D1 and the depth Range data D2, to generate the pcolor as data D4.
More specifically, as shown in figures 1 and 3, the analytical unit 22 includes a statistical module 221, wherein the statistics Module 221 is communicatedly connect with the receiving module 211, described first from the receiving module 211 to receive and pre-process Beginning image data D1, and then generate the depth bounds data D2.In other words, the statistical module 221 can be based on described first The depth value of all pixels is in beginning image data D1 to count one of depth value described in the original data D1 Maximum value and a minimum value, so that the depth bounds for calculating pixel described in the original data D1 automatically are (i.e. described The minimum value of depth value is the most recent value of the depth bounds, and the maximum value of the depth value is the depth bounds Farthest value), to generate the depth bounds data D2.Significantly, since the original data D1 includes The depth information of measured target W, therefore the depth bounds data D2 generated of the analytical unit 22 then includes the quilt Survey the depth bounds of target W, that is to say, that the pseudo- color mapped system of the 3D can be calculated automatically based on an original data The depth bounds of a measured target W out, and then corresponding color is mapped according to the depth bounds, it is obtained to enrich Color in pcolor picture, while can also be effectively prevented from and retrochromism caused by due to depth information is excessively concentrated occurs Small problem.In addition, when the angle coverage of the acquisition device 10 or viewing distance change, the pseudo- color mapping system of the 3D System 20 remains to calculate and adjust the depth bounds data automatically, and the color to ensure the pcolor picture obtained is rich always Richness is without concentrating, so that the pcolor picture of high quality is obtained, so as to human eye direct viewing.
As shown in Figure 1 and Figure 4, the processing unit 23 includes a mapping block 231 and a configuration module 232, wherein institute State the mapping block 231 of processing unit 23 respectively with the receiving module 211 of the interactive unit 21 and the analysis The statistical module 221 of unit 22 is communicatively coupled, and the mapping block 231 can synchronize reception from the reception The original data D1 of the module 211 and depth bounds data D2 from the statistical module 221, wherein described Mapping block 231 can be based on the depth bounds data D2 by the depth value of each pixel in the original data D1 It maps in a predetermined mapping range, to calculate a depth map value of each pixel, to generate a depth map number According to D3, wherein the depth map data D3 includes the depth map of each pixel described in the original data D1 Value.
The configuration module 232 of the processing unit 23 is communicatively coupled with the mapping block 231, and described Configuration module 232 can receive the depth map data D3 from the mapping block 231, and be based on the depth map number It is each corresponding HSV value of pixel configuration according to D3, to generate the pcolor as data D4.It is worth noting that, described Configuration module 232 is based on a hsv color model and is configured, wherein the value range of tone value H is 0 to 360 in the HSV value The value range of degree, intensity value S (in the present invention, the value range equal proportion of the intensity value S is expanded for 0 to 1 To 0 to 255, saturation degree when wherein the S=255 epoch refer to S=1), the value range of brightness value V is 0 to 255 (wherein described bright Angle value V indicates black when being 0, white is indicated when the brightness value V is 255), in different HSV value generations, refers to different colors, because This, the corresponding HSV value is obtained by the depth value of each pixel, so that realizing has different depth value The pixel is configured upper different HSV value, that is, obtains different colors, will pass through the pcolor as including in data D4 The color of the pixel distinguish the different depth of measured target W, so as to direct viewing to the boundary of measured target W.
In first preferred embodiment of the invention, as shown in Figure 2 and Figure 5, the mapping of the processing unit 23 Module 231 integrates module 2313 including a tone mapping block 2311, a brightness mapping block 2312 and one, wherein the color Adjust mapping block 2311 can be based on the depth bounds data D2 by the depth of each pixel in the original data D1 Angle value maps in a predetermined tone mapping range, to calculate the of the same colour of each pixel described in the original data D1 Mapping value is adjusted, to generate a tone map data D31, the brightness mapping block 2312 can be based on the depth bounds data D2 The depth value of each pixel in the original data D1 is mapped in a predetermined luminance mapping range, to calculate One brightness mapping value of each pixel described in the original data D1 maps data D32 to generate a brightness.It is described whole Molding block 2313 is communicatively coupled with the tone mapping module 2311 and the brightness mapping block 2312 respectively, and institute State integrate module 2313 can receive and integrate the tone map data D31 and the brightness mapping data D32, with generate described in Depth map data D3, wherein the depth map data D3 includes the tone map data of each pixel and described Brightness maps data, that is to say, that the depth map value of the depth map data D3 includes the tone map data The pixel the tone mapping value and the brightness mapping data the brightness mapping value so that the 3D puppet coloured silk reflects The depth information of one measured target can synchronously be mapped out corresponding color and brightness by the system of penetrating, to increase puppet obtained Coloured picture picture can distinguishing characteristics, to help to distinguish the different depth of the measured target.
Preferably, the predetermined tone mapping range is implemented as 0 to 180 degree, due in the HVS color model, 180 degree is differed between the tone value H of complementary colours respectively, therefore when the predetermined tone mapping range is selected as 0 to 180 degree When, the tone mapping value of the pixel in tone map data D31 value between the complementary colours, so that institute The color of the pcolor picture obtained is relatively abundanter, and ensures that the pixel of different depth has different colors, and Without worry difference larger depth different pixels have similar color, with will pass through color distinguish the pcolor as Depth information, that is to say, that the pixel-map with different depth value can be gone out different colour systems by the pseudo- color mapped system of the 3D Color, to highlight the different depth of the measured target, so as to obviously distinguish the boundary of the measured target.Those skilled in the art answer Work as understanding, the predetermined tone mapping range also may be implemented as such as 60 to 240 degree, 120 to 300 degree, 180 to 360 degree Etc. it is any difference 180 degree value range.Certainly, the predetermined tone mapping range also may be implemented as non-difference 180 Value range.
It is worth noting that, the predetermined luminance mapping range can be, but not limited to be implemented as 0 to 255, so that smaller The depth value be mapped to the biggish brightness mapping value, correspondingly, the biggish depth value is mapped to smaller The brightness mapping value, the pseudo- color mapped system of the 3D is synchronously mapped based on the depth information of measured target W Different brightness out, to help to enhance the identifiability of pcolor picture obtained.That is, the depth value is smaller The pixel has brighter brightness, and the bigger pixel of the depth value has darker brightness, and in other words, this is tested The brightness in the pcolor picture of more forward region is higher on target W, and the region on measured target W more rearward is in the puppet Brightness is lower in coloured picture picture, with meet human eye viewing habit, it helps human eye by directly observe the pcolor picture come Distinguish depth or the boundary of measured target W.It will be appreciated by those skilled in the art that the predetermined luminance mapping range can also be with Any range being implemented as between 0 to 255, is equally reached same brightness recognition effect.In addition, passing through change The lesser depth value can be also mapped to the lesser brightness mapping value, so that on measured target W by mapping algorithm The brightness in the pcolor picture of more forward region is lower.
The configuration module 232 of the processing unit 23 receives the depth map from the mapping block 231 The tone value H of the pixel is assigned a value of the institute of the corresponding pixel in the depth map data D3 by data D3 Tone mapping value is stated, the brightness value V of the pixel is assigned a value of in the depth map data D3 the corresponding pixel The brightness mapping value, while the intensity value S of the pixel is assigned a value of 255, to be configured to the institute of the pixel HSV value is stated, and then generates the pcolor as data D4.
It is noted that in some embodiments of the invention, the mapping block 231 of the processing unit 23 is only Including the tone mapping module 2311, with will be each in the original data D1 by the tone mapping module 2311 The depth value of pixel maps in the predetermined tone mapping range, to calculate described in the original data D1 The tone mapping value of each pixel, to generate the tone map data D31, and the mapping block 231 can be based on The tone map data generates the depth map data, so that the depth generated of the processing unit 23 is reflected Penetrate data D3 only and include the tone mapping value of each pixel, that is to say, that the processing unit 23 institute generated Depth map data D3 is stated equal to the tone map data D31.The configuration module 232 of the processing unit 23, which receives, to be come From the depth map data D3 of the mapping block 231, the tone value H of the pixel is assigned a value of the depth The tone mapping value for mapping the corresponding pixel in data D3, is assigned a value of 255 for the brightness value V of the pixel, The intensity value S of the pixel is assigned a value of 255 simultaneously, to be configured to the HSV value of the pixel, and then generates institute Pcolor is stated as data D4.
First preferred embodiment according to the present invention, it is preferable that the mapping block 231 of the processing unit 23 The depth value of the pixel in the original data D1 is mapped to the pixel in a manner of Linear Mapping The depth map value.It is worth noting that, the Linear Mapping mode can be a positive linear mapping mode, or One reverse linear mapping mode.
It is highly preferred that the tone mapping module 2311 of the mapping block 231 will in a manner of positive Linear Mapping The depth value of the pixel in the original data D1 is mapped to the tone mapping value of the pixel, described The brightness mapping block 2312 of mapping block 231 will be in the original data D1 in such a way that reverse linear maps The depth value of the pixel is mapped to the brightness mapping value of the pixel.Citing ground, the forward direction Linear Mapping side The mapping relations of formula are as follows: pre- described in the tone mapping value=((depth value-most recent value)/depth bounds) * Fixation tune mapping range+predetermined tone mapping minimum value=((depth value-most recent value)/(the farthest value-is described Most recent value)) * 180;The mapping relations of the reverse linear mapping mode are as follows: the brightness mapping value=(the 1- ((depth Value-the most recent value)/the depth bounds)) predetermined luminance mapping range+predetermined luminance described in * maps minimum value=(1- ((depth value-most recent value)/(the farthest value-most recent value))) * 255.It is worth noting that, described predetermined Tone mapping minimum value is the minimum value in the predetermined tone mapping range, and the predetermined luminance mapping minimum value is described pre- Determine the minimum value in brightness mapping range, in of the invention described first preferably implementation, the predetermined tone mapping minimum value It is zero with predetermined luminance mapping minimum value.
It is noted that it will be appreciated by those skilled in the art that the mapping block 231 can also be with Nonlinear Mapping The depth value of the pixel in the original data D1 is mapped to the depth map of the pixel by mode Different depth value is mapped to different HSV values to realize by value, so that different color generations refers to different depth.
First preferred embodiment according to the present invention, as shown in figures 1 to 6, it is pseudo- color that the present invention also provides a 3D Mapping method, wherein the pseudo- color mapping method the following steps are included:
S2: by an analytical unit 22 of the pseudo- color mapped system 20 of a 3D, an original data D1 is pre-processed, to generate One depth bounds data D2;With
S3: being based on the depth bounds data D2, passes through a processing unit 23 of the pseudo- color mapped system 20 of the 3D, processing The original data D1, to generate a pcolor as data D4.
In first preferred embodiment of the invention, further include before the step S2 of the pseudo- color mapping method One step S1: the original data D1 from an acquisition device 10 is received by a receiving module 211.Preferably, institute Stating acquisition device 10 can be, but not limited to be implemented as a depth camera.
More specifically, the step S2 of the pseudo- color mapping method includes:
S21: by a statistical module 221 of the analytical unit 22, the pixel in the original data D1 is counted Depth value, to obtain the maximum value and minimum value of the depth value of the pixel;With
S22: the maximum value and the minimum value based on the depth value generate institute by the statistical module 221 State depth bounds data D2.
It is noted that the step S3 of the pseudo- color mapping method the following steps are included:
S31: being based on the depth bounds data D2, will be described first by a mapping block 231 of the processing unit 23 The depth value of the pixel of beginning image data D1 is mapped to a depth map value of the corresponding pixel respectively, with life At a depth map data D3;With
S32: being based on the depth map data D3, and by a configuration module 232 of the processing unit 23, configuration is corresponding HSV value in the pixel, to generate the pcolor as data D4.
Preferably, the step S31 the following steps are included:
S311: being based on the depth bounds data D2, by a tone mapping block 2311 of the mapping block 231, The tone that the depth value of the pixel of the original data D1 is mapped to the corresponding pixel respectively is reflected Value is penetrated, to generate a tone map data D31;
S312: being based on the depth bounds data D2, by a brightness mapping block 2312 of the mapping block 231, The brightness that the depth value of the pixel of the original data D1 is mapped to the corresponding pixel respectively is reflected Value is penetrated, maps data D32 to generate a brightness;And
S313: module 2313 is integrated by the one of the mapping block 231, integrates the tone map data D31 and institute Brightness mapping data D32 is stated, to generate the depth map data D3.
In some embodiments of the invention, the step S31 the following steps are included:
S311: being based on the depth bounds data D2, by a tone mapping block 2311 of the mapping block 231, The depth value of the pixel of the original data is mapped to a tone mapping of the corresponding pixel respectively Value, to generate a tone map data;With
S312: by the mapping block, it is based on the tone map data, generates the depth map data.
It is worth noting that, further including a step S4 after the step S3 of the pseudo- color mapping method: defeated by one Out module 212 by the pcolor as data D4 is transmitted to an output device 30, to be shown accordingly by the output device 30 A pcolor picture.Preferably, the output device 30 can be, but not limited to be implemented as a display screen.
In first preferred embodiment of the invention, when the shooting for carrying out multiple image using the acquisition unit 10 When generating a series of original data D1, the corresponding pseudo- color mapped system 20 of the 3D can be to each frame image The original data D1 carries out pseudo- color mapping processing, to obtain the pcolor of each frame image as data D4.Also It is to say, when carrying out Pseudo-color technology to each frame image, can all executes a preprocessing process (statistic processes), it is corresponding to obtain The depth bounds data D2, and then map corresponding color according to the depth bounds, it is obtained described to enrich Color in coloured picture picture, while can also be effectively prevented from generation and cause retrochromism small because depth information is excessively concentrated Problem.
With reference to shown in Fig. 7 to Figure 12 of attached drawing, the 3D for showing one second preferred embodiment according to the present invention is pseudo- color Mapped system and method.Second preferred embodiment according to the present invention, the pseudo- color mapped system 20 ' of the 3D includes phase intercommunication An interactive unit 21, an analytical unit 22 ' and the processing unit 23 ' for believing connection, wherein the interactive unit 21 can receive An original data D1 from an acquisition device 10, and the original data D1 can be transmitted separately to the analysis Unit 22 ' and the processing unit 23 ', wherein the analytical unit 22 ' can receive the original data D1, and described Analytical unit 22 ' can periodically pre-process the original data D1 to calculate a depth bounds, to generate a depth model Data D2 ' is enclosed, wherein the processing unit 23 ' can receive the original data D1 and the depth bounds data D2 ', and The original data D1 is handled based on the depth bounds data D2 ', to generate a pcolor as data D4 ', and energy By the pcolor as data D4 ' the is transmitted to interactive unit 21, wherein the interactive unit 21 receives the pcolor picture Data D4 ', and the interactive unit 21 can be based on the pcolor as data show a pcolor by an output device 30 Picture for pcolor picture described in human eye direct viewing, and distinguishes a tested mesh by the different colours of the pcolor picture Mark the different depth of W.
It is worth noting that, the acquisition device 10 can be, but not limited to be implemented as a depth camera, for acquiring one The depth information of measured target W (or testee), to generate the original data D1, wherein the original data D1 includes the depth value of each pixel in the initial pictures (gray level image) of measured target W.
In addition, the output device 30 can be, but not limited to be implemented as a display screen, based on the pseudo- color mapping of the 3D The pcolor that system 20 ' generates shows the pcolor picture of measured target W as data D4 '.It should be pointed out that institute It states output device 30 and is also possible to mobile phone, computer, plate, touch screen etc. equipment having a display function.
Second preferred embodiment according to the present invention, as shown in fig. 7, the pseudo- color mapped system 20 ' of the 3D is described Interactive unit 21 includes a receiving module 211 and an output module 212, wherein the receiving module of the interactive unit 21 211 communicatedly connect with the acquisition device 10, to receive the acquisition device 10 original data D1 generated, The output module 212 of the interactive unit 21 is communicatedly connect with the output device 30, by the processing unit 23 The pcolor generated is as data D4 ' the is transmitted to output device 30, to show the quilt by the output device 30 Survey the pcolor picture of target W.It is worth noting that, the mode of the communication connection illustrate ground but be not limited to be implemented as Line or wireless connection type, such as electric connection mode, signal connection type, data connection approach.
The receiving module 211 of the interactive unit 21 respectively with the analytical unit 22 ' and the processing unit 23 ' It is communicatively coupled, the original data D1 is transmitted separately to the analytical unit 22 ' and the processing unit 23 ', wherein the analytical unit 22 ' receives and pre-processes the original data D1 to generate the depth bounds data D2 ', the processing unit 23 ' receives and processes the original data D1 and the depth bounds data D2 ', to generate Pcolor is stated as data D4 '.
More specifically, as shown in figure 9, the analytical unit 22 ' includes the statistical module 221 ' connected with being in communication with each other With a judgment module 222 '.The receiving module 211 of the interactive unit 21 communicably connects with the statistical module 221 ' It connects, and the statistical module 222 ' can periodically receive the original data D1 from the receiving module 211, with The original data D1 from the receiving module 211 is periodically pre-processed, and then generates an initial depth range number According to D21 '.In other words, the statistical module 221 can be based on the depth value of all pixels in the original data D1 To count the maximum value and a minimum value of depth value described in the original data D1, to calculate automatically described The depth bounds of original data D1, to generate the initial depth range data D21 ' (the i.e. described depth bounds data The farthest value of D21 ' is the maximum value of the depth value;The most recent value of the depth bounds data D21 ' is the depth value The minimum value), and the initial depth range data D21 ' is transmitted to the judgment module 222 '.
The judgment module 222 ' can receive and judge the initial depth range data D21 ', to generate the depth model Enclose data D2 '.When the judgment module 222 ' receives the initial depth range data D21 ' for the first time, the judgment module 222 ' generate the depth bounds data D2 ', i.e., the described depth bounds data D2 ' based on the initial depth range data D21 ' Equal to the initial depth range data D21 ' for the first time;The initial depth model is received for the first time when the judgment module 222 ' is non- When enclosing data D21 ', then by the depth bounds data D2 ' of the farthest value and last time of the initial depth range data D21 ' Farthest value be compared, with judge the initial depth range data D21 ' the farthest value whether be greater than the depth model The prearranged multiple being farthest worth of data D2 ' is enclosed, if the farthest value of the initial depth range data D21 ' is big In the prearranged multiple being farthest worth of the depth bounds data D2 ', then the depth bounds data D2 ' is assigned The initial depth range data D21 ', that is to say, that the depth bounds data D2 ' is updated to the initial depth range Data D21 ';If the farthest value of the initial depth range data D21 ' is no more than the depth bounds data D2's ' The prearranged multiple being farthest worth, then the depth bounds data D2 ' is still assigned the depth bounds data D2 ', That is, any change does not occur for the depth bounds data D2 '.Preferably, the prearranged multiple may be, but is not limited to 1.2 times are implemented as, i.e., the mobility scale being farthest worth of the described initial depth range data D21 ' is more than the depth model Enclose 20% be farthest worth of data D2 '.
It is worth noting that, in some embodiment of the invention, the judgment module 222 ' also synchronously will be described initial The most recent value of depth bounds data D21 ' is compared with the most recent value of the depth bounds data D2 ', described initial to judge Whether the most recent value of depth bounds data D21 ' is less than one predetermined times of the most recent value of the depth bounds data D2 ' Number, if the most recent value of the initial depth range data D21 ' is described nearest less than the depth bounds data D2 ' The prearranged multiple of value, then the depth bounds data D2 ' is assigned the initial depth range data D21 ', that is, It says, the depth bounds data D2 ' is updated to the initial depth range data D21 ';If the initial depth range number The prearranged multiple of the most recent value according to the most recent value of D21 ' not less than the depth bounds data D2 ', then it is described Depth bounds data D2 ' is still assigned the depth bounds data D2 '.Preferably, the prearranged multiple can with but do not limit In being implemented as 1.2 times.
It will be appreciated by those skilled in the art that the statistical module 221 ' of the analytical unit 22 ' can be according to interval One predetermined time received and counted the original data D1 with being timed, can also be according to the figure of one predetermined frame number of interval Picture receives and counts the original data D1 with being timed, or can also trigger simultaneously in the way of artificially controlling The statistical module 221 ' is controlled to be received and be counted the work of the original data D1.Citing ground, a depth phase Machine is usually per second can to shoot 30 frame images, can choose and execute a statistical work at interval of ten frame images, to judge The difference of initial depth range data D21 ' Yu the depth bounds data D2 ' are stated, and then judges the depth letter of measured target W Whether breath changes, and when the measured target changes or viewing distance changes, the Pseudo-color technology system 20 ' can calculate automatically and adjust the depth bounds data D2 ', to meet the need of different viewing distances or different measured targets It wants, to improve the use scope of the pseudo- color mapped system 20 ' of the 3D.
Second preferred embodiment according to the present invention, as illustrated in fig. 7 and fig. 10, the pseudo- color mapped system 20 ' of the 3D The processing unit 23 ' include a mapping block 231 ', a configuration module 232 ' and a correction module 233 ', wherein the place The correction module 233 ' for managing unit 23 ' is single with the receiving module 211 of the interactive unit 21 and the analysis respectively Member 22 ' is communicatively coupled, and the correction module 233 ' can be synchronously received from the receiving module 211 it is described just The beginning image data D1 and depth bounds data D2 ' for coming from the analytical unit 22 ', with the original data The size being farthest worth of the depth value of each pixel and the depth bounds data D2 ' in D1, when the initial graph It, will be described first when being greater than the farthest value of the depth bounds data D2 ' as the depth value of the pixel of data D1 The depth value of the pixel of beginning image data D1 is assigned a value of the farthest value of the depth bounds data D2 ';Work as institute The depth value for stating the pixel of original data D1 is not more than the farthest value of the depth bounds data D2 ' When, the depth value of the pixel of the original data D1 is still assigned a value of to the depth value of the pixel, with Generate one amendment original data D1 ', that is to say, that be more than the depth bounds data D2 ' it is described be farthest worth own The depth value of pixel is corrected for the farthest value, so that the depth for correcting all pixels in original data D1 ' Value is respectively positioned between the most recent value and the farthest value of the depth bounds data D2 ', is fallen in preventing depth value Pixel except the depth bounds data D2 ', so that it is guaranteed that all pixels can be pseudo- in the original data D1 Coloured silk mapping.
In some embodiments of the invention, the correction module 233 ' can the synchronously original data D1 In each pixel the depth value and the depth bounds data D2 ' the most recent value size, when the initial pictures When the depth value of the pixel of data D1 is less than the most recent value of the depth bounds data D2 ', the initial graph As the depth value of the pixel of data D1 will be assigned the most recent value of the depth bounds data D2 ', also It is to say, is corrected for less than the depth value of all pixels being farthest worth of the depth bounds data D2 ' described nearest Value, so that the depth value of all pixels is respectively positioned between the most recent value and the farthest value of the depth bounds data D2 ', The all pixels in the original data D1 to ensure every frame can be performed Pseudo-color technology, to be each pixel It is mapped to corresponding color.
It will be appreciated by those skilled in the art that normally, the most recent value of the depth bounds data D2 ' is zero, make The depth value for obtaining any pixel will not be less than the most recent value, therefore, in second preferred embodiment of the invention In, the correction module 233 ' does not need the depth value of pixel described in comparison and the size of the most recent value.
As shown in Figure 10, the mapping block 231 ' of the processing unit 23 ' respectively with the correction module 233 ' and The analytical unit 22 ' is communicatively coupled, wherein the correction module 233 ' can pass the amendment original data D1 ' The mapping block 231 ' is transported to, the depth bounds data D2 ' can be transmitted to the mapping block by the analytical unit 22 ' 231 ', the mapping block 231 ', which can synchronize, receives the amendment original data D1 ' and the depth bounds data D2 ', Wherein the mapping block 231 ' can based on the depth bounds data D2 ' will be each in the amendment original data D1 ' The depth value of pixel maps in a predetermined mapping range, to calculate a depth map value of each pixel, with A depth map data D3 ' is generated, wherein the depth map data D3 ' includes institute in the amendment original data D1 ' State the depth map value of each pixel.
The configuration module 232 ' of the processing unit 23 ' is communicatively coupled with the mapping block 231 ', and The configuration module 232 ' can receive the depth map data D3 ' from the mapping block 231 ', and be based on the depth Degree mapping data D3 ' is each corresponding HSV value of pixel configuration, to generate the pcolor as data D4 '.It is worth noting , the configuration module 232 ' be based on hsv color model configured, wherein in the HSV value tone value H value range For 0 to 360 degree, the value range of intensity value S is 0 to 1 (in the present invention, by the ratio such as value range of the intensity value S Example ground is expanded to 0 to 255, saturation degree when wherein the S=255 epoch refer to S=1), the value range of brightness value V is 0 to 255 (its Described in brightness value V indicate black when being 0, white is indicated when the brightness value V is 255), in different HSV value generations, refers to different Therefore color obtains the corresponding HSV value by the depth value of each pixel, so that realizing has difference The pixel of depth value is configured upper different HSV value, that is, obtains different colors, will pass through the pcolor as data The color for the pixel for including in D4 distinguishes the different depth of measured target W.
In second preferred embodiment of the invention, as shown in Figure 8 and Figure 11, the described of the processing unit 23 ' is reflected It penetrates module 231 ' and integrates module 2313 ' including a tone mapping block 2311 ', a brightness mapping block 2312 ' and one, Described in tone mapping module 2311 ' can based on the depth bounds data D2 ' will in the amendment original data D1 ' often The depth value of one pixel maps in a predetermined tone mapping range, to calculate the amendment original data D1' Described in each pixel a tone mapping value, to generate a tone map data D31 ', the 2312 ' energy of brightness mapping block The depth value of each pixel in the amendment original data D1' is mapped to based on the depth bounds data D2 ' In one predetermined luminance mapping range, reflected with calculating a brightness of each pixel described in the amendment original data D1' Value is penetrated, maps data D32 ' to generate a brightness.It is described integrate module 2313 ' respectively with the tone mapping module 2311 ' and The brightness mapping block 2312 ' is communicatively coupled, and it is described integrate module 2313 ' and can receive and integrate the tone reflect Data D31 ' and brightness mapping data D32 ' are penetrated, to generate the depth map data D3 ', wherein the depth map number It include that the tone map data of each pixel and the brightness map data according to D3 ', that is to say, that the depth is reflected The depth map value for penetrating data D3 ' includes the tone mapping value of the pixel of the tone map data and described The brightness mapping value of brightness mapping data.
Preferably, the predetermined tone mapping range is implemented as 0 to 180 degree, due in the HVS color model, 180 degree is differed between the tone value H of complementary colours respectively, therefore when the predetermined tone mapping range is selected as 0 to 180 degree When, the tone mapping value of the pixel in tone map data D31 value between the complementary colours, so that institute The color of the pcolor picture obtained is relatively abundanter, and ensures that the pixel of different depth has different colors, and Without worry difference larger depth different pixels have similar color, with will pass through color distinguish the pcolor as Depth information.It will be understood by those skilled in the art that the predetermined tone mapping range also may be implemented as such as 60 to 240 degree, The value range of any difference 180 degree of 120 to 300 degree, 180 to 360 degree etc..Certainly, the predetermined tone mapping range It may be implemented as the value range of non-difference 180.
It is worth noting that, the predetermined luminance mapping range can be, but not limited to be implemented as 0 to 255, so that smaller The depth value be mapped to the biggish brightness mapping value, correspondingly, the biggish depth value is mapped to smaller The brightness mapping value, that is to say, that the smaller pixel of the depth value has brighter brightness, and the depth value gets over The big pixel has darker brightness, and in other words, more forward region is in the pcolor picture on measured target W Brightness is higher, and the brightness in the pcolor picture of the region on measured target W more rearward is lower, is practised with meeting the viewing of human eye It is used, it helps human eye distinguishes depth or the boundary of measured target W by directly observing the pcolor picture.This field skill Art personnel should be appreciated that any range that the predetermined luminance mapping range can also be implemented as between 0 to 255, together Sample can reach same brightness recognition effect.In addition, also the lesser depth value can be reflected by changing mapping algorithm The lesser brightness mapping value is penetrated into, so that brightness is more in the pcolor picture in region more forward on measured target W It is low.
The configuration module 232 ' of the processing unit 23 ' receives the depth from the mapping block 231 ' and reflects Data D3 ' is penetrated, the tone value H of the pixel is assigned a value of in the depth map data D3 ' the corresponding pixel The tone mapping value, the brightness value V of the pixel is assigned a value of in the depth map data D3 ' corresponding described The brightness mapping value of pixel, while the intensity value S of the pixel is assigned a value of 255, to be configured to the pixel The HSV value, and then generate the pcolor as data D4 '.
It is noted that in some embodiments of the invention, the mapping block 231 ' of the processing unit 23 ' Only include the tone mapping module 2311 ', with by the tone mapping module 2311 ' by the amendment original data The depth value of each pixel maps in the predetermined tone mapping range in D1', to calculate the amendment initial graph The tone mapping value of each pixel as described in data D1', to generate the tone map data D31 ', so that the place The depth map data D3 ' generated of the mapping block 231 ' for managing unit 23 ' only includes the institute of each pixel State tone mapping value.The configuration module 232 ' of the processing unit 23 ' receives described from the mapping block 231 ' The tone value H of the pixel is assigned a value of corresponding institute in the depth map data D3 ' by depth map data D3 ' The brightness value V of the pixel is assigned a value of 255 by the tone mapping value for stating pixel, while will be described in the pixel Intensity value S is assigned a value of 255, to be configured to the HSV value of the pixel, and then generates the pcolor as data D4 '.
Second preferred embodiment according to the present invention, it is preferable that the mapping block of the processing unit 23 ' The depth value of the pixel in the amendment original data D1' is mapped to institute in a manner of Linear Mapping by 231 ' State the depth map value of pixel.It is worth noting that, the Linear Mapping mode can be a positive linear mapping mode, It may be a reverse linear mapping mode.
It is highly preferred that the tone mapping module 2311 ' of the mapping block 231 ' is in a manner of positive Linear Mapping The depth value of the pixel in the amendment original data D1' is mapped to the tone mapping of the pixel The brightness mapping block 2312 ' of value, the mapping block 231 ' is initial by the amendment in a manner of reverse linear mapping The depth value of the pixel in image data D1' is mapped to the brightness mapping value of the pixel.Citing ground, it is described The mapping relations of positive linear mapping mode are as follows: the tone mapping value=((depth value-most recent value)/depth Spend range) predetermined tone mapping range+predetermined tone mapping minimum value=((depth value-most recent value)/(institute described in * State the farthest value-most recent value)) * 180;The mapping relations of the reverse linear mapping mode are as follows: the brightness mapping value= Predetermined luminance mapping range described in (1- ((depth value-most recent value)/depth bounds)) *+predetermined luminance mapping Minimum value=(1- ((depth value-most recent value)/(the farthest value-most recent value))) * 255.It is noticeable It is that the predetermined tone mapping minimum value is the minimum value in the predetermined tone mapping range, the predetermined luminance mapping is most Small value is the minimum value in the predetermined luminance mapping range, in of the invention described first preferably implementation, the predetermined color Adjusting mapping minimum value and predetermined luminance mapping minimum value is zero.
It is noted that it will be appreciated by those skilled in the art that the mapping block 231 ' can also be with Nonlinear Mapping Mode the depth value of the pixel in the amendment original data D1' is mapped to the depth of the pixel Mapping value is spent, different depth value is mapped to different HSV values to realize, so that different color generations refers to different depths Degree.
Second preferred embodiment according to the present invention, as shown in figure 12, the present invention also provides the pseudo- color mappings of a 3D Method, wherein the pseudo- color mapping method the following steps are included:
S2 ': by an analytical unit 22 ' of the pseudo- color mapped system 20 ' of a 3D, pre-processing an original data D1, with Generate a depth bounds data D2 ';With
S3 ': being based on the depth bounds data D2 ', handles the original data by the processing unit 23 ' D1, to generate a pcolor as data D4 '.
In second preferred embodiment of the invention, also wrapped before the step S2 ' of the pseudo- color mapping method It includes a step S1 ': being received by a receiving module 211 of an interactive unit 21 of the pseudo- color mapped system 20 ' of the 3D from one The original data D1 of acquisition device 10.Preferably, the acquisition device 10 can be, but not limited to be implemented as one deeply Spend camera.
More specifically, the step S2 ' of the pseudo- color mapping method includes:
S21 ': by a statistical module 221 ' of the analytical unit 22 ', the picture in the original data D1 is counted The depth value of element, to obtain the maximum value and a minimum value of the depth value of the pixel, to generate an initial depth model Enclose data D21 ';With
S22 ': by a judgment module 222 ' of the analytical unit 22 ', judging the initial depth range data D21, To generate the depth bounds data D2 '.
It is noted that the step S3 ' of the pseudo- color mapping method the following steps are included:
S31 ': being based on the depth bounds data D2 ', passes through a processing unit 23 ' of the pseudo- color mapped system 20 ' of the 3D A correction module 233 ', correct the original data D1, with generate one amendment original data D1 ';
S32 ': it is based on the depth bounds data D2 ', by a mapping block 231 ' of the processing unit 23 ', by institute The depth value for stating the pixel of amendment original data D1' is mapped to a depth of the corresponding pixel respectively and reflects Value is penetrated, to generate a depth map data D3 ';And
S33 ': being based on the depth map data D3 ', passes through a configuration module 232 ' of the processing unit 23 ', configuration Corresponding HSV value is in the pixel, to generate the pcolor as data D4 '.
Preferably, the step S32 ' the following steps are included:
S321 ': it is based on the depth bounds data D2 ', passes through a tone mapping block of the mapping block 231 ' 2311 ', the depth value of the pixel of the amendment original data D1' is mapped to the corresponding pixel respectively A tone mapping value, to generate a tone map data D31 ';
S322 ': it is based on the depth bounds data D2 ', passes through a brightness mapping block of the mapping block 231 ' 2312 ', the depth value of the pixel of the amendment original data D1' is mapped to the corresponding pixel respectively A brightness mapping value, with generate a brightness map data D32 ';And
S323 ': by the mapping block 231 ', the tone map data D31 ' and brightness mapping data are integrated D32 ', to generate the depth map data D3 '.
It is worth noting that, in the step S32 ', the step 321 ' and the step S322 ' order regardless of elder generation Afterwards.
Further include a step S4 ' after the step S3 ' of the pseudo- color mapping method: passing through the interactive unit 21 One output module 212 by the pcolor as data D4 ' is transmitted to an output device 30, to be shown by the output device 30 Corresponding pcolor picture.Preferably, the output device 30 can be, but not limited to be implemented as a display screen.
It is noted that the second preferred embodiment according to the present invention, when the pseudo- color mapped system 20 ' of the 3D The pretreatment interval time of the statistical module 221 ' of the analytical unit 22 ' is set to the time of every frame, then mean The original data D1 ' of every frame is pre-processed, to generate the depth bounds data D2 ', at this time compared to root According to the pseudo- color mapped system 20 of the 3D of first preferred embodiment of the invention, the 3D of second preferred embodiment Pseudo- coloured silk mapped system 20 ' adds an adjustment nargin, i.e. the only initial depth range data of the initial pictures of present frame The farthest value of D21 ' is more than the prearranged multiple of the depth bounds data D2 ' of the initial pictures of previous frame being farthest worth, The depth bounds data D2 ' is assigned a value of the initial depth range data D21 ', otherwise will not adjust the depth bounds Data D2 ' adjusts the depth bounds data D2 ' to prevent slight change occurs because of the depth value of the initial pictures, with The consistency of colour of consecutive frame pcolor picture obtained is effectively kept, while can also judge automatically the captured quilt It surveys whether target W changes, and then meets the mapping needs of different measured target W.
Other side under this invention, the present invention further provides the pseudo- color mapping methods of a 3D, wherein the 3D is pseudo- color Mapping method includes the following steps:
(a) one is obtained deeply in a manner of analyzing an original data D1 of a measured target W an analytical unit 22 Spend range data D2;With
(b) pass through described in the depth bounds data D2 processing that a processing unit 23 is obtained based on the analytical unit 22 Original data D1, to obtain a pcolor of the measured target W as data D4, thus by the institute of the measured target W It states original data D1 and is mapped as the pcolor as data D4.
Other side under this invention, the present invention further provides one by an electronic equipment show a pcolor as Method, wherein the display methods of the pcolor picture includes the following steps:
(A) the original data D1 about a measured target W is acquired by an acquisition device 10 of an electronic equipment;
(B) the original data D1 is received from the acquisition device 10 by a receiving module 211;
(C) a depth bounds data D2 is obtained in such a way that an analytical unit 22 analyzes the original data D1;
(D) pass through described in the depth bounds data D2 processing that a processing unit 23 is obtained based on the analytical unit 22 Original data D1, to obtain a pcolor of the measured target W as data D4;And
(E) by an output module 212 to be communicatively connected in a network in the side of an output device 30 of the electronic equipment Formula shows the pcolor as data D4, to form a pcolor picture of the measured target on the output device 30.
It should be understood by those skilled in the art that foregoing description and the embodiment of the present invention shown in the drawings are only used as illustrating And it is not intended to limit the present invention.The purpose of the present invention has been fully and effectively achieved.Function and structural principle of the invention exists It shows and illustrates in embodiment, under without departing from the principle, embodiments of the present invention can have any deformation or modification.

Claims (24)

1. the pseudo- color mapped system of a 3D characterized by comprising
One analytical unit, wherein the analytical unit obtains a depth model by analyzing an original data of a measured target Enclose data;With
One processing unit, wherein the processing unit is communicatively connected in a network in the analytical unit, wherein the processing unit Original data described in the depth bounds data processing obtained based on the analytical unit, to obtain about described tested One pcolor of target is as data, so that the original data of the measured target is mapped as the pcolor picture number According to.
2. the pseudo- color mapped system of 3D according to claim 1, further comprises an interactive unit, wherein the interactive unit Including an output module, wherein the output module is communicatively connected in a network in the processing unit, wherein the output module Be communicatively connected in a network shown by way of an output device pcolor that the processing unit obtains as data, with A pcolor picture of the measured target is formed on the output device.
3. the pseudo- color mapped system of 3D according to claim 1, further comprises an interactive unit, wherein the interactive unit Including a receiving module, wherein the analytical unit is communicatively connected in a network in the receiving module, wherein the receiving module Obtain the original data of the measured target by way of an acquisition device being communicatively connected in a network, described point Analysis unit obtains the original data of the measured target from the receiving module.
4. the pseudo- color mapped system of 3D according to claim 2, wherein the interactive unit includes a receiving module, described point Analysis unit is communicatively connected in a network in the receiving module, wherein the receiving module is by being communicatively connected in a network in an acquisition The mode of device obtains the original data of the measured target, and the analytical unit obtains institute from the receiving module State the original data of measured target.
5. the pseudo- color mapped system of 3D according to claim 4, wherein the analytical unit includes a statistical module, the system Meter module is communicatively connected in a network in the receiving module and the processing unit, wherein the statistical module is from the reception mould Block receives the original data, and the statistical module is by counting all pixels in the original data The mode of depth value automatically calculates out the depth bounds of the pixel, to obtain the depth bounds data.
6. the pseudo- color mapped system of 3D according to claim 5, wherein the processing unit further comprises that can mutually communicate A mapping block and a configuration module for ground connection, wherein the mapping block be communicatively connected in a network in the receiving module and The analytical unit, wherein the mapping block receives the original data from the receiving module respectively and from described point It analyses unit and receives the depth bounds data, and the mapping block is to be based on the depth bounds data for the initial graph As the mode that data map to a default mapping range generates a depth map data, wherein the configuration module receives the depth Degree mapping data, and the depth map data are configured to generate the pcolor as data.
7. the pseudo- color mapped system of 3D according to claim 4, wherein the analytical unit includes mutually being communicatively coupled A statistical module and a judgment module, the statistical module is communicatively connected in a network in the receiving module, the judgement mould Block is communicatively connected in a network in the processing unit, wherein the statistical module receives the initial pictures from the receiving module Data simultaneously automatically calculate out the one of the pixel to count in a manner of the depth value of all pixels in the original data Depth bounds, to obtain an initial depth range data, wherein the judgment module is obtained according to the initial depth range data To the depth bounds data.
8. the pseudo- color mapped system of 3D according to claim 7, wherein the statistical module is periodically from the receiving module Receive the original data.
9. the pseudo- color mapped system of 3D according to claim 7, wherein the processing unit further comprises a mapping block It is communicatively connected in a network the configuration module and a correction module in the mapping block respectively, the correction module can be led to It is connected to letter the receiving module and the analytical unit, wherein the correction module is based on received from the analytical unit The depth bounds data correction obtains an amendment initial pictures after the received original data of the receiving module Data, wherein the mapping block be based on will be from the correction module from the received depth bounds data of the analytical unit The received amendment original data obtains a depth map data after mapping to a predetermined mapping range, wherein described match It sets module and receives the depth map data, and configure the depth map data to generate the pcolor as data.
10. the pseudo- color mapped system of 3D according to claim 6 or 9, wherein the mapping block is synchronously from the reception Module receives the original data and receives the depth bounds data from the analytical unit.
11. the pseudo- color mapped system of 3D according to claim 6 or 9, wherein the mapping block further comprises an integration It module and is communicatively connected in a network respectively in the tone mapping block and a brightness mapping block for integrating module, and institute It states tone mapping module to be communicatively connected in a network in the receiving module and the analytical unit, the brightness mapping block quilt can It is communicatively coupled to the receiving module and the analytical unit, wherein the tone mapping module is to be based on the depth bounds The mode that the original data maps to a default tone mapping range is obtained a tone map data by data, described bright Mapping block is spent so that the original data is mapped to a predetermined luminance mapping range based on the depth bounds data Mode obtains brightness mapping data, wherein described integrate the module entirely tone map data and brightness mapping number According to obtain the depth bounds data.
12. the pseudo- color mapped system of 3D according to claim 6 or 9, wherein the mapping block further comprises a tone Mapping block, the tone mapping module are communicatively connected in a network in the receiving module and the analytical unit, wherein described Tone mapping module is to map to a default tone mapping range for the original data based on the depth bounds data Mode obtain a tone map data, the tone map data forms the depth map data.
13. according to claim ask 6 or 9 described in the pseudo- color mapped system of 3D, wherein the mapping relations of the mapping block are one Linear mapping relation.
14. an electronic equipment characterized by comprising
One acquisition device;
One output device;And
The pseudo- color mapped system of one 3D, wherein the pseudo- color mapped system of the 3D further comprises:
One interactive unit, wherein the interactive unit includes a receiving module and an output module, the receiving module can be led to It is connected to letter the acquisition device, the output module is communicatively connected in a network in the output device;
One analytical unit, wherein the analytical unit is communicatively connected in a network in the interactive unit, wherein the analytical unit A depth is obtained after analyzing an original data of the received measured target acquired by described device of the receiving module Range data;And
One processing unit, wherein the processing unit is communicatively connected in a network in the analytical unit, wherein the processing unit Original data described in the depth bounds data processing obtained based on the analytical unit, to obtain about described tested One pcolor of target is as data, wherein the output module can show what the processing unit obtained in the output device The pcolor is as data, to form a pcolor picture on the output device.
15. the pseudo- color mapping method of a 3D, which is characterized in that the pseudo- color mapping method of the 3D includes the following steps:
(a) a depth bounds number is obtained in a manner of the original data for analyzing a measured target an analytical unit According to;With
(b) pass through initial pictures number described in the depth bounds data processing that a processing unit is obtained based on the analytical unit According to obtain a pcolor of the measured target as data, so that the original data of the measured target be reflected It penetrates as the pcolor as data.
16. the pseudo- color mapping method of 3D according to claim 15, wherein further comprising step in the step (a):
The depth value of all pixels in the original data is counted, to obtain the maximum of the depth value of the pixel Value and minimum value;With
According to the maximum value and minimum value of the depth value, the depth bounds data are obtained.
17. the pseudo- color mapping method of 3D according to claim 15 or 16, wherein further comprising step in the step (b) It is rapid:
The depth value of the pixel of the original data is mapped to phase respectively according to the depth bounds data One depth map value of the pixel answered, to obtain a depth map data;With
According to the corresponding HSV value of depth map data configuration in the pixel, to obtain the pcolor as data.
18. the pseudo- color mapping method of 3D according to claim 17, wherein will be described first according to the depth bounds data The depth value of the pixel of beginning image data is mapped to the depth map value of the corresponding pixel respectively, with Further comprise step to the step of depth map range:
Based on the depth bounds data, by described in the pixel of the tone mapping block by the original data Depth value is mapped to a tone mapping value of the corresponding pixel respectively, to obtain a tone map data;
Based on the depth bounds data, by described in the pixel of the brightness mapping block by the original data Depth value is mapped to a brightness mapping value of the corresponding pixel respectively, to obtain brightness mapping data;And
The module integrated tone map data is integrated by one and the brightness maps data, to obtain the depth map number According to.
19. the pseudo- color mapping method of 3D according to claim 17, wherein will be described first according to the depth bounds data The depth value of the pixel of beginning image data is mapped to the depth map value of the corresponding pixel respectively, with Further comprise step to the step of depth map range: being based on the depth bounds data, pass through a tone mapping The depth value of the pixel of the original data is mapped to a tone of the corresponding pixel by module respectively Mapping value, to obtain a tone map data, wherein the tone map data forms the depth map data.
20. the pseudo- color mapping method of 3D according to claim 15, wherein further comprising step in the step (a):
Count the depth value of all pixels in the original data, with obtain the pixel depth value maximum value and The mode of minimum value obtains an initial depth range data;
The initial depth range data is judged by a judgment module, to obtain the depth bounds data.
21. the pseudo- color mapping method of 3D described in 5 or 20 according to claim 1, wherein further comprising step in the step (b) It is rapid:
According to original data described in the depth bounds data correction, to obtain an amendment original data;
The depth value of the pixel of the amendment original data is mapped respectively according to the depth bounds data At a depth map value of the corresponding pixel, to obtain a depth map data;And
According to the corresponding HSV value of depth map data configuration in the pixel, to obtain the pcolor as data.
22. the pseudo- color mapping method of 3D according to claim 21, wherein being repaired according to the depth bounds data by described The depth value of the pixel of positive original data is mapped to the depth map value of the corresponding pixel respectively, Further comprise step with the step of obtaining the depth map range:
Based on the depth bounds data, by a tone mapping block by the pixel of the amendment original data The depth value is mapped to a tone mapping value of the corresponding pixel respectively, to obtain a tone map data;
Based on the depth bounds data, by a brightness mapping block by the pixel of the amendment original data The depth value is mapped to a brightness mapping value of the corresponding pixel respectively, to obtain brightness mapping data;And
The module integrated tone map data is integrated by one and the brightness maps data, to obtain the depth map number According to.
23. the pseudo- color mapping method of 3D according to claim 21, wherein being repaired according to the depth bounds data by described The depth value of the pixel of positive original data is mapped to the depth map value of the corresponding pixel respectively, Further comprise step with the step of obtaining the depth map range: being based on the depth bounds data, pass through a tone The depth value of the pixel of the amendment original data is mapped to the corresponding pixel by mapping block respectively A tone mapping value, to obtain a tone map data, wherein the tone map data forms the depth map data.
24. the method for showing a pcolor picture by an electronic equipment, which is characterized in that the display methods packet of the pcolor picture Include following steps:
(A) original data about a measured target is acquired by an acquisition device of an electronic equipment;
(B) original data is received from the acquisition device by a receiving module;
(C) a depth bounds data are obtained in such a way that an analytical unit analyzes the original data;
(D) pass through initial pictures number described in the depth bounds data processing that a processing unit is obtained based on the analytical unit According to obtain a pcolor of the measured target as data;And
(E) an output module by be communicatively connected in a network in a manner of an output device of the electronic equipment show it is described Pcolor is as data, to form a pcolor picture of the measured target on the output device.
CN201711182607.6A 2017-11-23 2017-11-23 3D pseudo color mapping system and mapping method thereof, and application of 3D pseudo color mapping system Active CN109829954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711182607.6A CN109829954B (en) 2017-11-23 2017-11-23 3D pseudo color mapping system and mapping method thereof, and application of 3D pseudo color mapping system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711182607.6A CN109829954B (en) 2017-11-23 2017-11-23 3D pseudo color mapping system and mapping method thereof, and application of 3D pseudo color mapping system

Publications (2)

Publication Number Publication Date
CN109829954A true CN109829954A (en) 2019-05-31
CN109829954B CN109829954B (en) 2021-04-23

Family

ID=66858495

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711182607.6A Active CN109829954B (en) 2017-11-23 2017-11-23 3D pseudo color mapping system and mapping method thereof, and application of 3D pseudo color mapping system

Country Status (1)

Country Link
CN (1) CN109829954B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110251117A (en) * 2019-07-04 2019-09-20 博睿康科技(常州)股份有限公司 A kind of electroencephalogram Time-Frequency Information method for visualizing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101546426A (en) * 2009-04-30 2009-09-30 上海交通大学 Weak luminescence image processing method based on regional augmentation and regional extraction
CN103035026A (en) * 2012-11-24 2013-04-10 浙江大学 Maxim intensity projection method based on enhanced visual perception
CN103808804A (en) * 2014-03-06 2014-05-21 北京理工大学 Method for rapidly mapping and imaging pseudo-colors through ultrasonic microscopic technology
CN104748729A (en) * 2015-03-19 2015-07-01 中国科学院半导体研究所 Optimized display device and optimized display method for range-gating super-resolution three-dimensional imaging distance map
CN105225256A (en) * 2015-06-10 2016-01-06 西安交通大学 A kind of display packing of high gray depth image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101546426A (en) * 2009-04-30 2009-09-30 上海交通大学 Weak luminescence image processing method based on regional augmentation and regional extraction
CN103035026A (en) * 2012-11-24 2013-04-10 浙江大学 Maxim intensity projection method based on enhanced visual perception
CN103808804A (en) * 2014-03-06 2014-05-21 北京理工大学 Method for rapidly mapping and imaging pseudo-colors through ultrasonic microscopic technology
CN104748729A (en) * 2015-03-19 2015-07-01 中国科学院半导体研究所 Optimized display device and optimized display method for range-gating super-resolution three-dimensional imaging distance map
CN105225256A (en) * 2015-06-10 2016-01-06 西安交通大学 A kind of display packing of high gray depth image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110251117A (en) * 2019-07-04 2019-09-20 博睿康科技(常州)股份有限公司 A kind of electroencephalogram Time-Frequency Information method for visualizing

Also Published As

Publication number Publication date
CN109829954B (en) 2021-04-23

Similar Documents

Publication Publication Date Title
KR102336064B1 (en) Imaging apparatus and imaging method thereof, image processing apparatus and image processing method thereof, and program
EP2940658B1 (en) Information processing device, information processing system, and information processing method
CN104272717B (en) For performing projected image to the method and system of the alignment of infrared ray (IR) radiation information detected
US8786679B2 (en) Imaging device, 3D modeling data creation method, and computer-readable recording medium storing programs
US6839088B2 (en) System and method for estimating physical properties of objects and illuminants in a scene using modulated light emission
CN110168562B (en) Depth-based control method, depth-based control device and electronic device
CN108337433A (en) A kind of photographic method, mobile terminal and computer readable storage medium
US11818513B2 (en) Self-adaptive adjustment method and adjustment system for brightness of projection apparatus
WO2018049849A9 (en) Light-splitting combined image collection device
CN108182923A (en) The method, display device and electronic equipment of image are shown on the display apparatus
CN106851122A (en) The scaling method and device of the auto exposure parameter based on dual camera system
CN105933617A (en) High dynamic range image fusion method used for overcoming influence of dynamic problem
CN111345029A (en) Target tracking method and device, movable platform and storage medium
US6829383B1 (en) Stochastic adjustment of differently-illuminated images
CN112361990B (en) Laser pattern extraction method and device, laser measurement equipment and system
CN109565577B (en) Color correction device, color correction system and color correction method
CN110533709A (en) Depth image acquisition method, apparatus and system, image capture device
CN107533765A (en) Track the device of optical object, method and system
WO2018206691A1 (en) Method for controlling a display parameter of a mobile device and computer program product
CN108140362A (en) Display method, display device, electronic equipment and computer program product
CN109905691A (en) Depth image acquisition device and depth image acquisition system and its image processing method
CN114627562A (en) Multispectral face recognition module and method
CN106683133A (en) Method for acquiring target depth image
CN109829954A (en) The application of the pseudo- color mapped system of 3D and its mapping method and the pseudo- color mapped system of 3D
AU2017218753A1 (en) Imaging device with white balance compensation and related systems and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant