CN109672876A - Depth map processing unit and depth map processing unit - Google Patents
Depth map processing unit and depth map processing unit Download PDFInfo
- Publication number
- CN109672876A CN109672876A CN201710966971.5A CN201710966971A CN109672876A CN 109672876 A CN109672876 A CN 109672876A CN 201710966971 A CN201710966971 A CN 201710966971A CN 109672876 A CN109672876 A CN 109672876A
- Authority
- CN
- China
- Prior art keywords
- image
- processing unit
- sensor
- imaging sensor
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
The present invention provides a kind of depth map processing unit and depth map processing unit, and the depth map processing unit includes: image acquisition units, depth image processing unit and using processing unit;The depth image processing unit is connected with described image acquisition unit, receives image sequence from described image acquisition unit;The depth image processing unit is connected with the application processing unit, and depth map is generated after handling the described image sequence received and is sent to the depth map described using processing unit.The double control to image acquisition units is realized by depth map processing unit and application processing unit in the present invention, the two both can individually control image acquisition units, it can cooperate again, image data is acquired by application processing unit to control image acquisition units, depth map processing is carried out to image data by depth map processing unit again, to effectively increase the efficiency of depth map acquisition, system performance is improved.
Description
Technical field
The present invention relates to field of image processings, more particularly to depth map technical field, specially a kind of depth map processing
Device and depth map processing unit.
Background technique
Currently, camera is when acquiring image data, the initiation of acquisition instructions, the setting of acquisition parameter and to collecting
The depth map processing of image be all to be completed by AP (application process), AP and camera pass through data line
It is directly connected to control line, unidirectionally controlled, serial operation.Since all operations require to initiate by AP, image is reduced
The efficiency of acquisition process, also reduces system performance.
Summary of the invention
In order to solve above-mentioned and other potential technical problems, the embodiment provides at a kind of depth map
Manage device, the depth map processing unit includes: image acquisition units, depth image processing unit and using processing unit;Institute
It states depth image processing unit to be connected with described image acquisition unit, receives image sequence from described image acquisition unit;It is described
Depth image processing unit is connected with the application processing unit, generates after handling the described image sequence received deep
The depth map is simultaneously sent to described using processing unit by degree figure.
In one embodiment of the invention, the depth image processing unit is acquired by the depth map and from described image
The received image sequence of unit is sent to described using processing unit simultaneously.
In one embodiment of the invention, described image acquisition unit includes an imaging sensor;Described image sensing
Device is RGB image sensor, MONO imaging sensor, structure light image sensor or TOF imaging sensor.
In one embodiment of the invention, the first control instruction is sent to described image and acquired by the application processing unit
Unit.
In one embodiment of the invention, the first control instruction is sent to the depth image by the application processing unit
Processing unit;The depth image processing unit is from the application processing unit the first control instruction of reception and according to described first
Control instruction generates the second control instruction;Second control instruction is sent to described image by the depth image processing unit
Acquisition unit.
In one embodiment of the invention, described image acquisition unit includes the first imaging sensor and the second image sensing
Device;The depth image processing unit is connected with the first image sensor and second imaging sensor and simultaneously respectively
Image sequence is received from the first image sensor and second imaging sensor;The depth image processing unit docking
The two groups of described image sequences received generate depth map and the depth map is sent to the application after being handled handles list
Member.
In one embodiment of the invention, the first control instruction is respectively sent to described first by the application processing unit
Imaging sensor and second imaging sensor.
In one embodiment of the invention, the application processing unit is simultaneously from the first image sensor and described the
Two imaging sensors receive image sequence.
In one embodiment of the invention, the first control instruction is sent to the depth image by the application processing unit
Processing unit;The depth image processing unit is from the application processing unit the first control instruction of reception and according to described first
Control instruction generates the second control instruction;Second control instruction is respectively sent to described by the depth image processing unit
First imaging sensor and second imaging sensor.
In one embodiment of the invention, the first image sensor and second imaging sensor are RGB figure
As sensor, MONO imaging sensor or structure light image sensor or the first image sensor and second figure
One is RGB image sensor in picture sensor, another is MONO imaging sensor, structure light image sensor or TOF are schemed
As sensor.
In one embodiment of the invention, described image acquisition unit includes the first imaging sensor and the second image sensing
Device;The application processing unit is connected with the first image sensor and receives image sequence from the first image sensor
Column;The depth image processing unit is connected with second imaging sensor and receives image from second imaging sensor
Sequence;The depth image processing unit generates depth map and by the depth after handling the described image sequence received
Degree figure is sent to described using processing unit.
In one embodiment of the invention, the figure that will be received from the first image sensor using processing unit
The depth image processing unit is sent to after being handled as sequence;The depth image processing unit is to from application place
It the image sequence that receives of reason unit and generates after being handled from the image sequence that second imaging sensor receives deep
The depth map is simultaneously sent to described using processing unit by degree figure.
In one embodiment of the invention, the first control instruction is respectively sent to described first by the application processing unit
Imaging sensor and second imaging sensor.
In one embodiment of the invention, the first control instruction is sent to the depth image by the application processing unit
Processing unit;The depth image processing unit is from the application processing unit the first control instruction of reception and according to described first
Control instruction generates the second control instruction;Second control instruction is respectively sent to described by the depth image processing unit
First imaging sensor and second imaging sensor.
In one embodiment of the invention, the first image sensor is RGB image sensor or MONO image sensing
Device;Second imaging sensor is RGB image sensor, MONO imaging sensor, structure light image sensor or TOF image
Sensor.
In one embodiment of the invention, described image acquisition unit includes the first imaging sensor, the second image sensing
Device and third imaging sensor;The first image sensor and second imaging sensor be RGB image sensor,
In MONO imaging sensor or structure light image sensor or the first image sensor and second imaging sensor
One is RGB image sensor, another is MONO imaging sensor;The 3rd sensor be structure light image sensor or
TOF imaging sensor.
In one embodiment of the invention, the depth image processing unit and the first image sensor, described the
Two imaging sensors are connected with any one in the third imaging sensor and obtain figure from the connected imaging sensor
As sequence, the application processing unit is connected with remaining two imaging sensors and simultaneously from described two imaging sensor
Obtain image sequence.
In one embodiment of the invention, it is described the image sequence received is handled using processing unit after send
To the depth image processing unit;The depth image processing unit is to from the image sequence received using processing unit
Column with handled from the image sequence that the imaging sensor being connected receives after generate and depth map and send the depth map
Processing unit is applied to described.
In one embodiment of the invention, the depth image processing unit and the first image sensor, described the
Two imaging sensors are connected with any two in the third imaging sensor, while obtaining figure from two imaging sensors
As sequence;The application processing unit is connected with a remaining imaging sensor and obtains image sequence from the imaging sensor
Column.
In one embodiment of the invention, the image sequence that will be received using processing unit is sent after being handled
To the depth image processing unit;The depth image processing unit is to from the image sequence received using processing unit
Column with handled from the image sequence that two imaging sensors being connected receive after generate depth map and by the depth map
It is sent to described using processing unit.
In one embodiment of the invention, the depth image processing unit respectively with the first image sensor, institute
It states the second imaging sensor to be connected with the third imaging sensor, and simultaneously from the first image sensor, described second
Imaging sensor and the third imaging sensor receive image sequence;The depth image processing unit is to three received
Group described image sequence generates depth map and is sent to the depth map described using processing unit after being handled.
In one embodiment of the invention, the first control instruction is sent to the depth image by the application processing unit
Processing unit;The depth image processing unit is from the application processing unit the first control instruction of reception and according to described first
Control instruction generates the second control instruction;Second control instruction is respectively sent to described by the depth image processing unit
First imaging sensor, second imaging sensor and the third imaging sensor.
In one embodiment of the invention, the first control instruction is respectively sent to described first by the application processing unit
Imaging sensor, second imaging sensor and the third imaging sensor.
The embodiment of the present invention also provides a kind of depth image processing unit, including image processor, image connecting interface
With apply connecting interface;Described image connecting interface is connected with described image acquisition unit, receives from described image acquisition unit
Image sequence;Described image processor carries out processing to the image sequence received and generates depth map;It is described to apply connecting interface
It is connected with the application processing unit, the depth map is sent to described using processing unit.
In one embodiment of the invention, the application connecting interface is by the depth map and from described image acquisition unit
Received image sequence is sent to described using processing unit simultaneously.
In one embodiment of the invention, described image connecting interface is one, and described image acquisition unit includes one
Imaging sensor;Described image sensor is RGB image sensor, MONO imaging sensor, structure light image sensor or TOF
Imaging sensor.
In one embodiment of the invention, the first control instruction is sent to the application and connected by the application processing unit
Interface.
In one embodiment of the invention, the first control instruction is sent to the application and connected by the application processing unit
Interface;Described image processor generates the second control instruction according to first control instruction and sends out second control instruction
It send to described image connecting interface.
In one embodiment of the invention, described image connecting interface includes that the first image connecting interface and the second image connect
Connection interface, described image acquisition unit include the first imaging sensor and the second imaging sensor;The first image connects
Mouth is connected with the first image sensor and receives image sequence, the second image connection from the first image sensor
Interface is connected with second imaging sensor and receives image sequence from second imaging sensor;Described image processor
The two groups of described image sequences received to the first image connecting interface and the second image connecting interface are handled
After generate depth map.
In one embodiment of the invention, the first control instruction is sent to the application and connected by the application processing unit
Interface;Described image processor generates the second control instruction according to first control instruction and divides second control instruction
It is not sent to the first image connecting interface and the second image connecting interface.
In one embodiment of the invention, described image connecting interface includes that the first image connecting interface and the second image connect
Connection interface, described image acquisition unit include including the first imaging sensor and the second imaging sensor;The application processing is single
Member is connected with the first image sensor and receives image sequence from the first image sensor;The second image connection
Interface is connected with second imaging sensor and receives image sequence from second imaging sensor;Described image processor
Depth map is generated after handling the described image sequence that the second image connecting interface receives;The application connects
The depth map is sent to described using processing unit by mouth.
In one embodiment of the invention, the figure that will be received from the first image sensor using processing unit
It is sent to after being handled as sequence described using connecting interface;Described image processor is received to from the application connecting interface
To image sequence and generate depth map after being handled from the image sequence that the second image connecting interface receives;It is described
The depth map is sent to using connecting interface described using processing unit.
In one embodiment of the invention, the application connecting interface receives the first control from the application processing unit and refers to
It enables, described image processor generates the second control instruction according to first control instruction, and the second image connecting interface will
Second control instruction is sent to second imaging sensor.
In one embodiment of the invention, the application connecting interface receives the first control from the application processing unit and refers to
It enables, the depth image processing unit receives the first control instruction from the application processing unit and referred to according to first control
It enables and generates the second control instruction;Described image processor generates the second control instruction according to first control instruction, and described the
Second control instruction is sent to the first image sensor, the second image connecting interface by one image connecting interface
Second control instruction is sent to second imaging sensor.
In one embodiment of the invention, described image connecting interface includes the first image connecting interface, the second image company
Connection interface and third image connecting interface, described image acquisition unit include including the first imaging sensor, the second image sensing
Device and third imaging sensor;The first image sensor and second imaging sensor be RGB image sensor,
In MONO imaging sensor or structure light image sensor or the first image sensor and second imaging sensor
One is RGB image sensor, another is MONO imaging sensor;The 3rd sensor be structure light image sensor or
TOF imaging sensor.
In one embodiment of the invention, the first image connecting interface, the second image connecting interface and described
Any of third image connecting interface and the first image sensor, second imaging sensor and the third figure
Obtain image sequence as any of sensor is connected, and from the connected imaging sensor, the application processing unit and
Remaining two imaging sensors are connected and obtain image sequence from described two imaging sensors simultaneously.
In one embodiment of the invention, the first image connecting interface, the second image connecting interface and described
Any two and the first image sensor, second imaging sensor and the third figure in third image connecting interface
As any two in sensor are connected from connected two imaging sensors acquisition image sequence;Processing unit is applied from described
It is connected with remaining two imaging sensors and obtains image sequence from described two imaging sensors simultaneously.
In one embodiment of the invention, the application connecting interface receives image sequence from the application processing unit;
Described image processor is to from the image sequence received using connecting interface and the figure received from image connecting interface
Depth map is generated after being handled as sequence;The depth map is sent to the application and handles list by the application connecting interface
Member.
In one embodiment of the invention, the first image connecting interface, the second image connecting interface and described
Third image connecting interface respectively corresponds and the first image sensor, second imaging sensor and the third image
It is connected in sensor and corresponding from the first image sensor, second imaging sensor and the third imaging sensor
Obtain image sequence;Described image processor is to from the first image connecting interface, the second image connecting interface and institute
It states after three groups of image sequences that third image connecting interface receives are handled and generates depth map;The application connecting interface will
The depth map is sent to described using processing unit.
In one embodiment of the invention, the first control instruction is sent to the application and connected by the application processing unit
Interface;Described image processor generates the second control instruction according to first control instruction and divides second control instruction
It is not sent in the first image connecting interface, the second image connecting interface and the third image connecting interface extremely
It is one few.
As described above, depth map processing unit and depth map processing unit of the invention has the advantages that
It is realized by depth map processing unit and application processing unit to the double controls of image acquisition units in the present invention, two
Person not only can individually control image acquisition units, but also can cooperate, and be adopted by application processing unit to control image
Collection unit is acquired image data, then carries out depth map processing to image data by depth map processing unit, to effectively mention
The efficiency that high depth map obtains, improves system performance.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for
For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other
Attached drawing.
Fig. 1 is shown as the whole principle assumption diagram of depth map processing unit of the invention.
The signal control that Fig. 2~Fig. 3 is shown as when depth map processing unit of the invention connects an imaging sensor is shown
It is intended to.
The signal control that Fig. 4~Fig. 8 is shown as when depth map processing unit of the invention connects two imaging sensors is shown
It is intended to.
The signal control that Fig. 9~Figure 17 is shown as when depth map processing unit of the invention connects three imaging sensors is shown
It is intended to.
Component label instructions
100 depth map processing units
110 depth image processing units
120 image acquisition units
121 first imaging sensors
122 second imaging sensors
123 third imaging sensors
130 apply processing unit
Specific embodiment
Illustrate embodiments of the present invention below by way of specific specific example, those skilled in the art can be by this specification
Other advantages and efficacy of the present invention can be easily understood for disclosed content.The present invention can also pass through in addition different specific realities
The mode of applying is embodied or practiced, the various details in this specification can also based on different viewpoints and application, without departing from
Various modifications or alterations are carried out under spirit of the invention.It should be noted that in the absence of conflict, following embodiment and implementation
Feature in example can be combined with each other.
Fig. 1 is please referred to Figure 17.It should be clear that this specification structure depicted in this specification institute accompanying drawings, ratio, size etc., are only used
To cooperate the revealed content of specification, so that those skilled in the art understands and reads, being not intended to limit the invention can
The qualifications of implementation, therefore do not have technical essential meaning, the tune of the modification of any structure, the change of proportionate relationship or size
It is whole, in the case where not influencing the effect of present invention can be generated and the purpose that can reach, it should all still fall in disclosed skill
Art content obtains in the range of capable of covering.Meanwhile in this specification it is cited as "upper", "lower", "left", "right", " centre " and
The term of " one " etc. is merely convenient to being illustrated for narration, rather than to limit the scope of the invention, relativeness
It is altered or modified, under the content of no substantial changes in technology, when being also considered as the enforceable scope of the present invention.
The purpose of the present embodiment is that a kind of depth map processing unit and depth map processing unit are provided, it is existing for solving
All operations of the processing of depth map require to initiate and handle by AP (using processing unit) in technology, reduce image and adopt
The problem of collecting the efficiency handled, also reducing system performance.Depth map processing unit and depth of the invention described in detail below
Spend figure processing unit principle and embodiment, make those skilled in the art do not need creative work be appreciated that it is of the invention
Depth map processing unit and depth map processing unit.
Fig. 1 is the embodiment schematic diagram of depth map processing unit of the present invention.Specifically, as shown in Figure 1, this implementation
Example is provided the embodiment provides a kind of depth map processing unit 100, and the depth map processing unit 100 is applied to
In one intelligent electronic device, for example, applied to smart phone, tablet computer, game machine etc. it is any have take pictures or camera function
Electronic equipment.
Specifically, the depth map processing unit 100 includes: image acquisition units 120, depth image processing unit 110
With application processing unit 130 (AP).Wherein, image acquisition units 120 are the electronic component with acquisition image/video function,
Such as photographic head or camera.Image acquisition units 120 can receive external control instruction, and be collected according to external control instruction
Or setting image acquisition parameter, internal optical component is adjusted, and obtain image/video.In addition, image acquisition units 120 can be with
It is generated according to external control instruction and exports relevant data.The depth image processing unit 110 and described image acquisition are single
Member 120 is connected, and receives image sequence from described image acquisition unit 120;The depth image processing unit 110 and the application
Processing unit 130 is connected, and depth map is generated after handling the described image sequence received and sends the depth map
To the application processing unit 130.
In some embodiments, the depth image processing unit 110 is that can control image acquisition units 120, and locate
Manage the electronic component of control instruction and image/video data.Specifically, the depth image processing unit 110 has at instruction
Manage function and data processing function." instruction processing function " refers to the control instruction handled for controlling image acquisition units 120, or
Person generates the ability that can control the control instruction of image acquisition units 120;And " data processing function " refers to processing Image Acquisition
The data that unit 120 generates, or generate the ability of data relevant to the data that image acquisition units 120 generate.Using place
Managing unit 130 is the electronic component that can control image acquisition units 120 and the depth image processing unit 110.
In some embodiments, the depth image processing unit 110 includes digital signal processor and cache module.Institute
Stating digital signal processor can be any hardware module for having digital information processing function, such as Digital Signal
Processor (abbreviation DSP).The cache module is to provide various data for the depth image processing unit 110 to store
Hardware store module (such as flash, RAM, ROM, Cache etc.).The application processing unit 130 includes main control module and storage
Module, the main control module can be any hardware module (such as CPU, DSP etc.) with calculation processing ability.The storage mould
Block is the hardware store module to provide the function of various data storages using processing unit 130.Can be using processing unit 130
Application processor (Application Processor, abbreviation AP), central processing unit (CentralProcessing Unit, letter
Claim CPU) or System on Chip/SoC (System on Chip, abbreviation SoC).
In some embodiments, it by control instruction that is described being generated using processing unit 130 or sending out, can be described as
" the first control instruction " also includes for controlling including the instruction and relevant various acquisition parameters for controlling camera
The instruction of depth image processing unit 110 and relevant various information.The control issued from depth image processing unit 110 refers to
Order can be considered as " the second control instruction ", including the instruction and relevant various acquisition parameters for controlling camera.Change sentence
It talks about, the second control instruction can be according to the generation of the first control instruction.Second control instruction also may include part or
All the first control instructions, including the instruction generated according to the first control instruction, including the finger unrelated with the first control instruction
The various combinations of order or above-mentioned several instructions.For example, depth image processing unit 110 can carry out the first control instruction
Repeat, copy, addition, modification, replacement, delete etc. it is various operation and generate the second control instruction, can not also according to first control
System instructs and is individually created the second control instruction.
Obtain depth map in the present embodiment, can use kinds of schemes: 1) image acquisition units 120 include: two common
Imaging sensor (RGB sensor or Mono sensor);2) image acquisition units 120 include: a TOF sensor;
3) image acquisition units 120 include: one or two structure lights sensor;4) image acquisition units 120 include: two common
Sensor adds TOF senor or structure light sensor.
TOF sensor and structure light sensor requires additional light emitting source cooperation.
Image taking sensor is transmitted to depth image processing unit 110, depth image as image input source, by image sequence
Processing unit 110 finally calculates the depth map that can be exported by certain computational algorithm.That is, in this present embodiment,
The calculating of depth map is completed in depth image processing unit 110.
The process that the calculating of depth map is completed in the depth image processing unit 110 is as follows:
It is digital picture, including two-way digital picture, two-way digital picture to the depth image processing unit 110 input
Sequence, single channel image sequence.The generation source of input includes RGB sensor, Mono sensor, IR sensor, TOF sensor
Deng.The digital picture for being transferred to depth image processing unit 110 is preferably raw image.Raw image is most normal if it is RGB image
What is seen is Baeyer arrangement (Bayer Pattern), it is also possible to it is other arrangement modes, in order to facilitate depth map generalization,
Generally traditional RGB image or YUV image are converted by these arrangements.If raw image is that monochromatic sensor is generated,
Not needing conversion then can directly use.General algorithm is that disparity map is first calculated according to input picture, and disparity map can be simple
The reverse for being considered depth map.The basic step for generating disparity map by two common image sensors is as follows: polar curve correction
When dual input, need to carry out polar curve correction to image.
1) it pre-processes: some pretreatments is carried out to input picture, according to different use-cases, specific processing be will be different,
Common processing has denoising, demosaicing, scaling, color space conversion etc..
2) matching cost calculates (matching cost computation): the method that common cost calculates has SSD,
The calculated result of the windows cost calculation method such as SAD, these methods is very big with the value relationship of pixel itself.For dual camera system
System takes the photograph head system more, the exposure (gain) of each camera and deviates (bias) and is not quite similar, for such system, one
As using to exposing and deviate insensitive cost, such as census transformation.Cost matching is carried out along polar curve direction.Finally
Generate matching cost figure identical with depth levels.
3) cost polymerization and disparity map calculate: polymerizeing to matching cost, polymerizing windows, can also either fixed window
To be the window of change in size.Each the smallest parallax value of pixel cost is calculated after cost is polymerize.By this step
Available preliminary disparity map.
4) post-process: this step optimizes disparity map.Delete partial error depth, filling cavity.According to image object
The parallax boundary of the edge optimization disparity map of body.After above procedure is handled, the depth levels of general depth map are fewer, deep
Spend existing relatively rough of chart.In some applications, for example robot navigation and target following, less parallax grade can also expire
Sufficient demand.But rendering based on image etc. is applied, and has higher requirement to parallax grade, what this will generate preceding step
Disparity map does process of refinement, usually carries out the method for sub-pix disparity estimation.
In this present embodiment, the depth image processing unit 110 (Pre-ISP) mainly completes the calculating of depth map, raw
At depth map, the image sequence of all or part of imaging sensor (sensor) transmission is received.AP
(APPLICATIONPROCESSOR) depth map that primary recipient generates does further practical application, and AP is also possible to control figure
As sensor and the image sequence of receiving portion imaging sensor transmission.
In this present embodiment, the depth image processing unit 110 is by the depth map and from described image acquisition unit
120 received image sequences are sent to the application processing unit 130 simultaneously.
Several situations of the imaging sensor to depth map processing unit 100 comprising different number carry out specifically below
It is bright.
In an embodiment, as shown in Figures 2 and 3, described image acquisition unit 120 includes an imaging sensor;Institute
Stating imaging sensor is RGB image sensor, MONO imaging sensor, structure light image sensor or TOF imaging sensor.
Depth image processing unit 110 receives the image sequence of imaging sensor, and controls the 3A of imaging sensor.Depth
Generation depth map is transmitted to using processing unit 130 after image processing unit 110 carries out a series of processing to image sequence.
In an embodiment, as shown in Fig. 2, the first control instruction is sent to the figure by the application processing unit 130
As acquisition unit 120, or as shown in figure 3, the first control instruction is sent to the depth map by the application processing unit 130
As processing unit 110;The depth image processing unit 110 receives the first control instruction simultaneously from the application processing unit 130
The second control instruction is generated according to first control instruction;The depth image processing unit 110 refers to second control
Order is sent to described image acquisition unit 120.
In an embodiment, as shown in Figure 4 and Figure 5, described image acquisition unit 120 includes the first imaging sensor 121
With the second imaging sensor 122;The depth image processing unit 110 respectively with the first image sensor 121 and described
Second imaging sensor 122 is connected and receives simultaneously from the first image sensor 121 and second imaging sensor 122
Image sequence;The depth image processing unit 110 generates depth after handling the two groups of described image sequences received
Scheme and the depth map is sent to the application processing unit 130.
In an embodiment, as shown in figure 4, the first control instruction is respectively sent to institute by the application processing unit 130
State the first imaging sensor 121 and second imaging sensor 122.
Wherein, as shown in figure 4, the application processing unit 130 can be simultaneously from the first image sensor 121 and institute
It states the second imaging sensor 122 and receives image sequence.
That is, the depth image processing unit 110 receives the first imaging sensor 121 simultaneously and the second image passes
The image sequence of sensor 122 carries out 3A to the first imaging sensor 121 and the second imaging sensor 122 using processing unit 130
Control.After the depth image processing unit 110 carries out a series of processing to two groups of image sequences, generation depth map, which is transferred to, is answered
With processing unit 130, or the data and depth map of the first imaging sensor 121 and the second imaging sensor 122 are transmitted simultaneously
To using processing unit 130.
In an embodiment, as shown in figure 5, the first control instruction is sent to the depth by the application processing unit 130
Spend image processing unit 110;The depth image processing unit 110 receives the first control from the application processing unit 130 and refers to
It enables and the second control instruction is generated according to first control instruction;The depth image processing unit 110 is controlled described second
System instruction is respectively sent to the first image sensor 121 and second imaging sensor 122.
As shown in figure 5, depth image processing unit 110 receives the first imaging sensor 121 and the second image sensing simultaneously
The image sequence of device 122, depth image processing unit 110 also to the first imaging sensor 121 and the second imaging sensor 122 into
Row 3A control.After depth image processing unit 110 carries out a series of processing to two groups of image sequences, generation depth map, which is transferred to, is answered
With processing unit 130, or the data and depth map of the first imaging sensor 121 and the second imaging sensor 122 are transmitted simultaneously
To using processing unit 130.
In an embodiment, the first image sensor 121 and second imaging sensor 122 are RGB image
Sensor, MONO imaging sensor or structure light image sensor or the first image sensor 121 and second figure
One is RGB image sensor in picture sensor 122, another is MONO imaging sensor, structure light image sensor or TOF
Imaging sensor.
In an embodiment, as shown in fig. 6, described image acquisition unit 120 includes the first imaging sensor 121 and second
Imaging sensor 122;The first image sensor 121 is RGB image sensor or MONO imaging sensor;Second figure
As sensor 122 is RGB image sensor, MONO imaging sensor, structure light image sensor or TOF imaging sensor.
The application processing unit 130 is connected with the first image sensor 121 and from the first image sensor
121 receive image sequence;The depth image processing unit 110 is connected with second imaging sensor 122 and from described
Two imaging sensors 122 receive image sequence;The depth image processing unit 110 carries out the described image sequence received
Depth map is generated after processing and the depth map is sent to the application processing unit 130.
In an embodiment, as shown in fig. 7, the application processing unit 130 will connect from the first image sensor 121
The image sequence received is sent to the depth image processing unit 110 after being handled;The depth image processing unit 110
To from the image sequence received using processing unit 130 and the image received from second imaging sensor 122
Sequence generates depth map and the depth map is sent to the application processing unit 130 after being handled.
In an embodiment, as shown in fig. 6, the first control instruction is respectively sent to institute by the application processing unit 130
State the first imaging sensor 121 and second imaging sensor 122.
Or as shown in fig. 7, the first control instruction is respectively sent to the first image by the application processing unit 130
Sensor 121 and the depth image processing unit 110;The depth image processing unit 110 applies processing unit from described
130 receive the first control instruction and generate the second control instruction according to first control instruction;The depth image processing is single
Second control instruction is sent to second imaging sensor 122 by member 110.
Or as shown in figure 8, the first control instruction is sent to the depth image processing by the application processing unit 130
Unit 110;The depth image processing unit 110 is from the application processing unit 130 the first control instruction of reception and according to institute
It states the first control instruction and generates the second control instruction;The depth image processing unit 110 distinguishes second control instruction
It is sent to the first image sensor 121 and second imaging sensor 122.
As shown in fig. 7, depth image processing unit 110 receives the image sequence of the second imaging sensor 122, and control the
The 3A of two imaging sensors 122.The image sequence of the first imaging sensor 121 is received using processing unit 130, and controls first
The 3A of imaging sensor 121.Depth image processing unit 110 carries out the image sequence of the second imaging sensor 122 a series of
After processing, generates depth map and be transferred to using processing unit 130.The first imaging sensor can be combined using processing unit 130
The depth map that 121 image sequences and depth image processing unit 110 generate, is further optimized depth map.
In an embodiment, as shown in Fig. 9 to Figure 17, described image acquisition unit 120 includes the first imaging sensor
121, the second imaging sensor 122 and third imaging sensor 123;The first image sensor 121 and second image
Sensor 122 is RGB image sensor, MONO imaging sensor or structure light image sensor or the first image
One is RGB image sensor in sensor 121 and the first image sensor 121, another is MONO imaging sensor;
The 3rd sensor is structure light image sensor or TOF imaging sensor.
In an embodiment, as shown in Figures 9 to 11, the depth image processing unit 110 is passed with the first image
Any one in sensor 121, second imaging sensor 122 and the third imaging sensor 123 is connected and from the phase
Even imaging sensor obtain image sequence, the application processing unit 130 be connected with remaining two imaging sensors and together
When from the described two imaging sensors obtain image sequences.
Wherein, in an embodiment, it is described the image sequence received is handled using processing unit 130 after send
To the depth image processing unit 110;The depth image processing unit 110 is received to from the application processing unit 130
To image sequence with handled from the image sequence that the imaging sensor being connected receives after generate depth map and will be described
Depth map is sent to the application processing unit 130.
In an embodiment, as shown in Figure 12 to Figure 14, the depth image processing unit 110 is passed with the first image
Sensor 121, second imaging sensor 122 are connected with any two in the third imaging sensor 123, at the same from this
Two imaging sensors obtain image sequence;The application processing unit 130 be connected with a remaining imaging sensor and from
The imaging sensor obtains image sequence.
Wherein, in an embodiment, the image sequence that will be received using processing unit 130 is sent after being handled
To the depth image processing unit 110;The depth image processing unit 110 is received to from the application processing unit 130
To image sequence with from the image sequence that two imaging sensors being connected receive handled after generate depth map and will
The depth map is sent to the application processing unit 130.
As shown in figure 14, depth image processing unit 110 receives the first imaging sensor 121 and the second imaging sensor
122 image sequence, into excessively a series of processing after generate depth map and be transmitted to using processing unit 130, person transmits the first figure simultaneously
It is given as the data and depth map of sensor 121 and the second imaging sensor 122 and applies processing unit 130.Using processing unit 130
The image sequence of third imaging sensor 123 is received, more sparse depth map is obtained, in conjunction with depth image processing unit 110
More complete depth map can be generated in the depth map of generation.
In an embodiment, as shown in Figure 15 to Figure 17, the depth image processing unit 110 respectively with first figure
Picture sensor 121, second imaging sensor 122 are connected with the third imaging sensor 123, and simultaneously from described first
Imaging sensor 121, second imaging sensor 122 and the third imaging sensor 123 receive image sequence;It is described
Depth image processing unit 110 generates depth map and by the depth after handling the three groups of described image sequences received
Figure is sent to the application processing unit 130.
Wherein, in an embodiment, as shown in figure 15, the first control instruction is sent to by the application processing unit 130
The depth image processing unit 110;The depth image processing unit 110 receives first from the application processing unit 130
Control instruction simultaneously generates the second control instruction according to first control instruction;The depth image processing unit 110 will be described
Second control instruction is respectively sent to the first image sensor 121, second imaging sensor 122 and the third figure
As sensor 123.
Or in an embodiment, as shown in figure 16, the application processing unit 130 sends the first control instruction respectively
To the first image sensor 121, second imaging sensor 122 and the third imaging sensor 123.
Alternatively, the first control instruction is respectively sent to any one imaging sensor and institute by the application processing unit 130
State depth image processing unit 110;The depth image processing unit 110 receives the first control from the application processing unit 130
System instruction simultaneously generates the second control instruction according to first control instruction;The depth image processing unit 110 is by described
Two control instructions are sent to remaining two imaging sensors.
First control instruction is respectively sent to any two imaging sensor and the depth by the application processing unit 130
Spend image processing unit 110;The depth image processing unit 110 receives the first control from the application processing unit 130 and refers to
It enables and the second control instruction is generated according to first control instruction;The depth image processing unit 110 is controlled described second
System instruction is sent to a remaining imaging sensor.
Or as shown in figure 17, the first control instruction is sent at the depth image by the application processing unit 130
Manage unit 110 and the second imaging sensor 122, third imaging sensor 123;The depth image processing unit 110 is from described
The first control instruction is received using processing unit 130 and the second control instruction is generated according to first control instruction;The depth
It spends image processing unit 110 and second control instruction is respectively sent to the first image sensor 121.
The embodiment of the present invention also provides a kind of depth image processing unit 110, including image processor, and image connects
Mouthful and apply connecting interface;Described image connecting interface is connected with described image acquisition unit 120, from described image acquisition unit
120 receive image sequence;Described image processor carries out processing to the image sequence received and generates depth map;The application connects
Connection interface is connected with the application processing unit 130, and the depth map is sent to the application processing unit 130.
In an embodiment, the application connecting interface is received by the depth map and from described image acquisition unit 120
Image sequence be sent to the application processing unit 130 simultaneously.
In an embodiment, described image connecting interface is one, and described image acquisition unit 120 includes that an image passes
Sensor;Described image sensor is that RGB image sensor, MONO imaging sensor, structure light image sensor or TOF image pass
Sensor.
In an embodiment, the first control instruction is sent to described using connecting interface by the application processing unit 130
Or the application processing unit 130 first control instruction is sent to it is described using connecting interface;Described image processor root
The second control instruction is generated according to first control instruction and second control instruction is sent to described image connecting interface.
In an embodiment, described image connecting interface includes the first image connecting interface and the second image connecting interface,
Described image acquisition unit 120 includes the first imaging sensor 121 and the second imaging sensor 122;The first image connection
Interface be connected with the first image sensor 121 and from the first image sensor 121 receive image sequence, described second
Image connecting interface is connected with second imaging sensor 122 and receives image sequence from second imaging sensor 122;
Figure described in described image processor receive to the first image connecting interface and the second image connecting interface two groups
Depth map is generated after being handled as sequence.
In an embodiment, the first control instruction is sent to described using connecting interface by the application processing unit 130;
Described image processor generates the second control instruction according to first control instruction and sends out second control instruction respectively
It send to the first image connecting interface and the second image connecting interface.
In an embodiment, described image connecting interface includes the first image connecting interface and the second image connecting interface,
Described image acquisition unit 120 includes including the first imaging sensor 121 and the second imaging sensor 122;The application processing
Unit 130 is connected with the first image sensor 121 and receives image sequence from the first image sensor 121;It is described
Second image connecting interface is connected with second imaging sensor 122 and receives image from second imaging sensor 122
Sequence;The described image sequence that described image processor receives the second image connecting interface generates deep after handling
Degree figure;The depth map is sent to the application processing unit 130 by the application connecting interface.
In an embodiment, the image that will be received from the first image sensor 121 using processing unit 130
Sequence is sent to described using connecting interface after being handled;Described image processor is received to from the application connecting interface
Image sequence and generate depth map after being handled from the image sequence that the second image connecting interface receives;It is described to answer
The depth map is sent to the application processing unit 130 with connecting interface.
In an embodiment, the application connecting interface receives the first control instruction, institute from the application processing unit 130
It states image processor and the second control instruction is generated according to first control instruction, the second image connecting interface is by described the
Two control instructions are sent to second imaging sensor 122.
In an embodiment, the application connecting interface receives the first control instruction, institute from the application processing unit 130
Depth image processing unit 110 is stated to receive the first control instruction from the application processing unit 130 and control according to described first
Instruction generates the second control instruction;Described image processor generates the second control instruction according to first control instruction, described
Second control instruction is sent to the first image sensor 121 by the first image connecting interface, and second image connects
Second control instruction is sent to second imaging sensor 122 by connection interface.
In an embodiment, described image connecting interface include the first image connecting interface, the second image connecting interface and
Third image connecting interface, described image acquisition unit 120 include including the first imaging sensor 121, the second imaging sensor
122 and third imaging sensor 123;The first image sensor 121 and second imaging sensor 122 are RGB figure
As sensor, MONO imaging sensor or structure light image sensor or the first image sensor 121 and described second
One is RGB image sensor in imaging sensor 122, another is MONO imaging sensor;The 3rd sensor is knot
Structure optical image sensor or TOF imaging sensor.
In an embodiment, the first image connecting interface, the second image connecting interface and the third image
Any of connecting interface and the first image sensor 121, second imaging sensor 122 and the third image
Any of sensor 123 is connected, and obtains image sequence from the connected imaging sensor, described to apply processing unit
130 are connected with remaining two imaging sensors and obtain image sequence from described two imaging sensors simultaneously.
In an embodiment, the first image connecting interface, the second image connecting interface and the third image
Any two and the first image sensor 121, second imaging sensor 122 and the third image in connecting interface
Any two in sensor 123 are connected from connected two imaging sensors acquisition image sequence;It is handled from the application single
Member 130 is connected with remaining two imaging sensors and obtains image sequence from described two imaging sensors simultaneously.
In an embodiment, the application connecting interface receives image sequence from the application processing unit 130;The figure
As processor is to from the image sequence received using connecting interface and the image sequence received from image connecting interface
Depth map is generated after being handled;The depth map is sent to the application processing unit 130 by the application connecting interface.
In an embodiment, the first image connecting interface, the second image connecting interface and the third image
Connecting interface is respectively corresponded to be passed with the first image sensor 121, second imaging sensor 122 and the third image
It is connected in sensor 123 and corresponding from the first image sensor 121, second imaging sensor 122 and the third figure
As sensor 123 obtains image sequence;Described image processor connects to from the first image connecting interface, second image
Three groups of image sequences that connection interface and the third image connecting interface receive generate depth map after being handled;The application
The depth map is sent to the application processing unit 130 by connecting interface.
In an embodiment, the first control instruction is sent to described using connecting interface by the application processing unit 130;
Described image processor generates the second control instruction according to first control instruction and sends out second control instruction respectively
Send at least one into the first image connecting interface, the second image connecting interface and the third image connecting interface
It is a.
In conclusion being realized by depth map processing unit and application processing unit 130 to Image Acquisition list in the present invention
The double control of member, the two not only can individually control image acquisition units, but also can cooperate, by applying processing unit
130 are acquired image data to control image acquisition units, then carry out depth map to image data by depth map processing unit
Processing improves system performance to effectively increase the efficiency of depth map acquisition.So the present invention effectively overcome it is existing
Various shortcoming in technology and have high industrial utilization value.
The above-described embodiments merely illustrate the principles and effects of the present invention, and is not intended to limit the present invention.It is any ripe
The personage for knowing this technology all without departing from the spirit and scope of the present invention, carries out modifications and changes to above-described embodiment.Cause
This, includes that institute is complete without departing from the spirit and technical ideas disclosed in the present invention for usual skill in technical field such as
At all equivalent modifications or change, should be covered by the claims of the present invention.
Claims (40)
1. a kind of depth map processing unit, which is characterized in that the depth map processing unit includes: image acquisition units, depth
Image processing unit and apply processing unit;
The depth image processing unit is connected with described image acquisition unit, receives image sequence from described image acquisition unit
Column;
The depth image processing unit is connected with the application processing unit, handles the described image sequence received
Depth map is generated afterwards and is sent to the depth map described using processing unit.
2. depth map processing unit according to claim 1, it is characterised in that: the depth image processing unit will be described
It depth map and is sent to simultaneously from the received image sequence of described image acquisition unit described using processing unit.
3. depth map processing unit according to claim 1 or 2, it is characterised in that: described image acquisition unit includes one
A imaging sensor;Described image sensor be RGB image sensor, MONO imaging sensor, structure light image sensor or
TOF imaging sensor.
4. depth map processing unit according to claim 3, it is characterised in that: the application processing unit is controlled first
Instruction is sent to described image acquisition unit.
5. depth map processing unit according to claim 3, it is characterised in that: the application processing unit is controlled first
Instruction is sent to the depth image processing unit;The depth image processing unit receives first from the application processing unit
Control instruction simultaneously generates the second control instruction according to first control instruction;The depth image processing unit is by described second
Control instruction is sent to described image acquisition unit.
6. depth map processing unit according to claim 1 or 2, it is characterised in that: described image acquisition unit includes the
One imaging sensor and the second imaging sensor;The depth image processing unit respectively with the first image sensor and institute
The second imaging sensor is stated to be connected and receive image sequence from the first image sensor and second imaging sensor simultaneously
Column;The depth image processing unit generates depth map and will be described after handling the two groups of described image sequences received
Depth map is sent to described using processing unit.
7. depth map processing unit according to claim 6, it is characterised in that: the application processing unit is controlled first
Instruction is respectively sent to the first image sensor and second imaging sensor.
8. depth map processing unit according to claim 6, it is characterised in that: the application processing unit is simultaneously from described
First imaging sensor and second imaging sensor receive image sequence.
9. depth map processing unit according to claim 6, it is characterised in that: the application processing unit is controlled first
Instruction is sent to the depth image processing unit;The depth image processing unit receives first from the application processing unit
Control instruction simultaneously generates the second control instruction according to first control instruction;The depth image processing unit is by described second
Control instruction is respectively sent to the first image sensor and second imaging sensor.
10. according to depth map processing unit described in claim 7~9 any claim, it is characterised in that: first figure
As sensor and second imaging sensor are RGB image sensor, MONO imaging sensor or structure light image sensing
One is RGB image sensor in device or the first image sensor and second imaging sensor, another is
MONO imaging sensor, structure light image sensor or TOF imaging sensor.
11. depth map processing unit according to claim 1 or 2, it is characterised in that: described image acquisition unit includes the
One imaging sensor and the second imaging sensor;The application processing unit is connected with the first image sensor and from described
First imaging sensor receives image sequence;The depth image processing unit is connected with second imaging sensor and from institute
It states the second imaging sensor and receives image sequence;The depth image processing unit to the described image sequence received at
Depth map is generated after reason and is sent to the depth map described using processing unit.
12. depth map processing unit according to claim 11, it is characterised in that: the application processing unit will be from described
The image sequence that first imaging sensor receives is sent to the depth image processing unit after being handled;The depth map
As processing unit is to receiving from the image sequence received using processing unit and from second imaging sensor
Image sequence generates depth map and is sent to the depth map described using processing unit after being handled.
13. depth map processing unit according to claim 11, it is characterised in that: the application processing unit is controlled first
System instruction is respectively sent to the first image sensor and second imaging sensor.
14. depth map processing unit according to claim 11, it is characterised in that: the application processing unit is controlled first
System instruction is sent to the depth image processing unit;The depth image processing unit receives the from the application processing unit
One control instruction simultaneously generates the second control instruction according to first control instruction;The depth image processing unit is by described
Two control instructions are respectively sent to the first image sensor and second imaging sensor.
15. depth map processing unit described in 2~14 any claims according to claim 1, it is characterised in that: described first
Imaging sensor is RGB image sensor or MONO imaging sensor;Second imaging sensor be RGB image sensor,
MONO imaging sensor, structure light image sensor or TOF imaging sensor.
16. depth map processing unit according to claim 1 or 2, it is characterised in that: described image acquisition unit includes the
One imaging sensor, the second imaging sensor and third imaging sensor;The first image sensor and second image
Sensor is RGB image sensor, MONO imaging sensor or structure light image sensor or the first image sensing
One is RGB image sensor in device and second imaging sensor, another is MONO imaging sensor;The third passes
Sensor is structure light image sensor or TOF imaging sensor.
17. depth map processing unit according to claim 16, it is characterised in that: the depth image processing unit and institute
State the first imaging sensor, second imaging sensor is connected with any one in the third imaging sensor and from this
Connected imaging sensor obtains image sequence, and the application processing unit is connected and simultaneously with remaining two imaging sensors
Image sequence is obtained from described two imaging sensors.
18. depth map processing unit according to claim 17, it is characterised in that: the application processing unit is to receiving
Image sequence handled after be sent to the depth image processing unit;The depth image processing unit is answered from described
The image sequence received with processing unit with handled from the image sequence that the imaging sensor being connected receives after generate
The depth map is simultaneously sent to described using processing unit by depth map.
19. depth map processing unit according to claim 16, it is characterised in that: the depth image processing unit and institute
State the first imaging sensor, second imaging sensor is connected with any two in the third imaging sensor, simultaneously
Image sequence is obtained from two imaging sensors;The application processing unit be connected with a remaining imaging sensor and from
The imaging sensor obtains image sequence.
20. depth map processing unit according to claim 19, it is characterised in that: the application processing unit will receive
Image sequence handled after be sent to the depth image processing unit;The depth image processing unit is answered from described
After the image sequence received with processing unit is handled with from the image sequence that two imaging sensors being connected receive
It generates depth map and is sent to the depth map described using processing unit.
21. depth map processing unit according to claim 16, it is characterised in that: the depth image processing unit difference
It is connected with the first image sensor, second imaging sensor and the third imaging sensor, and simultaneously from described
First imaging sensor, second imaging sensor and the third imaging sensor receive image sequence;The depth
Image processing unit generates depth map and sends the depth map after handling the three groups of described image sequences received
Processing unit is applied to described.
22. depth map processing unit described in 7,19 or 21 according to claim 1, it is characterised in that: described to apply processing unit
First control instruction is sent to the depth image processing unit;The depth image processing unit handles single from the application
Member receives the first control instruction and generates the second control instruction according to first control instruction;The depth image processing unit
Second control instruction is respectively sent to the first image sensor, second imaging sensor and the third figure
As sensor.
23. depth map processing unit described in 7,19 or 21 according to claim 1, it is characterised in that: described to apply processing unit
First control instruction is respectively sent to the first image sensor, second imaging sensor and the third image to pass
Sensor.
24. a kind of depth image processing unit, it is characterised in that: including image processor, image connecting interface and application connection
Interface;
Described image connecting interface is connected with described image acquisition unit, receives image sequence from described image acquisition unit;
Described image processor carries out processing to the image sequence received and generates depth map;
The application connecting interface is connected with the application processing unit, and the depth map is sent to the application and handles list
Member.
25. depth image processing unit according to claim 24, it is characterised in that: the application connecting interface will be described
It depth map and is sent to simultaneously from the received image sequence of described image acquisition unit described using processing unit.
26. the depth image processing unit according to claim 24 or 25, it is characterised in that: described image connecting interface is
One, described image acquisition unit includes an imaging sensor;Described image sensor is RGB image sensor, MONO figure
As sensor, structure light image sensor or TOF imaging sensor.
27. depth image processing unit according to claim 26, it is characterised in that: the application processing unit is by first
Control instruction is sent to described using connecting interface.
28. depth image processing unit according to claim 26, the application processing unit sends out the first control instruction
It send to described using connecting interface;Described image processor according to first control instruction generates the second control instruction and by institute
It states the second control instruction and is sent to described image connecting interface.
29. the depth image processing unit according to claim 24 or 25, described image connecting interface includes the first image
Connecting interface and the second image connecting interface, described image acquisition unit include the first imaging sensor and the second image sensing
Device;The first image connecting interface is connected with the first image sensor and receives image from the first image sensor
Sequence, the second image connecting interface, which is connected with second imaging sensor and receives from second imaging sensor, to be schemed
As sequence;Described image processor two groups that the first image connecting interface and the second image connecting interface are received
Described image sequence generates depth map after being handled.
30. depth image processing unit according to claim 29, it is characterised in that: the application processing unit is by first
Control instruction is sent to described using connecting interface;Described image processor generates the second control according to first control instruction
It instructs and second control instruction is respectively sent to the first image connecting interface and the second image connecting interface.
31. the depth image processing unit according to claim 24 or 25, it is characterised in that: described image connecting interface packet
Include the first image connecting interface and the second image connecting interface, described image acquisition unit include include the first imaging sensor and
Second imaging sensor;The application processing unit is connected with the first image sensor and from the first image sensor
Receive image sequence;The second image connecting interface is connected with second imaging sensor and from second image sensing
Device receives image sequence;At the described image sequence that described image processor receives the second image connecting interface
Depth map is generated after reason;The depth map is sent to described using processing unit by the application connecting interface.
32. depth image processing unit according to claim 31, it is characterised in that: the application processing unit will be from institute
State be sent to after the image sequence that the first imaging sensor receives is handled it is described using connecting interface;Described image processing
Device is to from the image sequence received using connecting interface and the image sequence received from the second image connecting interface
Column generate depth map after being handled;The depth map is sent to described using processing unit by the application connecting interface.
33. depth image processing unit according to claim 31, it is characterised in that: the application connecting interface is from described
The first control instruction is received using processing unit, described image processor generates the second control according to first control instruction and refers to
It enables, second control instruction is sent to second imaging sensor by the second image connecting interface.
34. depth image processing unit according to claim 31, it is characterised in that: the application connecting interface is from described
The first control instruction is received using processing unit, the depth image processing unit receives the first control from the application processing unit
System instruction simultaneously generates the second control instruction according to first control instruction;Described image processor refers to according to first control
It enables and generates the second control instruction, second control instruction is sent to the first image and passed by the first image connecting interface
Second control instruction is sent to second imaging sensor by sensor, the second image connecting interface.
35. the depth image processing unit according to claim 24 or 25, it is characterised in that: described image connecting interface packet
The first image connecting interface, the second image connecting interface and third image connecting interface are included, described image acquisition unit includes packet
Include the first imaging sensor, the second imaging sensor and third imaging sensor;The first image sensor and described second
Imaging sensor is RGB image sensor, MONO imaging sensor or structure light image sensor or the first image
One is RGB image sensor in sensor and second imaging sensor, another is MONO imaging sensor;Described
Three sensors are structure light image sensor or TOF imaging sensor.
36. depth image processing unit according to claim 35, it is characterised in that: the first image connecting interface,
It is any of the second image connecting interface and the third image connecting interface and the first image sensor, described
Any of second imaging sensor and described third imaging sensor are connected, and obtain figure from the connected imaging sensor
As sequence, the application processing unit is connected with remaining two imaging sensors and simultaneously from described two imaging sensor
Obtain image sequence.
37. depth image processing unit according to claim 35, it is characterised in that: the first image connecting interface,
Any two in the second image connecting interface and the third image connecting interface with the first image sensor, it is described
Second imaging sensor is connected with any two in the third imaging sensor to be obtained from two connected imaging sensors
Image sequence;It is connected from the application processing unit with remaining two imaging sensors and is passed simultaneously from described two images
Sensor obtains image sequence.
38. the depth image processing unit according to claim 36 or 37, it is characterised in that: the application connecting interface from
The application processing unit receives image sequence;Described image processor is to from the image sequence received using connecting interface
It arranges and generates depth map after being handled from the image sequence that image connecting interface receives;The application connecting interface will be described
Depth map is sent to described using processing unit.
39. depth image processing unit according to claim 35, it is characterised in that: the first image connecting interface,
The second image connecting interface and the third image connecting interface respectively correspond and the first image sensor, described
It is connected in two imaging sensors and the third imaging sensor and corresponding from the first image sensor, second image
Sensor and the third imaging sensor obtain image sequence;Described image processor connects to from the first image
Three groups of image sequences that mouth, the second image connecting interface and the third image connecting interface receive are raw after being handled
At depth map;The depth map is sent to described using processing unit by the application connecting interface.
40. according to depth image processing unit described in claim 36,37 or 39, it is characterised in that: the application processing is single
First control instruction is sent to described using connecting interface by member;Described image processor is generated according to first control instruction
Second control instruction is simultaneously respectively sent to the first image connecting interface, second image company by the second control instruction
At least one of connection interface and the third image connecting interface.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710966971.5A CN109672876A (en) | 2017-10-17 | 2017-10-17 | Depth map processing unit and depth map processing unit |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710966971.5A CN109672876A (en) | 2017-10-17 | 2017-10-17 | Depth map processing unit and depth map processing unit |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109672876A true CN109672876A (en) | 2019-04-23 |
Family
ID=66141394
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710966971.5A Pending CN109672876A (en) | 2017-10-17 | 2017-10-17 | Depth map processing unit and depth map processing unit |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109672876A (en) |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104052934A (en) * | 2013-03-13 | 2014-09-17 | 三星电子株式会社 | Electronic Device And Method For Processing Image |
TW201510570A (en) * | 2013-09-13 | 2015-03-16 | Quanta Comp Inc | Head mounted system |
CN204316622U (en) * | 2015-01-06 | 2015-05-06 | 上海青橙实业有限公司 | Image processing apparatus and user terminal |
CN104680510A (en) * | 2013-12-18 | 2015-06-03 | 北京大学深圳研究生院 | RADAR parallax image optimization method and stereo matching parallax image optimization method and system |
US20150237223A1 (en) * | 2014-02-20 | 2015-08-20 | Google Inc. | Methods and Systems for Communicating Sensor Data on a Mobile Device |
CN105049754A (en) * | 2014-04-28 | 2015-11-11 | 三星电子株式会社 | Image processing device and mobile computing device having the same |
CN205510224U (en) * | 2016-04-12 | 2016-08-24 | 上海豪成通讯科技有限公司 | Digital image processing ware |
CN105981369A (en) * | 2013-12-31 | 2016-09-28 | 谷歌技术控股有限责任公司 | Methods and Systems for Providing Sensor Data and Image Data to an Application Processor in a Digital Image Format |
CN106384363A (en) * | 2016-09-13 | 2017-02-08 | 天津大学 | Fast adaptive weight stereo matching algorithm |
CN106507092A (en) * | 2016-11-29 | 2017-03-15 | 歌尔科技有限公司 | Camera head and its image processing method, virtual reality device |
CN106525004A (en) * | 2016-11-09 | 2017-03-22 | 人加智能机器人技术(北京)有限公司 | Binocular stereo vision system and depth measuring method |
CN206042072U (en) * | 2016-07-26 | 2017-03-22 | 深圳众思科技有限公司 | Two camera devices and terminal |
CN106600632A (en) * | 2016-11-04 | 2017-04-26 | 天津大学 | Improved matching cost aggregation stereo matching algorithm |
CN106887021A (en) * | 2015-12-15 | 2017-06-23 | 株式会社理光 | The solid matching method of three-dimensional video-frequency, controller and system |
CN107066400A (en) * | 2016-12-13 | 2017-08-18 | 深圳众思科技有限公司 | Signal processing method, device and the electronic equipment of electronic equipment |
-
2017
- 2017-10-17 CN CN201710966971.5A patent/CN109672876A/en active Pending
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104052934A (en) * | 2013-03-13 | 2014-09-17 | 三星电子株式会社 | Electronic Device And Method For Processing Image |
TW201510570A (en) * | 2013-09-13 | 2015-03-16 | Quanta Comp Inc | Head mounted system |
CN104680510A (en) * | 2013-12-18 | 2015-06-03 | 北京大学深圳研究生院 | RADAR parallax image optimization method and stereo matching parallax image optimization method and system |
CN105981369A (en) * | 2013-12-31 | 2016-09-28 | 谷歌技术控股有限责任公司 | Methods and Systems for Providing Sensor Data and Image Data to an Application Processor in a Digital Image Format |
US20150237223A1 (en) * | 2014-02-20 | 2015-08-20 | Google Inc. | Methods and Systems for Communicating Sensor Data on a Mobile Device |
CN105049754A (en) * | 2014-04-28 | 2015-11-11 | 三星电子株式会社 | Image processing device and mobile computing device having the same |
CN204316622U (en) * | 2015-01-06 | 2015-05-06 | 上海青橙实业有限公司 | Image processing apparatus and user terminal |
CN106887021A (en) * | 2015-12-15 | 2017-06-23 | 株式会社理光 | The solid matching method of three-dimensional video-frequency, controller and system |
CN205510224U (en) * | 2016-04-12 | 2016-08-24 | 上海豪成通讯科技有限公司 | Digital image processing ware |
CN206042072U (en) * | 2016-07-26 | 2017-03-22 | 深圳众思科技有限公司 | Two camera devices and terminal |
CN106384363A (en) * | 2016-09-13 | 2017-02-08 | 天津大学 | Fast adaptive weight stereo matching algorithm |
CN106600632A (en) * | 2016-11-04 | 2017-04-26 | 天津大学 | Improved matching cost aggregation stereo matching algorithm |
CN106525004A (en) * | 2016-11-09 | 2017-03-22 | 人加智能机器人技术(北京)有限公司 | Binocular stereo vision system and depth measuring method |
CN106507092A (en) * | 2016-11-29 | 2017-03-15 | 歌尔科技有限公司 | Camera head and its image processing method, virtual reality device |
CN107066400A (en) * | 2016-12-13 | 2017-08-18 | 深圳众思科技有限公司 | Signal processing method, device and the electronic equipment of electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11182917B2 (en) | Stereo camera depth determination using hardware accelerator | |
JP5561781B2 (en) | Method and system for converting 2D image data into stereoscopic image data | |
US11663691B2 (en) | Method and apparatus for restoring image | |
KR102480600B1 (en) | Method for low-light image quality enhancement of image processing devices and method of operating an image processing system for performing the method | |
CN103460242B (en) | Information processing device, information processing method, and data structure of location information | |
KR100776805B1 (en) | Efficient image transmission method and apparatus using stereo vision processing for intelligent service robot system | |
CN1267826C (en) | On-chip system processor for multimedia | |
TW202112138A (en) | Method and system for processing input video | |
WO2012068724A1 (en) | Three-dimensional image acquisition system and method | |
CN110366048A (en) | Video transmission method, device, electronic equipment and computer readable storage medium | |
US20190043261A1 (en) | Augmented reality depth sensing using dual camera receiver | |
KR102020464B1 (en) | Color-mono Dual Camera Image Fusion Method, System and Computer-readable Medium | |
CN113393510B (en) | Image processing method, intelligent terminal and storage medium | |
WO2009089785A1 (en) | Image processing method, encoding/decoding method and apparatus | |
CN114640885A (en) | Video frame insertion method, training method, device and electronic equipment | |
CN109672876A (en) | Depth map processing unit and depth map processing unit | |
CN112907714A (en) | Mixed matching binocular vision system based on Census transformation and gray absolute difference | |
CN116205806B (en) | Image enhancement method and electronic equipment | |
CN116310326A (en) | Multi-mode point cloud segmentation method, system, equipment and storage medium | |
WO2022170866A1 (en) | Data transmission method and apparatus, and storage medium | |
CN109377460A (en) | A kind of image processing method, image processing apparatus and terminal device | |
CN109309784A (en) | Mobile terminal | |
Li et al. | Stereo Matching Accelerator With Re-Computation Scheme and Data-Reused Pipeline for Autonomous Vehicles | |
CN108012074A (en) | Camera chain and the operation method for the camera chain | |
CN114841863A (en) | Image color correction method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190423 |
|
RJ01 | Rejection of invention patent application after publication |