CN106572340B - Camera system, mobile terminal and image processing method - Google Patents

Camera system, mobile terminal and image processing method Download PDF

Info

Publication number
CN106572340B
CN106572340B CN201610959236.7A CN201610959236A CN106572340B CN 106572340 B CN106572340 B CN 106572340B CN 201610959236 A CN201610959236 A CN 201610959236A CN 106572340 B CN106572340 B CN 106572340B
Authority
CN
China
Prior art keywords
image
light
photographic device
depth
black light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610959236.7A
Other languages
Chinese (zh)
Other versions
CN106572340A (en
Inventor
黄源浩
刘龙
肖振中
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Maurio Technology Co., Ltd.
Original Assignee
Shenzhen Orbbec Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Orbbec Co Ltd filed Critical Shenzhen Orbbec Co Ltd
Priority to CN201610959236.7A priority Critical patent/CN106572340B/en
Publication of CN106572340A publication Critical patent/CN106572340A/en
Application granted granted Critical
Publication of CN106572340B publication Critical patent/CN106572340B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a kind of camera systems, including the first photographic device, the second photographic device, laser projection device and processor;The laser projection device is used for, and the shared visual field of the first photographic device of Xiang Suoshu and the second photographic device projects the structured light patterns of invisible laser;First photographic device is used for, and acquires the first otherwise visible light color image and the first black light image in the shared visual field;Second photographic device is used for, and acquires visible light gray level image and the second black light image in the shared visual field;The processor is used for, and synthesizes the second otherwise visible light color image using first otherwise visible light color image and visible light gray level image, and calculate depth image using the first black light image and the second black light image.The present invention can be realized with smaller volume, lower power consumption to the high-quality colour image of target and the acquisition of depth image.

Description

Camera system, mobile terminal and image processing method
[technical field]
The present invention relates to optical measurement and electronic field more particularly to a kind of camera systems, mobile terminal and image procossing Method.
[background technique]
RGB image is able to record the color characteristic of object, has been widely used in terms of pattern-recognition, such as people Face identification, Articles detecting etc..Recently as the development of depth measurement technology, it is based particularly on the development of structured light technique, benefit It is also gradually obtained with applications such as depth camera measurement object depth information further progress motion sensing manipulation, 3D reconstruction, tracking and avoidances To attention.
It is equipped with color camera on the computing device and depth camera makes equipment have more extensive function, but considers To mobile device as in mobile phone, the isometric lesser calculating equipment of plate, be arranged many cameras be it is unadvisable, on the one hand occupy A large amount of space;On the other hand bigger power consumption is also brought along.
[summary of the invention]
For overcome the deficiencies in the prior art, the present invention provides a kind of camera system, mobile terminal and image processing methods Method realizes the Color Image Acquisition and depth image acquisition of high quality with fewer resource.
A kind of camera system, including the first photographic device, the second photographic device, laser projection device and processor;
The laser projection device is used for, and the shared visual field projection of the first photographic device of Xiang Suoshu and the second photographic device is not The structured light patterns of visible laser;
First photographic device is used for, and the first otherwise visible light color image and first in the acquisition shared visual field is not Visible images;
Second photographic device is used for, and the visible light gray level image and second in the acquisition shared visual field are invisible Light image;
The processor is used for, visible using first otherwise visible light color image and the synthesis second of visible light gray level image Light color image, and depth image is calculated using the first black light image and the second black light image.
Preferably, the processor is used for, using the first black light image and the second black light image according to Binocular Vision Principle obtains the depth image.
Preferably, the processor is used for, and is calculated the first depth image using the first black light image, is utilized institute It states the second black light image and calculates the second depth image, and according to first depth image and the second range image integration institute State depth image.
Preferably, the laser projection device is for alternately projecting the structured light patterns and pass to the shared visual field Close the projection structured light patterns;
First photographic device is used for, and acquires first during the laser projection device projects the structured light patterns One black light image acquires the one or two black light during the laser projection device closes and projects the structured light patterns Image;Second photographic device is used for, and the 2nd 1 is acquired during the laser projection device projects the structured light patterns Black light image acquires the two or two black light figure during the laser projection device closes and projects the structured light patterns Picture;The processor is also used to, using the difference image of the one one black light image and the one or two black light image, And the difference image calculating depth image of the 2nd 1 black light image and the two or two black light image.
Preferably, it within the period that the laser projection device alternating projection and closing project the structured light patterns, throws The duration for penetrating the structured light patterns is greater than the duration closed and project the structured light patterns.
Preferably, the focal length of lens of first photographic device is different from the focal length of lens of second photographic device.
Preferably, the invisible laser is infrared ray laser.
Preferably, the laser projection device is arranged between first photographic device and the second photographic device.
The present invention also provides a kind of mobile terminals, including any camera system.
The present invention also provides a kind of image acquiring methods, include the following steps:
Project step: laser projection device projects invisible to the shared visual field of the first photographic device and the second photographic device The structured light patterns of laser;
Acquisition step: first photographic device acquires the first otherwise visible light color image and in the shared visual field One black light image, second photographic device acquire visible light gray level image in the shared visual field and second can not Light-exposed image;
Processing step: the processor utilizes first otherwise visible light color image and visible light gray level image synthesis second Otherwise visible light color image, and depth image is calculated using the first black light image and the second black light image.
The beneficial effects of the present invention are:
Compared with the prior art, the present invention can realize the high-quality colour to target with smaller volume, lower power consumption The acquisition of image and depth image.
[Detailed description of the invention]
Fig. 1 is the camera system schematic diagram of an embodiment of the present invention
Fig. 2 is the structural schematic diagram of the photographic device of an embodiment of the present invention
Fig. 3 is the optical filter schematic diagram of first photographic device of an embodiment of the present invention
Fig. 4 is the optical filter schematic diagram of second photographic device of an embodiment of the present invention
Fig. 5 is the projection timing diagram of the laser projection device of an embodiment of the present invention
Fig. 6 is the mobile terminal schematic diagram of an embodiment of the present invention
[specific embodiment]
The following further describes in detail the preferred embodiments of the invention.
As shown in Figure 1, a kind of camera system of embodiment, including the first photographic device, the second photographic device, laser projection Device and processor, processor are electrically connected with the first photographic device, the second photographic device and laser projection device respectively.
The laser projection device is used for, and the shared visual field projection of the first photographic device of Xiang Suoshu and the second photographic device is not The structured light patterns of visible laser;First photographic device is used for, and acquires the first otherwise visible light color in the shared visual field Image and the first black light image, second photographic device are used for, and acquire the visible light gray scale in the shared visual field Image and the second black light image;The processor is used for, and utilizes first otherwise visible light color image and visible light ash It spends image and synthesizes the second otherwise visible light color image, and calculated using the first black light image and the second black light image Depth image.
In Fig. 1, FOV indicates the visual angle of the first photographic device and the second photographic device, the projection scope of laser projection device Cover the shared visual field of the first photographic device and the second photographic device.
As shown in Fig. 2, the typical structure schematic diagram of photographic device, including imaging sensor 1, optical filter 2 and lens 3, light It is acquired after the convergence of lens 3 is using optical filter 3 by imaging sensor.Optical filter 2 is used for the light by specific wavelength, figure As sensor 1 is for converting light to digital electric signal.
The Baeyer optical filter that common RGB camera uses, optical filter have identical as image sensor pixel quantity and one One corresponding filter unit, Baeyer optical filter have the filter unit for passing through feux rouges, green light and blue light respectively, and light passes through The pixel of imaging sensor corresponding with the filter unit is incident on after some filter unit.In view of human eye is quicker to green light Sense, therefore be usually R (25%) by the ratio setting of three: G (50%): B (25%).
As shown in figure 3, in a kind of embodiment, the first photographic device uses RGB-IR camera, the optical filter used with visit Your optical filter is different, and the optical filter of the present embodiment is made of four kinds of different filter units, respectively can be by R, G, B and red Outer light (IR) component, four ratio are R (25%): G (25%): B (25%): IR (25%), by using this optical filter 2, the first photographic device can both acquire otherwise visible light color image, can also acquire sightless infrared light image.Certainly, may be used To substitute IR filter unit using the filter unit of other black lights, with cooperate that laser projection device projects other are invisible Laser.The arrangement mode of each filter unit of the optical filter of the present embodiment and each component accounting are simultaneously not exclusive, can also there is other Arrangement and distribution mode.
After imaging sensor in first photographic device obtains the optical information of each component (such as R, G, B, IR), due to Each optical information all only occupies the pixel of part, thus needs to restore by way of interpolation other three kinds points in each pixel The intensity information of amount, to finally realize synchronous acquisition RGB image and IR image.There are many ways to interpolation, such as plus Weight average etc., due to not being described in detail herein thus for prior art.
As shown in figure 4, in one embodiment, the second photographic device is for acquiring visible light gray level image and invisible Light image.Fig. 4 is the optical filter that the second photographic device uses in a kind of embodiment, and wherein IR indicates that infrared filtering unit, W indicate White light filter element, that is, transparent cell can pass through the light of any wavelength.
Laser projection device is different according to the structured light patterns of projection, and structure is also different.The present embodiment is with speckle particle Illustrate for structured light patterns.Laser projection device is generally by light source, collimation lens and diffraction optical element (DOE) group At.Light source is infrared laser in the present embodiment, can be single edge emitting laser light source, is also possible to vertical-cavity surface-emitting and swashs Optical arrays light source.Due to laser light source issue light have certain angle of divergence, thus need to be collimated using collimation lens with Launch the light beam of focal length.Laser beam is expanded into multiple laser after DOE, and forms speckle particle pattern in space.
It is that acquisition image, these light sources usually contain under the irradiation of the light sources such as sunlight in many usage scenarios There is black light (such as infrared light) identical with the black light that laser projection device is projected, the illumination of this part can be to acquisition Black light image have an impact, especially in the case of light source is more strong, will lead to the black light image of acquisition The unfavorable situations such as contrast is lower, noise is big.In one embodiment, by controlling laser projection device interval to shared visual field Projection structure light pattern eliminates this adverse effect.As shown in figure 5, being a kind of interval throwing of the laser projection device of embodiment Shadow timing diagram, for example laser projection device projection is closed in the T1 period, and acquire corresponding black light pattern I1, and in phase The adjacent T2 period opens laser projection device projection, and acquire corresponding black light pattern I2... and so on.When adjacent Between section spacing frequency be not more than black light image frequency acquisition, it can only acquire once can not in a period of time Light-exposed image can also acquire multiple black light image.It will illustrate how to eliminate with frequency identical situation below Influence of the black light that light source generates to the black light image of acquisition.
In the T1 period, laser projection device is closed, and the black light image I1 of acquisition only includes invisible in light source Light ingredient.In the T2 period, the black light image I2 of acquisition then simultaneously comprising laser projection device projection black light at Point and light source black light ingredient.A kind of processing method is, it is assumed that the illumination of two neighboring period inner light source does not become Change and collected object does not move, can be eliminated the effects of the act at this time by infrared-difference: differentiated black light figure As I2 '=I2-I1, such black light image I2 ' have more high contrast.In another processing method, front and back can use At least two width black light images do difference, such as: I2 '=I2- (I3+I1)/2.There are also more for specific calculus of finite differences processing method Kind.
It is overall under conditions of frequency acquisition is constant since laser projection device is to close whithin a period of time The quantity of the black light image of acquisition is reduced, and is had some impact on to subsequent calculating depth image.In order to reduce this influence, In the switch periods of laser projection device, the length of setting projection opening time section is greater than the length of projection shut-in time section (such as T2 is greater than T1), to obtain the acquisition frame number of higher black light image.
In one embodiment, a kind of image acquiring method, includes the following steps:
S1, laser projection device project invisible laser to the shared visual field of the first photographic device and the second photographic device Structured light patterns.
S2, first photographic device acquire the first otherwise visible light color image in the shared visual field and first can not Light-exposed image, second photographic device acquire visible light gray level image and the second black light figure in the shared visual field Picture.
S3, the processor synthesize the second visible light using first otherwise visible light color image and visible light gray level image Color image.
The first photographic device is identical as the size of the image acquisition device of the second photographic device and resolution ratio in the present embodiment, this Sample visible light gray level image and the first otherwise visible light color image only need to consider two image acquisition devices due to position in synthesis Image registration problem caused by difference.The parameter of the first photographic device and the second photographic device can also be in other embodiments It is not identical.Image registration algorithm is well-known technique more mature at present, is not elaborated herein.Due to visible light grayscale image It seem not filter and directly received acquired in visible light by optical filter, thus its sensitivity is stronger, it can with first by it Light-exposed color image, which carries out synthesis, may be implemented following effect:
Details enhancing.Under half-light environment, many details in the first otherwise visible light color image will be relatively fuzzyyer, at this time It will be seen that after the details of the part and the first otherwise visible light color image are synthesized in light gray level image, the second visible light of formation Color image will possess the image effect being more clear.
Image zoom.It sets different from the focal length of lens of the second photographic device for the first photographic device, for example is respectively Nearly Jiao Yuyuan is burnt.Since the first otherwise visible light color image is to obtain under remote focal length, thus the subject image in image nearby compares It is fuzzy, and visible light gray level image is obtained under nearly focal length mode, is mended at this time with the information in visible light gray level image The part obscured in the first otherwise visible light color image is repaid, the second relatively sharp otherwise visible light color image is ultimately formed.
S4, the processor calculate depth image using the first black light image and the second black light image.
In one embodiment, a kind of image acquiring method includes the following steps:
S11, laser projection device alternately can not to the projection of the shared visual field of the first photographic device and the second photographic device See the structured light patterns of laser.
S12, first photographic device acquire the first otherwise visible light color image in the shared visual field, and described Laser projection device acquires the one one black light image during projecting the structured light patterns, closes in the laser projection device The one or two black light image is acquired during closing the projection structured light patterns.
S13, second photographic device acquire the visible light gray level image in the shared visual field, and in the laser Projection arrangement acquires the 2nd 1 black light image during projecting the structured light patterns, closes and throws in the laser projection device The two or two black light image is acquired during penetrating the structured light patterns.
S14, the processor utilize the difference diagram of the one one the black light image and the one or two black light image The difference image of picture and the 2nd 1 the black light image and the two or two black light image, according to Binocular Vision Principle Calculate the depth image.
Encoded structured light patterns are cast into object space due to using laser projection device, thus first takes the photograph Picture device and the collected black light pattern of the second photographic device have gem-pure minutia, with common based on double The camera visually felt is compared, using in the present invention the first black light image and the second black light image be based on binocular vision The higher depth image of the available precision of principle.Simultaneously with by single camera and laser-projector form based on structure light skill The depth measurement principle of art is compared, and saves reference picture in this method without additional memory (such as with reference to speckle pattern Picture), while this method has stronger environment resistant light interference performance.In addition, calculating depth map using difference image, essence is calculated Du Genggao.
In one embodiment, a kind of image acquiring method includes the following steps:
S21, laser projection device alternately can not to the projection of the shared visual field of the first photographic device and the second photographic device See the structured light patterns of laser.
S22, first photographic device acquire the first otherwise visible light color image in the shared visual field, and described Laser projection device acquires the one one black light image during projecting the structured light patterns, closes in the laser projection device The one or two black light image is acquired during closing the projection structured light patterns.
S23, second photographic device acquire the visible light gray level image in the shared visual field, and in the laser Projection arrangement acquires the 2nd 1 black light image during projecting the structured light patterns, closes and throws in the laser projection device The two or two black light image is acquired during penetrating the structured light patterns.
S24, the processor utilize the difference diagram of the one one the black light image and the one or two black light image As calculating the first depth image, the difference image meter of the 2nd 1 the black light image and the two or two black light image is utilized The second depth image is calculated, the first depth image and the higher depth image of the second range image integration precision are then utilized.
It may be constructed by the first photographic device and laser projection device based on the trigon depth measurement unit of structure light.It obtains The principle for taking depth information is to obtain the reference picture accessed in collected black light image and system progress matching primitives Capture element deviation value, then the one-to-one relationship of pixel deviation value and actual depth value is utilized by principle of triangulation Obtain the depth image of object space.Here reference picture is by acquiring in the plane apart from depth camera known depth value Black light image and obtain.
Due to the relationship of shooting visual angle, it is only capable of obtaining the depth information of object side, such as human body, if phase seat in the plane On the left of human body, then the data on right side have loss.It thus can further utilize the first depth image and second obtained Depth image is synthesized, and data complement can be carried out, in contrast obtained third depth image possesses higher space point Debate rate.
As shown in fig. 6, a kind of mobile terminal 7 containing this camera system of embodiment, including processor, a photographic device 4, laser projection device 5 and the second photographic device 6.First photographic device 4, laser projection device 5 and the second photographic device 6 can be with It is arranged on the same surface of mobile terminal, laser projection device 5 is preferably provided at the first photographic device 4 and the second camera shooting dress It sets between 6, the first photographic device 4, laser projection device 5 and the second photographic device 6 are preferably provided on same straight line.Processing Device etc. by one of CPU, application specific processor, microelectronic component in mobile terminal etc. or a variety of can form.Mobile terminal It can be mobile phone, plate, computer etc..
The above content is a further detailed description of the present invention in conjunction with specific preferred embodiments, and it cannot be said that Specific implementation of the invention is only limited to these instructions.For those of ordinary skill in the art to which the present invention belongs, exist Under the premise of not departing from present inventive concept, a number of simple deductions or replacements can also be made, all shall be regarded as belonging to the present invention by The scope of patent protection that the claims submitted determine.

Claims (8)

1. a kind of camera system, characterized in that including the first photographic device, the second photographic device, laser projection device and processing Device;
The laser projection device is used for, and the shared visual field projection of the first photographic device of Xiang Suoshu and the second photographic device is invisible The encoded structured light patterns of laser;
First photographic device is used for, and the first otherwise visible light color image and first in the acquisition shared visual field are invisible Light image;
Second photographic device is used for, and acquires visible light gray level image and the second black light figure in the shared visual field Picture;The visible light gray level image is not filter and directly received acquired in visible light by optical filter;
The processor is used for, and synthesizes the second visible brilliance using first otherwise visible light color image and visible light gray level image Chromatic graph picture, and depth image is calculated using the first black light image and the second black light image;
The focal length of lens of first photographic device is different from the focal length of lens of second photographic device;
It is made of the first photographic device and laser projection device based on the trigon depth measurement unit of structure light;Obtain depth letter The principle of breath is that the reference picture accessed in collected black light image and system progress matching primitives acquisition pixel is inclined Target can be obtained using the one-to-one relationship of pixel deviation value and actual depth value from value, then by principle of triangulation The depth image in space;Here reference picture is invisible in the plane apart from depth camera known depth value by acquiring Light image and obtain;
The processor is used for, and calculates the first depth image using the first black light image, can not using described second Light-exposed image calculates the second depth image, and according to first depth image and the second range image integration third depth map Picture.
2. camera system as described in claim 1, characterized in that
The processor is used for, using the first black light image and the second black light image according to Binocular Vision Principle The depth image is obtained, to obtain the higher depth image of precision.
3. camera system as described in claim 1, characterized in that
The laser projection device is for alternately projecting the structured light patterns to the shared visual field and closing described in projection Structured light patterns;
First photographic device is used for, and the one one is acquired during the laser projection device projects the structured light patterns not Visible images acquire the one or two black light figure during the laser projection device closes and projects the structured light patterns Picture;Second photographic device is used for, and the 2nd 1 is acquired during the laser projection device projects the structured light patterns not Visible images acquire the two or two black light figure during the laser projection device closes and projects the structured light patterns Picture;The processor is also used to, using the difference image of the one one black light image and the one or two black light image, And the difference image calculating depth image of the 2nd 1 black light image and the two or two black light image.
4. camera system as claimed in claim 3, characterized in that
It alternately projects the structured light patterns to the shared visual field in the laser projection device and closes and project the knot In the period of structure light pattern, the duration for projecting the structured light patterns is greater than the duration closed and project the structured light patterns.
5. camera system as described in claim 1, characterized in that
The invisible laser is infrared ray laser.
6. camera system as described in claim 1, characterized in that
The laser projection device is arranged between first photographic device and the second photographic device.
7. a kind of mobile terminal, characterized in that including the camera system as described in claim 1-6 any claim.
8. a kind of image acquiring method, characterized in that include the following steps:
Project step: laser projection device projects invisible laser to the shared visual field of the first photographic device and the second photographic device Encoded structured light patterns;
Acquisition step: first photographic device acquires the first otherwise visible light color image and first in the shared visual field not Visible images, second photographic device acquire visible light gray level image and the second black light in the shared visual field Image;
Processing step: processor synthesizes the second visible brilliance using first otherwise visible light color image and visible light gray level image Chromatic graph picture, and depth image is calculated using the first black light image and the second black light image;
The focal length of lens of first photographic device is different from the focal length of lens of second photographic device;
It is made of the first photographic device and laser projection device based on the trigon depth measurement unit of structure light;Obtain depth letter The principle of breath is that the reference picture accessed in collected black light image and system progress matching primitives acquisition pixel is inclined Target can be obtained using the one-to-one relationship of pixel deviation value and actual depth value from value, then by principle of triangulation The depth image in space;Here reference picture is invisible in the plane apart from depth camera known depth value by acquiring Light image and obtain;
The processor is used for, and calculates the first depth image using the first black light image, can not using described second Light-exposed image calculates the second depth image, and according to first depth image and the second range image integration third depth map Picture.
CN201610959236.7A 2016-10-27 2016-10-27 Camera system, mobile terminal and image processing method Active CN106572340B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610959236.7A CN106572340B (en) 2016-10-27 2016-10-27 Camera system, mobile terminal and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610959236.7A CN106572340B (en) 2016-10-27 2016-10-27 Camera system, mobile terminal and image processing method

Publications (2)

Publication Number Publication Date
CN106572340A CN106572340A (en) 2017-04-19
CN106572340B true CN106572340B (en) 2019-05-10

Family

ID=58535654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610959236.7A Active CN106572340B (en) 2016-10-27 2016-10-27 Camera system, mobile terminal and image processing method

Country Status (1)

Country Link
CN (1) CN106572340B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107343122A (en) * 2017-08-02 2017-11-10 深圳奥比中光科技有限公司 3D imaging devices
CN107507272A (en) * 2017-08-09 2017-12-22 广东欧珀移动通信有限公司 Establish the method, apparatus and terminal device of human 3d model
CN107395974B (en) * 2017-08-09 2019-09-13 Oppo广东移动通信有限公司 Image processing system and method
CN107493411B (en) * 2017-08-09 2019-09-13 Oppo广东移动通信有限公司 Image processing system and method
CN107493412B (en) * 2017-08-09 2019-09-13 Oppo广东移动通信有限公司 Image processing system and method
EP3697078B1 (en) 2017-10-27 2022-05-25 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and device as well as electronic device
CN107800963B (en) * 2017-10-27 2019-08-30 Oppo广东移动通信有限公司 Image processing method, device, electronic device and computer readable storage medium
CN108195291B (en) * 2018-01-03 2020-05-05 中山大学 Moving vehicle three-dimensional detection method and detection device based on differential light spots
CN110099226B (en) * 2018-01-31 2024-04-09 宁波舜宇光电信息有限公司 Array camera module, depth information acquisition method thereof and electronic equipment
CN108648225B (en) * 2018-03-31 2022-08-02 奥比中光科技集团股份有限公司 Target image acquisition system and method
EP3611810B1 (en) 2018-04-28 2021-06-02 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control system and control method for laser projector, laser projection assembly and terminal
CN110533709B (en) * 2018-05-23 2023-02-07 杭州海康威视数字技术股份有限公司 Depth image acquisition method, device and system and image acquisition equipment
CN110619200B (en) * 2018-06-19 2022-04-08 Oppo广东移动通信有限公司 Verification system and electronic device
WO2019227975A1 (en) 2018-05-30 2019-12-05 Oppo广东移动通信有限公司 Control system of laser projector, terminal and control method of laser projector
CN109190484A (en) 2018-08-06 2019-01-11 北京旷视科技有限公司 Image processing method, device and image processing equipment
CN109327653B (en) * 2018-10-31 2021-01-26 Oppo广东移动通信有限公司 Image acquisition method, image acquisition device, structured light assembly and electronic device
CN111901502A (en) * 2019-05-06 2020-11-06 三赢科技(深圳)有限公司 Camera module
JP2022007139A (en) * 2020-06-25 2022-01-13 株式会社リコー Reading device, image forming apparatus, and image reading method
CN111866490A (en) * 2020-07-27 2020-10-30 支付宝(杭州)信息技术有限公司 Depth image imaging system and method
CN112738386A (en) * 2021-03-30 2021-04-30 北京芯海视界三维科技有限公司 Sensor, shooting module and image acquisition method
CN114284306A (en) * 2021-12-15 2022-04-05 武汉新芯集成电路制造有限公司 Depth and image sensor device, manufacturing method thereof and depth and image sensor chip

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101652393B1 (en) * 2010-01-15 2016-08-31 삼성전자주식회사 Apparatus and Method for obtaining 3D image
KR101966976B1 (en) * 2012-09-03 2019-04-08 엘지이노텍 주식회사 3-dimensional image processing system
KR101966975B1 (en) * 2012-09-03 2019-04-08 엘지이노텍 주식회사 Apparatus for stereo matching
KR101954192B1 (en) * 2012-11-15 2019-03-05 엘지전자 주식회사 Array camera, Moblie terminal, and method for operating the same
KR102070778B1 (en) * 2012-11-23 2020-03-02 엘지전자 주식회사 Rgb-ir sensor with pixels array and apparatus and method for obtaining 3d image using the same

Also Published As

Publication number Publication date
CN106572340A (en) 2017-04-19

Similar Documents

Publication Publication Date Title
CN106572340B (en) Camera system, mobile terminal and image processing method
CN106454287B (en) Combination shot system, mobile terminal and image processing method
US10345684B2 (en) Pattern projection and imaging using lens arrays
CN110530286B (en) Novel single-camera three-dimensional digital image correlation system using light-combining prism
CN107172407B (en) Suitable for generating the electronic device and method of depth map
US8334893B2 (en) Method and apparatus for combining range information with an optical image
CN104335005B (en) 3D is scanned and alignment system
Skocaj et al. Range image acquisition of objects with non-uniform albedo using structured light range sensor
CN107302667B (en) Camera-interchangeable dynamic spectral imaging system and method for applying same to high dynamic imaging
CN108055452A (en) Image processing method, device and equipment
CN107343130A (en) High dynamic imaging module based on DMD dynamic light splitting
CN106534633A (en) Combined photographing system, mobile terminal and image processing method
CN108307675A (en) More baseline camera array system architectures of depth enhancing in being applied for VR/AR
JP2010528499A (en) Single lens, single sensor 3D imaging device with central aperture opening to obtain camera position
JP2010517039A (en) Method and apparatus for quantitative three-dimensional imaging
EP3381015B1 (en) Systems and methods for forming three-dimensional models of objects
JP2000065542A (en) Three-dimensional image photographing device
CN108053438A (en) Depth of field acquisition methods, device and equipment
JP4193342B2 (en) 3D data generator
Beigpour et al. A comprehensive multi-illuminant dataset for benchmarking of the intrinsic image algorithms
CN208536839U (en) Image capture device
JP6367803B2 (en) Method for the description of object points in object space and combinations for its implementation
CN107392955B (en) Depth of field estimation device and method based on brightness
KR102184210B1 (en) 3d camera system
CN210201927U (en) Double-fisheye panoramic image information acquisition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191114

Address after: 518057 13th floor, United headquarters building, high tech Zone, No. 63, Xuefu Road, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Maurio Technology Co., Ltd.

Address before: 518000 Guangdong city of Shenzhen province Nanshan District Hing Road three No. 8 China University of Geosciences research base in building A808

Patentee before: Shenzhen Aobi Zhongguang Science & Technology Co., Ltd.