CN106997595A - Color of image processing method, processing unit and electronic installation based on the depth of field - Google Patents

Color of image processing method, processing unit and electronic installation based on the depth of field Download PDF

Info

Publication number
CN106997595A
CN106997595A CN201710138670.3A CN201710138670A CN106997595A CN 106997595 A CN106997595 A CN 106997595A CN 201710138670 A CN201710138670 A CN 201710138670A CN 106997595 A CN106997595 A CN 106997595A
Authority
CN
China
Prior art keywords
image
color
scene
depth
master image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710138670.3A
Other languages
Chinese (zh)
Inventor
孙剑波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710138670.3A priority Critical patent/CN106997595A/en
Publication of CN106997595A publication Critical patent/CN106997595A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The invention discloses a kind of color of image processing method based on the depth of field.Color of image processing method includes:Contextual data is handled to obtain the depth information of scene master image;The foreground part of scene master image is determined according to depth information;With processing scene master image to lift the color saturation of foreground part.In addition, the invention also discloses a kind of color of image processing unit and electronic installation.Color of image processing method, processing unit and the electronic installation of the present invention determines the foreground part of image according to depth information, by handling image to lift the color saturation of foreground part, so as to obtain the adjustment image of foreground part protrusion.

Description

Color of image processing method, processing unit and electronic installation based on the depth of field
Technical field
The present invention relates to imaging technique, more particularly to a kind of color of image processing method based on the depth of field, processing unit and Electronic installation.
Background technology
Existing imaging device is not deep enough for the color treatments of image, causes the main body of image not protrude, so that Obtain Consumer's Experience not good.
The content of the invention
It is contemplated that at least solving one of technical problem present in prior art.Therefore, the present invention needs offer one Plant color of image processing method, processing unit and electronic installation based on the depth of field.
A kind of color of image processing method based on the depth of field, the contextual data for handling imaging device collection, the field Scape data include scene master image, and described image color processing method comprises the following steps:
The contextual data is handled to obtain the depth information of the scene master image;
The foreground part of the scene master image is determined according to the depth information;With
The scene master image is handled to lift the color saturation of the foreground part.
A kind of color of image processing unit based on the depth of field, the contextual data for handling imaging device collection, the field Scape data include scene master image, and described image color processing apparatus includes first processing module, the first determining module and second Processing module.
The first processing module is used to handle the contextual data to obtain the depth information of the scene master image.
First determining module is used for the foreground part that the scene master image is determined according to the depth information.
The Second processing module is used to handle the scene master image to lift the color saturation of the foreground part.
A kind of electronic installation includes imaging device and described image color processing apparatus.
Color of image processing method, processing unit and the electronic installation of the present invention determines the prospect of image according to depth information Part, by handling image to lift the color saturation of foreground part, so as to obtain the adjustment image of foreground part protrusion.
The additional aspect and advantage of the present invention will be set forth in part in the description, and will partly become from the following description Obtain substantially, or recognized by the practice of the present invention.
Brief description of the drawings
The above-mentioned and/or additional aspect and advantage of the present invention will become from description of the accompanying drawings below to embodiment is combined Obtain substantially and be readily appreciated that, wherein:
Fig. 1 is the schematic flow sheet of the color of image processing method of embodiment of the present invention.
Fig. 2 is the floor map of the electronic installation of embodiment of the present invention.
Fig. 3 is the schematic flow sheet of the color of image processing method of some embodiments of the invention.
Fig. 4 is the high-level schematic functional block diagram of the first processing module of some embodiments of the invention.
Fig. 5 is the schematic flow sheet of the color of image processing method of some embodiments of the invention.
Fig. 6 is another high-level schematic functional block diagram of the first processing module of some embodiments of the invention.
Fig. 7 is the schematic flow sheet of the color of image processing method of some embodiments of the invention.
Fig. 8 is the high-level schematic functional block diagram of the first determining module of some embodiments of the invention.
Fig. 9 is the schematic flow sheet of the color of image processing method of some embodiments of the invention.
Figure 10 is the schematic flow sheet of the color of image processing method of some embodiments of the invention.
Figure 11 is the high-level schematic functional block diagram of the Second processing module of some embodiments of the invention.
Figure 12 is the schematic flow sheet of the color of image processing method of some embodiments of the invention.
Figure 13 is the high-level schematic functional block diagram of the color of image processing unit of some embodiments of the invention.
Main element symbol description:
Electronic installation 100, color of image processing unit 10, first processing module 11, at first processing units 112, second Manage unit 114, the 3rd processing unit 116, fourth processing unit 118, the first determining module 13, the 5th processing unit 132, searching Unit 134, Second processing module 15, the 6th processing unit 152, the 7th processing unit 154, at the second determining module the 17, the 3rd Manage module 19, imaging device 20.
Embodiment
Embodiments of the present invention are described below in detail, the embodiment of the embodiment is shown in the drawings, wherein Same or similar label represents same or similar element or the element with same or like function from beginning to end.Lead to below It is exemplary to cross the embodiment being described with reference to the drawings, and is only used for explaining the present invention, and it is not intended that to the limit of the present invention System.
Also referring to Fig. 1 and Fig. 2, the color of image processing method based on the depth of field of embodiment of the present invention can be used for Handle the contextual data that imaging device 20 is gathered.Contextual data includes scene master image.Color of image processing method includes following Step:
S11:Contextual data is handled to obtain the depth information of scene master image;
S13:The foreground part of scene master image is determined according to depth information;With
S15:Scene master image is handled to lift the color saturation of foreground part.
Referring to Fig. 2, the color of image processing unit 10 of embodiment of the present invention can be used for handling imaging device The contextual data of 20 collections.Contextual data includes scene master image.Color of image processing unit 10 include first processing module 11, First determining module 13 and Second processing module 15.First processing module 11 is used to handle contextual data to obtain scene master image Depth information.First determining module 13 is used for the foreground part that scene master image is determined according to depth information.Second processing mould Block 15 is used to handle scene master image to lift the color saturation of foreground part.
In other words, the color of image processing method of embodiment of the present invention can by embodiment of the present invention image face Color processing unit 10 realizes, wherein, step S11 can be realized by first processing module 11, and step S13 can determine mould by first Block 13 realizes that step S15 can be realized by Second processing module 15.
In some embodiments, the color of image processing unit 10 of embodiment of the present invention can apply to of the invention real The electronic installation 100 of mode is applied, the electronic installation 100 of embodiment of the present invention can include embodiment of the present invention in other words Color of image processing unit 10.In addition, the electronic installation 100 of embodiment of the present invention also includes imaging device 20, imaging device 20 and color of image processing unit 10 electrically connect.
Color of image processing method, processing unit 10 and the electronic installation 100 of embodiment of the present invention are according to depth information The foreground part of image is determined, by handling image to lift the color saturation of foreground part, is dashed forward so as to obtain foreground part The adjustment image gone out.
In some embodiments, electronic installation 100 includes mobile phone or tablet personal computer.In embodiments of the present invention, it is electric Sub-device 100 is mobile phone.
In some embodiments, imaging device 20 includes Front camera and/or rearmounted camera, does not do any limit herein System.In embodiments of the present invention, imaging device 20 is Front camera.
Referring to Fig. 3, in some embodiments, contextual data includes depth image corresponding with scene master image, step Rapid S11 comprises the following steps:
S112:Depth image is handled to obtain the depth data of scene master image;With
S114:Depth data is handled to obtain depth information.
Referring to Fig. 4, in some embodiments, contextual data includes depth image corresponding with scene master image, the One processing module 11 includes first processing units 112 and second processing unit 114.First processing units 112 are used to handle depth Image is to obtain the depth data of scene master image.Second processing unit 114 is used to handle depth data to obtain depth information.
In other words, step S112 can be realized by first processing units 112, and step S114 can be by second processing unit 114 realize.
So, it is possible to use depth image quickly obtains the depth information of scene master image.
It is appreciated that caching master image is RGB color image, in scene each personal or thing relative to imaging device 20 away from From can be characterized with depth image, each pixel value in depth image that is to say depth data represent in scene certain point with The distance of imaging device 20, the depth data of the point of people or thing in composition scene is the depth that would know that corresponding people or thing Spend information.Because the color information of scene master image and the depth information of depth image are one-to-one relations, therefore, it can obtain Obtain the depth information of scene master image.
In some embodiments, the acquisition modes of depth image corresponding with scene master image include using structure optical depth Ranging is spent to obtain depth image and obtain depth image two using flight time (time of flight, TOF) depth camera The mode of kind.
When obtaining depth image using structure light Range finder, imaging device 20 includes camera and the projector.
It is appreciated that structure light Range finder is that the photo structure of certain pattern is projeced into body surface using the projector, The striation 3-D view modulated by testee shape is formed on surface.Striation 3-D view is detected to obtain by camera Striation two dimension fault image.Relative position and body surface shape that the distortion degree of striation is depended between the projector and camera Wide or height.The displacement shown along striation is proportional to the height of body surface, and kink illustrates the change of plane, discontinuously The physical clearance of display surface.When the timing of relative position one between the projector and camera, by the two-dimentional optical strip image distorted The three-D profile of coordinate just reproducible body surface, so as to obtain depth information.Structure light Range finder has higher Resolution ratio and measurement accuracy.
When obtaining depth image using TOF depth cameras, imaging device 20 includes TOF depth cameras.
It is appreciated that the modulation infrared light emission that is sent by sensor record from luminescence unit of TOF depth cameras to Object, then the phase place change reflected from object, according to the light velocity in the range of a wavelength, can obtain whole in real time Scene depth distance.TOF depth cameras are not influenceed when calculating depth information by the gray scale and feature on object surface, and can Rapidly to calculate depth information, with very high real-time.
Referring to Fig. 5, in some embodiments, contextual data includes scene sub-picture corresponding with scene master image, Step S11 comprises the following steps:
S116:Scene master image and scene sub-picture is handled to obtain the depth data of scene master image;With
S118:Depth data is handled to obtain depth information.
Referring to Fig. 6, in some embodiments, contextual data includes scene sub-picture corresponding with scene master image, First processing module 11 includes the 3rd processing unit 116 and fourth processing unit 118.3rd processing unit 116 is used for Treatment stations Scape master image and scene sub-picture are to obtain the depth data of scene master image.Fourth processing unit 118 is used to handle depth number Depth information is obtained according to this.
In other words, step S116 can be realized that step S118 can be by fourth processing unit by the 3rd processing unit 116 118 realize.
In this way, the depth information of scene master image can be obtained by handling scene master image and scene sub-picture.
In some embodiments, imaging device 20 includes main camera and secondary camera.
It is appreciated that depth information can be obtained by binocular stereo vision distance-finding method, now contextual data bag Include scene master image and scene sub-picture.Wherein, scene master image is shot by main camera and obtained, and scene sub-picture is imaged by pair Head shoots and obtained.Binocular stereo vision ranging is that same object is imaged from different positions with two identical cameras To obtain the stereo pairs of object, then go out by algorithmic match the corresponding picture point of stereo pairs, so that parallax is calculated, Depth information is finally recovered using the method based on triangulation.In this way, by scene master image and scene sub-picture this Stereo pairs, which are matched, just can obtain the depth information of scene master image.
Referring to Fig. 7, in some embodiments, step S13 comprises the following steps:
S132:The First Point of scene master image is obtained according to depth information;With
S134:The region found with First Point adjoining and depth consecutive variations is used as foreground part.
Referring to Fig. 8, in some embodiments, the first determining module 13 includes the 5th processing unit 132 and finds single Member 134.5th processing unit 132 is used for the First Point that scene master image is obtained according to depth information.Finding unit 134 is used to seek The region with First Point adjoining and depth consecutive variations is looked for as foreground part.
In other words, step S132 can be realized by the 5th processing unit 132, and step S134 can be by searching unit 134 Realize.
In this way, the foreground part of scene master image physical link can be obtained.In reality scene, usual foreground part is Link together.Using the foreground part of physical link as main body, the relation of foreground part can be intuitively obtained.
Specifically, the First Point of scene master image is first obtained according to depth information, First Point is opened equivalent to foreground part End, is diffused from First Point, obtains the region with First Point adjoining and depth consecutive variations, these regions and First Point are returned And be foreground area.
It should be noted that First Point refer to the minimum corresponding pixel of object of depth, i.e. object distance it is minimum or from The corresponding pixel of the nearest object of imaging device 20.Adjoining refers to that two pixels link together.Depth consecutive variations are The depth difference for referring to two adjacent pixels is less than predetermined difference value, and the difference of depth is less than two adjoinings of predetermined difference value in other words Pixel depth consecutive variations.
Referring to Fig. 9, in some embodiments, step S13 may comprise steps of:
S136:The First Point of scene master image is obtained according to depth information;With
S138:The region that the difference of the depth of searching and First Point is less than predetermined threshold is used as foreground part.
In this way, the foreground part of scene master image logical communication link can be obtained.In reality scene, foreground part may not have Link together, but meet certain logical relation, such as eagle, which dives, grabs the scene of chicken, eagle and chicken physics On may not link together, but logically, it can be determined that they are connected.
Specifically, the First Point of scene master image is first obtained according to depth information, First Point is opened equivalent to foreground part End, is diffused from First Point, and the difference of the depth of acquisition and First Point is less than the region of predetermined threshold, these regions and First Point Merger is foreground area.
In some embodiments, predetermined threshold can be the value set by user.In this way, user can be according to itself Demand determine the scope of foreground part, so as to obtain preferable composition suggestion, realize preferable composition.
In some embodiments, predetermined threshold can be the value that color of image processing unit 10 is determined, herein not Do any limitation.The predetermined threshold that color of image processing unit 10 is determined can be a fixed value of storage inside, can also It is the numerical value calculated according to different situations, such as depth of First Point.
In some embodiments, step S13 may comprise steps of:
Find region of the depth in predetermined interval and be used as foreground part.
It is background parts to determine region of the scene master image in addition to foreground part.
In this way, the foreground part that depth is in OK range can be obtained.
It is appreciated that in the case of some shootings, foreground part is not the part of foremost, but foremost part is somewhat Part more rearward, for example, people is sitting in behind computer, computer is earlier, but the talent is main part, so by depth Region in predetermined interval can be effectively prevented from the incorrect problem of main body'choice as foreground part.
Referring to Fig. 10, in some embodiments, step S15 comprises the following steps:
S152:Scene master image is handled to recognize the main col of foreground part;With
S154:Improve the saturation degree of main col.
Figure 11 is referred to, in some embodiments, Second processing module 15 is included at the 6th processing unit 152 and the 7th Manage unit 154.6th processing unit 152 is used to handle scene master image to recognize the main col of foreground part.7th processing Unit 154 is used for the saturation degree for improving main col.
In other words, step S152 can be realized by the 6th processing unit 152, and step S154 can be by the 7th processing unit 154 realize.
In this way, the color saturation of foreground part can fast and effeciently be improved, so that foreground part, i.e. main part Divide protrusion.
Specifically, main col refers to the maximum color of Color Channel accounting in RGB image.Such as red channel is at some Accounting in three Color Channels of red, green, blue of pixel is maximum, then red is exactly main col, can be red logical by strengthening The color saturation in road strengthens the visual effect of foreground part.The method of lifting main col saturation degree can be that increase is main The accounting of color.
Figure 12 is referred to, in some embodiments, color of image processing method comprises the following steps:
S17:Determine background parts of the scene master image in addition to foreground part;With
S19:Scene master image is handled to reduce the color saturation of background parts.
Figure 13 is referred to, in some embodiments, color of image processing unit 10 includes the second determining module 17 and the Three processing modules 19.Second determining module 17 is used to determine background parts of the scene master image in addition to foreground part.3rd processing Module 19 is used to handle scene master image to reduce the color saturation of background parts.
In other words, step S17 can be realized that step S19 can be by the 3rd processing module 19 by the second determining module 17 Realize.
In this way, the color saturation of background parts can be reduced, foreground part, i.e. main part can be caused by contrast It is prominent.
In the description of embodiments of the present invention, term " first ", " second " are only used for describing purpose, without being understood that To indicate or implying relative importance or the implicit quantity for indicating indicated technical characteristic.Thus, define " first ", One or more feature can be expressed or be implicitly included to the feature of " second ".In embodiments of the present invention In description, " multiple " are meant that two or more, unless otherwise specifically defined.
, it is necessary to illustrate in the description of embodiments of the present invention, unless otherwise clearly defined and limited, term " installation ", " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, or one Connect body;It can be mechanical connection or electrical connection or can mutually communicate;Can be joined directly together, can also lead to Cross intermediary to be indirectly connected to, can be connection or the interaction relationship of two elements of two element internals.For ability For the those of ordinary skill in domain, it can understand that above-mentioned term in embodiments of the present invention specific contains as the case may be Justice.
In the description of this specification, reference term " embodiment ", " some embodiments ", " schematically implementation The description of mode ", " example ", " specific example " or " some examples " etc. means the tool with reference to the embodiment or example description Body characteristicses, structure, material or feature are contained at least one embodiment of the present invention or example.In this manual, Identical embodiment or example are not necessarily referring to the schematic representation of above-mentioned term.Moreover, the specific features of description, knot Structure, material or feature can in an appropriate manner be combined in any one or more embodiments or example.
Any process described otherwise above or method description are construed as in flow chart or herein, represent to include Module, fragment or the portion of the code of one or more executable instructions for the step of realizing specific logical function or process Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not be by shown or discussion suitable Sequence, including according to involved function by it is basic simultaneously in the way of or in the opposite order, carry out perform function, this should be of the invention Embodiment person of ordinary skill in the field understood.
Represent in flow charts or logic and/or step described otherwise above herein, for example, being considered use In the order list for the executable instruction for realizing logic function, it may be embodied in any computer-readable medium, for Instruction execution system, device or equipment (such as computer based system including the system of processing module or other can be from instruction The system of execution system, device or equipment instruction fetch and execute instruction) use, or combine these instruction execution systems, device or Equipment and use.For the purpose of this specification, " computer-readable medium " can be it is any can include, store, communicating, propagating or Transmission procedure uses for instruction execution system, device or equipment or with reference to these instruction execution systems, device or equipment Device.The more specifically example (non-exhaustive list) of computer-readable medium includes following:With one or more wirings Electrical connection section (electronic installation), portable computer diskette box (magnetic device), random access memory (RAM), read-only storage (ROM), erasable edit read-only storage (EPROM or flash memory), fiber device, and portable optic disk is read-only deposits Reservoir (CDROM).In addition, can even is that can be in the paper of printing described program thereon or other are suitable for computer-readable medium Medium, because can then enter edlin, interpretation or if necessary with it for example by carrying out optical scanner to paper or other media His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each several part of embodiments of the present invention can be with hardware, software, firmware or combinations thereof come real It is existing.In the above-described embodiment, multiple steps or method can be with storages in memory and by suitable instruction execution system The software or firmware of execution is realized.If for example, being realized with hardware, with another embodiment, ability can be used Any one of following technology known to domain or their combination are realized:With for realizing logic function to data-signal The discrete logic of logic gates, the application specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), field programmable gate array (FPGA) etc..
Those skilled in the art are appreciated that to realize all or part of step that above-described embodiment method is carried Rapid to can be by program to instruct the hardware of correlation to complete, described program can be stored in a kind of computer-readable storage medium In matter, the program upon execution, including one or a combination set of the step of embodiment of the method.
In addition, each functional unit in various embodiments of the present invention can be integrated in a processing module, also may be used To be that unit is individually physically present, can also two or more units be integrated in a module.It is above-mentioned integrated Module can both be realized in the form of hardware, it would however also be possible to employ the form of software function module is realized.The integrated module If being realized using in the form of software function module and as independent production marketing or in use, a calculating can also be stored in In machine read/write memory medium.
Storage medium mentioned above can be read-only storage, disk or CD etc..
Although embodiments of the invention have been shown and described above, it is to be understood that above-described embodiment is example Property, it is impossible to limitation of the present invention is interpreted as, one of ordinary skill in the art within the scope of the invention can be to above-mentioned Embodiment is changed, changed, replacing and modification.

Claims (17)

1. a kind of color of image processing method based on the depth of field, the contextual data for handling imaging device collection, the scene Data include scene master image, it is characterised in that described image color processing method comprises the following steps:
The contextual data is handled to obtain the depth information of the scene master image;
The foreground part of the scene master image is determined according to the depth information;With
The scene master image is handled to lift the color saturation of the foreground part.
2. color of image processing method as claimed in claim 1, it is characterised in that the contextual data includes and the scene The step of corresponding depth image of master image, depth information of the processing contextual data to obtain the scene master image Comprise the following steps:
The depth image is handled to obtain the depth data of the scene master image;With
The depth data is handled to obtain the depth information.
3. color of image processing method as claimed in claim 1, it is characterised in that the contextual data includes and the scene The corresponding scene sub-picture of master image, the processing contextual data is to obtain the step of the depth information of the scene master image Suddenly comprise the following steps:
The scene master image and the scene sub-picture is handled to obtain the depth data of the scene master image;With
The depth data is handled to obtain the depth information.
4. color of image processing method as claimed in claim 1, it is characterised in that described that institute is determined according to the depth information The step of foreground part for stating scene master image, comprises the following steps:
The First Point of the scene master image is obtained according to the depth information;With
The region found with First Point adjoining and depth consecutive variations is used as the foreground part.
5. color of image processing method as claimed in claim 1, it is characterised in that the processing scene master image is to carry The step of color saturation for rising the foreground part, comprises the following steps:
The scene master image is handled to recognize the main col of the foreground part;With
Improve the saturation degree of the main col.
6. color of image processing method as claimed in claim 1, it is characterised in that described image color processing method include with Lower step:
Determine background parts of the scene master image in addition to the foreground part;With
The scene master image is handled to reduce the color saturation of the background parts.
7. a kind of color of image processing unit based on the depth of field, the contextual data for handling imaging device collection, the scene Data include scene master image, it is characterised in that described image color processing apparatus includes:
First processing module, the first processing module is used to handle the contextual data to obtain the depth of the scene master image Spend information;
First determining module, first determining module is used for the prospect that the scene master image is determined according to the depth information Part;With
Second processing module, the Second processing module is used to handle the scene master image to lift the color of the foreground part Color saturation degree.
8. color of image processing unit as claimed in claim 7, it is characterised in that the contextual data includes and the scene The corresponding depth image of master image, the first processing module includes:
First processing units, the first processing units are used to handle the depth image to obtain the depth of the scene master image Degrees of data;With
Second processing unit, the second processing unit is used to handle the depth data to obtain the depth information.
9. color of image processing unit as claimed in claim 7, it is characterised in that the contextual data includes and the scene The corresponding scene sub-picture of master image, the first processing module includes:
3rd processing unit, the 3rd processing unit is used to handle the scene master image and the scene sub-picture to obtain The depth data of the scene master image;With
Fourth processing unit, the fourth processing unit is used to handle the depth data to obtain the depth information.
10. color of image processing unit as claimed in claim 7, it is characterised in that first determining module includes:
5th processing unit, the 5th processing unit be used for according to the depth information obtain the scene master image most before Point;With
Unit is found, the searching unit is used to find the region with First Point adjoining and depth consecutive variations as described Foreground part.
11. color of image processing unit as claimed in claim 7, it is characterised in that the Second processing module includes:
6th processing unit, the 6th processing unit is used to handle the scene master image to recognize the master of the foreground part Want color;With
7th processing unit, the 7th processing unit is used for the saturation degree for improving the main col.
12. color of image processing unit as claimed in claim 7, it is characterised in that described image color processing apparatus includes:
Second determining module, second determining module is used to determine background of the scene master image in addition to the foreground part Part;With
3rd processing module, the 3rd processing module is used to handle the scene master image to reduce the color of the background parts Color saturation degree.
13. a kind of electronic installation, it is characterised in that including:
Imaging device;With
Color of image processing unit as described in claim 7 to 12 any one.
14. electronic installation as claimed in claim 13, it is characterised in that the electronic installation includes mobile phone or tablet personal computer.
15. electronic installation as claimed in claim 13, it is characterised in that the imaging device includes main camera and secondary shooting Head.
16. electronic installation as claimed in claim 13, it is characterised in that the imaging device includes camera and the projector.
17. electronic installation as claimed in claim 13, it is characterised in that the imaging device includes TOF depth cameras.
CN201710138670.3A 2017-03-09 2017-03-09 Color of image processing method, processing unit and electronic installation based on the depth of field Pending CN106997595A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710138670.3A CN106997595A (en) 2017-03-09 2017-03-09 Color of image processing method, processing unit and electronic installation based on the depth of field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710138670.3A CN106997595A (en) 2017-03-09 2017-03-09 Color of image processing method, processing unit and electronic installation based on the depth of field

Publications (1)

Publication Number Publication Date
CN106997595A true CN106997595A (en) 2017-08-01

Family

ID=59431777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710138670.3A Pending CN106997595A (en) 2017-03-09 2017-03-09 Color of image processing method, processing unit and electronic installation based on the depth of field

Country Status (1)

Country Link
CN (1) CN106997595A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109816663A (en) * 2018-10-15 2019-05-28 华为技术有限公司 A kind of image processing method, device and equipment
WO2021098518A1 (en) * 2019-11-18 2021-05-27 RealMe重庆移动通信有限公司 Image adjustment method and apparatus, electronic device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1878316A (en) * 2005-06-10 2006-12-13 中华映管股份有限公司 Image color adjusting method
CN102034230A (en) * 2010-12-17 2011-04-27 清华大学 Method for enhancing visibility of image
US20140304271A1 (en) * 2011-10-27 2014-10-09 Agency For Science, Technology And Research Method To Identify An Object And A System For Doing The Same
CN105027145A (en) * 2012-12-13 2015-11-04 微软技术许可有限责任公司 Automatic classification and color enhancement of a markable surface
CN105303543A (en) * 2015-10-23 2016-02-03 努比亚技术有限公司 Image enhancement method and mobile terminal
CN106327473A (en) * 2016-08-10 2017-01-11 北京小米移动软件有限公司 Method and device for acquiring foreground images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1878316A (en) * 2005-06-10 2006-12-13 中华映管股份有限公司 Image color adjusting method
CN102034230A (en) * 2010-12-17 2011-04-27 清华大学 Method for enhancing visibility of image
US20140304271A1 (en) * 2011-10-27 2014-10-09 Agency For Science, Technology And Research Method To Identify An Object And A System For Doing The Same
CN105027145A (en) * 2012-12-13 2015-11-04 微软技术许可有限责任公司 Automatic classification and color enhancement of a markable surface
CN105303543A (en) * 2015-10-23 2016-02-03 努比亚技术有限公司 Image enhancement method and mobile terminal
CN106327473A (en) * 2016-08-10 2017-01-11 北京小米移动软件有限公司 Method and device for acquiring foreground images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张丰收、宋卫东、李振伟: "《数字图像处理技术及应用》", 30 April 2015, 中国水利水电出版社 *
陈继民: "《3D打印技术基础教程》", 31 January 2016, 《国防工业出版社》 *
马广韬,姚冲,张娜: "《中国版Photoshop CC服装设计》", 31 May 2016, 中国青年出版社 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109816663A (en) * 2018-10-15 2019-05-28 华为技术有限公司 A kind of image processing method, device and equipment
CN113112505A (en) * 2018-10-15 2021-07-13 华为技术有限公司 Image processing method, device and equipment
CN113112505B (en) * 2018-10-15 2022-04-29 华为技术有限公司 Image processing method, device and equipment
WO2021098518A1 (en) * 2019-11-18 2021-05-27 RealMe重庆移动通信有限公司 Image adjustment method and apparatus, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
CN106851123A (en) Exposal control method, exposure-control device and electronic installation
CN106851238B (en) Method for controlling white balance, white balance control device and electronic device
US8406510B2 (en) Methods for evaluating distances in a scene and apparatus and machine readable medium using the same
CN106851124A (en) Image processing method, processing unit and electronic installation based on the depth of field
US10839530B1 (en) Moving point detection
CN106993112A (en) Background-blurring method and device and electronic installation based on the depth of field
US10477178B2 (en) High-speed and tunable scene reconstruction systems and methods using stereo imagery
US8319854B2 (en) Shadow removal in an image captured by a vehicle based camera using a non-linear illumination-invariant kernel
CN107172353B (en) Automatic explosion method, device and computer equipment
CN103916603B (en) Backlighting detecting and equipment
US8345100B2 (en) Shadow removal in an image captured by a vehicle-based camera using an optimized oriented linear axis
CN107025635A (en) Processing method, processing unit and the electronic installation of image saturation based on the depth of field
CN111462503B (en) Vehicle speed measuring method and device and computer readable storage medium
CN101527046A (en) Motion detection method, device and system
CN106851107A (en) Switch control method, control device and the electronic installation of camera assisted drawing
CN110378946B (en) Depth map processing method and device and electronic equipment
US10764561B1 (en) Passive stereo depth sensing
CN106998389A (en) Control method, control device and the electronic installation of auto composition
CN106875433A (en) Cut control method, control device and the electronic installation of composition
CN110443186B (en) Stereo matching method, image processing chip and mobile carrier
CN104424640A (en) Method and device for carrying out blurring processing on images
CN111837158A (en) Image processing method and device, shooting device and movable platform
CN115035235A (en) Three-dimensional reconstruction method and device
CN106997595A (en) Color of image processing method, processing unit and electronic installation based on the depth of field
CN107016651A (en) Image sharpening method, image sharpening device and electronic installation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170801

RJ01 Rejection of invention patent application after publication