CN109598753A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN109598753A
CN109598753A CN201811440471.9A CN201811440471A CN109598753A CN 109598753 A CN109598753 A CN 109598753A CN 201811440471 A CN201811440471 A CN 201811440471A CN 109598753 A CN109598753 A CN 109598753A
Authority
CN
China
Prior art keywords
depth information
background object
pixel
equation
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811440471.9A
Other languages
Chinese (zh)
Other versions
CN109598753B (en
Inventor
尚砚娜
杨汇成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201811440471.9A priority Critical patent/CN109598753B/en
Publication of CN109598753A publication Critical patent/CN109598753A/en
Application granted granted Critical
Publication of CN109598753B publication Critical patent/CN109598753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

Present disclose provides a kind of image processing methods, comprising: obtains image to be processed, wherein includes foreground object and background object in image to be processed;Obtain the first depth information, wherein the first depth information characterizes the depth that multiple pixels in image correspond to the surface of foreground object;Obtain the second depth information, wherein the second depth information characterizes depth of multiple pixels corresponding to the surface of background object;And for each pixel in multiple pixels, amplify the difference in the first depth information and the second depth information between corresponding depth information.The disclosure additionally provides a kind of image processing apparatus.

Description

Image processing method and device
Technical field
This disclosure relates to a kind of image processing method and a kind of image processing apparatus.
Background technique
To have change in depth image carry out extracted region when, if in image background parts of measurand itself depth Degree changes greatly, and the relatively surrounding background parts change in depth of foreground part is smaller, then show as on the image prospect and Contrast between background is lower, this can bring very big difficulty to subsequent extracted region/detection.
For example, some texts have been carved on the surface of a ceramic cup, measurand is cup, and the foreground part in image can Think the word on cup, background parts can be the part around word, since cup has certain radian, so background parts sheet The change in depth of body is larger, and region area shared by text is smaller, and than shallower, therefore foreground part is relative to ambient background Change in depth and little.
Summary of the invention
An aspect of this disclosure, which provides one kind, to carry out figure by the difference of amplification foreground depth and background depth As the image processing method of processing, comprising: obtain image to be processed, wherein include prospect in the image to be processed Object and background object;Obtain the first depth information, wherein multiple pixels in the first depth information characterization described image The depth on the surface corresponding to the foreground object;Obtain the second depth information, wherein described in the second depth information characterization Multiple pixels correspond to the depth on the surface of the background object;And for each pixel in the multiple pixel Point amplifies the difference in first depth information and second depth information between corresponding depth information.
Optionally, the second depth information of the acquisition, comprising: determine the fitting for being fitted the surface of the background object Equation;And using each pixel in the multiple pixel as the variable of the fit equation, calculate each picture Vegetarian refreshments corresponds to the depth on the surface of the background object, obtains second depth information.
Optionally, the determination is used to be fitted the fit equation on the surface of the background object, comprising: if described The surface of background object is plane, it is determined that for being fitted the linear equation or plane equation on the surface of the background object;Or If the surface of person's background object is curved surface, it is determined that for being fitted the curvilinear equation or song on the surface of the background object Face equation.
Optionally, the determination is used to be fitted the fit equation on the surface of the background object, comprising: determines described in being located at Predetermined number pixel on the surface of background object;Measurement obtains the corresponding depth of predetermined number pixel Information;And the depth information obtained using the predetermined number pixel and measurement, it is fitted the surface of the background object, Obtain the fit equation.
It optionally, include one of any in following in the multiple pixel: on the surface of the foreground object All pixels point;Multiple feature pixels on the surface of the foreground object;It is connected on positioned at the foreground object All pixels point on the edge of the background object.
Another aspect of the disclosure provides a kind of image processing apparatus, comprising: first obtain module, for obtain to The image of processing, wherein include foreground object and background object in the image to be processed;Second obtains module, is used for Obtain the first depth information, wherein multiple pixels correspond to the prospect in the first depth information characterization described image The depth on the surface of object;Third obtains module, for obtaining the second depth information, wherein the second depth information characterization The multiple pixel corresponds to the depth on the surface of the background object;And processing module, for being directed to the multiple picture Each pixel in vegetarian refreshments amplifies in first depth information and second depth information between corresponding depth information Difference.
Optionally, the third obtains module and comprises determining that unit, for determining the table for being fitted the background object The fit equation in face;And computing unit, for using each pixel in the multiple pixel as the fit equation Variable, calculate the depth that each pixel corresponds to the surface of the background object, obtain second depth information.
Optionally, the determination unit is also used to: if the surface of the background object is plane, it is determined that for being fitted The linear equation or plane equation on the surface of the background object;Or if the surface of the background object is curved surface, really Determine the curvilinear equation or surface equation for being fitted the surface of the background object.
Optionally, the determination unit comprises determining that subelement, is located on the surface of the background object for determining Predetermined number pixel;Subelement is measured, obtains the corresponding depth letter of the predetermined number pixel for measuring Breath;And fitting subelement, the depth information for being obtained using the predetermined number pixel and measurement are fitted the back The surface of scape object obtains the fit equation.
It optionally, include one of any in following in the multiple pixel: on the surface of the foreground object All pixels point;Multiple feature pixels on the surface of the foreground object;It is connected on positioned at the foreground object All pixels point on the edge of the background object.
Another aspect of the present disclosure provides a kind of computer installation, comprising: one or more processors;Memory is used In the one or more programs of storage, wherein when one or more of programs are executed by one or more of processors, make It obtains one or more of processors and realizes method as described above.
Another aspect of the present disclosure provides a kind of computer readable storage medium, is stored with computer executable instructions, Described instruction is when executed for realizing method as described above.
Another aspect of the present disclosure provides a kind of computer program, and the computer program, which includes that computer is executable, to be referred to It enables, described instruction is when executed for realizing method as described above.
Detailed description of the invention
In order to which the disclosure and its advantage is more fully understood, referring now to being described below in conjunction with attached drawing, in which:
Fig. 1 diagrammatically illustrates the application scenarios of image processing method and device according to the embodiment of the present disclosure;
Fig. 2 diagrammatically illustrates the flow chart of the image processing method according to the embodiment of the present disclosure;
Fig. 3 A diagrammatically illustrates the effect picture before the enhancing image according to the embodiment of the present disclosure;
Fig. 3 B diagrammatically illustrates the effect picture after the enhancing image according to the embodiment of the present disclosure;
Fig. 4 diagrammatically illustrates the schematic diagram of the fit equation of the determination background surface according to the embodiment of the present disclosure;
Fig. 5 diagrammatically illustrates the block diagram of the image processing apparatus according to the embodiment of the present disclosure;
Fig. 6 diagrammatically illustrates the block diagram that module is obtained according to the third of the embodiment of the present disclosure;
Fig. 7 diagrammatically illustrates the block diagram of the determination unit according to the embodiment of the present disclosure;And
Fig. 8 diagrammatically illustrates the department of computer science for being adapted for carrying out image processing method and device according to the embodiment of the present disclosure The block diagram of system.
Specific embodiment
Hereinafter, will be described with reference to the accompanying drawings embodiment of the disclosure.However, it should be understood that these descriptions are only exemplary , and it is not intended to limit the scope of the present disclosure.In the following detailed description, to elaborate many specific thin convenient for explaining Section is to provide the comprehensive understanding to the embodiment of the present disclosure.It may be evident, however, that one or more embodiments are not having these specific thin It can also be carried out in the case where section.In addition, in the following description, descriptions of well-known structures and technologies are omitted, to avoid Unnecessarily obscure the concept of the disclosure.
Term as used herein is not intended to limit the disclosure just for the sake of description specific embodiment.It uses herein The terms "include", "comprise" etc. show the presence of the feature, step, operation and/or component, but it is not excluded that in the presence of Or add other one or more features, step, operation or component.
There are all terms (including technical and scientific term) as used herein those skilled in the art to be generally understood Meaning, unless otherwise defined.It should be noted that term used herein should be interpreted that with consistent with the context of this specification Meaning, without that should be explained with idealization or excessively mechanical mode.
It, in general should be according to this using statement as " at least one in A, B and C etc. " is similar to Field technical staff is generally understood the meaning of the statement to make an explanation (for example, " system at least one in A, B and C " Should include but is not limited to individually with A, individually with B, individually with C, with A and B, with A and C, have B and C, and/or System etc. with A, B, C).Using statement as " at least one in A, B or C etc. " is similar to, generally come Saying be generally understood the meaning of the statement according to those skilled in the art to make an explanation (for example, " having in A, B or C at least One system " should include but is not limited to individually with A, individually with B, individually with C, with A and B, have A and C, have B and C, and/or the system with A, B, C etc.).
Shown in the drawings of some block diagrams and/or flow chart.It should be understood that some sides in block diagram and/or flow chart Frame or combinations thereof can be realized by computer program instructions.These computer program instructions can be supplied to general purpose computer, The processor of special purpose computer or other programmable data processing units, so that these instructions are when executed by this processor can be with Creation is for realizing function/operation device illustrated in these block diagrams and/or flow chart.The technology of the disclosure can be hard The form of part and/or software (including firmware, microcode etc.) is realized.In addition, the technology of the disclosure, which can be taken, is stored with finger The form of computer program product on the computer readable storage medium of order, the computer program product is for instruction execution system System uses or instruction execution system is combined to use.
Embodiment of the disclosure, which provides one kind, to carry out image by the difference of amplification foreground depth and background depth The image processing method of processing and the image processing apparatus that this method can be applied.This method includes obtaining figure to be processed Picture, wherein include foreground object and background object in image to be processed;Obtain the first depth information, wherein the first depth Multiple pixels correspond to the depth on the surface of foreground object in information representation image;Obtain the second depth information, wherein second Depth information characterizes depth of multiple pixels corresponding to the surface of background object;And for each picture in multiple pixels Vegetarian refreshments amplifies the difference in the first depth information and the second depth information between corresponding depth information.
Fig. 1 diagrammatically illustrates the application scenarios of image processing method and device according to the embodiment of the present disclosure.It needs to infuse Meaning, being only shown in Fig. 1 can be using the example of the scene of the embodiment of the present disclosure, to help skilled in the art to understand this Disclosed technology contents, but it is not meant to that the embodiment of the present disclosure may not be usable for other equipment, system, environment or scene.
To have change in depth image carry out extracted region when, if in image background parts of measurand itself depth Degree changes greatly, and the relatively surrounding background parts change in depth of foreground part is smaller, then show as on the image prospect ( Claim foreground object) and background (also referred to as background object) between contrast it is lower, this can bring to subsequent extracted region/detection Very big difficulty.
In application scenarios as shown in Figure 1, it is clear that background parts of image change in depth itself is larger, and foreground part Then change in depth is smaller for background parts relative to surrounding, thus the contrast of the two is smaller, it is difficult to carry out image zooming-out/inspection It surveys, the contrast of foreground part and background parts can be enhanced, after being in the technical solution provided using the embodiment of the present disclosure Continuous image zooming-out/detection brings very big convenience.
Fig. 2 diagrammatically illustrates the flow chart of the image processing method according to the embodiment of the present disclosure.
As shown in Fig. 2, this method includes operation S210~S230, in which:
In operation S210, image to be processed is obtained, wherein include foreground object and background pair in image to be processed As.
It should be noted that foreground and background is opposite concept in piece image, it generally, can be according to user Actual needs certain objects in image are set as foreground object, and other remaining objects are set as background object.
For example, can be foreground object by the defects of image/character setting, and by other in addition to defect/character Object is set as background object.
In operation S220, the first depth information is obtained, wherein it is corresponding that the first depth information characterizes multiple pixels in image Depth in the surface of foreground object.
That is, the first depth information is the depth for characterizing the surface of foreground object.Specifically, this depth information can lead to It crosses and directly reads the depth of pixel corresponding with the surface of foreground object to determine.
It should be noted that multiple pixels corresponding with the surface of foreground object can include but is not limited in following It is one of any: " all pixels point " on the surface of foreground object;Multiple feature pictures on the surface of foreground object Vegetarian refreshments (can represent the pixel of the feature on the surface of foreground object) in i.e. aforementioned " all pixels point ";Positioned at foreground object phase The all pixels point being connected on the edge of background object.
In operation S230, the second depth information is obtained, wherein the second depth information characterizes multiple pixels and corresponds to background The depth on the surface of object.
That is, the second depth information is the depth for characterizing the surface of background object herein, more specifically, it is that characterization is preceding The depth on the surface for the part background object that scape object blocks.Since the surface of part background object is actually by prospect Object shelters from, therefore this depth information is can not to be determined by directly reading the depth of corresponding pixel.
Specifically, in the embodiments of the present disclosure, can be determined for the corresponding equation of surface fitting of background object Two depth informations.Wherein, the corresponding equation of surface fitting for how being directed to background object will elaborate in the following embodiments, Details are not described herein.
Amplify the first depth information and the second depth letter for each pixel in multiple pixels in operation S240 Difference in breath between corresponding depth information.
For example, it is assumed that the fit equation on the surface of background object is g, and point (x0, y0) is on the surface of background object, and Point (x0, y1) is on the surface of foreground object, can be in the hope of the value of y0 then bring point (x0, y0) into equation g, and the value of y1 It can then be obtained by measuring or reading the depth of respective pixel point.Wherein, the second depth information above-mentioned can use y0 table Show, the first depth information above-mentioned can be indicated with y1, if (y1-y0) is denoted as y2, y2 can indicate two depth letters Difference between breath can be realized, specifically to enhance the contrast between foreground object and background object by amplification y2 Y2 can be mapped in 0~255 value range according to a certain percentage.
With the prior art in lower this situation progress image-region extraction/detection of contrast between foreground and background When, it is difficult, or even be difficult to accurately to extract/detect corresponding image-region and compare, the embodiment of the present disclosure, before amplification Contrast between scape and background can be realized convenient for image-region extraction/detection purpose.
For example, the upper figure in Fig. 3 A indicates that the effect picture before enhancing image, the following figure in Fig. 3 A are indicated based in Fig. 3 A Upper figure carry out image segmentation result figure, it is clear that segmentation effect is very poor;Upper figure in Fig. 3 B is indicated using disclosure offer Effect picture after the enhancing image that scheme obtains, the following figure in Fig. 3 B indicate to carry out image segmentation based on the upper figure in Fig. 3 B Result figure, it is clear that compared with Fig. 3 A, image segmentation is carried out more preferably based on the upper figure in Fig. 3 B.
Below with reference to Fig. 4, method shown in Fig. 2 is described further in conjunction with specific embodiments.
As a kind of optional embodiment, obtaining the second depth information includes:
Determine the fit equation for being fitted the surface of background object;And
Using each pixel in multiple pixels as the variable of fit equation, each pixel is calculated corresponding to background The depth on the surface of object obtains the second depth information.
As it was noted above, the second depth information is the depth for characterizing the surface of background object, more specifically, it is characterization The depth on the surface for the part background object blocked by foreground object.Due to part background object surface actually by Foreground object shelters from, thus this depth information be can not by directly read corresponding pixel depth or directly survey The mode of amount determines.
Assuming that the sectional view of the corresponding material object of foreground part and background parts in piece image is as shown in Figure 4, wherein yin Shadow part corresponds to the foreground object in image, and straight line portion corresponds to the background object in image, therefore, for such as Fig. 4 institute The surface for the background object shown can be fitted equation in line or a plane equation.
By taking fitting a straight line equation as an example:
(1) assume that the fit equation for the surface of background object is y=kx+b;
(2) measurement obtains the depth of any two point on the surface (part that do not blocked by foreground object) of background object, Such as (x2, y2), (x3, y3);
(3) (x2, y2), (x3, y3) are substituted into y=kx+b, acquires k value (assuming that k=k1) and b value (assuming that b=b1), from And determine the fit equation y=klx+b1 on the surface for background object;
(4) the second depth information is calculated, e.g., y1 indicates x0 to correspond on the surface of foreground object and puts depth information at 1, Y0 indicates x0 corresponding to the depth information put at 2 on the surface of background object, wherein point 1 and point 2 are corresponding, then by x0 and y0 Y=klx+bl is substituted into, y0=klx0+b1 can be acquired, (klx0+b1) is one of them second depth information;
(5) depth information on the surface that other points correspond to background object can be acquired according to method shown in (4).
By the embodiment of the present disclosure, since the surface of the corresponding background object of aforesaid plurality of pixel is actually by prospect The surface of object covers, it is difficult to which actual measurement obtains, thus can be evaluated whether these pixels pair by way of fit equation The depth information on the surface for the background object answered.
As a kind of optional embodiment, determine that the fit equation on the surface for being fitted background object includes:
If the surface of background object is plane, it is determined that for being fitted the linear equation or plane on the surface of background object Equation;Or
If the surface of background object is curved surface, it is determined that for being fitted the curvilinear equation or curved surface on the surface of background object Equation.
That is, can first determine the type on the surface of background object, then intend when being directed to the surface fitting equation of background object Close corresponding equation.Wherein, if the surface of background object is plane, two kinds of equations, i.e. straight line can be fitted to Equation or plane equation;If the surface of background object is curved surface, two kinds of equations, i.e. curve can also be fitted to Equation or surface equation.
As a kind of optional embodiment, determine that the fit equation on the surface for being fitted background object includes:
Determine the predetermined number pixel being located on the surface of background object;
Measurement obtains the corresponding depth information of predetermined number pixel;And
The depth information obtained using predetermined number pixel and measurement, is fitted the surface of background object, is fitted Equation.
By taking the surface of background object is plane as an example, the process of fitting a straight line equation is as follows: it is assumed that straight line Equation is y=kx+b, and measures the depth of (x2, y2) and (x3, y3) two o'clock, substitutes into y=kx+b, to obtain for background pair The fitting a straight line equation on the surface of elephant;The process of fit Plane equation is as follows: assuming that plane equation is Ax+By+Cz+D=0, and The depth of 3 points of measurement (x1, y1), (x2, y2) and (x3, y3) substitutes into Ax+By+Cz+D=0, to obtain for background object Surface fit Plane equation.
The method that the case where surface for background object is curved surface is fitted corresponding equation, with the table for being directed to background object The method that the case where face is plane is fitted corresponding equation is similar, and details are not described herein.
By the embodiment of the present disclosure, it can set and fit therewith according to the type (such as plane/curved surface) on the surface of background object The fit equation answered, and then pass through the depth for measuring certain points and the fit equation for substituting into setting, finally acquire background object The fit equation on surface.
Fig. 5 diagrammatically illustrates the block diagram of the image processing apparatus according to the embodiment of the present disclosure.
As shown in figure 5, image processing apparatus 500 includes that the first acquisition module 510, second obtains module 520, third obtains Module 530 and processing module 540.The image processing apparatus 500 can execute the method being described above, to realize in enhancing image The contrast of foreground and background, and then it is convenient for the purpose of image detection.
Specifically, first module 510 is obtained, for obtaining image to be processed, wherein include in image to be processed Foreground object and background object;
Second obtains module 520, for obtaining the first depth information, wherein the first depth information characterizes multiple in image Pixel corresponds to the depth on the surface of foreground object;
Third obtains module 530, for obtaining the second depth information, wherein the second depth information characterizes multiple pixels The depth on the surface corresponding to background object;And
Processing module 540, for amplifying the first depth information and second deeply for each pixel in multiple pixels Spend the difference in information between corresponding depth information.
With the prior art in lower this situation progress image-region extraction/detection of contrast between foreground and background When, it is difficult, or even be difficult to accurately to extract/detect corresponding image-region and compare, the embodiment of the present disclosure, before amplification Contrast between scape and background can be realized convenient for image-region extraction/detection purpose.
As a kind of optional embodiment, as shown in fig. 6, third, which obtains module 530, comprises determining that unit 610, for true Determine the fit equation for being fitted the surface of background object;And computing unit 620, for by each picture in multiple pixels Variable of the vegetarian refreshments as fit equation calculates depth of each pixel corresponding to the surface of background object, obtains the second depth Information.
By the embodiment of the present disclosure, since the surface of the corresponding background object of aforesaid plurality of pixel is actually by prospect The surface of object covers, it is difficult to which actual measurement obtains, thus can be evaluated whether these pixels pair by way of fit equation The depth information on the surface for the background object answered.
As a kind of optional embodiment, determination unit is also used to: if the surface of background object is plane, it is determined that use Linear equation or plane equation in the surface of fitting background object;Or if the surface of background object is curved surface, it is determined that For being fitted the curvilinear equation or surface equation on the surface of background object.
As a kind of optional embodiment, as shown in fig. 7, determination unit 610 comprises determining that subelement 710, for determining Predetermined number pixel on the surface of background object;Subelement 720 is measured, obtains predetermined number picture for measuring The corresponding depth information of vegetarian refreshments;And fitting subelement 730, for what is obtained using predetermined number pixel and measurement Depth information is fitted the surface of background object, obtains fit equation.
By the embodiment of the present disclosure, it can set and fit therewith according to the type (such as plane/curved surface) on the surface of background object The fit equation answered, and then pass through the depth for measuring certain points and the fit equation for substituting into setting, finally acquire background object The fit equation on surface.
It include one of any in following as a kind of optional embodiment, in multiple pixels: positioned at foreground object All pixels point on surface;Multiple feature pixels on the surface of foreground object;Back is connected on positioned at foreground object All pixels point on the edge of scape object.
Module according to an embodiment of the present disclosure, unit, in subelement it is any number of or in which any number of at least Partial function can be realized in a module.According in the module of the embodiment of the present disclosure, unit, subelement any one or It is multiple to be split into multiple modules to realize.According to any one in the module of the embodiment of the present disclosure, unit, subelement Or multiple it can at least be implemented partly as hardware circuit, such as field programmable gate array (FPGA), programmable logic battle array (PLA), system on chip, the system on substrate, the system in encapsulation, specific integrated circuit (ASIC) are arranged, or can be by electricity Road carries out hardware or the firmware of any other rational method that is integrated or encapsulating to realize, or with software, hardware and firmware Any one in three kinds of implementations several appropriately combined is realized with wherein any.Alternatively, according to the embodiment of the present disclosure Module, unit, one or more of subelement can at least be implemented partly as computer program module, when the calculating When machine program module is run, corresponding function can be executed.
For example, first obtains the acquisition of module 510, second module 520, third obtains in module 530 and processing module 540 Any number of may be incorporated in a module is realized or any one module therein can be split into multiple modules. Alternatively, at least partly function of one or more modules in these modules can mutually be tied at least partly function of other modules It closes, and is realized in a module.In accordance with an embodiment of the present disclosure, first module 510, second acquisition module 520, third are obtained Hardware circuit can be at least implemented partly as by obtaining at least one of module 530 and processing module 540, such as scene can It programs gate array (FPGA), programmable logic array (PLA), system on chip, the system on substrate, the system in encapsulation, dedicated Integrated circuit (ASIC), or can be by carrying out hardware or the firmwares such as any other rational method that is integrated or encapsulating to circuit It realizes, or with any one in three kinds of software, hardware and firmware implementations or with wherein any several appropriately combined To realize.Alternatively, first obtains the acquisition of module 510, second module 520, third obtains in module 530 and processing module 540 At least one can at least be implemented partly as computer program module, can be with when the computer program module is run Execute corresponding function.
Fig. 8 diagrammatically illustrates the department of computer science for being adapted for carrying out image processing method and device according to the embodiment of the present disclosure The block diagram of system.Computer system shown in Fig. 8 is only an example, should not function and use scope to the embodiment of the present disclosure Bring any restrictions.
As shown in figure 8, computer system 800 includes processor 810, computer readable storage medium 820.The department of computer science System 800 can execute the method according to the embodiment of the present disclosure.
Specifically, processor 810 for example may include general purpose microprocessor, instruction set processor and/or related chip group And/or special microprocessor (for example, specific integrated circuit (ASIC)), etc..Processor 810 can also include using for caching The onboard storage device on way.Processor 810 can be the different movements for executing the method flow according to the embodiment of the present disclosure Single treatment unit either multiple processing units.
Computer readable storage medium 820, such as can be non-volatile computer readable storage medium, specific example Including but not limited to: magnetic memory apparatus, such as tape or hard disk (HDD);Light storage device, such as CD (CD-ROM);Memory, such as Random access memory (RAM) or flash memory;Etc..
Computer readable storage medium 820 may include computer program 821, which may include generation Code/computer executable instructions execute processor 810 according to the embodiment of the present disclosure Method or its any deformation.
Computer program 821 can be configured to have the computer program code for example including computer program module.Example Such as, in the exemplary embodiment, the code in computer program 821 may include one or more program modules, for example including 821A, module 821B ....It should be noted that the division mode and number of module are not fixation, those skilled in the art can To be combined according to the actual situation using suitable program module or program module, when these program modules are combined by processor 810 When execution, processor 810 is executed according to the method for the embodiment of the present disclosure or its any deformation.
In accordance with an embodiment of the present disclosure, processor 810 can be executed according to the method for the embodiment of the present disclosure or its any change Shape.
In accordance with an embodiment of the present disclosure, first the acquisition of module 510, second module 520, third acquisition 530 and of module are obtained At least one of processing module 540 can be implemented as the computer program module with reference to Fig. 8 description, by processor 810 When execution, corresponding operating described above may be implemented.
The disclosure additionally provides a kind of computer readable storage medium, which can be above-mentioned reality It applies included in equipment/device/system described in example;Be also possible to individualism, and without be incorporated the equipment/device/ In system.Above-mentioned computer readable storage medium carries one or more program, when said one or multiple program quilts When execution, the method according to the embodiment of the present disclosure is realized.
In accordance with an embodiment of the present disclosure, computer readable storage medium can be non-volatile computer-readable storage medium Matter, such as can include but is not limited to: portable computer diskette, hard disk, random access storage device (RAM), read-only memory (ROM), erasable programmable read only memory (EPROM or flash memory), portable compact disc read-only memory (CD-ROM), light Memory device, magnetic memory device or above-mentioned any appropriate combination.In the disclosure, computer readable storage medium can With to be any include or the tangible medium of storage program, the program can be commanded execution system, device or device use or Person is in connection.
Flow chart and block diagram in attached drawing are illustrated according to the device of the various embodiments of the disclosure, method and computer journey The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation A part of one module, program segment or code of table, a part of above-mentioned module, program segment or code include one or more Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants It is noted that the combination of each box in block diagram or flow chart and the box in block diagram or flow chart, can use and execute rule The dedicated hardware based systems of fixed functions or operations is realized, or can use the group of specialized hardware and computer instruction It closes to realize.
It will be understood by those skilled in the art that the feature recorded in each embodiment and/or claim of the disclosure can To carry out multiple combinations and/or combination, even if such combination or combination are not expressly recited in the disclosure.Particularly, exist In the case where not departing from disclosure spirit or teaching, the feature recorded in each embodiment and/or claim of the disclosure can To carry out multiple combinations and/or combination.All these combinations and/or combination each fall within the scope of the present disclosure.
Although the disclosure, art technology has shown and described referring to the certain exemplary embodiments of the disclosure Personnel it should be understood that in the case where the spirit and scope of the present disclosure limited without departing substantially from the following claims and their equivalents, A variety of changes in form and details can be carried out to the disclosure.Therefore, the scope of the present disclosure should not necessarily be limited by above-described embodiment, But should be not only determined by appended claims, also it is defined by the equivalent of appended claims.

Claims (10)

1. a kind of image processing method, comprising:
Obtain image to be processed, wherein include foreground object and background object in the image to be processed;
Obtain the first depth information, wherein multiple pixels correspond to described in the first depth information characterization described image The depth on the surface of foreground object;
Obtain the second depth information, wherein second depth information characterizes the multiple pixel and corresponds to the background pair The depth on the surface of elephant;And
For each pixel in the multiple pixel, amplify in first depth information and second depth information Difference between corresponding depth information.
2. according to the method described in claim 1, wherein, the second depth information of the acquisition, comprising:
Determine the fit equation for being fitted the surface of the background object;And
Using each pixel in the multiple pixel as the variable of the fit equation, each pixel pair is calculated The depth on the surface of background object described in Ying Yu obtains second depth information.
3. according to the method described in claim 2, wherein, the determination is used to be fitted the quasi- of the surface of the background object Close equation, comprising:
If the surface of the background object be plane, it is determined that for be fitted the surface of the background object linear equation or Plane equation;Or
If the surface of the background object be curved surface, it is determined that for be fitted the surface of the background object curvilinear equation or Surface equation.
4. according to the method described in claim 2, wherein, the determination is used to be fitted the fitting side on the surface of the background object Journey, comprising:
Determine the predetermined number pixel being located on the surface of the background object;
Measurement obtains the corresponding depth information of predetermined number pixel;And
The depth information obtained using the predetermined number pixel and measurement, is fitted the surface of the background object, obtained The fit equation.
5. including one of any in following according to the method described in claim 1, wherein, in the multiple pixel:
All pixels point on the surface of the foreground object;
Multiple feature pixels on the surface of the foreground object;
All pixels point on the edge that the foreground object is connected on the background object.
6. a kind of image processing apparatus, comprising:
First obtain module, for obtaining image to be processed, wherein include in the image to be processed foreground object and Background object;
Second obtains module, for obtaining the first depth information, wherein multiple in the first depth information characterization described image Pixel corresponds to the depth on the surface of the foreground object;
Third obtains module, for obtaining the second depth information, wherein second depth information characterizes the multiple pixel The depth on the surface corresponding to the background object;And
Processing module, for for each pixel in the multiple pixel, amplifying first depth information and described Difference in second depth information between corresponding depth information.
7. device according to claim 6, wherein the third obtains module and includes:
Determination unit, for determining the fit equation for being fitted the surface of the background object;And
Computing unit, for calculating institute using each pixel in the multiple pixel as the variable of the fit equation Depth of each pixel corresponding to the surface of the background object is stated, second depth information is obtained.
8. device according to claim 7, wherein the determination unit is also used to:
If the surface of the background object be plane, it is determined that for be fitted the surface of the background object linear equation or Plane equation;Or
If the surface of the background object be curved surface, it is determined that for be fitted the surface of the background object curvilinear equation or Surface equation.
9. device according to claim 7, wherein the determination unit includes:
Subelement is determined, for determining the predetermined number pixel being located on the surface of the background object;
Subelement is measured, obtains the corresponding depth information of predetermined number pixel for measuring;And
It is fitted subelement, the depth information for obtaining using the predetermined number pixel and measurement is fitted the background The surface of object obtains the fit equation.
10. device according to claim 6, wherein include one of any in following in the multiple pixel:
All pixels point on the surface of the foreground object;
Multiple feature pixels on the surface of the foreground object;
All pixels point on the edge that the foreground object is connected on the background object.
CN201811440471.9A 2018-11-28 2018-11-28 Image processing method and device Active CN109598753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811440471.9A CN109598753B (en) 2018-11-28 2018-11-28 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811440471.9A CN109598753B (en) 2018-11-28 2018-11-28 Image processing method and device

Publications (2)

Publication Number Publication Date
CN109598753A true CN109598753A (en) 2019-04-09
CN109598753B CN109598753B (en) 2021-02-19

Family

ID=65959832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811440471.9A Active CN109598753B (en) 2018-11-28 2018-11-28 Image processing method and device

Country Status (1)

Country Link
CN (1) CN109598753B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504771A (en) * 2009-03-20 2009-08-12 北京航空航天大学 Vision tracing method for non-parameterized model
US20120086780A1 (en) * 2010-10-12 2012-04-12 Vinay Sharma Utilizing Depth Information to Create 3D Tripwires in Video
US20120087572A1 (en) * 2010-10-11 2012-04-12 Goksel Dedeoglu Use of Three-Dimensional Top-Down Views for Business Analytics
CN102609934A (en) * 2011-12-22 2012-07-25 中国科学院自动化研究所 Multi-target segmenting and tracking method based on depth image
CN103310231A (en) * 2013-06-24 2013-09-18 武汉烽火众智数字技术有限责任公司 Auto logo locating and identifying method
CN104246822A (en) * 2012-03-22 2014-12-24 高通股份有限公司 Image enhancement
CN104657993A (en) * 2015-02-12 2015-05-27 北京格灵深瞳信息技术有限公司 Lens shielding detection method and device
CN106204617A (en) * 2016-07-21 2016-12-07 大连海事大学 Adapting to image binarization method based on residual image rectangular histogram cyclic shift

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504771A (en) * 2009-03-20 2009-08-12 北京航空航天大学 Vision tracing method for non-parameterized model
US20120087572A1 (en) * 2010-10-11 2012-04-12 Goksel Dedeoglu Use of Three-Dimensional Top-Down Views for Business Analytics
US20120086780A1 (en) * 2010-10-12 2012-04-12 Vinay Sharma Utilizing Depth Information to Create 3D Tripwires in Video
CN102609934A (en) * 2011-12-22 2012-07-25 中国科学院自动化研究所 Multi-target segmenting and tracking method based on depth image
CN104246822A (en) * 2012-03-22 2014-12-24 高通股份有限公司 Image enhancement
CN103310231A (en) * 2013-06-24 2013-09-18 武汉烽火众智数字技术有限责任公司 Auto logo locating and identifying method
CN104657993A (en) * 2015-02-12 2015-05-27 北京格灵深瞳信息技术有限公司 Lens shielding detection method and device
CN106204617A (en) * 2016-07-21 2016-12-07 大连海事大学 Adapting to image binarization method based on residual image rectangular histogram cyclic shift

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
LIN G S等: ""2D to 3D Image Conversion Based on Classification of Background Depth Profiles"", 《PSIVT 2011: ADVANCES IN IMAGE AND VIDEO TECHNOLOGY》 *
李逸伦等: ""2D转3D中基于物体建模的深度图生成"", 《北京:中国科技论文在线》 *
王晏民等编著: "《深度图像化点云数据管理》", 31 December 2013, 北京:测绘出版社 *
邵俊棋: ""利用邊界特性改善立體影像的深度估測"", 《臺北科技大學資訊工程系研究所學位論文》 *

Also Published As

Publication number Publication date
CN109598753B (en) 2021-02-19

Similar Documents

Publication Publication Date Title
CN104792277B (en) 3 d shape measuring apparatus and method for measuring three-dimensional shape
MY181985A (en) Inspection system, inspecting device and gaming chip
KR20160048901A (en) System and method for determining the extent of a plane in an augmented reality environment
CN109255767B (en) Image processing method and device
Mara et al. Vectorization of 3D-characters by integral invariant filtering of high-resolution triangular meshes
CN106385640B (en) Video annotation method and device
JP2016507793A5 (en)
CN106327546B (en) Method and device for testing face detection algorithm
CN102713671A (en) Point group data processing device, point group data processing method, and point group data processing program
CN106052646A (en) Information processing apparatus and information processing method
US20160284122A1 (en) 3d model recognition apparatus and method
RU2012131409A (en) BONE RELAXATION IN X-RAY IMAGES
BR0105433A (en) Device for producing images
CN112199268B (en) Software compatibility testing method and electronic equipment
JP2011002919A (en) Method and device for detecting object
CN106845508A (en) The methods, devices and systems of release in a kind of detection image
US20140232630A1 (en) Transparent Display Field of View Region Determination
CN106062824A (en) Edge detection device, edge detection method, and program
CN108960012B (en) Feature point detection method and device and electronic equipment
RU2014153979A (en) METHOD FOR CREATING INSCRIPTIONS, DEVICE AND TERMINAL
JPWO2019044439A1 (en) Drug audit equipment, image processing equipment, image processing methods and programs
US20150221068A1 (en) Optimized Method for Estimating the Dominant Gradient Direction of a Digital Image Area
KR102555463B1 (en) Method and device for determining illumination of 3D virtual scene
SG11201807007PA (en) Inspection processing device, inspection system, commodity master registration device, inspection processing method, and program
CN109598753A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant