CN109544616A - A kind of depth information determines method and terminal - Google Patents
A kind of depth information determines method and terminal Download PDFInfo
- Publication number
- CN109544616A CN109544616A CN201811509856.6A CN201811509856A CN109544616A CN 109544616 A CN109544616 A CN 109544616A CN 201811509856 A CN201811509856 A CN 201811509856A CN 109544616 A CN109544616 A CN 109544616A
- Authority
- CN
- China
- Prior art keywords
- depth information
- area object
- subject
- effective
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The present invention provides a kind of depth information and determines method and terminal, this method comprises: obtaining the first depth information of subject by the TOF camera, and the second depth information of the subject is obtained by the TOF camera and the colour imagery shot;The effective depth information of the subject is determined from first depth information and second depth information.In the present embodiment, terminal can be made to obtain the better performances of depth information.
Description
Technical field
The present invention relates to technology for information acquisition fields more particularly to a kind of depth information to determine method and terminal.
Background technique
It is both provided with photographic device on today's society, more and more terminals, to facilitate user can be whenever and wherever possible
Take pictures.In practice, existing photographic device generally uses flying time technology (Time Of Light, TOF) to image
Head obtains the depth information of object by TOF camera, and the depth information of object can only be often obtained by TOF camera,
The depth information for directly obtaining TOF camera is as effective depth information.But certain some scene is obtained by TOF camera
Depth information it is accurate it is low may be relatively low, so as to cause terminal obtain depth information performance it is poor.
Summary of the invention
The embodiment of the present invention provides a kind of depth information and determines method and terminal, to solve the property that terminal obtains depth information
The poor problem of energy.
In a first aspect, the embodiment of the invention provides a kind of depth informations to determine method, be applied to include TOF camera and
The terminal of colour imagery shot, the TOF camera and the colour imagery shot are located at the same side of the terminal, the method packet
It includes:
The first depth information of subject is obtained by the TOF camera, and, pass through the TOF camera
The second depth information of the subject is obtained with the colour imagery shot;
The effective depth letter of the subject is determined from first depth information and second depth information
Breath.
Second aspect, the embodiment of the present invention also provide a kind of terminal, the terminal including TOF camera and colour imagery shot,
The TOF camera and the colour imagery shot are located at the same side of the terminal, the terminal further include:
Module is obtained, for the first depth information by TOF camera acquisition subject, and, pass through
The TOF camera and the colour imagery shot obtain the second depth information of the subject;
Determining module, for determining the subject from first depth information and second depth information
Effective depth information.
The third aspect, the embodiment of the present invention also provide a kind of mobile terminal, comprising: memory, processor and are stored in institute
The computer program that can be run on memory and on the processor is stated, the processor executes real when the computer program
Existing above-mentioned depth information determines the step in method.
Fourth aspect, the embodiment of the present invention also provide a kind of computer readable storage medium, the computer-readable storage
Computer program is stored on medium, the computer program realizes that above-mentioned depth information determines in method when being executed by processor
The step of.
In the embodiment of the present invention, the first depth information of subject is obtained by the TOF camera, and, lead to
It crosses TOF camera and colour imagery shot obtains the second depth information of subject;From the first depth information and the second depth
The effective depth information of subject is determined in information.In this way, terminal can obtain subject by TOF camera
Depth information, or the depth information of subject is obtained by TOF camera and colour imagery shot jointly, so that eventually
End obtains the better performances of depth information.
Detailed description of the invention
In order to illustrate the technical solution of the embodiments of the present invention more clearly, needed in being described below to the embodiment of the present invention
Attached drawing to be used is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention,
For those of ordinary skill in the art, without any creative labor, it can also obtain according to these attached drawings
Obtain other attached drawings.
Fig. 1 is the flow chart that a kind of depth information provided in an embodiment of the present invention determines method;
Fig. 2 is the flow chart that another depth information provided in an embodiment of the present invention determines method;
Fig. 3 is the application scenario diagram that another depth information provided in an embodiment of the present invention determines method;
Fig. 4 is a kind of structural schematic diagram of terminal provided in an embodiment of the present invention;
Fig. 5 is the structural schematic diagram of another terminal provided in an embodiment of the present invention;
Fig. 6 is the structural schematic diagram of another terminal provided in an embodiment of the present invention;
Fig. 7 is a kind of hardware structural diagram of mobile terminal provided in an embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair
Embodiment in bright, every other implementation obtained by those of ordinary skill in the art without making creative efforts
Example, shall fall within the protection scope of the present invention.
It is the flow chart that a kind of depth information provided in an embodiment of the present invention determines method referring to Fig. 1, Fig. 1, this method is answered
For the terminal including TOF camera and colour imagery shot, the TOF camera and the colour imagery shot are located at the terminal
The same side, as shown in Figure 1, this method the following steps are included:
Step 101, the first depth information that subject is obtained by the TOF camera, and, by described
TOF camera and the colour imagery shot obtain the second depth information of the subject.
Wherein, subject can be terminal by TOF camera, or pass through TOF camera and colour imagery shot
All objects for including in the image or video got, such as: terminal gets an image, then subject at this time
It can be whole objects such as the people for including and shooting background in image.It should be noted that above-mentioned image can be in terminal
Image in preview interface, or the image that terminal is shot by above-mentioned camera.In addition, the depth of subject
Degree information is referred to as the depth information of image.
In addition, light and infrared laser that terminal can be issued by TOF camera by measurement Infrared laser emission device
Receiver receives the time interval between light, and this time interval, again divided by 2, can calculate and be taken pair multiplied by the light velocity
The depth information of elephant.
In addition, the quantity for the colour imagery shot for including in terminal is it is not limited here.Such as: the quantity of colour imagery shot can
Think one or more.Such as: it may include 1 TOF camera and 1 colour imagery shot in terminal;Alternatively, in terminal
It may include 1 TOF camera and 2 colour imagery shots.It should be noted that colour imagery shot can be RGB (Red
Green Blue, RGB) camera.
When the quantity of colour imagery shot is multiple, terminal can be directed to by TOF camera and multiple colour imagery shots
Subject acquires image, and the effective depth information of subject is determined according to the image of acquisition.
Such as: TOF camera and each colour imagery shot can acquire image for subject respectively, this
The multiple image of subject can be obtained in sample, also, at the terminal due to TOF camera and each colour imagery shot
Position fixed so that the positional relationship between TOF camera and each colour imagery shot and subject
Can also know, in this way, by difference and TOF camera between multiple image and each colour imagery shot with clapped
The positional relationship between object is taken the photograph, so that terminal can determine the effective depth information of subject.
Step 102 determines having for the subject from first depth information and second depth information
Imitate depth information.
Wherein it is possible to first judge whether the difference between the first depth information and the second depth information is less than target difference,
It, can be by comparing the accuracy of the first depth information and the second depth information, thus the when difference is less than target difference
The effective depth information of subject is determined in one depth information and the second depth information.Such as: the first depth information and
The difference of two depth informations is 0.1 unit or 0.2 unit, and the value of target difference is 1 unit, then it is deep can to compare first
The accuracy of information and the second depth information is spent, if the first depth information is accurate to after decimal point 1, the second depth information is accurate
2 after to decimal point, then it can determine that the second depth information is the effective depth information of subject.Unit can root herein
Factually border it needs to be determined that.
Through the above steps, the first depth information of subject is obtained by TOF camera, and, it is taken the photograph by TOF
As the second depth information of head and colour imagery shot acquisition subject, then believe again from the first depth information and the second depth
It is screened in breath, and the depth information filtered out is determined as to the effective depth information of subject.In this way, with existing skill
It can only be compared by the way of the effective depth information that TOF camera obtains subject in art, terminal acquisition is taken pair
The mode of the effective depth information of elephant is more, but also terminal obtains the better performances of depth information.
In the embodiment of the present invention, terminal can be mobile terminal, be also possible to irremovable terminal, and above-mentioned mobile whole
End can be mobile phone, tablet computer (Tablet PersonalComputer), laptop computer (Laptop Computer), a
Personal digital assistant (PersonalDigital Assistant, abbreviation PDA), mobile Internet access device (Mobile Internet
Device, MID) or wearable device (Wearable Device) etc..
In the embodiment of the present invention, the first depth information of subject is obtained by the TOF camera, and, lead to
It crosses the TOF camera and the colour imagery shot obtains the second depth information of the subject;It is deep from described first
It spends in information and second depth information and determines the effective depth information of the subject.In this way, terminal can be distinguished
The depth information of subject is obtained by TOF camera, or quilt is obtained by TOF camera and colour imagery shot jointly
The depth information of reference object, so that terminal obtains the better performances of depth information.
Referring to fig. 2, Fig. 2 is the flow chart that another depth information provided in an embodiment of the present invention determines method.This implementation
Example and the main distinction of last embodiment are: subject includes first area object and second area object, and can be with
The depth information for determining first area object in the first depth information is the effective depth information of first area object, and, really
The depth information of second area object is the effective depth information of second area object in fixed second depth information.As shown in Fig. 2,
The following steps are included:
Step 201, the first depth information that subject is obtained by the TOF camera, and, by described
TOF camera and the colour imagery shot obtain the second depth information of the subject.
It should be noted that terminal is enameled, the quantity of camera is it is not limited here.
Step 202, the depth information for determining first area object described in first depth information are firstth area
The effective depth information of field object, and, determine that the depth information of second area object described in second depth information is
The effective depth information of the second area object;Alternatively, determining that first depth information is having for the subject
Imitate depth information.
Wherein, the depth that the depth information of first area object can be less than second area object in the second depth information is believed
Breath, in addition, the type of first area object and second area object is it is not limited here, such as: the type of first area object
It can be people, the type of second area object can be trees.
Such as: referring to Fig. 3, TOF camera 3011 and colour imagery shot 3012 are provided in terminal 301, terminal 301 passes through
First depth information of the available subject 302 of TOF camera 3011 passes through TOF camera 3011 and colored camera shooting
Second depth information of first 3012 available subject 302, and subject 302 includes first area object 3021
With second area object 3022.In addition, arrow direction indicates infrared light between terminal 301 and subject 302 in Fig. 3
Transmission direction, such as: infrared light is issued from the infrared transmitter in TOF camera 3011, after encountering subject 302
Reflect back into the infrared remote receiver or colour imagery shot 3012 of TOF camera 3011.
In this way, terminal 301 can determine that the depth information of first area object 3021 in the first depth information is the firstth area
The effective depth information of field object 3021, and, determine that the depth information of second area object 3022 in the second depth information is
The effective depth information of second area object 3022;Alternatively, determining that the first depth information is the effective depth of subject 302
Information, so that terminal 301 determines that the mode of the effective depth information of subject 302 is more flexible.
In addition, the effective depth information of first area object can be with are as follows: the multiple points for including in the object of first area it is flat
The median etc. of the depth information of equal depth information or multiple points.It should be noted that the effective depth of second area object
Information can refer to the statement of the effective depth information of above-mentioned first area object.
Optionally, the depth information of first area object described in the determination first depth information is described first
The effective depth information of section object, and, determine the depth information of second area object described in second depth information
For the second area object effective depth information the step of, comprising:
If the depth of the depth information of first area object described in second depth information and the second area object
The difference for spending information is more than or equal to preset threshold, it is determined that the depth of first area object described in first depth information
The effective depth information that information is the first area object is spent, and, determine the secondth area described in second depth information
The depth information of field object is the effective depth information of the second area object;Alternatively,
If the depth information of first area object described in second depth information is less than or equal to the first preset value, and
The depth information of second area object described in second depth information is greater than or equal to the second preset value, it is determined that described the
The depth information of first area object described in one depth information is the effective depth information of the first area object, and,
The depth information for determining second area object described in second depth information is the effective depth of the second area object
Information, wherein first preset value is less than second preset value.
When the difference of the depth information of first area object in the second depth information and the depth information of second area object
When more than or equal to preset threshold, the depth information of first area object is less than second area object in the second depth information
Depth information, and in the first depth information obtained by TOF camera, the depth information of second area object is much smaller than second
The depth information of second area object in depth information.If the section object is background, obtained above by TOF camera
The lower phenomenon of depth information accuracy of second area object be properly termed as " background further phenomenon ".
In this way, when the depth information of the depth information and second area object of first area object in the second depth information
When difference is more than or equal to preset threshold, it can determine that the depth information of first area object in the first depth information is first
The effective depth information of section object, and, determine that the depth information of second area object in the second depth information is the secondth area
The effective depth information of field object.
Similarly, when the depth information of first area object in the second depth information is less than or equal to the first preset value, and the
The depth information of second area object is greater than or equal to the second preset value in two depth informations, then equally will appear above-mentioned " background
Further phenomenon ".Such as: the value of the first preset value can be 20 centimetres, and the value of the second preset value can be 1 meter.Certainly, have
Body value is it is not limited here.It should be noted that depth information can be understood as the distance between subject and terminal.
In present embodiment, when the accuracy of the depth information of the subject obtained by TOF camera is lower,
When there is " background further phenomenon ", it can determine that the depth information of first area object in the first depth information is first area
The effective depth information of object, and, determine that the depth information of second area object in the second depth information is second area pair
The effective depth information of elephant, so as to improve subject depth information accuracy.
Optionally, the step of determination first depth information is the effective depth information of the subject,
Include:
If the depth of the depth information of first area object described in second depth information and the second area object
The difference for spending information is less than preset threshold, it is determined that first depth information is that the effective depth of the subject is believed
Breath;Alternatively,
If the depth information of first area object described in second depth information is greater than the first preset value, and/or, institute
The depth information of second area object described in the second depth information is stated less than the second preset value, it is determined that the first depth letter
Breath is the effective depth information of the subject, wherein first preset value is less than second preset value.
Wherein, when the depth information of the depth information of first area object in the second depth information and second area object
When difference is less than preset threshold, then explanation does not occur " background further phenomenon " by the first depth information that TOF camera obtains,
In this way, due to not occurring " background further phenomenon ", and the effective depth information of the subject obtained by TOF camera
When, it is also available in the case where no visible light, so that the acquisition to depth information is more convenient, so as to being clapped
The mode for taking the photograph the acquisition of the effective depth information of object is more flexible.
If the depth information of first area object is greater than the first preset value in the second depth information, and/or, the second depth letter
The depth information of second area object described in breath then can equally illustrate to obtain by TOF camera less than the second preset value
The first depth information do not occur " background further phenomenon ".
In addition, the statement of preset threshold, the first preset value and the second preset value may refer to the table in last embodiment
It states, details are not described herein.
In present embodiment, when the first depth information of the subject obtained by TOF camera does not occur " background
Further phenomenon " when, i.e. when the accuracy of the first depth information is higher, it can determine that the first depth information is having for subject
Depth information is imitated, so that the precision of the effective depth information of subject is higher.
Optionally, the filter plate of the colour imagery shot is two-pass filter piece.
Wherein, when the intensity of visible light is lower, such as: at night, or in more black interior, terminal can pass through coloured silk
Color camera receives the detection infrared light for penetrating two-pass filter piece, and according to the quantity of above-mentioned detection infrared light, acquisition is taken
Second depth information of object.Above-mentioned detection infrared light is the infrared transmitter transmitting in TOF camera, and encounters and be taken
The infrared light being back to after object in colour imagery shot.
Since the infrared fileter in conventional colour imagery shot is completely switched off, i.e. infrared light in infrared portions
Permeability is lower, and the filter plate of the colour imagery shot in the present embodiment is two-pass filter piece, and above-mentioned two-pass filter piece is by changing
Become Film Design, so as to logical in 400-600 nanometers of visible light in 830-1020 nanometers of infrared light and wave band for wave band
It crosses, to complete the acquisition to the second depth information of subject.
In addition, it is necessary to explanation, since wave band is lower in the transmitance of 830-1020 nanometers of infrared light, generally below
2%, and mean transmissivity is 0.5% or so, and wave band is higher in the transmitance of 400-600 nanometers of visible light, transmitance can
To reach 90% or more.Therefore, when shooting photo by TOF camera and colour imagery shot, through the filter of colour imagery shot
The infrared light of wave plate is lower to the image quality of photo, is substantially negligible and disregards.
In present embodiment, the filter plate of colour imagery shot is two-pass filter piece, so that lower in visual intensity
When, TOF camera and colour imagery shot can also preferably get the second depth information of subject, so that right
The acquisition of second depth information of subject is more convenient.
In the embodiment of the present invention, by step 201 and 202, so that determining the side of the effective depth information of subject
Formula is more flexible, so that terminal obtains the better performances of depth information.
Referring to fig. 4, Fig. 4 is the structure chart of terminal provided in an embodiment of the present invention, is able to achieve a kind of depth in above-described embodiment
Degree information determines the details of method, and reaches identical effect.As shown in figure 4, terminal 400 includes that TOF camera and colour are taken the photograph
As the terminal of head, the TOF camera and the colour imagery shot are located at the same side of the terminal, and terminal 400 includes:
Module 401 is obtained, for the first depth information by TOF camera acquisition subject, and, lead to
It crosses the TOF camera and the colour imagery shot obtains the second depth information of the subject;
Determining module 402, for being taken described in the determination from first depth information and second depth information
The effective depth information of object.
Optionally, referring to Fig. 5, the subject includes first area object and second area object, the determination
Module 402 includes:
First determines submodule 4021, for determining that the depth of first area object described in first depth information is believed
Breath is the effective depth information of the first area object, and, determine second area pair described in second depth information
The depth information of elephant is the effective depth information of the second area object, wherein first preset value is less than described second
Preset value.
Optionally, referring to Fig. 6, the determining module 402 includes: second to determine submodule 4022, for determining described the
One depth information is the effective depth information of the subject.
Optionally, described first submodule 4021 is determined, if being also used to first area described in second depth information
The difference of the depth information of the depth information of object and the second area object is more than or equal to preset threshold, it is determined that institute
The depth information for stating first area object described in the first depth information is the effective depth information of the first area object, with
And determine that the depth information of second area object described in second depth information is effective depth of the second area object
Spend information;Alternatively,
Described first determines submodule 4021, if being also used to the depth of first area object described in second depth information
It spends information and is less than or equal to the first preset value, and the depth information of second area object described in second depth information is greater than
Or it is equal to the second preset value, it is determined that the depth information of first area object described in first depth information is described first
The effective depth information of section object, and, determine the depth information of second area object described in second depth information
For the effective depth information of the second area object, wherein first preset value is less than second preset value.
Optionally, described second submodule 4022 is determined, if being also used to first area described in second depth information
The difference of the depth information of the depth information of object and the second area object is less than preset threshold, it is determined that described first is deep
Spend the effective depth information that information is the subject;Alternatively,
Described second determines submodule 4022, if being also used to the depth of first area object described in second depth information
It spends information and is greater than the first preset value, and/or, the depth information of second area object described in second depth information is less than the
Two preset values, it is determined that first depth information is the effective depth information of the subject.
Optionally, the filter plate of the colour imagery shot is two-pass filter piece.
Terminal provided in an embodiment of the present invention can be realized each mistake that terminal is realized in the embodiment of the method for Fig. 1 to Fig. 2
Journey, to avoid repeating, which is not described herein again.In the present embodiment, terminal can equally be made to obtain the better performances of depth information.
A kind of hardware structural diagram of Fig. 7 mobile terminal of each embodiment to realize the present invention.
The mobile terminal 700 includes but is not limited to: radio frequency unit 701, network module 702, audio output unit 703, defeated
Enter unit 704, sensor 705, display unit 706, user input unit 707, interface unit 708, memory 709, processor
The components such as 710 and power supply 711.It will be understood by those skilled in the art that mobile terminal structure shown in Fig. 7 is not constituted
Restriction to mobile terminal, mobile terminal may include than illustrating more or fewer components, perhaps combine certain components or
Different component layouts.In embodiments of the present invention, mobile terminal include but is not limited to mobile phone, tablet computer, laptop,
Palm PC, car-mounted terminal, wearable device and pedometer etc..
Processor 710, is used for:
The first depth information of subject is obtained by TOF camera, and, it is taken the photograph by TOF camera and colour
As head obtains the second depth information of the subject;
The effective depth letter of the subject is determined from first depth information and second depth information
Breath.
Optionally, the subject includes first area object and second area object, and the processor 710 executes
The effective depth information that the subject is determined from first depth information and second depth information
The step of, comprising:
The depth information for determining first area object described in first depth information is the first area object
Effective depth information, and, determine that the depth information of second area object described in second depth information is described second
The effective depth information of section object;Alternatively,
Determine that first depth information is the effective depth information of the subject, wherein described first is default
Value is less than second preset value.
Optionally, first area object described in the determination first depth information that the processor 710 executes
Depth information be the first area object effective depth information, and, determine described in second depth information the
The step of depth information of two section objects is the effective depth information of the second area object, comprising:
If the depth of the depth information of first area object described in second depth information and the second area object
The difference for spending information is more than or equal to preset threshold, it is determined that the depth of first area object described in first depth information
The effective depth information that information is the first area object is spent, and, determine the secondth area described in second depth information
The depth information of field object is the effective depth information of the second area object;Alternatively,
If the depth information of first area object described in second depth information is less than or equal to the first preset value, and
The depth information of second area object described in second depth information is greater than or equal to the second preset value, it is determined that described the
The depth information of first area object described in one depth information is the effective depth information of the first area object, and,
The depth information for determining second area object described in second depth information is the effective depth of the second area object
Information, wherein first preset value is less than second preset value.
Optionally, it is having for the subject that the processor 710, which executes the determination first depth information,
The step of imitating depth information, comprising:
If the depth of the depth information of first area object described in second depth information and the second area object
The difference for spending information is less than preset threshold, it is determined that first depth information is that the effective depth of the subject is believed
Breath;Alternatively,
If the depth information of first area object described in second depth information is greater than the first preset value, and/or, institute
The depth information of second area object described in the second depth information is stated less than the second preset value, it is determined that the first depth letter
Breath is the effective depth information of the subject.
Optionally, the filter plate of the colour imagery shot is two-pass filter piece.
Mobile terminal in the embodiment of the present invention again may be by the depth letter that TOF camera obtains subject
It ceases, or obtains the depth information of subject jointly by TOF camera and colour imagery shot, so that obtaining depth
The better performances of information.
It should be understood that the embodiment of the present invention in, radio frequency unit 701 can be used for receiving and sending messages or communication process in, signal
Send and receive, specifically, by from base station downlink data receive after, to processor 710 handle;In addition, by uplink
Data are sent to base station.In general, radio frequency unit 701 includes but is not limited to antenna, at least one amplifier, transceiver, coupling
Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 701 can also by wireless communication system and network and other set
Standby communication.
Mobile terminal provides wireless broadband internet by network module 702 for user and accesses, and such as user is helped to receive
It sends e-mails, browse webpage and access streaming video etc..
Audio output unit 703 can be received by radio frequency unit 701 or network module 702 or in memory 709
The audio data of storage is converted into audio signal and exports to be sound.Moreover, audio output unit 703 can also be provided and be moved
The relevant audio output of specific function that dynamic terminal 700 executes is (for example, call signal receives sound, message sink sound etc.
Deng).Audio output unit 703 includes loudspeaker, buzzer and receiver etc..
Input unit 704 is for receiving audio or video signal.Input unit 704 may include graphics processor
(Graphics Processing Unit, GPU) 7041 and microphone 7042, graphics processor 7041 is in video acquisition mode
Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out
Reason.Treated, and picture frame may be displayed on display unit 706.Through graphics processor 7041, treated that picture frame can be deposited
Storage is sent in memory 709 (or other storage mediums) or via radio frequency unit 701 or network module 702.Mike
Wind 7042 can receive sound, and can be audio data by such acoustic processing.Treated audio data can be
The format output that mobile communication base station can be sent to via radio frequency unit 701 is converted in the case where telephone calling model.
Mobile terminal 700 further includes at least one sensor 705, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 7061, and proximity sensor can close when mobile terminal 700 is moved in one's ear
Display panel 7061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axis) size of acceleration, it can detect that size and the direction of gravity when static, can be used to identify mobile terminal posture (ratio
Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);It passes
Sensor 705 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, wet
Meter, thermometer, infrared sensor etc. are spent, details are not described herein.
Display unit 706 is for showing information input by user or being supplied to the information of user.Display unit 706 can wrap
Display panel 7061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 7061.
User input unit 707 can be used for receiving the number or character information of input, and generate the use with mobile terminal
Family setting and the related key signals input of function control.Specifically, user input unit 707 include touch panel 7071 and
Other input equipments 7072.Touch panel 7071, also referred to as touch screen collect the touch operation of user on it or nearby
(for example user uses any suitable objects or attachment such as finger, stylus on touch panel 7071 or in touch panel 7071
Neighbouring operation).Touch panel 7071 may include both touch detecting apparatus and touch controller.Wherein, touch detection
Device detects the touch orientation of user, and detects touch operation bring signal, transmits a signal to touch controller;Touch control
Device processed receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 710, receiving area
It manages the order that device 710 is sent and is executed.Furthermore, it is possible to more using resistance-type, condenser type, infrared ray and surface acoustic wave etc.
Seed type realizes touch panel 7071.In addition to touch panel 7071, user input unit 707 can also include other input equipments
7072.Specifically, other input equipments 7072 can include but is not limited to physical keyboard, function key (such as volume control button,
Switch key etc.), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 7071 can be covered on display panel 7061, when touch panel 7071 is detected at it
On or near touch operation after, send processor 710 to determine the type of touch event, be followed by subsequent processing device 710 according to touching
The type for touching event provides corresponding visual output on display panel 7061.Although in Fig. 7, touch panel 7071 and display
Panel 7061 is the function that outputs and inputs of realizing mobile terminal as two independent components, but in some embodiments
In, can be integrated by touch panel 7071 and display panel 7061 and realize the function that outputs and inputs of mobile terminal, it is specific this
Place is without limitation.
Interface unit 708 is the interface that external device (ED) is connect with mobile terminal 700.For example, external device (ED) may include having
Line or wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, storage card end
Mouth, port, the port audio input/output (I/O), video i/o port, earphone end for connecting the device with identification module
Mouthful etc..Interface unit 708 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and
By one or more elements that the input received is transferred in mobile terminal 700 or can be used in 700 He of mobile terminal
Data are transmitted between external device (ED).
Memory 709 can be used for storing software program and various data.Memory 709 can mainly include storing program area
The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function
Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as
Audio data, phone directory etc.) etc..In addition, memory 709 may include high-speed random access memory, it can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 710 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection
A part by running or execute the software program and/or module that are stored in memory 709, and calls and is stored in storage
Data in device 709 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place
Managing device 710 may include one or more processing units;Preferably, processor 710 can integrate application processor and modulatedemodulate is mediated
Manage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is main
Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 710.
Mobile terminal 700 can also include the power supply 711 (such as battery) powered to all parts, it is preferred that power supply 711
Can be logically contiguous by power-supply management system and processor 710, to realize management charging by power-supply management system, put
The functions such as electricity and power managed.
In addition, mobile terminal 700 includes some unshowned functional modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 710, and memory 709 is stored in
On memory 709 and the computer program that can run on the processor 710, the computer program are executed by processor 710
A kind of above-mentioned depth information of Shi Shixian determines each process of embodiment of the method, and can reach identical technical effect, to avoid
It repeats, which is not described herein again.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium
Calculation machine program, the computer program realize that a kind of above-mentioned depth information determines each mistake of embodiment of the method when being executed by processor
Journey, and identical technical effect can be reached, to avoid repeating, which is not described herein again.Wherein, the computer-readable storage medium
Matter, such as read-only memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access
Memory, abbreviation RAM), magnetic or disk etc..
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
There is also other identical elements in the process, method of element, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service
Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form belongs within protection of the invention.
Claims (12)
1. a kind of depth information determines method, which is characterized in that taken the photograph applied to including flying time technology TOF camera and colour
As the terminal of head, the TOF camera and the colour imagery shot are located at the same side of the terminal, which comprises
The first depth information of subject is obtained by the TOF camera, and, pass through the TOF camera and institute
State the second depth information that colour imagery shot obtains the subject;
The effective depth information of the subject is determined from first depth information and second depth information.
2. the method as described in claim 1, which is characterized in that the subject includes first area object and the secondth area
Field object, the effective depth that the subject is determined from first depth information and second depth information
The step of information, comprising:
The depth information for determining first area object described in first depth information is the effective of the first area object
Depth information, and, determine that the depth information of second area object described in second depth information is the second area
The effective depth information of object;Alternatively,
Determine that first depth information is the effective depth information of the subject.
3. method according to claim 2, which is characterized in that first area described in determination first depth information
The depth information of object is the effective depth information of the first area object, and, determine institute in second depth information
The step of depth information for stating second area object is the effective depth information of the second area object, comprising:
If the depth of the depth information of first area object described in second depth information and the second area object is believed
The difference of breath is more than or equal to preset threshold, it is determined that the depth of first area object described in first depth information is believed
Breath is the effective depth information of the first area object, and, determine second area pair described in second depth information
The depth information of elephant is the effective depth information of the second area object;Alternatively,
If the depth information of first area object described in second depth information is less than or equal to the first preset value, and described
The depth information of second area object described in second depth information is greater than or equal to the second preset value, it is determined that described first is deep
The depth information for spending first area object described in information is the effective depth information of the first area object, and, it determines
The depth information of second area object described in second depth information is the effective depth information of the second area object,
Wherein, first preset value is less than second preset value.
4. method according to claim 2, which is characterized in that determination first depth information is described is taken pair
The step of effective depth information of elephant, comprising:
If the depth of the depth information of first area object described in second depth information and the second area object is believed
The difference of breath is less than preset threshold, it is determined that first depth information is the effective depth information of the subject;Or
Person,
If the depth information of first area object described in second depth information is greater than the first preset value, and/or, described the
The depth information of second area object is less than the second preset value described in two depth informations, it is determined that first depth information is
The effective depth information of the subject, wherein first preset value is less than second preset value.
5. such as method of any of claims 1-4, which is characterized in that the filter plate of the colour imagery shot is bilateral
Filter plate.
6. a kind of terminal, which is characterized in that the terminal including TOF camera and colour imagery shot, the TOF camera and described
Colour imagery shot is located at the same side of the terminal, the terminal further include:
Module is obtained, for the first depth information by TOF camera acquisition subject, and, by described
TOF camera and the colour imagery shot obtain the second depth information of the subject;
Determining module, for determining having for the subject from first depth information and second depth information
Imitate depth information.
7. terminal as claimed in claim 6, which is characterized in that the subject includes first area object and the secondth area
Field object, the determining module include:
First determines submodule, for determining that the depth information of first area object described in first depth information is described
The effective depth information of first area object, and, determine the depth of second area object described in second depth information
Information is the effective depth information of the second area object;Alternatively,
Second determines submodule, for determining that first depth information is the effective depth information of the subject.
8. terminal as claimed in claim 7, which is characterized in that described first determines submodule, if it is deep to be also used to described second
Degree information described in first area object depth information and the second area object depth information difference be greater than or
Equal to preset threshold, it is determined that the depth information of first area object described in first depth information is the first area
The effective depth information of object, and, determine the depth information of second area object described in second depth information for institute
State the effective depth information of second area object;Alternatively,
Described first determines submodule, if the depth information for being also used to first area object described in second depth information is small
In or be equal to the first preset value, and the depth information of second area object described in second depth information is greater than or equal to the
Two preset values, it is determined that the depth information of first area object described in first depth information is the first area object
Effective depth information, and, determine that the depth information of second area object described in second depth information is described the
The effective depth information of two section objects, wherein first preset value is less than second preset value.
9. terminal as claimed in claim 7, which is characterized in that described second determines submodule, if it is deep to be also used to described second
The difference of the depth information of first area object described in information and the depth information of the second area object is spent less than default
Threshold value, it is determined that first depth information is the effective depth information of the subject;Alternatively,
Described second determines submodule, if the depth information for being also used to first area object described in second depth information is big
In the first preset value, and/or, the depth information of second area object described in second depth information is default less than second
Value, it is determined that first depth information is the effective depth information of the subject, wherein first preset value is small
In second preset value.
10. the terminal as described in any one of claim 6-9, which is characterized in that the filter plate of the colour imagery shot is double
Pass filter piece.
11. a kind of mobile terminal characterized by comprising memory, processor and be stored on the memory and can be in institute
The computer program run on processor is stated, the processor is realized when executing the computer program as in claim 1-5
Described in any item depth informations determine the step in method.
12. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium
Program, the computer program realize that depth information according to any one of claims 1 to 5 determines when being executed by processor
Step in method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811509856.6A CN109544616B (en) | 2018-12-11 | 2018-12-11 | Depth information determination method and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811509856.6A CN109544616B (en) | 2018-12-11 | 2018-12-11 | Depth information determination method and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109544616A true CN109544616A (en) | 2019-03-29 |
CN109544616B CN109544616B (en) | 2021-02-26 |
Family
ID=65854007
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811509856.6A Active CN109544616B (en) | 2018-12-11 | 2018-12-11 | Depth information determination method and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109544616B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021087812A1 (en) * | 2019-11-06 | 2021-05-14 | Oppo广东移动通信有限公司 | Method for determining depth value of image, image processor and module |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120162390A1 (en) * | 2010-12-24 | 2012-06-28 | Wen-Chin Chang | Method of Taking Pictures for Generating Three-Dimensional Image Data |
CN106612387A (en) * | 2015-10-15 | 2017-05-03 | 杭州海康威视数字技术股份有限公司 | Combined depth map acquisition method and depth camera |
CN107403447A (en) * | 2017-07-14 | 2017-11-28 | 梅卡曼德(北京)机器人科技有限公司 | Depth image acquisition method |
CN107851322A (en) * | 2015-07-13 | 2018-03-27 | 皇家飞利浦有限公司 | Method and apparatus for determining depth map for image |
US20180205926A1 (en) * | 2017-01-17 | 2018-07-19 | Seiko Epson Corporation | Cleaning of Depth Data by Elimination of Artifacts Caused by Shadows and Parallax |
CN108564613A (en) * | 2018-04-12 | 2018-09-21 | 维沃移动通信有限公司 | A kind of depth data acquisition methods and mobile terminal |
CN108777784A (en) * | 2018-06-06 | 2018-11-09 | Oppo广东移动通信有限公司 | Depth acquisition methods and device, electronic device, computer equipment and storage medium |
-
2018
- 2018-12-11 CN CN201811509856.6A patent/CN109544616B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120162390A1 (en) * | 2010-12-24 | 2012-06-28 | Wen-Chin Chang | Method of Taking Pictures for Generating Three-Dimensional Image Data |
CN107851322A (en) * | 2015-07-13 | 2018-03-27 | 皇家飞利浦有限公司 | Method and apparatus for determining depth map for image |
CN106612387A (en) * | 2015-10-15 | 2017-05-03 | 杭州海康威视数字技术股份有限公司 | Combined depth map acquisition method and depth camera |
US20180205926A1 (en) * | 2017-01-17 | 2018-07-19 | Seiko Epson Corporation | Cleaning of Depth Data by Elimination of Artifacts Caused by Shadows and Parallax |
CN107403447A (en) * | 2017-07-14 | 2017-11-28 | 梅卡曼德(北京)机器人科技有限公司 | Depth image acquisition method |
CN108564613A (en) * | 2018-04-12 | 2018-09-21 | 维沃移动通信有限公司 | A kind of depth data acquisition methods and mobile terminal |
CN108777784A (en) * | 2018-06-06 | 2018-11-09 | Oppo广东移动通信有限公司 | Depth acquisition methods and device, electronic device, computer equipment and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021087812A1 (en) * | 2019-11-06 | 2021-05-14 | Oppo广东移动通信有限公司 | Method for determining depth value of image, image processor and module |
Also Published As
Publication number | Publication date |
---|---|
CN109544616B (en) | 2021-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110266930A (en) | Light compensation method, mobile terminal and the light adjusting method of image taking | |
CN107846537B (en) | A kind of CCD camera assembly, image acquiring method and mobile terminal | |
CN108989678A (en) | A kind of image processing method, mobile terminal | |
CN107864336B (en) | A kind of image processing method, mobile terminal | |
CN110298304A (en) | A kind of skin detecting method and terminal | |
CN110113528A (en) | A kind of parameter acquiring method and terminal device | |
CN108174103A (en) | A kind of shooting reminding method and mobile terminal | |
CN109788174A (en) | A kind of light compensation method and terminal | |
CN109144361A (en) | A kind of image processing method and terminal device | |
CN109068116A (en) | Image processing method, device, mobile terminal and storage medium based on light filling | |
CN108881719A (en) | A kind of method and terminal device switching style of shooting | |
CN110290331A (en) | A kind of screen control method and terminal | |
CN108174081B (en) | A kind of image pickup method and mobile terminal | |
CN110213412A (en) | A kind of display methods and terminal | |
CN110138967A (en) | A kind of method of controlling operation thereof and terminal of terminal | |
CN109445653A (en) | A kind of icon processing method and mobile terminal | |
CN108718388A (en) | A kind of photographic method and mobile terminal | |
CN109192153A (en) | A kind of terminal and terminal control method | |
CN108650466A (en) | The method and electronic equipment of photo tolerance are promoted when a kind of strong light or reversible-light shooting portrait | |
CN110276329A (en) | A kind of skin detecting method and terminal | |
CN109639981A (en) | A kind of image capturing method and mobile terminal | |
CN109358913A (en) | A kind of the starting method and terminal device of application program | |
CN109274957A (en) | A kind of depth image image pickup method and mobile terminal | |
CN109547700A (en) | Photographic method and terminal | |
CN109729336A (en) | A kind of display methods and device of video image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |