CN108259727A - A kind of depth image generation method and mobile terminal - Google Patents

A kind of depth image generation method and mobile terminal Download PDF

Info

Publication number
CN108259727A
CN108259727A CN201810234190.1A CN201810234190A CN108259727A CN 108259727 A CN108259727 A CN 108259727A CN 201810234190 A CN201810234190 A CN 201810234190A CN 108259727 A CN108259727 A CN 108259727A
Authority
CN
China
Prior art keywords
camera lens
mobile terminal
image
objective plane
motor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810234190.1A
Other languages
Chinese (zh)
Inventor
王鹏
黎冠英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810234190.1A priority Critical patent/CN108259727A/en
Publication of CN108259727A publication Critical patent/CN108259727A/en
Priority to PCT/CN2019/078635 priority patent/WO2019179413A1/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/676Bracketing for image capture at varying focusing conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The present invention provides a kind of depth image generation method and mobile terminal, and the mobile terminal includes camera lens, image chip and motor, and this method includes:Control the motor that the camera lens is driven to be moved in objective plane, the objective plane is parallel to the photosurface of the image chip;When the camera lens is located at each target location in the objective plane, target image of the image chip in each target location acquisition, the target location at least two are obtained;Depth image is obtained according to the target image.Due to camera lens being driven to move by motor, and the image for obtaining the shooting of at least two different locations is shot, so as to obtain depth image.In this way, double image element autofocus sensor is set without being set in mobile terminal, so as to reduce the cost of mobile terminal in dual camera or single camera.

Description

A kind of depth image generation method and mobile terminal
Technical field
The present invention relates to field of communication technology more particularly to a kind of depth image generation method and mobile terminals.
Background technology
With application of the camera on mobile phone terminal, the mankind after meeting to the basic function demand of intellectual product, Higher and higher, demand of especially taking pictures also is required to the experience of consumer electronics;It does not require nothing more than apparent more rich, also to have more Good visual experience will have the background blurring effect for the slr camera that matches in excellence or beauty.Mm professional camera special is realized not by reconciling aperture size Reference object when shooting microspur, portrait, from background saliency can be come out, formed with this good by same Deep Canvas Visual effect.But as the mobile terminal of portable equipment, it is clear that the light as meeting without enough space with mm professional camera special camera lens Circle transformation, and fixed aperture can not realize the conversion of particular depth of view.
In existing mobile terminal, in order to obtain depth image, mobile terminal usually be provided with dual camera or Setting double image element auto-focusing (Dual-pixel autofocus) sensor in single camera.In this way, it will cause mobile whole The cost at end is higher.
Invention content
The embodiment of the present invention provides a kind of depth image generation method and mobile terminal, with solve the cost of mobile terminal compared with The problem of high.
In order to solve the above-mentioned technical problem, the invention is realized in this way:
In a first aspect, an embodiment of the present invention provides a kind of depth image generation method, applied to mobile terminal, the shifting Dynamic terminal includes camera lens, image chip and motor, the method includes:
Control the motor that the camera lens is driven to be moved in objective plane, the objective plane is parallel to the image core The photosurface of piece;
When the camera lens is located at each target location in the objective plane, the image chip is obtained in each institute State the target image of target location acquisition, the target location at least two;
Depth image is obtained according to the target image.
Second aspect, the embodiment of the present invention additionally provide a kind of mobile terminal, and the mobile terminal includes camera lens, image core Piece and motor, the mobile terminal further include:
Control module, for controlling the motor that the camera lens is driven to be moved in objective plane, the objective plane is put down Row is in the photosurface of the image chip;
Acquisition module during for being located at each target location in the objective plane in the camera lens, obtains the shadow As the target image that chip is acquired in each target location, the target location at least two;
Processing module, for obtaining depth image according to the target image.
In the embodiment of the present invention, control the motor that the camera lens is driven to be moved in objective plane, the objective plane It is parallel to the photosurface of the image chip;When the camera lens is located at each target location in the objective plane, obtain The image chip is in the target image of each target location acquisition, the target location at least two;According to described Target image obtains depth image.The shooting of at least two different locations is obtained due to camera lens being driven to move by motor, and shot Image, so as to obtain depth image.In this way, double image is set without being set in dual camera or single camera in mobile terminal Plain autofocus sensor, so as to reduce the cost of mobile terminal.
Description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, needed in being described below to the embodiment of the present invention Attached drawing to be used is briefly described, it should be apparent that, the accompanying drawings in the following description is only some embodiments of the present invention, For those of ordinary skill in the art, without having to pay creative labor, it can also be obtained according to these attached drawings Obtain other attached drawings.
Fig. 1 is the flow chart of depth image generation method provided in an embodiment of the present invention;
Fig. 2 is the detonation configuration figure of the OIS modules of depth image generation method application provided in an embodiment of the present invention;
Fig. 3 is the structure chart of motor in the OIS modules that depth image generation method provided in an embodiment of the present invention is applied;
Fig. 4 is the position displaying of camera lens in the OIS modules that depth image generation method provided in an embodiment of the present invention is applied One of figure;
Fig. 5 is the position displaying of camera lens in the OIS modules that depth image generation method provided in an embodiment of the present invention is applied The two of figure;
Fig. 6 is the position displaying of camera lens in the OIS modules that depth image generation method provided in an embodiment of the present invention is applied The three of figure;
Fig. 7 is the position displaying of camera lens in the OIS modules that depth image generation method provided in an embodiment of the present invention is applied The four of figure;
Fig. 8 is the position displaying of camera lens in the OIS modules that depth image generation method provided in an embodiment of the present invention is applied The five of figure;
Fig. 9 is one of structure chart of mobile terminal provided in an embodiment of the present invention;
Figure 10 is the two of the structure chart of mobile terminal provided in an embodiment of the present invention.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present invention, the technical solution in the embodiment of the present invention is carried out clear, complete Site preparation describes, it is clear that described embodiment is part of the embodiment of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained without creative efforts Example, shall fall within the protection scope of the present invention.
Referring to Fig. 1, Fig. 1 is a kind of flow chart of depth image generation method provided in an embodiment of the present invention, the depth map Picture generation method is applied to mobile terminal, for controlling the generation of depth image.Specifically, as shown in Fig. 2, mobile terminal packet OIS modules are included, which includes camera lens 201, motor 202, optical filter 203, stent 204, image chip 205, wiring board 206th, connector 207, resistance capacitance 208 and driving IC 209 (Integrated Circuit, integrated circuit).As shown in Figure 1, Include the following steps:
Step 101, control the motor that the camera lens is driven to be moved in objective plane, the objective plane is parallel to institute State the photosurface of image chip;
In the step, depth image can specifically be generated in the case where taking pictures or shooting the application scenarios such as video, for example, When being taken pictures, if mobile terminal receives photographing instruction, depth image can be firstly generated, then according to the depth of field of generation Image makees the image virtualization processing in later stage.When shooting video, if mobile terminal receives video capture instruction, scape can be generated Then deep image makees later stage 3D according to the depth image of generation or image virtualization is handled.Specifically, generation is made according to depth image Later stage 3D or the process of image virtualization processing not further are herein limited.
In the present embodiment, the processor of mobile terminal can export control signal and control the motor that the camera lens is driven to exist It is moved in objective plane.Specifically, the processor is in the case where receiving photographing instruction or video capture and instructing, it can be defeated Go out to control signal to driving IC 209, by driving IC 209 that motor 202 is controlled to work, to drive camera lens 201 in objective plane Inside moved.As shown in figure 3, motor 202 can be driven in the objective plane (X/Y plane) that camera lens 201 is formed in X-axis and Y-axis It is moved.
Step 102, when the camera lens is located at each target location in the objective plane, the image chip is obtained In the target image of each target location acquisition, the target location at least two;
It should be understood that the specific position of above-mentioned target location and quantity can be configured according to actual needs, for example, In the present embodiment, above-mentioned target location includes the first position and second that the camera lens 201 obtains after first axle moves up Position;Alternatively, above-mentioned target location includes the first position that is obtained after the first axle moves up of the camera lens 201 and the Two positions and the third place obtained after the second axis moves up and the 4th position.Wherein, above-mentioned first axis can be The axial direction of X-axis, second can be axially the axial direction of Y-axis, can also be other axial directions certainly, such as can be suitable relative to X-axis Hour hands or counterclockwise rotate by a certain angle after axial direction, do not do herein carry out a step restriction.In the present embodiment, due to control Camera lens 201 is planar axially moveable, and obtains two or four target locations, and mobile realization method is simple, while can be with Reduce the generation difficulty of depth image.
It should be noted that the specific location of above-mentioned first position, the second position, the third place and the 4th position can root It is configured according to actual needs, such as in the present embodiment, above-mentioned first position and the second position are the camera lens in first axle Be centrosymmetric setting in moveable region upwards, and above-mentioned the third place and the 4th position are the camera lens in the second axial direction Be centrosymmetric setting in moveable region.Further, preferable depth image in order to obtain, avoids depth information shallower And inaccuracy, the spacing of first position and the second position can be set to maximize, while the third place and the 4th can be set The spacing put maximizes.That is, in the present embodiment, the first position and the second position are the camera lens in first axle Upward moveable extreme position;The third place and the 4th position are the camera lens moveable limit in the second axial direction Position.Specifically, the process that motor 202 can drive camera lens 201 to be moved in objective plane does not do further restriction herein, with Under this is described in detail.
As shown in figure 4,201 moveable region of camera lens is region 401, and in the case where camera lens 201 is not controlled to move, mirror First 201 are in the center in region 401, origin of the center for X-axis and Y-axis.Using above-mentioned first axis as the direction of X-axis, It is illustrated for the above-mentioned second axial direction for Y-axis.Specifically, if target location is two, can be driven by motor 202 Index glass head 201 is moved to the negative direction of X-axis, and so as to reach the boundary in region 401, camera lens 201 is in first position (as schemed at this time Shown in 5);May then pass through motor 202 drives camera lens 201 to be moved to the positive direction of X-axis, so as to reach the boundary in region 401, Camera lens 201 is in the second position (as shown in Figure 6) at this time.If target location is four, it is being moved respectively to first position and the Behind two positions, camera lens 201 can be driven to return to origin by motor 202, camera lens 201 is then driven to Y-axis by motor 202 Negative direction moves, and so as to reach the boundary in region 401, camera lens 201 is in the third place (as shown in Figure 7) at this time.Then pass through Motor 202 drives camera lens 201 to be moved to the positive direction of Y-axis, and so as to reach the boundary in region 401, camera lens 201 is in the 4th at this time Position (as shown in Figure 8).
Since in the present embodiment, first position and the second position are set as the extreme position that first axle moves up, because This centre-to-centre spacing in first position and the second position is larger, and the effect that the depth image of generation is blurred is preferable.In addition, simultaneously It takes pictures in first position, the second position, the third place and the 4th position, subject can be positioned, realize 3D effect. Specifically, using existing OIS motors, the wherein displacement between first position and the second position can reach 0.5mm, take pictures Centre-to-centre spacing is more much larger than Dual pixel, can be used for doing depth of field application, effect is better than Dual pixel.
Step 103, depth image is obtained according to the target image.
In the present embodiment, Digital Signal Processing DSP (digital signal can be carried out to the target image Processor), depth image is obtained.Specifically, the figure by calculating difference difference in two target images (Disparity Map, disparity map), what this disparity map represented is the displacement difference of two image identical points, but due to three Displacement difference and the target range distance of camera lens (subject to) in the positioning of angle is directly proportional, therefore many times disparity map Depth map is just directly used as, this figure is stored in the form of gray-scale map when most.
In the embodiment of the present invention, control the motor that the camera lens is driven to be moved in objective plane, the objective plane It is parallel to the photosurface of the image chip;When the camera lens is located at each target location in the objective plane, obtain The image chip is in the target image of each target location acquisition, the target location at least two;According to described Target image obtains depth image.The shooting of at least two different locations is obtained due to camera lens being driven to move by motor, and shot Image, so as to obtain depth image.In this way, double image is set without being set in dual camera or single camera in mobile terminal Plain autofocus sensor, so as to reduce the cost of mobile terminal.
It should be noted that the structure of said motor 202 can be configured according to actual needs, such as in the present embodiment In, existing OIS (Optical Image Stabilizer, optical image stabilizer) motor may be used in said motor 202. Specifically, OIS motors include AF directional coils, two the first magnet being oppositely arranged and two the second magnet being oppositely arranged.Wherein, First magnet is used to that the camera lens to be controlled to move up in the first party of the objective plane;Second magnet is used to control the mirror Head is moved up in the second party of the objective plane, and the second direction is vertical with the first direction, the AF direction lines Circle is for controlling the camera lens to be moved up in the third party perpendicular to the objective plane, to carry out AF focusings.Above-mentioned AF side It is vertical with above-mentioned objective plane to as Z-direction.
Wherein, above-mentioned first direction is the direction of X-axis, and second direction is the direction of Y-axis.
Based on said program, motor 202 can also be OIS motors or the motor after being improved to OIS motors Structure, as long as can realize that control camera lens 201 is moved in above-mentioned objective plane.The structure of motor 202 in the present embodiment Using following four situation.
The first situation:Motor 202 be OIS motors, i.e., motor 202 include AF directional coils, two be oppositely arranged first Magnet and two the second magnet being oppositely arranged.Due to using existing OIS motors, without being carried out to the structure of mobile terminal in itself It improves, the scope of application is wider.
The second situation:Motor 202 is the OIS motors for removing the second magnet, i.e. motor 202 includes AF directional coils and two The first magnet being oppositely arranged.Due to eliminating the design of the second magnet, it is possible to reduce the volume of motor 202, while can be into One step reduces the cost of mobile terminal.
The third situation:Motor 202 is the OIS motors for removing AF directional coils, i.e. motor 202 is oppositely arranged including two First magnet and two the second magnet being oppositely arranged.Due to eliminating the design of AF directional coils, it is possible to reduce the body of motor 202 Product, while can further reduce the cost of mobile terminal.
4th kind of situation:Motor 202 is the OIS motors for removing AF directional coils and the second magnet, i.e. motor 202 includes two The first magnet being oppositely arranged.Due to eliminating the design of AF directional coils and the second magnet, it is possible to reduce the body of motor 202 Product, while can further reduce the cost of mobile terminal.
It should be noted that the embodiment for the plurality of optional introduced in the embodiment of the present invention, can be combined with each other each other It realizes, can also be implemented separately, this embodiment of the present invention is not construed as limiting.
Referring to Fig. 9, Fig. 9 is the structure chart of mobile terminal provided in an embodiment of the present invention, shifting provided in an embodiment of the present invention Dynamic terminal camera lens, image chip and motor, as shown in figure 9, the mobile terminal further includes:
Control module 901, for controlling the motor that the camera lens is driven to be moved in objective plane, the objective plane It is parallel to the photosurface of the image chip;
Acquisition module 902, during for being located at each target location in the objective plane in the camera lens, described in acquisition Image chip is in the target image of each target location acquisition, the target location at least two;
Processing module 903, for obtaining depth image according to the target image.
Optionally, the target location includes the first position and second that the camera lens obtains after first axle moves up Position;Alternatively, the target location includes the first position and second that the camera lens obtains after the first axle moves up Position and the third place obtained after the second axis moves up and the 4th position.
Optionally, the first position and the second position are the camera lens moveable extreme position in first axis; The third place and the 4th position are the camera lens moveable extreme position in the second axial direction.
Optionally, the first axis and described second axially vertical.
Optionally, the motor includes two the first magnet being oppositely arranged, and first magnet is used to control the camera lens It is moved up in the first party of the objective plane.
Optionally, the motor further includes AF directional coils and/or two the second magnet being oppositely arranged, second magnet For the camera lens to be controlled to be moved up in the second party of the objective plane, the second direction is hung down with the first direction Directly, the AF directional coils are for controlling the camera lens to be moved up in the third party perpendicular to the objective plane, to carry out AF focuses.
Mobile terminal provided in an embodiment of the present invention can realize that mobile terminal is realized in the embodiment of the method for Fig. 1 to Fig. 8 Each process, repeated to avoid, which is not described herein again.Due to camera lens being driven to move by motor, and shoot acquisition at least two The image of a different location shooting, so as to obtain depth image.In this way, without setting dual camera in mobile terminal or singly taking the photograph As setting double image element autofocus sensor in head, so as to reduce the cost of mobile terminal.
A kind of hardware architecture diagram of Figure 10 mobile terminals of each embodiment to realize the present invention.
The mobile terminal 1000 includes but not limited to:Radio frequency unit 1001, network module 1002, audio output unit 1003rd, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, storage The components such as device 1009, processor 1010, power supply 1011, camera lens, image chip and motor.Those skilled in the art can manage It solves, the mobile terminal structure shown in Figure 10 does not form the restriction to mobile terminal, and mobile terminal can be included than illustrating more More or less components either combines certain components or different components arrangement.In embodiments of the present invention, mobile terminal Including but not limited to mobile phone, tablet computer, laptop, palm PC, car-mounted terminal, wearable device and pedometer Deng.
Wherein, processor 1010, for controlling the motor that the camera lens is driven to be moved in objective plane, the target Plane is parallel to the photosurface of the image chip;When the camera lens is located at each target location in the objective plane, Obtain target image of the image chip in each target location acquisition, the target location at least two;According to The target image obtains depth image.
Optionally, the target location includes the first position and second that the camera lens obtains after first axle moves up Position;Alternatively, the target location includes the first position and second that the camera lens obtains after the first axle moves up Position and the third place obtained after the second axis moves up and the 4th position.
Optionally, the first position and the second position are the camera lens moveable extreme position in first axis; The third place and the 4th position are the camera lens moveable extreme position in the second axial direction.
Optionally, the first axis and described second axially vertical.
Optionally, the motor includes two the first magnet being oppositely arranged, and first magnet is used to control the camera lens It is moved up in the first party of the objective plane.
Optionally, the motor further includes AF directional coils and/or two the second magnet being oppositely arranged, second magnet For the camera lens to be controlled to be moved up in the second party of the objective plane, the second direction is hung down with the first direction Directly, the AF directional coils are for controlling the camera lens to be moved up in the third party perpendicular to the objective plane, to carry out AF focuses.
In the embodiment of the present invention, control the motor that the camera lens is driven to be moved in objective plane, the objective plane It is parallel to the photosurface of the image chip;When the camera lens is located at each target location in the objective plane, obtain The image chip is in the target image of each target location acquisition, the target location at least two;According to described Target image obtains depth image.The shooting of at least two different locations is obtained due to camera lens being driven to move by motor, and shot Image, so as to obtain depth image.In this way, double image is set without being set in dual camera or single camera in mobile terminal Plain autofocus sensor, so as to reduce the cost of mobile terminal.
It should be understood that the embodiment of the present invention in, radio frequency unit 1001 can be used for receive and send messages or communication process in, signal Send and receive, specifically, by from base station downlink data receive after, handled to processor 1010;In addition, by uplink Data are sent to base station.In general, radio frequency unit 1001 includes but not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 1001 can also by radio communication system and network and other Equipment communicates.
Mobile terminal has provided wireless broadband internet to the user by network module 1002 and has accessed, and such as user is helped to receive It sends e-mails, browse webpage and access streaming video etc..
It is that audio output unit 1003 can receive radio frequency unit 1001 or network module 1002 or in memory The audio data stored in 1009 is converted into audio signal and exports as sound.Moreover, audio output unit 1003 can be with The relevant audio output of specific function performed with mobile terminal 1000 is provided (for example, call signal receives sound, message sink Sound etc.).Audio output unit 1003 includes loud speaker, buzzer and receiver etc..
Input unit 1004 is used to receive audio or video signal.Input unit 1004 can include graphics processor (Graphics Processing Unit, GPU) 10041 and microphone 10042, graphics processor 10041 in video to capturing In pattern or image capture mode by image capture apparatus (such as camera) obtain static images or video image data into Row processing.Treated, and picture frame may be displayed on display unit 1006.Through treated the picture frame of graphics processor 10041 It can be stored in memory 1009 (or other storage mediums) or be carried out via radio frequency unit 1001 or network module 1002 It sends.Microphone 10042 can receive sound, and can be audio data by such acoustic processing.Audio that treated Data can be converted to the lattice that mobile communication base station can be sent to via radio frequency unit 1001 in the case of telephone calling model Formula exports.
Mobile terminal 1000 further includes at least one sensor 1005, for example, optical sensor, motion sensor and other Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein, ambient light sensor can be according to ring The light and shade of border light adjusts the brightness of display panel 10061, proximity sensor can when mobile terminal 1000 is moved in one's ear, Close display panel 10061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions The size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify mobile terminal appearance when static State (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) Deng;Sensor 1005 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gas Meter, hygrometer, thermometer, infrared ray sensor etc. are pressed, details are not described herein.
Display unit 1006 is used to show by information input by user or be supplied to the information of user.Display unit 1006 can Including display panel 10061, liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diodes may be used Forms such as (Organic Light-Emitting Diode, OLED) are managed display panel 10061 is configured.
User input unit 1007 can be used for receiving the number inputted or character information and generation and the use of mobile terminal The key signals input that family is set and function control is related.Specifically, user input unit 1007 include touch panel 10071 with And other input equipments 10072.Touch panel 10071, also referred to as touch screen collect user on it or neighbouring touch are grasped Make (for example user uses any suitable objects such as finger, stylus or attachment on touch panel 10071 or in touch panel Operation near 10071).Touch panel 10071 may include both touch detecting apparatus and touch controller.Wherein, it touches The touch orientation of detection device detection user is touched, and detects the signal that touch operation is brought, transmits a signal to touch controller; Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 1010, It receives the order that processor 1010 is sent and is performed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surface The multiple types such as sound wave realize touch panel 10071.In addition to touch panel 10071, user input unit 1007 can also include Other input equipments 10072.Specifically, other input equipments 10072 can include but is not limited to physical keyboard, function key (ratio Such as volume control button, switch key), trace ball, mouse, operating lever, details are not described herein.
Further, touch panel 10071 can be covered on display panel 10061, when touch panel 10071 detects After touch operation on or near it, processor 1010 is sent to determine the type of touch event, is followed by subsequent processing device 1010 Corresponding visual output is provided on display panel 10061 according to the type of touch event.Although in Fig. 10, touch panel 10071 realize the function that outputs and inputs of mobile terminal with display panel 10061 is the component independent as two, but In some embodiments, touch panel 10071 with display panel 10061 can be integrated and realize outputting and inputting for mobile terminal Function does not limit specifically herein.
Interface unit 1008 is the interface that external device (ED) is connect with mobile terminal 1000.For example, external device (ED) can include Wired or wireless headphone port, external power supply (or battery charger) port, wired or wireless data port, storage card Port, the port for device of the connection with identification module, audio input/output (I/O) port, video i/o port, earphone Port etc..Interface unit 1008 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) simultaneously And one or more elements that the input received is transferred in mobile terminal 1000 or it can be used in mobile terminal Data are transmitted between 1000 and external device (ED).
Memory 1009 can be used for storage software program and various data.Memory 1009 can mainly include storage program Area and storage data field, wherein, storing program area can storage program area, needed at least one function application program (such as Sound-playing function, image player function etc.) etc.;Storage data field can be stored uses created data (ratio according to mobile phone Such as audio data, phone directory) etc..In addition, memory 1009 can include high-speed random access memory, can also include non- Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 1010 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part is stored in storage by running or performing the software program being stored in memory 1009 and/or module and call Data in device 1009 perform the various functions of mobile terminal and processing data, so as to carry out integral monitoring to mobile terminal.Place Reason device 1010 may include one or more processing units;Preferably, processor 1010 can integrate application processor and modulation /demodulation Processor, wherein, the main processing operation system of application processor, user interface and application program etc., modem processor master Handle wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 1010.
Mobile terminal 1000 can also include the power supply 1011 (such as battery) powered to all parts, it is preferred that power supply 1011 can be logically contiguous by power-supply management system and processor 1010, so as to realize that management is filled by power-supply management system The functions such as electricity, electric discharge and power managed.
In addition, mobile terminal 1000 includes some unshowned function modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 1010, memory 1009, storage On memory 1009 and the computer program that can be run on the processor 1010, the computer program is by processor 1010 Each process of above-mentioned depth image generation method embodiment is realized during execution, and identical technique effect can be reached, to avoid It repeats, which is not described herein again.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium Calculation machine program, the computer program realize each process of above-mentioned depth image generation method embodiment when being executed by processor, And identical technique effect can be reached, it is repeated to avoid, which is not described herein again.Wherein, the computer readable storage medium, Such as read-only memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, letter Claim RAM), magnetic disc or CD etc..
Those of ordinary skill in the art may realize that each exemplary lists described with reference to the embodiments described herein Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is performed with hardware or software mode, specific application and design constraint depending on technical solution.Professional technician Described function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceed The scope of the present invention.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit can refer to the corresponding process in preceding method embodiment, and details are not described herein.
In embodiment provided herein, it should be understood that disclosed device and method can pass through others Mode is realized.For example, the apparatus embodiments described above are merely exemplary, for example, the division of the unit, only A kind of division of logic function, can there is an other dividing mode in actual implementation, for example, multiple units or component can combine or Person is desirably integrated into another system or some features can be ignored or does not perform.Another point, shown or discussed is mutual Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, device or unit It connects, can be electrical, machinery or other forms.
The unit illustrated as separating component may or may not be physically separate, be shown as unit The component shown may or may not be physical unit, you can be located at a place or can also be distributed to multiple In network element.Some or all of unit therein can be selected according to the actual needs to realize the embodiment of the present invention Purpose.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also That each unit is individually physically present, can also two or more units integrate in a unit.
If the function is realized in the form of SFU software functional unit and is independent product sale or in use, can be with It is stored in a computer read/write memory medium.Based on such understanding, technical scheme of the present invention is substantially in other words The part contribute to the prior art or the part of the technical solution can be embodied in the form of software product, the meter Calculation machine software product is stored in a storage medium, is used including some instructions so that a computer equipment (can be People's computer, server or network equipment etc.) perform all or part of the steps of the method according to each embodiment of the present invention. And aforementioned storage medium includes:USB flash disk, mobile hard disk, ROM, RAM, magnetic disc or CD etc. are various can to store program code Medium.
The above description is merely a specific embodiment, but protection scope of the present invention is not limited thereto, any Those familiar with the art in the technical scope disclosed by the present invention, can readily occur in change or replacement, should all contain Lid is within protection scope of the present invention.Therefore, protection scope of the present invention should be subject to the protection scope in claims.

Claims (14)

1. a kind of depth image generation method, applied to mobile terminal, the mobile terminal includes camera lens, image chip and horse It reaches, which is characterized in that the method includes:
Control the motor that the camera lens is driven to be moved in objective plane, the objective plane is parallel to the image chip Photosurface;
When the camera lens is located at each target location in the objective plane, the image chip is obtained in each mesh The target image of cursor position acquisition, the target location at least two;
Depth image is obtained according to the target image.
2. according to the method described in claim 1, it is characterized in that, the target location includes the camera lens in first axis The first position and the second position obtained after movement;Alternatively, the target location includes the camera lens in the first axis The first position and the second position that are obtained after movement and the third place obtained after the second axis moves up and the 4th It puts.
3. according to the method described in claim 2, it is characterized in that, the first position and the second position are the camera lens the One axial moveable extreme position;The third place and the 4th position are moveable in the second axial direction for the camera lens Extreme position.
4. the according to the method described in claim 2, it is characterized in that, first axis and described second axially vertical.
5. according to the method described in claim 1, it is characterized in that, the motor includes two the first magnet being oppositely arranged, institute The first magnet is stated for the camera lens to be controlled to be moved up in the first party of the objective plane.
6. according to the method described in claim 5, it is characterized in that, the motor further includes AF directional coils and/or two relatively Second magnet of setting, second magnet is for controlling the camera lens to be moved up in the second party of the objective plane, institute It is vertical with the first direction to state second direction, the AF directional coils are used to that the camera lens to be controlled to put down perpendicular to the target The third party in face moves up, to carry out AF focusings.
7. a kind of mobile terminal, the mobile terminal includes camera lens, image chip and motor, which is characterized in that described mobile whole End further includes:
Control module, for controlling the motor that the camera lens is driven to be moved in objective plane, the objective plane is parallel to The photosurface of the image chip;
Acquisition module during for being located at each target location in the objective plane in the camera lens, obtains the image core Piece is in the target image of each target location acquisition, the target location at least two;
Processing module, for obtaining depth image according to the target image.
8. mobile terminal according to claim 7, which is characterized in that the target location includes the camera lens in first axle The first position and the second position obtained after moving up;Alternatively, the target location includes the camera lens in the first axle The first position and the second position that are obtained after moving up and the third place and the 4th obtained after the second axis moves up Position.
9. mobile terminal according to claim 8, which is characterized in that the first position and the second position are the camera lens The moveable extreme position in first axis;The third place and the 4th position are removable in the second axial direction for the camera lens Dynamic extreme position.
10. mobile terminal according to claim 8, which is characterized in that the first axis and described second axially vertical.
11. mobile terminal according to claim 7, which is characterized in that the motor includes two the first magnetic being oppositely arranged Iron, first magnet are used to that the camera lens to be controlled to move up in the first party of the objective plane.
12. mobile terminal according to claim 11, which is characterized in that the motor further include AF directional coils and/or Two the second magnet being oppositely arranged, second magnet are used to that the camera lens to be controlled to move up in the second direction of the objective plane Dynamic, the second direction is vertical with the first direction, and the AF directional coils are used to control the camera lens perpendicular to described The third party of objective plane moves up, to carry out AF focusings.
13. a kind of mobile terminal, which is characterized in that including processor, memory and be stored on the memory and can be in institute The computer program run on processor is stated, such as claim 1 to 6 is realized when the computer program is performed by the processor Any one of described in depth image generation method the step of.
14. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program The step of depth image generation method described in any one of claim 1-6 is realized when being executed by processor.
CN201810234190.1A 2018-03-21 2018-03-21 A kind of depth image generation method and mobile terminal Pending CN108259727A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810234190.1A CN108259727A (en) 2018-03-21 2018-03-21 A kind of depth image generation method and mobile terminal
PCT/CN2019/078635 WO2019179413A1 (en) 2018-03-21 2019-03-19 Depth-of-field image generating method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810234190.1A CN108259727A (en) 2018-03-21 2018-03-21 A kind of depth image generation method and mobile terminal

Publications (1)

Publication Number Publication Date
CN108259727A true CN108259727A (en) 2018-07-06

Family

ID=62747021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810234190.1A Pending CN108259727A (en) 2018-03-21 2018-03-21 A kind of depth image generation method and mobile terminal

Country Status (2)

Country Link
CN (1) CN108259727A (en)
WO (1) WO2019179413A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109241832A (en) * 2018-07-26 2019-01-18 维沃移动通信有限公司 A kind of method and terminal device of face In vivo detection
WO2019179413A1 (en) * 2018-03-21 2019-09-26 维沃移动通信有限公司 Depth-of-field image generating method and mobile terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102168954A (en) * 2011-01-14 2011-08-31 浙江大学 Monocular-camera-based method for measuring depth, depth field and sizes of objects
US20140226041A1 (en) * 2013-02-14 2014-08-14 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium
CN104010178A (en) * 2014-06-06 2014-08-27 深圳市墨克瑞光电子研究院 Binocular image parallax adjusting method and device and binocular camera
CN104902190A (en) * 2015-06-24 2015-09-09 联想(北京)有限公司 Control method, photographic device and electronic device
CN107026969A (en) * 2016-02-01 2017-08-08 中兴通讯股份有限公司 The determination method and device of phase difference

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007058066A (en) * 2005-08-26 2007-03-08 Olympus Corp Measuring microscope
US9307134B2 (en) * 2011-03-25 2016-04-05 Sony Corporation Automatic setting of zoom, aperture and shutter speed based on scene depth map
CN202837765U (en) * 2012-09-17 2013-03-27 硕颖数码科技(中国)有限公司 Camera lens module
CN203365778U (en) * 2013-07-08 2013-12-25 爱佩仪光电技术(深圳)有限公司 Motor spring structure for controlling translational motion of lens
CN108259727A (en) * 2018-03-21 2018-07-06 维沃移动通信有限公司 A kind of depth image generation method and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102168954A (en) * 2011-01-14 2011-08-31 浙江大学 Monocular-camera-based method for measuring depth, depth field and sizes of objects
US20140226041A1 (en) * 2013-02-14 2014-08-14 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium
CN104010178A (en) * 2014-06-06 2014-08-27 深圳市墨克瑞光电子研究院 Binocular image parallax adjusting method and device and binocular camera
CN104902190A (en) * 2015-06-24 2015-09-09 联想(北京)有限公司 Control method, photographic device and electronic device
CN107026969A (en) * 2016-02-01 2017-08-08 中兴通讯股份有限公司 The determination method and device of phase difference

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019179413A1 (en) * 2018-03-21 2019-09-26 维沃移动通信有限公司 Depth-of-field image generating method and mobile terminal
CN109241832A (en) * 2018-07-26 2019-01-18 维沃移动通信有限公司 A kind of method and terminal device of face In vivo detection

Also Published As

Publication number Publication date
WO2019179413A1 (en) 2019-09-26

Similar Documents

Publication Publication Date Title
CN107872623B (en) A kind of image pickup method, mobile terminal and computer readable storage medium
CN110495819B (en) Robot control method, robot, terminal, server and control system
CN108513070A (en) A kind of image processing method, mobile terminal and computer readable storage medium
CN104134230A (en) Image processing method, image processing device and computer equipment
CN107592471A (en) A kind of high dynamic range images image pickup method and mobile terminal
CN107483836B (en) A kind of image pickup method and mobile terminal
CN107592466A (en) A kind of photographic method and mobile terminal
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN107817939A (en) A kind of image processing method and mobile terminal
CN103473804A (en) Image processing method, device and terminal equipment
CN108459815A (en) A kind of display control method and mobile terminal
CN107580209A (en) Take pictures imaging method and the device of a kind of mobile terminal
CN108989672A (en) A kind of image pickup method and mobile terminal
CN109922179A (en) A kind of camera module, camera control method and terminal
CN108038825A (en) A kind of image processing method and mobile terminal
CN108337381A (en) A kind of lens control method and mobile terminal
CN107682639B (en) A kind of image processing method, device and mobile terminal
CN106203254A (en) A kind of adjustment is taken pictures the method and device in direction
CN108833709A (en) A kind of the starting method and mobile terminal of camera
CN108156374A (en) A kind of image processing method, terminal and readable storage medium storing program for executing
CN108320263A (en) A kind of method, device and mobile terminal of image procossing
CN108881544A (en) A kind of method taken pictures and mobile terminal
CN109688341A (en) A kind of method for polishing and terminal device
CN105635553B (en) Image shooting method and device
CN108984143A (en) A kind of display control method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180706