CN106921825A - A kind of method of focal imaging, device and terminal - Google Patents
A kind of method of focal imaging, device and terminal Download PDFInfo
- Publication number
- CN106921825A CN106921825A CN201510990082.3A CN201510990082A CN106921825A CN 106921825 A CN106921825 A CN 106921825A CN 201510990082 A CN201510990082 A CN 201510990082A CN 106921825 A CN106921825 A CN 106921825A
- Authority
- CN
- China
- Prior art keywords
- information
- distance
- camera lens
- focusing
- preview image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 36
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000010586 diagram Methods 0.000 description 13
- 238000004590 computer program Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000009432 framing Methods 0.000 description 5
- 230000001902 propagating effect Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003707 image sharpening Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
Abstract
The embodiment of the invention discloses a kind of method of focal imaging, device and terminal, the method can include:Receive the first instruction acquisition image information and generate preview image;Wherein, the collection image information includes:Object information is gathered by pick-up lens and the corresponding coordinate information of the object information is gathered by sensor;Receive the first object information that the second instruction is chosen in the preview image;The coordinate information of the first object information and first object according to the selection is focused treatment to the preview image.
Description
Technical Field
The present invention relates to imaging technologies, and in particular, to a method, an apparatus, and a terminal for focused imaging.
Background
When a single camera lens is used for imaging in a portable device such as a mobile terminal, the depth of field of the single camera lens is 2.5m before and after the focus, and a target image outside the depth of field is out of focus, thereby causing the problem of imaging blur.
At present, in order to solve the problem of single-lens imaging blur of a portable device, a main camera lens and an auxiliary camera lens are generally adopted to realize rapid focusing, the main camera lens is used for taking pictures and collecting image information, and the auxiliary camera lens is a wide-depth camera lens and is used for collecting the distance between the main camera lens and a shot object and feeding back position information to an image processing system. Thereby obtaining the imaging capability of shooting a larger depth of field range; and the effect of high-definition imaging output can be obtained through image processing. However, since the sub imaging lens is usually made of an optical device, it is easily affected by external light during the process of collecting the distance between the main imaging lens and the object to be photographed, and it is impossible to measure a distance under a low illuminance condition.
Disclosure of Invention
In order to solve the foregoing technical problems, embodiments of the present invention desirably provide a method, an apparatus, and a terminal for focused imaging, which can avoid the influence of external light and can still collect the distance between the main camera lens and the object to be photographed under the condition of low illuminance.
The technical scheme of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an apparatus for focused imaging, where the apparatus includes: the device comprises an acquisition module, a selection module and a processing module;
the acquisition module is used for receiving a first instruction to acquire image information and generate a preview image; wherein the acquiring image information comprises: acquiring object information through a camera lens and acquiring coordinate information corresponding to the object information through a sensor;
the selecting module is used for receiving a second instruction to select the first object information in the preview image;
and the processing module is used for focusing the preview image according to the selected first object information and the coordinate information of the first object.
In the above solution, the processing module includes: a ranging sub-module, a selection sub-module, and a focusing sub-module, wherein,
the distance measuring submodule is used for determining a first distance between the first object and the camera lens according to the coordinate information corresponding to the first object information; and the number of the first and second groups,
determining a second distance between a second object except the first object and the camera lens according to the coordinate information of the second object in the image information;
the selecting submodule is used for selecting a target object with a second distance which is the same as the first distance from the second object;
and the focusing sub-module is used for focusing the first object and the target object in the preview image to obtain a focused image.
In the above solution, the apparatus further includes a determining module, configured to receive a third instruction to determine the selected area;
correspondingly, the selecting sub-module is used for selecting a target object with a second distance which is the same as the first distance from a second object in the selected area.
In the above scheme, the focusing sub-module is further configured to perform non-focusing processing on the first object and other objects in the preview image except the target object.
In the above scheme, the acquisition module is specifically configured to:
acquiring coordinate information corresponding to the object information by using an ultrasonic sensor according to the view finding range of the camera lens through ultrasonic waves with the frequency of 40KHz kilohertz;
or acquiring coordinate information corresponding to the object information by using an infrared sensor through a plurality of infrared rays with the same frequency and different emission angles in the view finding range of the camera lens.
In a second aspect, an embodiment of the present invention provides a method of focused imaging, where the method includes:
receiving a first instruction to acquire image information and generating a preview image; wherein the acquiring image information comprises: acquiring object information through a camera lens and acquiring coordinate information corresponding to the object information through a sensor;
receiving a second instruction to select first object information in the preview image;
and focusing the preview image according to the selected first object information and the coordinate information of the first object.
In the foregoing solution, the focusing the preview image according to the selected first object information and the coordinate information of the first object specifically includes:
determining a first distance between the first object and the camera lens according to the coordinate information corresponding to the first object information;
determining a second distance between a second object except the first object and the camera lens according to the coordinate information of the second object in the image information;
selecting a target object with a second distance identical to the first distance from the second object;
and focusing the first object and the target object in the preview image to obtain a focused image.
In the above aspect, the method further includes: receiving a third instruction to determine a selected area;
correspondingly, selecting a target object with a second distance equal to the first distance from the second object specifically includes:
and selecting a target object with the second distance being the same as the first distance from a second object in the selected area.
In the above aspect, the method further includes: and performing non-focusing processing on the first object and other objects except the target object in the preview image.
In the above scheme, the acquiring, by the sensor, the coordinate information corresponding to the object information includes:
the ultrasonic sensor acquires coordinate information corresponding to the object information through ultrasonic waves with the frequency of 40KHz kilohertz according to the view finding range of the camera lens;
or the infrared sensor collects coordinate information corresponding to the object information through a plurality of infrared rays with the same frequency and different emission angles in the view finding range of the camera lens.
In a third aspect, an embodiment of the present invention provides a terminal, where the terminal includes: the device comprises a camera lens, a sensor, a memory, a display and a processor;
the processor is used for receiving a second instruction to select first object information in a preview image which is generated according to the received first instruction and displayed by the display; the image information comprises object information acquired by the camera lens and coordinate information corresponding to the object information acquired by the sensor, and the image information is stored in the memory;
and the focusing unit is used for focusing the preview image according to the selected first object information and the coordinate information of the first object.
The embodiment of the invention provides a focusing imaging method, a device and a terminal, wherein a propagation wave is adopted to replace a traditional auxiliary camera lens to measure the distance between a main camera lens and a shot object, so that the influence of external light can be avoided, and the distance between the main camera lens and the shot object can be still collected under the condition of low illuminance.
Drawings
Fig. 1 is a schematic structural diagram of a focusing imaging apparatus according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a photographing lens provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a distance measurement of an ultrasonic distance measuring sensor according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of another focusing imaging apparatus provided in the embodiment of the present invention;
FIG. 5 is a flowchart illustrating a method of focused imaging according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating a focusing process according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
Example one
Referring to fig. 1, which illustrates an apparatus 10 for focused imaging according to an embodiment of the present invention, where the apparatus 10 may be applied to a mobile terminal with a shooting function, such as a smart phone, a tablet computer, a notebook computer, and the like, the apparatus 10 may include: acquisition module 101, selection module 102 and processing module 103
The acquisition module 101 is configured to receive a first instruction to acquire image information and generate a preview image; wherein the acquiring image information comprises: acquiring object information through a camera lens and acquiring coordinate information corresponding to the object information through a sensor;
the selecting module 102 is configured to receive a second instruction to select the first object information in the preview image;
the processing module 103 is configured to perform focusing processing on the preview image according to the selected first object information and the coordinate information of the first object.
It can be understood that in the focusing imaging apparatus 10 provided in this embodiment, the collecting module 101 utilizes the propagation wave to perform the distance measurement of the scenery through the sensor, and performs the focusing of the framing preview image according to the distance measurement of the propagation wave, so that the focusing process is not affected by the external light environment, and the distance between the main camera lens and the shot object can still be collected for focusing the framing preview image under the condition of low illuminance.
For example, in the focusing imaging apparatus 10 shown in fig. 1, the acquisition module 101 may specifically include an image pickup lens having auto-focusing and shooting functions as shown in fig. 2, and may select shooting parameters such as a focal length range, a lens field angle, and an aperture size according to requirements for shooting. When the mobile terminal starts the device 10 for focusing and imaging, the camera lens is opened, a framing preview image is acquired in a framing range of the camera lens according to preset shooting parameters, and the framing preview image can be displayed in the display unit.
For example, in the focusing imaging apparatus 10 shown in fig. 1, the acquisition module 101 may further include a propagation wave distance measuring sensor, such as an ultrasonic distance measuring sensor or an infrared distance measuring sensor. It is to be understood that the propagating wave may be other propagating waves, and the embodiment is not limited thereto.
For the ultrasonic ranging sensor, the ultrasonic waves emitted by the ultrasonic ranging sensor need to satisfy the view range of the camera lens in the acquisition module 101, i.e., the lens field angle of the camera lens.
Specifically, since ultrasonic waves are elastic mechanical waves that vibrate longitudinally, they propagate by virtue of molecular motion of a propagation medium. Thus, the wave equation description for ultrasonic waves is similar to that for electromagnetic waves, as shown in equation 1:
wherein A (x) is amplitude, A0Is a constant, w is the circular frequency, t is the time, x is the propagation distance of the ultrasonic wave,is the wave number, lambda is the wavelength of the ultrasonic wave, and the attenuation constant α ═ af of the ultrasonic wave2Wherein a is a dielectric constant, and f is the vibration frequency of the ultrasonic wave.
For example, the dielectric constant a of air is 2 × 10-13s2/cm, thereforeWhen the vibration frequency of the ultrasonic wave is f 40kHz, α is 3.2 × 10-4cm-11/α is 31m, and when f is 30kHz, 1/α is 56m, so it can be seen that the higher the frequency of the ultrasonic wave, the more the attenuation is, and the shorter the distance traveled.
In addition, the pointing angle theta of the ultrasonic ranging sensor is the included angle of the ultrasonic wave beam half-power angle, which can directly influence the ranging resolution and is an important technical parameter influencing the ranging precision. Taking a wafer ultrasonic ranging sensor as an example, the size of the pointing angle θ is related to the wavelength λ of the ultrasonic wave of the wafer ultrasonic ranging sensor and the radius r of the wafer ultrasonic ranging sensor, and the specific relationship is as shown in formula 2:
as can be seen from equation (2), the smaller the pointing angle θ of the ultrasonic ranging sensor is, the higher the spatial resolution is, the larger the radius r of the wafer ultrasonic ranging sensor is required to be. When f is 40kHz, one can obtain: the wavelength λ of the ultrasonic wave is 340/40000 mm, where C represents the propagation speed of the ultrasonic wave in air. In this case, in the present embodiment, the vibration frequency of the ultrasonic wave is preferably set to f 40kHz, based on the correspondence relationship between the pointing angle θ of the ultrasonic distance measuring sensor, the radius r of the wafer ultrasonic distance measuring sensor, and the wavelength λ of the ultrasonic wave, which is expressed by equation (2).
Therefore, the acquisition module 101 may specifically acquire the coordinate information corresponding to the object information by using an ultrasonic sensor according to the ultrasonic wave with the frequency of 40KHz in the viewing range of the imaging lens.
For the infrared sensor, since the pointing angle of the infrared ray is much smaller than that of the ultrasonic wave, one infrared ray beam can only measure an object in a single direction or at a single angle, and therefore, if coordinate information of an object in the viewing range of the camera lens is to be collected, the infrared sensor needs to measure multiple infrared rays, and therefore, the collection module 101 needs to collect the coordinate information corresponding to the object information by using the infrared sensor through multiple infrared rays with the same frequency and different emission angles in the viewing range of the camera lens.
For example, in the device 10 for focused imaging shown in fig. 1, the distance between the object in the preview image and the camera lens of the acquisition module 101 may be obtained by the difference between the emission time of the propagating wave and the receiving time of the propagating wave echo.
Further, when the propagation wave is an ultrasonic wave, accordingly, the distance L between the object in the preview image and the camera lens of the acquisition module 101 can be obtained by equation (3):
wherein, T is a transit time, which is a time for the ultrasonic wave emitted from the ultrasonic wave emitting end to propagate to the ultrasonic wave receiving end through the gas medium. It can be understood that, as shown in fig. 3, when the ultrasonic ranging sensor employs a remote pulse reflection type to measure the distance between the object in the preview image and the camera lens of the capture module 101, the distance is half of the sound wave transmission distance. It can be understood that, when the propagation wave is infrared, the distance is obtained in the same manner as above, and will not be described again.
For example, in the focusing imaging apparatus 10 shown in fig. 1, the view preview image generated by the capture module 101 may be displayed by a display unit with touch sensing, and when the selection module 102 receives a second instruction of a user to select an object in the displayed preview image, the first object information in the preview image is selected according to the selection instruction of the user.
Specifically, referring to fig. 4, the processing module 103 includes: a ranging sub-module 1031, a selection sub-module 1032, and a focusing sub-module 1033, wherein,
the distance measuring submodule 1031 is configured to determine a first distance between the first object and the camera lens according to the coordinate information corresponding to the first object information; and the number of the first and second groups,
determining a second distance between a second object except the first object and the camera lens according to the coordinate information of the second object in the image information;
a selecting submodule 1032, configured to select a target object with a second distance that is the same as the first distance from the second object;
the focusing sub-module 1033 is configured to perform focusing processing on the first object and the target object in the preview image to obtain a focused image.
Preferably, the distance between the target object and the camera lens of the acquisition module 101 is equal to the distance between the selected first object and the camera lens of the acquisition module 101, so that the target object and the selected first object are in a uniform plane, and the focusing sub-module 1032 can focus on the scene in the uniform plane.
Further, the area range of the unified plane can also be set and determined by the user's requirement, and therefore, the apparatus 10 further includes a determining module 104 for receiving a third instruction to determine the selected area;
accordingly, the selecting sub-module 1032 is configured to select a target object with a second distance equal to the first distance from the second object located in the selected area.
Further, in the process of obtaining the focused image, the focusing sub-module 1033 may be further configured to perform non-focusing processing on an object in the preview image, which is not in the same plane as the selected object, other objects in the preview image, other than the first object and the target object, for example, image processing such as image sharpening, blurring, color rendering, and the like, so as to achieve an effect of blurring the aperture of the camera lens. It is to be understood that the focus sub-module 1033 may also set the degree of blurring.
The present embodiment provides a device 10 for focusing and imaging, which uses a propagation wave to replace a conventional auxiliary camera lens to measure the distance between a main camera lens and a shot object, so as to avoid the influence of external light and still collect the distance between the main camera lens and the shot object under the condition of low illuminance.
Example two
Based on the same technical concept as the foregoing embodiment, referring to fig. 5, a method for focused imaging provided by an embodiment of the present invention is shown, and the method may be applied to an apparatus for focused imaging in the foregoing embodiment, and the method may include:
s501: receiving a first instruction to acquire image information and generating a preview image; wherein the acquiring image information comprises: acquiring object information through a camera lens and acquiring coordinate information corresponding to the object information through a sensor;
s502: receiving a second instruction to select first object information in the preview image;
s503: and focusing the preview image according to the selected first object information and the coordinate information of the first object.
For example, referring to fig. 6, the performing a focusing process on the preview image according to the selected first object information and the coordinate information of the first object may specifically include:
s5031: determining a first distance between a first object and the camera lens according to coordinate information corresponding to the first object information;
s5032: determining a second distance between a second object and the camera lens according to coordinate information of the second object except the first object in the image information;
s5033: selecting a target object with the second distance being the same as the first distance from the second object;
s5024: and focusing the first object and the target object in the preview image to obtain a focused image.
Further, the method further comprises: receiving a third instruction to determine a selected area;
correspondingly, selecting a target object with a second distance equal to the first distance from the second object specifically includes:
and selecting a target object with the second distance being the same as the first distance from a second object in the selected area.
Further, the method further comprises: and performing non-focusing processing on the first object and other objects except the target object in the preview image.
In the foregoing scheme, the acquiring, by the sensor, the coordinate information corresponding to the object information may specifically include: the ultrasonic sensor acquires coordinate information corresponding to the object information through ultrasonic waves with the frequency of 40KHz kilohertz according to the view finding range of the camera lens;
or the infrared sensor collects coordinate information corresponding to the object information through a plurality of infrared rays with the same frequency and different emission angles in the view finding range of the camera lens.
The embodiment provides a focusing imaging method, which adopts a propagation wave to replace a traditional auxiliary camera lens to measure the distance between a main camera lens and a shot object, thereby avoiding the influence of external light and still collecting the distance between the main camera lens and the shot object under the condition of low illuminance.
EXAMPLE III
Based on the same technical concept as the foregoing embodiment, referring to fig. 7, a hardware structure of a terminal 70 according to an embodiment of the present invention is shown, where the terminal 70 may be a smart phone, a tablet computer, a notebook computer, or the like, and the terminal 70 may include: components such as a camera lens 701, a sensor 702, a memory 703, a display 704, and a processor 705, which are communicatively connected by one or more buses 706; those skilled in the art will appreciate that the configuration of the mobile terminal shown in fig. 7 is not intended to limit embodiments of the present invention, and may be a bus configuration, a star configuration, a combination of more or fewer components than those shown, or a different arrangement of components. Wherein,
a processor 705 for receiving a second instruction to select first object information in the preview image displayed by the display 704 generated by acquiring image information according to the received first instruction; the image information includes object information acquired by the camera lens 701 and coordinate information corresponding to the object information acquired by the sensor 702, where the image information is stored in the memory 703;
and the focusing unit is used for focusing the preview image according to the selected first object information and the coordinate information of the first object.
In the foregoing solution, the processor 705 is specifically configured to determine a first distance between the first object and the imaging lens 701 according to coordinate information corresponding to the first object information; and the number of the first and second groups,
determining a second distance between a second object except the first object and the camera lens 701 according to coordinate information of the second object in the image information; and the number of the first and second groups,
selecting a target object with a second distance identical to the first distance from the second object; and the number of the first and second groups,
and focusing the first object and the target object in the preview image to obtain a focused image.
In the above solution, the processor 705 is further configured to receive a third instruction to determine the selected area;
and selecting a target object with the second distance being the same as the first distance from a second object in the selected area.
In the above solution, the processor 705 is further configured to perform unfocused processing on the first object and other objects in the preview image except the target object.
In the above solution, the sensor 702 may include an ultrasonic sensor 7021 or an infrared sensor 7022; wherein,
the ultrasonic sensor 7021 is configured to acquire coordinate information corresponding to the object information by using ultrasonic waves with a frequency of 40KHz in a viewing range of the imaging lens 701;
the infrared sensor 7022 is configured to acquire coordinate information corresponding to the object information by using a plurality of infrared rays having the same frequency and different emission angles within a viewing range of the imaging lens 701.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.
Claims (11)
1. An apparatus for focused imaging, the apparatus comprising: the device comprises an acquisition module, a selection module and a processing module;
the acquisition module is used for receiving a first instruction to acquire image information and generate a preview image; wherein the acquiring image information comprises: acquiring object information through a camera lens and acquiring coordinate information corresponding to the object information through a sensor;
the selecting module is used for receiving a second instruction to select the first object information in the preview image;
and the processing module is used for focusing the preview image according to the selected first object information and the coordinate information of the first object.
2. The apparatus of claim 1, wherein the processing module comprises: a ranging sub-module, a selection sub-module, and a focusing sub-module, wherein,
the distance measuring submodule is used for determining a first distance between the first object and the camera lens according to the coordinate information corresponding to the first object information; and the number of the first and second groups,
determining a second distance between a second object except the first object and the camera lens according to the coordinate information of the second object in the image information;
the selecting submodule is used for selecting a target object with a second distance which is the same as the first distance from the second object;
and the focusing sub-module is used for focusing the first object and the target object in the preview image to obtain a focused image.
3. The apparatus of claim 2, further comprising a determination module configured to receive a third instruction to determine the selected area;
correspondingly, the selecting sub-module is used for selecting a target object with a second distance which is the same as the first distance from a second object in the selected area.
4. The apparatus according to claim 2 or 3,
the focusing sub-module is further configured to perform non-focusing processing on the first object and other objects in the preview image except the target object.
5. The device according to claim 1, wherein the acquisition module is specifically configured to:
acquiring coordinate information corresponding to the object information by using an ultrasonic sensor according to the view finding range of the camera lens through ultrasonic waves with the frequency of 40KHz kilohertz;
or acquiring coordinate information corresponding to the object information by using an infrared sensor through a plurality of infrared rays with the same frequency and different emission angles in the view finding range of the camera lens.
6. A method of focused imaging, the method comprising:
receiving a first instruction to acquire image information and generating a preview image; wherein the acquiring image information comprises: acquiring object information through a camera lens and acquiring coordinate information corresponding to the object information through a sensor;
receiving a second instruction to select first object information in the preview image;
and focusing the preview image according to the selected first object information and the coordinate information of the first object.
7. The method according to claim 6, wherein the performing the focusing process on the preview image according to the selected first object information and the coordinate information of the first object specifically includes:
determining a first distance between the first object and the camera lens according to the coordinate information corresponding to the first object information;
determining a second distance between a second object except the first object and the camera lens according to the coordinate information of the second object in the image information;
selecting a target object with a second distance identical to the first distance from the second object;
and focusing the first object and the target object in the preview image to obtain a focused image.
8. The method of claim 7, further comprising: receiving a third instruction to determine a selected area;
correspondingly, selecting a target object with a second distance equal to the first distance from the second object specifically includes:
and selecting a target object with the second distance being the same as the first distance from a second object in the selected area.
9. The method according to claim 7 or 8, characterized in that the method further comprises: and performing non-focusing processing on the first object and other objects except the target object in the preview image.
10. The method according to claim 6, wherein the acquiring, by a sensor, coordinate information corresponding to the object information includes:
the ultrasonic sensor acquires coordinate information corresponding to the object information through ultrasonic waves with the frequency of 40KHz kilohertz according to the view finding range of the camera lens;
or the infrared sensor collects coordinate information corresponding to the object information through a plurality of infrared rays with the same frequency and different emission angles in the view finding range of the camera lens.
11. A terminal, characterized in that the terminal comprises: the device comprises a camera lens, a sensor, a memory, a display and a processor;
the processor is used for receiving a second instruction to select first object information in a preview image which is generated according to the received first instruction and displayed by the display; the image information comprises object information acquired by the camera lens and coordinate information corresponding to the object information acquired by the sensor, and the image information is stored in the memory;
and the focusing unit is used for focusing the preview image according to the selected first object information and the coordinate information of the first object.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510990082.3A CN106921825A (en) | 2015-12-24 | 2015-12-24 | A kind of method of focal imaging, device and terminal |
PCT/CN2016/086903 WO2016198014A1 (en) | 2015-12-24 | 2016-06-23 | Focusing imaging device, method, and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510990082.3A CN106921825A (en) | 2015-12-24 | 2015-12-24 | A kind of method of focal imaging, device and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106921825A true CN106921825A (en) | 2017-07-04 |
Family
ID=57503004
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510990082.3A Pending CN106921825A (en) | 2015-12-24 | 2015-12-24 | A kind of method of focal imaging, device and terminal |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106921825A (en) |
WO (1) | WO2016198014A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113838118B (en) * | 2021-09-08 | 2024-07-09 | 杭州逗酷软件科技有限公司 | Distance measurement method and device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1763225A2 (en) * | 2005-09-13 | 2007-03-14 | Canon Kabushiki Kaisha | Lens apparatus |
CN104052932A (en) * | 2014-07-03 | 2014-09-17 | 深圳市世尊科技有限公司 | Rapidly-focusing mobile phone camera shooting module |
CN105100615A (en) * | 2015-07-24 | 2015-11-25 | 青岛海信移动通信技术股份有限公司 | Image preview method, apparatus and terminal |
CN105100605A (en) * | 2015-06-18 | 2015-11-25 | 惠州Tcl移动通信有限公司 | Mobile terminal and quick focusing method for photographing with the same |
-
2015
- 2015-12-24 CN CN201510990082.3A patent/CN106921825A/en active Pending
-
2016
- 2016-06-23 WO PCT/CN2016/086903 patent/WO2016198014A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1763225A2 (en) * | 2005-09-13 | 2007-03-14 | Canon Kabushiki Kaisha | Lens apparatus |
CN104052932A (en) * | 2014-07-03 | 2014-09-17 | 深圳市世尊科技有限公司 | Rapidly-focusing mobile phone camera shooting module |
CN105100605A (en) * | 2015-06-18 | 2015-11-25 | 惠州Tcl移动通信有限公司 | Mobile terminal and quick focusing method for photographing with the same |
CN105100615A (en) * | 2015-07-24 | 2015-11-25 | 青岛海信移动通信技术股份有限公司 | Image preview method, apparatus and terminal |
Also Published As
Publication number | Publication date |
---|---|
WO2016198014A1 (en) | 2016-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5134694B2 (en) | Image processing apparatus and image processing method | |
CN105208287B (en) | A kind of image pickup method and device | |
CN107735713A (en) | Medical image processing unit, medical image processing method and medical viewing system | |
CN105704380A (en) | Camera focusing method and electric device | |
US20160173762A1 (en) | Image-capturing apparatus | |
CN104410783A (en) | Focusing method and terminal | |
CN105340267A (en) | Method for generating picture and twin-lens device | |
JP2014007580A (en) | Imaging apparatus, method of controlling the same and program therefor | |
CN107517345B (en) | Shooting preview method and capture apparatus | |
CN110246188B (en) | Internal reference calibration method and device for TOF camera and camera | |
CN104811613A (en) | Camera focusing method | |
Martel et al. | Real-time depth from focus on a programmable focal plane processor | |
CN105549299B (en) | Control method, control device and electronic device | |
CN112333379A (en) | Image focusing method and device and image acquisition equipment | |
JP6152772B2 (en) | Imaging apparatus, semiconductor integrated circuit, and imaging method | |
CN105657253B (en) | A kind of focusing method and electronic equipment | |
CN105301864B (en) | Liquid crystal lens imaging device and liquid crystal lens imaging method | |
JP2013145982A (en) | Imaging apparatus, image processing apparatus and method | |
CN108169996A (en) | A kind of test method of stereo camera shooting module motor characteristics, apparatus and system | |
JP2015194686A (en) | Imaging apparatus | |
US10880536B2 (en) | Three-dimensional image capturing device and method | |
JP6149717B2 (en) | Imaging apparatus and imaging method | |
JP6326631B2 (en) | Imaging device | |
CN110225247B (en) | Image processing method and electronic equipment | |
CN106921825A (en) | A kind of method of focal imaging, device and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170704 |