CN105323481A - Ultrasonic-based photographing method and device - Google Patents
Ultrasonic-based photographing method and device Download PDFInfo
- Publication number
- CN105323481A CN105323481A CN201510671450.8A CN201510671450A CN105323481A CN 105323481 A CN105323481 A CN 105323481A CN 201510671450 A CN201510671450 A CN 201510671450A CN 105323481 A CN105323481 A CN 105323481A
- Authority
- CN
- China
- Prior art keywords
- camera
- multiple subject
- subject
- ultrasonic sensor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000012544 monitoring process Methods 0.000 abstract 1
- 230000007423 decrease Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
Abstract
The invention discloses an ultrasonic-based photographing method and device. The method comprises the following steps: while monitoring a photographing trigger event, obtaining distance values between at least one camera in an intelligent terminal and a plurality of photographed objects through a plurality of ultrasonic sensors in the intelligent terminal; controlling the at least one camera to respectively focus the plurality of photographed objects according to the obtained distance values between the at least one camera and the plurality of photographed objects, and controlling the at least one camera to photograph the plurality of photographed objects according to a focusing result. According to the technical scheme provided by the embodiment of the invention, the camera is controlled to respectively focus the plurality of photographed objects according to the distance values between the camera and the plurality of photographed objects; therefore, the focusing time of the plurality of photographed objects by the camera is increased; and the thus, the quality of the photographed image is increased.
Description
Technical field
The embodiment of the present invention relates to electric terminal technical field, particularly relates to a kind of based on hyperacoustic photographic method and device.
Background technology
Ultrasonic sensor comprises two parts, one or more transmitting terminal, for launching ultrasonic wave; One or more receiving terminal, for receiving ultrasonic wave.
When object proximity ultrasonic sensor, have hyperacoustic reflection and reception, by launch time and the difference of time of reception, the algorithm of ultrasonic velocity and differential filtering algorithm, can specifically draw object close to distance, i.e. ultrasonic measuring distance technology.According to the difference that IC (IntegratedCircuit, integrated circuit) and software algorithm are arranged, distance measured value precision reaches millimeter (mm) rank, and full scale is at about 150-200cm.When utilizing ultrasonic measuring distance technology to take pictures at present, only many subject are once focused on, cause the picture quality of shooting poor.
Summary of the invention
The invention provides a kind of based on hyperacoustic photographic method and device, to utilize ultrasonic measuring distance technology to focus on respectively multiple subject, improve the picture quality of taking and obtaining.
First aspect, embodiments provides a kind of based on hyperacoustic photographic method, comprising:
Monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal;
According to the distance value of at least one camera respectively and between multiple subject obtained, control at least one camera described to focus on respectively described multiple subject, and described in controlling, at least one camera is taken described multiple subject according to focusing results.
Second aspect, embodiments provides a kind of based on hyperacoustic camera arrangement, comprising:
Distance obtain unit, for monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal;
Focusing unit, for according to the distance value of at least one camera respectively and between multiple subject obtained, controls at least one camera described and focuses on respectively described multiple subject;
Shooting unit, takes described multiple subject according to the focusing results of described focusing unit for controlling at least one camera described.
The technical scheme that the embodiment of the present invention provides, by controlling camera according to the distance value between camera and multiple subject, multiple subject is focused on respectively, improves the focusing number of times of camera to many subject, thus improve the picture quality of taking and obtaining.
Accompanying drawing explanation
Fig. 1 is a kind of schematic flow sheet based on hyperacoustic photographic method that the embodiment of the present invention one provides;
Fig. 2 is a kind of schematic flow sheet based on hyperacoustic photographic method that the embodiment of the present invention two provides;
Fig. 3 is a kind of schematic flow sheet based on hyperacoustic photographic method that the embodiment of the present invention three provides;
Fig. 4 is a kind of structural representation based on hyperacoustic camera arrangement that the embodiment of the present invention four provides.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not entire infrastructure.
Embodiment one
Fig. 1 is a kind of schematic flow sheet based on hyperacoustic photographic method that the embodiment of the present invention one provides, the method can perform by based on hyperacoustic camera arrangement, wherein this device can by software and/or hardware implementing, the part that can be used as intelligent terminal is built in intelligent terminal inside, and intelligent terminal can be the electric terminal of such as smart mobile phone, panel computer and so on.As shown in Figure 1, this realization flow specifically can comprise:
Step 11, monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal.
In the present embodiment, monitor user click default take pictures button or user perform take pictures trigger action time, produce trigger event of taking pictures, camera enters automatic focus state.
Concrete, obtain the distance value of each camera respectively and between multiple subject by multiple ultrasonic sensor.Such as, when intelligent terminal only includes a camera, can using the average of the distance value of multiple ultrasonic sensors respectively and between multiple subject that obtains as the distance value of this camera respectively and between multiple subject; When intelligent terminal comprises multiple camera, the quantity of ultrasonic sensor can be identical with the quantity of camera, and each camera neighbor positions place all arranges a ultrasonic sensor, can using obtain each ultrasonic sensor respectively to the camera distance value respectively and multiple subject between of the distance value between multiple subject as corresponding neighbour.
Step 12, according to obtain the distance value of at least one camera respectively and between multiple subject, control at least one camera described to focus on respectively described multiple subject, and described in controlling, at least one camera is taken described multiple subject according to focusing results.
In the present embodiment, after the distance value of the camera of acquisition respectively and between multiple subject, the camera being in automatic focus state focuses on respectively multiple subject and takes.In the present embodiment, because camera focuses on respectively to multiple subject respectively, make to take each subject in the image obtained all clear, obtain the image of the depth of field, only multiple subject is once focused on compared to camera in prior art, improve the picture quality of taking and obtaining.
The present embodiment provide based on hyperacoustic photographic method, by controlling camera according to the distance value between camera and multiple subject, multiple subject is focused on respectively, improves the focusing number of times of camera to many subject, thus improve the picture quality of taking and obtaining.
Embodiment two
The present embodiment provides a kind of based on hyperacoustic photographic method on the basis of above-described embodiment one.Fig. 2 is a kind of schematic flow sheet based on hyperacoustic photographic method that the embodiment of the present invention two provides, and as shown in Figure 2, this realization flow specifically can comprise:
Step 21, monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal.
In the present embodiment, described camera can comprise the first camera and second camera, and described first camera and described second camera are post-positioned pick-up head or front-facing camera.
In the present embodiment, described multiple ultrasonic sensor can comprise the first ultrasonic sensor and the second ultrasonic sensor; When described first camera and described second camera are post-positioned pick-up head, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on the intelligent terminal back side; When described first camera and described second camera are front-facing camera, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on intelligent terminal front.Concrete, first ultrasonic sensor can with the first camera position neighbour, second ultrasonic sensor can with second camera position neighbour, obtain the distance value of the first camera respectively and between multiple subject by the first ultrasonic sensor, obtain the distance value of second camera respectively and between multiple subject by the second ultrasonic sensor.
Step 22, according to the distance value of at least one camera respectively and between multiple subject obtained, control at least one camera described and described multiple subject is focused on respectively.
Concrete, the first camera, according to self distance value respectively and between multiple subject, focuses on multiple subject respectively; Second camera, according to self distance value respectively and between multiple subject, focuses on multiple subject respectively.
Step 23, control described first camera and described multiple subject is taken, to form the first image according to focusing results.
Step 24, control described second camera according to focusing results and described multiple subject is taken, to form the second image.
Because the first camera is different from second camera position, it is different with the second image that the two takes the first image obtained, and enriched and taken the image style obtained.
The present embodiment provide based on hyperacoustic photographic method, by controlling the first camera and second camera focuses on multiple subject and takes respectively, not only increasing the picture quality of taking and obtaining, also having enriched and having taken the image style obtained.
Optionally, control described second camera focus on described multiple subject and take described multiple shot object, after forming the second image, can also comprise: the piece image selecting brightness and/or colour temperature feature to meet to impose a condition from the first image formed and the second image is as final image.
Embodiment three
The present embodiment provides a kind of based on hyperacoustic photographic method on the basis of above-described embodiment one and above-described embodiment two.Fig. 3 is a kind of schematic flow sheet based on hyperacoustic photographic method that the embodiment of the present invention three provides, and as shown in Figure 3, this realization flow specifically can comprise:
Step 31, monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal.
Step 32, according to obtain the distance value of at least one camera respectively and between multiple subject, determine the positional information of described multiple subject.
Concrete, according to the distance value of each camera respectively and between multiple subject, the relative distance value between different subject can be estimated, thus determine the positional information of multiple subject.
Step 33, positional information according to described multiple subject, divide into groups to described multiple subject.
Concrete, the different subject of position neighbour can be divided at one group.
Step 34, acquisition often organize the distance average between subject and at least one camera described.
Step 35, according to the distance average obtained, control at least one camera described and described subject of often organizing is focused on respectively.
In the present embodiment, camera focuses on respectively to often organizing subject, focuses on respectively each subject compared to camera, decreases focusing number of times, improves focusing speed.
Step 36, control at least one camera described are taken described multiple subject according to focusing results.
The present embodiment provide based on hyperacoustic photographic method, camera focuses on respectively often organizing subject and takes, compared to camera, each subject is focused on respectively, decrease focusing number of times, improve focusing speed, compared to shooting header value of the prior art many subject once focused on and take, improve the picture quality of taking and obtaining.
4th embodiment
Fig. 4 is a kind of structural representation based on hyperacoustic camera arrangement that the embodiment of the present invention four provides, and this device can be built in intelligent terminal inside.As shown in Figure 4, being somebody's turn to do can be as follows based on the concrete structure of hyperacoustic camera arrangement:
Distance obtain unit 41, for monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal;
Focusing unit 42, for according to the distance value of at least one camera respectively and between multiple subject obtained, controls at least one camera described and focuses on respectively described multiple subject;
Shooting unit 43, takes described multiple subject according to the focusing results of described focusing unit for controlling at least one camera described.
Exemplary, comprise the first camera and second camera at least one camera described, and when described first camera and described second camera are post-positioned pick-up head or front-facing camera, described shooting unit 43 can comprise:
First shooting subelement, takes described multiple subject, to form the first image according to focusing results for controlling described first camera;
Second shooting subelement, takes described multiple subject, to form the second image according to focusing results for controlling described second camera.
Exemplary, this device can also comprise:
Image selection unit, for the piece image selecting brightness and/or colour temperature feature to meet in the first image of being formed from described first shooting subelement and described second shooting subelement second image to impose a condition as final image.
Exemplary, when at least one camera described comprises the first camera and second camera, described multiple ultrasonic sensor comprises the first ultrasonic sensor and the second ultrasonic sensor;
When described first camera and described second camera are post-positioned pick-up head, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on the intelligent terminal back side;
When described first camera and described second camera are front-facing camera, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on intelligent terminal front.
Exemplary, described focusing unit 42 can comprise:
Location subunit, for according to the distance value of at least one camera respectively and between multiple subject obtained, determines the positional information of described multiple subject;
Grouping subelement, for the positional information according to described multiple subject, divides into groups to described multiple subject;
Distance average subelement, for obtaining the distance average often organized between subject and at least one camera described;
Focuson unit, for according to the distance average obtained, controls at least one camera described and focuses on respectively described subject of often organizing.
The present embodiment provide based on hyperacoustic camera arrangement, what provide with any embodiment of the present invention belongs to same inventive concept based on hyperacoustic photographic method, can perform that any embodiment of the present invention provides based on hyperacoustic photographic method, possess the corresponding functional module of manner of execution and beneficial effect.The not using method of the finger print information that can provide see any embodiment of the present invention of detailed description in the present embodiment.
Be only the preferred embodiment of the embodiment of the present invention described in upper, be not limited to the embodiment of the present invention, to those skilled in the art, the embodiment of the present invention can have various change and change.Any amendment done within all spirit in the embodiment of the present invention and principle, equivalent replacement, improvement etc., within the protection range that all should be included in the embodiment of the present invention.
Claims (10)
1. based on a hyperacoustic photographic method, it is characterized in that, comprising:
Monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal;
According to the distance value of at least one camera respectively and between multiple subject obtained, control at least one camera described to focus on respectively described multiple subject, and described in controlling, at least one camera is taken described multiple subject according to focusing results.
2. method according to claim 1, is characterized in that, at least one camera described comprises the first camera and second camera, and described first camera and described second camera are post-positioned pick-up head or front-facing camera;
Described in described control, at least one camera is taken described multiple subject according to focusing results, comprising:
Control described first camera to take described multiple subject according to focusing results, to form the first image;
Control described second camera to take described multiple subject according to focusing results, to form the second image.
3. method according to claim 2, is characterized in that, controls described second camera and focuses on described multiple subject and take described multiple shot object, after forming the second image, also comprise:
The piece image selecting brightness and/or colour temperature feature to meet to impose a condition from the first image formed and the second image is as final image.
4. method according to claim 2, is characterized in that, described multiple ultrasonic sensor comprises the first ultrasonic sensor and the second ultrasonic sensor;
When described first camera and described second camera are post-positioned pick-up head, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on the intelligent terminal back side;
When described first camera and described second camera are front-facing camera, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on intelligent terminal front.
5. the method according to any one of claim 1-4, is characterized in that, the described distance value of at least one camera respectively and between multiple subject according to obtaining, and controls at least one camera described and focuses on respectively described multiple subject, comprising:
According to the distance value of at least one camera respectively and between multiple subject obtained, determine the positional information of described multiple subject;
According to the positional information of described multiple subject, described multiple subject is divided into groups;
Obtain the distance average often organized between subject and at least one camera described;
According to the distance average obtained, control at least one camera described and described subject of often organizing is focused on respectively.
6. based on a hyperacoustic camera arrangement, it is characterized in that, comprising:
Distance obtain unit, for monitor take pictures trigger event time, obtained the distance value of at least one camera respectively and between multiple subject in described intelligent terminal by the multiple ultrasonic sensors in intelligent terminal;
Focusing unit, for according to the distance value of at least one camera respectively and between multiple subject obtained, controls at least one camera described and focuses on respectively described multiple subject;
Shooting unit, takes described multiple subject according to the focusing results of described focusing unit for controlling at least one camera described.
7. device according to claim 6, it is characterized in that, comprise the first camera and second camera at least one camera described, and when described first camera and described second camera are post-positioned pick-up head or front-facing camera, described shooting unit comprises:
First shooting subelement, takes described multiple subject, to form the first image according to focusing results for controlling described first camera;
Second shooting subelement, takes described multiple subject, to form the second image according to focusing results for controlling described second camera.
8. device according to claim 7, is characterized in that, also comprises:
Image selection unit, for the piece image selecting brightness and/or colour temperature feature to meet in the first image of being formed from described first shooting subelement and described second shooting subelement second image to impose a condition as final image.
9. device according to claim 7, is characterized in that, described multiple ultrasonic sensor comprises the first ultrasonic sensor and the second ultrasonic sensor;
When described first camera and described second camera are post-positioned pick-up head, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on the intelligent terminal back side;
When described first camera and described second camera are front-facing camera, described first ultrasonic sensor and described second ultrasonic sensor are all arranged on intelligent terminal front.
10. the device according to any one of claim 6-9, is characterized in that, described focusing unit comprises:
Location subunit, for according to the distance value of at least one camera respectively and between multiple subject obtained, determines the positional information of described multiple subject;
Grouping subelement, for the positional information according to described multiple subject, divides into groups to described multiple subject;
Distance average subelement, for obtaining the distance average often organized between subject and at least one camera described;
Focuson unit, for according to the distance average obtained, controls at least one camera described and focuses on respectively described subject of often organizing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510671450.8A CN105323481B (en) | 2015-10-15 | 2015-10-15 | A kind of photographic method based on ultrasound and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510671450.8A CN105323481B (en) | 2015-10-15 | 2015-10-15 | A kind of photographic method based on ultrasound and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105323481A true CN105323481A (en) | 2016-02-10 |
CN105323481B CN105323481B (en) | 2018-11-20 |
Family
ID=55249985
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510671450.8A Expired - Fee Related CN105323481B (en) | 2015-10-15 | 2015-10-15 | A kind of photographic method based on ultrasound and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105323481B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106151802A (en) * | 2016-07-27 | 2016-11-23 | 广东思锐光学股份有限公司 | A kind of intelligent console and the method utilizing intelligent console to carry out autodyning |
CN106993137A (en) * | 2017-04-26 | 2017-07-28 | 广东小天才科技有限公司 | Method and device for determining terminal shooting mode |
CN106993130A (en) * | 2017-03-09 | 2017-07-28 | 北京小米移动软件有限公司 | Gather method, device and the mobile device of image |
CN108289166A (en) * | 2017-12-12 | 2018-07-17 | 北京臻迪科技股份有限公司 | A kind of submarine target automatic shooting method and system |
CN112135034A (en) * | 2019-06-24 | 2020-12-25 | Oppo广东移动通信有限公司 | Photographing method and device based on ultrasonic waves, electronic equipment and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4531157A (en) * | 1982-04-28 | 1985-07-23 | West Electric Co., Ltd. | Method and apparatus for automatic focusing in video camera |
CN1525763A (en) * | 2002-12-12 | 2004-09-01 | ���ǵ�����ʽ���� | Method and apparatus for generating user preference data regarding color characteristic of image and method and apparatus for converting image color preference using the method and apparatus |
CN1767597A (en) * | 2002-09-13 | 2006-05-03 | 佳能株式会社 | Focusing controller, image pickup device |
CN101771810A (en) * | 2008-12-29 | 2010-07-07 | 上海乐金广电电子有限公司 | Method and device for obtaining clear images |
CN102455568A (en) * | 2010-10-28 | 2012-05-16 | 安讯士有限公司 | Method for focusing |
CN103984186A (en) * | 2014-05-04 | 2014-08-13 | 深圳市阿格斯科技有限公司 | Optical zooming vidicon and automatic focusing control method and device thereof |
CN104184935A (en) * | 2013-05-27 | 2014-12-03 | 鸿富锦精密工业(深圳)有限公司 | Image shooting device and method |
CN104243828A (en) * | 2014-09-24 | 2014-12-24 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and terminal for shooting pictures |
CN104270560A (en) * | 2014-07-31 | 2015-01-07 | 三星电子(中国)研发中心 | Multi-point focusing method and device |
CN104680563A (en) * | 2015-02-15 | 2015-06-03 | 青岛海信移动通信技术股份有限公司 | Image data generating method and device |
CN104811613A (en) * | 2015-04-10 | 2015-07-29 | 深圳市金立通信设备有限公司 | Camera focusing method |
-
2015
- 2015-10-15 CN CN201510671450.8A patent/CN105323481B/en not_active Expired - Fee Related
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4531157A (en) * | 1982-04-28 | 1985-07-23 | West Electric Co., Ltd. | Method and apparatus for automatic focusing in video camera |
CN1767597A (en) * | 2002-09-13 | 2006-05-03 | 佳能株式会社 | Focusing controller, image pickup device |
CN1525763A (en) * | 2002-12-12 | 2004-09-01 | ���ǵ�����ʽ���� | Method and apparatus for generating user preference data regarding color characteristic of image and method and apparatus for converting image color preference using the method and apparatus |
CN101771810A (en) * | 2008-12-29 | 2010-07-07 | 上海乐金广电电子有限公司 | Method and device for obtaining clear images |
CN102455568A (en) * | 2010-10-28 | 2012-05-16 | 安讯士有限公司 | Method for focusing |
CN104184935A (en) * | 2013-05-27 | 2014-12-03 | 鸿富锦精密工业(深圳)有限公司 | Image shooting device and method |
CN103984186A (en) * | 2014-05-04 | 2014-08-13 | 深圳市阿格斯科技有限公司 | Optical zooming vidicon and automatic focusing control method and device thereof |
CN104270560A (en) * | 2014-07-31 | 2015-01-07 | 三星电子(中国)研发中心 | Multi-point focusing method and device |
CN104243828A (en) * | 2014-09-24 | 2014-12-24 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and terminal for shooting pictures |
CN104680563A (en) * | 2015-02-15 | 2015-06-03 | 青岛海信移动通信技术股份有限公司 | Image data generating method and device |
CN104811613A (en) * | 2015-04-10 | 2015-07-29 | 深圳市金立通信设备有限公司 | Camera focusing method |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106151802A (en) * | 2016-07-27 | 2016-11-23 | 广东思锐光学股份有限公司 | A kind of intelligent console and the method utilizing intelligent console to carry out autodyning |
CN106993130A (en) * | 2017-03-09 | 2017-07-28 | 北京小米移动软件有限公司 | Gather method, device and the mobile device of image |
CN106993137A (en) * | 2017-04-26 | 2017-07-28 | 广东小天才科技有限公司 | Method and device for determining terminal shooting mode |
CN108289166A (en) * | 2017-12-12 | 2018-07-17 | 北京臻迪科技股份有限公司 | A kind of submarine target automatic shooting method and system |
CN112135034A (en) * | 2019-06-24 | 2020-12-25 | Oppo广东移动通信有限公司 | Photographing method and device based on ultrasonic waves, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN105323481B (en) | 2018-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105323481A (en) | Ultrasonic-based photographing method and device | |
US11568517B2 (en) | Electronic apparatus, control method, and non- transitory computer readable medium | |
CN102957862B (en) | Picture pick-up device and the control method of picture pick-up device | |
CN108028887B (en) | Photographing focusing method, device and equipment for terminal | |
US9300858B2 (en) | Control device and storage medium for controlling capture of images | |
CN110035218B (en) | Image processing method, image processing device and photographing equipment | |
CN103728813B (en) | A kind of method of synchronizing focus during zoom | |
CN105629628B (en) | Atomatic focusing method and device | |
CN104363378A (en) | Camera focusing method, camera focusing device and terminal | |
CN105141840B (en) | Information processing method and electronic equipment | |
CN110572574A (en) | System and method for multi-focus imaging | |
KR20140140855A (en) | Method and Apparatus for controlling Auto Focus of an photographing device | |
CN103200361A (en) | Video signal processing apparatus | |
CN101221341A (en) | Initialization method for field depth composition | |
CN106331438A (en) | Lens focus method and device, and mobile device | |
US20140307054A1 (en) | Auto focus method and auto focus apparatus | |
CN105635571B (en) | Camera control method, photographing control device and camera system | |
CN105323480A (en) | Ultrasonic-based photographing method and device | |
CN106415348A (en) | Image capture device and focus control method | |
CN105960604A (en) | Image capture device and focus control method | |
CN105872384A (en) | Photographing method and terminal | |
CN105306819A (en) | Gesture-based photographing control method and device | |
KR20160002331A (en) | Photographing apparatus capable of bracketing photography, photographing control method, and storage medium | |
CN105301279B (en) | A kind of speed measurement method based on camera, device and mobile terminal | |
CN107682691B (en) | A kind of method, terminal and the computer readable storage medium of camera focus calibration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
CP01 | Change in the name or title of a patent holder | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20181120 |
|
CF01 | Termination of patent right due to non-payment of annual fee |