WO2014101722A1 - 一种摄像装置和摄像方法 - Google Patents

一种摄像装置和摄像方法 Download PDF

Info

Publication number
WO2014101722A1
WO2014101722A1 PCT/CN2013/090176 CN2013090176W WO2014101722A1 WO 2014101722 A1 WO2014101722 A1 WO 2014101722A1 CN 2013090176 W CN2013090176 W CN 2013090176W WO 2014101722 A1 WO2014101722 A1 WO 2014101722A1
Authority
WO
WIPO (PCT)
Prior art keywords
photometric
select
unit module
pattern
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2013/090176
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
赵蕴泽
井洪亮
崔小辉
申世安
里强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen ZTE Mobile Telecom Co Ltd
Original Assignee
Shenzhen ZTE Mobile Telecom Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen ZTE Mobile Telecom Co Ltd filed Critical Shenzhen ZTE Mobile Telecom Co Ltd
Priority to US14/432,776 priority Critical patent/US20150281561A1/en
Priority to KR1020157018323A priority patent/KR101691764B1/ko
Priority to JP2015549968A priority patent/JP2016510522A/ja
Priority to IN4078DEN2015 priority patent/IN2015DN04078A/en
Priority to EP13869429.4A priority patent/EP2933998A4/en
Publication of WO2014101722A1 publication Critical patent/WO2014101722A1/zh
Anticipated expiration legal-status Critical
Priority to US14/754,672 priority patent/US20150304567A1/en
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to the field of imaging technology, and in particular, to an imaging device and an imaging method.
  • the focal length value and the light information of the touch point are simultaneously measured, that is, when the user clicks the screen to manually focus, the framing interface
  • the metering value will also change at the same time.
  • the technical problem to be solved by the present invention is to provide an image capturing apparatus and an image capturing method, which can solve the problem that the existing focus point and the light metering point cannot be separated in the existing photographing and framing process, and it is inconvenient for the user to perform framing according to the amount of background light in different environments.
  • Shooting and other shortcomings That is, a framing scheme in which the focus point and the photometric point are independently separated, the focus value and the photometric value are separately set, and a corresponding imaging apparatus and imaging method are provided.
  • a camera device includes:
  • a display unit module for displaying an image
  • a receiving unit module configured to receive a finger processing unit module that operates on the image displayed on the display unit module, and is configured to select a photometric position and/or a focus position according to an instruction received by the receiving unit module.
  • An imaging method includes:
  • the focus area and the metering area can be selected according to the need of the opposite operation, and the composition is framed according to different scenes, thereby improving the user experience. Allows the user to drag the focus frame or the metering frame separately for focusing and metering during the framing process. High playability and user experience.
  • FIG. 1 is a schematic block diagram of an image pickup apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a screen of a display unit module according to an embodiment of the present invention.
  • FIG. 3 is a flowchart of an image capturing method according to an embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating an example of a mobile phone of an Android platform according to an embodiment of the present invention.
  • Embodiment 1 is a diagrammatic representation of Embodiment 1:
  • FIG. 1 is a schematic diagram of a module of an image pickup apparatus according to a first embodiment of the present invention.
  • An image pickup apparatus includes: a display unit module 11, a receiving unit module 12, and a processing unit module 13.
  • the display unit module 11 is configured to display an image; wherein, the image may be an image captured by the camera, an image received by the camera, or an image stored in the camera. In addition, the display unit module 11 is further configured to display a photometry pattern and/or a focus pattern.
  • the camera and the display unit module 11 are connected to each other. The camera is disposed outside the camera. The camera may be a front camera, or a rear camera, or a separate camera, connected to the display unit module 11 via a data line.
  • the display unit module 11 can be a liquid crystal display or an OLED display.
  • the receiving unit module 12 is configured to receive an instruction to operate on the image displayed on the display unit module 11.
  • the command may be an instruction of a gesture operation, a voice/voice command, or a touch input command.
  • the instructions include: an instruction to select a metering position by dragging or clicking on the displayed metering pattern; and/or an instruction to select a focus position by dragging or clicking on the displayed focus pattern.
  • the receiving unit module 12 can be a mouse, a keyboard, a microphone, a trackpad, a projection device, or various combinations of the former.
  • the processing unit module 13 is configured to select a photometry position and/or a focus position according to an instruction received by the receiving unit module 12.
  • the processing unit module 13 includes a calculation unit, and the calculation unit is configured to calculate a brightness value of a pixel covered by the selected photometry pattern of the at least one photometry position by using a preset function, An output value is obtained, and the imaging device performs photographing based on the output value. For example, as shown in FIG.
  • the command of the touch input is to perform touch on the screen of the display unit module 11 displaying the image by using a finger, and the user can select a photometry position, such as a finger dragging the photometry pattern 100, and observe The exposure amount changes, and the metering point is selected according to the need; the focus position can also be selected, for example, the finger drags the focus pattern 200 to the position where the focus is required, and performs focusing.
  • the photometric position is at least two; and/or the in-focus position is at least two.
  • the photometry pattern has a different color or a different shape from the focus pattern.
  • FIG. 3 is a schematic flowchart of an imaging method of the present invention.
  • An imaging method includes the following steps:
  • Step S1 displaying an image
  • the image may be an image captured by a camera, an image received by a camera, or an image stored in a camera.
  • the image may also include a photometric pattern and/or a focus pattern.
  • Step S2 receiving an instruction to operate on the displayed image
  • the instruction includes: an instruction to select a photometry position and/or an instruction to select a focus position.
  • SETURE S3 Select the metering position and/or focus position according to the received command.
  • the photometry position is selected according to the received instruction to select the photometry position.
  • Select the focus position according to the received command to select the focus position.
  • the steps of selecting a photometric position and/or a focus position further include:
  • the photometric positions are at least two; and/or the in-focus position is at least two.
  • the color measurement pattern is different from the color of the focus pattern, or the shape is different, so that the difference is convenient for the user to operate.
  • the brightness value of the pixel covered by the selected photometry pattern of the at least one photometry position is an input value, and is calculated by a preset function, and the photograph is taken according to the calculated output value.
  • the focus parameter is adjusted according to different focus positions to perform photographing, thereby optimizing the sharpness of the photographed image. (sharpness).
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • This product uses Android's own dispatchTouchEvent (touch event distribution, Android system's own touch screen event processing method) for touch screen event distribution and processing, the coordinates of the touch area Compare with the position of the previous focus frame and the metering frame to determine whether the drag or click event is for focus or metering. After making the judgment, use calculateTapArea (calculate the coordinate area and calculate the rectangular area around the contact). The coordinate transformation is converted from the screen coordinates of the UI to the drive coordinates that can be utilized by the bottom layer.
  • Android's own dispatchTouchEvent touch event distribution, Android system's own touch screen event processing method
  • the method of framing the focus point and the metering point separately for framing includes the following three modules:
  • the focus area and the metering area touch screen event acquisition and area judgment 1.
  • WindowManagerService window management service, the Android framework layer for managing the view of the window in the window
  • WindowManagerService function dispatchPointer distributed contact, WindowManagerService send message method
  • IWindow client proxy object to send the message to the corresponding IWindow server, which is an IWindow.Stub subclass
  • the implementation method dispatchPointer of the IWindow.Stub subclass will be called.
  • the View's dispatchTouchEvent method will be called, and the touchscreen event has been obtained. Comparing the currently acquired touch screen coordinate value with the previous focus area coordinate value and the light metering area coordinate value to determine whether the current click or drag area is the focus effective area or the metering effective area or the invalid area.
  • the parameters are passed to JNI (Java Native Interface, JAVA local call, through the setMeteringArea and setFocusArea of the framework layer (setting the focus area, the interface for transferring the focus area to the bottom layer), to complete the upper java language to The underlying c language is called) and passed to the HAL layer via android-hardware-Camera (a function of the JI layer that handles the java language to c language call in the camera module), which is ultimately implemented by native-set_parms.
  • JNI Java Native Interface
  • setMeteringArea and setFocusArea of the framework layer setting the focus area, the interface for transferring the focus area to the bottom layer
  • the underlying c language is called
  • android-hardware-Camera a function of the JI layer that handles the java language to c language call in the camera module
  • the above embodiment is only an example of the Android platform, and is not limited to the Android platform, and can also be implemented on a platform or an operating system such as Apple's iOS, Microsoft's Windows, or the like.
  • the focus area and the light metering area can be selected according to the needs of the opposite operation, and the composition is framed according to different scenes, thereby improving the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)
  • Focusing (AREA)
  • Exposure Control For Cameras (AREA)
PCT/CN2013/090176 2012-12-28 2013-12-22 一种摄像装置和摄像方法 Ceased WO2014101722A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US14/432,776 US20150281561A1 (en) 2012-12-28 2013-12-22 Photographic apparatus and photographing method
KR1020157018323A KR101691764B1 (ko) 2012-12-28 2013-12-22 촬영 장치 및 촬영 방법
JP2015549968A JP2016510522A (ja) 2012-12-28 2013-12-22 撮像装置及び撮像方法
IN4078DEN2015 IN2015DN04078A (enExample) 2012-12-28 2013-12-22
EP13869429.4A EP2933998A4 (en) 2012-12-28 2013-12-22 RECORDING DEVICE AND RECORDING METHOD
US14/754,672 US20150304567A1 (en) 2012-12-28 2015-06-29 Photographic apparatus and photographing method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201210587006 2012-12-28
CN201210587006.4 2012-12-28
CN201310557312.8 2013-11-08
CN201310557312 2013-11-08

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/432,776 A-371-Of-International US20150281561A1 (en) 2012-12-28 2013-12-22 Photographic apparatus and photographing method
US14/754,672 Continuation US20150304567A1 (en) 2012-12-28 2015-06-29 Photographic apparatus and photographing method

Publications (1)

Publication Number Publication Date
WO2014101722A1 true WO2014101722A1 (zh) 2014-07-03

Family

ID=51019876

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/090176 Ceased WO2014101722A1 (zh) 2012-12-28 2013-12-22 一种摄像装置和摄像方法

Country Status (6)

Country Link
US (2) US20150281561A1 (enExample)
EP (1) EP2933998A4 (enExample)
JP (1) JP2016510522A (enExample)
KR (1) KR101691764B1 (enExample)
IN (1) IN2015DN04078A (enExample)
WO (1) WO2014101722A1 (enExample)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079928A (zh) * 2014-07-07 2014-10-01 广东欧珀移动通信有限公司 摄像头的一致性校准方法、装置及移动设备
WO2016142757A1 (en) * 2015-03-06 2016-09-15 Sony Corporation Method for independently determining exposure and focus settings of a digital camera

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016510522A (ja) * 2012-12-28 2016-04-07 ヌビア テクノロジー カンパニー リミテッド 撮像装置及び撮像方法
CN106101680B (zh) * 2015-05-06 2018-12-25 小米科技有限责任公司 拍摄参数设置方法及装置
CN106713886B (zh) * 2016-12-20 2018-11-02 深圳Tcl数字技术有限公司 白平衡调整装置和白平衡调整方法
CN108737717A (zh) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 拍摄方法、装置、智能设备及存储介质
CN110177218B (zh) * 2019-06-28 2021-06-04 广州鲁邦通物联网科技有限公司 一种安卓设备的拍照图像处理方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030179310A1 (en) * 1997-05-12 2003-09-25 Yoshiaki Irie Camera having light measuring device operable based on focus detecting device
CN101582985A (zh) * 2008-12-09 2009-11-18 王玉龙 数码触景技术照相机
US20100027983A1 (en) * 2008-07-31 2010-02-04 Fuji Xerox Co., Ltd. System and method for manual selection of multiple evaluation points for camera control
CN101669358A (zh) * 2007-04-30 2010-03-10 索尼爱立信移动通讯有限公司 数字摄像机和操作方法

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3829144B2 (ja) * 2004-11-25 2006-10-04 シャープ株式会社 合焦エリア調節カメラ付携帯端末
JP4929630B2 (ja) * 2005-07-06 2012-05-09 ソニー株式会社 撮像装置、制御方法、およびプログラム
KR100805293B1 (ko) * 2006-11-16 2008-02-20 삼성전자주식회사 휴대기기 및 그 촬영방법
JP2008166926A (ja) * 2006-12-27 2008-07-17 Nikon Corp 逆光判定装置、及び被写体撮像方法
JP4879127B2 (ja) * 2007-09-21 2012-02-22 富士フイルム株式会社 デジタルカメラ、及びデジタルカメラのフォーカスエリア選択方法
JP2009273033A (ja) * 2008-05-09 2009-11-19 Olympus Imaging Corp カメラシステム及びコントローラの制御方法,コントローラのプログラム
US8237807B2 (en) * 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
KR101505681B1 (ko) * 2008-09-05 2015-03-30 엘지전자 주식회사 터치 스크린을 구비한 이동 단말기 및 이를 이용한 이미지 촬상 방법
US8233789B2 (en) * 2010-04-07 2012-07-31 Apple Inc. Dynamic exposure metering based on face detection
JP2012095236A (ja) * 2010-10-28 2012-05-17 Sanyo Electric Co Ltd 撮像装置
US20120242852A1 (en) * 2011-03-21 2012-09-27 Apple Inc. Gesture-Based Configuration of Image Processing Techniques
JP2016510522A (ja) * 2012-12-28 2016-04-07 ヌビア テクノロジー カンパニー リミテッド 撮像装置及び撮像方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030179310A1 (en) * 1997-05-12 2003-09-25 Yoshiaki Irie Camera having light measuring device operable based on focus detecting device
CN101669358A (zh) * 2007-04-30 2010-03-10 索尼爱立信移动通讯有限公司 数字摄像机和操作方法
US20100027983A1 (en) * 2008-07-31 2010-02-04 Fuji Xerox Co., Ltd. System and method for manual selection of multiple evaluation points for camera control
CN101582985A (zh) * 2008-12-09 2009-11-18 王玉龙 数码触景技术照相机

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2933998A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079928A (zh) * 2014-07-07 2014-10-01 广东欧珀移动通信有限公司 摄像头的一致性校准方法、装置及移动设备
WO2016142757A1 (en) * 2015-03-06 2016-09-15 Sony Corporation Method for independently determining exposure and focus settings of a digital camera

Also Published As

Publication number Publication date
KR101691764B1 (ko) 2016-12-30
IN2015DN04078A (enExample) 2015-10-09
JP2016510522A (ja) 2016-04-07
EP2933998A1 (en) 2015-10-21
US20150304567A1 (en) 2015-10-22
US20150281561A1 (en) 2015-10-01
EP2933998A4 (en) 2016-08-24
KR20150095787A (ko) 2015-08-21

Similar Documents

Publication Publication Date Title
CN103139481B (zh) 一种摄像装置和摄像方法
JP6834056B2 (ja) 撮影モバイル端末
WO2014101722A1 (zh) 一种摄像装置和摄像方法
CN104735355B (zh) 一种智能终端的摄像方法及装置
JP2019135879A (ja) プレビュー画像表示方法及び装置、並びに端末
CN105744175B (zh) 一种屏幕补光方法、装置及移动终端
WO2015003604A1 (zh) 一种图像处理方法、装置及终端
CN107743197A (zh) 一种屏幕补光方法、装置及移动终端
CN105049695A (zh) 一种视频录制方法及装置
CN106231175A (zh) 一种终端手势拍照的方法
WO2019214574A1 (zh) 图像拍摄方法、装置及电子终端
CN107637063B (zh) 用于基于用户的手势控制功能的方法和拍摄装置
JP2010016826A (ja) 画像処理オペレーションを効率的に実行するためのシステム及び方法
WO2015149525A1 (zh) 拍摄方法和拍摄装置
CN112532881A (zh) 图像处理方法、装置和电子设备
CN105530421B (zh) 一种基于双摄像头的对焦方法、装置和终端
CN112637515A (zh) 拍摄方法、装置和电子设备
CN115967854B (zh) 拍照方法、装置及电子设备
CN105472245A (zh) 一种拍照方法、电子设备
CN113873160B (zh) 图像处理方法、装置、电子设备和计算机存储介质
TW201618532A (zh) 輔助拍照系統及方法
CN112672051B (zh) 拍摄方法、装置和电子设备
CN104410787B (zh) 一种摄像装置和摄像方法
CN110533729A (zh) 智能终端、像素处理方法及计算机可读存储介质
CN105791665A (zh) 拍照方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13869429

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14432776

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2015549968

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20157018323

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2013869429

Country of ref document: EP