WO2018196854A1 - Procédé de photographie, appareil de photographie et terminal mobile - Google Patents

Procédé de photographie, appareil de photographie et terminal mobile Download PDF

Info

Publication number
WO2018196854A1
WO2018196854A1 PCT/CN2018/084823 CN2018084823W WO2018196854A1 WO 2018196854 A1 WO2018196854 A1 WO 2018196854A1 CN 2018084823 W CN2018084823 W CN 2018084823W WO 2018196854 A1 WO2018196854 A1 WO 2018196854A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
distance
camera
image
sharpness
Prior art date
Application number
PCT/CN2018/084823
Other languages
English (en)
Chinese (zh)
Inventor
张晓亮
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018196854A1 publication Critical patent/WO2018196854A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • the present disclosure relates to the field of photographing technology, and in particular, to a photographing method, a photographing device, and a mobile terminal.
  • dual cameras have the same pixels, some dual cameras have different pixels, some dual cameras are divided into a telephoto lens and a wide viewing angle lens, and some dual cameras include a color sensor and a black and white sensor.
  • a photographed object for example, a person
  • a background object for example, a landscape
  • the proportion of the photographed object for example, a person
  • the background for example, the scenery
  • the scenery is not well represented.
  • it is usually only a blurring operation on the background object, and the landscape in the distance cannot be zoomed in or enlarged to reflect the details of the landscape.
  • a photographing method includes: causing a first camera and a second camera to focus on two different positions in a photograph preview interface to obtain a first focus object corresponding to the first camera and a second focus object corresponding to the second camera; extracting an image around the first focus object as a first layer, extracting an image around the second focus object as a second layer; And/or the second layer performs processing; and merges the first layer and the second layer with the photo preview interface to generate a composite photo.
  • a camera apparatus includes: a focusing module configured to cause a first camera and a second camera to focus on two different positions in a photo preview interface to obtain a first camera corresponding a first in-focus object and a second in-focus object corresponding to the second camera; an image layering module configured to extract an image around the first in-focus object as a first layer, and extract an image around the second in-focus object a second layer; a layer processing module configured to process the first layer and/or the second layer; and a photo composition module configured to connect the first layer and the second layer
  • the photo preview interfaces are merged to generate a composite photo.
  • a mobile terminal comprising a memory, a processor, and at least one application stored in the memory and configured to be executed by the processor, the application being configured It is used to perform the photographing method described above.
  • a computer readable storage medium storing a computer executable program is provided.
  • the computer executable program is run on a processor, a photographing method according to an embodiment of the present disclosure may be implemented.
  • FIG. 1 is a flowchart of a photographing method according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of photographing a person's landscape photograph using a photographing method provided according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a sharpness focusing method according to an embodiment of the present disclosure
  • FIG. 4 is a flow chart of step S20 of Figure 1;
  • FIG. 5 is a flow chart of step S40 of Figure 1;
  • FIG. 6 is a schematic structural block diagram of a photographing apparatus according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic structural block diagram of an image layering module of FIG. 6;
  • FIG. 8 is a schematic structural block diagram of a layer processing module of FIG. 6;
  • FIG. 9 is a schematic structural block diagram of the photo synthesizing module of FIG. 6.
  • FIG. 1 is a flowchart of a photographing method according to an embodiment of the present disclosure.
  • the method can be applied to a photographing device having a dual camera, such as a camera, an electronic device having a photographing function, and the like.
  • the photographing method includes steps S10 to S40:
  • step S10 the first camera and the second camera are caused to focus on two different positions in the photo preview interface to obtain a first in-focus object corresponding to the first camera and a second in-focus object corresponding to the second camera.
  • step S20 an image around the first in-focus object is extracted as a first layer, and an image around the second in-focus object is extracted as a second layer.
  • step S30 the first layer and/or the second layer are processed.
  • step S40 the first layer and the second layer are merged with the photo preview interface to generate a composite photo.
  • the image captured by the dual camera is divided into two layers by simultaneously focusing the objects at different focus points by using the dual camera.
  • the two layers can be individually adjusted to synthesize the photos the user needs, improving the user experience.
  • the first camera when photographing a person's landscape photo, the first camera may be used to capture a person image (the first layer in the figure), and the second camera may be used to capture a landscape image. (the second layer in the picture).
  • the pixel near the selected focal plane of the landscape image can be used as a layer, and the layer is enlarged, so that the character image and the landscape image are more suitable, thereby improving the user's photographing experience.
  • the dual camera is simultaneously mounted on one side of the mobile terminal.
  • the camera preview interface of the mobile terminal displays the view image of the first camera.
  • the dual cameras are capable of auto-focusing, and simultaneously focus on two different positions in the photo preview interface, the two different positions being the focus positions of the two cameras, at the two focus positions.
  • Two focal planes are formed, and the focus mode is determined by image sharpness judgment, which may also be called contrast focus.
  • image sharpness judgment which may also be called contrast focus.
  • the method before the step S10, further includes: receiving a shooting instruction of the mobile terminal.
  • the first camera is used to take a picture of a first focus object (such as a character)
  • the second camera is used to take a picture of a second focus object (such as a background of a person).
  • the step S20 includes a sub-step S21 and a sub-step S22.
  • sub-step S21 the image around the first in-focus object is sharply identified, and all the pixels around the first in-focus object whose sharpness is higher than the pre-stored sharpness threshold are extracted to form a first layer.
  • sub-step S22 the image around the second in-focus object is sharply identified, and all the pixels around the second in-focus object whose sharpness is higher than the sharpness threshold are extracted to form a second layer.
  • the sharpness threshold is set by the user according to experience, and may also be preset in the camera by the research and development personnel.
  • the two cameras share the same sharpness threshold to form two layers.
  • the two cameras may also employ different sharpness thresholds to perform finer adjustment operations on the two layers.
  • the step S30 includes a sub-step of performing an enlargement, reduction or movement operation on the first layer or the second layer.
  • the step S30 includes the sub-steps:
  • the center position of the two-finger connection is obtained, and it is determined whether the center position is in the area of the first layer or the second layer, and when the distance between the two fingers becomes larger, Enlarging the layer in which the center position is located, and reducing the layer where the center position is when the distance becomes smaller;
  • the preview interface and the image content of each layer are saved, and for the saved image content, the enlargement and reduction operations and the moving operation of the images of the first layer and the second layer are performed. , adjust the size and position of each layer image.
  • the first layer or the second layer may also be edited, modified, and the like.
  • the person and/or the background is beautified by image processing.
  • the second layer (the landscape image in FIG. 2) may be processed (for example, enlarged or reduced), or the first layer may be processed, or simultaneously The layer and the second layer are processed.
  • the step S40 includes sub-step S41 - sub-step S45.
  • the first distance between the first camera and the first focusing object is acquired according to the distance between the first camera lens and the image capturing sensor; and according to the distance between the second camera lens and the image capturing sensor, Obtaining a second distance between the second camera and the second in-focus object.
  • sub-step S42 determining whether the first distance is smaller than the second distance, wherein if the first distance is smaller than the second distance, proceeding to sub-step S43, if the first distance is not less than When the second distance is described, the sub-step S44 is entered.
  • the first layer is used as the uppermost layer
  • the second layer is used as the middle layer layer
  • the photo preview interface is used as the lowermost layer.
  • the second layer is used as the uppermost layer
  • the first layer is used as the middle layer layer
  • the photo preview interface is used as the lowermost layer.
  • sub-step S45 the layers are merged in the order of the topmost layer, the middle layer, and the lowermost layer to generate a composite photo.
  • the uppermost layer covers the intermediate layer layer when performing an enlargement, reduction or movement operation. That is to say, the layer closer to the camera has a higher priority during operation.
  • the display module if the first layer is the uppermost layer, the first layer and the second layer are displayed on the photo preview interface, wherein the image of the first layer blocks part of the second layer. And the images of the first layer and the second layer block a part of the photo preview interface.
  • the image of the first layer and the second layer changes with the camera of the terminal, and the image content also changes.
  • the method further includes: saving the synthesized photo to the storage device, and displaying the synthesized photo through the display module according to the user request.
  • FIG. 6 is a schematic structural block diagram of a photographing apparatus according to an embodiment of the present disclosure.
  • the photographing apparatus 100 includes a focus module 10, an image layering module 20, a layer processing module 30, and a photo composition module 40.
  • the focusing module 10 is configured to focus the first camera and the second camera on two different positions in the photo preview interface to obtain a first in-focus object corresponding to the first camera and a second in-focus object corresponding to the second camera.
  • the image layering module 20 is configured to extract an image around the first in-focus object as a first layer, and extract an image around the second in-focus object as a second layer.
  • the layer processing module 30 is configured to process the first layer and/or the second layer.
  • the photo composition module 40 is configured to merge the first layer and the second layer with the photo preview interface to generate a composite photo.
  • the image captured by the dual camera is divided into two layers by simultaneously focusing the objects at different focus points by using the dual camera.
  • the two layers can be individually adjusted to synthesize the photos the user needs, improving the user experience.
  • the first camera when photographing a person's landscape photo, the first camera may be used to capture a person image (the first layer in the figure), and the second camera may be used to capture a landscape image. (the second layer in the picture).
  • the landscape image is far away, the pixel near the selected focal plane of the landscape image can be used as a layer, and the layer is enlarged, so that the character image and the landscape image are more suitable, thereby improving the user's photographing experience. .
  • the dual camera is simultaneously mounted on one side of the mobile terminal.
  • the camera preview interface of the mobile terminal displays the view image of the first camera.
  • the dual cameras are capable of auto-focusing, and simultaneously focus on two different positions in the photo preview interface, the two different positions being the focus positions of the two cameras, at the two focus positions.
  • Two focal planes are formed, and the focus mode is determined by image sharpness judgment, which may also be called contrast focus.
  • image sharpness judgment which may also be called contrast focus.
  • the image layering module includes: a first sharpness identifying unit 21 and a second sharpness identifying unit 22.
  • the first sharpness recognition unit 21 is configured to perform sharpness recognition on an image around the first in-focus object, and extract all pixel points of the image pixel sharpness around the first in-focus object that are higher than a pre-stored sharpness threshold.
  • the first layer is configured to perform sharpness recognition on an image around the first in-focus object, and extract all pixel points of the image pixel sharpness around the first in-focus object that are higher than a pre-stored sharpness threshold.
  • the second sharpness recognizing unit 22 is configured to perform sharpness recognition on an image around the second in-focus object, and extract all pixel points of the image in which the image sharpness around the second in-focus object is higher than the sharpness threshold. Form a second layer.
  • the sharpness threshold is set by the user according to experience, and may also be preset in the camera by the research and development personnel.
  • the two cameras share the same sharpness threshold to form two layers.
  • the two cameras may also employ different sharpness thresholds to perform finer adjustment operations on the two layers.
  • the layer processing module is further configured to perform an enlargement, reduction or movement operation on the image in the first layer or the second layer.
  • the layer processing module includes a layer scaling unit 31, and a layer moving unit 32.
  • the layer scaling unit 31 is configured to acquire a center position of the two-finger connection when the two fingers are detected on the display screen, and determine whether the center position is in the area of the first layer or the second layer, in the double When the distance between the fingers becomes larger, the layer in which the center position is located is enlarged, and when the distance becomes smaller, the layer in which the center position is located is reduced.
  • the layer moving unit 32 is configured to determine whether the single finger position is in the area of the first layer or the second layer when a single finger is detected on the display screen, and the single finger position is located when the single finger moves The layer follows the single finger to move.
  • the preview interface and the image content of each layer are saved, and for the saved image content, the enlargement and reduction operations and the moving operation of the images of the first layer and the second layer are performed. , adjust the size and position of each layer image.
  • the first layer or the second layer may also be edited, modified, and the like.
  • the person and/or the background is beautified by image processing.
  • the second layer (such as the landscape image in FIG. 2) may be processed (for example, enlarged or reduced), or the first layer may be processed, or the first layer and the first layer may be simultaneously processed.
  • the second layer is processed.
  • the photo synthesis module includes: a distance acquisition unit 41, a determination unit 42, and a merging unit 43.
  • the distance acquiring unit 41 is configured to acquire a first distance between the first camera and the first focusing object according to a distance between the first camera lens and the image capturing sensor; according to a distance between the second camera lens and the image capturing sensor Obtaining a second distance between the second camera and the second focusing object.
  • the determining unit 42 is configured to determine whether the first distance is smaller than the second distance, wherein if the first distance is smaller than the second distance, the first layer is used as an upper layer, and The second layer is used as the middle layer layer, and the photo preview interface is used as the bottom layer; if the first distance is not less than the second distance, the second layer is used as the top layer. The layer is used as the middle layer layer, and the photo preview interface is used as the bottom layer.
  • the merging unit 43 is configured to merge the layers in the order of the topmost layer, the middle layer layer, and the bottommost layer to generate a composite photo.
  • the uppermost layer covers the intermediate layer layer when performing an enlargement, reduction or movement operation. That is to say, the layer closer to the camera has a higher priority during operation.
  • the display module if the first layer is the uppermost layer, the first layer and the second layer are displayed on the photo preview interface, wherein the image of the first layer blocks part of the second layer. And the images of the first layer and the second layer block a part of the photo preview interface.
  • the image of the first layer and the second layer changes with the camera of the terminal, and the image content also changes.
  • the photographing apparatus further includes: a saving module configured to save the synthesized photo to the storage device; and a display module configured to display the synthesized photo according to the user request.
  • a mobile terminal comprising a memory, a processor, and at least one application stored in the memory and configured to be executed by the processor, the application being configured A photographing method according to an embodiment of the present disclosure is performed.
  • a computer readable storage medium storing a computer executable program.
  • the computer executable program When the computer executable program is run on a processor, a photographing method according to an embodiment of the present disclosure may be implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un procédé de photographie, un appareil de photographie et un terminal mobile. Le procédé comprend les étapes consistant : à effectuer la mise au point d'un premier et d'un second appareil photo sur deux positions différentes dans une interface de prévisualisation de photographie, de façon à acquérir un premier objet mis au point correspondant au premier appareil photo et un second objet mis au point correspondant au second appareil photo; à extraire une image entourant le premier objet mis au point pour qu'elle fasse office de première couche d'image, et à extraire une image entourant le second objet mis au point pour qu'elle fasse office de seconde couche d'image; à traiter la première et/ou la seconde couche d'image; et à combiner la première et la seconde couche d'image au moyen de l'interface de prévisualisation de photographie, de manière à générer une photographie combinée.
PCT/CN2018/084823 2017-04-27 2018-04-27 Procédé de photographie, appareil de photographie et terminal mobile WO2018196854A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710289228.0A CN108810326B (zh) 2017-04-27 2017-04-27 一种拍照方法、装置及移动终端
CN201710289228.0 2017-04-27

Publications (1)

Publication Number Publication Date
WO2018196854A1 true WO2018196854A1 (fr) 2018-11-01

Family

ID=63919499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/084823 WO2018196854A1 (fr) 2017-04-27 2018-04-27 Procédé de photographie, appareil de photographie et terminal mobile

Country Status (2)

Country Link
CN (1) CN108810326B (fr)
WO (1) WO2018196854A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110418056A (zh) * 2019-07-16 2019-11-05 Oppo广东移动通信有限公司 一种图像处理方法、装置、存储介质及电子设备
CN112351204A (zh) * 2020-10-27 2021-02-09 歌尔智能科技有限公司 一种拍照方法、装置、移动终端及计算机可读存储介质
CN112616022A (zh) * 2020-12-21 2021-04-06 努比亚技术有限公司 一种多点对焦方法、装置、终端及可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101025405B1 (ko) * 2009-11-02 2011-03-28 중앙대학교 산학협력단 초점 조절 영상 생성 시스템, 방법 및 이를 채용한 촬영 장치
CN104270573A (zh) * 2014-10-27 2015-01-07 上海斐讯数据通信技术有限公司 多点触控对焦成像系统和方法及适用的移动终端
CN104349063A (zh) * 2014-10-27 2015-02-11 东莞宇龙通信科技有限公司 一种控制摄像头拍摄的方法、装置及终端
CN104662463A (zh) * 2012-09-28 2015-05-27 佳能株式会社 图像处理装置、成像系统和图像处理系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140106927A (ko) * 2013-02-27 2014-09-04 한국전자통신연구원 파노라마 생성 장치 및 방법
CN103647903B (zh) * 2013-12-31 2016-09-07 广东欧珀移动通信有限公司 一种移动终端拍照方法及系统
CN104780315A (zh) * 2015-04-08 2015-07-15 广东欧珀移动通信有限公司 摄像装置拍摄的方法和系统
CN105120178A (zh) * 2015-09-21 2015-12-02 宇龙计算机通信科技(深圳)有限公司 一种多摄像头的终端的对焦拍摄方法、系统与移动终端
CN106060386A (zh) * 2016-06-08 2016-10-26 维沃移动通信有限公司 一种预览图像生成方法及移动终端
CN106375662B (zh) * 2016-09-22 2019-04-12 宇龙计算机通信科技(深圳)有限公司 一种基于双摄像头的拍摄方法、装置和移动终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101025405B1 (ko) * 2009-11-02 2011-03-28 중앙대학교 산학협력단 초점 조절 영상 생성 시스템, 방법 및 이를 채용한 촬영 장치
CN104662463A (zh) * 2012-09-28 2015-05-27 佳能株式会社 图像处理装置、成像系统和图像处理系统
CN104270573A (zh) * 2014-10-27 2015-01-07 上海斐讯数据通信技术有限公司 多点触控对焦成像系统和方法及适用的移动终端
CN104349063A (zh) * 2014-10-27 2015-02-11 东莞宇龙通信科技有限公司 一种控制摄像头拍摄的方法、装置及终端

Also Published As

Publication number Publication date
CN108810326B (zh) 2022-11-18
CN108810326A (zh) 2018-11-13

Similar Documents

Publication Publication Date Title
US9578260B2 (en) Digital photographing apparatus and method of controlling the digital photographing apparatus
US9712751B2 (en) Camera field of view effects based on device orientation and scene content
KR101899877B1 (ko) 확대된 영상의 화질을 개선하기 위한 장치 및 방법
JP5036599B2 (ja) 撮像装置
JP4497211B2 (ja) 撮像装置、撮像方法及びプログラム
JP6157242B2 (ja) 画像処理装置及び画像処理方法
US8274572B2 (en) Electronic camera capturing a group of a plurality of specific objects
US10237466B2 (en) Recognition of degree of focus of an image
WO2022083229A1 (fr) Procédé de traitement d'image, dispositif électronique et support de stockage non volatile lisible par ordinateur
JPWO2017037978A1 (ja) 検出装置、検出方法、検出プログラムおよび撮像装置
JP4802884B2 (ja) 撮像装置、撮像画像記録方法、及び、プログラム
WO2018196854A1 (fr) Procédé de photographie, appareil de photographie et terminal mobile
US9167150B2 (en) Apparatus and method for processing image in mobile terminal having camera
JP2013183306A (ja) 撮像装置、撮像方法、及びプログラム
JP2009089220A (ja) 撮像装置
EP2200275B1 (fr) Procédé et appareil d'affichage de portrait sur écran
JP2011239267A (ja) 撮像装置及び画像処理装置
JP6483661B2 (ja) 撮像制御装置、撮像制御方法およびプログラム
JP2010141609A (ja) 撮像装置
JP2020160773A (ja) 画像処理装置、撮像装置、画像処理方法およびプログラム
KR101392382B1 (ko) 휴대용 단말기에서 카메라의 파노라마 기능 구현 방법 및장치
US20230088309A1 (en) Device and method for capturing images or video
JP2010028370A (ja) 撮像装置
TWI465108B (zh) 影像擷取裝置及影像處理方法
JP2024002631A (ja) 画像処理装置、撮像装置、画像処理方法、コンピュータのプログラムおよび記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18791109

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18791109

Country of ref document: EP

Kind code of ref document: A1