CN105554369A - Electronic device and method for processing image - Google Patents

Electronic device and method for processing image Download PDF

Info

Publication number
CN105554369A
CN105554369A CN201510696218.XA CN201510696218A CN105554369A CN 105554369 A CN105554369 A CN 105554369A CN 201510696218 A CN201510696218 A CN 201510696218A CN 105554369 A CN105554369 A CN 105554369A
Authority
CN
China
Prior art keywords
image
electronic equipment
depth information
information
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510696218.XA
Other languages
Chinese (zh)
Other versions
CN105554369B (en
Inventor
尹泳权
金汶洙
金兑澔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN105554369A publication Critical patent/CN105554369A/en
Application granted granted Critical
Publication of CN105554369B publication Critical patent/CN105554369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

Electronic devices and methods for processing images are provided. The method includes obtaining a first image and a second image through a first image sensor, extracting depth information from at least one third image obtained through a second image sensor, applying the extracted depth information to the obtained first image and displaying the first image, and applying the extracted depth information to the obtained second image.

Description

For the treatment of electronic equipment and the method for image
Technical field
The disclosure relates to electronic equipment for the treatment of image and method.
Background technology
In recent years, portable electric appts is providing more diversified service and additional function.Developing can perform on an electronic device various should be used for meeting the demand of various different user and improve the practicality of electronic equipment.Here, intelligent telephone set, mobile phone, laptop computer, tablet personal computer (PC) and the mobile electronic device of other state-of-the-arts with touch-screen can retain and at least severally to apply to hundreds of.
Such electronic equipment can have two cameras to catch image.Electronic equipment can carry out composograph by using the depth information about image to carry out reprocessing.
But, because the electronic equipment of correlation technique carrys out carries out image process, so they can not carry out Real-time image display or video capture by using the reprocessing of depth information.
Therefore, need by using the scan picture about the depth information of multiple image carry out preview or catch image.
Above information information is as a setting presented just to helping to understand the disclosure.Do not make for prior art whether can be applied to upper any one relative to the disclosure anyly determines, do not make any asserting yet.
Summary of the invention
Aspect of the present disclosure solves at least the problems referred to above and/or shortcoming, and provides at least following advantage.Therefore, an aspect of the present disclosure provides a kind of electronic equipment for the treatment of image and method.
According to an aspect of the present disclosure, provide a kind of for the method by electronic equipment process image.The method comprises: obtain the first image and the second image by the first imageing sensor; Depth information is extracted from least one the 3rd image obtained by the second imageing sensor; The depth information extracted to the first obtained image applications also shows this first image; And to the depth information that the second obtained image applications is extracted.
According to another aspect of the present disclosure, provide a kind of electronic equipment for the treatment of image.This electronic equipment comprises: image obtains module, and it comprises the first imageing sensor being configured to acquisition first image and the second image and the second imageing sensor being configured to obtain at least one the 3rd image; Image processor, be configured to extract depth information from least one the 3rd image, the depth information extracted to the first obtained image applications also shows this first image, and to the depth information that the second obtained image applications is extracted; And display, be configured to show the first image obtained.
According to the detailed description of carrying out below in conjunction with accompanying drawing, other aspects of the present disclosure, advantage and potential feature will be apparent for those skilled in the art, disclosed in the figures various embodiment of the present disclosure.
Accompanying drawing explanation
According to the description carried out below in conjunction with accompanying drawing, the above and other aspects, features and advantages of some embodiments of the disclosure will clearly, wherein:
Fig. 1 illustrates the network environment comprising electronic equipment according to disclosure embodiment;
Fig. 2 is the block diagram of the image processor of the electronic equipment illustrated according to disclosure embodiment;
Fig. 3 A illustrates the front surface of the electronic equipment according to disclosure embodiment;
Fig. 3 B illustrates the rear surface of the electronic equipment according to disclosure embodiment;
Fig. 4 be illustrate according to disclosure embodiment by the flow chart of the method for electronic equipment process image;
Fig. 5 be illustrate according to disclosure embodiment by the flow chart of the method for electronic equipment process image;
Fig. 6 A is the view that the example be focused according to the first object in the wherein image of disclosure embodiment is shown;
Fig. 6 B is the view that the example be focused according to the second object in the wherein image of disclosure embodiment is shown;
Fig. 6 C is the view that the example be focused according to the 3rd object in the wherein image of disclosure embodiment is shown;
Fig. 6 D illustrates to have not confocal image and the view of the example of image that obtain by electronic equipment by synthesis is illustrated in Fig. 6 A, 6B and 6C according to disclosure embodiment;
Fig. 7 is the flow chart of the process of the focus of the object illustrated in image that the control display according to disclosure embodiment shows;
Fig. 8 A and Fig. 8 B illustrates the view controlling the example focused on when selecting the first object in the image shown from display according to disclosure embodiment;
Fig. 8 C and Fig. 8 D illustrates the view controlling the example focused on when selecting the second object in the image shown from display according to disclosure embodiment;
Fig. 9 illustrates that the flow chart of the process of effect (exchange-lenseseffect) exchanged by the image applications camera lens shown on display according to disclosure embodiment;
Figure 10 A illustrates that the view of the example of effect exchanged by the image applications camera lens shown on display according to disclosure embodiment;
Figure 10 B is the view of the example of the image applications lens effect shown on display illustrated according to disclosure embodiment;
Figure 10 C is the view of the example of the image applications lens effect shown on display illustrated according to disclosure embodiment;
Figure 10 D is the view of the example of the image applications lens effect shown on display illustrated according to disclosure embodiment;
Figure 10 E is the view of the example of the image applications lens effect shown on display illustrated according to disclosure embodiment;
Figure 10 F is the view of the example of the image applications lens effect shown on display illustrated according to disclosure embodiment;
Figure 11 A illustrates according to the display of disclosure embodiment for applying the view of the example of the image of lens effect;
Figure 11 B is the view of the example about the shot attribute information for providing image effect to shown image illustrated according to disclosure embodiment;
Figure 11 C illustrates the view providing the example of image effect for the f-number by adjusting camera lens according to disclosure embodiment;
Figure 11 D illustrates the view providing the example of image effect for the shutter speed by adjusting camera lens according to disclosure embodiment;
Figure 11 E illustrates the view providing the example of image effect for the focal length and shutter speed by adjusting camera lens according to disclosure embodiment;
Figure 12 is the block diagram of the electronic equipment illustrated according to disclosure embodiment; And
Figure 13 illustrates the communication protocol 1300 between multiple electronic equipment (such as, the first electronic equipment 1310 and the second electronic equipment 1330) according to disclosure embodiment.
In accompanying drawing in the whole text, similar reference number is appreciated that and refers to similar part, assembly and structure.
Embodiment
The following explanation with reference to accompanying drawing is provided to help the of the present disclosure various embodiment of complete understanding by claim and equivalents thereof.It comprises various specific detail and helps understand, but they should only be considered to schematic.Therefore, persons of ordinary skill in the art will recognize that and can make various change or amendment to various embodiment described herein, and the scope of the present disclosure and spirit can not be departed from.In addition, for clarity and conciseness, the description to known function and structure will be omitted.
The term used in following specification and claims and word are not limited to dictionary meanings, but only use can knowing of next life by inventor and as one man understand the disclosure.Therefore, should it is evident that to those skilled in the art, the following explanation of various embodiment of the present disclosure is provided just to schematic object, and is not intended to limit object of the present disclosure, limit the disclosure by claims and equivalent thereof.
Should be appreciated that singulative " ", " one " and " being somebody's turn to do " comprise plural form, unless context explicitly points out really not so.Therefore, such as, referring to one or more surface so is also comprised to referring to of " assembly surface ".
When used herein, term " comprise " and/or " comprising " specify disclosed in function, operation or assembly existence, but do not get rid of and exist or add other functions one or more, operation or assembly.Be to be understood that further, when using in this manual, term " comprise " and/or " having " specify set forth feature, integer, operation, element and/or assembly existence, but do not get rid of and exist or add other features one or more, integer, operation, element and/or assembly and/or its group.
When used herein, term "and/or" comprises one or more aforementioned any and whole combination listing item.Such as, " A or B " can comprise A or comprise B or comprise A and B.
When here used, such as the ordinal number of " first ", " second " etc. can modify the various assemblies of various embodiment, but does not limit these assemblies.Such as, these terms not order of limiter assembly and/or importance.These terms are only for distinguishing an assembly mutually with another assembly.Such as, first user equipment and the second subscriber equipment are subscriber equipmenies different from each other.Such as, the first assembly can be indicated as the second assembly, and vice versa, and can not depart from the scope of the present disclosure.
When an assembly " be connected to " or " being couple to " another assembly time, this assembly can be connected directly or be couple to another assembly, or can insert other assemblies (or multiple) therebetween.On the contrary, when an assembly " be directly connected to " or " being directly coupled to " another assembly time, assembly between two parties cannot be inserted therebetween.
According to disclosure embodiment, electronic equipment disclosed herein can be the equipment with Presentation Function.Such as, the example of electronic equipment can include but not limited to, intelligent telephone set, tablet personal computer (PC), mobile phone, video telephone, E-book reader, desktop PC, laptop computer, net book, personal digital assistant (PDA), portable media player (PMP), Motion Picture Experts Group's stage 1 or stage 2 (MPEG-1 or MPEG-2) audio layer 3 (MP3) player, ambulatory medical device, camera or wearable device are (such as, headset equipment (HMD), electronics clothes, electronics bracelet, electronics necklace, electronics accessories, electronics is tatooed or intelligent watch).
According to disclosure embodiment, electronic equipment can be the controlling intelligent household appliances with Presentation Function.Such as, the example of controlling intelligent household appliances includes but not limited to, television set (TV), digital versatile disc (DVD) player, audio player, refrigerator, air-conditioning, vacuum cleaner, baking box, microwave oven, washing machine, dryer, air purifier, Set Top Box, TV box (such as, SamsungHomeSync tM, AppleTV tMor GoogleTV tM), game console, electronic dictionary, camcorder or digital photo frame.
According to disclosure embodiment, the example of electronic equipment can include but not limited to various Medical Devices (such as, magnetic resonance angiography (MRA) equipment, magnetic resonance imaging (MRI) equipment, computed tomography (CT) equipment, imaging device or ultrasonic device), navigator, global positioning system (GPS) receiver, event data recorder (EDR), flight data recorder (FDR), automotive infotainment equipment, Marine Electronics equipment (such as, marine navigation equipment, gyroscope or compass), avionic device, safety means, vehicle head unit, industry or domestic robot, ATM (ATM) or point of sale (POS) equipment.
According to the various embodiment of the disclosure, the example of electronic equipment can include but not limited to have the part of the furniture of biometric function or building/structure, electron plate, electronic signature receiving equipment, projecting apparatus or various measuring equipment (such as, for measuring water, electricity, gas or electromagnetic equipment).According to disclosure embodiment, electronic equipment can listed hereinbefore go out equipment in one or combination.According to disclosure embodiment, electronic equipment can be flexible apparatus.According to disclosure embodiment, electronic equipment be not limited to listed hereinbefore go out equipment.
Now with reference to accompanying drawing, various embodiment of the present disclosure is described.When here used, term " user " can refer to the people or another equipment that use electronic equipment.
Fig. 1 illustrates the network environment 100. comprising electronic equipment according to disclosure embodiment
With reference to Fig. 1, electronic equipment 101 can comprise bus 110, processor 120, memory 130, input/output interface 140, display 150, communication interface 160 and image processor 170.
According to disclosure embodiment, electronic equipment can be can pass on data and can send or receive biological information with the various electronic equipments of executable operations.Electronic equipment can comprise intelligent telephone set, mobile phone, laptop computer, door lock, air-conditioning, washing machine, notebook PC, dull and stereotyped PC or intelligent TV.
Other assemblies are connected to each other by bus 110, and bus 110 can carry the communication (such as, control message) between other assemblies.
Processor 120 can by such as bus 110 from other assemblies (such as, memory 130, input/output interface 140, display 150, communication interface 160 or image processor 170) receive order, received order can be explained, and moving calculation or data processing can be come according to explained order.
Memory 130 can store the order or data that receive from other assemblies (such as, input/output interface 140, display 150, communication interface 160 or image processor 170) or the order that generated by processor 120 or other assemblies or data.Memory 130 can keep comprising such as kernel 131, middleware 132, API (API) 133 or apply the programming module of 134.Programming module can be configured with software, firmware, hardware or their two or more combination.
Kernel 131 can control or manage the system resource (such as, bus 110, processor 120 or memory 130) for operating in operation or the function realized in other programming module of such as middleware 132, API133 or application 134.Kernel 131 can provide and allow each assembly of middleware 132, API133 or application 134 access electronic equipment 101 to control or to manage their interface.
Middleware 132 can serve as the relaying allowing API133 or application 134 and kernel 131 to pass on data.Multiple application 134 can be provided.Middleware 132 can by such as distributing the system resource of use electronic equipment 101 (such as at least one in multiple application 134, bus 110, processor 120 or memory 130) priority, control from application 134 receive work request.
API133 is the interface allowing application 134 to control the function provided from kernel 131 or middleware 132.Such as, API133 can comprise at least one interface or function (such as, ordering) of controlling for filing (filing) control, window control, image procossing or text.
According to disclosure embodiment, multiple application 134 can be provided, comprise Short Message Service (SMS)/multimedia information service (MMS) application, e-mail applications, calendar application, alert applications, healthcare application (such as, for measuring amount of exercise or blood sugar) or environmental information application (such as, the application of atmospheric pressure, humidity or temperature information is provided).Additionally or alternatively, applying 134 can be the application relevant with the information exchange between electronic equipment 101 and external electronic device (such as, electronic equipment 104).The example of information exchange related application can include but not limited to transmit the notice relay application of customizing messages or the device management application for managing external electronic device for externally electronic equipment.
Such as, notice relay application can comprise for by from electronic equipment 101 other application (such as, SMS/MMS application, e-mail applications, healthcare application or environmental information application) announcement information that generates is relayed to the function of external electronic device (such as, electronic equipment 104).Additionally or alternatively, notice relay application from such as external electronic device (such as electronic equipment 104) reception notification information, and can provide received announcement information to user.Device management application can perform the external electronic device that communicates with electronic equipment 101 (such as, electronic equipment 104) at least some function (such as, the brightness (or resolution) of on/off external electronic device (or some assemblies of external electronic device) or control display), and device management application can manage (such as, install, delete or upgrade) application that operates in external electronic device or the service (such as, call service or messenger service) provided from external electronic device.
According to disclosure embodiment, application 134 can comprise the application of the attribute (such as, the type of electronic equipment) that depends on external electronic device (such as, electronic equipment 104) and design.Such as, when external electronic device is MP3 player, application 134 can comprise the application relevant to playing music.Similarly, when external electronic device is ambulatory medical device, application 134 can comprise the application relevant to health care.According to disclosure embodiment, application 134 can comprise the application or the application that receives from external electronic device (such as, server 106 or electronic equipment 104) that design electronic equipment 101.
Input/output interface 140 such as can transmit by bus 110 order or data that are inputted via input-output apparatus (such as transducer, keyboard or touch-screen) by user to processor 120, memory 130, communication interface 160 or image processor 170.Such as, input/output interface 140 can provide to processor 120 data touching input about the user by touch-screen.Input/output interface 140 can export by such as bus 110 order or data that receive from processor 120, memory 130, communication interface 160 or image processor 170 via input-output apparatus (such as, loud speaker or display).Such as, input/output interface 140 can export by loud speaker the speech data processed by processor 120 to user.
Display 150 can show various types of information (such as, multi-medium data or text data) to user.
Communication interface 160 can dock the communication between electronic equipment 101 with external electronic device (such as, electronic equipment 104 or server 106).Such as, communication interface 160 can be connected with network 162 is wired or wireless, to communicate with external electronic device.Wireless connections can be made by various radio communication protocol, described radio communication protocol includes but not limited to Wi-Fi, bluetooth (BT), near-field communication (NFC), GPS or cellular communication protocol (such as, Long Term Evolution (LTE), senior LTE (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications service (UMTS), WiMAX (WiBro) or global system for mobile communications (GSM).Can make wired connection by various wired communication protocol, described wired communication protocol includes but not limited to USB (USB), HDMI (High Definition Multimedia Interface) (HDMI), proposed standard 232 (RS-232) or plain old telephone service (POTS).
According to disclosure embodiment, network 162 can be communication network.Communication network can comprise computer network, internet, Internet of Things (IoT) or telephone network.According to disclosure embodiment, the agreement (example of such agreement includes but not limited to transport layer protocol, data link layer protocol or physical layer protocol) for the communication between electronic equipment 101 and external electronic device can be employed 134, API133, middleware 132, kernel 131 or communication interface 160 supported.
Server 106 can be supported to carry out drive electronics 101 with at least one operation in the operation (or function) by performing realization on electronic equipment 101.Such as, server 106 can comprise the image processing server module 108 that can be supported in the image processor 170 realized in electronic equipment 101.Such as, at least one execution in the operation performed by image processor 170 of at least one element that image processing server module 108 can comprise image processor 170 operates.
Image processor 170 can process the information at least partially obtained from other elements (such as, processor 120, memory 130, input/output interface 140 or communication interface 160), and can be supplied to user in every way.Such as, image processor 170 can make purpose processor 120 or control at least some function of electronic equipment 101 independent of processor 120, make electronic equipment 101 can with other electronic equipments (such as, electronic equipment 104 or server 106) interworking.According to embodiment of the present disclosure, at least one configuration of image processor 170 can be included in server 106 (such as, image processing server module 108) in, and at least one operation of realization image processor 170 can be used for from server 106 support.The extraneous information about image processor 170 is provided by Fig. 2 to Figure 13 described below.
Fig. 2 is the block diagram of the image processor of the electronic equipment illustrated according to disclosure embodiment.
With reference to Fig. 2, image processor 170 can comprise image and obtain module 210, extraction of depth information module 220 and application module 230.
According to disclosure embodiment, image acquisition module 210 can comprise at least one image (picture) from least one imageing sensor.At least one imageing sensor can comprise at least one at least one array camera, stereoscopic camera, flight time (TOF) transducer, structured light sensor and infrared sensor.Array camera can comprise multiple camera model.Image obtains module 210 can obtain at least one image (or picture) by imageing sensor and array camera.Image obtains module 210 can obtain the first image and the second image by the first imageing sensor, and can obtain at least one the 3rd image by the second imageing sensor.Second imageing sensor can be included at least one imageing sensor comprised at least one array camera.3rd image can be the image obtained by least one array camera or the image obtained from a stereoscopic camera.Image obtains module 210 also can obtain at least one image by least one array camera.Imageing sensor can obtain an image, and array camera can pass through at least one provided camera model obtains at least one image.Imageing sensor can obtain two images about Same Scene.In the middle of two images, the first image can be the image of preview on the display 150, and the second image can be temporarily stored in the image in buffer or memory.First image can have the resolution lower than the second image.
According to disclosure embodiment, when being obtained the first imageing sensor of module 210 by image and obtaining the first image and the second image and obtain at least one the 3rd image by the second imageing sensor, image processor 170 can use the image obtained by the second imageing sensor to extract depth information, extracted depth information is applied to the first image, and display application the first image of depth information on the display 150.In addition, image processor 170 can by by first imageing sensor obtain the first image and by second imageing sensor obtain the 3rd Images uniting to extract depth information.The depth information that image processor 170 can be extracted to the second image applications obtained by the first imageing sensor.Second image can be temporarily stored in buffer.(upscaled) depth information that image processor 170 can amplify to the second image applications be temporarily stored in buffer or memory.The depth information that image processor 170 can store amplification in memory 130 and the image caught.The image stored can comprise at least one object, and the depth information amplified can be included in the picture or can separate with image and stores.Image processor 170 can use depth information.Image processor 170 can adjust extracted depth information when showing image, the depth information after the first image applications adjustment, and display application the first image of depth information.The resolution of change depth information can be comprised to adapt to the operation of the resolution of image to the operation that depth information amplifies.
According to disclosure embodiment, image processor 170 preview can apply the first image of depth information on the display 150, and can show the information with camera lens exchange correlation when sensing while preview for exchanging the input of effect (exchange-lenseseffect) to preview image application camera lens.This information is the information of attribute for adjusting camera lens and can comprises aperture and arrange (aperturesetting), shutter speed, at least one manually and in program.When sensing at least one item of information in shown information, image processor 170 can to the preview image application image effect (such as, fuzzy) corresponding with selected item of information, and the display application image of this effect.Image processor 170 can show menu or icon for receiving input from user together with image, to exchange effect to the image applications camera lens shown on the display 150.
According to disclosure embodiment, extraction of depth information module 220 can extract depth map from the image obtained from imageing sensor.Depth map can be extracted by Stereo matching scheme.Extraction of depth information module 220 can at least one the application image effect in the first image and the second image.Extraction of depth information module 220 can estimate the relative distance of the same target comprised at the first image and the second image, and can apply the image effect corresponding with estimated distance.
According to disclosure embodiment, application module 230 can amplify the depth information extracted by extraction of depth information module 220, and can to the information being obtained at least one image applications amplification that module 210 obtains by image.Application module 230 can use the depth information of amplification to the effect of application image at least partially (such as, fuzzy) of obtained image.When sensing the touch on display 150 in display application just on the display 150 while the image of depth information, application module 230 can focus on apart from the object in the presumptive area of the position sensing touch, at least one object application image effect (such as, fuzzy) simultaneously in preview image except this object.Application module 230 can use the respective depth information corresponding with at least one object to come to this object application image effect (such as, fuzzy).Image effect can comprise at least one in adjustment ambiguity, color, brightness, mosaic and resolution.
According to disclosure embodiment, image processor 170 can comprise the session connection module (not shown) that the image that can control image processor 170 obtains module 210, interoperability between extraction of depth information module 220 and application module 230.According to disclosure embodiment, session connection module can control the connection between at least one assembly of electronic equipment 101.In addition, session connection module can control the connection between electronic equipment 101 and server 106 and/or the session connection between electronic equipment 101 and at least one peripheral electronic device.
In addition, can meet by image processor 170 or processor 120 the various functions performed respectively by image acquisition module 210, extraction of depth information module 220 and application module 230.
Fig. 3 A and Fig. 3 B illustrates front surface (Fig. 3 A) and rear surface (Fig. 3 B) of the electronic equipment according to disclosure embodiment.
Electronic equipment 101 can have at least one camera on its front surface or rear surface.Electronic equipment 101 can have first camera 301 and second camera 302 on its front surface, or can have first camera 303 and second camera 304 on the surface thereafter.At least one in first camera 301 and 303 and second camera 302 and 304 can comprise at least one in array camera, stereoscopic camera, TOF sensor, structured light sensor and IR transducer.Array camera can comprise multiple camera model.The image that camera can pass through captured object to image processor 170 or processor 120 and obtain.Camera can collect the image comprising multiple colour element, and to image processor 170 or processor 120 transitive graph picture.Camera can comprise at least one image (picture) sensor assembly being connected to electronic equipment 101.Camera can comprise depth transducer.Depth transducer can comprise at least one transducer, and it is implemented as with the infrared pulsed lasers of TOF scheme operation, to export the depth information about measured object.
Second camera 302 and/or 304 can share visual angle with first camera 301 and/or 303 at least in part.The electronic equipment 101 with second camera 302 and/or 304 can comprise the transducer based on TOF, this transducer be received after by object reflection based on launched IR pulse laser beam time of spending determine depth information.In addition, second camera 302 and/or 304 can comprise the array camera with at least two cameras being connected (instrumentallyconnected) by instrument.At least one in first camera and second camera can be installed in the surface of electronic equipment 101 Anywhere.According to disclosure embodiment, depth information can depend on distance (such as, baseline) between first camera and second camera and change.
Fig. 4 be illustrate according to disclosure embodiment by the flow chart of the method for electronic equipment process image.
Now with reference to Fig. 4 describe according to disclosure embodiment by the method for electronic equipment process image.
In operation 410, electronic equipment 101 can obtain the first image and the second image by the first imageing sensor, and is stored in a buffer by the second image.Electronic equipment 101 can obtain the first image (such as, preview image) and the second image by imageing sensor, and the second obtained image can be stored in a buffer.Electronic equipment 101 can have at least one camera that can comprise the imageing sensor that can obtain image (or picture).Imageing sensor can generate preview image and the actual image caught simultaneously or sequentially, and shows preview image on the display 150.Preview image can have the resolution lower than the resolution of the image of reality seizure, and can have the resolution corresponding with the resolution of display 150.Electronic equipment 101 can generate at least two images for each same scene obtained by imageing sensor, and the first image in two images generated can be used to carry out preview, is temporarily stored in a buffer by the second image simultaneously.First image can have the resolution lower than the resolution of the second image.Such as, the first image has the resolution of 3.7 million pixels, and the second image can have the resolution of 16,000,000 pixels.
In operation 420, electronic equipment 101 can extract depth information from least one the 3rd image obtained by least one second imageing sensor.At least one second imageing sensor can comprise at least one array camera.Electronic equipment 101 can have at least one array camera that can catch the resolution multiple images (or picture) lower than the image exported from the first imageing sensor.Electronic equipment 101 can obtain depth information by least one image of obtaining from least one second imageing sensor.Electronic equipment 101 can estimate the relative distance of the same target comprised at least two images.Electronic equipment 101 can use extracted depth information at least one the application image effect in the first image and the second image.Depth information can comprise the depth map about the first image or the second image.
In operation 430, the depth information that electronic equipment 101 can extract to the first image applications, and display application the first image of depth information on the display 150.Electronic equipment 101 can convergent-divergent depth information, makes it possible to process with preview image or by the actual image caught of the first imageing sensor the depth information obtained from least one image obtained from least one second imageing sensor accordingly.Electronic equipment 101 can apply to the correspondence one (such as, the first image) in the image obtained by the first imageing sensor the depth information extracted, and the display application image of depth information on the display 150.In addition, electronic equipment 101 can amplify extracted depth information, the depth information after amplifying to image (such as, the second image) application stored in a buffer, and stores the image applying depth information.The operation of amplifying depth information can comprise the resolution of change depth information with the resolution of applicable image.Such as, when the image obtained by array camera has the resolution of 2,000,000 pixels, the Nonlinear magnify of 2,000,000 pixels can be the image of 3.7 million pixels or the image of 16,000,000 pixels by electronic equipment 101.
In operation 440, when sensing the input for catching image in preview, in operation 450, electronic equipment 101 can amplify extracted depth information, depth information after amplifying to the second stored image applications, and store the second image applying depth information.When applying the image of depth information (such as, first image) be shown on the display 150 while when sensing the input of captured object, electronic equipment 101 can be amplified in the depth information that operation 420 is extracted, to the image be stored in operation 410 in a buffer (such as, second image) apply the depth information after amplifying, and store the image applying depth information.In addition, according to the disclosure, electronic equipment 101 can be amplified in the depth information extracted in operation 420, and depth information after amplification stores the second image accordingly in memory 130 with the input sensing captured object under being applied to the state storing the situation of image (such as, the second image) in a buffer in operation 410.In order to amplify extracted depth information and be applied to stored image (such as, second image), electronic equipment 101 can be amplified in the depth map extracted in operation 420 and also use the depth map after amplifying to the effect of application image at least partially (such as, fuzzy) of stored image.Electronic equipment 101 can store image and the depth information of the depth information after applying amplification to memory 130.The image stored can comprise at least one object.According to disclosure embodiment, image can be temporarily stored in buffer and maybe can be stored in memory 130.After this, when receiving the request for catching from user, the image when this seizure request in buffer or memory can be encoded or is stored in another region in memory 130.Alternatively, the image caught can be temporarily stored in buffer before by image procossing and maybe can be stored in memory 130.Then, when user asks to catch, the image when this seizure request in buffer or memory can be encoded or is stored in another region in memory 130.
When applying the image of depth information (such as in operation 430, first image) be just shown on the display 150 while when sensing touch on the display 150, electronic equipment 101 can focus on apart from the object in the presumptive area of the position sensing touch, at least one object application image effect (such as, fuzzy) simultaneously in preview image except this object.In order to at least one object application image effect (such as, fuzzy), the depth information about at least one object can be used.When applying the image of depth information (such as in operation 430, first image) be just shown on the display 150 while sense when exchanging the input of effect to shown image applications camera lens, electronic equipment 101 can show on the display 150 exchanges relevant information to camera lens.When sensing the selection at least one item of information in shown information, electronic equipment 101 can use at least one object application image effect that selected item of information is corresponding with selected item of information in the image of institute's preview (such as, fuzzy), and show result images.Described information is the information of the attribute for adjusting camera lens, and can comprise aperture setting, shutter speed, at least one manually and in program.
Fig. 5 be illustrate according to disclosure embodiment by the flow chart of the method for electronic equipment process image.
Now with reference to Fig. 5 describe according to disclosure embodiment by the method for electronic equipment process image.
In operation 510, electronic equipment 101 can obtain the first image and the second image by the first imageing sensor, and obtains the 3rd image by the second imageing sensor.Electronic equipment 101 can have each at least two cameras comprising at least one imageing sensor that can obtain image (or picture).First imageing sensor simultaneously or can sequentially generate the first image and the second image, and shows the first image on the display 150.First image (such as, preview image) can have the resolution lower than the resolution of the second image, and can have the resolution corresponding with the resolution of display 150.Second image can be stored in buffer or memory 130.The image stored in a buffer can be automatically left out in the scheduled time.
In operation 520, electronic equipment 101 can use at least one in the first obtained image and the second image to extract depth information.Electronic equipment 101 can synthesize the first image obtained by the first imageing sensor and the 3rd image obtained by least one second imageing sensor.Composograph can have the resolution at least partially comprising the first image and the second image.Electronic equipment 101 can extract depth information from the first image and the 3rd image.Electronic equipment 101 can adjust following at least one: the movement of at least one in focus, conditions of exposure, white balance, rotation and the first image and the second image.The operation adjusting the first image can comprise the operation that estimation is included in the relative distance of the object in the first image and the 3rd image jointly.
In operation 530, the depth information that electronic equipment 101 can extract to the first image applications, and display application the first image of depth information on the display 150.Electronic equipment 101 can show the image synthesized in operation 520 on the display 150, and extracted depth information maybe can be used and the display application image of image effect to the first image applications image effect (such as fuzzy).When sensing touch while being just shown on the display 150 at the first image, electronic equipment 101 can focus on self-induction and measure the object touched in presumptive area, at least one object application image effect (such as, fuzzy) simultaneously in shown image except this object.In order to at least one object application image effect (such as, fuzzy), the depth information about at least one object can be used.When sensing the input of application camera lens exchange effect while image is by preview, electronic equipment 101 can show exchanges relevant information to camera lens, and when sensing at least one item of information in the information shown by selection, selected item of information can be used to apply the image effect corresponding with selected item of information (such as at least one object in image, fuzzy), and the display application image of image effect.Information is the information of the attribute for adjusting camera lens, and can comprise aperture setting, shutter speed, at least one manually and in program.
In operation 540, when sensing the input for catching image, in operation 550, electronic equipment 101 can amplify extracted depth information, the depth information after amplifying to the second image applications, and stores the second image applying depth information.Electronic equipment 101 can show the image obtained by the first imageing sensor and at least one the second imageing sensor on the display 150.When sensing the input of captured object while image is shown on the display 150, electronic equipment 101 can be amplified in the depth information that operation 520 is extracted, depth information after amplifying to the second image applications obtained in operation 510, and store the second image applying depth information.In addition, according to the disclosure, electronic equipment 101 can amplify the depth information extracted in operation 520, and stores the second image in memory 130 accordingly with the input sensing captured object depth information after amplification has been applied to the second image in operation 510.In order to amplify extracted depth information and be applied to the second image, electronic equipment 101 can amplify the depth map extracted in operation 520 and also use the depth map after amplifying to the effect of application image at least partially (such as, fuzzy) of stored image.Such as, electronic equipment 101 can depend on that depth information is applied different fuzzy to each object, focuses on to realize selectivity.Electronic equipment 101 can store the first image in memory 130.
Fig. 6 A is the view that the example be focused according to the first object in the wherein image of disclosure embodiment is shown.Fig. 6 B is the view that the example be focused according to the second object in the wherein image of disclosure embodiment is shown.Fig. 6 C is the view that the example be focused according to the 3rd object in the wherein image of disclosure embodiment is shown.Fig. 6 D illustrates to have not confocal image and the view of the example of image that obtain by electronic equipment by synthesis is illustrated in Fig. 6 A, 6B and 6C according to disclosure embodiment.
With reference to Fig. 6 A, in the image 610 shown in Fig. 6 A, first object 611 corresponding with personage is focused, and the second object 612,613 and 614 be included in background and three object 615 corresponding with personage out of focus.Fig. 6 A illustrates that wherein the first object 611 has experienced the example of selectivity focusing.
With reference to Fig. 6 B, in the image 620 shown in Fig. 6 B, the second object 622,623 and 624 be included in background is focused, simultaneously first object 611 corresponding with personage and the 3rd object 615 out of focus.Fig. 6 B illustrates that wherein the second object 622,623 and 624 has experienced the example of selectivity focusing.
With reference to Fig. 6 C, in the image 630 shown in Fig. 6 C, three object 635 corresponding with personage is focused, be included in the second object 612,613 and 614 in background simultaneously and first object 611 corresponding with personage out of focus.Fig. 6 C illustrates that wherein the 3rd object 635 has experienced the example of selectivity focusing.
With reference to Fig. 6 D, the image 640 of Fig. 6 D will have the result that not confocal image carries out synthesizing such as shown in Fig. 6 A to Fig. 6 C.According to the disclosure, the object be included in respectively in multiple image can be synthesized in single image.Electronic equipment 101 can sense the order of synthesizing multiple image.Such as, electronic equipment 101 can sense the order of synthesizing multiple image based on the information by least one input in input/output interface 140 and display 150.In addition, electronic equipment 101 can synthesize multiple image by sensing for the synthesis of user's gesture of multiple object, maybe automatically synthesizes multiple image when being included in when the object in multiple image has different focuses.
When sensing such order, electronic equipment 101 can extract at least one object be focused from each image or picture.Such as, electronic equipment 101 can extract the first object 611 be focused from the image 610 of Fig. 6 A, extracts the second object 622,623 and 624 be focused, and from the image 630 of Fig. 6 C, extract the 3rd object 635 be focused from the image 620 of Fig. 6 B.The object extracted from Fig. 6 A to Fig. 6 C respectively can be synthesized single image 640 by electronic equipment 101, as shown in Figure 6 D.Image 640 can comprise the multiple objects as being included in the focusing in Fig. 6 A to Fig. 6 C.Electronic equipment 101 can store composograph 640 and the depth information about each object in memory 130 or buffer, or composograph 640. electronic equipment 101 can be shown on the display 150 to the corresponding depth information of composograph application image 610,620 and 630, and can show on the display 150.Electronic equipment 101 can use each image 610,620 and 630 to carry out synthetic image 640, and shows the image 640 generated.
Fig. 7 is the flow chart of the process of the focus of the object illustrated in image that the control display according to disclosure embodiment shows.
Now with reference to Fig. 7, the process according to the focus of the object in the image that the control display of disclosure embodiment shows is described.
With reference to Fig. 7, in operation 710, electronic equipment 101 can show the image comprising at least one object.The display 150 of electronic equipment 101 can synthesize and show and has not confocal multiple image or picture (picture).Electronic equipment 101 can be synthesized to new image or picture by by multiple image of multiple camera or image capture sensor or picture, and shows composograph or picture on the display 150.
When sensing the selection to object in operation 720, can to other object application image effects (such as, fuzzy) except selected object and the display application object of image effect at operation 730 electronic equipment 101.While image or picture are shown on the display 150, electronic equipment 101 can to focus on based on the information by input/output interface 140 and at least one input in display 150 in composograph with the corresponding object of input, simultaneously at least one not corresponding with this input other object application image effect (such as fuzzy) and the display application object of image effect.When sensing the touch on display 150 while the image of depth information in display application, electronic equipment 101 can use stored depth information to focus on apart from sensing object in the position-scheduled region of touch, at least one the object application image effect (such as fuzzy) simultaneously in preview image except focused on object the display application object of image effect.The depth information about object can be used to carry out application image effect.Image effect can comprise at least one in adjustment ambiguity, color, brightness, mosaic and resolution.When sensing the touch on display 150 while showing image, electronic equipment 101 can use stored depth information to focus on first object corresponding with the place sensing touch, and image effect (such as fuzzy) that can be relatively light to the second object application be positioned near the first object, simultaneously to the image effect (such as fuzzy) of relative the 3rd object application phase counterweight away from the first object.
Fig. 8 A and Fig. 8 B illustrates the view controlling the example focused on when selecting the first object from the image shown over the display according to disclosure embodiment.Fig. 8 C and Fig. 8 D illustrates the view controlling the example focused on when selecting the second object from the image shown over the display according to disclosure embodiment.
Image 810 with reference to Fig. 8 A and Fig. 8 B, Fig. 8 A can comprise first object 811 corresponding with personage, second object 812,813 and 814 corresponding with background and three object 815 corresponding with personage.According to disclosure embodiment, what comprise at multiple image respectively has not confocal object and can be synthesized as single image.The image 810 of Fig. 8 A is the image from having the synthesis of not confocal object.The memory 130 of electronic equipment 101 can memory image and the depth information about each object comprised in the picture.Depth information can indicate the distance between each object and its contiguous object.User can notice image blurring by depth information.Such as, when selecting the first object 811 (such as from the image 810 of Fig. 8 A, in operation 816) time, electronic equipment 101 can to other object 822,823, the 824 and 825 application image effects (such as fuzzy) except selected object 811, as shown in the image 820 of Fig. 8 B.Electronic equipment 101 can adjust, to focus on the object from the image or picture of display on display 150 sensing the position of user's input in presumptive area, out-focus senses other objects in other regions extra-regional of user's input except this simultaneously.The image 820 of Fig. 8 B illustrates wherein to the example of the second object 822,823 and 824 outside the first object 811 and the 3rd object 825 application image effect.Fig. 8 A illustrates that wherein the first object 811 has experienced the example of selectivity focusing.
Image 830 with reference to Fig. 8 C and Fig. 8 D, Fig. 8 C can comprise first object 831 corresponding with personage, second object 832,833 and 834 corresponding with background and three object 835 corresponding with personage.Electronic equipment 101 can adjust the depth information about selected object.Electronic equipment 101 can carry out application image effect (such as fuzzy) according to the depth information about each object.User can by such image effect should to be used for recognition image fuzzy.Such as, when selecting the 3rd object 835 from the image 830 of Fig. 8 C (such as, in operation 836), electronic equipment 101 can adjust with out-focus other objects 841,842,843 and 844 except selected object 835, as shown in the image 840 of Fig. 8 D.Electronic equipment 101 can adjust, to focus on the object from the image or picture of display on display 150 sensing the position of user's input in presumptive area, out-focus senses other objects in other regions extra-regional of user's input except this simultaneously.
Fig. 9 illustrates that the flow chart of the process of effect exchanged by the image applications camera lens shown on display according to disclosure embodiment.
The process of exchanging effect according to the image applications camera lens shown on display of disclosure embodiment is described now with reference to Fig. 9.
With reference to Fig. 9, in operation 910, electronic equipment can show the image comprising at least one object.Electronic equipment 101 can show image by image capture sensor or picture on the display 150, or can show storage image in memory 130 or picture.Image or picture can comprise multiple object, and each object can have depth information.Electronic equipment 101 can show the menu for the image applications lens effect of display on display 150 together with image.Order for applying lens effect can be inputted by this menu or the independent button provided in electronic equipment 101 or key.
When have input the order for applying camera lens exchange effect in operation 920, can show at operation 930 electronic equipment 101 and exchanging relevant information to camera lens.Electronic equipment 101 can receive by least one in input/output interface 140 and display 150 order exchanging effect for applying camera lens.Electronic equipment 101 can sense this order by user's gesture.When sensing this order, electronic equipment 101 can show camera lens and exchange relevant information, visually to provide camera lens to exchange effect to user.Electronic equipment 101 can show the information of the attribute of the camera lens comprising the image that can be applied to display on display 150.This information can comprise various information, comprises the focal length of at least one camera such as comprised at electronic equipment 101, aperture is arranged or shutter speed.This information can also comprise various types of information, and focal length, the aperture of the such as present obtainable camera lens of business are arranged and shutter speed.
When have selected any camera lens in operation 940, at operation 950 electronic equipment 101 can to the attribute of the camera lens selected by shown image applications, and the display application image of this attribute.Electronic equipment 101 can show the information of the attribute of the camera lens comprising the image that can be applied to display on display 150.When corresponding with the attribute of this camera lens information in the information shown by selecting, electronic equipment 101 can use selected information to come to shown image applications image effect.Such as, when user adjust aperture arrange time, electronic equipment 101 can apply such image effect with performance the corresponding degree of depth is set with this aperture.
Figure 10 A illustrates that the view of the example of effect exchanged by the image applications camera lens shown on display according to disclosure embodiment.Figure 10 B, Figure 10 C, Figure 10 D, Figure 10 E and Figure 10 F are the views of the example of the image applications lens effect shown on display illustrated according to disclosure embodiment.
With reference to Figure 10 A, the image 1010 of Figure 10 A can comprise the first object 1011, second object 1012,1013 and 1014 and the 3rd object 1015.According to disclosure embodiment, " total focus " image that all objects are all focused can be shown wherein.The image 1010 of Figure 10 A is wherein whole image of being all focused of object.Image 1010 can comprise the menu 1016 for applying lens effect.The memory 130 of electronic equipment 101 can memory image and the depth information about each object comprised at image 1010.When menu 1016 is selected (such as, in operation 1017), electronic equipment can show the information indicating multiple shot attribute, as shown in the image 1010 of Figure 10 B.This information can comprise: the attribute with the first camera lens 1021 of the focal length of such as 10mm and the aperture setting of such as 3.5, there is the attribute of the second camera lens 1022 of the focal length of such as 16mm and the aperture setting of such as 2.4, there is the attribute of the three-lens 1023 of the focal length of such as 30mm and the aperture setting of such as 2.0, there is the attribute of the four-barrel 1024 of the focal length of such as 45mm and the aperture setting of such as 1.8, the attribute of the 6th camera lens 1026 that the attribute with the 5th camera lens 1025 of the focal length of such as 60mm and the aperture setting of such as 2.8 is arranged with the aperture of the focal length and such as 1.4 with such as 85mm.According to disclosure embodiment, information can also comprise the information about the obtainable camera lens of present business.When user selects to have the 5th camera lens 1025 that the focal length of 60mm and the aperture of 2.8 arrange (such as from this information, in operation 1018) time, electronic equipment 101 can use the characteristic arranging corresponding camera lens with the focal length of 60mm and the aperture of 2.8 to come to image applications image effect.Electronic equipment 101 can arrange the corresponding degree of depth to image applications with selected aperture.When sensing selection the 5th camera lens 1025, electronic equipment 101 can use stored depth information application arrange corresponding image effect (such as fuzzy) with the aperture of selected camera lens or can cut out the part corresponding with selected focal length, and can show on the display 150, as shown in the image 1030 of Figure 10 C.The image 1030 of Figure 10 C illustrates the example of camera lens application image effect that arranges of aperture of focal length by having 60mm and 2.8.Such as, can see, the first object 1031 is focused and image effect (such as fuzzy) is applied to the second object 1032,1033 and 1034 and the 3rd object 1035.In addition, the region corresponding with this focal length can be cut out from the specific part of this image (such as, center).
With reference to Figure 10 D, Figure 10 E and Figure 10 F, when from the image 1010 of Figure 10 D time choice menus 1016 (such as, in operation 1019), electronic equipment can show the information of instruction shot attribute, as shown in the image 1040 of Figure 10 E.This information can comprise the focal length 1041 from 10mm to 85mm value 1042 and from 1.8 to 22.0 aperture arrange 1043 value 1044.When user selects the focal length of 60mm and the aperture of 2.8 to arrange from this information, the characteristic of the camera lens that electronic equipment 101 can be arranged according to the aperture of the focal length and 2.8 with 60mm is fuzzy to this image applications.Electronic equipment 101 can with such selection accordingly, image is shown as the image 1050 of Figure 10 F.Can see, in the image 1050 of Figure 10 F, the first object 1051 is focused, and image effect (such as fuzzy) is applied to the second object 1052,1053 and 1054 and the 3rd object 1055.In addition, the region corresponding with this focal length can be cut out from the specific part of this image (such as, center).
Figure 11 A illustrates according to the display of disclosure embodiment for applying the view of the example of the image of lens effect.Figure 11 B is the view of the example about the shot attribute information for providing image effect to shown image illustrated according to disclosure embodiment.Figure 11 C illustrates the view providing the example of image effect for the f-number by adjusting camera lens according to disclosure embodiment.Figure 11 D illustrates the view providing the example of image effect for the shutter speed by adjusting camera lens according to disclosure embodiment.Figure 11 E illustrates the view providing the example of image effect for the f-number and shutter speed by adjusting camera lens according to disclosure embodiment.
With reference to Figure 11 A, electronic equipment 101 can comprise and show for showing shot attribute to provide the menu 1111 of image effect to image 1110.When menu 1111 is by selection, as shown in Figure 11 B, image 1120 can show together with programs menu 1160 with aperture menu 1130, shutter speed menu 1140, manual menu 1150 by electronic equipment 101.When user have selected any one camera lens, electronic equipment 101 can show on the display being connected to it by instrument aperture menu 1130 for selecting aperture to arrange, for adjust shutter speed shutter speed menu 1140, for allow user manually adjust shutter speed manual menu 1150 and for allow electronic equipment 101 automatically adjustment aperture arrange and the programs menu 1160 of shutter speed.
When selecting aperture menu 1130 from the image 1120 of Figure 11 B, electronic equipment 101 can show various aperture and arrange 1132 on screen, and it makes it possible to select aperture to arrange on image 1121, as shown in Figure 11 C.When user selects aperture priority, on the display 150 that electronic equipment 101 functionally can be connected to electronic equipment 101, display makes it possible to the information selecting aperture 1131.When user selects any one value from shown information, can realize arranging the corresponding degree of depth with selected aperture by image procossing.The characteristic that electronic equipment 101 can be arranged based on aperture semi-automatically adjusts shutter speed, and makes it possible to carry out picture catching.
When selecting shutter speed menu 1140 in Figure 11 B, electronic equipment 101 can show various shutter speed 1142 on screen, and its permission selects shutter speed on image 1121, as shown in Figure 11 D.When user selects Shutter speed priority level, on the display 150 that electronic equipment 101 functionally can be connected to electronic equipment 101, display makes it possible to the information 1142 selecting shutter speed 1141.When user selects any one value from shown information 1142, shutter speed can be fixed to selected shutter speed and correspondingly can catch picture by the adjustment of this transducer sensitivity by electronic equipment 101.Electronic equipment 101 can change aperture and arrange in the fixed mode of selected shutter speed, and can apply the image effect providing degree of depth change according to the setting of this aperture.
When selecting manual menu 1150 in Figure 11 B, on the display 150 that electronic equipment 101 functionally can be connected to electronic equipment 101, display makes it possible to select aperture to arrange the item of information 1152 and 1154 of 1151 and shutter speed 1153, as depicted in fig. 1 ie.When selecting any one value from shown item of information 1152 and 1154, electronic equipment 101 can be shown by image procossing and have adjusted effect the degree of depth as depending on selected value, and transducer sensitivity (ISO) can be adjusted, to be applicable to fixing shutter speed and carries out image seizure.
Figure 12 is the block diagram of the electronic equipment illustrated according to disclosure embodiment.
Electronic equipment can comprise the configuration all or in part of the electronic equipment 101 such as shown in Fig. 1.With reference to Figure 12, electronic equipment 1201 can comprise one or more application processor (AP) 1210, communication unit 1220, subscriber identity module (SIM) block 1224, memory 1230, sensor assembly 1240, input equipment 1250, display 1260, interface 1270, audio unit 1280, camera model 1291, Power Management Unit 1295, battery 1296, indicating device 1297 and motor 1298.
AP1210 can be operatively connected to multiple hardware and software components of AP1210 by operation system or application program, and AP1210 can process and calculate the various data comprising multi-medium data.AP1210 can be implemented in such as SOC (system on a chip) (SoC).According to disclosure embodiment, AP1210 can also comprise Graphics Processing Unit (GPU) (not shown).
Communication unit 1220 (such as, communication interface 160) can perform via network and be connected to electronic equipment 1201 (such as, electronic equipment 101) the data communication of other electronic equipments (such as, electronic equipment 104 or server 106).According to disclosure embodiment, communication unit 1220 can comprise cellular module 1221, Wi-Fi module 1223, BT module 1225, GPS module 1227, NFC module 1228 and radio frequency (RF) module 1229.
Cellular module 1221 can pass through communication network (such as, LTE, LTE-A, CDMA, WCDMA, UMTS, WiBo or GSM network) provides audio call, video call, text or Internet service.Cellular module 1221 can use such as SIM (such as, SIM card 1224) to come to perform the electronic equipment in communication network to identify and certification.According to disclosure embodiment, cellular module 1221 can perform at least some in the function that can be provided by AP1210.Such as, cellular module 1221 can perform at least some in multimedia control function.
According to disclosure embodiment, cellular module 1221 can comprise communication processor (CP).Cellular module 1221 can be implemented in such as SoC.Although divide with AP1210 in fig. 12 to open and provide cellular module 1221 (such as CP), memory 1230 or Power Management Unit 1295, but according to an embodiment of the present disclosure, the at least some (such as, cellular module 1221) of assembly is listed before AP1210 can be configured to comprise.
According to disclosure embodiment, AP1210 or cellular module 1221 (such as, CP) can load the order or data that receive from nonvolatile memory or other assemblies of being connected to it, and process the order that loads or data.AP1210 or cellular module 1221 can store the data received from other assemblies or the data generated by other assemblies in the nonvolatile memory.
Wi-Fi module 1223, BT module 1225, GPS module 1227 or NFC module 1228 can comprise the process for such as processing the data passed on by module.Although cellular module 1221, Wi-Fi module 1223, BT module 1225, GPS module 1227 and NFC module 1228 are illustrated by with their blocks independent separately in fig. 12, but at least some in cellular module 1221, Wi-Fi module 1223, BT module 1225, GPS module 1227 and NFC module 1228 (such as, two or more) can be included in single integrated circuit (IC) or IC bag.Such as, can be implemented in single SoC with at least some (such as, corresponding with cellular module 1221 CP and the Wi-Fi processor corresponding with Wi-Fi module 1223) in the processor that cellular module 1221, Wi-Fi module 1223, BT module 1225, GPS module 1227 and NFC module 1228 are corresponding respectively.
RF module 1229 can pass on the data of such as RF signal.RF module 1229 can comprise such as transceiver, power amplifier module (PAM) (not shown), frequency filter (not shown) or low noise amplifier (LNA) (not shown).RF module 1229 may further include when performing radio communication for passing on the part (such as, conductor or wire) of radio wave in free space.Although cellular module 1221, Wi-Fi module 1223, BT module 1225, GPS module 1227 and NFC module 1228 share single RF module 1229 in fig. 12, cellular module 1221, Wi-Fi module 1223, BT module 1225, GPS module 1227 or NFC module 1228 can pass on RF signal by independent RF module.
SIM card 1224 can comprise SIM, in the groove that the pre-position that SIM card 1224 can be inserted in electronic equipment is formed.SIM card 1224 can comprise unique identity information (such as, integrated circuit card identifier (ICCID) or user profile (such as, international mobile subscriber identity (IMSI)).
Memory 1230 (such as, memory 130) can comprise internal storage 1232 or external memory storage 1234.Internal storage 1232 can comprise such as that volatile memory is (such as, dynamic RAM (DRAM), static RAM (SRAM) (SRAM), synchronous dynamic ram (SDRAM) etc.) or nonvolatile memory (such as, disposable programmable read only memory (OTPROM), programming ROM (PROM), erasable programmable ROM (EPROM), electrically erasable ROM (EEPROM), mask rom, flash ROM, NAND quick-flash memory or NOR flash memory).
According to disclosure embodiment, internal storage 1232 can be solid-state driving (SSD).External memory storage 1234 can comprise flash drives, such as compact flash (CF) memory, secure digital (SD) memory, micro-SD memory, mini SD memory, extreme digital (xD) memory, or memory stick tM.External memory storage 1234 functionally can be connected with electronic equipment 1201 via various interface.According to disclosure embodiment.Electronic equipment 1201 can also comprise the memory device (or storage medium) of such as hard drive.
Sensor assembly 1240 can the mode of operation of measure physical quantities or detected electrons equipment 1201, and information that is measured or that detect can be converted to the signal of telecommunication by sensor assembly 1240.Sensor assembly 1240 can comprise at least one in such as the following: gesture transducer 1240A, gyro sensor 1240B, baroceptor 1240C, Magnetic Sensor 1240D, acceleration transducer 1240E, holding power transducer 1240F, proximity transducer 1240G, the such as color sensor 1240H of R-G-B (RGB) transducer, biometric sensor 1240I, temperature/humidity sensor 1240J, luminance sensor 1240K or ultraviolet (UV) transducer 1240M.Additionally or alternatively, sensor assembly 1240 can comprise not shown such as Electronic Nose transducer, electromyogram (ECG) transducer, electroencephalogram (EEG) (EEG) transducer, electrocardiogram (ECG) transducer, IR transducer, iris transducer or fingerprint sensor.Sensor assembly 1240 can comprise at least one transducer that can sense or identify biological information, the fingerprint of such as hand or pin, iris, face, the rhythm of the heart, E.E.G, joint or pulse.Except above-mentioned multiple transducer, sensor assembly 1240 can comprise the biological information that can bend its joint or user according to user and sense or the various transducers of identifying information.Sensor assembly 1240 can also comprise the control circuit for controlling at least one or more transducer comprised at sensor assembly.
Input unit 1250 can comprise touch panel 1252, (numeral) transducer 1254, key 1256 or ultrasonic input equipment 1258.Touch panel 1252 can identify with at least one in condenser type, resistance-type, infrared type or ultrasonic method and touch input.Touch panel 1252 can also comprise control circuit.Utilize condenser type method, physical contact or proximity test are fine.Touch panel 1252 can also comprise tactile layer.In this, touch panel 1252 can also provide haptic response to user.
Can to input same or similar mode with the touch such as how receiving user or to pass through to use the independent thin slice (sheet) for identifying to realize (numeral) transducer 1254.Key 1256 can comprise such as physical button, optical key or keypad.Ultrasonic input equipment 1258 can use the input tool of generation ultrasonic signal and make electronic equipment 1201 can carry out identification data by the ultrasonic signal sensing microphone (such as, microphone 1288).According to disclosure embodiment, electronic equipment 1201 can use communication unit 1220 to receive user's input from the external electronic device (such as, network, computer or server) being connected to electronic equipment 1201.
Display 1260 (such as, display 150) can comprise panel 1262, hologram device 1264 or projecting apparatus 1266.Panel 1262 can be such as liquid crystal display (LCD), active matrix organic light-emitting diode (AMOLED) etc.Panel 1262 may be implemented as flexible, transparent or wearable.Panel 1262 also can be merged into a module together with touch panel 1252.Hologram device 1264 can make three-dimensional (3D) image (hologram) in atmosphere by making the interference of using up.Projecting apparatus 1266 can show image by projection light on screen.Screen can be positioned within electronic equipment 1201 or outside.According to an embodiment, display 1260 can also comprise the control circuit for control panel 1262, hologram device 1264 or projecting apparatus 1266.
Interface 1270 can comprise such as HDMI1272, USB1274, optical interface 1276 or D-microminiature (D-sub) 1278.Interface 1270 can be included in the communication interface 160 such as shown in Fig. 1.Additionally or alternatively, interface 1270 can comprise the clear degree link of mobile high definition (MHL) interface, SD card/multimedia card (MMC) interface or IrDA standard interface.
Audio unit 1280 can perform and sound wave or audio signal is converted to the signal of telecommunication or the relevant various process (such as, encode or decode) of inverse operation.In the input/output interface 140 that can be included in such as shown in Fig. 1 at least partially of audio unit 1280.Audio unit 1280 can process the acoustic information inputed or outputed by such as loud speaker 1282, receiver 1284, earphone 1286 or microphone 1288.
Camera model 1291 can be the equipment for capturing static image and video, and the photoflash lamp of one or more imageing sensor (such as, preposition and rearmounted transducer) (not shown), camera lens (not shown), image-signal processor (ISP) (not shown) or such as LED or xenon lamp (not shown) can be comprised according to disclosure embodiment.
Power Management Unit 1295 can the power supply of managing electronic equipment 1201.Although not shown, Power Management Unit 1295 can comprise such as power management IC (PMIC), charger IC or battery or fuel meter (fuelgauge).
PMIC can be installed on such as IC or SoC.Charging method can be divided into wired or wireless charging module.Charger IC can charge to battery, and prevents from introducing overvoltage or overcurrent to charger.According to disclosure embodiment, charger IC can be used at least one in cable charging scheme and wireless charging scheme.Wireless charging scheme can comprise such as magnetic resonance scheme, magnetic induction scheme or based on electromagnetic scheme, and can add the additional circuit of such as coil, resonant circuit, rectifier etc. in order to wireless charging.
Battery measurement meter just can be measured the dump energy of battery 1296, voltage, electric current or temperature at battery 1296 during charging.Battery 1296 can be preserved or produce electricity, and the electricity utilizing institute to preserve or produce is powered to electronic equipment 1201.Battery 1296 can comprise such as rechargeable battery or solar cell.
Indicating device 1297 can indicate the particular state of a part (such as, AP1210) for electronic equipment 1201 or electronic equipment, comprises such as boot state, message status or recharges state.Motor 1298 can convert electrical signals to mechanical oscillation.Although not shown, for supporting that the treatment circuit of mobile TV (such as GPU) can be included in electronic equipment 1201.Can process for supporting the treatment circuit of mobile TV and meet about DMB (DMB), digital video broadcasting (DVB) or mediaFLO tMthe media data of standard.
Each in the aforementioned components of electronic equipment can comprise one or more parts, and component names may change according to the type of electronic equipment.Electronic equipment according to the various embodiment of the disclosure can comprise at least one in aforementioned components, omits some in them, or comprises other additional assemblies.Some assemblies can be merged into an entity, but this entity can perform the function identical with the function that these assemblies can carry out.
Figure 13 illustrates the communication protocol 1300 between multiple electronic equipment (such as, the first electronic equipment 1310 and the second electronic equipment 1330) according to disclosure embodiment.
With reference to Figure 13, communication protocol 1300 such as can comprise device discovery protocol 1351, capabilities exchange agreement 1353, procotol 1355 and application protocol 1357.
According to disclosure embodiment, device discovery protocol 1351 can be for each electronic equipment (such as, first electronic equipment 1310 or the second electronic equipment 1330) detect the agreement of external electronic device, electronic equipment can communicate with detected external electronic device or himself is linked to detected external electronic device.Such as, first electronic equipment 1310 (such as, electronic equipment 101) can use device discovery protocol 1351 by communication plan available on the first electronic equipment 1310 (such as, Wi-Fi, BT or USB) detect the second electronic equipment 1330 (such as, electronic equipment 104).First electronic equipment 1310 can obtain and store the identifying information about the second electronic equipment 1330 using device discovery protocol 1351 to detect, to set up communication link with the second electronic equipment 1330.First electronic equipment 1310 can set up such communication link based on such as at least this identifying information is next with the second electronic equipment 1330.
According to disclosure embodiment, device discovery protocol 1351 can be for multiple electronic equipment between the agreement of mutual certification.Such as, first electronic equipment 1310 can at least based on the certification performed about the communication information (such as, media interviews control (MAC) address, GUID (UUID), subsystem identification (SSID) or Information Provider (IP) address) linked with the second electronic equipment 1330 between first electronic equipment 1310 and the second electronic equipment 1330.
According to disclosure embodiment, capabilities exchange agreement 1353 can be for exchanging and the agreement by the relevant information of at least one the supported service ability in the first electronic equipment 1310 and the second electronic equipment 1330.Such as, the first electronic equipment 1310 and the second electronic equipment 1330 can exchange information about the current service ability just provided by handling capacity exchange agreement 1353.Tradable information can comprise the identification information indicating the special services can supported by the first electronic equipment 1310 and the second electronic equipment 1330.Such as, the first electronic equipment 1310 can receive identification information about the special services provided by the second electronic equipment 1330 from the second electronic equipment 1330 by handling capacity exchange agreement 1353.In this case, based on received identification information, the first electronic equipment 1310 can determine whether the first electronic equipment 1310 can support this special services.
According to disclosure embodiment, procotol 1355 can be for controlling in communication the electronic equipment that is connected to each other (such as, first electronic equipment 1310 and the second electronic equipment 1330) between the agreement of data flow passed on, such as, electronic equipment is made can to provide service while mutually working.Such as, at least one in the first electronic equipment 1310 or the second electronic equipment 1330 can use procotol 1355 to carry out Wrong control or data quality control.This is nonlocal or alternatively, procotol 1355 can determine the transformat of the data passed between the first electronic equipment 1310 and the second electronic equipment 1330.At least one in first electronic equipment 1310 or the second electronic equipment 1330 can use procotol 1355 at least to manage session (such as, session connection or session termination) for the exchanges data between the first electronic equipment 1310 and the second electronic equipment 1330.
According to disclosure embodiment, application protocol 1357 can be exchange the process of data relevant to the service being supplied to external electronic device or the agreement of information for being provided for.Such as, the first electronic equipment 1310 (such as, electronic equipment 101) can provide service by application protocol 1357 to the second electronic equipment 1330 (such as, electronic equipment 104 or server 106).
According to disclosure embodiment, communication protocol 1300 can be standard communication protocol, the agreement designed by individual or entity's (such as, communication equipment/system manufacturer or network provider) or its combination.
Term " module " can refer to comprise the unit of in hardware, software and firmware or its combination.Term " module " can use with unit, logic, logical block, assembly or circuit switching.Module can be minimum unit or the part of integrated package.Module can be the minimum unit or the part that perform one or more function.Can machinery realize or electronics realize this module.Such as, this module can comprise at least one in application-specific integrated circuit (ASIC) chip of the known or execution certain operations that is developed in the future, field programmable gate array (FPGA) or programmable logic array (PLA).
Such as can be implemented as with the form of programming module the instruction be stored in non-transitory computer-readable storage medium at least partially in equipment (such as module or their function) or method (such as operating).When being run by one or more processor (such as, processor 120), instruction can make processor perform corresponding function.Non-transitory computer-readable storage medium can be such as memory 130.Such as can be realized (such as, performing) programming module at least partially by processor 120.The such as module, program, routine, instruction set, process etc. that can comprise at least partially for performing one or more function of programming module.
Non-transitory computer-readable storage medium can comprise be configured to store and execution of program instructions (such as, programming module) hardware device, such as, the magnet-optical medium of magnetizing mediums, such as the compact disk ROM (CD-ROM) of such as hard disk, floppy disk and tape and the light medium of DVD, such as light floppy disk, ROM, RAM, flash memory etc.The example of program command can not only comprise machine language code but also comprise the higher-level language code using resolver to run by various calculating means.Aforementioned hardware equipment can be configured to operate as one or more software module, and to perform various embodiment of the present disclosure, and vice versa.
The module of the various embodiment of the foundation disclosure or programming module can comprise at least one or more in aforementioned components, omit some in them, or also comprise other additional assemblies.Can by sequentially, simultaneously, repeat or heuristically perform by the operation performed according to the module of the disclosure various embodiment, programming module or other assemblies.In addition, certain operations can be performed with different order or be omitted or comprise other operation bidirectionals.According to disclosure embodiment, can provide to store and be configured to be performed with the storage medium making at least one processor perform the order that at least one operates by least one processor, at least one operation described comprises the first command set being obtained image by imageing sensor, use the second command set of the multiple image zooming-out depth informations obtained by array camera, the depth information extracted to obtained image applications preview apply the 3rd command set of the image of depth information, and the 4th command set, 4th command set amplifies extracted depth information accordingly with the input of the image catching institute's preview, depth information after amplifying to obtained image applications also catches the image of the depth information after applying amplification.
As it is evident that by aforementioned explanation, according to disclosure embodiment, the depth information about multiple image can be used to carry out scan picture, make it possible to carry out realtime graphic preview or catch.
In addition, according to disclosure embodiment, multiple image by synthesis, preview in real time and can be caught, and thus allows the convenience for users increased.
Although illustrate and describe the disclosure with reference to various embodiment of the present disclosure, but it will be appreciated by those skilled in the art that, here can to make on various forms and change in details, and the spirit and scope of the present disclosure by claims and equivalents thereof can not be departed from.

Claims (20)

1., for the method by electronic equipment process image, the method comprises:
The first image and the second image is obtained by the first imageing sensor;
Depth information is extracted from least one the 3rd image obtained by the second imageing sensor;
The depth information extracted to the first obtained image applications also shows the first image; And
To the depth information that the second obtained image applications is extracted.
2. method according to claim 1, wherein, comprise to the first obtained image and at least one the application depth information in the second image obtained: the depth information that convergent-divergent extracts is with the size of at least one in applicable the first image obtained and the second image obtained; And apply the depth information after convergent-divergent to the first obtained image and at least one in the second image obtained.
3. method according to claim 1, wherein, described display comprises: use the depth information that extracts to the effect of application image at least partially of the first obtained image.
4. method according to claim 1, wherein, during a period, perform the described depth information extracted to the second obtained image applications, this period comprise the input sensed for catching the first shown image before and after sensing the input for catching the first shown image at least one.
5. method according to claim 1, wherein, the described second imageing sensor imageing sensor that is included in array camera, stereoscopic camera, flight time (TOF) transducer, structured light sensor and infrared (IR) transducer to comprise at least one.
6. method according to claim 5, wherein, described depth information extracts in the image that obtains of the imageing sensor comprised from least one from array camera, stereoscopic camera, TOF sensor, structured light sensor and IR transducer.
7. method according to claim 1, also comprises: storage depth information and apply in the image of this depth information at least one.
8. method according to claim 3, wherein, described image effect adjusts at least one in ambiguity, color, brightness, mosaic and resolution.
9. method according to claim 1, also comprises:
When sensing touch while display first image, focus on the object apart from sensing in the position-scheduled region of touch; And
At least one object application image effect in the first shown image except described object.
10. method according to claim 1, also comprises:
When sensing for exchanging the input of effect to the first shown image applications camera lens, display exchanges relevant information with camera lens; And
The selected information corresponding with at least one item of information of the information shown by selection is used to come to the first shown image applications image effect.
11. methods according to claim 10, wherein, described information is the information of the attribute for adjusting camera lens, and comprises at least one in aperture setting, shutter speed, manually setting and programming.
12. 1 kinds of electronic equipments for the treatment of image, this electronic equipment comprises:
Image obtains module, comprises:
Be configured to the first imageing sensor of acquisition first image and the second image, and
Be configured to the second imageing sensor obtaining at least one the 3rd image;
Image processor, is configured to:
Depth information is extracted from least one the 3rd image,
The depth information extracted to the first obtained image applications and show this first image,
And
To the depth information that the second obtained image applications is extracted; And
Display, is configured to show the first image obtained.
13. electronic equipments according to claim 12, wherein, described image processor is configured to:
The depth information that convergent-divergent extracts, with the size of applicable image; And
Depth information after this image applications convergent-divergent.
14. electronic equipments according to claim 12, wherein, described image processor is configured to the effect of application image at least partially using the depth information that extracts to the first obtained image.
15. electronic equipments according to claim 12, wherein, described image processor is configured to the depth information that the second obtained image applications is extracted during a period, this period comprise the input sensed for catching the first shown image before and after sensing the input for catching the first shown image at least one.
16. electronic equipments according to claim 12, wherein, the described second imageing sensor imageing sensor that is included in array camera, stereoscopic camera, flight time (TOF) transducer, structured light sensor and infrared (IR) transducer to comprise at least one.
17. electronic equipments according to claim 16, wherein, described depth information is extracted in the image that the imageing sensor that described image processor is configured to comprise from least one from array camera, stereoscopic camera, TOF sensor, structured light sensor and IR transducer obtains.
18. electronic equipments according to claim 12, also comprise: memory, are configured to storage depth information and apply at least one in the image of depth information.
19. electronic equipments according to claim 12, wherein, when sensing touch while display first image, described image processor is configured to focus on the object apart from sensing in the position-scheduled region of touch, and at least one the object application image effect in the first shown image except described object.
20. electronic equipments according to claim 12, wherein, when sensing for exchanging the input of effect to the first shown image applications camera lens, described image processor is configured to display and exchanges relevant information with camera lens, and uses the selected information corresponding with at least one item of information of the information shown by selection to come to the first shown image applications image effect.
CN201510696218.XA 2014-10-23 2015-10-23 Electronic device and method for processing image Active CN105554369B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140144398A KR102251483B1 (en) 2014-10-23 2014-10-23 Electronic device and method for processing image
KR10-2014-0144398 2014-10-23

Publications (2)

Publication Number Publication Date
CN105554369A true CN105554369A (en) 2016-05-04
CN105554369B CN105554369B (en) 2020-06-23

Family

ID=55761193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510696218.XA Active CN105554369B (en) 2014-10-23 2015-10-23 Electronic device and method for processing image

Country Status (6)

Country Link
US (4) US9990727B2 (en)
EP (1) EP3210376A4 (en)
KR (1) KR102251483B1 (en)
CN (1) CN105554369B (en)
AU (1) AU2015337185B2 (en)
WO (1) WO2016064248A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108040207A (en) * 2017-12-18 2018-05-15 信利光电股份有限公司 A kind of image processing method, device, equipment and computer-readable recording medium
CN110611755A (en) * 2018-05-29 2019-12-24 广州印芯半导体技术有限公司 Image sensing system and multifunctional image sensor thereof
CN113965678A (en) * 2016-06-30 2022-01-21 三星电子株式会社 Electronic device and image capturing method thereof

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9826159B2 (en) 2004-03-25 2017-11-21 Clear Imaging Research, Llc Method and apparatus for implementing a digital graduated filter for an imaging apparatus
US10721405B2 (en) 2004-03-25 2020-07-21 Clear Imaging Research, Llc Method and apparatus for implementing a digital graduated filter for an imaging apparatus
US8331723B2 (en) 2004-03-25 2012-12-11 Ozluturk Fatih M Method and apparatus to correct digital image blur due to motion of subject or imaging device
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
EP2890125B1 (en) 2013-12-24 2021-10-13 Sony Depthsensing Solutions A time-of-flight camera system
CN111371986A (en) * 2015-03-16 2020-07-03 深圳市大疆创新科技有限公司 Apparatus and method for focus adjustment and depth map determination
US9979890B2 (en) 2015-04-23 2018-05-22 Apple Inc. Digital viewfinder user interface for multiple cameras
US9681111B1 (en) 2015-10-22 2017-06-13 Gopro, Inc. Apparatus and methods for embedding metadata into video stream
US9667859B1 (en) 2015-12-28 2017-05-30 Gopro, Inc. Systems and methods for determining preferences for capture settings of an image capturing device
JP6637767B2 (en) * 2016-01-05 2020-01-29 キヤノン株式会社 Electronic apparatus, control method thereof, and remote imaging system
US9922387B1 (en) 2016-01-19 2018-03-20 Gopro, Inc. Storage of metadata and images
US9967457B1 (en) 2016-01-22 2018-05-08 Gopro, Inc. Systems and methods for determining preferences for capture settings of an image capturing device
US9665098B1 (en) 2016-02-16 2017-05-30 Gopro, Inc. Systems and methods for determining preferences for flight control settings of an unmanned aerial vehicle
US9912860B2 (en) 2016-06-12 2018-03-06 Apple Inc. User interface for camera effects
CN105974427B (en) * 2016-06-24 2021-05-04 上海图漾信息科技有限公司 Structured light distance measuring device and method
KR102529120B1 (en) * 2016-07-15 2023-05-08 삼성전자주식회사 Method and device for acquiring image and recordimg medium thereof
US10564740B2 (en) 2016-07-21 2020-02-18 Samsung Electronics Co., Ltd. Pen device—panel interaction based on electromagnetic signals output by the pen device
KR102488563B1 (en) * 2016-07-29 2023-01-17 삼성전자주식회사 Apparatus and Method for Processing Differential Beauty Effect
KR102622754B1 (en) * 2016-09-07 2024-01-10 삼성전자주식회사 Method for image composition and electronic device supporting the same
US10234524B2 (en) * 2016-10-18 2019-03-19 Siemens Healthcare Gmbh Shifted pulses for simultaneous multi-slice imaging
US9973792B1 (en) 2016-10-27 2018-05-15 Gopro, Inc. Systems and methods for presenting visual information during presentation of a video segment
EP3525447B1 (en) * 2016-10-28 2023-10-25 Huawei Technologies Co., Ltd. Photographing method for terminal, and terminal
US10719927B2 (en) * 2017-01-04 2020-07-21 Samsung Electronics Co., Ltd. Multiframe image processing using semantic saliency
US10187607B1 (en) 2017-04-04 2019-01-22 Gopro, Inc. Systems and methods for using a variable capture frame rate for video capture
KR102390184B1 (en) 2017-04-26 2022-04-25 삼성전자주식회사 Electronic apparatus and method for displaying image thereof
WO2018214067A1 (en) 2017-05-24 2018-11-29 SZ DJI Technology Co., Ltd. Methods and systems for processing an image
DK180859B1 (en) * 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
CN107493427A (en) * 2017-07-31 2017-12-19 广东欧珀移动通信有限公司 Focusing method, device and the mobile terminal of mobile terminal
KR102338576B1 (en) * 2017-08-22 2021-12-14 삼성전자주식회사 Electronic device which stores depth information associating with image in accordance with Property of depth information acquired using image and the controlling method thereof
CN108055452B (en) * 2017-11-01 2020-09-18 Oppo广东移动通信有限公司 Image processing method, device and equipment
CN107959778B (en) 2017-11-30 2019-08-20 Oppo广东移动通信有限公司 Imaging method and device based on dual camera
US11112964B2 (en) 2018-02-09 2021-09-07 Apple Inc. Media capture lock affordance for graphical user interface
US10713756B2 (en) * 2018-05-01 2020-07-14 Nvidia Corporation HW-assisted upscaling and multi-sampling using a high resolution depth buffer
US11012694B2 (en) 2018-05-01 2021-05-18 Nvidia Corporation Dynamically shifting video rendering tasks between a server and a client
US11722764B2 (en) * 2018-05-07 2023-08-08 Apple Inc. Creative camera
DK201870374A1 (en) 2018-05-07 2019-12-04 Apple Inc. Avatar creation user interface
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
DK201870623A1 (en) 2018-09-11 2020-04-15 Apple Inc. User interfaces for simulated depth effects
EP3627511A1 (en) * 2018-09-21 2020-03-25 Samsung Electronics Co., Ltd. Method and system for automatically adding effect while recording
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US10645294B1 (en) 2019-05-06 2020-05-05 Apple Inc. User interfaces for capturing and managing visual media
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
JP7171406B2 (en) * 2018-12-12 2022-11-15 キヤノン株式会社 Electronic device and its control method
US10902265B2 (en) * 2019-03-27 2021-01-26 Lenovo (Singapore) Pte. Ltd. Imaging effect based on object depth information
KR20200117562A (en) * 2019-04-04 2020-10-14 삼성전자주식회사 Electronic device, method, and computer readable medium for providing bokeh effect in video
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
WO2020247109A1 (en) * 2019-06-06 2020-12-10 Applied Materials, Inc. Imaging system and method of creating composite images
US20210044742A1 (en) * 2019-08-05 2021-02-11 Facebook Technologies, Llc Dynamically programmable image sensor
CN111246106B (en) * 2020-01-22 2021-08-03 维沃移动通信有限公司 Image processing method, electronic device, and computer-readable storage medium
KR20210098292A (en) * 2020-01-31 2021-08-10 삼성전자주식회사 Electronic device including camera and operating method thereof
CN113359950A (en) * 2020-03-06 2021-09-07 苹果公司 Housing structure for handheld electronic device
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11039074B1 (en) 2020-06-01 2021-06-15 Apple Inc. User interfaces for managing media
US11212449B1 (en) * 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
US11632601B1 (en) * 2021-11-11 2023-04-18 Qualcomm Incorporated User interface for camera focus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298898A1 (en) * 2010-05-11 2011-12-08 Samsung Electronics Co., Ltd. Three dimensional image generating system and method accomodating multi-view imaging
CN102314683A (en) * 2011-07-15 2012-01-11 清华大学 Computational imaging method and imaging system based on nonplanar image sensor
US20120050480A1 (en) * 2010-08-27 2012-03-01 Nambi Seshadri Method and system for generating three-dimensional video utilizing a monoscopic camera
EP2579572A1 (en) * 2011-10-07 2013-04-10 LG Electronics A mobile terminal and method for generating an out-of-focus image
EP2683169A2 (en) * 2012-07-03 2014-01-08 Woodman Labs, Inc. Image blur based on 3D depth information
KR101391095B1 (en) * 2012-11-14 2014-05-07 한양대학교 에리카산학협력단 Method and apparatus for improving stereo vision image using depth information of an image

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100891549B1 (en) * 2007-05-22 2009-04-03 광주과학기술원 Method and apparatus for generating depth information supplemented using depth-range camera, and recording medium storing program for performing the method thereof
JP5124372B2 (en) 2008-07-10 2013-01-23 株式会社リコー Image processing apparatus, image processing method, and digital still camera
KR101629459B1 (en) 2009-11-05 2016-06-10 삼성전자주식회사 Image processing apparatus and method
JP2012044564A (en) 2010-08-20 2012-03-01 Sanyo Electric Co Ltd Imaging apparatus
KR20120060549A (en) * 2010-12-02 2012-06-12 전자부품연구원 Mobile Terminal And Method Of Providing Video Call Using Same
KR101862199B1 (en) * 2012-02-29 2018-05-29 삼성전자주식회사 Method and Fusion system of time-of-flight camera and stereo camera for reliable wide range depth acquisition
KR20140067253A (en) * 2012-11-26 2014-06-05 삼성전자주식회사 Image processing apparatus and method thereof
KR102124802B1 (en) * 2013-06-04 2020-06-22 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298898A1 (en) * 2010-05-11 2011-12-08 Samsung Electronics Co., Ltd. Three dimensional image generating system and method accomodating multi-view imaging
US20120050480A1 (en) * 2010-08-27 2012-03-01 Nambi Seshadri Method and system for generating three-dimensional video utilizing a monoscopic camera
CN102314683A (en) * 2011-07-15 2012-01-11 清华大学 Computational imaging method and imaging system based on nonplanar image sensor
EP2579572A1 (en) * 2011-10-07 2013-04-10 LG Electronics A mobile terminal and method for generating an out-of-focus image
EP2683169A2 (en) * 2012-07-03 2014-01-08 Woodman Labs, Inc. Image blur based on 3D depth information
KR101391095B1 (en) * 2012-11-14 2014-05-07 한양대학교 에리카산학협력단 Method and apparatus for improving stereo vision image using depth information of an image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113965678A (en) * 2016-06-30 2022-01-21 三星电子株式会社 Electronic device and image capturing method thereof
CN113965678B (en) * 2016-06-30 2024-04-02 三星电子株式会社 Electronic device and image capturing method thereof
CN108040207A (en) * 2017-12-18 2018-05-15 信利光电股份有限公司 A kind of image processing method, device, equipment and computer-readable recording medium
CN110611755A (en) * 2018-05-29 2019-12-24 广州印芯半导体技术有限公司 Image sensing system and multifunctional image sensor thereof

Also Published As

Publication number Publication date
US9990727B2 (en) 2018-06-05
US11455738B2 (en) 2022-09-27
AU2015337185A1 (en) 2017-03-23
KR20160047891A (en) 2016-05-03
KR102251483B1 (en) 2021-05-14
CN105554369B (en) 2020-06-23
US20180276833A1 (en) 2018-09-27
US10970865B2 (en) 2021-04-06
US20200027226A1 (en) 2020-01-23
AU2015337185B2 (en) 2019-06-13
EP3210376A4 (en) 2017-10-18
US20160117829A1 (en) 2016-04-28
EP3210376A1 (en) 2017-08-30
US20210225019A1 (en) 2021-07-22
US10430957B2 (en) 2019-10-01
WO2016064248A1 (en) 2016-04-28

Similar Documents

Publication Publication Date Title
CN105554369A (en) Electronic device and method for processing image
KR102444085B1 (en) Portable communication apparatus and method for displaying images thereof
CN108289161B (en) Electronic device and image capturing method thereof
US10003785B2 (en) Method and apparatus for generating images
CN109644229B (en) Method for controlling camera and electronic device thereof
US9794441B2 (en) Electronic device using composition information of picture and shooting method using the same
US20180070023A1 (en) Image composition method and electronic device for supporting the same
US10440262B2 (en) Electronic device and method for processing image
EP3687157A1 (en) Method for capturing images and electronic device
KR20160055337A (en) Method for displaying text and electronic device thereof
KR20150141426A (en) Electronic device and method for processing an image in the electronic device
KR20150134916A (en) Electronic device and method for controlling photograph function in electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant