CN108632529B - Electronic device providing a graphical indicator for a focus and method of operating an electronic device - Google Patents

Electronic device providing a graphical indicator for a focus and method of operating an electronic device Download PDF

Info

Publication number
CN108632529B
CN108632529B CN201810249601.4A CN201810249601A CN108632529B CN 108632529 B CN108632529 B CN 108632529B CN 201810249601 A CN201810249601 A CN 201810249601A CN 108632529 B CN108632529 B CN 108632529B
Authority
CN
China
Prior art keywords
electronic device
region
image
graphical indicator
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810249601.4A
Other languages
Chinese (zh)
Other versions
CN108632529A (en
Inventor
李承翰
千钟爀
金杓宰
尹泳权
金汶洙
元钟勋
李基赫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN108632529A publication Critical patent/CN108632529A/en
Application granted granted Critical
Publication of CN108632529B publication Critical patent/CN108632529B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Abstract

An electronic device is provided. The electronic device includes a camera, a display, a memory, and at least one processor. The at least one processor may be configured to: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using the camera; determining, based on the depth information, a first region in the image displayed on the display that displays a first graphical indicator indicating a focus region in which the camera is focused, with control of a focus of the camera; and displaying the first graphical indicator on at least a portion of at least one subject included in the first region.

Description

Electronic device providing a graphical indicator for a focus and method of operating an electronic device
Technical Field
The present disclosure relates to an electronic device. More particularly, the present disclosure relates to an electronic device for providing a graphical indicator for a focus and a method of operating the electronic device.
Background
In a manual focus operation for manually controlling a focus of a camera, an electronic device including the camera provides a graphic indicator, such as focus peaking (focus peaking) of an object included in a focus area, thereby easily and manually controlling the focus by recognizing the focus area.
According to the related art, an electronic device including a camera provides a graphical indicator for focus peaking on an object included in a focus area using contrast information of an image. However, the method using the contrast information may display a graphic indicator of an object whose position is inaccurate according to edge information, contrast information, and/or illuminance and brightness of an object to be imaged, and thus may reduce accuracy and usability.
The above information is presented merely as background information to aid in understanding the present disclosure. It is not certain, nor is it asserted, whether any of the above can be used as prior art in connection with the present disclosure.
Disclosure of Invention
Various aspects of the present disclosure are to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an electronic device and a method for operating the same, which provide a graphical indicator to a subject included in a focus area using depth information of an image when a focus of a camera is manually operated, so as to identify the focus area.
According to one aspect of the present disclosure, an electronic device is provided. The electronic device includes a camera, a display, a memory, and at least one processor. The at least one processor may be configured to: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using the camera; determining, based on the depth information, a first region in the image displayed on the display that displays a first graphical indicator indicating a focus region in which the camera is focused, with control of a focus of the camera; and displaying the first graphical indicator on at least a portion of the subject included in the first region.
According to another aspect of the present disclosure, a method of operating an electronic device is provided. The method for operating the electronic equipment comprises the following steps: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using a camera included in the electronic device; determining, based on the depth information, a first region in the image displayed on a display, the first region displaying a first graphical indicator for indicating a focus region in which the camera is focused, with control of a focus of the camera; and displaying the first graphical indicator on at least a portion of the subject included in the first region.
According to another aspect of the present disclosure, an electronic device is provided. The electronic device includes a camera, a display, a memory, and at least one processor. The at least one processor may be configured to: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using the camera; determining a focus area on which the camera focuses according to an input signal in an image displayed on the display; and displaying a first graphical indicator on at least a portion of the subject included in the focus area based on the depth information.
According to various embodiments of the present disclosure, when a manual focus of a camera is operated, an electronic device displays a graphic indicator on a subject included in a focus area using depth information of an image. Therefore, the electronic apparatus can accurately and efficiently identify the focus area.
Other aspects, advantages and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the disclosure.
Drawings
The above and other aspects, features and advantages of particular embodiments of the present disclosure will become more apparent from the following description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram illustrating an electronic device and a network in accordance with various embodiments of the present disclosure;
FIG. 2 is a block diagram of an electronic device according to various embodiments of the present disclosure;
FIG. 3 is a block diagram of program modules according to various embodiments of the present disclosure;
FIG. 4 is a block diagram schematically illustrating an electronic device according to various embodiments of the present disclosure;
FIGS. 5A and 5B are flow diagrams illustrating operation of an electronic device according to various embodiments of the present disclosure;
FIG. 6 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure;
7A, 7B, and 7C illustrate user interfaces for describing graphical indicators provided from an electronic device, in accordance with various embodiments of the present disclosure;
FIG. 8 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure;
FIG. 9 illustrates a user interface for describing graphical indicators provided from an electronic device, in accordance with various embodiments of the present disclosure;
FIG. 10 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure;
11A, 11B, 11C, 11D, 11E, and 11F illustrate user interfaces for graphical user interfaces provided from an electronic device, according to various embodiments of the present disclosure;
FIG. 12 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure;
13A, 13B, 13C, 13D, and 13E illustrate user interfaces for describing graphical indicators provided from an electronic device, in accordance with various embodiments of the present disclosure;
FIG. 14 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure; and
15A, 15B, and 15C illustrate user interfaces for describing graphical indicators provided from an electronic device, according to various embodiments of the present disclosure.
It should be noted that throughout the drawings, like reference numerals are used to describe the same or similar elements, features and structures.
Detailed Description
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. The following description includes various specific details to aid understanding, but these specific details should be considered exemplary only. Thus, one of ordinary skill in the art will recognize that: various changes and modifications may be made to the various embodiments described herein without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to a literal meaning, but are used only by the inventors to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of the various embodiments of the present disclosure is provided for illustration only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
It should be understood that the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to a "component surface" includes reference to one or more of such surfaces.
The term "substantially" means that the recited feature, parameter, or value need not be achieved exactly, but that deviations or variations, including such things as tolerances, measurement error, measurement accuracy limitations, and other factors known to those skilled in the art, may occur in amounts that do not preclude the effect that the feature is intended to provide.
The expression "configured to" as used in various embodiments may be used interchangeably in terms of hardware or software, with for example "adapted to", "having. Alternatively, in some cases, the expression "a device configured.. may mean that the device is" capable. "along with other devices or components. For example, the phrase "a processor adapted (or configured) to perform A, B and C" may mean a dedicated processor (e.g., an embedded processor) for performing the corresponding operations only, or a general-purpose processor (e.g., a Central Processing Unit (CPU) or an Application Processor (AP)) that may perform the corresponding operations by executing one or more software programs stored in a memory device.
An electronic device according to various embodiments may include at least one of: for example, smart phones, tablet Personal Computers (PCs), mobile phones, video phones, electronic book readers (e-book readers), desktop PCs, laptop PCs, netbook computers, workstations, servers, Personal Digital Assistants (PDAs), Portable Multimedia Players (PMPs), moving picture experts group (MPEG-1 or MPEG-2) audio layer-3 (MP3) players, mobile medical devices, cameras, wearable devices, and the like. According to various embodiments of the present disclosure, a wearable device may include at least one of: an item of apparel type (e.g., a watch, ring, bracelet, foot ring, necklace, glasses, contact lens, or Head Mounted Device (HMD)), a dress or apparel integration type (e.g., electronic apparel), a body-mounted type (e.g., a skin patch or tattoo), and a bioimplantation type (e.g., an implantable circuit). In embodiments of the present disclosure, the electronic device may include, for example, a television, a Digital Video Disc (DVD) player, an audio device, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air purifier, a set-top box, a home automation control panel, a security control panel, a TV box (e.g., Samsung HomeSync)TM、Apple TVTMOr Google TVTM) Game machine (e.g., Xbox)TMAnd PlayStationTM) At least one of an electronic dictionary, an electronic key, a camera, and an electronic photo frame.
In other embodiments of the invention, the electronic device may comprise at least one of: various medical devices (e.g., various portable medical measurement devices (blood glucose monitoring device, heart rate monitoring device, blood pressure measuring device, body temperature measuring device, etc.), Magnetic Resonance Angiography (MRA), Magnetic Resonance Imaging (MRI), Computed Tomography (CT) machine, and ultrasound scanner), navigation device, Global Positioning System (GPS) receiver, Event Data Recorder (EDR), Flight Data Recorder (FDR), vehicle infotainment device, marine electronic device (e.g., marine navigation device and gyro compass), avionic device, security device, vehicle head unit, industrial or home robot, bank's Automated Teller Machine (ATM), store's point of sale (POS), or internet of things device (e.g., light bulb, various sensors, electricity or gas meter, sprinkler device, fire alarm, thermostat, street light, toaster, etc.), Sports equipment, hot water tanks, heaters, boilers, etc.). According to some embodiments of the present disclosure, the electronic device may include at least one of furniture or a part of a building/structure, an electronic board, an electronic signature receiving device, a projector, and various measuring instruments (e.g., a water meter, an electric meter, a gas meter, and a radio wave meter). In various embodiments of the present disclosure, the electronic device may be flexible, or may be a combination of one or more of the various devices described above. The electronic device according to one embodiment is not limited to the above-described device. In embodiments of the present disclosure, the term "user" may indicate a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).
Fig. 1 is a block diagram illustrating an electronic device and a network according to various embodiments of the present disclosure.
Referring to fig. 1, an electronic device 101 in a network environment 100 according to various embodiments will be described with reference to fig. 1. Electronic device 101 may include bus 110, processor 120, memory 130, input/output interface 150, display 160, and communication interface 170. In some embodiments of the present disclosure, the electronic device 101 may omit at least one of the above elements, or may also include other elements. Bus 110 may include, for example, circuitry for interconnecting elements 110-170 and conveying communications (e.g., control messages or data) between these elements. The processor 120 may include one or more of a CPU, an AP, and a Communication Processor (CP). For example, the processor 120 may perform operations or data processing related to control and/or communication of at least one other element of the electronic device 101.
The memory 130 may include volatile and/or non-volatile memory. Memory 130 may store, for example, instructions or data related to at least one other element of electronic device 101. Memory 130 may store software and/or programs 140 in accordance with embodiments of the present disclosure. Programs 140 may include a kernel 141, middleware 143, an Application Programming Interface (API)145, and/or application programs (or "applications") 147. At least some of the kernel 141, the middleware 143, and the API 145 may be referred to as an Operating System (OS). The kernel 141 may control or manage system resources (e.g., the bus 110, the processor 120, or the memory 130) for performing operations or functions implemented by other programs (e.g., the middleware 143, the API 145, or the application 147) or the application program 147. Further, kernel 141 can provide an interface through which middleware 143, API 145, or application 147 can access various elements of electronic device 101 to control or manage system resources.
Middleware 143 can, for example, act as an intermediary, allowing an API 145 or application 147 to communicate with kernel 141 to exchange data. Further, middleware 143 can process one or more task requests received from application 147 according to priority. For example, middleware 143 can assign priority to one or more applications 147 for using system resources (e.g., bus 110, processor 120, memory 130, etc.) of electronic device 101, and can process one or more task requests. The API 145 is an interface by which the application 147 controls functions provided by the kernel 141 or the middleware 143, and may include at least one interface or function (e.g., an instruction) for file control, window control, image processing, or text control, for example. For example, the input/output interface 150 may forward instructions or data input from a user or an external device to other elements of the electronic device 101, or may output instructions or data received from other elements of the electronic device 101 to a user or an external device.
The display 160 may include, for example, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, an Organic Light Emitting Diode (OLED) display, a micro-electro-mechanical system (MEMS) display, or an electronic paper display. For example, the display 160 may display various types of content (e.g., text, images, videos, icons, and/or symbols) to a user. The display 160 may include a touch screen and may receive touch, gesture, proximity, or hover input using, for example, an electronic pen or a portion of a user's body. For example, the communication interface 170 may set up communication between the electronic device 101 and an external device (e.g., the first external electronic device 102, the second external electronic device 104, or the server 106). For example, the communication interface 170 may connect to the network 162 via wireless or wired communication to communicate with an external device (e.g., the second external electronic device 104 or the server 106), as illustrated by element 164.
The wireless communication may include, for example, cellular communication using at least one of: long Term Evolution (LTE), LTE-advanced (LTE-a), Code Division Multiple Access (CDMA), wideband CDMA (wcdma), Universal Mobile Telecommunications System (UMTS), wireless broadband (WiBro), global system for mobile communications (GSM), and the like. In accordance with embodiments of the present disclosure, as illustrated by element 164 of fig. 1, for example, the wireless communication may include at least one of: wireless fidelity (WiFi), light fidelity (LiFi), bluetooth, BT Low Energy (BLE), Zigbee, Near Field Communication (NFC), magnetic security transport, Radio Frequency (RF), and Body Area Network (BAN). According to embodiments of the present disclosure, the wireless communication may include a Global Navigation Satellite System (GNSS). The GNSS may be, for example: GPS, global navigation satellite system (Glonass), beidou navigation satellite system (hereinafter "beidou"), galileo or european global satellite based navigation systems. In the context of this document, the term "GPS" may be interchanged with the term "GNSS". The wired communication may include, for example, at least one of: universal Serial Bus (USB), high-definition multimedia interface (HDMI), recommended standard 232(RS-232), power line communication, Plain Old Telephone Service (POTS), and the like. Network 162 may include a telecommunications network, such as at least one of a computer network (e.g., a Local Area Network (LAN) or a Wide Area Network (WAN)), the internet, and a telephone network.
Each of the first external electronic device 102 and the second external electronic device 104 may be of the same type as or a different type from the type of the electronic device 101. All or a portion of the operations performed by the electronic device 101 may be performed by another electronic device or devices (e.g., the first external electronic device 102 and the second external electronic device 104) or the server 106, in accordance with various embodiments of the present disclosure. According to an embodiment of the present disclosure, when the electronic device 101 must automatically or in response to a request perform a function or service, the electronic device 101 may request another device (e.g., the first external electronic device 102 or the second external electronic device 104, or the server 106) to perform at least some functions related to the function or service instead of the electronic device 101 automatically performing the function or service or in addition to the electronic device 101 performing the function or service. Another electronic device (e.g., the first external electronic device 102 or the second external electronic device 104, or the server 106) may perform the requested function or additional functions and may transmit the results of its execution to the electronic device 101. The electronic device 101 may provide the received results themselves, or may additionally process the received results to provide the requested function or service. To this end, for example, cloud computing, distributed computing, or client-server computing techniques may be used.
Fig. 2 is a block diagram illustrating an electronic device according to various embodiments of the present disclosure.
Referring to fig. 2, electronic device 201 may comprise, for example, all or part of electronic device 101 shown in fig. 1. The electronic device 201 may include at least one processor 210 (e.g., an AP), a communication module 220, a subscriber identification module 224, a memory 230, a sensor module 240, an input device 250, a display 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298. The processor 210 may control a plurality of hardware or software elements connected to the processor 210 by running, for example, an OS or an application program, and may perform processing and arithmetic operations on various types of data. Processor 210 may be implemented, for example, by a system on a chip (SoC). According to an embodiment of the present disclosure, the processor 210 may further include a Graphics Processing Unit (GPU) and/or an Image Signal Processor (ISP). The processor 210 may also include at least some of the elements shown in fig. 2 (e.g., cellular module 221). The processor 210 may load instructions or data received from at least one other element (e.g., non-volatile memory) into volatile memory, process the loaded instructions or data, and store the resulting data in non-volatile memory.
The communication module 220 may have the same or similar configuration as the communication interface 170 shown in fig. 1. For example, the communication module 220 may include a cellular module 221, a Wi-Fi module 223, a bluetooth module 225, a GNSS module 227, an NFC module 228, and an RF module 229. For example, the cellular module 221 may provide a voice call, a video call, a short message service, or an internet service through a communication network. According to embodiments of the present disclosure, the cellular module 221 may use a subscriber identification module (e.g., a Subscriber Identification Module (SIM) card) 224 to identify and authenticate the electronic device 201 in the communication network. In accordance with embodiments of the present disclosure, the cellular module 221 may perform at least some of the functions that the AP 210 may provide. According to an embodiment of the present disclosure, the cellular module 221 may include a CP. At least some (i.e., two or more) of the cellular module 221, the Wi-Fi module 223, the bluetooth module 225, the GNSS module 227, and the NFC module 228 may be included in a single Integrated Chip (IC) or IC package according to an embodiment of the present disclosure. The RF module 229 may transmit/receive, for example, communication signals (e.g., RF signals). The RF module 229 may include, for example, a transceiver, a Power Amplification Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), an antenna, and the like. According to another embodiment of the present disclosure, at least one of the cellular module 221, the Wi-Fi module 223, the BT module 225, the GNSS module 227, and the NFC module 228 may transmit/receive an RF signal through a separate RF module. Subscriber identification module 224 may include, for example, a card with a SIM and/or an embedded SIM, and may contain unique identification information (e.g., an Integrated Circuit Card Identifier (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
Memory 230 (e.g., memory 130) may include, for example, internal memory 232 or external memory 234. The internal memory 232 may include, for example, at least one of: volatile memory (e.g., Dynamic Random Access Memory (DRAM), static ram (sram), Synchronous Dynamic (SDRAM), etc.) and non-volatile memory (e.g., one-time programmable read only memory (OTPROM), programmable ROM (prom), erasable programmable ROM (eprom), electrically erasable programmable ROM (eeprom), mask ROM, flash memory, a hard drive, or a Solid State Drive (SSD)). The external memory 234 may include a flash memory drive such as Compact Flash (CF), Secure Digital (SD), micro SD, mini SD, extreme digital (xD), multi-media card (MMC), memory stick, and the like. The external memory 234 may be functionally or physically connected with the electronic device 201 via various interfaces.
The sensor module 240 may, for example, measure a physical quantity or detect an operating state of the electronic device 201, and may convert the measured or detected information into an electrical signal. The sensor module 240 may include, for example, at least one of: a gesture sensor 240A, a gyroscope sensor 240B, an atmospheric pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 240G, a color sensor 240H (e.g., a red, green, blue (RGB) sensor), a biometric sensor 240I, a temperature/humidity sensor 240J, an illuminance sensor 240K, and an Ultraviolet (UV) sensor 240M. Additionally or alternatively, the sensor module 240 may include, for example, an electronic nose sensor, an Electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an Electrocardiogram (ECG) sensor, an Infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor. The sensor module 240 may also include control circuitry for controlling one or more sensors included therein. In some embodiments of the present disclosure, the electronic device 201 may further include a processor configured to control the sensor module 240, either as part of the processor 210 or separate from the processor 210, to control the sensor module 240 when the processor 210 is in a sleep state.
Input device 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, keys 256, or an ultrasonic input device 258. The touch panel 252 may use, for example, at least one of a capacitive type, a resistive type, an IR type, and an ultrasonic type. In addition, the touch panel 252 may further include a control circuit. The touch panel 252 may also include a tactile layer for providing tactile responses to a user. The (digital) pen sensor 254 may include, for example, an identification patch that is part of or separate from the touch panel. The keys 256 may include, for example, physical buttons, optical keys, or a keypad. The ultrasonic input device 258 may detect ultrasonic waves generated by the input tool through a microphone (e.g., microphone 288) to identify data corresponding to the detected ultrasonic waves.
Display 260 (e.g., display 160) may include a panel 262, a holographic device 264, a projector 266, and/or control circuitry for controlling them. The panel 262 may be implemented, for example, as flexible, transparent, or wearable. The panel 262 may be configured as one or more modules along with the touch panel 252. According to embodiments of the present disclosure, the panel 262 may include a pressure sensor (or POS sensor) that may measure the pressure intensity of the user's touch. The pressure sensor may be implemented as integrated with the touch panel 252 or may be implemented as one or more sensors separate from the touch panel 252. The holographic device 264 may display a three-dimensional image in the air by using interference of light. The projector 266 may display an image by projecting light onto a screen. The screen may be located, for example, inside or outside the electronic device 201. The interface 270 may include, for example, an HDMI 272, a USB 274, an optical interface 276, or a D-subminiature (D-sub) interface 278. Interface 270 may be included in, for example, communication interface 170 in fig. 1. Additionally or alternatively, interface 270 may include, for example, a mobile high definition link (MHL) interface, an SD card/MMC interface, or an infrared data association (IrDA) standard interface.
The audio module 280 may, for example, convert sound and electrical signals bi-directionally. At least some of the elements of audio module 280 may be included in, for example, input/output interface 150 shown in fig. 1. Audio module 280 may process sound information input or output through, for example, speaker 282, earpiece 284, headset 286, microphone 288, and the like. The camera module 291 is a device that can capture still images and moving images. According to embodiments of the present disclosure, the camera module 291 may include one or more image sensors (e.g., a front sensor or a rear sensor), lenses, ISPs, or flash lights (e.g., LEDs or xenon lights). The power management module 295 may manage power of the electronic device 201, for example. According to embodiments of the present disclosure, the power management module 295 may include a Power Management Integrated Circuit (PMIC), a charger IC, or a battery or fuel gauge. The PMIC may use wired and/or wireless charging methods. Examples of the wireless charging method may include a magnetic resonance method, a magnetic induction method, an electromagnetic wave method, and the like. Additional circuitry for wireless charging (e.g., coil loops, resonant circuits, rectifiers, etc.) may also be included. The battery gauge may measure, for example, the remaining charge of the battery 296, as well as the voltage, current, or temperature during charging. For example, the battery 296 may include a rechargeable battery and/or a solar cell.
The indicator 297 may display a particular state of the electronic device 201 or a portion of the electronic device 201 (e.g., the processor 210), such as a startup state, a message state, a charging state, etc. The motor 298 may convert the electrical signal into mechanical vibrations and may generate vibrations or haptic effects. The electronic device 201 may comprise a mobile television enabled device (e.g., GPU) capable of processing data according to a protocol such as Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), mediaFloTMAnd so on, the standard media data. Each of the above-described constituent elements of hardware according to an embodiment of the present disclosure may be configured with one or more components, and names of the respective constituent elements may vary based on a type of an electronic device. According to various embodiments of the present disclosure, an electronic device (e.g., electronic device 201) may not include some elements, or may further include additional elements. Some elements may be coupled to constitute one object, but the electronic device may perform the same function as the corresponding elements had before being coupled to each other.
FIG. 3 is a block diagram of program modules according to various embodiments of the present disclosure.
Referring to fig. 3, program modules 310 (e.g., program 140) may include an OS for controlling resources associated with an electronic device (e.g., electronic device 101) and/or various applications (e.g., application program 147) executing in the OS in accordance with embodiments of the present disclosure. The OS may include, for example, AndroidTM、iOSTM、WindowsTM、SymbianTM、TizenTMOr BadaTM. Referring to fig. 3, program modules 310 may include a kernel 320 (e.g., kernel 141), middleware 330 (e.g., middleware 143), an API 360 (e.g., API 145), and/or an application 370 (e.g., application program 147). At least some of the program modules 310 may be pre-loaded onto the electronic device or may be downloaded from an external electronic device (e.g., the first external electronic device 102 or the second external electronic device 104 or the server 106).
The kernel 320 may include, for example, a system resource manager 321 and/or a device driver 323. The system resource manager 321 may control, allocate, or acquire system resources. According to an embodiment of the present disclosure, the system resource manager 321 may include a process manager, a memory manager, a file system manager, and the like. The device drivers 323 may include, for example, a display driver, a camera driver, a bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication (IPC) driver. The middleware 330 may provide, for example, functionality commonly required by the applications 370, or may provide various functionality to the applications 370 through the API 360 so that the applications 370 may efficiently use limited system resources within the electronic device. According to embodiments of the present disclosure, the middleware 330 may include at least one of: runtime library 335, application manager 341, window manager 342, multimedia manager 343, resource manager 344, power manager 345, database manager 346, package manager 347, connection manager 348, notification manager 349, location manager 350, graphics manager 351, and security manager 352.
Runtime libraries 335 may include library modules used, for example, by a compiler to add new functionality through a programming language during execution of application 370. Runtime library 335 may manage input/output, manage memory, or handle arithmetic functions. The application manager 341 may manage, for example, the lifecycle of the application 370. The window manager 342 may manage the Graphical User Interface (GUI) resources used by the screen. The multimedia manager 343 can recognize a format required for reproducing various media files and can encode or decode the media files using a codec suitable for the corresponding format. The resource manager 344 may manage the space of the source code or memory of the application 370. The power manager 345 may manage, for example, the capacity or power of a battery, and may provide power information required for operating the electronic device. According to embodiments of the present disclosure, the power manager 345 may operate in conjunction with a basic input/output system (BIOS). The database manager 346 may, for example, generate, search, or change a database to be used by the application 370. The package manager 347 may manage installation or update of an application distributed in the form of a package file.
The connection manager 348 may manage, for example, wireless connections. The notification manager 349 may provide events such as received messages, appointments, and proximity notifications to the user. The location manager 350 may manage, for example, location information of the electronic device. The graphic manager 351 may manage graphic effects to be provided to the user and user interfaces related to the graphic effects. The security manager 352 may provide, for example, system security or user authentication. According to an embodiment of the present disclosure, the middleware 330 may include a phone manager for managing a voice or video call function of an electronic device, or a middleware module including a combination of functions capable of forming the above elements. According to an embodiment of the present disclosure, the middleware 330 may provide a module designated for each type of OS. Middleware 330 may dynamically remove some of the existing elements, or may add new elements. For example, the API 360 is a set of API programming functions and may be provided in different configurations according to the OS. For example, in the case of android or iOS, one API set may be provided for each platform, and in the case of Tizen, two or more API sets may be provided for each platform.
The applications 370 may include applications such as: a home application 371, a dialer application 372, a Short Message Service (SMS)/Multimedia Message Service (MMS) application 373, an Instant Messaging (IM) application 374, a browser application 375, a camera application 376, an alarm application 377, a contacts application 378, a voice dialing application 379, an email application 380, a calendar application 381, a media player application 382, an album application 383, a clock application 384, a healthcare application (e.g., measuring workout volume or blood glucose), or an environmental information application (e.g., barometric pressure, humidity, or temperature information). According to an embodiment of the present disclosure, the applications 370 may include an information exchange application capable of supporting information exchange between an electronic device and an external electronic device. The information exchange application may include, for example, a notification relay application for relaying specific information to the external electronic device or a device management application for managing the external electronic device. For example, the notification relay application may relay notification information generated in other applications of the electronic device to the external electronic device, or may receive the notification information from the external electronic device and provide the received notification information to the user. The device management application may, for example, install, delete, or update a function of an external electronic device that communicates with the electronic device (e.g., a function of turning on/off the external electronic device itself (or some component thereof) or a function of adjusting the brightness (or resolution) of a display) or an application running in the external electronic device. According to embodiments of the present disclosure, the applications 370 may include applications specified according to attributes of the external electronic device (e.g., healthcare applications of the ambulatory medical device). According to an embodiment of the present disclosure, the application 370 may include an application received from an external electronic device. At least some of program modules 310 may be implemented (e.g., executed) by software, firmware, hardware (e.g., processor 210), or a combination of two or more thereof, and may include a module, program, routine, set of instructions, or process for performing one or more functions.
The term "module" as used herein may include a unit comprised of hardware, software, or firmware, and may be used interchangeably with the terms "logic," "logic block," "component," "circuitry," and the like, for example. A "module" may be an integrated component or a minimal unit or a portion thereof for performing one or more functions. A "module" may be implemented mechanically or electrically and may include, for example, an Application Specific Integrated Circuit (ASIC) chip, a Field Programmable Gate Array (FPGA) or a programmable logic device, as is known or later developed to perform certain operations.
At least some of the apparatus (e.g., modules or functions thereof) or methods (e.g., operations) according to various embodiments may be implemented by instructions stored in the form of program modules in a computer-readable storage medium (e.g., memory 130). The instructions, when executed by a processor (e.g., processor 120), may cause the one or more processors to perform functions corresponding to the instructions. The computer readable storage medium may include a hard disk, a floppy disk, a magnetic medium (e.g., magnetic tape), an optical medium (e.g., CD-ROM, DVD), a magneto-optical medium (e.g., floptical disk), an internal memory, and the like. The instructions may include code generated by a compiler or code executable by an interpreter. A programming module according to an embodiment of the disclosure may include one or more of the above-described elements, or may also include other additional elements, or may omit some of the above-described elements. Operations performed by modules, programmable modules or other elements according to various embodiments may be performed sequentially, in parallel, repeatedly, or in a heuristic manner. At least some of the operations may be performed according to another order, may be omitted, or may include other operations as well.
Fig. 4 is a block diagram that schematically illustrates an electronic device, in accordance with various embodiments of the present disclosure.
Referring to fig. 4, the electronic device 401 may include a configuration substantially similar or identical to the configuration of the electronic device 101 or 201 of fig. 1 or 2.
Electronic device 401 may include a processor 410 (e.g., processor 120 or 210 of fig. 1 or 2), a camera module 420 (e.g., camera module 291 of fig. 2), a display 430 (e.g., display 160 or 260 of fig. 1 or 2), a memory 440 (e.g., memory 130 or 230 of fig. 1 or 2), and an input device 450 (e.g., input device 250 of fig. 2).
The processor 410 may control the overall operation of the electronic device 401.
According to an embodiment of the present disclosure, the processor 410 may acquire an image IM using the camera module 420. For example, the image IM may include a still image and/or a video. For example, the image IM may include at least one subject. In addition, the image IM may include a plurality of pixels.
According to an embodiment of the present disclosure, the processor 410 may generate the depth information DI corresponding to at least one subject included in the image IM based on information about a plurality of pixels included in the image IM obtained using the camera module 420. The processor 410 may use a distance measuring sensor (not shown) to generate the depth information DI. For example, the distance measurement sensor may comprise a time-of-flight (TOF) camera.
The depth information DI may refer to distance information of at least one object included in the image IM. For example, the depth information DI may include a depth map for at least one subject included in the image IM.
The processor 410 may generate contrast information for the image IM.
The processor 410 may display the image IM on the display 430. For example, the processor 410 may display the image IM as a preview image for imaging on the display 430.
The processor 410 may control a focus of the camera module 420 according to the user's input signal IN the image IM displayed on the display 430. For example, IN a manual focus operation, the processor 410 may move a position of a lens included IN the camera module 420 according to an input signal IN of a user and may control a focus of the lens.
The processor 410 may determine a focus area on which the camera module 420 focuses IN the image IM displayed on the display 430 according to the input signal IN of the user.
For example, the focus area may refer to an area in which the camera module 420 focuses on the image IM displayed on the display 430. For example, the focus area may refer to an in-focus area in the image IM. Meanwhile, a region different from the focused region may refer to a region where the camera module 420 is not focused. For example, a region different from the focus region may be referred to as an out-of-focus (out-focus) region.
For example, the input signal IN may refer to a signal corresponding to an input of a user for controlling a focus IN a manual focusing operation. IN addition, the input signal IN may also refer to a signal corresponding to an input for controlling the depth.
According to an embodiment of the present disclosure, IN case of controlling a focus of a camera included IN the camera module 420 according to the input signal IN, the processor 410 may determine a first region displaying the graphical indicator GI IN the image IM displayed on the display 430 based on the depth information DI. For example, the first region may refer to a region displaying a graphical indicator for indicating a focus region of the camera in the image IM.
According to an embodiment of the present disclosure, the processor 410 may further determine the first region displaying the graphical indicator GI based on the depth information DI and the contrast information of the image IM.
The graphical indicator GI may refer to an indicator for indicating a focus area of the camera. The graphical indicator GI may also refer to an indicator (or mark) displayed on the subject focused on the image IM by highlighting at least a part of the subject using a line and/or dot of a specific color. For example, the graphical indicator GI may refer to an indicator (or mark) displayed on an object included in the focus area with a focus peaking function.
According to an embodiment of the present disclosure, the processor 410 may display a graphical indicator GI on at least a part of a subject (or a vicinity of the subject (e.g., an edge, a face, and/or an outside of the edge)) included in a focus region of the image IM, based on the depth information DI, in the image IM displayed on the display 430. The processor 410 may also display a graphic indicator GI on at least a part of the subject included in the focus area (or the vicinity of the subject (e.g., the edge, the face, and/or the outside of the edge)) based on the depth information DI and the contrast information of the image IM in the image IM displayed on the display 430.
For example, the processor 410 may obtain distance information of at least one object included in the image IM based on the depth information DI. In addition, the processor 410 may identify at least one subject included in the image IM based on the depth information DI.
The processor 410 may detect a portion (in general, an edge portion of a subject) of the subject included in the image IM whose contrast value is higher than a predetermined value based on the contrast information. The processor 410 may identify at least one subject included in the image IM based on the detected portion.
The processor 410 may determine a first region displaying a graphical indicator GI indicating a focus region based on the depth information DI and the contrast information of the image IM. For example, the processor 410 may determine the first region displaying the graphical indicator GI based on distance information about at least one object included in the image IM obtained by the depth information DI and a portion with high contrast in the image IM obtained by the contrast information.
The processor 410 may display a graphical indicator GI on at least a part of the subject included in the determined first region.
According to an embodiment of the present disclosure, the processor 410 may identify a focus area (e.g., a position of the focus area) of the image IM based on the input signal IN of the user and the depth information DI.
The processor 410 may display a graphical indicator GI on at least a part of the subject included in the focus area of the image IM, based on the depth information DI, in the image IM displayed on the display 430. For example, the processor 410 may obtain distance information about at least one object included in the image IM based on the depth information DI. In addition, the processor 410 may identify at least one subject included in the image IM based on the information on the identified focus area and the depth information DI. The processor 410 may display a graphical indicator GI on at least a part of the identified subject (e.g., a subject included in the focus area) in the image IM displayed on the display 430.
According to an embodiment of the present disclosure, the processor 410 may display a first graphic indicator using a first attribute (e.g., a first color) on at least a portion of a subject included in a first region of the image IM corresponding to the focus region. In addition, the processor 410 may display the graphical indicator using an attribute different from the first attribute on at least a portion of the other area of the image IM different from the first area.
The processor 410 may determine, based on the depth information DI, a second region farther from the electronic device 401 (e.g., the camera module 420 of the electronic device 401) than the first region and a third region closer to the electronic device 401 than the first region, among other regions different from the first region corresponding to the focus region. The processor 410 may display a second graphical indicator using a second attribute (e.g., a second color) on at least a part of the subject included in the second region, and may display a third graphical indicator using a third attribute (e.g., a third color) on at least a part of the subject included in the third region.
For example, the first attribute, the second attribute, and the third attribute may refer to attributes different from each other. For example, the first attribute, the second attribute, and the third attribute may refer to means for highlighting a subject included in the first region corresponding to the focused region in the image IM, such as a specific color, shape, size, and/or brightness.
The processor 410 may display a graphic interface for setting a focus region on the display 430 IN response to the input signal IN. For example, the processor 410 may display an object corresponding to the focus area on the graphical interface. In the graphical interface, the processor 410 may display a first portion corresponding to the focused region using a first attribute, a second portion corresponding to the second region using a second attribute, and a third portion corresponding to the third region using a third attribute.
The processor 410 may determine a depth corresponding to the focus region. The processor 410 may also display an object on the graphical interface having attributes (e.g., size and/or shape) corresponding to the determined depth.
The processor 410 may control the reference value for displaying the graphical indicator GI based on the depth information DI. For example, the reference value may refer to a threshold value for displaying the contrast of the graphical indicator GI. For example, since the threshold is high if the reference value is high, the processor 410 may display the graphical indicator GI only if a high contrast is detected. In addition, since the threshold value of the contrast is low in case the reference value is low, the processor 410 may display the graphical indicator GI even if a low contrast is detected.
According to an embodiment of the present disclosure, the processor 410 may determine whether to perform a zoom function of the image IM. The processor 410 may control the reference value for displaying the graphical indicator GI based on the zoom ratio of the image IM and the depth information DI. For example, in case the zoom ratio of the image IM displayed on the display 430 increases, the processor 410 may decrease the reference value for displaying the graphical indicator GI. In addition, in the case where the zoom ratio of the image IM displayed on the display 430 decreases, the processor 410 may increase the reference value for displaying the graphic indicator GI.
According to an embodiment of the present disclosure, the processor 410 may determine a distance of at least one object included in the image IM based on the depth information DI.
For example, a subject located in a more distant area than a predetermined first reference distance may have a high frequency, and the amount of a graphical indicator displayed on the subject may be larger. In a case where the subject included in the first region of the image IM is located at a greater distance (or longer distance) from the electronic device 401 (e.g., the camera module 420 of the electronic device 401) than the first reference distance, the processor 410 may increase the reference value for displaying the graphical indicator GI. For example, the first reference distance may refer to a distance value used to determine whether the object is located at a longer distance.
The subject located in a closer area than the predetermined second reference distance may have a low frequency, and the amount of the graphical indicator displayed on the subject may be smaller. In a case where the subject included in the first region of the image IM is located at a closer distance (or shorter distance) from the electronic apparatus 401 than the second reference distance, the processor 410 may decrease the reference value for displaying the graphical indicator GI. For example, the second reference distance may refer to a distance value used to determine whether the object is located at a shorter distance.
The camera module 420 may capture an image IM and send the captured image IM to the processor 410.
The camera module 420 may include at least one camera. For example, the camera module 420 may include one or more cameras.
The camera module 420 may include an image sensor. For example, an image sensor may include a plurality of pixels. Each of the plurality of pixels may include one or more photodiodes (e.g., two or more photodiodes).
According to an embodiment of the present disclosure, in case the camera module 420 includes one camera, the camera may include an image sensor having a plurality of photodiodes. The processor 410 may obtain a phase difference of at least one object included in the image IM using an image sensor including a plurality of photodiodes, and may generate the depth information DI based on the obtained phase difference.
According to another embodiment of the present disclosure, in a case where the camera module 420 includes two or more cameras, each of the two or more cameras may include an image sensor having one or more photodiodes. The processor 410 may obtain a phase difference of at least one object included in the image IM using two or more cameras, and may generate the depth information DI based on the obtained phase difference.
The display 430 may display the image IM. For example, the display 430 may be implemented by a touch screen. The display 430 may receive a touch input IN of a user.
The memory 440 may store data related to the operation of the electronic device 401. For example, the memory 440 may be implemented by a nonvolatile memory or a volatile memory.
According to an embodiment of the present disclosure, the memory 440 may store the depth information DI generated by the processor 410. In addition, the memory 440 may also store contrast information about the image IM.
Input device 450 may receive an input signal IN of a user and may send input signal IN to processor 410. For example, the processor 410 may receive an input signal IN via the display 430 and/or the input device 450.
According to an embodiment of the present disclosure, the input device 450 may refer to a device that controls a focus of a camera included IN the camera module 420 by an input signal IN of a user. Input device 450 may be implemented substantially the same as or similar to input device 250 described with reference to fig. 2. For example, input devices 450 may include devices that can receive user input, such as Physical User Interfaces (PUIs), buttons, and jog switches (jog).
Fig. 5A and 5B are flow diagrams illustrating operation of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 5A, at operation 501, a processor 410 (e.g., the processor 410 of fig. 4) may generate depth information DI about an image IM obtained using a camera included in the camera module 420. For example, the processor 410 may generate the depth information DI based on information about a plurality of pixels included in the image IM.
The processor 410 may control a focus of a camera included IN the camera module 420 according to the input signal IN.
In the case of controlling the focus of a camera included in the camera module 420, the processor 410 may determine a first region for displaying a graphical indicator indicating a focus region in the image IM displayed on the display 430 based on the depth information DI and the contrast information at operation 503.
At operation 505, the processor 410 may display a graphical indicator GI on at least a portion of the first region. For example, the processor 410 may determine a subject included in the first area based on the depth information DI and the contrast information, and may display a graphical indicator GI on at least a part of the determined subject.
Referring to fig. 5B, at operation 511, the processor 410 (e.g., the processor 410 of fig. 4) may generate depth information DI about an image IM obtained using a camera included in the camera module 420. For example, the processor 410 may generate the depth information DI based on information about a plurality of pixels included in the image IM.
At operation 513, the processor 410 may determine a focus area in which the camera is focused in an image IM (e.g., a preview image) displayed on the display 430. For example, the processor 410 may determine the focus area from the input signal IN of the user.
The processor 410 may identify a focus area of the image IM based on the input signal IN of the user and the depth information DI.
At operation 515, the processor 410 may display a graphical indicator GI on at least a portion of the focus area based on the depth information DI. For example, the processor 410 may determine a subject included in the focus area based on the depth information DI, and may display a graphical indicator GI on at least a part of the subject.
Fig. 6 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 6, at operation 601, a processor 410 (e.g., the processor 410 of fig. 4) may generate depth information DI about an image IM obtained using a camera included in the camera module 420.
At operation 603, the processor 410 may determine a first region displaying a first graphical indicator in an image IM (e.g., a preview image) displayed on the display 430. The processor 410 may determine a first region displaying a first graphical indicator for indicating a focus region of the image IM based on the depth information DI and the contrast information.
At operation 605, the processor 410 may display a first graphical indicator using a first attribute (e.g., a first color) on at least a portion of the subject included in the first region.
The processor 410 may identify an area (hereinafter, out-of-focus area) different from the first area corresponding to the in-focus area in the image IM displayed on the display 430 using the depth information DI. For example, the processor 410 may identify a region at a longer distance (or longer distance) than the focus region (hereinafter referred to as a second region), and a region at a shorter distance (or shorter distance) than the focus region (hereinafter referred to as a third region) in the image.
At operation 607, the processor 410 may display a second graphical indicator using a second attribute (e.g., a second color) in a second region of the out-of-focus region corresponding to the long distance.
At operation 609, the processor 410 may display a third graphical indicator using a third attribute (e.g., a third color) in a third region of the out-of-focus region corresponding to the short distance.
The electronic apparatus 401 of the present disclosure displays graphical indicators having different attributes (e.g., different colors) on one subject included in the image IM. Therefore, the electronic device 401 can intuitively recognize the focus area and can efficiently set the focus area.
Fig. 7A, 7B, and 7C illustrate user interfaces for describing graphical indicators provided from an electronic device, according to various embodiments of the present disclosure.
Referring to fig. 7A, 7B, and 7C, electronic device 701 may be implemented substantially the same as or similar to electronic device 401 described with reference to fig. 4.
The electronic device 701 may display the image IM on the display 730. The image IM may include a plurality of subjects 751, 752, 761, 762, 771, 772, 781, 782, 791, and 792.
Referring to fig. 7A, the electronic device 701 may focus on a center portion of a first image 731 using a camera. For example, the electronic device 701 may focus on the first subjects 751 and 752 located at a center portion of the first image 731.
According to an embodiment of the present disclosure, the electronic device 701 may display a first graphical indicator having a first attribute (e.g., a first color) on at least a portion of the first subjects 751 and 752. For example, the first graphical indicator may refer to a mark comprising a line and a dot of "green".
The electronic device 701 may display a second graphical indicator having a second attribute (e.g., a second color) on at least a portion of the second subjects 761, 762, 771, and 772 located farther from the electronic device 701 (e.g., a camera of the electronic device 701) than the focus area. For example, the second graphical indicator may refer to a mark comprising a line and a dot of "blue".
The electronic device 701 may display a third graphical indicator having a third attribute (e.g., a third color) on at least a portion of the third subjects 781, 782, 791, and 792 located closer to the electronic device 701 (e.g., a camera of the electronic device 701) than the focus area. For example, the third graphical indicator may refer to a mark comprising a line and a dot of "red".
Referring to fig. 7B, the electronic device 701 may focus on a long-distance portion of the second image 732 using a camera. For example, the electronic device 701 may focus on the first subjects 761 and 762 located at a long-distance portion of the second image 732.
According to an embodiment of the present disclosure, the electronic device 701 may display a first graphical indicator having a first attribute (e.g., a first color) on at least a portion of the first subjects 761 and 762. For example, the first graphical indicator may refer to a mark comprising a line and a dot of "green".
The electronic device 701 may display a second graphical indicator having a second attribute (e.g., a second color) on at least a portion of the second subjects 771 and 772 located farther from the electronic device 701. For example, the second graphical indicator may refer to a mark comprising a line and a dot of "blue".
The electronic device 701 may display a third graphical indicator having a third attribute (e.g., a third color) on at least a portion of the third subjects 751, 752, 781, 782, 791, and 792 located closer to the electronic device 701. For example, the third graphical indicator may refer to a mark comprising a line and a dot of "red".
Referring to fig. 7C, the electronic device 701 may focus on a short-distance portion of the third image 733 using a camera. For example, the electronic device 701 may focus on the first subjects 781 and 782 positioned at a short distance portion of the third image 733.
According to an embodiment of the present disclosure, the electronic device 701 may display a first graphical indicator having a first attribute (e.g., a first color) on at least a portion of the first subjects 781 and 782. For example, the first graphical indicator may refer to a mark comprising a line and a dot of "green".
The electronic device 701 may display a second graphical indicator having a second attribute (e.g., a second color) on at least a portion of the second subjects 751, 752, 761, 762, 771, and 772 located farther from the electronic device 701. For example, the second graphical indicator may refer to a mark comprising a line and a dot of "blue".
The electronic device 701 may display a third graphical indicator having a third attribute (e.g., a third color) on at least a portion of the third subjects 791 and 792 located closer to the electronic device 701. For example, the third graphical indicator may refer to a mark comprising a line and a dot of "red".
Fig. 8 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 8, at operation 801, a processor 410 (e.g., the processor 410 of fig. 4) may generate depth information DI about at least one subject included in an image IM using the image IM obtained with a camera included in a camera module 420.
At operation 803, the processor 410 may determine a first region corresponding to a focus region in which the camera is focused and other regions (hereinafter, referred to as out-of-focus regions) different from the first region in an image IM (e.g., a preview image) displayed on the display 430 based on the depth information DI and the contrast information. For example, the out-of-focus region may refer to a region where the camera is out of focus.
The processor 410 may distinguish and display the in-focus area and the out-of-focus area. For example, the processor 410 may distinguish and display the in-focus area from the out-of-focus area by adding a particular effect in the out-of-focus area.
According to an embodiment of the present disclosure, at operation 805, the processor 410 may add a blur effect in the out-of-focus region. In addition, the processor 410 may control the intensity of the blurring effect of the out-of-focus region based on the depth information DI.
FIG. 9 illustrates a user interface for describing graphical indicators provided from an electronic device, in accordance with various embodiments of the present disclosure.
Referring to fig. 9, electronic device 901 may be implemented substantially the same as or similar to electronic device 401 described with reference to fig. 4. The electronic device 901 may display the image IM on the display 930.
The electronic device 901 can focus on an area of the image 961 where the subject 951 is located using a camera. For example, the focused region may be a region where the subject 951 of the image 961 is located.
The electronic device 901 can display a graphical indicator on at least a portion of the subject 951 included in the image 961 displayed on the display 430. The electronic device 901 may add a blur effect in the out-of-focus area and display the out-of-focus area on the display 930.
Fig. 10 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 10, at operation 1001, a processor 410 (e.g., the processor 410 of fig. 4) may display a GUI for manual focus control.
At operation 1003, the processor 410 may distinguish and display a portion corresponding to a focus area of the image IM and a portion corresponding to an area other than the focus area (hereinafter, referred to as an out-of-focus area) in the GUI based on the depth information DI. For example, the processor 410 may display a portion of the GUI corresponding to the focus region using the same properties as those of the graphical interface of the focus region. In addition, the processor 410 may display a portion of the GUI corresponding to the out-of-focus area using the same properties as those of the graphical interface of the out-of-focus area.
At operation 1005, the processor 410 may display the GUI by reflecting the depth of the image IM (e.g., preview image) displayed on the display 430. For example, the processor 410 may display a portion corresponding to a focus area of the image IM in the GUI by reflecting the depth of the image IM. In the case where the depth of the focus area is deep, the processor 410 may display a portion corresponding to the focus area of the image IM in the GUI wider (or larger). In contrast, in the case where the depth of the focus area is shallow, the processor 410 may display the portion corresponding to the focus area of the image IM in the GUI narrower (or smaller).
11A, 11B, 11C, 11D, 11E, and 11F illustrate user interfaces for graphical user interfaces provided from an electronic device, according to various embodiments of the present disclosure.
Referring to fig. 11A, 11B, 11C, 11D, 11E, and 11F, electronic device 1101 may be substantially the same as or implemented similar to electronic device 401 described with reference to fig. 4.
The electronic device 1101 may display the image IM on the display 1130. The image IM may include a plurality of subjects 1151, 1152, 1161, 1162, 1171, 1172, 1181, 1182, 1191, and 1192.
Referring to fig. 11A, the electronic device 1101 may focus on a central portion of the first image 1131 using a camera. For example, the electronic device 1101 may focus on the subjects 1151 and 1152 located at the central portion of the first image 1131.
According to an embodiment of the disclosure, the electronic device 1101 may display a graphical user interface 1150 for manually controlling focus. The graphic user interface 1150 may include a moving object 1153 for moving the focus area. For example, the moving object 1153 may move vertically in response to a touch input by the user. The processor 410 (e.g., the processor 410 of fig. 4) may correspond the position of the moving object 1153 to the position of the focus area. For example, the processor 410 may control a focus area by moving a lens of a camera included IN the camera module 420 (e.g., the camera module 420 of fig. 4) IN response to an input signal IN for controlling a position of the moving object 1153.
According to an embodiment of the present disclosure, the moving object 1153 may be displayed by an attribute identical or similar to a first attribute (e.g., a first color) indicating a first graphic indicator displayed on the subjects 1151 and 1152 included in the first area corresponding to the focused area. For example, where the first graphical indicator is displayed in "green," the mobile object 1153 may be displayed in "green.
The graphic user interface 1150 may further include a first portion 1154 and a second portion 1155, the first portion 1154 corresponding to a second region based on a long distance of the focus region, and the second portion 1155 corresponding to a third region based on a short distance of the focus region.
The first portion 1154 may be displayed by an attribute that is the same as or similar to a second attribute (e.g., a second color) indicating a second graphical indicator displayed on the subjects 1161, 1162, 1171, and 1172 included in the second area. For example, where the second graphical indicator is displayed in "blue," first portion 1154 may be displayed in "blue. In the same manner, the second portion 1155 may be displayed by an attribute that is the same as or similar to a third attribute (e.g., a third color) indicating a third graphical indicator displayed on the subjects 1181, 1182, 1191, and 1192 included in the third area. For example, where the third graphical indicator is displayed in "red," the second portion 1155 may be displayed in "red.
Referring to fig. 11B, the electronic device 1101 may focus on a portion of the second image 1132 corresponding to a long distance IN response to the input signal IN. For example, in a case where the moving object 1153 of the graphic user interface 1150 moves upward, the electronic device 1101 may focus on the subjects 1161 and 1162 located in the portion of the second image 1132 corresponding to the long distance.
For example, in a case where the moving object 1153 moves upward, the electronic device 1101 may display a first graphic indicator having a first attribute (for example, green) on the subjects 1161 and 1162 included in the first area corresponding to the focused area, may display a second graphic indicator having a second attribute (for example, blue) on the subjects 1171 and 1172 included in the second area, and may display a third graphic indicator having a third attribute (for example, red) on the subjects 1151, 1152, 1181, 1182, 1191, and 1192 included in the third area.
Referring to fig. 11C, the electronic device 1101 may focus on a portion of the third image 1133 corresponding to a short distance IN response to the input signal IN. For example, in a case where the moving object 1153 of the graphic user interface 1150 moves downward, the electronic device 1101 may focus on the subjects 1181 and 1182 located in a short-distance portion of the third image 1133.
For example, in a case where the moving object 1153 moves downward, the electronic device 1101 may display a first graphic indicator having a first attribute (for example, green) on the subjects 1181 and 1182 included in the first area corresponding to the focused area, may display a second graphic indicator having a second attribute (for example, blue) on the subjects 1151, 1152, 1161, 1162, 1171, and 1172 included in the second area, and may display a third graphic indicator having a third attribute (for example, red) on the subjects 1191 and 1192 included in the third area.
Referring to fig. 11D, the electronic device 1101 may focus on a central portion of the fourth image 1134 using a camera. For example, the electronic device 1101 may focus on objects 1151 and 1152 located at a central portion of the fourth image 1134.
The processor 410 (e.g., the processor 410 of fig. 4) may display a focus position display part 1163 indicating the position of the focus area. The processor 410 may display the focus position display part 1163 by reflecting the depth to the focus area. The processor 410 may control the size of the focus position display portion 1163 to increase or decrease the focus control resolution.
According to an embodiment of the disclosure, the electronic device 1101 may display a graphical user interface 1160 for manually controlling focus. The graphic user interface 1160 may include a moving object 1153 for moving a focus area, and an expanded focus position display portion 1163.
For example, in the case where the depth of the focus area is deep, the processor 410 may display a portion (for example, the focus position display portion 1163) corresponding to the focus area of the image IM in the graphic user interface 1160 small automatically or according to an input of the user. In addition, in the case where the depth of the focus area is shallow, the processor 410 may display a portion (for example, the focus position display portion 1163) corresponding to the focus area of the image IM in the graphic user interface 1160 largely automatically or according to an input of the user.
For example, in the case where the size of the focus position display portion 1163 is enlarged, a portion for controlling the focus area via the moving object 1153 may be increased. In the case where the size of the focus position display portion 1163 is reduced, a portion for controlling the focus area via the moving object 1153 may be reduced. For example, the processor 410 may more precisely control the focus area in the enlarged focus position display portion 1163 according to the movement of the moving object 1153.
Referring to fig. 11E, the electronic device 1101 may control the size of the focus position display portion 1163 corresponding to the focus area. The electronic device 1101 may control the size of the focus position display portion 1163 indicating the focus area in response to an input for controlling the size of the focus position display portion 1163. For example, the electronic device 1101 may enlarge or reduce an area corresponding to the focus area for more precise focus control.
According to an embodiment of the present disclosure, the electronic device 1101 may increase the focus position display portion 1163 indicating the focus area in response to an input for enlarging the size of the focus position display portion 1163. For example, the electronic device 1101 may narrowly control the focus position display portion 1163 indicating the focus area in response to an input for reducing the size of the moving object 1153.
According to another embodiment of the present disclosure, the electronic device 1101 may control the size or length of the graphical user interface 1160. For example, in the case where the size of the graphical user interface 1160 is enlarged, a portion for controlling the focus area via the moving object 1153 may be increased. In the case where the size of the graphic user interface 1160 is reduced, a portion for controlling the focus area via the moving object 1153 may be reduced. For example, the processor 410 may more precisely control the focus area in the enlarged graphical user interface 1160 according to the movement of the moving object 1153.
Referring to fig. 11F, the electronic device 1101 may focus on a central portion and a part of the long distance portion of the fifth image 1135 using a camera.
According to an embodiment of the disclosure, the electronic device 1101 may display a graphical user interface 1160 for manually controlling focus. The graphic user interface 1160 may include a moving object 1153 for moving the focus region, and a focus position display portion 1163 for focus region expansion of the fifth image 1135. For example, in the case where the moving object 1153 moves upward based on the center portion of the graphical user interface 1160, the focus region may also move to a long distance region based on the center portion. For example, the electronic device 1101 may focus on the objects 1151, 1152, 1171, and 1172 located in the center portion and a portion of the long-distance portion of the fifth image 1135 based on the position of the moving object 1153.
The electronic device 1101 can more accurately control the focus in accordance with the movement of the moving object 1153 using the expanded focus position display portion 1163.
Fig. 12 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 12, at operation 1201, a processor 410 (e.g., the processor 410 of fig. 4) may display a graphic indicator on at least a portion of a first area (e.g., at least a portion of a subject) corresponding to a focus area of an image IM using depth information DI and contrast information of the image IM. In addition, the processor 410 may also display a graphical indicator on at least a part (for example, at least a part of the subject) of a region (hereinafter, out-of-focus region) of the image IM different from the first region.
At operation 1203, the processor 410 may determine whether to perform a zoom function of a camera included in the camera module 420 (e.g., the camera module 420 of fig. 4).
In the case of performing a zoom function of the camera, the processor 410 may determine a zoom ratio of the camera at operation 1205.
At operation 1207, the processor 410 may control a reference value for displaying the graphical indicator based on the zoom ratio and the depth information DI. For example, the reference value may refer to a contrast threshold for displaying the graphical indicator.
For example, the processor 410 may obtain contrast information about the image IM via the camera module 420. In the event that a contrast greater than a contrast threshold is detected for the subject included in the image IM, the processor 410 may display a graphical indicator on the subject included in the image IM.
According to an embodiment of the present disclosure, the processor 410 may increase the resolution of the image IM and/or the depth resolution included in the depth information DI according to the zoom magnification. In addition, the processor 410 may match the pixel values of the enlarged image IM and the enlarged depth information DI to each other.
According to an embodiment of the present disclosure, the processor 410 may reduce the resolution of the image IM and/or the depth resolution included in the depth information DI according to zoom down. In addition, the processor 410 may match the pixel values of the reduced image IM and the reduced depth information DI to each other.
According to an embodiment of the present disclosure, in case of zoom up (e.g., a state in which the zoom ratio is greater than 1.0), the processor 410 may decrease the reference value for displaying the graphic indicator. In addition, in case of zoom-out (e.g., a state in which the zoom ratio is less than 1.0), the processor 410 may increase a reference value for displaying the graphic indicator.
Fig. 13A, 13B, 13C, 13D, and 13E illustrate user interfaces for describing graphical indicators provided from an electronic device, according to various embodiments of the present disclosure.
Referring to fig. 13A, 13B, 13C, 13D, and 13E, electronic device 1301 may be implemented substantially the same as or similar to electronic device 401 described with reference to fig. 4.
Referring to fig. 13A, the electronic device 1301 may display an image IM on the display 1330. The electronic apparatus 1301 can focus on an area corresponding to the position of the object 1351 in the image IM. In addition, the electronic device 1301 may display a graphical indicator on at least a portion of the subject 1351.
The electronic device 1301 may display a zoom function window 1370 for performing a zoom function on the display 1330. The zoom function window 1370 may include an expansion key 1371 for zoom expansion and a reduction key 1372 for zoom reduction.
Referring to fig. 13B, the electronic device 1301 may enlarge the image IM in response to a touch input to the enlargement key 1371. For example, the electronic device 1301 may enlarge the image IM by 1.5 times in response to the touch input. The electronic device 1301 may display an enlarged information window 1381 for the image IM.
In the case where the image IM is enlarged, the electronic device 1301 may display a graphical indicator 1352 on the enlarged subject. For example, the frequency of pixels corresponding to an enlarged subject can be reduced. Accordingly, the amount of the graphical indicator 1352 displayed on the enlarged subject can be reduced. The electronic device 1301 may display a graphical indicator 1352 corresponding to the enlarged subject.
Referring to fig. 13C, the electronic device 1301 may enlarge the image IM in response to a touch input to the enlargement key 1371. In the case where the image IM is enlarged, the electronic device 1301 may decrease the reference value for displaying the graphic indicator.
In the case where the reference value is decreased, the electronic device 1301 may increase the amount of the graphic indicator 1353 displayed on the enlarged subject. The electronic device 1301 may display a graphical indicator 1353 corresponding to the decreased reference value.
Referring to fig. 13D, the electronic device 1301 may reduce the image IM in response to a touch input to a reduction key 1372. For example, the electronic device 1301 may enlarge the image IM by 0.7 times in response to the touch input. The electronic device 1301 may display an information window 1382 for reduction of the image IM.
In the case where the image IM is reduced, the electronic apparatus 1301 may display a graphic indicator 1354 on the reduced subject. For example, the frequency of pixels corresponding to the reduced subject may be increased. Accordingly, the amount of the graphic indicator 1354 displayed on the reduced subject can be increased. The electronic device 1301 may display a graphical indicator 1354 corresponding to the zoomed-out subject.
Referring to fig. 13E, the electronic device 1301 may reduce the image IM in response to a touch input to a reduction key 1372. In the case where the image IM is reduced, the electronic device 1301 may increase the reference value for displaying the graphic indicator.
In the case where the reference value is increased, the electronic apparatus 1301 may increase the amount of the graphic indicator 1355 displayed on the reduced subject. The electronic device 1301 may display a graphical indicator 1355 corresponding to the increased reference value.
Fig. 14 is a flow chart illustrating operation of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 14, at operation 1401, a processor 410 (e.g., processor 410 of fig. 4) may display a graphical indicator on at least a portion of a first region corresponding to a focus region using depth information DI and contrast information of an image IM.
The processor 410 may determine a distance of at least one subject included in the image IM displayed on the display 430 based on the depth information DI.
For example, at operation 1403, the processor 410 may determine a distance of at least one subject included in a first region corresponding to the focus region of the image IM (e.g., a preview image) displayed on the display 430 based on the depth information DI. In addition, the processor 410 may also determine the distance of at least one object included in a region (hereinafter, referred to as out-of-focus region) different from the first region of the image IM displayed on the display 430 based on the depth information DI.
At operation 1405, the processor 410 may compare the determined distance of the subject to a reference distance. For example, the reference distance may refer to a distance for determining whether the distance between the electronic device 401 and the subject is a long distance or a short distance. In addition, the reference distance may include a first reference distance for determining whether the subject is located at a long distance and a second reference distance for determining whether the subject is located at a short distance.
At operation 1407, the processor 410 may control a reference value for displaying the graphical indicator according to the comparison result.
For example, in a case where the subject is located at a distance smaller than the first reference distance, the processor 410 may determine that the subject is located at a short distance. In the case where the subject is at a short distance, the processor 410 may decrease the reference value for displaying the graphical indicator.
For example, in a case where the subject is located at a distance equal to or greater than the second reference distance, the processor 410 may determine that the subject is located at a long distance. In the case where the subject is at a long distance, the processor 410 may increase the reference value for displaying the graphical indicator.
15A, 15B, and 15C illustrate user interfaces for describing graphical indicators provided from an electronic device, according to various embodiments of the present disclosure.
Referring to fig. 15A, the electronic device 1501 may display an image IM on the display 1530. The electronic device 1501 may focus on an area corresponding to the position of the object 1550 in the image IM. In addition, the electronic device 1501 may display a graphical indicator on at least a portion of the subject 1550.
The electronic device 1501 may determine the distance of the object 1550 based on the depth information DI.
Referring to fig. 15B, the electronic device 1501 may display a graphic indicator 1551 corresponding to a subject 1550 located at a short distance.
In the event that it is determined that the subject 1550 is located at a short distance, the electronic device 1501 may display a graphical indicator 1551 on the subject 1550. For example, in the case where the object 1550 is located at a short distance, the frequency of the pixel corresponding to the object 1550 may be reduced. Accordingly, the amount of the graphical indicator 1551 displayed on the subject 1550 may be reduced. In the case where the amount of the graphic indicator 1551 is reduced, a manual focusing operation may not be easy. Accordingly, the electronic device 1501 can increase the amount of the graphical indicator 1551 by decreasing the reference value used to display the graphical indicator 1551 so that a manual focus operation is easy.
Referring to fig. 15C, the electronic device 1501 can display a graphical indicator 1552 corresponding to the decreased reference value.
In the case where it is determined that the subject 1550 is located at a short distance, the electronic device 1501 may decrease the reference value for displaying the graphical indicator. For example, the electronic device 1501 can reduce the reference value used to display the graphical indicator 1551 so that manual focus operations are easy.
In the event that the reference value is decreased, the electronic device 1501 may increase the amount of the graphical indicator 1552 displayed on the object 1550. Accordingly, the amount of the graphic indicator 1552 is increased, and thus a manual focusing operation may be easy.
An electronic device according to various embodiments may include a camera, a display, a memory, and a processor. The processor may be configured to: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using the camera; determining, based on the depth information, a first region in the image displayed on the display that displays a first graphical indicator indicating a focus region in which the camera is focused, with control of a focus of the camera; and displaying the first graphical indicator on at least a portion of the subject included in the first region.
The processor may be configured to: displaying the first graphical indicator using a first attribute on at least a portion of a subject included in the first region; and displaying a graphical indicator using an attribute different from the first attribute on at least a portion of other regions of the image different from the first region based on the depth information.
The processor may be configured to: determining a second region farther from the electronic device than the first region and a third region closer to the electronic device than the first region among the other regions based on the depth information; displaying a second graphical indicator using a second attribute on at least a portion of the second region; and displaying a third graphical indicator using a third attribute on at least a portion of the third region.
The processor may be configured to: displaying, on the display, a graphical interface for obtaining an input signal for controlling a focus of the camera, in which a first portion corresponding to the first region is displayed using the first attribute, a second portion corresponding to the second region is displayed using the second attribute, and a third portion corresponding to the third region is displayed using the third attribute.
The processor may be configured to: determining a depth of the focus region, and displaying an attribute corresponding to the depth on the graphical interface.
The processor may be configured to: determining the first region based on depth information and contrast information of the image.
The processor may be configured to: controlling a reference value for displaying the first graphical indicator based on the depth information.
The processor may be configured to: in a case where the subject included in the first area is located closer to the electronic apparatus than a predetermined reference distance, the reference value for displaying the first graphical indicator is decreased.
The processor may be configured to: controlling a reference value for displaying the first graphical indicator based on the zoom ratio and the depth information of the image.
The processor may be configured to: decreasing the reference value for displaying the first graphical indicator if the zoom ratio of the image is increased, and increasing the reference value for displaying the first graphical indicator if the zoom ratio of the image is decreased.
Methods of operating an electronic device according to various embodiments may include: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using a camera included in the electronic device; determining, based on the depth information, a first region in the image displayed on a display, the first region displaying a first graphical indicator for indicating a focus region in which the camera is focused, with control of a focus of the camera; and displaying the first graphical indicator on at least a portion of the subject included in the first region.
Displaying the first graphical indicator may include: displaying the first graphical indicator using a first attribute on at least a portion of a subject included in the first region, and displaying the graphical indicator using an attribute different from the first attribute on at least a portion of another region of the image different from the first region.
The method may further comprise: determining a second region farther from the electronic device than the first region and a third region closer to the electronic device than the first region among the other regions based on the depth information; displaying a second graphical indicator using a second attribute on at least a portion of the second region; and displaying a third graphical indicator using a third attribute on at least a portion of the third region.
The method may further comprise: displaying, on the display, a graphical interface for obtaining an input signal for controlling a focus of the camera, in which a first portion corresponding to the first region is displayed using the first attribute, a second portion corresponding to the second region is displayed using the second attribute, and a third portion corresponding to the third region is displayed using the third attribute.
Displaying the graphical interface may include: determining a depth of the focus region, and displaying an attribute corresponding to the depth on the graphical interface.
Determining the first region may include: determining the first region based on depth information and contrast information of the image.
The method may further comprise: controlling a reference value for displaying the first graphical indicator based on the depth information.
Controlling the reference value may include: in a case where the subject included in the first area is located closer to the electronic apparatus than a predetermined reference distance, the reference value for displaying the first graphical indicator is decreased.
The method may further comprise: controlling a reference value for displaying the first graphical indicator based on the zoom ratio and the depth information of the image.
An electronic device according to various embodiments may include a camera, a display, a memory, and a processor. The processor may be configured to: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using the camera; determining a focus area on which the camera focuses according to an input signal in an image displayed on the display; and displaying a first graphical indicator on at least a portion of the subject included in the focus area based on the depth information.
Certain aspects of the present disclosure may also be embodied as computer readable code on a non-transitory computer readable recording medium. The non-transitory computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the non-transitory computer readable recording medium include read-only memory (ROM), random-access memory (RAM), compact disc ROM (CD-ROM), magnetic tapes, floppy disks, and optical data storage devices. The non-transitory computer readable recording medium may also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. Also, functional programs, codes, and code segments for accomplishing the present disclosure can be easily construed by programmers skilled in the art to which the present disclosure pertains.
In this regard, it should be noted that the various embodiments of the present disclosure as described above generally involve input data processing and output data generation to some extent. Such input data processing and output data generation may be implemented in hardware or software in combination with hardware. For example, certain electronic components may be employed in a mobile device or similar or related circuitry for implementing the functions associated with the various embodiments of the present disclosure described above. Alternatively, one or more processors operating in accordance with stored instructions may implement the functions associated with the various embodiments of the present disclosure described above. If this is the case, it is within the scope of the disclosure that such instructions may be stored on one or more non-transitory processor-readable media. Examples of the processor-readable medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device. The processor-readable medium can also be distributed over network coupled computer systems so that the instructions are stored and executed in a distributed fashion. Furthermore, functional computer programs, instructions and instruction segments for implementing the present disclosure can be easily interpreted by a programming skilled person in the art to which the present disclosure pertains.
A computer-readable recording medium according to various embodiments may store a program for executing the method of operating an electronic device. The method may include: generating depth information corresponding to at least one subject based on information on pixels included in an image including the at least one subject obtained using a camera included in the electronic device; determining, based on the depth information, a first region in the image displayed on a display, the first region displaying a first graphical indicator for indicating a focus region in which the camera is focused, with control of a focus of the camera; and displaying the first graphical indicator on at least a portion of the subject included in the first region.
Each component of the electronic device according to the present disclosure may be implemented by one or more components, and the name of the corresponding component may vary according to the type of the electronic device. In various embodiments of the present disclosure, the inspection device may include at least one of the above elements. The electronic device may omit some of the above elements, or the electronic device may further include additional elements. Furthermore, some of the components of an electronic device according to various embodiments may be combined to form a single entity, and thus the functions of the respective elements before combination may be equally performed.
While the disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents.

Claims (10)

1. An electronic device, comprising:
a camera;
a display;
a memory; and
at least one processor configured to:
displaying an image including at least one subject obtained using the camera on the display,
obtaining depth information corresponding to the at least one subject based on information on pixels included in the image obtained using the camera,
in response to identifying a manual focus mode to manually control a focus of the camera, identifying, based on the depth information, a first region in the image displayed on the display that the camera is focused on, and a second region that is further from the electronic device than the first region and a third region that is closer to the electronic device than the first region among regions in the image other than the first region, wherein the second region and the third region are regions in which the camera is not focused,
displaying a first graphical indicator including a plurality of points of a first color on at least a portion of an outline of at least one first object included in the first area,
displaying a second graphic indicator including a plurality of points of a second color on at least a part of an outline of at least one second object included in the second area, and
displaying a third graphical indicator including a plurality of points of a third color on at least a part of an outline of at least one third object included in the third area,
wherein the first graphical indicator, the second graphical indicator, and the third graphical indicator are displayed together in the image,
wherein the at least one processor is further configured to: controlling a reference value for displaying the first graphical indicator based on the depth information and displaying the first graphical indicator according to the reference value such that the number of points comprised by the first graphical indicator increases or decreases based on the reference value,
wherein a display position of each of the first graphical indicator, the second graphical indicator, and the third graphical indicator changes based on manually changing a focus of the camera.
2. The electronic device of claim 1, wherein the at least one processor is further configured to:
displaying on the display a graphical interface for obtaining input signals for controlling a focus of the camera, in which graphical interface:
displaying a first portion corresponding to the first region using the first color,
displaying a second portion corresponding to the second area using the second color, an
Displaying a third portion corresponding to the third area using the third color.
3. The electronic device of claim 2, wherein the at least one processor is further configured to:
identifying a depth of the first region, an
Displaying the attribute corresponding to the depth on the graphical interface.
4. The electronic device of claim 1, wherein the at least one processor is further configured to: identifying the first region based on depth information and contrast information of the image.
5. The electronic device of claim 1, wherein the at least one processor is further configured to: in a case where the at least one first subject included in the first area is located closer to the electronic device than a predetermined reference distance, decreasing a reference value for displaying the first graphical indicator.
6. The electronic device of claim 1, wherein the at least one processor is further configured to: controlling a reference value for displaying the first graphical indicator based on the zoom ratio of the image and the depth information.
7. The electronic device of claim 6, wherein the at least one processor is further configured to:
in the case where the zoom ratio of the image is increased, decreasing a reference value for displaying the first graphic indicator, and
increasing a reference value for displaying the first graphical indicator in a case where a zoom ratio of the image decreases.
8. A method of operating an electronic device, the method comprising:
displaying an image including at least one subject obtained using a camera included in the electronic device;
obtaining depth information corresponding to the at least one subject based on information on pixels included in the image obtained using the camera;
in response to identifying a manual focus mode to manually control a focus of the camera, identifying, based on the depth information, a first region in the image displayed on a display, the first region displaying a first graphical indicator indicating a focused region in which the camera is focused and a second region, among regions in the image other than the first region, farther from the electronic device than the first region and a third region closer to the electronic device than the first region, the second and third regions being regions in which the camera is unfocused; and
displaying a first graphical indicator including a plurality of points of a first color on at least a part of an outline of at least one first subject included in the first region, displaying a second graphical indicator including a plurality of points of a second color on at least a part of an outline of at least one second subject included in the second region, and displaying a third graphical indicator including a plurality of points of a third color on at least a part of an outline of at least one third subject included in the third region,
wherein the first graphical indicator, the second graphical indicator, and the third graphical indicator are displayed together in the image,
wherein the method further comprises: controlling a reference value for displaying the first graphical indicator based on the depth information and displaying the first graphical indicator according to the reference value such that the number of points comprised by the first graphical indicator increases or decreases based on the reference value,
wherein a display position of each of the first graphical indicator, the second graphical indicator, and the third graphical indicator changes based on manually changing a focus of the camera.
9. The method of claim 8, further comprising:
displaying a graphical interface on the display for receiving an input signal to control a focus of the camera; and
in the graphical interface, a first portion corresponding to the first region is displayed using the first color, a second portion corresponding to the second region is displayed using the second color, and a third portion corresponding to the third region is displayed using the third color.
10. The method of claim 9, wherein displaying the graphical interface comprises: determining a depth of the first region, and displaying a property corresponding to the depth on the graphical interface.
CN201810249601.4A 2017-03-24 2018-03-23 Electronic device providing a graphical indicator for a focus and method of operating an electronic device Active CN108632529B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0037826 2017-03-24
KR1020170037826A KR102379898B1 (en) 2017-03-24 2017-03-24 Electronic device for providing a graphic indicator related to a focus and method of operating the same

Publications (2)

Publication Number Publication Date
CN108632529A CN108632529A (en) 2018-10-09
CN108632529B true CN108632529B (en) 2021-12-03

Family

ID=61912953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810249601.4A Active CN108632529B (en) 2017-03-24 2018-03-23 Electronic device providing a graphical indicator for a focus and method of operating an electronic device

Country Status (4)

Country Link
US (1) US10868954B2 (en)
EP (1) EP3379821A1 (en)
KR (1) KR102379898B1 (en)
CN (1) CN108632529B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110149482B (en) * 2019-06-28 2021-02-02 Oppo广东移动通信有限公司 Focusing method, focusing device, electronic equipment and computer readable storage medium
JP6961889B1 (en) * 2020-06-30 2021-11-05 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd Control device, imaging device, control method, and program
US11632601B1 (en) * 2021-11-11 2023-04-18 Qualcomm Incorporated User interface for camera focus
DE102022207014A1 (en) * 2022-07-08 2024-01-11 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Camera assistance system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753844A (en) * 2008-12-18 2010-06-23 三洋电机株式会社 Image display apparatus and image sensing apparatus
CN102625044A (en) * 2011-01-31 2012-08-01 三洋电机株式会社 Image pickup apparatus
CN105210018A (en) * 2013-05-16 2015-12-30 索尼公司 User interface for selecting a parameter during image refocusing

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160065943A1 (en) * 2014-09-03 2016-03-03 Samsung Electronics Co., Ltd. Method for displaying images and electronic device thereof
AU2003226336A1 (en) * 2002-04-09 2003-10-27 University Of Iowa Research Foundation Reconstruction and motion analysis of an embryo
JP2005278053A (en) 2004-03-26 2005-10-06 Sony Corp Imaging apparatus
KR20100082147A (en) * 2009-01-08 2010-07-16 삼성전자주식회사 Method for enlarging and changing captured image, and phographed apparatus using the same
US20120249550A1 (en) * 2009-04-18 2012-10-04 Lytro, Inc. Selective Transmission of Image Data Based on Device Attributes
US8176509B2 (en) * 2009-06-30 2012-05-08 Yahoo! Inc. Post processing video to identify interests based on clustered user interactions
US8600226B2 (en) 2010-08-30 2013-12-03 Samsung Electronics Co., Ltd. Focusing methods and apparatus, and recording media for recording the methods
KR101797040B1 (en) 2011-11-28 2017-11-13 삼성전자주식회사 Digital photographing apparatus and control method thereof
US20150146072A1 (en) 2012-04-19 2015-05-28 Sony Mobile Communications Ab Image focusing
US20150010236A1 (en) 2013-07-08 2015-01-08 Htc Corporation Automatic image refocusing method
US9554031B2 (en) * 2013-12-31 2017-01-24 Light Labs Inc. Camera focusing related methods and apparatus
KR102326275B1 (en) 2014-02-25 2021-11-16 삼성전자주식회사 Image displaying method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753844A (en) * 2008-12-18 2010-06-23 三洋电机株式会社 Image display apparatus and image sensing apparatus
CN102625044A (en) * 2011-01-31 2012-08-01 三洋电机株式会社 Image pickup apparatus
CN105210018A (en) * 2013-05-16 2015-12-30 索尼公司 User interface for selecting a parameter during image refocusing

Also Published As

Publication number Publication date
US10868954B2 (en) 2020-12-15
EP3379821A1 (en) 2018-09-26
CN108632529A (en) 2018-10-09
KR102379898B1 (en) 2022-03-31
KR20180108280A (en) 2018-10-04
US20180278837A1 (en) 2018-09-27

Similar Documents

Publication Publication Date Title
CN108289161B (en) Electronic device and image capturing method thereof
US11442580B2 (en) Screen configuration method, electronic device, and storage medium
CN110462572B (en) Electronic device and control method thereof
US10574895B2 (en) Image capturing method and camera equipped electronic device
CN107800930B (en) Image synthesizing method and electronic device supporting the same
CN107257954B (en) Apparatus and method for providing screen mirroring service
US11030288B2 (en) Electronic device for authenticating using biometric information and method of operating electronic device
CN108427533B (en) Electronic device and method for determining environment of electronic device
CN108632529B (en) Electronic device providing a graphical indicator for a focus and method of operating an electronic device
US10412339B2 (en) Electronic device and image encoding method of electronic device
US10466856B2 (en) Electronic device having two displays and a method for executing a different application on each display of the electronic device based on simultaneous inputs into a plurality of application icons
US10719209B2 (en) Method for outputting screen and electronic device supporting the same
US9942467B2 (en) Electronic device and method for adjusting camera exposure
KR20160031217A (en) Method for controlling and an electronic device thereof
US10033921B2 (en) Method for setting focus and electronic device thereof
US10198828B2 (en) Image processing method and electronic device supporting the same
KR20160137258A (en) Electronic apparatus and method for displaying screen thereof
US20170111608A1 (en) Method for recording execution screen and electronic device for processing the same
US10334174B2 (en) Electronic device for controlling a viewing angle of at least one lens and control method thereof
US11210828B2 (en) Method and electronic device for outputting guide
CN110537177B (en) Electronic device and operation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant