CN110996013B - Electronic device and method for processing image - Google Patents

Electronic device and method for processing image Download PDF

Info

Publication number
CN110996013B
CN110996013B CN201911355618.9A CN201911355618A CN110996013B CN 110996013 B CN110996013 B CN 110996013B CN 201911355618 A CN201911355618 A CN 201911355618A CN 110996013 B CN110996013 B CN 110996013B
Authority
CN
China
Prior art keywords
image
image data
electronic device
data
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911355618.9A
Other languages
Chinese (zh)
Other versions
CN110996013A (en
Inventor
白宇铉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140025000A external-priority patent/KR102124188B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN110996013A publication Critical patent/CN110996013A/en
Application granted granted Critical
Publication of CN110996013B publication Critical patent/CN110996013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J21/00Catalysts comprising the elements, oxides, or hydroxides of magnesium, boron, aluminium, carbon, silicon, titanium, zirconium, or hafnium
    • B01J21/06Silicon, titanium, zirconium or hafnium; Oxides or hydroxides thereof
    • B01J21/063Titanium; Oxides or hydroxides thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J35/00Catalysts, in general, characterised by their form or physical properties
    • B01J35/30Catalysts, in general, characterised by their form or physical properties characterised by their physical properties
    • B01J35/39Photocatalytic properties
    • CCHEMISTRY; METALLURGY
    • C04CEMENTS; CONCRETE; ARTIFICIAL STONE; CERAMICS; REFRACTORIES
    • C04BLIME, MAGNESIA; SLAG; CEMENTS; COMPOSITIONS THEREOF, e.g. MORTARS, CONCRETE OR LIKE BUILDING MATERIALS; ARTIFICIAL STONE; CERAMICS; REFRACTORIES; TREATMENT OF NATURAL STONE
    • C04B41/00After-treatment of mortars, concrete, artificial stone or ceramics; Treatment of natural stone
    • C04B41/45Coating or impregnating, e.g. injection in masonry, partial coating of green or fired ceramics, organic coating compositions for adhering together two concrete elements
    • C04B41/50Coating or impregnating, e.g. injection in masonry, partial coating of green or fired ceramics, organic coating compositions for adhering together two concrete elements with inorganic materials
    • C04B41/5025Coating or impregnating, e.g. injection in masonry, partial coating of green or fired ceramics, organic coating compositions for adhering together two concrete elements with inorganic materials with ceramic materials
    • C04B41/5041Titanium oxide or titanates
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01CCONSTRUCTION OF, OR SURFACES FOR, ROADS, SPORTS GROUNDS, OR THE LIKE; MACHINES OR AUXILIARY TOOLS FOR CONSTRUCTION OR REPAIR
    • E01C1/00Design or layout of roads, e.g. for noise abatement, for gas absorption
    • E01C1/005Means permanently installed along the road for removing or neutralising exhaust gases
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01CCONSTRUCTION OF, OR SURFACES FOR, ROADS, SPORTS GROUNDS, OR THE LIKE; MACHINES OR AUXILIARY TOOLS FOR CONSTRUCTION OR REPAIR
    • E01C7/00Coherent pavings made in situ
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01CCONSTRUCTION OF, OR SURFACES FOR, ROADS, SPORTS GROUNDS, OR THE LIKE; MACHINES OR AUXILIARY TOOLS FOR CONSTRUCTION OR REPAIR
    • E01C7/00Coherent pavings made in situ
    • E01C7/08Coherent pavings made in situ made of road-metal and binders
    • E01C7/35Toppings or surface dressings; Methods of mixing, impregnating, or spreading them
    • E01C7/353Toppings or surface dressings; Methods of mixing, impregnating, or spreading them with exclusively bituminous binders; Aggregate, fillers or other additives for application on or in the surface of toppings with exclusively bituminous binders, e.g. for roughening or clearing
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01CCONSTRUCTION OF, OR SURFACES FOR, ROADS, SPORTS GROUNDS, OR THE LIKE; MACHINES OR AUXILIARY TOOLS FOR CONSTRUCTION OR REPAIR
    • E01C7/00Coherent pavings made in situ
    • E01C7/08Coherent pavings made in situ made of road-metal and binders
    • E01C7/35Toppings or surface dressings; Methods of mixing, impregnating, or spreading them
    • E01C7/358Toppings or surface dressings; Methods of mixing, impregnating, or spreading them with a combination of two or more binders according to groups E01C7/351 - E01C7/356
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32106Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title separate from the image data, e.g. in a different computer file
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3273Display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Structural Engineering (AREA)
  • Architecture (AREA)
  • Civil Engineering (AREA)
  • Materials Engineering (AREA)
  • Organic Chemistry (AREA)
  • Human Computer Interaction (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Ceramic Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Inorganic Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An electronic device and a method of processing an image. An electronic device includes a first image sensor, a second image sensor, one or more image processing modules, and a display. The first image sensor generates first image data. The second image sensor generates second image data. One or more image processing modules process one or more of the first image data and the second image data. The display displays one or more of the first image data or the second image data processed by the one or more image processing modules. The thumbnail generation module generates thumbnail data using one or more of the first image data and the second image data processed by the one or more image processing modules. A method includes converting a plurality of image data into a format displayable on a display, and generating thumbnail data using the image data in the displayable format.

Description

Electronic device and method for processing image
The present application is a divisional application of a patent application having an application date of 2014, 3/13, an application number of 201480014885.9, entitled "electronic device and method of processing image".
Technical Field
Various embodiments of the present disclosure relate to a method for processing an image and an electronic device thereof.
Background
With the development of information communication (IT) technology and semiconductor technology, various types of electronic devices are included in multimedia devices to provide various multimedia services. For example, the portable electronic device may provide various multimedia services such as a broadcasting service, a wireless internet service, and a music playing service.
The electronic device may provide a variety of services using one or more images acquired by the image sensor. For example, an electronic device may perform image processing such as level adjustment, noise removal, gamma correction, color space conversion, and the like on image data acquired through an image sensor using an Image Signal Processor (ISP), and provide various services.
However, since the electronic apparatus performs a variety of image processing using one ISP, there may be a problem of reducing the processing speed of image data.
Disclosure of Invention
In order to address the above-mentioned deficiencies, it is a primary object to provide an apparatus and method for efficiently processing image data acquired by one or more image sensors in an electronic device.
Embodiments of the present disclosure may provide an apparatus and method of reducing a processing delay of image data acquired by one or more sensors in an electronic device.
Embodiments of the present disclosure may provide an apparatus and method for efficiently generating thumbnail data regarding captured image data in an electronic device.
Embodiments of the present disclosure may provide an apparatus and method for generating thumbnail data regarding captured image data using a processor different from an image processing unit (i.e., an Image Signal Processor (ISP)) in an electronic device.
Embodiments of the present disclosure may provide an apparatus and method for generating thumbnail data regarding captured image data using one or more data generated in an image processing unit (i.e., ISP) in different processors of an electronic device.
Embodiments of the present disclosure may provide an apparatus and method for interlocking and storing thumbnail data and captured image data generated using a processor different from an image processing unit (i.e., ISP) in an electronic device.
Embodiments of the present disclosure may provide an apparatus and method for interlocking and storing captured image data and thumbnail data using metadata generated in an image processing unit (i.e., ISP) in different processors of an electronic device.
The above aspects are achieved by providing an electronic device and a method for processing an image.
According to an embodiment of the present disclosure, an electronic device includes a first image sensor, a second image sensor, one or more image processing modules, a display, and a thumbnail generation unit. The first image sensor generates first image data. The second image sensor generates second image data. One or more image processing modules process one or more of the first image data and the second image data. The display displays one or more of the first image data or the second image data processed by the one or more image processing modules. The thumbnail generation module generates thumbnail data using one or more of the first image data or the second image data processed by the one or more image processing modules.
According to an embodiment of the present disclosure, an electronic device includes one or more processors and a display unit. The one or more processors receive the image data, process the image data, and generate a preview image. A display unit displays the preview image generated by the one or more processors. The one or more processors are configured to generate an image smaller in size than the preview image using at least a portion of the preview image in response to a signal corresponding to the capture instruction.
According to an embodiment of the present disclosure, there is provided an operating method of an electronic device. The method comprises the following operations: generating a plurality of image data using a plurality of image sensors; converting, by one or more image processing modules, the plurality of image data into a format displayable on a display unit; the thumbnail data is generated in another module separate from the image processing module using the image data in the displayable format converted in the image processing module.
According to an embodiment of the present disclosure, there is provided an operating method of an electronic device. The method comprises the following operations: storing one or more image data; converting, by one or more processors, one or more of the one or more image data into a preview image; in response to the signal indicative of the capture instruction, generating, by the processor, an image of a size smaller than the preview image using at least a portion of the preview image.
According to an embodiment of the present disclosure, an electronic device includes one or more image sensors and an interface. The one or more image sensors generate image data. An interface processes image data generated in the one or more image sensors. The interface sends the image data to one or more modules. The one or more modules change a format of the image data based on an image data processing method of the corresponding module.
Before proceeding to the following detailed description, it may be helpful to set forth definitions of certain words and phrases used throughout this patent document: the terms "include" and "comprise," as well as derivatives thereof, are intended to be inclusive and not limiting; the term "or" is inclusive, meaning and/or; the terms "associated with … …" and "associated therewith," and derivatives thereof, may mean to include, be included within … …, interconnect with … …, contain, be included within … …, connect to or with … …, be incorporated into or in conjunction with … …, communicate with … …, cooperate with … …, interleave, juxtapose, approximate, be constrained by … … or be constrained by … …, have or have the properties of … …, and the like; the term "controller" means any device, system or component thereof that controls at least one operation, and such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases by a user.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which like reference numbers indicate like parts:
FIG. 1A is a diagram illustrating a network environment including an electronic device, according to an embodiment of the present disclosure;
FIG. 1B is a block diagram illustrating an electronic device according to an embodiment of the present disclosure;
FIG. 2 is a detailed block diagram illustrating a processor according to an embodiment of the present disclosure;
FIG. 3 is a block diagram illustrating an electronic device according to another embodiment of the present disclosure;
FIG. 4 is a detailed block diagram illustrating a processor according to another embodiment of the present disclosure;
FIG. 5 is a block diagram illustrating an electronic device according to yet another embodiment of the present disclosure;
FIG. 6 is a detailed block diagram illustrating a processor according to yet another embodiment of the present disclosure;
fig. 7 is a detailed block diagram illustrating an external image processing unit according to an embodiment of the present disclosure;
FIG. 8 is a detailed block diagram illustrating a memory according to an embodiment of the present disclosure;
FIG. 9 is a block diagram illustrating an interface according to an embodiment of the present disclosure;
fig. 10 is a flowchart illustrating a process for generating thumbnail data in an electronic device according to an embodiment of the present disclosure;
fig. 11 is a flowchart illustrating a process for interlocking and storing thumbnail data and acquired image data in an electronic apparatus according to an embodiment of the present disclosure;
fig. 12A and 12B are diagrams illustrating a structure of divided and stored image data according to an embodiment of the present disclosure;
fig. 13 is a block diagram illustrating an electronic device according to yet another embodiment of the present disclosure.
Detailed Description
Fig. 1A through 13, discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in a manner that would limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged electronic device. The present disclosure may be described below with reference to the accompanying drawings. The present disclosure is susceptible to various modifications and embodiments. Specific embodiments have been shown by way of example in the drawings and have been disclosed in the relevant detailed description. However, these are not intended to limit the present disclosure to specific embodiment forms, and should be understood to include all changes, equivalents, and substitutions included in the spirit and technical scope of the present disclosure. In the description of the drawings, the same reference numerals are used for the same constituent elements.
Expressions such as "include", "may include" or "may include" usable in the present disclosure indicate that there are disclosed corresponding functions, operations, constituent elements, and the like, and do not limit another one or more functions, operations, constituent elements, and the like. Further, in the present disclosure, it should be understood that the terms "comprises," "comprising," "includes," "including," "has," "having," and the like, are used to specify the presence of stated features, quantities, steps, operations, elements, components, or combinations thereof, and do not preclude the presence or addition of one or more other features, quantities, steps, operations, elements, components, or combinations thereof.
In this disclosure, the expression "or" and the like includes any and all combinations of the words arranged together. For example, "a or B" may include a, or may include B, or may include both a and B.
In the present disclosure, the expressions "first" and "second" and the like may modify various constituent elements of the present disclosure, but do not limit the corresponding constituent elements. For example, the expressions do not limit the order and/or importance of the corresponding constituent elements, etc. Expressions may be used to distinguish one constituent element from another constituent element. For example, the first user device and the second user device are all user devices and represent different user devices. For example, a first constituent element may be named a second constituent element without departing from the spirit and scope of the present disclosure. Likewise, even the second constituent element may be named the first constituent element.
When it is referred to that one constituent element is "connected" or "accessed" to another constituent element, it should be understood that one constituent element may be directly connected or accessed to another constituent element, or a third constituent element may exist between the two constituent elements. In contrast, when it is referred to one constituent element as being "directly connected" or "directly accessing" another constituent element, it is understood that there is no third constituent element between the two constituent elements.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to limit the spirit and scope of the present disclosure. Unless the context clearly dictates otherwise, singular expressions include plural expressions.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. Unless clearly defined in the present disclosure, terms as defined in a general dictionary should be interpreted as having meanings consistent with the context of the related art, and are not formally interpreted in an idealized or transitional manner.
An electronic device according to an embodiment of the present disclosure may be a device including a camera function. For example, the electronic device may include at least one of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), an MPEG audio layer 3(MP3) player, a mobile medical instrument, a camera, and a wearable device (e.g., a Head Mounted Display (HMD) such as electronic glasses, an electronic garment, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, or a smart watch).
According to some embodiments, the electronic device may be a smart electronic home appliance having a camera function. For example, the smart electronic home appliance may include at least one of a television, a Digital Versatile Disk (DVD) player, an audio device, a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washing machine, an air purifier, a set-top box, a TV box (e.g., Samsung homesynctm, AppleTVTM, or Google TVTM), a game console, an electronic dictionary, an electronic lock system, a video camera, and an electronic photo frame.
According to some embodiments, the electronic device may include at least one of a variety of medical instruments (e.g., Magnetic Resonance Angiography (MRA), Magnetic Resonance Imaging (MRI), Computerized Tomography (CT), motion cameras, ultrasound machines, and the like), navigation devices, Global Positioning System (GPS) receivers, vehicle entertainment devices, marine electronics (e.g., marine navigation devices, gyroscopic compasses, and the like), flight electronics, safety instruments, and industrial or domestic robots.
According to some embodiments, the electronic device may include furniture or a part of a building/structure having a camera function, an electronic board, an electronic signature input device, a projector, and at least one of various measuring instruments (e.g., a tap water, electric, gas, or radio wave measuring instrument, etc.). The electronic device according to the present disclosure may be one of the foregoing various devices or a combination of two or more of the foregoing. Furthermore, it is apparent to those skilled in the art that the electronic device according to the present disclosure is limited to the aforementioned apparatus.
An electronic device according to various embodiments will be described below with reference to the accompanying drawings. The term "user" as used in various embodiments may refer to a person using an electronic device or a device that utilizes an electronic device (e.g., a obfuscated electronic device).
In the following, embodiments of the present disclosure describe techniques for processing image data acquired by multiple image sensors in an electronic device.
Fig. 1A is a diagram illustrating a network environment 100 including an electronic device according to an embodiment of the present disclosure.
Referring to fig. 1A, an electronic device 101 may include a bus 110, a processor 120, a memory 130, an input/output interface 140, a display 150, a communication interface 160, an image processing module 170, and an image sensor module 180.
The bus 110 may be a circuit that connects the aforementioned constituent elements to each other and transmits a communication signal (e.g., a control message) between the aforementioned constituent elements.
The processor 120 may receive instructions from the aforementioned other constituent elements (e.g., the memory 130, the input/output interface 140, the display 150, the communication interface 160, the image processing module 170, or the image sensor module 180), for example, through the bus 110, interpret the received instructions, and perform operations or data processing according to the interpreted instructions.
The memory 130 may store instructions or data received from or generated by the processor 120 or other constituent elements (e.g., the input/output interface 140, the display 150, the communication interface 160, the image processing module 170, or the image sensor module 180, etc.). Memory 130 may include program modules for cores 131, middleware 132, Application Program Interfaces (APIs) 133, or applications 134, among others. The aforementioned program modules may be constituted by software, firmware, hardware, or a combination of at least two or more thereof.
The core 131 may control or manage system resources (e.g., the bus 110, the processor 120, or the memory 130, etc.) for performing operations or functions implemented in other program modules (e.g., the middleware 132, the API 133, or the application 134). Further, the core 131 may provide an interface that enables the middleware 132, the API 133, or the application 134 to access and control or manage the individual constituent elements of the electronic device 101.
The middleware 132 may implement a role of relay that enables the API 133 or the application 134 to communicate and exchange data with the core 131. Further, with respect to work requests received from the applications 134, for example, the middleware 132 may perform control (e.g., scheduling or load balancing) of the work requests for at least one of the applications 134 using a method that enables assigning a priority order or the like using system resources (e.g., the bus 110, the processor 120, or the memory 130, etc.) of the electronic device 100.
The API 133, which is an interface enabling the application 134 to control functions provided in the core 131 or the middleware 132, may include at least one interface or function (e.g., an instruction) for file control, window control, image processing, character control, or the like, for example.
According to various embodiments, the applications 134 may include a Short Message Service (SMS)/Multimedia Message Service (MMS) application, an electronic mail (e-mail) application, a calendar application, an alarm clock application, a healthcare application (e.g., an application that measures momentum, blood glucose, or the like), or an environmental information application (e.g., an application that provides pressure, humidity, or temperature information, or the like), among others. Additionally or alternatively, application 134 may be an application related to the exchange of information between electronic device 101 and an external electronic device (e.g., electronic device 102 or electronic device 104). The applications related to information exchange may include, for example, a notification push application for pushing specific information to an external electronic device or a device management application for managing the external electronic device.
For example, the notification push application may include functionality to push notification information generated in other applications of the electronic device 101 (e.g., an SMS/MMS application, an e-mail application, a healthcare application, or an environmental information application, etc.) to an external electronic device (e.g., the electronic device 102 or the electronic device 104). Additionally or alternatively, for example, the notification push application may receive notification information from an external electronic device (e.g., electronic device 102 or electronic device 104) and provide the received notification information to, for example, a user. For example, the device management application may manage (e.g., install, delete, or update) a function of at least a part of an external electronic device (e.g., the electronic device 102 or the electronic device 104) that communicates with the electronic device 101 (e.g., enable/disable the external electronic device itself or some component thereof, or adjust display brightness or resolution), an application running in the external electronic device, or a service (e.g., a call service or an information service) provided in the external electronic device.
According to various embodiments, the applications 134 may include applications that are specified according to attributes (e.g., categories) of an external electronic device (e.g., the electronic device 102 or the electronic device 104). For example, when the external electronic device is an MP3 player, the applications 134 may include applications related to music playing. Similarly, when the external electronic device is a mobile medical instrument, the applications 134 may include healthcare-related applications. According to one embodiment, the applications 134 may include at least one of an application designated for the electronic device 101 and an application received from an external electronic device (e.g., the server 106, the electronic device 102, or the electronic device 104).
The input/output interface 140 may push instructions or data input from a user through sensors (e.g., acceleration sensors and gyro sensors) or input devices (e.g., a keyboard or a touch screen) to, for example, the processor 120, the memory 130, the communication interface 160, or the image processing module 170 via the bus 110. For example, the input/output interface 140 may provide data regarding a user's touch input through the touch screen to the processor 120. Further, for example, input/output interface 140 may output instructions or data received from processor 120, memory 130, communication interface 160, or image processor module 170 via bus 110 through an output device (e.g., a speaker or display). For example, the input/output interface 140 may output sound data processed by the processor 120 to a user via a speaker.
The display 150 may display various information (e.g., multimedia data or text data, etc.) to a user.
The communication interface 160 may connect communications between the electronic device 101 and an external device (e.g., the electronic device 102, the electronic device 104, or the server 106). For example, the communication interface 160 may support network communications 162 (e.g., the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a telecommunications network, a cellular network, a satellite network, or a Plain Old Telephone System (POTS), etc.), short-range communications 164 (e.g., wireless fidelity (WiFi), Bluetooth (BT), or Near Field Communications (NFC)) and wired communications (e.g., Universal Serial Bus (USB), high-definition multimedia interface (HDMI), recommendation standard-232 (RS-232), or POTS, etc.). According to one embodiment, a protocol (e.g., a short-range communication protocol, a network communication protocol, or a wired communication protocol) for communication between the electronic device 100 and an external device may be supported in at least one of the API 133 and the middleware 132. Electronic devices 102 and 104 can each be the same device (e.g., the same kind of device) as electronic device 101 or a different device (e.g., a different kind of device).
The image sensor module 180 may provide image data acquired through subject photographing to the image processing module 170. At this time, the image sensor module 180 may include at least one image sensor module functionally connected to the electronic device 101.
The image processing module 170 may perform image processing on image data provided from the image sensor module 180 or the external electronic devices 102 and 104. For example, the image processing module 170 may perform one or more of level adjustment, noise removal, gamma correction, and conversion of image data into a format that can be displayed on the display 150. The image processing module 170 may control storing the image-processed image data in the memory 130 or displaying the image data on the display 150. For example, the image processing module 170 may transmit image data (e.g., YUV data) displayed on the display 150 and metadata related to the corresponding image data to the memory 130. Here, the image processing for converting into a format that can be displayed on the display 150 may include color space conversion.
The image processing module 170 may select and synchronize at least two image data among the image data acquired through the at least one image sensor module 180. For example, the image processing module 170 may select and synthesize at least two image data using an image acquisition time stamp or an image processing delay time and an image acquisition time stamp corresponding to the image data.
As another example, when a capture event occurs, the image processing module 170 may generate thumbnail data regarding the acquired image data using image data (e.g., preview images) stored in the memory 130 and metadata regarding each image data. For example, the image processing module 170 may generate thumbnail data regarding captured image data using a different module that is logically or physically separate from the module that performs image processing on the image data provided from the image sensor module 180. The thumbnail data may represent a reduced image to facilitate retrieval of the corresponding image or image data that allows a user to easily recognize the corresponding image.
Fig. 1B is a block diagram illustrating an electronic device according to an embodiment of the present disclosure.
Referring to fig. 1B, the electronic device 101 may include a processor 120, a memory 130, image sensors 180-1 to 180-N, an input unit (input interface) 140, and a display unit (i.e., a display) 150. Here, the processor 120 may include an Application Processor (AP).
The processor 120 may control the electronic device 100 to provide a variety of services.
The processor 120 may interpret instructions received from one or more other constituent elements included in the electronic device 101 (e.g., the memory 130, the image sensors 180-1 to 180-N, the display unit 150, and the input unit 140) and perform operations or data processing according to the interpreted instructions. For example, the processor 120 may perform one or more image processes of level adjustment, noise removal, gamma correction, and conversion of image data provided from the image sensors 180-1 to 180-N into a format displayable on the display unit 150. The processor 120 may control to store the image-processed image data in the memory 130 or display the image data on the display unit 150. For example, the processor 120 may transmit image data (e.g., YUV data) displayed on the display unit 150 and metadata about the corresponding image data to the memory 130. Here, the image processing converted into a format displayable on the display unit 150 may include color space conversion.
The processor 120 may execute one or more programs stored in the memory 130 and may control the electronic device 101 to provide a variety of multimedia services. For example, the processor 120 may execute a program stored in the processor 130, and may select and synthesize at least two image data among the image data acquired by the image sensors 180-1 to 180-N. For example, the processor 120 may select and synthesize at least two image data using an image acquisition time stamp or an image processing delay time and an image acquisition time stamp corresponding to the image data.
As another example, when a capture event occurs, the processor 120 may generate thumbnail data regarding the captured image data using the image-processed image data (e.g., preview image) and metadata regarding each image data stored in the memory 130. For example, the processor 120 may generate thumbnail data regarding captured image data using a different module logically or physically separate from a module (e.g., ISP) that image-processes image data provided from the image sensors 180-1 to 180-N.
The memory 130 may store instructions or data received from or generated by one or more constituent elements included in the electronic device 101. For example, the memory 130 may include an internal memory or an external memory. The internal memory may include, for example, at least one of volatile memory (e.g., Dynamic Random Access Memory (DRAM), Static Random Access Memory (SRAM), Synchronous Dynamic Random Access Memory (SDRAM), and the like) and non-volatile memory (e.g., one-time programmable read only memory (OTPROM), Programmable Read Only Memory (PROM), erasable PROM (eprom), electrically erasable programmable ROM (eeprom), shielded ROM, flash ROM, NAND (NAND) flash memory, NOR (NOR) flash memory, and the like). The external memory may include a flash drive, for example, at least one of Compact Flash (CF), Secure Data (SD), micro-SD, xD, and a memory stick. The external memory may be functionally connected with the electronic device 101 via various interfaces.
The image sensors 180-1 to 180-N may provide image data acquired through subject photographing. At this time, the image sensors 180-1 to 180-N may transmit image data to the processor 120 through a serial interface, such as a Mobile Industry Processor Interface (MIPI) and a Mobile Display Digital Interface (MDDI), and a parallel interface, such as a parallel bus. Here, the first image sensor 180-1 may be located in front of the electronic device 101, and the Nth image sensor 180-N may be located in rear of the electronic device 101.
The input unit 140 may transmit instructions or data input by a user to the processor 120 or the memory 130. For example, the input unit 140 may include a touch input unit, a pen sensor, a key, or an ultrasonic input device.
The display unit 150 may provide status information, a still picture, a moving picture, or data of the electronic device 101 through a graphic user interface. For example, the display unit 150 may display one or more images provided from the processor 120. As another example, the display unit 150 may display at least one image selected based on the image processing delay time and the image acquisition time stamp or the image acquisition time stamp in the processor 120.
Although not shown, the electronic apparatus 101 may further include a communication unit capable of connecting communication with other electronic apparatuses or servers via voice communication or data communication. Here, the communication unit may be divided into a plurality of communication sub-modules supporting different communication networks.
In the foregoing embodiments, the electronic device 101 may include a plurality of image sensors 180-1 to 180-N. At this time, one or more image sensors among the plurality of image sensors 180-1 to 180-N may be selectively connected to the electronic device 101. For example, one or more of the plurality of image sensors 180-1 through 180-N may be selectively connected to the electronic device 101 via a wired interface. As another example, among the plurality of image sensors 180-1 to 180-N, one or more image sensors may be selectively connected to the electronic apparatus 101 via a wireless interface such as bluetooth and wireless LAN.
Fig. 2 is a detailed block diagram illustrating a processor according to an embodiment of the present disclosure.
Referring to fig. 2, the processor 120 may include an image processing unit (i.e., an Image Signal Processor (ISP))200, a display control unit 210, an image generation control unit 220, a thumbnail generation unit 230, and a moving picture generation unit 240.
The image processing unit 200 may perform one or more of image processing of level adjustment, noise removal, gamma correction, and color space conversion on the image data supplied from the respective image sensors 180-1 to 180-N. The image processing unit 200 may transmit the image-processed image data to one or more of the memory 130 and the display control unit 210. For example, the image processing unit 200 may transmit image data (e.g., YUV data) displayed on the display unit 150 and metadata about the corresponding image data to the memory 130.
The display control unit 210 may control the graphic user interface provided through the display unit 150. For example, the display control unit 210 may control display of image data (e.g., a preview image) supplied from the image processing unit 200 or the memory 130 on the display unit 150. For example, the display control unit 210 may control image data supplied from the image sensors 180-1 to 180-N via the image processing unit 200 to be displayed together on the display unit 150.
The image generation control unit 220 may select and synthesize at least two image data among the image data acquired via the image sensors 180-1 to 180-N. For example, when a capture event occurs, the image generation control unit 220 may select and synthesize at least two image data using the image acquisition time stamp or the image processing delay time and the image processing time stamp of the image data stored in the memory 130.
The thumbnail generation unit 230 may generate thumbnail data using image-processed image data (e.g., preview image) stored in the memory 130 or metadata about the respective image data. For example, when a capture event occurs, the thumbnail generating unit 230 may generate thumbnail data using YUV data of image data stored in the memory 130 and metadata about the corresponding image data. For example, in the case of synthesizing at least two image data acquired by the plurality of image sensors 180-1 to 180-N and generating the acquired image data, the thumbnail generating unit 230 may synthesize the image data based on the processing delay time of each image data and may generate the thumbnail data. At this time, the thumbnail generation unit 230 may interlock the captured image data and the thumbnail data using the image acquisition time stamp or the frame identification information included in the metadata, and store the interlock result in the memory 130.
The moving picture generation unit 240 may encode the image-processed image data stored in the memory 130 and generate moving picture data. For example, the moving picture generation unit 240 may include a video preprocessor and a video encoder. The video preprocessor may perform preprocessing such as scaling, rotation, color space conversion, and flipping on the image-processed image data stored in the memory 130 and store the preprocessing result in the memory 130. The video encoder encodes the image data preprocessed by the video preprocessor and stored in the memory 130 according to a preset encoding method, and generates moving picture data.
Although not shown, the processor 120 may further include a time setting unit capable of setting an image acquisition time stamp for one or more image data provided from the image sensors 180-1 to 180-N. For example, the time setting unit may record a time corresponding to each image data supplied from the image sensors 180-1 to 180-N in the metadata of the corresponding image data in units of each frame. As another example, when there are one or more image sensors among the image sensors 180-1 to 180-N that can be selectively connected to the electronic device 100, the time setting unit may set an image acquisition time stamp to metadata of one or more image data provided from the one or more image sensors connected to the electronic device 101. At this time, the image acquisition time stamp may be set to an image acquired by one or more image sensors that can be selectively connected to the electronic apparatus 101 via a separate module included in each image sensor.
In the foregoing embodiments, the processor 120 may process image data provided from the image sensors 180-1 to 180-N by one image processing unit 200.
In another embodiment, the processor 120 may include a plurality of image processing units and process image data provided from the respective image sensors 180-1 to 180-N.
Fig. 3 is a block diagram illustrating an electronic device according to another embodiment of the present disclosure.
Referring to fig. 3, the electronic device 300 may include a processor 310, a memory 320, image sensors 330-1 to 330-N, external image processing units 340-1 to 340- (N-1), an input unit 350, and a display unit 360. Here, the processor 310 may include an AP.
The processor 310 may control the electronic device 300 to provide a variety of services.
The processor 310 may interpret instructions received from one or more other constituent elements included in the electronic device 300 (e.g., the memory 320, the image sensors 330-1 to 330-N, the external image processing units 340-1 to 340- (N-1), the input unit 350, and the display unit 360) and perform operations or data processing according to the interpreted instructions. For example, the processor 310 may perform one or more of level adjustment, noise removal, gamma correction, and conversion of image data provided from the first image sensor 330-1 into a format displayable on the display unit 360. The processor 310 may control storing the image-processed image data in the memory 320 or displaying the image data on the display unit 360. For example, the processor 310 may transmit image data (e.g., YUV data) displayed on the display unit 360 and metadata about the corresponding image data to the memory 320. As another example, the processor 310 may control the conversion of the image stored in the memory 320 into a format that can be displayed on the display unit 360 and the display of the converted image on the display unit 360 through the external image processing units 340-1 to 340- (N-1). Here, the image processing converted into a format displayable on the display unit 360 may include color space conversion.
The processor 310 may execute one or more programs stored in the memory 320 and control the electronic device 300 to provide a variety of multimedia services. For example, the processor 310 may execute a program stored in the memory 320 and select and synthesize at least two image data among the image data acquired by the image sensors 330-1 to 330-N. For example, the processor 310 may select and synthesize at least two image data using an image acquisition time stamp or an image processing delay time and an image acquisition time stamp corresponding to the image data.
As another example, when a capture event occurs, the processor 310 may generate thumbnail data regarding the captured image data using the image-processed image data (e.g., preview image) and metadata regarding each image data stored in the memory 320. For example, the processor 310 may generate thumbnail data regarding captured image data using a different module logically or physically separate from an internal module (e.g., ISP) of the processor 310 that image-processes image data provided from the first image sensor 330-1. At this time, the different modules may be logically or physically separated from the internal modules within the processor 310 that process the image or physically separated from the processor 310.
The memory 320 may store instructions or data received from or generated by one or more constituent elements included in the electronic device 300.
The image sensors 330-1 to 330-N may provide collected images acquired by subject photographing to the processor 310. At this time, the image sensors 330-1 to 330-N may transmit image data to the processor 310 or the external image processing units 340-1 to 340- (N-1) through a serial interface (such as MIPI and MDDI) and a parallel interface (such as a parallel bus). Here, the first image sensor 330-1 may be located in front of the electronic device 300, and the nth image sensor 330-N may be located in rear of the electronic device 300.
The external image processing units 340-1 to 340- (N-1) may control to perform image processing such as horizontal adjustment, noise removal, and gamma correction on the images provided from the image sensors 330-2 to 330-N, and may store the processing results in the memory 320 through the processor 310. Here, the external image processing units 340-1 to 340- (N-1) may further include a time setting unit capable of setting an image acquisition time stamp to image data related to the images supplied from the image sensors 330-2 to 330-N. For example, the time setting unit may record a time corresponding to each image data supplied from the image sensors 330-2 to 330-N in the metadata of the corresponding image data in units of each frame.
The input unit 350 may transmit instructions or data input by a user to the processor 310 or the memory 320. For example, the input unit 350 may include a touch input unit, a pen sensor, a key, or an ultrasonic input device.
The display unit 360 may provide status information, still pictures, moving pictures, or data of the electronic device 300 through a graphic user interface. For example, the display unit 360 may display one or more image data provided from the processor 310. As another example, the display unit 360 may display at least one image data selected based on the image processing delay time and the image acquisition time stamp or the image acquisition time stamp in the processor 310.
Although not shown, the electronic device 300 may further include a communication unit capable of connecting communication with other electronic devices or servers via voice communication or data communication. Here, the communication unit may be divided into a plurality of communication sub-modules supporting different communication networks.
In the foregoing embodiments, the electronic device 300 may include a plurality of image sensors 330-1 to 330-N. At this time, one or more image sensors among the plurality of image sensors 330-1 to 330-N may be selectively connected to the electronic device 300. For example, among the plurality of image sensors 330-1 to 330-N, one or more image sensors may be selectively connected to the electronic device 300 via a wired interface. In this case, an external image processing unit connected to one or more image sensors that can be selectively connected to the electronic device 300 may be installed in the electronic device 300, or may be selectively connected to the electronic device 300 together with the image sensors.
As another example, among the plurality of image sensors 330-1 to 330-N, one or more image sensors may be selectively connected to the electronic device 300 via a wireless interface such as bluetooth and wireless LAN. In this case, an external image processing unit connected to one or more image sensors that may be selectively connected to the electronic device 300 may be connected to the electronic device 300, or may be selectively connected to the electronic device 300 together with the image sensors.
Fig. 4 is a detailed block diagram illustrating a processor according to another embodiment of the present disclosure.
Referring to fig. 4, the processor 310 may include an image processing unit (i.e., ISP)400, an internal interface 410, a format change unit 420, a display control unit 430, an image generation control unit 440, a thumbnail generation unit 450, and a moving picture generation unit 460.
The image processing unit 400 may perform one or more of level adjustment, noise removal, gamma correction, and color space conversion of the image data provided from the first image sensor 330-1. The image processing unit 400 may transmit the image-processed image data to one or more of the memory 320 and the display control unit 430. For example, the image processing unit 400 may transmit image data (e.g., YUV data) displayed on the display unit 360 and metadata about the corresponding image data to the memory 320.
The internal interface 410 may transmit image data provided from the respective external image processing units 340-1 to 340- (N-1) to the memory 320. For example, internal interface 410 may include one or more of MIFI and CAMIF.
The format change unit 420 may change the image data supplied from the external image processing units 340-1 to 340- (N-1) stored in the memory 320 into the format of the image data displayable on the display unit 360. For example, the format change unit 420 may perform color space conversion on the image data supplied from the memory 320 and transmit the image data to the display control unit 430. For example, the format change unit 420 may control the image data supplied from the external image processing units 340-1 to 340- (N-1), which is to be changed into the format of the image data displayable on the display unit 360, to be stored in the memory 320.
The display control unit 430 may control the graphic user interface provided through the display unit 360. For example, the display control unit 430 may control display of an image provided from one or more of the image processing unit 400 and the format change unit 420 on the display unit 360. For example, the display control unit 430 may control to display the image data provided from the first image sensor 330-1 provided via the image processing unit 400 and the image data of the nth image sensor 330-N provided through the format change unit 420 together on the display unit 360.
The image generation control unit 440 may select and synthesize at least two image data among the image data acquired via the image sensors 330-1 to 330-N. For example, when a capture event occurs, the image generation control unit 440 may select and synthesize at least two image data using an image acquisition time stamp or an image processing delay time and an image processing time stamp of the image stored in the memory 320.
The thumbnail generating unit 450 may generate thumbnail data using the image-processed image data or metadata about the respective image data stored in the memory 320. For example, when a capture event occurs, the thumbnail generating unit 450 may generate thumbnail data using YUV data of each image data and metadata about the corresponding image data stored in the memory 320. For example, in the case of synthesizing at least two image data acquired by the plurality of image sensors 330-1 to 330-N and generating captured image data, the thumbnail generation unit 450 may synthesize the image data based on the processing delay time of each image data and may generate thumbnail data. At this time, the thumbnail generation unit 450 may interlock the captured image data and the thumbnail data using the image acquisition time stamp or the frame identification information included in the metadata, and store the interlock result in the memory 320.
The moving picture generation unit 460 may encode the image-processed image data stored in the memory 320 and generate moving picture data. For example, the moving picture generation unit 460 may include a video preprocessor and a video encoder. The video preprocessor may perform preprocessing such as scaling, rotation, color space conversion, and flipping on the image-processed image data stored in the memory 320 and store the preprocessing result in the memory 320. The video encoder encodes the image data preprocessed by the video preprocessor and stored in the memory 320 according to a preset encoding method, and generates moving picture data.
Although not shown, the processor 310 may further include a time setting unit capable of setting an image acquisition time stamp for image data supplied from the first image sensor 330-1 or the first image sensor 330-1 and the external image processing units 340-1 to 340- (N-1). For example, the time setting unit may record a time corresponding to the image data supplied from the first image sensor 330-1 in the metadata of the corresponding image data in units of each frame. At this time, the image acquisition time stamp may be set to the image data acquired through the second to nth image sensors 330-2 to 330-N via an external image processing unit connected to each image sensor.
In the foregoing embodiment, the processor 310 may include the format change unit 420 for changing the image data supplied from the external image processing units 340-1 to 340- (N-1) into the format of the image data displayable on the display unit 360.
In another embodiment, if the external image processing units 340-1 to 340- (N-1) can change the image data into a format of the image data displayable on the display unit 360, the processor 310 may be configured not to include the format change unit 420.
Fig. 5 is a block diagram illustrating an electronic device according to yet another embodiment of the present disclosure.
Referring to fig. 5, the electronic device 500 may include a processor 510, memories 520 and 550, image sensors 530-1 to 530-N, external image processing units 540-1 to 540- (N-1), a display unit 560, and an input unit 570. Here, the processor 510 may include an AP.
The processor 510 may control the electronic device 500 to provide a variety of services.
The processor 510 may interpret instructions received from one or more other constituent elements included in the electronic device 500 and perform operations or data processing according to the interpreted instructions. For example, the processor 510 may perform one or more image processes of level adjustment, noise removal, gamma correction, and conversion of image data provided from the first image sensor 530-1 into a format displayable on the display unit 560. The processor 510 may control storing of the image-processed image data in the first memory 520 or displaying of the image data on the display unit 560. For example, the processor 510 may transmit image data (e.g., YUV data) displayed on the display unit 560 and metadata about the corresponding image data to the first memory 520. Here, the image processing converted into a format displayable on the display unit 560 may include color space conversion.
The processor 510 may execute one or more programs stored in the first memory 520 and control the electronic device 500 to provide a variety of multimedia services. For example, the processor 510 may execute a program stored in the memory 520 and select and synthesize at least two image data among the image data acquired through the image sensors 530-1 to 530-N. For example, processor 510 may select and synthesize at least two image data using an image acquisition time stamp or an image processing delay time and an image acquisition time stamp corresponding to the image data.
As another example, when a capture event occurs, the processor 510 may generate thumbnail data regarding the captured image data using the image-processed image data (e.g., preview image) and metadata regarding each image data stored in the first memory 520. For example, processor 510 may generate thumbnail data regarding captured image data using a different module separate from the module that image-processes the image data provided from first image sensor 530-1 (e.g., an ISP).
The first memory 520 may store instructions or data received from or generated by one or more constituent elements included in the electronic device 500.
Image sensors 530-1 through 530-N may provide images acquired by subject capture to processor 510. At this time, the image sensors 530-1 to 530-N may transmit images to the processor 510 or the external image processing units 540-1 to 540- (N-1) through a serial interface (such as MIPI and MDDI) and a parallel interface (such as a parallel bus). Here, the first image sensor 530-1 may be located in front of the electronic device 500, and the Nth image sensor 530-N may be located in rear of the electronic device 500.
The external image processing units 540-1 to 540- (N-1) may control to perform image processing such as level adjustment, noise removal, gamma correction, and color space conversion on the image data supplied from the image sensors 530-2 to 530-N, and may store the processing results in the first memory 520. In addition, the external image processing units 540-1 to 540- (N-1) may control setting time information to the image data provided from the image sensors 530-2 to 530-N and storing the image data to which the time information is set in the second memory 550. For example, the external image processing units 540-1 to 540- (N-1) may set time information to metadata of the corresponding image data.
The second memory 550 may store unprocessed image data supplied from the external image processing units 540-1 to 540- (N-1). For example, the second memory 550 may store raw image data provided from the external image processing units 540-1 to 540- (N-1). At this time, the second memory 550 may exist corresponding to each of the external image processing units 540-1 to 540- (N-1).
The display unit 560 may provide status information, a still picture, a moving picture, or data of the electronic device 500 through a graphic user interface. For example, the display unit 560 may display one or more image data provided from the processor 510. As another example, the display unit 560 may display at least two image data selected based on the image processing delay time and the image acquisition time stamp or the image acquisition time stamp in the processor 510.
The input unit 570 may transmit instructions or data input by a user to the processor 510 or the first memory 520. For example, the input unit 570 may include a touch input unit, a pen sensor, a key, or an ultrasonic input device.
Although not shown, the electronic device 500 may further include a communication unit capable of connecting communication with other electronic devices or servers via voice communication or data communication. Here, the communication unit may be divided into a plurality of communication sub-modules supporting different communication networks.
In the foregoing embodiments, the electronic device 500 may include a plurality of image sensors 530-1 to 530-N. At this time, one or more image sensors among the plurality of image sensors 530-1 to 530-N may be selectively connected to the electronic device 500. For example, among the plurality of image sensors 530-1 to 530-N, one or more image sensors may be selectively connected to the electronic device 500 via a wired interface. In this case, an external image processing unit connected to one or more image sensors that can be selectively connected to the electronic device 500 may be installed in the electronic device 500, or may be selectively connected to the electronic device 500 together with the image sensors.
As another example, among the plurality of image sensors 530-1 to 530-N, one or more image sensors may be selectively connected to the electronic device 500 via a wireless interface such as bluetooth and wireless LAN. In this case, an external image processing unit connected to one or more image sensors that may be selectively connected to the electronic device 500 may be connected to the electronic device 500, or may be selectively connected to the electronic device 500 together with the image sensors.
Fig. 6 is a detailed block diagram illustrating a processor according to yet another embodiment of the present disclosure.
Referring to fig. 6, the processor 510 may include an image processing unit (i.e., ISP)600, an internal interface 610, a display control unit 620, an image generation control unit 630, a moving picture generation unit 640, and a thumbnail generation unit 650.
The image processing unit 600 may perform one or more of level adjustment, noise removal, gamma correction, and color space conversion of the image data provided from the first image sensor 530-1. The image processing unit 600 may transmit the image-processed image data to one or more of the first memory 520 and the display control unit 620. For example, the image processing unit 600 may transmit image data (e.g., YUV data) displayed on the display unit 560 and metadata about the corresponding image data to the first memory 520.
The internal interface 610 may transmit images provided from the respective external image processing units 540-1 to 540- (N-1) to the first memory 520. For example, the internal interface 610 may include one or more of MIFI and CAMIF, and RDI for transmitting an image converted into a format displayable on the display unit 560 in the external image processing units 540-1 to 540- (N-1).
The display control unit 620 may control the graphic user interface provided through the display unit 560. For example, the display control unit 620 may control display of image data provided from one or more of the image processing unit 600 and the first memory 520 on the display unit 560. For example, the display control unit 620 may control the image data of the first image sensor 530-1 supplied via the image processing unit 600 and the image data of the Nth image sensor 530-N acquired from the first memory 520 to be displayed together on the display unit 560.
The image generation control unit 630 may select and synthesize at least two image data among the image data acquired through the image sensors 530-1 to 530-N. For example, when a capture event occurs, the image generation control unit 630 may select and synthesize at least two image data using the image acquisition time stamps or the image processing delay time and the image processing time stamps of the image data stored in the first memory 520 and the second memory 550.
The moving picture generation unit 640 may encode the image-processed image data stored in the first memory 520 and the second memory 550 and generate moving picture data. For example, the moving picture generation unit 640 may include a video preprocessor and a video encoder. The video preprocessor may perform preprocessing such as scaling, rotation, color space conversion, and flipping on the image data stored in the first and second memories 520 and 550 and store the preprocessing result in one or more of the first and second memories 520 and 550. The video encoder may encode the image data preprocessed by the video preprocessor and stored in one or more of the first memory 520 and the second memory 550 according to a preset encoding method and generate moving picture data.
The thumbnail generating unit 650 may generate thumbnail data using the image-processed image data (e.g., preview image) stored in the first memory 520 or with respect to corresponding metadata. For example, when a capture event occurs, the thumbnail generating unit 650 may generate thumbnail data using YUV data of each image data and metadata about the corresponding image data stored in the first memory 520. For example, in the case of synthesizing at least two image data acquired by the plurality of image sensors 530-1 to 530-N and generating captured image data, the thumbnail generation unit 650 may synthesize the image data based on the processing delay time of each image data and may generate thumbnail data. At this time, the thumbnail generation unit 650 may interlock the captured image data and the thumbnail data using the image acquisition time stamp or the frame identification information included in the metadata, and store the interlock result in the first memory 520.
Although not shown, the processor 510 may further include a time setting unit capable of setting an image acquisition time stamp for image data provided from the first image sensor 530-1 or the first image sensor 530-1 and the external image processing units 540-1 to 540- (N-1). For example, the time setting unit may record a time corresponding to the image data supplied from the first image sensor 530-1 in the metadata of the corresponding image data in units of each frame. At this time, the image acquisition time stamp may be set to the image data acquired through the second to nth image sensors 530-2 to 530-N via an external image processing unit connected to each image sensor. As another example, the time setting unit may record a time corresponding to the image data supplied from the external image processing unit 540- (N-1) in the metadata of the corresponding image data in units of each frame. In this case, the image generation control unit 630 may select and synthesize at least two images for synthesis based on the image acquisition time stamps of the images stored in the first memory 520.
Fig. 7 is a detailed block diagram illustrating an external image processing unit according to an embodiment of the present disclosure.
Referring to fig. 7, the external image processing unit 540 may include an image processing control unit 700 and a time setting unit 710.
The image processing control unit 700 may perform one or more image processing of level adjustment, noise removal, gamma correction, and conversion of the image data provided from the image sensors 530-2 to 530-N into a format displayable on the display unit 560. For example, the image processing control unit 700 may color-space-convert the image data YUV 422 supplied from the image sensors 530-2 to 530-N into the image data YUV 420, thereby converting into a format displayable on the display unit 560.
The image processing control unit 700 may convert one or more image data stored in the second memory 550 into a format displayable on the display unit 560 and transmit the converted image data to the image generation control unit 630. For example, the image processing control unit 700 may receive image data selected for image synthesis from the second memory 550, convert the received image data into a format displayable on the display unit 560, and transmit the converted image data to the image generation control unit 630 according to the control of the image generation control unit 630 of fig. 6. As another example, when a capture event occurs, the image processing control unit 700 may convert one or more image data among the images stored in the second memory 550 into a format displayable on the display unit 560 and transmit the converted image data to the image generation control unit 630.
The time setting unit 710 may set an image acquisition time stamp for image data provided from the image sensor 530-2 or 530-N. For example, the time setting unit 710 may include a time inserting unit and a frame setting unit, and may record a time corresponding to image data supplied from the image sensor 530-2 or 530-N in units of each frame.
In the foregoing embodiment, the external image processing unit 540 may include the image processing control unit 700 and the time setting unit 710. In another embodiment, the time setting unit 710 may be located outside the external image processing unit 540.
Fig. 8 is a detailed block diagram illustrating a memory according to an embodiment of the present disclosure.
Referring to fig. 8, the first memory 520 may be logically or physically divided into a plurality of blocks 800, 810, and 820 to store data. For example, image data provided from the image processing unit 600 of the processor 510 may be stored in the third block 820 of the first memory 520.
The image data supplied from the external image processing units 540-1 to 540- (N-1) may be stored in the first block 800 of the first memory 520. At this time, the image data may be divided into Y data, UV number, and metadata, and may be stored in the internal blocks 802, 804, and 806 in the first block 800. Here, the metadata may include one or more of a frame identifier of the image data, an image acquisition time stamp, focus data, and image setting information (EXIF).
When a capture event occurs, the image data stored in the second memory 550 may be stored in the third block 820 of the first memory 520 through the external image processing units 540-1 to 540- (N-1).
In the foregoing embodiments, the electronic device may transmit image data generated by the image sensor to each module using the serial interface and the parallel interface. For example, the electronic device may transmit image data generated through an image sensor to each module using a MIPI interface constructed as follows in fig. 9.
Fig. 9 is a block diagram illustrating an interface according to an embodiment of the present disclosure.
Referring to fig. 9, the MIPI interface may include a plurality of lanes (lanes) 900 according to the format of an image. For example, the MIPI interface may be constructed of MIPI-4-Lane PHY, MIPI-2-Lane PHY, and MIPI-1-Lane PHY according to transmission data capacity.
The MIPI interface may transmit image data to a corresponding module 930 through a serial interface (e.g., Camera Serial Interface (CSI))910 corresponding to each lane 900. For example, the MIPI-4-Lane PHY may transmit image data to one or more modules through MIPI CSI _0, the MIPI 2-Lane PHY may transmit image data to one or more modules through MIPI CSI _1, and the MIPI 1-Lane PHY may transmit image data to one or more modules through MIPI CSI _ 2.
The modules receiving the image data through the MIPI interface can process the format of the image data according to the characteristics of each module. For example, the VPE module may perform image processing such as scaling, rotation, color space conversion, and flipping on image data provided through the MIPI interface. A Joint Photographic Experts Group (JPEG) decoding (DCD) module may support hardware acceleration functions required to decode JPEG-formatted image data provided through the MIPI interface. The VFE module may apply various effects such as color change to image data provided through the MIPI interface. The offline JEPG module can support a hardware acceleration function required to decode JPEG-formatted image data provided through the MIPI interface.
When image data is transmitted through the MIPI interface configured as above, the electronic apparatus may use a divisional transmission method of dividing and transmitting image data due to the limitation of a memory and the transmission capacity of the MIPI interface. For example, when transmitting 11 Megabytes (MB) of image data, the electronic device may divide the 11MB of image data into 8MB of data 1200 and 3MB of data 1210, and transmit the divided image data through the MIPI interface. For example, the electronic device may divide 11MB of image data into 8MB of data 1200 and 3MB of data 1210 and store the divided image data, as shown in fig. 12A, and transmit the divided image data 1200 and 1210 by the PIPE method. The memory of the electronic device receiving the data 1200 and 1210 divided through the MIPI interface may aggregate the divided data 1200 and 1210 into one data and store the divided data 1200 and 1210 in a divided format. As shown in fig. 12B, the electronic apparatus can flexibly set the size of the memory (e.g., the size of divided data) and the number of divisions of the data. As another example, if the electronic device can transmit 11MB of data at a time through the MIPI interface, as shown in fig. 12B, the electronic device can use a preset data capacity (e.g., 3MB) for a preview image in 11MB and transmit original image data using the remaining data capacity (e.g., 8 MB). In this case, the electronic device may transmit the original image data at one time, or divide and transmit the original image data through the MIPI interface based on the size of the original image data. For example, if the original image data is 7MB, the electronic device may transmit 3MB of the preview image and 7MB of the original image data at a time through the MIPI interface. If the original image data is 15MB, the electronic device may fixedly use 3MB of the preview image, divide the original image data into 8MB and 7MB of original image data, and transmit the 8MB and 7MB of original image data once through the MIPI interface twice. When dividing and transmitting the original image data, the electronic device may combine the divided original image data into one image using the metadata.
According to an embodiment of the present disclosure, an electronic device includes a first image sensor, a second image sensor, one or more image processing modules, a display, and a thumbnail generation unit. The first image sensor generates first image data. The second image sensor generates second image data. One or more image processing modules process one or more of the first image and the second image data. The display unit displays one or more of the first image data and the second image data processed by the one or more image processing modules. The thumbnail generation module generates thumbnail data using one or more of the first image data and the second image data processed by the one or more image processing modules.
The one or more image processing modules include: a first image processing module configured to process first image data received from a first image sensor; a second image processing module configured to process second image data received from the second image sensor, wherein the first image processing module is formed in the Application Processor (AP).
The thumbnail generation module is formed in the application processor.
The one or more image processing modules are configured to process one or more of the first image data and the second image data and generate one or more preview data in a format displayable on a display and metadata regarding the corresponding image data.
Upon capturing an event, the thumbnail generation module is configured to generate thumbnail data regarding the captured image data using the one or more preview data and metadata regarding the corresponding image data generated in the one or more image processing modules.
The electronic device may further include a memory, wherein the thumbnail generation module is configured to interlock the thumbnail data and the captured image data using an image acquisition time stamp or frame identification information included in the metadata, and store the interlock result in the memory.
Fig. 10 is a flowchart illustrating a process for generating thumbnail data in an electronic device according to an embodiment of the present disclosure.
Referring to fig. 10, in step 1001, an electronic device may generate image data using a plurality of image sensors. For example, the electronic device may generate image data using a first image sensor located in front of the electronic device and a second image sensor located in back of the electronic device.
When generating the image data, the electronic device may convert the image data into a preview format that can be displayed on the display unit in step 1003. For example, the electronic device may convert image data into a preview format displayable on a display unit using one or more image processing units (i.e., ISPs). For example, when referring to fig. 2, the electronic device 100 may convert image data generated through the image sensors 130-1 to 130-N into a preview format (e.g., YUV data) displayable on the display unit 140 using the image processing unit 200. At this time, the image processing unit 200 may generate metadata about the image data converted into the preview format displayable on the display unit 140 together and store the image data and the metadata about the image data in the memory 120. As another example, when referring to fig. 5 and 6, the electronic device 500 may convert image data generated through the first image sensor 530-1 into a preview format (e.g., YUV data) displayable on the display unit 140 using the image processing unit 600, and may convert image data generated through the second to nth image sensors 530-2 to 530-N into a preview format displayable on the display unit 140 using the external image processing units 540-1 to 540- (N-1). At this time, the image processing unit 600 and the external image processing units 540-1 to 540- (N-1) may together generate metadata about the image data converted into a preview format that can be displayed on the display unit 140, and store the image data and the metadata about the image data in one or more of the first memory 520 and the second memory 550. Here, the metadata may include one or more of a frame Identifier (ID), an image acquisition time stamp, and image setting information (EXIF) of the corresponding image data.
When converting the image data into a preview format displayable on the display unit, the electronic device may generate thumbnail data regarding the captured image data using the data of the preview format in step 1005. For example, the electronic device may generate the thumbnail data using a different module separate from the image processing unit. For example, when referring to fig. 2, the thumbnail generation unit 230 of the electronic device 100 may generate thumbnail data regarding captured image data using metadata of corresponding image data stored in the memory 120 and image data in a preview format. As another example, when referring to fig. 5 and 6, the thumbnail generation unit 650 of the electronic device 500 may generate thumbnail data regarding captured image data using metadata of corresponding image data and image data in a preview format stored in one or more of the first memory 520 and the second memory 550.
Fig. 11 is a flowchart illustrating a process for interlocking and storing thumbnail data and captured image data in an electronic device according to an embodiment of the present disclosure.
Referring to fig. 11, in step 1101, an electronic device may generate image data using a plurality of image sensors. For example, the electronic device may generate image data using a first image sensor located in front of the electronic device and a second image sensor located in back of the electronic device.
When generating the image data, the electronic device may convert the image data into a preview format that can be displayed on the display unit in step 1103. For example, the electronic device may convert image data into a preview format displayable on a display unit using one or more image processing units (i.e., ISPs). At this time, the electronic device may store the image data converted into the preview format in the one or more image processing units and metadata regarding the corresponding image data in the memory. Here, the metadata may include one or more of a frame ID, an image acquisition time stamp, and image setting information (EXIF) of the corresponding image data.
When converting the image data into a preview format displayable on the display unit, the electronic apparatus may display the image data in the preview format on the display unit in step 1105.
At step 1107, the electronic device may determine whether a capture event occurred. For example, the electronic device may determine whether a hardware key corresponding to the capture event is sensed. As another example, the electronic device may determine whether selection of an icon corresponding to the capture event is sensed. As yet another example, the electronic device may determine whether a gesture of the user corresponding to the capture event is sensed.
If a capture event has not occurred at step 1107, the electronic device may return to step 1101 and use multiple image sensors to generate image data.
If a capture event occurs at step 1107, the electronic device may use the image data in preview format to generate thumbnail data for the captured image data. For example, when a first image sensor having low performance is located in front of the electronic device and a second image sensor having high performance is located behind the electronic device, the electronic device may use the image data having low performance generated by the first image sensor as preview image data that may be displayed on the display unit. The electronic device may convert the high-performance image data generated by the second image sensor into a preview format and generate preview image data. Accordingly, when a capture event occurs at a time when the processing of the high-performance image data is delayed, the electronic device can recognize and synthesize, as captured image data and high-performance image data including an image acquisition time stamp corresponding to the capture event occurrence time point, low-performance image data corresponding to preview image data displayed on the display unit at the capture event occurrence time point. In generating the thumbnail data according to the capture time, the electronic device may generate the thumbnail data by synthesizing preview image data of low-performance image data displayed on the display unit at the capture event occurrence time point and a preview image of high-performance image data including an image acquisition time stamp corresponding to the capture event occurrence time stamp. For example, the electronic device may generate the thumbnail data using a different module separate from the image processing unit.
After generating the thumbnail data, the electronic device may interlock the thumbnail data and the captured image data using the metadata of the thumbnail data and may store the interlock result in step 1111. For example, the electronic device may interlock the captured image data and the thumbnail data using frame identification information or an image acquisition time stamp included in metadata for generating the thumbnail data, and store the interlock result in the memory.
According to an embodiment of the present disclosure, there is provided an operating method of an electronic device. The method comprises the following operations: generating a plurality of image data using a plurality of image sensors; converting, by one or more image processing modules, the plurality of image data into a format displayable on a display unit; the thumbnail data is generated in another module separate from the image processing module using the image data in the displayable format converted in the image processing module.
The method as described above, wherein the step of generating image data comprises: the plurality of image data is generated using a plurality of image sensors connected to the electronic device or connected to the electronic device through a wired interface or a wireless interface.
The method may further comprise: one or more of the image data in the displayable format converted in the image processing module are displayed on the display.
The step of converting to a displayable format includes: converting first image data generated using a first image sensor of the plurality of image sensors into a format displayable on a display using a first image processing module formed in an Application Processor (AP); second image data generated using a second image sensor of the plurality of image sensors is converted into a format displayable on a display using a second image processing module formed separately from the AP.
The generating of the thumbnail data includes the operations of: the thumbnail image data is generated in another module included in the AP using the image data in the displayable format converted in the first image processing module and the second image processing module.
The step of converting to a displayable format includes: processing one or more image data of the plurality of display data using one or more image processing modules; one or more preview data in a format displayable on a display and metadata about the corresponding image data are generated.
The generating of the thumbnail data includes: at the time of capturing an event, thumbnail data regarding captured image data is generated in another module using one or more preview data generated in one or more image processing modules and metadata regarding the corresponding image data.
The method may further comprise: interlocking the thumbnail data and the captured image data using an image acquisition time stamp included in the metadata or the frame identification information; the interlock result is stored in a memory.
According to an embodiment of the present disclosure, an electronic device includes one or more image sensors and an interface. One or more image sensors generate image data. The interface processes image data generated in one or more image sensors. The interface sends the image data to one or more modules. One or more modules change the format of the image data based on the image data processing method of the corresponding module.
The interface includes an interface of a Mobile Industry Processor Interface (MIPI) method.
The one or more modules include one or more of an Image Signal Processor (ISP), a video pre-processing (VPE) module, a Video Front End (VFE) module, and a preview image generation module.
When the format of the image data is changed based on an image data processing method of one or more modules, the ISP generates metadata including information about the format change of the image data.
The metadata includes one or more of frame identification information, an image acquisition time stamp, and image setting information (exchangeable image file format (EXIF)) of the image data.
The ISP is configured to generate thumbnail data of image data generated in one or more image sensors using the metadata.
The interface is configured to divide the image data into a plurality of parts based on a transmission performance of the interface and transmit the divided image data to one or more modules.
Fig. 13 is a block diagram illustrating an electronic device according to yet another embodiment of the present disclosure. In the following description, the electronic apparatus 1300 may configure a part or all of the electronic apparatus 101 shown in fig. 1, for example.
Referring to fig. 13, the electronic device 1300 may include one or more processors 1310, a SIM card 1314, memory 1320, a communication module 1330, a sensor module 1340, an input module 1350, a display module 1360, an interface 1370, an audio module 1380, a camera module 1391, a power management module 1395, a battery 1396, an indicator 1397, or a motor 1398.
The processor 1310 (e.g., processor 120) may include one or more APs 1311 or one or more Communication Processors (CPs) 1313. In fig. 13, the AP 1311 and the CP 1313 are shown to be included within the processor 1310, but the AP 1311 and the CP 1313 may be included within different IC packages, respectively. According to one embodiment, the AP 1311 and the CP 1313 may be included within one IC package.
The AP 1311 may drive an operating system or an application program and control a plurality of hardware or software constituent elements connected to the AP 1311, and may perform processing and operations of various data including multimedia data. The AP 1311 may be implemented as, for example, a system on chip (SoC). According to one embodiment, processor 1310 may also include a Graphics Processing Unit (GPU) (not shown).
The CP 1313 may perform functions of managing a data link and converting a communication protocol in communication between the electronic device 1300 (e.g., the electronic device 101) and other electronic devices (e.g., the electronic device 102, the electronic device 104, or the server 106) connected to the network. The CP 1313 may be implemented as, for example, an SoC. According to one embodiment, the CP 1313 may perform at least a portion of the multimedia control functions. The CP 1313 may perform differentiation and authentication of electronic devices within the communication network using a subscriber identity module (e.g., SIM card 1314). In addition, the CP 1313 may provide a service of a voice call, a video call, text information, or packet data, etc. to the user.
In addition, the CP 1313 may control data transmission/reception of the communication module 1330. In fig. 13, the constituent elements such as the CP 1313, the power management module 1395, or the memory 1320 are shown as separate constituent elements from the AP 1311, but according to one embodiment, the AP 1311 may be implemented to include at least some of the aforementioned constituent elements (e.g., the CP 1313).
According to one embodiment, the AP 1311 or the CP 1313 may load data or instructions received from a nonvolatile memory connected to each or at least one of the other constituent elements to a volatile memory, and may process the loaded instructions or data. In addition, the AP 1311 and the CP 1313 may store data received from or generated by at least one other constituent element in the nonvolatile memory.
The SIM card 1314 may be a card including a subscriber identity module and may be inserted into a slot provided in a particular location of the electronic device. The SIM card 1314 may include unique identification information (e.g., integrated circuit card id (iccid)) or subscriber information (e.g., International Mobile Subscriber Identity (IMSI)).
Memory 1320 (e.g., memory 130) may include internal memory 1322 or external memory 1324. The internal memory 1322 may include, for example, at least one of volatile memory (e.g., DRAM, SRAM, SDRAM, and the like) and non-volatile memory (e.g., OTPROM, PROM, EPROM, EEPROM, mask ROM, flash ROM, NAND flash memory, NOR flash memory, and the like). According to one embodiment, the internal memory 822 may be an SSD. The external memory 1324 may also include a flash drive, such as a CF, SD, micro-SD, Mini-SD, xD, or memory stick, among others. The external memory 1324 may be functionally connected to the electronic device 1300 through various interfaces. According to one embodiment, the electronic device 1300 may further include a storage device (or storage medium) such as a hard disk.
Communication module 1330 (e.g., communication interface 160) may include a wireless communication module 1331 or a Radio Frequency (RF) module 1334. The wireless communication module 1331 may include, for example, WiFi 1333, BT 1335, GPS 1337, or NFC 1339. For example, the wireless communication module 1331 may provide wireless communication functions using radio frequencies. Additionally or alternatively, the wireless communication module 1331 may include a network interface (e.g., a LAN card) or modem to connect the electronic device 1300 with a network (e.g., the internet, a LAN, a WAN, a telecommunications network, a cellular network, a satellite network, or POTS, etc.).
The RF module 1334 may perform transmission/reception of data, for example, transmission/reception of RF signals. Although not shown, the RF module 1334 may include, for example, a transceiver, a Pluggable Authentication Module (PAM), a frequency filter, or a Low Noise Amplifier (LNA), etc. Further, the RF module 1334 may further include a component, such as a conductor or a wire, for transmitting/receiving electromagnetic waves in a free space in wireless communication.
The sensor module 1340 may measure a physical quantity, sense an activation state of the electronic device 1300, and convert the measured or sensed information into an electrical signal. The sensor module 1340 may include, for example, at least one of a gesture sensor 1340-A, a gyroscope sensor 1340-B, a pressure sensor 1340-C, a magnetic sensor 1340-D, an accelerometer sensor 1340-E, a grip sensor 1340-F, a proximity sensor 1340-G, a color sensor 1340-H (e.g., an RGB sensor), a biosensor 1340-I, a temperature/humidity sensor 1340-J, a light sensor 1340-K, and an Ultraviolet (UV) sensor 1340-M. Additionally or alternatively, the sensor module 1340 may include, for example, an odor sensor (not shown), an Electromyography (EMG) sensor (not shown), an electroencephalogram (EEG) sensor (not shown), an Electrocardiography (ECG) sensor (not shown), an Infrared (IR) sensor (not shown), an iris sensor (not shown), or a fingerprint sensor (not shown), among others. The sensor module 1340 may also include control circuitry that controls at least one or more sensors belonging to the sensor module 1340.
The input module 1350 may include a touch panel 1352, a (digital) pen sensor 1354, keys 1356, or an ultrasonic input device 1358. The touch panel 1352 may recognize a touch input, for example, in at least one of a capacitive method, a pressure sensing method, an infrared method, and an ultrasonic method. In addition, the touch panel 1352 may further include a control circuit. In the capacitive method, physical contact or proximity recognition may be performed. Touch panel 1352 may also include a tactile layer. In such a case, the touch panel 1352 may provide a tactile response to the user.
The (digital) pen sensor 1354 may be implemented, for example, in the same or similar method as the method of receiving the user's touch input or with a separate identification patch. The keys 1356 may include, for example, physical keys, optical keys, a keyboard, or touch keys. The ultrasonic input device 1358, which is a device capable of sensing sound waves through a microphone (e.g., microphone 1388) and determining data in an electronic device, is capable of wireless identification by an input tool that generates ultrasonic signals. According to one embodiment, the electronic device 1300 may receive user input from an external device (e.g., a network, a computer, or a server) connected thereto using the communication module 1330.
The display module 1360 (e.g., the display 150) may include a panel 1362, a hologram 1364, or a projector 1366. The panel 1362 may be, for example, a Liquid Crystal Display (LCD) or an Active Matrix Organic Light Emitting Diode (AMOLED), etc. Panel 1362 may be implemented, for example, as flexible, transparent, or wearable. Panel 1362 may be constructed as one module with touch panel 1352. The hologram 1364 may represent a three-dimensional image in air using interference of light. The projector 1366 may transmit light to a screen and display video. The screen may be located, for example, inside or outside the electronic device 100. According to one embodiment, the display module 1360 may also include control circuitry for controlling the panel 1362, hologram 1364, or projector 1366.
The interface 1370 may include, for example, an HDMI 1372, a USB 1374, an optical communication terminal 1376, or a D-sub (D-sub) 1378. Interface 1370 may include, for example, communication interface 160 shown in fig. 1. Additionally or alternatively, interface 1379 may include, for example, a mobile high definition link (MHL) (not shown), a secure digital/multimedia card (SD/MMC) (not shown), or infrared data association (IrDA) not shown.
The audio module 1380 may convert sound and electrical signals into each other. At least some of the constituent elements of audio module 1380 may be included in, for example, input/output interface 140 shown in fig. 1. The audio module 1380 may process sound information input or output through, for example, a speaker 1382, a receiver 1384, an earphone 1386, or a microphone 1388.
The camera module 1391 is a device capable of acquiring a still picture and a moving picture. According to one embodiment, the camera module 1391 may include one or more image sensors (e.g., front or rear sensors), lenses (not shown), ISPs (not shown), or flash lights (not shown) (e.g., LEDs or xenon lights).
The power management module 1395 may manage the power of the electronic device 1300. Although not shown, the power management module 1395 may include, for example, a Power Management Integrated Circuit (PMIC), a charging Integrated Circuit (IC), and a battery or fuel gauge.
The PMIC may be mounted in an integrated circuit or SoC semiconductor, for example. The charging method may be classified into a wired type and a wireless type. The charging IC can charge the battery and can prevent an overcurrent or overvoltage from flowing in from the electronic charger. According to one embodiment, the charging IC may include a charging IC of at least one of a wired charging method and a wireless charging method. As a wireless charging method, there are, for example, a magnetic resonance method, a magnetic induction method, an electromagnetic method, and the like. The charging IC may be added with an auxiliary circuit for wireless charging, for example, a coil circuit, a resonance circuit, a rectifier circuit, or the like.
The battery gauge may measure, for example, the level of the battery 1396 and the charging voltage, current, or temperature. The battery 1396 may store and generate electric power, and may use the stored or generated electric power to provide power to the electronic device 1300. The battery 1396 may include, for example, a rechargeable cell or a solar cell.
The indicator 1397 may display a particular status of the electronic device 1300 or a portion thereof, such as a start-up status, a message status, or a charge status, for example. The motor 1398 may convert electrical signals into mechanical vibrations. Although not shown, the electronic device 1300 may include a processing device (e.g., GPU) for mobile TV support. The processing means for mobile TV support may process standard media data such as Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB) or media streaming.
The aforementioned constituent elements of the electronic device according to the embodiment of the present disclosure may each be constituted by one or more components, and names of the respective components may be different according to kinds of the electronic device. An electronic device according to the present disclosure may include at least one of the aforementioned constituent elements, and some constituent elements may be omitted or further include another other constituent element. In addition, some constituent elements of the electronic device according to the present disclosure are combined and constructed as one body, and thus the functions of the respective constituent elements before combination may be identically performed.
As described above, embodiments of the present disclosure can improve the image processing speed of an image processing unit (i.e., ISP) by interlocking thumbnail data generated using a processor different from the image processing unit and captured data and storing the interlocking result in an electronic device.
Embodiments of the present disclosure may improve processing speed of thumbnails by generating thumbnail data using metadata generated in an image processing unit in a different processor of an electronic device. Here, the metadata may include a frame identifier of the image data, an image acquisition time stamp, focus information, image setting information (EXIF), flash information, and the like.
While the present disclosure has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that the operation of the electronic device may be changed, combined, or reused, and that various changes, such as omissions and substitutions and changes, may be made therein without departing from the spirit and scope of the present disclosure. Therefore, the spirit and scope of the present disclosure should not be limited or limited by the described embodiments, and should be defined only by the claims and equivalents thereof.

Claims (28)

1. An electronic device, comprising:
a first image sensor;
a second image sensor;
a touch screen display; and
a processor configured to:
obtaining first image data using a first image sensor;
controlling the touch screen display to display at least a portion of the first image data as a preview image;
receiving user input for capturing a captured image via a touch screen display;
in response to the user input, a first thumbnail image corresponding to the preview image is generated using the at least a portion of the first image data, and a photographed image is generated using the at least a portion of the first image data and at least a portion of the second image data obtained using the second image sensor.
2. The electronic device of claim 1, wherein the processor comprises:
a first image processing module configured to process first image data received from a first image sensor; and
a second image processing module configured to process second image data received from a second image sensor.
3. The electronic device of claim 1, wherein the processor is configured to process the first image data and generate the preview image and metadata regarding the corresponding first image data in a format displayable on a touch screen display.
4. The electronic device of claim 1, wherein the first image data comprises an image of less performance than the second image data.
5. The electronic device of claim 1, wherein the first thumbnail image is displayable while the camera application is being executed.
6. The electronic device of claim 1, wherein the processor is configured to:
selecting, in response to the user input, second image data from a plurality of image data obtained using a second image sensor based on metadata of the first image data; and
generating the photographic image using the at least a portion of the first image data and the selected at least a portion of the second image data.
7. The electronic device of claim 6, wherein the metadata of the first image data includes at least one of a frame identifier of the image data, an image acquisition timestamp, focus information, or image setting information EXIF.
8. The electronic device of claim 1, wherein the processor is configured to:
synchronizing the first image data and the second image data based on a processing delay time of the first image data and a processing delay time of the second image data; and
the captured image is generated based on at least a portion of the synchronized first image data and at least a portion of the synchronized second image data.
9. The electronic device of claim 8, wherein the processing delay time of the second image data is greater than the processing delay time of the first image data.
10. The electronic device of claim 1, wherein a performance of the first image data is less than a performance of the second image data.
11. A method in an electronic device, the method comprising:
obtaining first image data using a first image sensor;
displaying the first image data as a preview image;
receiving a user input to capture a captured image;
in response to the user input, a first thumbnail image corresponding to the preview image is generated using at least a portion of the first image data, and a photographed image is generated using the at least a portion of the first image data and at least a portion of the second image data obtained using the second image sensor.
12. The method of claim 11, wherein the step of displaying the first image data comprises:
converting first image data of the one or more image data into the preview image using a first image processing module.
13. The method of claim 12, wherein generating a first thumbnail image comprises:
in response to receiving the user input, using at least a portion of the preview image to generate an image that is smaller in size than the preview image.
14. The method of claim 11, wherein the first thumbnail image is displayable while the camera application is being executed.
15. The method of claim 11, wherein generating the captured image comprises:
selecting, in response to the user input, second image data from a plurality of image data obtained using a second image sensor based on metadata of the first image data; and
generating the photographic image using the at least a portion of the first image data and the selected at least a portion of the second image data.
16. The method of claim 15, wherein the metadata of the first image data comprises at least one of a frame identifier, an image acquisition timestamp, focus information, or image setting information EXIF of the image data.
17. The method of claim 11, wherein generating the captured image comprises:
synchronizing the first image data and the second image data based on a processing delay time of the first image data and a processing delay time of the second image data; and
the captured image is generated based on at least a portion of the synchronized first image data and at least a portion of the synchronized second image data.
18. The method of claim 17, wherein the processing delay time of the second image data is greater than the processing delay time of the first image data.
19. The method of claim 11, wherein the performance of the first image data is less than the performance of the second image data.
20. A portable communication device, comprising:
a first image sensor;
a second image sensor;
a touch screen display; and
at least one processor adapted to:
obtaining a first image at a first performance using a first image sensor, wherein the obtaining includes generating acquisition time information or frame identification information corresponding to the first image as at least a portion of metadata corresponding to the first image;
presenting at least a portion of the first image as a preview image via the touch screen display;
receiving, via the touch screen display, a user input for capturing a captured image while the preview image is displayed;
in response to the user input, selecting at least one of a plurality of second images based at least in part on the acquisition time information or the frame identification information, wherein the plurality of second images are obtained using a second image sensor at a second performance higher than the first performance;
generating the captured image using the first image and the at least one second image; and
generating a thumbnail image corresponding to the photographed image using a corresponding one of the acquisition time information and the frame identification information corresponding to the first image and at least a portion of the first image,
wherein the at least one processor is configured to:
synchronizing the first image and the at least one second image based on the processing delay time of the first image and the processing delay time of the at least one second image; and
generating the photographic image based on the synchronized first image and the synchronized at least one second image.
21. The portable communication device of claim 20, further comprising:
an interface connected to the at least one processor, the first image sensor and the second image sensor,
wherein the at least one processor is adapted to:
receiving a first image from a first image sensor via an interface; and
a second image is received from a second image sensor via an interface.
22. The portable communication device of claim 20, wherein the at least one processor is adapted to:
recording acquisition time information of the first image as metadata of one or more second images among the plurality of second images, wherein the one or more second images correspond to the first image.
23. The portable communication device of claim 20, wherein the at least one processor is adapted to:
the at least one second image is selected based at least in part on a processing delay of the second image caused by processing the first image.
24. The portable communication device of claim 20, wherein the second image sensor is mounted on an opposite side of the portable communication device from the first image sensor.
25. The portable communication device of claim 20, wherein the at least one processor is adapted to:
generating the thumbnail image to be smaller than the preview image.
26. The portable communication device of claim 20, wherein the at least one processor is adapted to:
displaying the preview image as at least a portion of the presentation via a first area of a touch screen display.
27. The portable communication device of claim 20, further comprising:
a memory for storing a plurality of data to be transmitted,
wherein the at least one processor is adapted to:
associating the thumbnail image with the captured image, wherein the thumbnail image is selected based at least in part on the acquisition time information or frame identification information; and
storing the thumbnail image in a memory in association with the captured image.
28. An electronic device, comprising:
a first image sensor;
a second image sensor;
a touch screen display; and
a processor configured to:
obtaining first image data using a first image sensor;
controlling the touch screen display to display at least a portion of the first image data as a preview image;
while the preview image is displayed, receiving a user input via a touch screen display to capture a captured image;
generating a first thumbnail image corresponding to the preview image using first image data and generating the photographed image using the first image data and second image data obtained using a second image sensor in response to the user input,
wherein the processor is configured to:
synchronizing the first image data and the second image data based on a processing delay time of the first image data and a processing delay time of the second image data; and
the captured image is generated based on the synchronized first image and the synchronized second image.
CN201911355618.9A 2013-03-13 2014-03-13 Electronic device and method for processing image Active CN110996013B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201361780635P 2013-03-13 2013-03-13
US61/780,635 2013-03-13
KR1020140025000A KR102124188B1 (en) 2013-03-13 2014-03-03 Electronic device and method for processing image
KR10-2014-0025000 2014-03-03
CN201480014885.9A CN105075241B (en) 2013-03-13 2014-03-13 Electronic device and method for processing image
PCT/KR2014/002091 WO2014142564A1 (en) 2013-03-13 2014-03-13 Electronic device and method for processing image

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201480014885.9A Division CN105075241B (en) 2013-03-13 2014-03-13 Electronic device and method for processing image

Publications (2)

Publication Number Publication Date
CN110996013A CN110996013A (en) 2020-04-10
CN110996013B true CN110996013B (en) 2022-07-01

Family

ID=51537108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911355618.9A Active CN110996013B (en) 2013-03-13 2014-03-13 Electronic device and method for processing image

Country Status (2)

Country Link
CN (1) CN110996013B (en)
WO (1) WO2014142564A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113055585B (en) * 2019-12-27 2022-06-10 青岛海信移动通信技术股份有限公司 Thumbnail display method of shooting interface and mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080006774A (en) * 2006-07-13 2008-01-17 엠텍비젼 주식회사 Method and device for transmitting thumbnail data
CN101420504A (en) * 2007-10-25 2009-04-29 鸿富锦精密工业(深圳)有限公司 Image viewing system and method
CN101742035A (en) * 2008-11-11 2010-06-16 夏普株式会社 Image forming apparatus and preview display method
CN102546925A (en) * 2010-12-29 2012-07-04 Lg电子株式会社 Mobile terminal and controlling method thereof

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3820086B2 (en) * 2000-07-31 2006-09-13 オリンパス株式会社 Electronic camera
JP4432233B2 (en) * 2000-08-25 2010-03-17 株式会社ニコン Electronic camera
WO2003024094A1 (en) * 2001-09-10 2003-03-20 Nikon Corporation Digital camera system, image storage apparatus, and digital camera
US7480864B2 (en) * 2001-10-12 2009-01-20 Canon Kabushiki Kaisha Zoom editor
US20060187227A1 (en) * 2005-01-31 2006-08-24 Jung Edward K Storage aspects for imaging device
US20080030592A1 (en) * 2006-08-01 2008-02-07 Eastman Kodak Company Producing digital image with different resolution portions
KR20100018335A (en) * 2008-08-06 2010-02-17 삼성디지털이미징 주식회사 Method and apparatus for controlling thumbnail display, and digital photographing apparatus
KR20100032135A (en) * 2008-09-17 2010-03-25 주식회사 케이티테크 Method and apparatus for displaying thumbnail image
US20100118175A1 (en) * 2008-11-10 2010-05-13 Victor Charles Bruce Imaging Apparatus For Image Integration
KR101500505B1 (en) * 2009-01-29 2015-03-09 엘지전자 주식회사 Mobile terminal and operation method thereof
KR101691833B1 (en) * 2010-11-04 2017-01-09 엘지전자 주식회사 Mobile terminal and Method for controlling photographing image thereof
US9584735B2 (en) * 2010-11-12 2017-02-28 Arcsoft, Inc. Front and back facing cameras
CN102915669A (en) * 2012-10-17 2013-02-06 中兴通讯股份有限公司 Method and device for manufacturing live-action map

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080006774A (en) * 2006-07-13 2008-01-17 엠텍비젼 주식회사 Method and device for transmitting thumbnail data
CN101420504A (en) * 2007-10-25 2009-04-29 鸿富锦精密工业(深圳)有限公司 Image viewing system and method
CN101742035A (en) * 2008-11-11 2010-06-16 夏普株式会社 Image forming apparatus and preview display method
CN102546925A (en) * 2010-12-29 2012-07-04 Lg电子株式会社 Mobile terminal and controlling method thereof

Also Published As

Publication number Publication date
WO2014142564A1 (en) 2014-09-18
CN110996013A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
US11509807B2 (en) Electronic device and method for generating thumbnails based on captured images
US10348971B2 (en) Electronic device and method for generating thumbnails based on captured images
EP3506617A1 (en) Method for controlling camera and electronic device therefor
KR102122476B1 (en) Apparatas and method for controlling a rotation of screen in an electronic device
US20160127653A1 (en) Electronic Device and Method for Providing Filter in Electronic Device
CN104869305B (en) Method and apparatus for processing image data
US20150058630A1 (en) Electronic device and method for extracting encrypted message
US10999501B2 (en) Electronic device and method for controlling display of panorama image
US9508383B2 (en) Method for creating a content and electronic device thereof
US10747489B2 (en) Method for displaying content and electronic device therefor
US20150063778A1 (en) Method for processing an image and electronic device thereof
KR20150043894A (en) Apparatas and method for adjusting a preview area of multi image in an electronic device
CN110996013B (en) Electronic device and method for processing image
CN114630152A (en) Parameter transmission method and device for image processor and storage medium
KR20150050695A (en) Method for processing image and an electronic device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant