CN106228511B - Electronic device and method for generating image file in electronic device - Google Patents

Electronic device and method for generating image file in electronic device Download PDF

Info

Publication number
CN106228511B
CN106228511B CN201610384197.2A CN201610384197A CN106228511B CN 106228511 B CN106228511 B CN 106228511B CN 201610384197 A CN201610384197 A CN 201610384197A CN 106228511 B CN106228511 B CN 106228511B
Authority
CN
China
Prior art keywords
image
display
images
electronic device
combination file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610384197.2A
Other languages
Chinese (zh)
Other versions
CN106228511A (en
Inventor
李慈卿
李奎浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN106228511A publication Critical patent/CN106228511A/en
Application granted granted Critical
Publication of CN106228511B publication Critical patent/CN106228511B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • H04N1/2116Picture signal recording combined with imagewise recording, e.g. photographic recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An electronic device and a method for generating an image file in the electronic device are provided. The electronic device includes a memory for storing a plurality of images captured intermittently, and a processor for selecting at least some of the plurality of images, generating an image combination file in a format for sequentially playing the selected images by combining the selected images, and storing the image combination file in the memory. The apparatus may also provide a display for adding, deleting, and arranging images of the image combination file.

Description

Electronic device and method for generating image file in electronic device
This application claims the benefit of korean patent application No. 10-2015-.
Technical Field
The present disclosure relates to a method for generating an image file using an image stored in an electronic device.
Background
People capture photographs to share memory such as travel and anniversaries. When a camera is mounted on a device such as a smartphone or a tablet Personal Computer (PC), capturing and sharing photos in people's daily life becomes a normal state.
Typically, the photos may be stored in multiple folders in the electronic device. The photos stored in each folder may be displayed in the form of thumbnails on one screen, or one photo selected by a user of the electronic device may be displayed.
The number of times photos are captured has increased greatly due to the characteristics of digital devices that easily capture and delete images. With the advancement of communication technologies and Social Network Services (SNS), sharing images of events (such as travel) captured by people with others through the internet has also increased greatly. However, as the number of photographs has increased, it is not easy to manage them. The user needs to select the photos to be transmitted one by one to transmit the photos stored in the electronic device to another electronic device. Furthermore, as the size of the data per photo increases, the communication charges may strain the user's funds if the user sends or uploads multiple photos.
The above information is presented merely as background information to aid in understanding the present disclosure. No determination has been made, nor is a statement made, as to whether any of the above information is applicable as prior art against the present disclosure.
Disclosure of Invention
Aspects of the present disclosure are provided to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an electronic device for easily playing, managing, and sharing an image by combining a plurality of photographs to generate one image file, and a method for generating an image file in the electronic device.
According to an aspect of the present disclosure, an electronic device is provided. The electronic device includes a memory for storing a plurality of images captured intermittently, and a processor for selecting at least some of the plurality of images, generating an image combination file in a format for sequentially playing the selected images by combining the selected images, and storing the image combination file in the memory.
According to another aspect of the present disclosure, a method for generating an image file in an electronic device is provided. The method comprises the following steps: the method includes selecting at least some of a plurality of images captured intermittently in an intermittent manner and stored in a memory, generating an image combination file in a format for sequentially playing the selected images by combining the selected images, and storing the image combination file in the memory.
According to another aspect of the present disclosure, there is provided a computer-readable recording medium. The computer-readable recording medium includes a program for executing the method of: the method includes selecting at least some of a plurality of images intermittently captured and stored in a memory, generating an image combination file of a format for sequentially playing the selected images by combining the selected images, and storing the image combination file in the memory.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the disclosure.
Drawings
The above and other aspects, features and advantages of particular embodiments of the present disclosure will become more apparent from the following description when taken in conjunction with the accompanying drawings, in which:
fig. 1 is a block diagram illustrating a configuration of an electronic device according to various embodiments of the present disclosure;
2A-2C are diagrams illustrating a user interface for generating an image combination file according to various embodiments of the present disclosure;
3A-3F are diagrams illustrating user interfaces for setting image selection conditions according to various embodiments of the present disclosure;
4A-4C are diagrams illustrating a user interface for editing an image combination file according to various embodiments of the present disclosure;
5A-5C are diagrams illustrating a user interface for generating an image combination file according to various embodiments of the present disclosure;
6A-6C are diagrams illustrating a user interface for selecting a plurality of images according to various embodiments of the present disclosure;
7A-7D are diagrams illustrating a user interface for editing an image combination file according to various embodiments of the present disclosure;
8A-8D are diagrams illustrating a user interface for editing an image combination file according to various embodiments of the present disclosure;
9A-9D are diagrams illustrating a user interface for playing an image combination file according to various embodiments of the present disclosure;
FIG. 10 is a diagram illustrating operations of playing an image combination file according to various embodiments of the present disclosure;
FIG. 11 is a flow diagram illustrating a method for generating an image file in an electronic device according to various embodiments of the present disclosure;
fig. 12 is a block diagram illustrating a configuration of an electronic device in a network environment according to various embodiments of the present disclosure;
fig. 13 is a block diagram illustrating a configuration of an electronic device according to various embodiments of the present disclosure;
fig. 14 is a block diagram illustrating a configuration of program modules according to various embodiments of the present disclosure.
Throughout the drawings, it should be noted that: the same reference numerals are used to describe the same or similar elements, features and structures.
Detailed Description
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. While the following description includes various specific details to aid understanding, these specific details are to be considered exemplary only. Thus, one of ordinary skill in the art will recognize that: various changes and modifications may be made to the various embodiments described herein without departing from the scope and spirit of the present disclosure. Moreover, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to literature meanings, but are used only to achieve a clear and consistent understanding of the disclosure. Thus, it should be clear to the skilled person that: the following description of various embodiments of the present disclosure is provided for the purpose of illustration only and is not intended to limit the invention defined by the claims and their equivalents.
It will be understood that: the singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to "a component surface" includes reference to one or more of such surfaces.
In the disclosure, the expressions "having", "may have", "include" and "include" or "may include" and "may include" used herein indicate the presence of the corresponding feature (e.g., an element such as a value, a function, an operation, or a component) without excluding the presence of additional features.
In the disclosure, the expressions "a or B", "at least one a or/and B", "one or more of a or/and B", and the like, as used herein, may include any and all combinations of one or more of the associated listed items. For example, the term "a or B", "at least one of a and B", or "at least one a or B" may indicate all of the following: a case (1) including at least one a, a case (2) including at least one B, or a case (3) including both at least one a and at least one B.
Expressions such as "first", "second", and the like, used in various embodiments of the present disclosure may indicate various elements regardless of the order and/or priority of the respective elements, and do not limit the respective elements. The expression may be used to distinguish one element from another. For example, both the "first user device" and the "second user device" indicate user devices that are different from each other regardless of the order and/or priority of the respective elements. For example, a first component may be termed a second component, and vice-versa, without departing from the scope of the present disclosure.
It should be understood that: when an element (e.g., a first element) is said to be (operatively or communicatively) coupled/connected to (operatively or communicatively) another element (a second element) or to (directly connected to) another element (e.g., a second element), the element may be directly coupled/connected to the other element or to the other element, or intervening elements (e.g., third elements) may be present. In contrast, when an element (e.g., a first element) is referred to as being "directly coupled"/"directly coupled" to or "directly connected" to another element (e.g., a second element) with the other element (e.g., the first element), it is understood that there are no intervening elements (e.g., third elements).
As used herein, the expression "configured to" may be used, as appropriate, for example, the expression "suitable for", "having … capability", "designed for", "suitable for", "used as" or "capable". The term "configured to" does not mean "specifically designed for" the hardware. Alternatively, the expression "a device configured as …" may indicate that the device is "capable" of operating with another device or other component. For example, a "processor configured to perform A, B and C" may represent a general-purpose processor (e.g., a Central Processing Unit (CPU) or an Application Processor (AP)) that may perform the respective operations by running a software program that stores a special-purpose processor (e.g., an embedded processor) for performing the respective operations.
Unless otherwise defined herein, all terms used herein including technical or scientific terms may have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in dictionaries, should also be interpreted as having a meaning that is conventional and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. In some cases, even if the terms are the terms defined in the specification, they may not be construed as excluding the embodiments of the present disclosure.
Electronic devices according to various embodiments of the present disclosure may include, for example, at least one of: a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a notebook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a motion Picture experts group phase 1 or phase 2(MPEG-1 or MPEG-2) audio layer three (MP3) player, a mobile medical device, a camera, or a wearable device. According to various embodiments, the wearable device may comprise at least one of: an accessory-type wearable device (e.g., a watch, ring, bracelet, anklet, necklace, glasses, contact lens, or Head Mounted Device (HMD)), a fabric or garment-integrated wearable device (e.g., an electronic garment), a body-mounted wearable device (e.g., a brace or a tattoo), or an implantable wearable device (e.g., an implantable circuit).
In various embodiments, the electronic device may also beAn intelligent household appliance. The smart appliance may include, for example, at least one of: television (TV), Digital Versatile Disc (DVD) player, stereo, refrigerator, air conditioner, various vacuum cleaners, oven, microwave oven, washing machine, air purifier, set-top box, home automation control panel, security control panel, TV box (e.g., Samsung HomeSync)TMApple TVTMOr Google TVTM) Game console (e.g., Xbox)TMAnd PlayStationTM) An electronic dictionary, an electronic key, a camera or an electronic photo frame.
In various embodiments, the electronic device may also include at least one of: various medical devices (e.g., various portable medical measuring devices (e.g., blood glucose meters, heart rate meters, blood pressure meters, thermometers, etc.), Magnetic Resonance Angiography (MRA), Magnetic Resonance Imaging (MRI), Computed Tomography (CT), scanners, ultrasound devices, etc.), navigation devices, Global Navigation Satellite Systems (GNSS), Event Data Recorders (EDR), Flight Data Recorders (FDR), vehicle infotainment devices, electronic equipment for ships (e.g., navigation systems, gyrocompasses, etc.), avionic devices, security devices, head units of automobiles, industrial or household robots, Automated Teller Machines (ATM), point of sale (POS), or internet of things (e.g., light bulbs, various sensors, electricity or gas meters, sprinkler devices, fire alarms, thermostats, street lights, toasters, training equipment, Hot water tanks, heaters, boilers, etc.).
According to various embodiments of the present disclosure, the electronic device may also include at least one of: a portion of furniture or a building/structure, an electronic board, an electronic signature receiving device, a projector, or various measuring instruments (e.g., a water meter, an electricity meter, a gas meter, an electric wave meter, etc.). Electronic devices according to various embodiments of the present disclosure may also be a combination of one or more of the foregoing devices. Electronic devices according to various embodiments of the present disclosure may be flexible or soft contoured electronic devices. However, the electronic device according to various embodiments of the present disclosure may not be limited to the foregoing devices, and may include a new electronic device as technology develops.
Hereinafter, an electronic device according to various embodiments will be described with reference to the accompanying drawings. The term "user" as used herein may refer to a person using an electronic device or may refer to a device (e.g., an artificial electronic device) using an electronic device.
Fig. 1 is a block diagram illustrating a configuration of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 1, the electronic device 100 may include a camera module 110, a sensor module 120, a memory 130, an input module 140, a display 150, a communication module 160, and a processor 170.
The camera module 110 may capture an image (or photograph). For example, the camera module 110 may generate an image file by capturing an image based on a user instruction.
The sensor module 120 may sense a state of the electronic device 100. According to an embodiment of the present disclosure, the sensor module 120 may include a time sensor (not shown). The time sensor can obtain information about the current time (including year, month and day) through the constantly updated time. According to an embodiment of the present disclosure, if the camera module 110 captures an image, the time sensor may verify the time at which the image was captured and provide the time information to the processor 170.
The memory 130 may store images (or image files) captured by the camera module 110. Alternatively, the memory 130 may store an image captured by and then transmitted from an external device. According to an embodiment of the present disclosure, the memory 130 may store a plurality of images captured in an intermittent manner. The discontinuous manner may refer to, for example, a temporal or spatial discontinuous manner. According to embodiments of the present disclosure, memory 130 may store multiple images captured at different times and/or different locations. The memory 130 may store a plurality of images captured at different times in a time-break manner. The memory 130 may store a plurality of images captured at different locations in a spatially intermittent manner. For example, a photograph captured by a user of electronic device 100 may correspond to multiple images captured in an intermittent manner. In contrast to this, a moving image captured by a user may correspond to a plurality of images captured in a continuous manner.
According to an embodiment of the present disclosure, the memory 130 may store an image combination file generated by combining a plurality of images captured in an intermittent manner. The image combination file may be an image file in a format for sequentially (or consecutively) playing a plurality of images included in the image file.
The input module 140 may receive various user instructions. According to an embodiment of the present disclosure, the input module 140 may include a touch sensor for sensing a touch operation of a user or a pen sensor panel for sensing a pen operation of a user.
According to an embodiment of the present disclosure, the input module 140 may detect a user operation that is not in direct contact with a panel (e.g., a touch sensor panel or a pen sensor panel) and is input from a distance smaller than a certain distance, and a user operation that is in direct contact with the panel.
According to an embodiment of the present disclosure, the input module 140 may also accept a user instruction to select an image to be included in the image combination file.
According to an embodiment of the present disclosure, the input module 140 may receive a user instruction to set a selection condition of an image included in the image combination file.
According to embodiments of the present disclosure, the input module 140 may also receive a user instruction to add a new image to the image combination file. According to an embodiment of the present disclosure, the input module 140 may also receive a user instruction to delete some of the images included in the image combination file. According to an embodiment, the input module 140 may also receive a user instruction to change the order in which the images included in the image combination file are arranged.
The display 150 may display a user interface. For example, if a certain event or a predetermined event occurs in the electronic device 100, the display 150 may display a corresponding user interface. For example, the display 150 may display an application execution screen, a content play screen, a menu screen, a lock screen, a notification message, and the like.
According to an embodiment of the present disclosure, the display 150 may display a user interface for setting an image selection condition. The image selection condition may include, for example, at least one of time, place, person, tag, or image mode.
According to an embodiment of the present disclosure, the display 150 may display a user interface for editing the image combination file. The user may delete some of the images included in the image combination file through the user interface or may change the order in which the images included in the image combination file are arranged. In addition, the user may add a new image to the image combination file through the user interface.
According to an embodiment of the present disclosure, the display 150 may also display a play screen of the image combination file. For example, the display 150 may sequentially play a plurality of images included in the image combination file at a certain time interval or a predetermined time interval. According to an embodiment of the present disclosure, the display 150 may display an image corresponding to a region where a user operation is input, among a plurality of images included in an image combination file.
According to an embodiment of the present disclosure, the input module 140 and the display 150 may be implemented using, for example, a touch screen in which a touch sensor panel is arranged on a display panel to simultaneously display an image and detect a touch operation.
The communication module 160 may perform data communication through a network, such as a mobile communication network or an internet network. According to an embodiment of the present disclosure, the communication module 160 may include a cellular module (not shown), a Wi-Fi module (not shown), a Bluetooth (BT) module (not shown), a Near Field Communication (NFC) module (not shown), a GNSS module (not shown), and the like.
According to an embodiment of the present disclosure, the cellular module may communicate with a base station, wherein the base station provides a mobile communication service to an area in which the electronic device 100 is located. According to an embodiment, the cellular module may send information associated with the serving cell to the processor 170.
According to an embodiment of the present disclosure, a Wi-Fi module may communicate with an Access Point (AP) providing wireless internet service within a determined range or a predetermined range. According to an embodiment of the present disclosure, the Wi-Fi module may obtain information about an AP currently communicating with the electronic device 100 (e.g., a location of the AP, an identification number of the AP, etc.).
According to an embodiment of the present disclosure, the GNSS module may determine the current location (e.g., latitude/longitude) of the electronic device 100 using information received from satellites. According to an embodiment of the present disclosure, if the camera module 110 captures an image, the GNSS module may verify the location of the captured image and may provide the verified location to the processor 170.
According to an embodiment of the present disclosure, the communication module 160 may communicate with an external device to communicate an image or an image combination file with the external device. For example, the communication module 160 may transmit an image combination file to a Social Network Service (SNS) server or receive an image combination file from an SNS server.
The processor 170 may control the overall operation of the electronic device 100. According to an embodiment of the present disclosure, the processor 170 may control the camera module 110, the sensor module 120, the memory 130, the input module 140, and the communication module 160 to generate, manage, and play an image combination file according to various embodiments of the present disclosure.
According to an embodiment of the present disclosure, the processor 170 may execute an application for generating, managing, and playing an image combination file, and may provide an image combination file service to a user. According to an embodiment of the present disclosure, the processor 170 may be implemented with a system on a chip (SoC).
According to an embodiment of the present disclosure, the processor 170 may store the image captured by the camera module 110 in the memory 130. According to an embodiment of the present disclosure, the processor 170 may also generate metadata associated with the image captured by the camera module 110 and may store the generated metadata in the memory 130 along with the image. The metadata may include, for example, information such as the time at which the image was captured, the location at which the image was captured, people included in the image, and tags inserted into the image (e.g., time, location, people, user's feelings, events associated with the image (e.g., holidays, birthdays, travel locations, etc.)). According to an embodiment of the present disclosure, time information or location information may be received from the sensor module 120 (e.g., a time sensor) or the communication module 160 (e.g., a GNSS module).
According to an embodiment, personal information included in an image may be generated to recognize a person included in the image through a face recognition algorithm or may be received from a user. According to an embodiment of the present disclosure, a user's feeling or an event associated with an image may also be received from a user. The metadata may be transmitted with the image. For example, if the electronic device 100 receives an image captured by an external device, the metadata may be included in the received image.
According to an embodiment of the present disclosure, the processor 170 may select at least some of the plurality of images stored in the memory 130. For example, the processor 170 may select a plurality of images stored in the memory 130.
According to an embodiment of the present disclosure, the processor 170 may select an image selected by a user. For example, the processor 170 may display a user interface for selecting at least some of the plurality of images on the display 150. The user may select at least some of the plurality of images stored in memory 130 through a user interface displayed on display 150.
According to an embodiment of the present disclosure, the processor 170 may also use the metadata to select at least some of the plurality of images stored in the memory 130. According to an embodiment of the present disclosure, the processor 170 may select at least some of the plurality of images stored in the memory 130 based on an image selection condition set by a user. For example, the processor 170 may display a user interface for setting image selection conditions on the display 150. The user may set the image selection condition through a user interface displayed on the display 150. The user may then select at least some of the images. The image selection condition may include, for example, at least one of: the time when the image was captured, the location where the image was captured, a person included in the image, a tag inserted into the image, or an image pattern.
According to an embodiment of the present disclosure, the processor 170 may generate an image combination file by combining the selected images. According to an embodiment of the present disclosure, the processor 170 may compress the selected images and may generate an image combination file by combining the compressed images. Accordingly, if the image combination file includes many images, the processor 170 may reduce the data amount of the image combination file to easily play and share the image combination file.
According to an embodiment of the present disclosure, the processor 170 may arrange the selected images based on at least one of a time at which each image is captured, a place at which each image is captured, a person included in each image, a tag inserted into each image, or an image pattern, and may generate an image combination file by combining the selected images in the arranged order.
According to an embodiment of the present disclosure, the processor 170 may store the generated image combination file in the memory 130. According to an embodiment of the present disclosure, processor 170 may generate metadata associated with the image combination file and may store the generated metadata in storage 130 along with the image combination file. The metadata may include, for example, information such as a time when the image included in the image combination file was captured, a place where the image was captured, a person included in the image, and a tag (e.g., a time, a place, a person, a feeling of the user, an event (e.g., a holiday, a birthday, a travel place, etc.) associated with the image) inserted into the image. The metadata of the image combination file may be generated based on the metadata of the images included in the image combination file, or may be received from a user.
According to an embodiment of the present disclosure, the processor 170 may also edit the image combination file. As one example, the processor 170 may delete some of the images included in the image combination file based on a user instruction. For another example, the processor 170 may change the order in which the images included in the image combination file are arranged based on a user instruction. For another example, the processor 170 may add a new image to the image combination file based on a user instruction.
According to an embodiment of the present disclosure, the processor 170 may also transmit the image combination file to an external device through the communication module 160. For example, the processor 170 may share the image combination file by uploading the image combination file into a Short Message Service (SMS) or sending the image combination file to an electronic device of another user based on a user instruction.
According to an embodiment of the present disclosure, the processor 170 may play and display the image combination file on the display 150. According to an embodiment of the present disclosure, processor 170 may play and display the image combination file on at least a portion of display 150 based on a user instruction.
According to an embodiment of the present disclosure, if the image combination file is played, the processor 170 may sequentially display a plurality of images included in the image combination file at a determined time interval or a predetermined time interval on the display 150.
According to an embodiment of the present disclosure, if the image combination file is played, the processor 170 may divide an area in which the image combination file is displayed into a plurality of areas corresponding to the number of images included in the image combination file. According to an embodiment of the present disclosure, if a user operation is input on an area where an image combination file is displayed, the processor 170 may display an image corresponding to the area where the user operation is input on the display 150.
Fig. 2A-2C are diagrams illustrating a user interface for generating an image combination file according to various embodiments of the present disclosure.
Referring to fig. 2A, the display 150 may display an image combination file 1 and an image combination file 2 stored in the memory 130 of fig. 1. For example, the display 150 may display the image combination files stored in the memory 130 in an order reflecting the time when each image combination file is generated. According to an embodiment of the present disclosure, the display 150 may display a single image representing each of the image combination files 1 and 2 among a plurality of images included in each of the image combination files 1 and 2. According to an embodiment of the present disclosure, the display 150 may display an object 3 for generating a new or modified image combination file. For example, referring to fig. 2A, the display 150 may display a plus-shaped icon object 3.
Referring to fig. 2B, if the user of the electronic device of fig. 1 selects the icon object 3, the display 150 may display a plurality of images stored in the memory 130. The user may select at least some of the plurality of images displayed on the display 150. For example, a user may touch a particular image displayed on the display 150 to select the particular image. According to an embodiment of the present disclosure, the display 150 may display the image selected by the user using a color, a shade, or a brightness different from those of the image not selected by the user. According to an embodiment, the display 150 may display the object 4 indicating the number of selected images. For example, referring to fig. 2B, the display 150 may display a text object 4 indicating the number of images selected.
According to an embodiment of the present disclosure, the display 150 may also display an object 5 that can enter a menu for setting image selection conditions. If the user selects the condition setting object 5, the display 150 may display the user interface illustrated in fig. 3A. In this regard, a description of the user interface will be provided with reference to fig. 3A through 3C.
When the selection of the images is completed, the processor 170 may generate an image combination file by combining the selected images and store the generated image combination file in the memory 130. Referring to fig. 2C, the display 150 may display a newly generated image combination file 6 in addition to the previously stored image combination file 1 and image combination file 2.
Fig. 3A to 3F are drawings illustrating a user interface for setting image selection conditions according to various embodiments of the present disclosure.
If the user of the electronic device 100 of fig. 1 selects the condition setting object 5 illustrated in fig. 2B, the display 150 may display the user interface illustrated in fig. 3A.
Referring to fig. 3A, the display 150 may display a menu for setting a time 11 at which an image is captured, a menu for setting a place 13 at which an image is captured, a menu for setting a person 15 included in an image, a menu for setting a tag 17 inserted into an image, and a menu for setting an image mode 19. Each menu may include an object 20 for selecting a conditional on/off state. According to an embodiment of the present disclosure, if the object 20 is in an 'on' state, the processor 170 of fig. 1 may select an image in consideration of a corresponding condition. If the object 20 is in the 'off' state, the processor 170 may select an image without considering the corresponding condition. For example, referring to fig. 3A, the processor 170 may select an image in consideration of the time 11 at which the image was captured and the location 13 at which the image was captured.
According to an embodiment of the present disclosure, if a user selects a specific condition, the display 150 may display a user interface for setting details of the selected condition.
According to an embodiment of the present disclosure, if a user selects a menu for setting the time 11 at which an image is captured on the user interface illustrated in fig. 3A, the display 150 may display the user interface illustrated in fig. 3B.
Referring to fig. 3B, the display 150 may display an object for setting the start time 21 and an object for setting the end time 22. If the user sets a start time and an end time, the processor 170 may select images captured between the set times.
According to an embodiment of the present disclosure, if the user selects a menu for setting the location 13 where the image is captured on the user interface illustrated in fig. 3A, the display 150 may display the user interface illustrated in fig. 3C.
Referring to fig. 3C, the display 150 may display a search window 23 for searching for an area or place. The user can search for an area or a place using the search window 23, and can set the place 13 where the image is captured. If the user sets a place to capture an image, the processor 170 may select an image captured at the set place.
The display 150 may also display an object 24 for setting a range and a map 25 for setting a position. If the user sets the location and range, the processor 170 may select an image captured within the set range.
According to an embodiment of the present disclosure, if the user selects a menu for setting the person 15 included in the image on the user interface illustrated in fig. 3A, the display 150 may display the user interface illustrated in fig. 3D.
Referring to fig. 3D, the display 150 may arrange and display images stored in the memory 130. The user may select an image displayed on the display 150 or a specific person included in the image. If the user selects a person (or landmark), the processor 170 may recognize the face of the selected person (or the shape of the selected landmark) using a face recognition algorithm (or a landmark recognition algorithm), and may select an image including the corresponding face (or the corresponding landmark).
According to an embodiment of the present disclosure, if the user selects a menu for setting the tab 17 inserted into the image on the user interface illustrated in fig. 3A, the display 150 may display the user interface illustrated in fig. 3E.
Referring to fig. 3E, the display 150 may display a search window 26 for searching for a tag inserted into an image. The user may set the tab using the search window 26.
The display 150 may also display a list 27 of tags inserted into the images stored in the memory 130. The user may select at least one of the tags included in the tag list and may set the tag. If the user sets a tag, the processor 170 may select an image into which the set tag is inserted.
According to an embodiment of the present disclosure, if the user selects a menu for setting the image mode 19 on the user interface illustrated in fig. 3A, the display 150 may display the user interface illustrated in fig. 3F.
Referring to fig. 3F, the display 150 may arrange and display images stored in the memory 130. The user may select at least one of the images displayed on the display 150. If the image is selected, the processor 170 may analyze a pattern of the image using an image pattern analysis algorithm, and may select an image having a pattern similar to the analyzed pattern.
Although not shown in fig. 3A to 3F, the display 150 may also display a calendar including schedule information. If the user selects a specific schedule, the processor 170 may select an image captured between times corresponding to the corresponding schedule or an image captured at a place corresponding to the corresponding schedule.
Fig. 4A-4C are diagrams illustrating a user interface for editing an image combination file according to various embodiments of the present disclosure.
Referring to fig. 4A, the display 150 of fig. 1 may arrange and display a plurality of images included in an image combination file. According to an embodiment of the present disclosure, the user interface displayed on the display 150 may include a deletion area 31 for deleting images included in the image combination file.
According to an embodiment of the present disclosure, the display 150 may also display an object 32 indicating the number of images included in the image combination file.
Referring to fig. 4B, the user of the electronic device 100 of fig. 1 may select the image 33 displayed on the display 150 and may move the selected image 33 to the deletion area 31. If the image 33 is moved to the deletion area 31, the processor 170 of FIG. 1 may delete the corresponding image from the image combination file.
Referring to fig. 4C, the display 150 may arrange and display the remaining images except for the deleted image. Further, the display 150 may change the object 32 indicating the number of images, and may display the changed object 32.
Fig. 5A to 5C are diagrams illustrating a user interface for generating an image combination file according to various embodiments of the present disclosure.
Referring to fig. 5A, the display 150 of fig. 1 may arrange and display a plurality of images included in an image combination file. The user of the electronic device 100 of FIG. 1 may select the images 35 displayed on the display 150 and may change the position of the images (or the order in which the images are arranged).
For example, referring to fig. 5B, the user selects the image 35 displayed on the display 150, and may move the selected image 35 between the fifth image 36 and the sixth image 37. According to an embodiment, when the movement of the image is completed, the processor 170 may rearrange the images included in the image combination file.
Referring to fig. 5C, the display 150 may display the rearranged image. For example, the display 150 may arrange and display the moved image 35 between the fifth image 36 and the sixth image 37.
In fig. 4A to 4C and 5A to 5C, an embodiment of the present disclosure is illustrated in which a user deletes or moves only one image included in an image combination file. However, the scope and spirit of the present disclosure may not be limited thereto. For example, the user may simultaneously select a plurality of images included in the image combination file, and may delete or move the plurality of images. An operation of simultaneously selecting a plurality of images included in an image combination file will be described with reference to fig. 7A to 7D.
Fig. 6A-6C are diagrams illustrating a user interface for selecting a plurality of images according to various embodiments of the present disclosure.
Referring to fig. 6A, the display 150 of fig. 1 may arrange and display a plurality of images included in an image combination file. A user of the electronic device 100 of fig. 1 may input a sliding operation in a lateral direction through a plurality of images to select the plurality of images. For example, the user may input a slide operation using his or her multiple fingers (e.g., two or three fingers). The processor 170 of fig. 1 may then select a plurality of images displayed on the area where the sliding operation is input.
Referring to fig. 6B, the user may input a touch operation in a graphic form to select a plurality of images. For example, the user may input a circular or elliptical touch operation. The processor 170 may then select an image to include in the circle or ellipse or an image that passes through the circle or ellipse.
Referring to fig. 6C, the user may input a sliding operation in a longitudinal or diagonal direction through the plurality of images to select the plurality of images. If a lengthwise or diagonal sliding operation is input, the processor 170 may then select an image included between an image corresponding to a start point of the sliding operation and an image corresponding to an end point of the sliding operation.
Fig. 7A-7D are diagrams illustrating a user interface for editing an image combination file according to various embodiments of the present disclosure.
Referring to fig. 7A, the display 150 of fig. 1 may arrange and display a plurality of images included in an image combination file. According to an embodiment of the present disclosure, the display 150 may display an object 41 indicating the number of images included in the image combination file. A user of the electronic device 100 of fig. 1 may input a plurality of sliding operations directed from the boundary of the continuous images 42 and 43 to the opposite side to add a new image between the arranged images 42 and 43.
Referring to fig. 7B, the processor 170 of fig. 1 may generate a new blank area 44 between the two images 42 and 43, to which the sliding operation is input. If the user selects the blank area 44, the display 150 may display the user interface shown in FIG. 7C.
Referring to fig. 7C, the display 150 may arrange and display images stored in the memory 130 of fig. 1. According to an embodiment of the present disclosure, the processor 170 may exclude an image being executed included in the image combination file from the images stored in the memory 130. The user may select at least some of the images 45 displayed on the display 150. For example, the user may select each image displayed on the display 150, or a plurality of images may be simultaneously selected using one operation as described with reference to fig. 6A to 6C. According to an embodiment, the display 150 may also display an object 46 indicating the number of images selected. When the selection of the image is completed, the processor 170 may add the newly selected image to the image combination file. For example, the processor 170 may compress the selected image and may add the compressed image to the image combination file.
Referring to fig. 7D, the display 150 may arrange and display the newly added image 45 in the area 44. Further, the display 150 may change the object 41 indicating the number of images, and may display the changed object 41.
Fig. 8A to 8D are diagrams illustrating a user interface for editing an image combination file according to various embodiments of the present disclosure.
Referring to fig. 8A, the display 150 of fig. 1 may arrange and display a plurality of images included in an image combination file. According to an embodiment of the present disclosure, the display 150 may display an object 51 indicating the number of images included in the image combination file. The user of the electronic device 100 of fig. 1 may input a plurality of sliding operations directed from the boundary of the continuous image 52 and image 53 to the opposite side to add a new image between the arranged images 52 and 53.
Referring to fig. 8B, the processor 170 of fig. 1 may generate a new blank region 54 between two images to which a slide operation is input. If the user selects blank area 54, display 150 may display the user interface shown in FIG. 8C.
Referring to fig. 8C, the display 150 may arrange and display the image combination file stored in the memory 130 of fig. 1. According to an embodiment of the present disclosure, the processor 170 may exclude an image combination file being executed from image combination files stored in the memory 130. The user may select at least one image combination file 55 of the image combination files displayed on the display 150. The selected image combination file 55 may include an image 57. If an image combination file 55 is selected, the processor 170 may add the image 57 to the selected image combination file.
Referring to fig. 8D, the display 150 may arrange and display the newly added image 57 in the area 54. Further, the display 150 may change the object 51 indicating the number of images, and may display the changed object 51.
Fig. 9A to 9D are drawings illustrating a user interface for playing an image combination file according to various embodiments of the present disclosure.
Referring to fig. 9A, the display 150 of fig. 1 may arrange and display the image combination file stored in the memory 130 of fig. 1. According to an embodiment of the present disclosure, the display 150 may display the image combination file along with metadata associated with the image combination file (e.g., a date the image was captured or a tag inserted into the image combination file). If the user of the electronic device 100 of fig. 1 selects one of the image combination files 61 displayed on the display 150, the processor 170 of fig. 1 may play the selected image combination file.
Referring to fig. 9B, the display 150 may display the played image combination file 61. For example, the display 150 may sequentially display the images included in the image combination file 61 at a determined time interval or a predetermined time interval (e.g., one second) on the determined or predetermined area 62 (e.g., at least part of the display 150).
According to an embodiment of the present disclosure, the display 150 may also display a status bar 63 indicating a play status (sequence or time) of the image combination file. According to an embodiment of the present disclosure, the status bar 63 may be included in an area where an image is displayed.
According to an embodiment of the present disclosure, the display 150 may also display thumbnails 64 of a plurality of images included in the image combination file 61. The thumbnail images 64 may be displayed, for example, on the lower end of the area where the image combination files 61 are displayed.
According to an embodiment of the present disclosure, the status bar 63 may be gradually moved rightward based on the reproduction of the image combination file 61. According to an embodiment, the image displayed on the display 150 may be changed based on the position of the status bar 63. For example, the display 150 may display the image 66 corresponding to the current position of the status bar 63 among the thumbnails 64.
Referring to fig. 9C, the processor 170 may change an image displayed on the display 150 based on a user operation. If a user operation is input on the right point 65 of the status bar 63 while the combined image file is played as shown in fig. 9D, the display 150 may display an image corresponding to the position 65 where the user operation is input.
According to an embodiment of the present disclosure, the processor 170 may constantly change the image displayed on the display 150 based on the position movement operated by the user. For example, the user may constantly search for a plurality of images included in the image combination file using one touch operation (e.g., touching to the left or right of the status bar 63, or dragging the status bar 63 to the left or right).
In fig. 9A to 9D, an embodiment of the present disclosure is illustrated as the display 150 displaying the status bar 63 in the form of a line across the image. However, the scope and spirit of the present disclosure may not be limited thereto. For example, the status bar 63 may be displayed in various ways and may indicate the play status (sequence or time) of the image combination file. As another example, the status bar 63 may be displayed on one side of the image (e.g., the lower end of the image or thumbnail). As another example, the status bar 63 may be displayed as a highlight effect rather than in the form of a line, or may be displayed as an icon on a thumbnail.
Fig. 10 is a diagram illustrating an operation of playing an image combination file according to various embodiments of the present disclosure.
Referring to FIG. 10, if the processor 170 of FIG. 1 plays an image combination file, the display 150 of FIG. 1 may display the image combination file on at least a portion 71 of the display 150.
According to an embodiment of the present disclosure, the processor 170 may divide the area 71 displaying the image combination file into a plurality of areas corresponding to the number of images included in the image combination file. For example, referring to fig. 10, if 10 images (image 1 to image 10) are included in the image combination file, the processor 170 may divide the area 71 displaying the image combination file into 10 areas a1 to a 10. Each of the classified regions a1 through a10 may correspond to one image of a plurality of images (image 1 through image 10) included in the image combination file.
According to an embodiment of the present disclosure, a status bar (e.g., status bar 63 of fig. 9B) displayed on a screen on which an image combination file is played may be continuously moved from an area a1 to an area a 10. The display 150 may display an image corresponding to an area where the status bar is displayed.
According to an embodiment of the present disclosure, if a user operation such as touch or touch-and-drag is input on the region 71 displaying the image combination file, the display 150 may display an image corresponding to the region 71 in which the user operation is input or an image corresponding to the input user operation. For example, if a user operation is input on the region a7 in a state where the display 150 displays the second image img2, the display 150 may change the second image img2 to the seventh image img7 and may display the seventh image img 7.
Fig. 11 is a flowchart illustrating a method for generating an image file in an electronic device according to various embodiments of the present disclosure.
Referring to fig. 11, in operation 1110, the electronic device of fig. 1 may select at least some of a plurality of images stored in the memory 130 of fig. 1. According to an embodiment, the electronic device 100 may select an image selected by a user.
According to an embodiment of the present disclosure, the electronic device 100 may use the metadata to select at least some of the plurality of images stored in the memory 130. For example, the electronic device 100 may select at least some of the plurality of images stored in the memory 130 based on an image selection condition set by a user.
According to an embodiment of the present disclosure, the electronic device 100 may combine the selected images to generate an image combination file in a format for sequentially playing the selected images in operation 1120. According to an embodiment of the present disclosure, the electronic device 100 may compress the selected images, and may generate an image combination file by combining the compressed images.
According to an embodiment of the present disclosure, the electronic device 100 may arrange the selected images based on at least one of a time when each image is captured, a place where each image is captured, a person included in each image, a tag inserted into each image, or an image pattern, and may generate an image combination file by combining the selected images in the order of arrangement.
According to an embodiment of the present disclosure, in operation 1130, the electronic device 100 may store the image combination file in the memory 130. According to an embodiment of the present disclosure, the electronic device 100 may also generate metadata associated with the image combination file and store the generated metadata in the memory 130 along with the image combination file. The metadata may include, for example, information such as a time when the image included in the image combination file was captured, a place where the image was captured, a person included in the image, and a tag inserted into the image (e.g., a time, a place, a person, a feeling of the user, an event associated with the image (e.g., a holiday, a birthday, a travel place, etc.)).
According to an embodiment of the present disclosure, the electronic device 100 may edit and play the image combination file or may transmit the image combination file to an external device in operation 1140.
According to an embodiment of the present disclosure, the electronic device 100 may also edit the image combination file. As one example, the electronic apparatus 100 may delete some of the images included in the image combination file based on a user instruction. As another example, the electronic apparatus 100 may change the order in which the images included in the image combination file are arranged based on a user instruction. As another example, the electronic device 100 may add a new image to the image combination file based on a user instruction.
According to an embodiment of the present disclosure, the electronic device 100 may transmit the image combination file to an external device or elsewhere. For example, the electronic device 100 may upload the image combination file into an SMS server or transmit the image combination file to an electronic device of another user based on a user instruction to share the image combination file with other users.
According to an embodiment of the present disclosure, the electronic device 100 may play and display the image combination file on the display 150 of fig. 1. According to an embodiment, if the image combination file is played, the electronic device 100 may sequentially display a plurality of images included in the image combination file at a determined time interval or a predetermined time interval on the display 150.
According to an embodiment of the present disclosure, if the image combination file is played, the electronic device 100 may divide an area in which the image combination file is displayed into a plurality of areas corresponding to the number of images included in the image combination file.
According to an embodiment of the present disclosure, if a user operation is input on an area where an image combination file is displayed, the electronic device 100 may display an image corresponding to the area where the user operation is input on the display 150.
Fig. 12 is a block diagram illustrating a configuration of an electronic device in a network environment according to various embodiments of the present disclosure.
Referring to fig. 12, a description will be provided of an electronic apparatus 1201 in a network environment 1200. Electronic device 1201 may include all or some of the components of electronic device 100 shown in fig. 1, for example. The electronic device 1201 may also include a bus 1210, a processor 1220, a memory 1230, an input and output interface 1250, a display 1260, and a communication interface 1270. In various embodiments, at least one of the above components may be omitted from electronic device 1201, and other components may be additionally included in electronic device 1201.
The bus 1210 may be, for example, circuitry that interconnects the components 1220-1270 and transmits communication signals (e.g., control messages and/or data) between the components.
Processor 1220 may include one or more of a CPU, an AP, or a Communication Processor (CP). For example, processor 1220 may perform calculations or data processing related to control and/or communication of at least one of the components of electronic device 1201.
The memory 1230 can include volatile and/or nonvolatile memory. Memory 1230 may store, for example, instructions or data associated with at least one of the components of electronic device 1201. Memory 1230 may also store software and/or programs 1240, according to an embodiment.
Programs 1240 can include, for example, a kernel 1241, middleware 1243, an Application Program Interface (API)1245, an application program (or "app") 1247, and the like. At least a portion of the kernel 1241, middleware 1243, or API 1245 may serve as and be referred to as an Operating System (OS).
The kernel 1241 may control or manage system resources (e.g., the bus 1241, the processor 1220, the memory 1230, etc.), for example, for performing operations or functions implemented in other programs (e.g., the middleware 1243, the API 1245, or the application program 1247). Further, the kernel 1241 may provide an interface that may control or manage system resources when the middleware 1243, API 1245, or application 1247 accesses individual components of the electronic device 1201.
The middleware 1243 may function as, for example, a mediator, so that the API 1245 or the application 1247 communicates with the kernel 1241 to exchange data. Further, the middleware 1243 may process one or more work requests received from the application 1247 in order of priority. For example, middleware 1243 may assign a priority for using system resources of electronic device 1201 (e.g., bus 1210, processor 1220, memory 1230, etc.) to at least one of application programs 1247. For example, the middleware 1243 may perform scheduling or load balancing on one or more work requests by processing the one or more work requests in order of priority provided to at least one of the applications 1247.
The API 1245 may be, for example, an interface through which the application 1247 controls functions provided from the kernel 1241 or the middleware 1243. For example, the API 1245 may include at least one interface or function (e.g., instructions) for file control, window control, image processing, text control, and the like.
The input and output interface 1250 may serve as an interface that can transmit instructions or data input from a user or another external device to another component (or other components) of the electronic device 1201, for example. Further, the input and output interface 1250 may output instructions or data received from another component (or other components) of the electronic device 1201 to a user or another external device.
The display 1260 may include, for example, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, an organic LED (oled) display, a micro-electro-mechanical systems (MEMS) display, or an electronic paper display. Display 1260 may display, for example, various content (e.g., text, images, videos, icons, symbols, etc.) to a user. The display 1260 may include a touch screen and may receive, for example, touch input, gesture input, proximity input, or hover input, for example, using an electronic pen or a portion of a user's body.
Communication interface 1270 (e.g., communication module 160 of fig. 1) may establish communication between, for example, electronic device 1201 and an external device (e.g., first external electronic device 1202, second external electronic device 1204, or server 1206). For example, communication interface 1270 may be connected to network 1262 via wireless or wired communication, and may communicate with external devices (e.g., second external electronic device 1204 or server 1206).
The wireless communication may use, for example, at least one of the following as a cellular communication protocol: long Term Evolution (LTE), LTE-advanced (LTE-a), Code Division Multiple Access (CDMA), wideband CDMA (wcdma), Universal Mobile Telecommunications System (UMTS), wireless broadband (WiBro), or global system for mobile communications (GSM), among others. Further, the wireless communication may include, for example, local area network communication 1264. Local network communications 1264 may include, for example, at least one of: Wi-Fi communication, BT communication, Bluetooth Low Energy (BLE) communication, ZigBee communication, NFC, magnetic secure transport communication, GNSS communication, and the like. The GNSS may comprise, for example, at least one of: global Positioning System (GPS), Glonass, beidou satellite navigation system (hereinafter referred to as "beidou"), or Galileo (i.e., the global satellite-based navigation system in europe). Hereinafter, the term "GPS" as used herein may be used interchangeably with the term "GNSS".
The wired communication may include, for example, at least one of: universal Serial Bus (USB) communication, high-definition multimedia interface (HDMI) communication, recommended standard 232(RS-232) communication, power line communication, Plain Old Telephone Service (POTS) communication, and the like. Network 1262 may include a telecommunications network, such as at least one of a computer network (e.g., a Local Area Network (LAN) or a Wide Area Network (WAN)), the Internet, or a telephone network.
Each of the first external electronic device 1202 and the second external electronic device 1204 may be the same device or a different device than the electronic device 1201. According to an embodiment, the server 1206 may comprise a group of one or more servers. According to various embodiments, all or some of the operations performed in the electronic device 1201 may be performed in another electronic device or devices (e.g., the first external electronic device 1202, the second external electronic device 1204, or the server 1206). According to an embodiment of the present disclosure, if the electronic apparatus 1201 performs any function or service automatically or according to a request, the electronic apparatus 1201 may request another apparatus (e.g., the first external electronic apparatus 1202, the second external electronic apparatus 1204, or the server 1206) to perform at least part of the function or service instead of itself or the electronic apparatus 1201 may request another apparatus (e.g., the first external electronic apparatus 1202, the second external electronic apparatus 1204, or the server 1206) to perform at least part of the function or service in addition to the function or service itself performs. Another electronic apparatus (e.g., the first external electronic apparatus 1202, the second external electronic apparatus 1204, or the server 1206) may perform the requested function or the added function and transmit the result of the execution to the electronic apparatus 1201. The electronic apparatus 1201 may process the received result without change or after performing the additional function, and may provide the requested function or service result. For this purpose, for example, cloud computing technology, distributed computing technology, or client-server computing technology may be used.
Fig. 13 is a block diagram illustrating a configuration of an electronic device according to various embodiments of the present disclosure.
Referring to fig. 13, an electronic device 1301 may include all or part of the electronic device 100 shown in fig. 1, for example. The electronic device 1301 may include one or more processors 1310 (e.g., APs), a communication module 1320, a Subscriber Identity Module (SIM)1324, memory 1330, a sensor module 1340, an input device 1350, a display 1360, an interface 1370, an audio module 1380, a camera module 1391, a power management module 1395, a battery 1396, an indicator 1397, and a motor 1398.
The processor 1310 may drive, for example, an OS or an application program to control a plurality of hardware or software components connected to the processor 1310, and may process and calculate various data. Processor 1310 may be implemented in, for example, a SoC. According to an embodiment, the processor 1310 may further include an image processing unit (GPU) (not shown) and/or an Image Signal Processor (ISP) (not shown). Processor 1310 may also include at least some of the other components shown in fig. 13 (e.g., cellular module 1321). The processor 1310 may load instructions or data received from at least one of the components (e.g., the non-volatile memory) to the volatile memory to process the data, and may store various data and processing results in the non-volatile memory.
The communications module 1320 may have the same or similar configuration as the configuration of the communications interface 1270 of fig. 12. The communication module 1320 may include, for example, a cellular module 1321, a Wi-Fi module 1323, a BT module 1325, a GNSS module 1327 (e.g., a GPS module, a Glonass module, a beidou module, or a Galileo module), an NFC module 1328, and a Radio Frequency (RF) module 1329.
The cellular module 1321 may provide, for example, a voice call service, a video call service, a short message service, an internet service, etc. through a communication network. According to embodiments of the present disclosure, the cellular module 1321 may use a SIM1324 (e.g., a SIM card) to identify and authenticate the electronic device 1301 in the communication network. According to an embodiment, the cellular module 1321 may perform at least part of the functionality provided by the processor 1310. According to an embodiment, the cellular module 1321 may also include a CP.
At least some (e.g., two or more) of the cellular module 1321, the Wi-Fi module 1323, the BT module 1325, the GNSS module 1327, or the NFC module 1328 may be included in one Integrated Chip (IC) or one IC package, according to various embodiments of the present disclosure.
The RF module 1329 may transmit and receive, for example, communication signals (e.g., RF signals). Although not shown, the RF module 1329 may include, for example, a transceiver, a Power Amplification Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), an antenna, and the like. According to another embodiment, at least one of the cellular module 1321, the Wi-Fi module 1323, the BT module 1325, the GNSS module 1327, or the NFC module 1328 may transmit and receive an RF signal through a separate RF module.
The SIM1324 may include, for example, a card including a SIM and/or an embedded SIM. The SIM1324 may include unique identification information (e.g., an integrated circuit card identification number (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
Memory 1330 may include, for example, embedded or internal memory 1332, or external memory 1334. Embedded memory 1332 may include, for example, at least one of: volatile memory (e.g., Dynamic Random Access Memory (DRAM), static ram (sram), synchronous dynamic ram (sdram), etc.) or non-volatile memory (e.g., one-time programmable read only memory (OTPROM), programmable ROM (prom), erasable programmable ROM (eprom), electrically erasable programmable ROM (eeprom), masked ROM, flash memory (e.g., NAND flash memory, NOR flash memory, etc.), hard disk drive, or Solid State Drive (SSD)).
External memory 1334 may also include a flash drive (e.g., Compact Flash (CF)), Secure Digital (SD), micro SD, mini SD, extreme digital card (xD), memory stick, and so forth. The external memory 1334 may be functionally and/or physically connected with the electronic device 1301 through various interfaces.
The sensor module 1340 may measure, for example, physical quantities or may detect the operating state of the electronic device 1301, and may convert the measured or detected information into an electronic signal. The sensor module 1340 may include, for example, at least one of: a gesture sensor 1340A, a gyroscope sensor 1340B, an atmospheric pressure sensor 1340C, a magnetic sensor 1340D, an acceleration sensor 1340E, a grip sensor 1340F, a proximity sensor 1340G, a color sensor 1340H (e.g., a red, green, blue (RGB) sensor), a biometric sensor 1340I, a temperature/humidity sensor 1340J, an illuminance sensor 1340K, or an Ultraviolet (UV) sensor 1340M. Additionally or alternatively, the sensor module 1340 can also include, for example, an electronic nose sensor (not shown), an Electromyography (EMG) sensor (not shown), an electroencephalogram (EEG) sensor (not shown), an Electrocardiogram (ECG) sensor (not shown), an Infrared (IR) sensor (not shown), an iris sensor (not shown), a fingerprint sensor (not shown), and/or the like. The sensor module 1340 may also include control circuitry for controlling at least one or more sensors included in the sensor module 1340. According to various embodiments, the electronic device 1301 may further include a processor configured to control the sensor module 1340, wherein the processor is part of the processor 1310 or separate from the processor 1310. Thus, the electronic device 1301 may control the sensor module 1340 when the processor 1310 is in a sleep state.
The input device 1350 may include, for example, a touch panel 1352, a (digital) pen sensor 1354, keys 1356, or an ultrasonic input unit 1358. The touch panel 1352 may include, for example, at least one of a capacitive type, a resistive type, an IR type, or an ultrasonic type. In addition, the touch panel 1352 may further include a control circuit. Touch panel 1352 may also include a tactile layer and may provide a tactile response to the user.
The (digital) pen sensor 1354 may be part of the touch panel 1352, for example, or may comprise a separate board for identification. The keys 1356 may include, for example, physical buttons, optical keys, or a keyboard. The ultrasonic input unit 1358 may allow the electronic device 1301 to detect sound waves using a microphone (e.g., microphone 1388) and allow the electronic device 1301 to verify data through an input tool that generates ultrasonic signals.
The display module 1360 (e.g., the display 159 of fig. 1) may include a panel 1362, a holographic device 1364, or a projector 1366. Faceplate 1362 may include the same or similar configuration as display 1360. Panel 1362 may be implemented to be flexible, transparent, impact resistant, and/or wearable, for example. Panel 1362 and touch panel 1352 can also be integrated into one module. The hologram 1364 may show a stereoscopic image in space by interference of light. The projector 1366 may project light onto a screen to display an image. The screen may be located, for example, inside or outside of the electronic device 1301. According to embodiments, the display 1360 may also include control circuitry for controlling the panel 1362, holographic device 1364, or projector 1366.
The interface 1370 may include, for example, HDMI 1372, USB 1374, optical interface 1376, or D-subminiature 1378. Interface 1370 may be included in, for example, communications interface 1270 shown in fig. 12. Additionally or alternatively, interface 1370 may include, for example, a mobile high definition connection (MHL) interface, an SD card/MMC interface, or an infrared data association (IrDA) standard interface.
The audio module 1380 may perform bidirectional conversion of sound and electrical signals. At least some of the components of audio module 1380 may be included in, for example, input and output interface 1250 shown in fig. 12. The audio module 1380 may process sound information input or output through, for example, a speaker 1382, a receiver 1384, an earphone 1386, or a microphone 1388.
The camera module 1391 may be a device that captures still images and moving images. According to embodiments of the present disclosure, the camera module 1391 may include one or more image sensors (not shown) (e.g., front or rear sensors), lenses (not shown), ISPs (not shown), or flash lights (not shown) (e.g., LEDs or xenon lights).
The power management module 1395 may manage power of the electronic device 1301, for example. According to an embodiment of the present disclosure, although not shown, the power management module 1395 may include a Power Management Integrated Circuit (PMIC), a charger IC, or a battery or fuel gauge. The PMIC may have a wired charging method and/or a wireless charging method. The wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, an acoustic resonance method, an electromagnetic method, and the like. Additional circuitry for wireless charging may also be provided, e.g., coil loops, resonant circuits, rectifiers, etc. The battery fuel gauge may measure, for example, the remaining amount of the battery 1396 and the voltage, current, or temperature of the battery 1396 during the time that the battery 1396 is being charged. The battery 1396 may comprise, for example, a rechargeable battery or a solar cell.
The indicator 1397 may display a particular state of the electronic device 1301 or a portion of the electronic device 901 (e.g., the processor 1310), such as a boot stateMessage status, charge status, etc. The motor 1398 may convert electrical signals into mechanical vibrations and may generate vibrations, haptic effects, and the like. Although not shown, the electronic device 1301 may include a processing unit (e.g., GPU) for supporting mobile TV. The processing unit for supporting mobile TV may be according to, for example, the Digital Multimedia Broadcasting (DMB) standard, the Digital Video Broadcasting (DVB) standard, the mediaFloTMStandards, etc. to process media data.
Each of the above-described elements of the electronic device according to various embodiments of the present disclosure may be configured with one or more components, and names of the respective elements may vary according to a type of the electronic device. Electronic devices according to various embodiments of the present disclosure may include at least one of the above-described elements, some elements may be omitted from the electronic device, or other additional elements may also be included in the electronic device. Furthermore, some elements of the electronic device according to various embodiments of the present disclosure may be combined with each other to form one entity, so that functions of the respective elements may be performed in the same manner as before the combination.
Fig. 14 is a block diagram illustrating a configuration of program modules according to various embodiments of the present disclosure.
Referring to fig. 14, program modules 1410 (e.g., programs 1240 of fig. 12) may include an OS for controlling resources associated with an electronic device (e.g., electronic device 1201 of fig. 12) and/or various applications executing on the OS (e.g., application programs 1247 of fig. 12). The OS may be, for example, Android, iOS, Windows, Symbian, Tizen, Bada, etc.
Program modules 1410 may include a kernel 1420, middleware 1430, an API 1460, and/or applications 1470. At least a portion of the program modules 1410 may be pre-loaded on the electronic device or may be downloaded from an external electronic device (e.g., first external electronic device 1202, second external electronic device 1204, server 1206, etc. of fig. 12).
The kernel 1420 (e.g., kernel 1241 of FIG. 12) may include, for example, a system resource manager 1421 and/or a device driver 1423. The system resource manager 1421 may control, allocate, collect, etc., system resources. The system resource manager 1421 may include a process management unit, a storage management unit, a file system management unit, and the like according to an embodiment of the present disclosure. The device driver 1423 may include, for example, a display driver, a camera driver, a BT driver, a shared memory driver, a USB driver, a keyboard driver, a Wi-Fi driver, an audio driver, or an inter-process communication (IPC) driver.
Middleware 1430 (e.g., middleware 1243 of fig. 12) may provide, for example, functions commonly required by the application 1470, and may provide various functions to the application 1470 through the API 1460, so that the application 1470 efficiently uses limited system resources in the electronic device. According to an embodiment, middleware 1430 (e.g., middleware 1243) can include at least one of: runtime library 1435, application manager 1441, window manager 1442, multimedia manager 1443, resource manager 1444, power manager 1445, database manager 1446, package manager 1447, connection manager 1448, notification manager 1449, location manager 1450, graphics manager 1451, or security manager 1452.
Runtime libraries 1435 may include, for example, library modules used by an editor when executing application 1470 to add new functionality through a programming language. The runtime library 1435 may perform functions with respect to input and output management, memory management, or arithmetic functions.
The application manager 1441 may manage a lifecycle of at least one application, e.g., of the applications 1470. The window manager 1442 may manage Graphical User Interface (GUI) resources used on a screen of the electronic device. The multimedia manager 1443 may confirm formats required for playing various media files, and may encode or decode the media files using codecs corresponding to the respective formats. Resource manager 1444 may manage source code for at least one of applications 1470 and may manage resources such as memory or storage space.
The power manager 1445 may operate with, for example, a basic input/output system (BIOS) or the like to manage battery or power, and may provide power information required for the operation of the electronic device. Database manager 1446 may generate, search, or change a database to be used in at least one of applications 1470. The package manager 1447 may manage installation or update of applications distributed in the type of package file.
The connection manager 1448 may manage, for example, wireless connections, such as Wi-Fi connections, BT connections, and the like. Notification manager 1449 can display or notify events, such as incoming messages, appointments, and proximity notifications, in a manner that does not disturb the user. The location manager 1450 may manage location information of the electronic device. The graphic manager 1451 may manage a graphic effect to be provided to a user or a User Interface (UI) related to the graphic effect. The security manager 1452 may provide security functions required for system security, user authentication, and the like. According to an embodiment of the present disclosure, when an electronic device (e.g., the electronic device 1201) has a phone function, the middleware 1430 may further include a phone manager (not shown) for managing a voice or video communication function of the electronic device.
Middleware 1430 may include a middleware module that configures a combination of the various functionalities of the components described above. The middleware 1430 may provide modules dedicated according to various types of OS to provide different functions. In addition, the middleware 1430 can dynamically delete some old components and/or can add new components.
The API 1460 (e.g., API 1245 of fig. 12) may be, for example, a collection of API programming functions and may be provided with different components depending on the OS. For example, in the case of Android or iOS, one API set may be provided according to the platform. In the case of Tizen, two or more API sets may be provided according to the platform.
Applications 1470 (e.g., application 1247 of fig. 12) may include, for example, one or more of the following: a home page application 1471, a dialer application 1472, an SMS/Multimedia Messaging Service (MMS) application 1473, an Instant Messaging (IM) application 1474, a browser application 1475, a camera application 1476, an alarm clock application 1477, a contacts application 1478, a voice dialing application 1479, an email application 1480, a calendar application 1481, a media player application 1482, an album application 1483, a clock application 1484, a healthcare application (e.g., an application for measuring amount of exercise or blood glucose, etc.), an environmental information application (e.g., an application for providing barometric pressure information, humidity information, or temperature information), and so forth.
According to an embodiment of the present disclosure, the application 1470 may include an application (hereinafter, referred to as an "information exchange application" for better understanding and ease of description) for exchanging information between an electronic device (e.g., the electronic device 1201) and an external electronic device (e.g., the first external electronic device 1202 or the second external electronic device 1204). The information exchange application may include, for example, a notification forwarding application for transmitting specific information to the external electronic device or a device management application for managing the external electronic device.
For example, the notification forwarding application may include functionality to send notification information generated by other applications of the electronic device (e.g., an SMS/MMS application, an email application, a healthcare application, or an environmental information application, etc.) to an external electronic device (e.g., the first external electronic device 1202 or the second external electronic device 1204). Further, the notification forwarding application may receive, for example, notification information from an external electronic device, and may provide the received notification information to a user of the electronic device.
The device management application may manage (e.g., install, delete, or update) at least one of functions of, for example, an external electronic device (e.g., the first external electronic device 1202 or the second external electronic device 1204) communicating with the electronic device (e.g., a function of turning on/off the external electronic device itself (or part of components) or a function of adjusting the brightness (or resolution) of a display), an application operating in the external electronic device, or a service (e.g., a call service or a message service) provided from the external electronic device.
According to an embodiment of the present disclosure, the application 1470 may include an application (e.g., a health card application of an ambulatory medical device) preset according to attributes of an external electronic device (e.g., the first external electronic device 1202 or the second external electronic device 1204).
According to an embodiment of the present disclosure, the application 1470 may include an application received from an external electronic device (e.g., the server 1206, the first external electronic device 1202, or the second external electronic device 1204). According to embodiments of the present disclosure, the application 1470 may include a preloaded application or a third party application that may be downloaded from a server. Names of components of the program module 1410 according to various embodiments of the present disclosure may differ according to kinds of OS.
According to various embodiments of the disclosure, at least a portion of program module 1410 may be implemented in software, firmware, hardware, or a combination of at least two or more of software, firmware, and hardware. At least a portion of program modules 1410 may be implemented (e.g., executed) by, for example, a processor (e.g., processor 1220 in fig. 12). At least some of program modules 1410 may include, for example, modules, programs, routines, instruction sets, processes, etc. for performing one or more functions.
Each of the above-described elements of the electronic device according to various embodiments of the present disclosure may be configured with one or more components, and names of the respective elements may vary according to a type of the electronic device. Electronic devices according to various embodiments of the present disclosure may include at least one of the above-described elements, some elements may be omitted from the electronic device, or other additional elements may also be included in the electronic device. Furthermore, some elements of the electronic device according to various embodiments of the present disclosure may be combined with each other to form one entity, so that functions of the respective elements may be performed in the same manner as before the combination.
The term "module" as used herein may refer to, for example, a unit comprising one of hardware, software, and firmware, or a combination of two or more of hardware, software, and firmware. The term "module" may be used interchangeably with, for example, the terms "unit," "logic block," "component," "circuitry," and the like. A "module" may be a minimal unit or part of a minimal unit of an integrated component. A "module" may also be the smallest unit that performs one or more functions or part of the smallest unit that performs one or more functions. The "module" may also be implemented mechanically or electronically. For example, a "module" may include at least one of: an Application Specific Integrated Circuit (ASIC) chip, a Field Programmable Gate Array (FPGA), or a programmable logic device known or to be developed in the future to perform certain operations.
According to various embodiments of the disclosure, at least part of an apparatus (e.g., a module or function) or a method (e.g., an operation) may be implemented with instructions stored in a non-transitory computer-readable storage medium having program modules, for example. When the instructions are executed by a processor (e.g., processor 170 of fig. 1), one or more processors may perform functions corresponding to the instructions.
The computer readable storage medium may include a hard disk, a floppy disk, a magnetic medium (e.g., magnetic tape), an optical medium (e.g., compact disc read only memory (CD-ROM) and DVD), a magneto-optical medium (e.g., floppy disk), a hardware device (e.g., ROM, RAM, or flash memory), and so on. Further, the program instructions may include not only machine code compiled by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform operations according to various embodiments of the present disclosure, and vice versa.
Modules or program modules according to various embodiments may include at least one or more of the above components, some of which may be omitted, or other additional components may also be included. The operations performed by the modules, program modules or other components may be performed in a sequential manner, a parallel manner, an iterative manner or a heuristic manner. Further, some operations may be performed in a different order or may be omitted, and other operations may be added.
According to various embodiments of the present disclosure, a user of an electronic device may combine a plurality of images into one file and may manage the one file, thereby easily playing, managing, and sharing the images. In addition, the user can conveniently enjoy a plurality of photographs included in the composed file.
While the disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that: various changes in form and detail may be made therein without departing from the spirit and scope of the disclosure as defined by the following claims and their equivalents.

Claims (14)

1. An electronic device, comprising:
a memory configured to store a plurality of images captured intermittently;
a processor configured to:
selecting at least some of the plurality of images;
generating an image combination file of a format for sequentially playing the selected images by combining the selected images; and is
The image combination file is stored in a memory,
wherein the electronic device further comprises: a display configured to display the image combination file,
wherein the processor is further configured to:
controlling a display to arrange and display images included in the first image combination file;
generating a blank area between two consecutive images in a first image combination file in response to a plurality of slide operations directed from a boundary of the two images to an opposite side being input;
controlling a display to display a second image combination file stored in a memory if the blank area is selected;
in response to at least one of the second image combination files being selected, controlling the display to arrange and display images included in the selected image combination file in the blank area.
2. The electronic device of claim 1, wherein the processor is further configured to:
the selected image is compressed, and
an image combination file is generated by combining the compressed images.
3. The electronic device of claim 1, wherein the display is further configured to display a user interface for setting an image selection condition,
wherein the image selection condition includes at least one of a time when the image is captured, a place where the image is captured, a person included in the image, a tag inserted into the image, and an image mode.
4. The electronic device of claim 1, further comprising:
an input module configured to receive a user input for setting an image selection condition,
wherein the processor is configured to select the at least some of the plurality of images based on the set image selection condition.
5. The electronic device of claim 1, wherein the processor is further configured to:
arranging the selected images based on at least one of a time when each image is captured, a place where each image is captured, a person included in each image, a tag inserted into each image, or an image pattern, and
an image combination file is generated by combining the selected images in the order of arrangement.
6. The electronic device of claim 1, further comprising:
a communication module configured to communicate with an external device,
wherein the processor is further configured to transmit the image combination file stored in the memory to an external device.
7. The electronic device as set forth in claim 1,
wherein the processor is further configured to: the image combination file stored in the memory is played.
8. The electronic device of claim 7, wherein the processor is further configured to: a plurality of images included in the image combination file are sequentially displayed at a determined time interval.
9. The electronic device of claim 7, wherein the processor is further configured to:
if a user operation is input on the area where the image combination file is displayed, the images are displayed on the display in an order corresponding to the user operation.
10. The electronic device of claim 7, wherein the processor is further configured to:
dividing an area where the image combination file is displayed into a plurality of areas corresponding to the number of images included in the image combination file, and
if a user operation is input on the area where the image combination file is displayed, an image corresponding to the area where the user operation is input is displayed.
11. A method for generating an image file in an electronic device, the method comprising:
selecting at least some of the plurality of images captured intermittently and stored in the memory;
generating an image combination file of a format for sequentially playing the selected images by combining the selected images; and is
The image combination file is stored in a memory,
wherein the method further comprises:
arranging and displaying images included in the first image combination file;
generating a blank area between two consecutive images in a first image combination file in response to a plurality of slide operations directed from a boundary of the two images to an opposite side being input;
displaying a second image combination file stored in a memory if the blank area is selected;
in response to at least one of the second image combination files being selected, images included in the selected image combination file are arranged and displayed in the blank area.
12. The method of claim 11, wherein the generating of the image combination file comprises:
compressing the selected image; and is
An image combination file is generated by combining the compressed images.
13. The method of claim 11, further comprising:
a user interface for setting image selection conditions is displayed on the display,
wherein the image selection condition includes at least one of a time when the image is captured, a place where the image is captured, a person included in the image, a tag inserted into the image, and an image mode.
14. The method of claim 11, wherein selecting at least some of the plurality of images comprises:
receiving a user input for setting an image selection condition; and is
At least some of the plurality of images are selected based on the set image selection conditions.
CN201610384197.2A 2015-06-02 2016-06-02 Electronic device and method for generating image file in electronic device Active CN106228511B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150078156A KR102411281B1 (en) 2015-06-02 2015-06-02 Electronic device and image file generating method
KR10-2015-0078156 2015-06-02

Publications (2)

Publication Number Publication Date
CN106228511A CN106228511A (en) 2016-12-14
CN106228511B true CN106228511B (en) 2020-02-28

Family

ID=56409458

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610384197.2A Active CN106228511B (en) 2015-06-02 2016-06-02 Electronic device and method for generating image file in electronic device

Country Status (4)

Country Link
US (1) US10510170B2 (en)
EP (1) EP3110122B1 (en)
KR (1) KR102411281B1 (en)
CN (1) CN106228511B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10120542B2 (en) 2014-10-08 2018-11-06 International Business Machines Corporation Reproducing state of source environment when image was screen captured on a different computing device using resource location, resource navigation and positional metadata embedded in image
CN106960467A (en) * 2017-03-22 2017-07-18 北京太阳花互动科技有限公司 A kind of face reconstructing method and system with bone information
KR101990689B1 (en) * 2017-08-31 2019-10-01 주식회사 엘지유플러스 Apparatus for providing image data and method thereof
JP7171349B2 (en) * 2018-09-28 2022-11-15 富士フイルム株式会社 Image processing device, image processing method, program and recording medium
CN110209310B (en) * 2019-04-19 2021-04-27 维沃软件技术有限公司 Picture processing method and terminal equipment
CN111970554B (en) * 2020-08-07 2022-09-09 海信视像科技股份有限公司 Picture display method and display device
CN112866739A (en) * 2021-01-21 2021-05-28 商客通尚景科技(上海)股份有限公司 Method for playing live photos in turn in real time

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4680643A (en) * 1984-03-19 1987-07-14 Olympus Optical Co., Ltd. Image display apparatus
EP1128656A2 (en) * 2000-02-21 2001-08-29 Fujitsu Limited Image photographing system having data management function, data management device and storage medium
US7882258B1 (en) * 2003-02-05 2011-02-01 Silver Screen Tele-Reality, Inc. System, method, and computer readable medium for creating a video clip
CN102347045A (en) * 2011-05-20 2012-02-08 合一网络技术(北京)有限公司 Synchronous display control system used for embedded media player and device thereof
CN104202661A (en) * 2014-09-15 2014-12-10 厦门美图之家科技有限公司 Automatic picture-to-video conversion method
EP2819288A1 (en) * 2013-06-25 2014-12-31 ST-Ericsson SA Method of valley inductance current polarity detection in a pulse width modulated circuit with an inductive charge

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815645A (en) 1996-07-29 1998-09-29 Eastman Kodak Company Method of combining two digital images
US20020087546A1 (en) * 2000-01-31 2002-07-04 Michael Slater Apparatus, methods, and systems for digital photo management
KR20000036810A (en) 2000-03-29 2000-07-05 문제갑 Pohto movie
US6996782B2 (en) * 2001-05-23 2006-02-07 Eastman Kodak Company Using digital objects organized according to a histogram timeline
US7970240B1 (en) * 2001-12-17 2011-06-28 Google Inc. Method and apparatus for archiving and visualizing digital images
US7656543B2 (en) * 2004-11-12 2010-02-02 Hewlett-Packard Development Company, L.P. Albuming images
US20060204141A1 (en) 2005-03-01 2006-09-14 James Modrall Method and system of converting film images to digital format for viewing
JP4861711B2 (en) * 2005-07-27 2012-01-25 株式会社リコー Image processing apparatus, image compression method, image compression program, and recording medium
US20080235595A1 (en) * 2007-03-20 2008-09-25 At&T Knowledge Ventures, Lp Device and method for accessing a multimedia timeline
KR101398471B1 (en) 2007-10-18 2014-05-26 삼성전자주식회사 Image processing apparatus and controlling method thereof
CN101557453B (en) 2008-04-09 2010-09-29 鸿富锦精密工业(深圳)有限公司 Image acquisition device and picture arrangement method thereof
US20090280859A1 (en) 2008-05-12 2009-11-12 Sony Ericsson Mobile Communications Ab Automatic tagging of photos in mobile devices
KR20120028491A (en) 2010-09-15 2012-03-23 삼성전자주식회사 Device and method for managing image data
JP2012156640A (en) * 2011-01-24 2012-08-16 Olympus Imaging Corp Image display apparatus and imaging apparatus
US8832560B2 (en) * 2011-09-21 2014-09-09 Facebook, Inc. Displaying social networking system user information via a historical newsfeed
US8885960B2 (en) * 2011-10-05 2014-11-11 Microsoft Corporation Linking photographs via face, time, and location
US8891883B2 (en) 2012-05-15 2014-11-18 Google Inc. Summarizing a photo album in a social network system
US9300841B2 (en) 2012-06-25 2016-03-29 Yoldas Askan Method of generating a smooth image from point cloud data
US9377933B2 (en) * 2012-09-24 2016-06-28 Facebook, Inc. Displaying social networking system entity information via a timeline interface
US8725800B1 (en) 2012-10-02 2014-05-13 Nextbit Systems Inc. Mobile photo application migration to cloud computing platform
US20140298265A1 (en) 2013-03-04 2014-10-02 Triptease Limited Photo-review creation
US20140317480A1 (en) 2013-04-23 2014-10-23 Microsoft Corporation Automatic music video creation from a set of photos
KR102172354B1 (en) 2013-06-28 2020-10-30 삼성전자주식회사 Image file generating method and apparatus thereof
US20150130816A1 (en) 2013-11-13 2015-05-14 Avincel Group, Inc. Computer-implemented methods and systems for creating multimedia animation presentations
US20160125062A1 (en) * 2014-10-30 2016-05-05 Futurewei Technologies, Inc. Multi-scale timeling photograph album management with incremental spectral photograph clustering

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4680643A (en) * 1984-03-19 1987-07-14 Olympus Optical Co., Ltd. Image display apparatus
EP1128656A2 (en) * 2000-02-21 2001-08-29 Fujitsu Limited Image photographing system having data management function, data management device and storage medium
US7882258B1 (en) * 2003-02-05 2011-02-01 Silver Screen Tele-Reality, Inc. System, method, and computer readable medium for creating a video clip
CN102347045A (en) * 2011-05-20 2012-02-08 合一网络技术(北京)有限公司 Synchronous display control system used for embedded media player and device thereof
EP2819288A1 (en) * 2013-06-25 2014-12-31 ST-Ericsson SA Method of valley inductance current polarity detection in a pulse width modulated circuit with an inductive charge
CN104202661A (en) * 2014-09-15 2014-12-10 厦门美图之家科技有限公司 Automatic picture-to-video conversion method

Also Published As

Publication number Publication date
KR20160142163A (en) 2016-12-12
EP3110122A1 (en) 2016-12-28
CN106228511A (en) 2016-12-14
EP3110122B1 (en) 2023-04-26
KR102411281B1 (en) 2022-06-21
US10510170B2 (en) 2019-12-17
US20160358359A1 (en) 2016-12-08

Similar Documents

Publication Publication Date Title
US11265275B2 (en) Electronic device and method for image control thereof
US10021569B2 (en) Theme applying method and electronic device for performing the same
CN110083730B (en) Method and apparatus for managing images using voice tags
CN106228511B (en) Electronic device and method for generating image file in electronic device
CN115097981B (en) Method for processing content and electronic device thereof
CN108028891B (en) Electronic apparatus and photographing method
CN108351892B (en) Electronic device and method for providing object recommendation
US10659933B2 (en) Electronic device and information processing system including the same
CN108475162B (en) Method for displaying user interface and electronic device for supporting the same
CN108475165B (en) Electronic device and control method thereof
US10108391B2 (en) Audio data operating method and electronic device supporting the same
US10198828B2 (en) Image processing method and electronic device supporting the same
US20180173701A1 (en) Method for contents tagging and electronic device supporting the same
US10845940B2 (en) Electronic device and display method of electronic device
US11210828B2 (en) Method and electronic device for outputting guide
US10956592B2 (en) Contents securing method and electronic device supporting the same
KR20160039334A (en) Method for configuring screen, electronic apparatus and storage medium
US20180074697A1 (en) Method for outputting screen according to force input and electronic device supporting the same
KR102247673B1 (en) Electronic apparatus and method for displaying screen thereof
KR20190065886A (en) Sharing captured multi-media taken with a shooting icon that includes a profile image
KR20180063763A (en) Sharing captured multi-media

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant