KR20170098113A - Method for creating image group of electronic device and electronic device thereof - Google Patents

Method for creating image group of electronic device and electronic device thereof Download PDF

Info

Publication number
KR20170098113A
KR20170098113A KR1020160020043A KR20160020043A KR20170098113A KR 20170098113 A KR20170098113 A KR 20170098113A KR 1020160020043 A KR1020160020043 A KR 1020160020043A KR 20160020043 A KR20160020043 A KR 20160020043A KR 20170098113 A KR20170098113 A KR 20170098113A
Authority
KR
South Korea
Prior art keywords
image
group
images
information
electronic device
Prior art date
Application number
KR1020160020043A
Other languages
Korean (ko)
Inventor
김규현
권순범
이창선
정승환
김대희
김일섭
양재용
김경애
박성주
유보라
이규호
전성익
조진형
김광태
박지윤
전용준
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020160020043A priority Critical patent/KR20170098113A/en
Publication of KR20170098113A publication Critical patent/KR20170098113A/en

Links

Images

Classifications

    • G06F17/30268
    • G06F17/30073
    • G06F17/3028
    • G06F17/3071

Abstract

A method of generating an image group of an electronic device according to various embodiments, the method comprising: generating a group including at least one image; generating a group of at least one of the at least one image group stored in the electronic device, Including the at least one image included in the generated group in a first image group having a first image group and an image information of the at least one image included in the generated group, And changing the first title to the second title. Other embodiments are possible.

Description

TECHNICAL FIELD [0001] The present invention relates to a method of generating an image group of an electronic device,

Various embodiments relate to a method of generating an image group of an electronic device and an electronic device thereof.

As the use of portable electronic devices such as smart phones and tablet PCs with camera functions become common, users using electronic devices can shoot pictures (or movies) anytime and anywhere. Accordingly, in recent years, a user using an electronic device has taken many pictures in order to leave his / her daily life as a photograph. As many pictures are stored in the memory of the electronic device, it has been demanded to develop a technique to easily organize and manage stored pictures. Accordingly, conventionally, a technique has been developed for organizing and / or managing photographs by creating image groups based on date and position information of photographs stored in electronic devices.

Conventional techniques for creating groups of images to organize and / or manage images, such as photographs, provide automatic classification by analyzing images on a server basis. In addition, the conventional technique detects an image based on a specific keyword, so that the electronic device creates an image group after determining the classification range to a specific period and / or a specific place. Further, the conventional technique displays the title of the generated image group as positional information.

Conventional image group generation technology generates an image group by dividing images without distinguishing images similar to duplicated images because an image group is generated based on a simple rule. Also, conventionally, a technique has been developed to update the generated image group when an image related to an image group generated in an electronic device is added or when an image included in the generated image group is edited not.

Accordingly, various embodiments are directed to a method of generating an image group of an electronic device capable of providing an electronic device-based image automatic classification and an electronic apparatus thereof. For example, an electronic device can automatically classify and group many photographs, which are difficult for a user to directly classify, by using tag information such as time, place, and person based on image analysis. Also, a meaningful title can be automatically generated through the tag information of the grouped pictures. This allows the electronic device to present a meaningful display of many pictures to the user and to facilitate sharing and / or consumption.

In addition, various embodiments are provided to provide an image group generation method of an electronic device and an electronic apparatus thereof that can provide an afterimage using tag information of an image.

In addition, various embodiments provide a method of generating an image group of an electronic device capable of automatically updating the group of images when an image related to the group of images created in the electronic device is added, and an electronic apparatus thereof .

In addition, various embodiments are provided to provide an image group generation method and an electronic apparatus thereof for an electronic device capable of adjusting a range of image clustering according to a photographing pattern of a user with a dynamic classification standard.

A method of generating an image group of an electronic device according to an embodiment, the method comprising: generating a group including at least one image; generating an image group corresponding to the generated group of at least one image group stored in the electronic device, Including at least one image included in the generated group in a first image group having a first image group and using image information of the at least one image included in the generated group, 1 < / RTI > title to the second title.

The electronic device according to an embodiment, comprising: a memory; a memory for storing a first image group corresponding to the generated group of at least one image group stored in the memory and having a first title, Wherein the first image group includes at least one image included in the generated group, and the image information of the at least one image included in the generated group is used to convert a first title of the first image group into a second title And a control unit for changing the operation mode.

CLAIMS What is claimed is: 1. A recording medium for generating a group of images of an electronic device, the group comprising at least one image, Comprising: adding the at least one image included in the generated group to a first image group having a first title; and using the image information of the at least one image included in the generated group, A program for operating the operation of changing the first title of the image group to the second title can be recorded.

A method and an electronic device for generating an image group of an electronic device according to various embodiments can provide an electronic device based image automatic classification. In addition, it is possible to provide a follow-up view using tag information of an image. Further, when an image related to an image group generated in the electronic device is added, the image group can be automatically updated. In addition, the range of image clustering can be adjusted according to the photographing pattern of the user as a dynamic classification standard.

1 illustrates a network environment including an electronic device according to various embodiments.
Figure 2 shows a block diagram of an electronic device according to various embodiments.
3 shows a block diagram of a program module according to various embodiments.
4 illustrates a system including an electronic device for generating a group of images according to various embodiments.
5 shows a configuration of an electronic device for generating a group of images according to various embodiments.
6 is a flow diagram of an image group creation operation of an electronic device (e.g., electronic device 101) according to various embodiments.
7 is a flowchart of an image group creation operation of an electronic device (e.g., electronic device 101) according to various embodiments.
8 is a flow diagram of a time information-based image group creation operation of an electronic device (e.g., electronic device 101) according to various embodiments.
9 and 10 are views for explaining generation of an image group based on time information according to various embodiments.
11 is an exemplary view of a screen for inducing a user to perform a name tag for images.
12 is a flow diagram of an image group update operation in response to image acquisition of an electronic device (e.g., electronic device 101) according to various embodiments.
Figure 13 is a flow diagram of operations for merging groups of images according to creating a new image group of an electronic device according to various embodiments.
14 is an exemplary view of an image search screen using tag information according to various embodiments.
15 is an illustration of an image containing a plurality of tag information according to various embodiments.
16 is an exemplary view of a screen displaying image groups according to various embodiments.

Hereinafter, various embodiments of the present document will be described with reference to the accompanying drawings. It is to be understood that the embodiments and terminologies used herein are not intended to limit the invention to the particular embodiments described, but to include various modifications, equivalents, and / or alternatives of the embodiments. In connection with the description of the drawings, like reference numerals may be used for similar components. The singular expressions may include plural expressions unless the context clearly dictates otherwise. In this document, the expressions "A or B" or "at least one of A and / or B" and the like may include all possible combinations of the items listed together. Expressions such as " first, "" second," " first, "or" second, " But is not limited to those components. When it is mentioned that some (e.g., first) component is "(functionally or communicatively) connected" or "connected" to another (second) component, May be connected directly to the component, or may be connected through another component (e.g., a third component).

In this document, the term " configured to (or configured) to "as used herein is intended to encompass all types of hardware, software, Quot ;, "made to do "," made to do ", or "designed to" In some situations, the expression "a device configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a general purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.

Electronic devices in accordance with various embodiments of the present document may be used in various applications such as, for example, smart phones, tablet PCs, mobile phones, videophones, electronic book readers, desktop PCs, laptop PCs, netbook computers, workstations, servers, PDAs, a portable multimedia player, an MP3 player, a medical device, a camera, or a wearable device. Wearable devices may be of the type of accessories (eg, watches, rings, bracelets, braces, necklaces, glasses, contact lenses or head-mounted-devices (HMD) (E.g., a skin-pads or tattoo), or a bio-implantable circuit. In some embodiments, the electronic device may be, for example, a television, a digital video disk (Such as Samsung HomeSync , Apple TV , or Google TV ), a refrigerator, air conditioner, vacuum cleaner, oven, microwave oven, washing machine, air purifier, set top box, home automation control panel, A game console (e.g., Xbox TM , PlayStation TM ), an electronic dictionary, an electronic key, a camcorder, or an electronic frame.

In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) A navigation system, a global navigation satellite system (GNSS), an event data recorder (EDR), a flight data recorder (FDR), an automobile infotainment device, a marine electronic equipment (For example, marine navigation systems, gyro compasses, etc.), avionics, security devices, head units for vehicles, industrial or domestic robots, drones, ATMs at financial institutions, of at least one of the following types of devices: a light bulb, a fire detector, a fire alarm, a thermostat, a streetlight, a toaster, a fitness device, a hot water tank, a heater, a boiler, . According to some embodiments, the electronic device may be a piece of furniture, a building / structure or part of an automobile, an electronic board, an electronic signature receiving device, a projector, or various measuring devices (e.g., Gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device is flexible or may be a combination of two or more of the various devices described above. The electronic device according to the embodiment of the present document is not limited to the above-described devices. In this document, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).

Referring to Figure 1, in various embodiments, an electronic device 101 in a network environment 100 is described. The electronic device 101 may include a bus 110, a processor 120, a memory 130, an input / output interface 150, a display 160, and a communication interface 170. In some embodiments, the electronic device 101 may omit at least one of the components or additionally include other components. The bus 110 may include circuitry to connect the components 110-170 to one another and to communicate communications (e.g., control messages or data) between the components. Processor 120 may include one or more of a central processing unit, an application processor, or a communications processor (CP). The processor 120 may perform computations or data processing related to, for example, control and / or communication of at least one other component of the electronic device 101.

Memory 130 may include volatile and / or non-volatile memory. Memory 130 may store instructions or data related to at least one other component of electronic device 101, for example. According to one embodiment, the memory 130 may store software and / or programs 140. The program 140 may include, for example, a kernel 141, a middleware 143, an application programming interface (API) 145, and / or an application program . At least some of the kernel 141, middleware 143, or API 145 may be referred to as an operating system. The kernel 141 may include system resources used to execute an operation or function implemented in other programs (e.g., middleware 143, API 145, or application program 147) (E.g., bus 110, processor 120, or memory 130). The kernel 141 also provides an interface to control or manage system resources by accessing individual components of the electronic device 101 in the middleware 143, API 145, or application program 147 .

The middleware 143 can perform an intermediary role such that the API 145 or the application program 147 can communicate with the kernel 141 to exchange data. In addition, the middleware 143 may process one or more task requests received from the application program 147 according to the priority order. For example, middleware 143 may use system resources (e.g., bus 110, processor 120, or memory 130, etc.) of electronic device 101 in at least one of application programs 147 Prioritize, and process the one or more task requests. The API 145 is an interface for the application 147 to control the functions provided by the kernel 141 or the middleware 143. The API 145 is an interface for controlling the functions provided by the application 141. For example, An interface or a function (e.g., a command). Output interface 150 may be configured to communicate commands or data entered from a user or other external device to another component (s) of the electronic device 101, or to another component (s) of the electronic device 101 ) To the user or other external device.

The display 160 may include a display such as, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or a microelectromechanical system (MEMS) display, or an electronic paper display . Display 160 may display various content (e.g., text, images, video, icons, and / or symbols, etc.) to a user, for example. Display 160 may include a touch screen and may receive a touch, gesture, proximity, or hovering input using, for example, an electronic pen or a portion of the user's body. The communication interface 170 establishes communication between the electronic device 101 and an external device (e.g., the first external electronic device 102, the second external electronic device 104, or the server 106) . For example, communication interface 170 may be connected to network 162 via wireless or wired communication to communicate with an external device (e.g., second external electronic device 104 or server 106).

The wireless communication may include, for example, LTE, LTE-A (LTE Advance), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro) System for Mobile Communications), and the like. According to one embodiment, the wireless communication may be wireless communication, such as wireless fidelity (WiFi), Bluetooth, Bluetooth low power (BLE), Zigbee, NFC, Magnetic Secure Transmission, Frequency (RF), or body area network (BAN). According to one example, wireless communication may include GNSS. GNSS may be, for example, Global Positioning System (GPS), Global Navigation Satellite System (Glonass), Beidou Navigation Satellite System (Beidou) or Galileo, the European global satellite-based navigation system. Hereinafter, in this document, "GPS" can be used interchangeably with "GNSS ". The wired communication may include, for example, at least one of a universal serial bus (USB), a high definition multimedia interface (HDMI), a recommended standard 232 (RS-232), a power line communication or a plain old telephone service have. Network 162 may include at least one of a telecommunications network, e.g., a computer network (e.g., LAN or WAN), the Internet, or a telephone network.

Each of the first and second external electronic devices 102, 104 may be the same or a different kind of device as the electronic device 101. According to various embodiments, all or a portion of the operations performed in the electronic device 101 may be performed in one or more other electronic devices (e.g., electronic devices 102, 104, or server 106). According to the present invention, when electronic device 101 is to perform a function or service automatically or on demand, electronic device 101 may perform at least some functions associated therewith instead of, or in addition to, (E.g., electronic device 102, 104, or server 106) may request the other device (e.g., electronic device 102, 104, or server 106) Perform additional functions, and forward the results to the electronic device 101. The electronic device 101 may process the received results as is or additionally to provide the requested functionality or services. For example, Cloud computing, distributed computing, or client-server computing technology may be utilized.

2 is a block diagram of an electronic device 201 according to various embodiments. The electronic device 201 may include all or part of the electronic device 101 shown in Fig. 1, for example. The electronic device 201 may include one or more processors (e.g., AP) 210, a communication module 220, a subscriber identification module 224, a memory 230, a sensor module 240, an input device 250, a display 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298. The processor 210 may control a plurality of hardware or software components connected to the processor 210, for example, by driving an operating system or an application program, and may perform various data processing and calculations. The processor 210 may be implemented with, for example, a system on chip (SoC). According to one embodiment, the processor 210 may further include a graphics processing unit (GPU) and / or an image signal processor. Processor 210 may include at least some of the components shown in FIG. 2 (e.g., cellular module 221). Processor 210 may load instructions and / or data received from at least one of the other components (e.g., non-volatile memory) into volatile memory) and process the resultant data in non-volatile memory.

May have the same or similar configuration as communication module 220 (e.g., communication interface 170). The communication module 220 may include, for example, a cellular module 221, a WiFi module 223, a Bluetooth module 225, a GNSS module 227, an NFC module 228 and an RF module 229 have. The cellular module 221 can provide voice calls, video calls, text services, or Internet services, for example, over a communication network. According to one embodiment, the cellular module 221 may utilize a subscriber identity module (e.g., a SIM card) 224 to perform the identification and authentication of the electronic device 201 within the communication network. According to one embodiment, the cellular module 221 may perform at least some of the functions that the processor 210 may provide. According to one embodiment, the cellular module 221 may comprise a communications processor (CP). At least some (e.g., two or more) of the cellular module 221, the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228, according to some embodiments, (IC) or an IC package. The RF module 229 can, for example, send and receive communication signals (e.g., RF signals). The RF module 229 may include, for example, a transceiver, a power amplifier module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna. According to another embodiment, at least one of the cellular module 221, the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228 transmits / receives an RF signal through a separate RF module . The subscriber identification module 224 may include, for example, a card or an embedded SIM containing a subscriber identity module, and may include unique identification information (e.g., ICCID) or subscriber information (e.g., IMSI (international mobile subscriber identity).

Memory 230 (e.g., memory 130) may include, for example, internal memory 232 or external memory 234. Volatile memory (e.g., a DRAM, an SRAM, or an SDRAM), a non-volatile memory (e.g., an OTPROM, a PROM, an EPROM, an EEPROM, a mask ROM, a flash ROM , A flash memory, a hard drive, or a solid state drive (SSD). The external memory 234 may be a flash drive, for example, a compact flash (CF) ), Micro-SD, Mini-SD, extreme digital (xD), multi-media card (MMC), or memory stick, etc. External memory 234 may communicate with electronic device 201, Or may be physically connected.

The sensor module 240 may, for example, measure a physical quantity or sense the operating state of the electronic device 201 to convert the measured or sensed information into an electrical signal. The sensor module 240 includes a gesture sensor 240A, a gyro sensor 240B, an air pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, A temperature sensor 240G, a UV sensor 240G, a color sensor 240H (e.g., an RGB (red, green, blue) sensor), a living body sensor 240I, And a sensor 240M. Additionally or alternatively, the sensor module 240 may be configured to perform various functions such as, for example, an e-nose sensor, an electromyography (EMG) sensor, an electroencephalograph (EEG) sensor, an electrocardiogram An infrared (IR) sensor, an iris sensor, and / or a fingerprint sensor. The sensor module 240 may further include a control circuit for controlling at least one or more sensors belonging to the sensor module 240. In some embodiments, the electronic device 201 further includes a processor configured to control the sensor module 240, either as part of the processor 210 or separately, so that while the processor 210 is in a sleep state, The sensor module 240 can be controlled.

The input device 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, a key 256, or an ultrasonic input device 258. As the touch panel 252, for example, at least one of an electrostatic type, a pressure sensitive type, an infrared type, and an ultrasonic type can be used. Further, the touch panel 252 may further include a control circuit. The touch panel 252 may further include a tactile layer to provide a tactile response to the user. (Digital) pen sensor 254 may be part of, for example, a touch panel or may include a separate recognition sheet. Key 256 may include, for example, a physical button, an optical key, or a keypad. The ultrasonic input device 258 can sense the ultrasonic wave generated by the input tool through the microphone (e.g., the microphone 288) and confirm the data corresponding to the ultrasonic wave detected.

Display 260 (e.g., display 160) may include panel 262, hologram device 264, projector 266, and / or control circuitry for controlling them. The panel 262 may be embodied, for example, flexibly, transparently, or wearably. The panel 262 may comprise a touch panel 252 and one or more modules. According to one embodiment, the panel 262 may include a pressure sensor (or force sensor) capable of measuring the intensity of the pressure on the user's touch. The pressure sensor may be integrated with the touch panel 252 or may be implemented by one or more sensors separate from the touch panel 252. The hologram device 264 can display a stereoscopic image in the air using interference of light. The projector 266 can display an image by projecting light onto a screen. The screen may be located, for example, inside or outside the electronic device 201. The interface 270 may include, for example, an HDMI 272, a USB 274, an optical interface 276, or a D-sub (D-subminiature) 278. The interface 270 may, for example, be included in the communication interface 170 shown in FIG. Additionally or alternatively, the interface 270 may include, for example, a mobile high-definition link (MHL) interface, an SD card / multi-media card (MMC) interface, or an infrared data association have.

The audio module 280 can, for example, convert sound and electrical signals in both directions. At least some of the components of the audio module 280 may be included, for example, in the input / output interface 145 shown in FIG. The audio module 280 may process sound information input or output through, for example, a speaker 282, a receiver 284, an earphone 286, a microphone 288, or the like. The camera module 291 is, for example, a device capable of capturing still images and moving images, and according to one embodiment, one or more image sensors (e.g., a front sensor or a rear sensor), a lens, an image signal processor (ISP) , Or flash (e.g., an LED or xenon lamp, etc.). The power management module 295 can, for example, manage the power of the electronic device 201. [ According to one embodiment, the power management module 295 may include a power management integrated circuit (PMIC), a charging IC, or a battery or fuel gauge. The PMIC may have a wired and / or wireless charging scheme. The wireless charging scheme may include, for example, a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic wave scheme, and may further include an additional circuit for wireless charging, for example, a coil loop, a resonant circuit, have. The battery gauge can measure, for example, the remaining amount of the battery 296, the voltage during charging, the current, or the temperature. The battery 296 may include, for example, a rechargeable battery and / or a solar cell.

The indicator 297 may indicate a particular state of the electronic device 201 or a portion thereof (e.g., processor 210), e.g., a boot state, a message state, or a state of charge. The motor 298 can convert the electrical signal to mechanical vibration, and can generate vibration, haptic effects, and the like. Electronic device 201 is, for example, DMB Mobile TV-enabled devices capable of handling media data in accordance with standards such as (digital multimedia broadcasting), DVB (digital video broadcasting), or MediaFLO (mediaFlo TM) (for example, : GPU). Each of the components described in this document may be composed of one or more components, and the name of the component may be changed according to the type of the electronic device. In various embodiments, an electronic device (e. G., Electronic device 201) may have some components omitted, further include additional components, or some of the components may be combined into one entity, The functions of the preceding components can be performed in the same manner.

3 is a block diagram of a program module according to various embodiments. According to one embodiment, program module 310 (e.g., program 140) includes an operating system that controls resources associated with an electronic device (e.g., electronic device 101) and / E.g., an application program 147). The operating system may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM . 3, program module 310 includes a kernel 320 (e.g., kernel 141), middleware 330 (e.g., middleware 143), API 360 (e.g., API 145) ), And / or an application 370 (e.g., an application program 147). At least a portion of the program module 310 may be preloaded on an electronic device, 102 and 104, a server 106, and the like).

The kernel 320 may include, for example, a system resource manager 321 and / or a device driver 323. The system resource manager 321 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 321 may include a process manager, a memory manager, or a file system manager. The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication . The middleware 330 may provide various functions through the API 360, for example, to provide functions that are commonly needed by the application 370 or allow the application 370 to use limited system resources within the electronic device. Application 370 as shown in FIG. According to one embodiment, the middleware 330 includes a runtime library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager The location manager 350, the graphic manager 351, or the security manager 352. In this case, the service manager 341 may be a service manager, a service manager, a service manager, a package manager 346, a package manager 347, a connectivity manager 348, a notification manager 349,

The runtime library 335 may include, for example, a library module that the compiler uses to add new functionality via a programming language while the application 370 is executing. The runtime library 335 may perform input / output management, memory management, or arithmetic function processing. The application manager 341 can manage the life cycle of the application 370, for example. The window manager 342 can manage GUI resources used in the screen. The multimedia manager 343 can recognize the format required for reproducing the media files and can perform encoding or decoding of the media file using a codec according to the format. The resource manager 344 can manage the source code of the application 370 or the space of the memory. The power manager 345 may, for example, manage the capacity or power of the battery and provide the power information necessary for operation of the electronic device. According to one embodiment, the power manager 345 may interoperate with a basic input / output system (BIOS). The database manager 346 may create, retrieve, or modify the database to be used in the application 370, for example. The package manager 347 can manage installation or update of an application distributed in the form of a package file.

The connectivity manager 348 may, for example, manage the wireless connection. The notification manager 349 may provide the user with an event such as, for example, an arrival message, an appointment, a proximity notification, and the like. The location manager 350 can manage the location information of the electronic device, for example. The graphic manager 351 may, for example, manage the graphical effects to be presented to the user or a user interface associated therewith. Security manager 352 may provide, for example, system security or user authentication. According to one embodiment, the middleware 330 may include a telephony manager for managing the voice or video call function of the electronic device, or a middleware module capable of forming a combination of the functions of the above-described components . According to one embodiment, the middleware 330 may provide a module specialized for each type of operating system. Middleware 330 may dynamically delete some existing components or add new ones. The API 360 may be provided in a different configuration depending on the operating system, for example, as a set of API programming functions. For example, for Android or iOS, you can provide a single API set for each platform, and for Tizen, you can provide two or more API sets for each platform.

The application 370 may include a home 371, a dialer 372, an SMS / MMS 373, an instant message 374, a browser 375, a camera 376, an alarm 377, Contact 378, voice dial 379, email 380, calendar 381, media player 382, album 383, watch 384, healthcare (e.g., measuring exercise or blood glucose) , Or environmental information (e.g., air pressure, humidity, or temperature information) application. According to one embodiment, the application 370 may include an information exchange application capable of supporting the exchange of information between the electronic device and the external electronic device. The information exchange application may include, for example, a notification relay application for communicating specific information to an external electronic device, or a device management application for managing an external electronic device. For example, the notification delivery application can transmit notification information generated in another application of the electronic device to the external electronic device, or receive notification information from the external electronic device and provide the notification information to the user. The device management application may, for example, control the turn-on / turn-off or brightness (or resolution) of an external electronic device in communication with the electronic device (e.g., the external electronic device itself Control), or install, delete, or update an application running on an external electronic device. According to one embodiment, the application 370 may include an application (e.g., a healthcare application of a mobile medical device) designated according to the attributes of the external electronic device. According to one embodiment, the application 370 may include an application received from an external electronic device. At least some of the program modules 310 may be implemented (e.g., executed) in software, firmware, hardware (e.g., processor 210), or a combination of at least two of the same, Program, routine, instruction set or process.

4 illustrates a system including an electronic device for generating a group of images according to various embodiments. Referring to Figure 4, a first electronic device 410 (e.g., electronic device 101) may send an image and / or a group of generated images to a second electronic To the device 450 (e. G., The electronic device 104) to share the image and / or group of images with the second electronic device 450. Alternatively, the first electronic device 410 may receive, via the server 430, a group of images and / or images generated by the second electronic device 450. In addition, the first electronic device 410 may upload or download images and / or groups of images to the content sharing service 490 via the cloud gateway 470.

The system 400 may include a first electronic device 410, a server 430, a second electronic device 450, a cloud gateway 470, and a content sharing service 490.

The first electronic device 410 may include a contents management hub (CMH) module 411, a camera 414, a gallery application 415, a sharing agent 417, and a cloud gateway platform 4180.

The CMH module 411 is an update management module of image group creation and image group creation, and may be, for example, the processor 120 shown in FIG. 1 or included in the processor 120. The CMH module 411 extracts tag information of an image through image analysis using an image analysis technique, and automatically generates an image group using the tag information and / or automatically updates the generated image group. The tag information may include at least one of location information (location information), time information, image type information, or user input information, for example.

The camera 414 may be a camera module 291 shown in Fig.

The gallery application 416 can display image files such as photographs and moving pictures in a folder and a channel format.

The share agent 417 may be the communication interface 170 shown in FIG. The sharing agent 417 may manage the connection between the server 430 and the first electronic device 410. For example, the sharing agent 417 can send and receive a group of images with an agent that interlocks the second electronic device 450 with the first server 431 to the fourth server 437, which will be described later.

The cloud gateway platform 418 may be the communication interface shown in FIG. The cloud gateway platform 418 may be a communication interface that manages the cloud gateway 470 to communicate with the cloud service. The cloud service may be, for example, a content sharing service 490, and the content sharing service 490 may include a first content sharing service 491, a second content sharing service 493, and the like. The first content sharing service 491 may be, for example, a dropbox, and the second content sharing service 493 may be, for example, a one drive. The cloud gateway 470 may be a gateway that can control the contents (e.g., images) of each cloud.

The server 430 may include at least one of a first server 431, a second server 433, a third server 435, and a fourth server 437.

The first server 431 may perform communication functions and may give an alarm to the sharing agent (e.g., 417, 457) of each electronic device for sharing and / or updating the image group. For example, the first server 431 may be referred to as a SPP (samsung push platform). For example, the alarm may be executed in various ways using at least one of message transmission, e-mail transmission, etc. in addition to the push function.

The second server 433 may be operable to store actual images and / or moving images when the group of images is shared. For example, the second server 433 may be a storage server.

The third server 435 may manage the sync of the shared images. For example, the third server 435 may be a share server.

The fourth server 437 may be an information-based authentication server of at least one of telephone number information of the electronic device, identification information of the electronic device, and user information such as a user ID. The user information can be authenticated by using a telephone number, an e-mail address, an ID, a model number unique to the electronic device, and the like.

For example, the fourth server 437 may be an easy sign up (ESU) server.

The second electronic device 450 may include a gallery application 456 and a share agent 457. The gallery application 456 can display image files such as photographs and moving pictures in a folder and a channel format. Sharing agent 457 may perform the same or similar operations as communication interface 170 shown in FIG. The sharing agent 457 may manage the connection between the server 430 and the second electronic device 450.

5 shows a configuration of an electronic device for generating a group of images according to various embodiments.

The electronic device 500 includes an application 510, an integrated metadata database 520, a CMH module 530, an external provider module 550, an image processing library 560, (E.g., memory 130).

The application 510 may include various applications such as a Highlight Video 511, a Gallery 513, and an S-Health 515, for example.

The Highlight Video 511 can provide a function of automatically detecting a part of important images and important parts of moving images included in an image group according to a specified condition and generating a video automatically.

The Gallery 513 can provide contents such as pictures and moving pictures taken by the user directly using the camera module 414, contents such as transmitted and received pictures and moving pictures, contents shared and updated, pictures and moving pictures , Images, and a group of videos.

The S-Health 515 may provide a function of providing health care.

The CMH module 530 may be, for example, the CMH module 411 of FIG. The CMH module may include a CMH provider 531 and a CMH engine 533.

The CMH provider 531 may manage the integrated metadata database 520 and may provide image group and image analysis information to the application 510. The integrated metadata database 520 may store image-specific metadata stored in the electronic device 500.

The CMH engine 533 may include a samsung extended format (SEF) extractor 5330, a POI (point of interest) generator 5332, an event extractor 5334, and a story generator 5336.

SEF Extractor 5330 may extract tag information from images of electronic device 500. For example, tag information can be extracted from photographs taken by a camera (e.g., camera 414) of the electronic device 500. [

The POI generator 5332 may obtain information such as landmarks of an image through GPS coordinates acquired by a GPS module (e.g., the GNSS module 227) of the electronic device 500. For example, when a user takes a picture by using a camera (e.g., camera 414) while the user performs the GPS function of the electronic device 500, the POI generator 5332 transmits the electronic information to the POI server (not shown) When the GPS coordinates of the device are transmitted, information such as a landmark, a surrounding café, and a nearby restaurant corresponding to the GPS coordinates can be received from the POI server to acquire information such as a landmark, a café, and a restaurant corresponding to the image.

The event extractor 5334 can provide event information registered by a user in a specific application (e.g., a schedule management application).

Story Generator (5336) can automatically create event groups. The event group creation will be described in detail later.

The external provider module 550 may include a Media Provider 551, a Cloud Manager 553, and a Calendar Provider 555.

The media provider 551 may be a basic database storing basic information of an image, for example, photographing time, GPS coordinates, image file type, image size, image capacity information, and the like. The basic information of the image may be included in the tag information.

The Cloud Manager 553 can perform a function of connecting with a cloud (not shown).

The calendar provider 555 can store and provide holidays information for each country.

The image processing library 560 may include a SAIV (Samsung Advanced Image & Video) Library 561, and the SAIV Library 561 may perform image analysis.

The CMH module 530 may provide a first service 541, a second service 543, a third service 545, a fourth service 547, a fifth service 549, One service 541 to the fifth service 549. [ The CMH module 530 uses a first service 541, a second service 543, a third service 545, a fourth service 547 and a fifth service 549 in an interprocessor communication (IPC) And may provide at least one of the first service 541 to the fifth service 549.

The first service 541 is analyzed by at least one of SEF Extractor 5310, POI Generator 5312, Event Extractor 5334, Media Provider 551, Cloud Manager 553, or Calendar Provider 555 A digital content management (DCM) function for managing digital contents through tag information of an image and a video can be provided.

Second service 543 may create and / or update tagged information (e.g., time, location, or user specified information) based image groups and may provide for generating and / or updating titles of image groups It is possible to provide a story function that can be performed. The creation and / or updating of the image group and the creation and / or updating of the title of the image group will be described in detail later.

The third service 545 and the fourth service 547 may be provided by the SAIV Library 561.

The third service 545 can provide an object recognition function that can recognize the object by searching for the minutiae of the object through face detection, recognition-related operations, etc. in the image.

The fourth service 547 is a service that provides category classification, color extraction, image quality check, and duplicate or similar image detection operation for each image according to a specified condition of the image. The fourth service 547 may provide an image processing function.

The fifth service 549 provides short videos generated by using a part of the images included in the image group through the Highlight Video 511 when the image group is created by the story generator 5336, An animation graphic interchange format (AGIF) file may be generated, sound information acquired with the image may be provided together with the image at the time of photographing, sound information related to the images included in the image group may be acquired It is possible to provide an enhance service capable of providing a function of allowing the user to memorize a picture more meaningfully such as providing the acquired image together with the image.

For example, when creating an image group, the AGIF file of an image group composed of optimal images may be generated after selecting at least one optimal image according to a predetermined condition for a plurality of similar images.

The CMH module 530 and the application 510 can be connected to each other via DB Quary & Intent.

(E.g., memory 130) and a group comprising at least one image, and wherein at least one of the plurality of images stored in the memory The at least one image included in the generated group is included in a first image group corresponding to the generated group and having a first title among the image groups, and the image of the at least one image included in the generated group (E.g., processor 120) that uses the information to change the first title of the first group of images to the second title.

According to various embodiments, the controller (e.g., processor 120) may analyze the at least one image, and if the at least one image is a plurality of images, Lt; / RTI >

According to various embodiments, the control unit (e.g., processor 120) further comprises determining the first image group corresponding to the generated group according to the information of the generated group, The group information may include at least one of location information, time information, image type information, or user input information.

According to various embodiments, the controller (e.g., processor 120) may use at least one image included in the first group of images and the at least one image included in the generated group And if it is determined that there are similar images, selects one of the similar images according to a specified condition, and if an image other than the selected one of the similar images is stored It can be prevented from being included in the image group.

According to various embodiments, the image information of the at least one image included in the generated group may include at least one of position information, time information, image type information, or user input information.

According to various embodiments, when the first title of the first image group is changed to the second title, the controller (e.g., processor 120) may correspond to the at least one information included in the image information May be included in the second title.

According to various embodiments, the controller (e.g., processor 120) may generate at least one new image group using the at least one image included in the generated group, And generating a title of the generated new image group using at least one piece of image information included in the generated new image group.

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the control unit (e.g., processor 120) may use the time information of each of the plurality of images, Classifying the plurality of images in a corresponding time interval of the designated time intervals, checking at least one peak time interval having a number of images equal to or larger than the average value of the time intervals of the corresponding time periods, Determining a first time interval from a nearest previous time interval in which the number of images is a specified number of previous time intervals to a nearest subsequent time interval in which the number of images is the predetermined number, And creating the group including the images included in the first time period as the group There.

According to various embodiments, when the generated group is a plurality of groups, the control unit (e.g., processor 120) calculates time intervals with the preceding and succeeding groups for each of the plurality of groups, The time intervals with respect to the preceding and following groups are divided into a short time interval and a long time interval, and representative values of time intervals (for example, an average value, a minimum value, a maximum value, ) And a second time interval, which is a representative value of time intervals divided by the long time interval, and includes adjacent groups having a time interval shorter than the first time interval among the plurality of groups And creating at least one new image group.

According to various embodiments, the controller (e.g., processor 120) may further include generating at least one new image group that includes adjacent groups at time intervals less than the second time interval among the plurality of groups .

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the control unit (e.g., processor 120) may determine that the plurality of images are identical to each other And generating at least one group for each of the images including the same person by using the images included in the at least one group if the number of images included in the at least one group is equal to or greater than a specified number, And creating a new group of images.

According to various embodiments, the controller (e.g., processor 120) may include using the at least one of face recognition technology or name tag information to classify the plurality of images by images that include the same person have.

According to various embodiments, the electronic device further includes a display (e.g., display 160), wherein the controller (e.g., processor 120) determines whether the number of images included in the at least one group And displaying a screen for setting a name tag for a person corresponding to the group on the display if the number is more than a predetermined number.

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the controller (e.g., processor 120) may convert the plurality of images into images And if the number of images included in the at least one group is equal to or greater than a designated number of images, the image processing unit And creating at least one new image group.

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the control unit (e.g., processor 120) may convert the plurality of images into an image And generating at least one group for each of the images including the corresponding position information, wherein the number of images included in the first group generated according to the first position information, which is the corresponding position information, , Creating at least one new image group using the images included in the first group.

According to various embodiments, when the number of images included in the second group generated according to the second position information is equal to or greater than the designated number of images, the controller (e.g., the processor 120) And using the images contained in the second group to create the at least one new image group.

According to various embodiments, the controller (e.g., processor 120) may generate (e.g., generate) in accordance with the first position information corresponding to a time after the time corresponding to the first group and a time before the time corresponding to the second group The second group, and the third group, if there is a third group of images in the first group, the images in the first group, the second group, and the third group.

According to various embodiments, the controller (e.g., processor 120) may convert images included in the first image group including the at least one image into an animation graphic interchange format (AGIF) file And < / RTI >

According to various embodiments, the controller (e.g., processor 120) may use at least one image information included in the first group of images, including the at least one image, And sorting the first group of images into at least two image groups.

According to various embodiments, the controller (e.g., processor 120) may be configured to use one or more of the plurality of image groups, May be generated.

According to various embodiments, in an electronic device (electronic device 101), a first image and a second image forming a first group of images, and a third image and a fourth image forming a second group of images are stored (E.g., memory 130) and a processor (e.g., processor 120), the processor is configured to: identify first attribute information corresponding to the first image group; Identify second attribute information corresponding to the second image group; A third image including at least one of the first image and the second image, and at least one of the third image and the fourth image, based on at least the first attribute information and the second attribute information, Create an image group; And a display functionally associated with the processor (e.g., processor 120) to associate and display the pictures forming the third group of images.

According to various embodiments, when the third group of images includes the first image and the second image, the third image, and the fourth image, if the third group of images is created, : The processor 120) may generate the title of the third group of images using the related information of each of the first image, the second image, the third image, and the fourth image.

According to various embodiments, when the third group of images includes the first image or the second image, the third image, and the fourth image, the processor (e.g., processor 120) Generates a title of the third group of images using information related to the first image or the second image included in the group of three images and information associated with each of the third image and the fourth image, If the group of images includes the first image and the second image, the third image, or the fourth image, the processor (e.g., processor 120) The title of the third image group can be generated using the image information and the image information of the third image or the fourth image included in the third image group.

According to various embodiments, the first property information includes first location information associated with the first image or the second image, and the second property information includes a second location information associated with the third image or the second image The processor (e.g., processor 120) comprises: a processor configured to compare the first location information with the second location information; And to perform generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first property information includes first time information related to the first image or the second image, and the second property information includes a second time information associated with the third image or the second image And the processor (e.g., processor 120) comprises: comparing the first time information with the second time information; And to perform generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first attribute information includes first tag information associated with the first image or the second image, and the second attribute information includes a second tag information associated with the third image or the second image And the processor (e.g., processor 120) includes tag information for comparing the first tag information with the second tag information; And to perform generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first property information includes first object information related to the first image or the second image, and the second property information includes a second object information related to the second image or the second image Wherein the processor (e.g., processor 120) comprises: comparing the first object information with the second object information; And to perform generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first attribute information may include at least one of first location information, first time information, first tag information, and first object information related to the first image or the second image, The second attribute information may include at least one of first location information, first time information, first tag information, and first object information related to the third image or the fourth image.

According to various embodiments, the processor (e.g., processor 120) may be configured such that the electronic device acquires a fifth image, and the acquired fifth image corresponds to third attribute information corresponding to the third image group , The fifth image may be included in the third image group.

According to various embodiments, the processor (e.g., processor 120) further includes altering the title of the third image group using associated information of each of the images included in the third image group can do.

According to various embodiments, the processor (e.g., processor 120) may include at least two pieces of information associated with the first image included in the third group of images, and at least two pieces of information associated with the third image If the at least two pieces of information associated with the first image and the second image correspond to the first image and the third image, the generation of the fourth image group including the first image and the third image.

According to various embodiments, the processor (e.g., processor 120) may be configured to determine whether at least one of the first image and the second image included in the third image group and the third image Or an animation graphic interchange format (AGIF) file.

According to various embodiments, the processor (e.g., processor 120) may associate the images forming the third group of images with a metro user interface type, a card type overlaying at least some of the images, And may be set to be displayed in an album form or a list form.

An album form, or a list form in which at least a part of the images are superimposed and displayed. The album type may be configured to display various images of the arrangement and size of the images forming the third image group. For example, the size of each of the images forming the third image group may be set differently. The album form may include text corresponding to images forming the third image group. The list form may be to display images arranged in a line in a specified direction.

The operations of associating and displaying the images forming the third image group with each other may be displayed in various ways, which may indicate that the images forming the third image group are one image group, in addition to the examples described above.

6 is a flow diagram of an image group creation operation of an electronic device (e.g., electronic device 101) according to various embodiments.

In operation 610, the electronic device may acquire a plurality of images through photographing, downloading images, receiving images, and the like. The image may be various, such as a photograph, a video, and the like.

In operation 620, the electronic device may analyze the acquired images using image analysis techniques.

In operation 630, the above-described DCM function (e.g., the function of the first service 541) can be executed. For example, the electronic device can extract and manage the tag information of the analyzed image according to the analysis result of the 620 operation. The tag information may include at least one of location information, time information, image type information, or user input information.

In operation 640, the electronic device may execute the above-described object recognition function (e.g., the function of the third service 543). For example, the electronic device may perform an operation of recognizing an object by searching the minutiae of the object, such as a face detection, a recognition related operation, etc., in the obtained image.

In operation 650, the electronic device may execute the image processing functions described above (e.g., functions of the fourth service 544). For example, the electronic device can use the tag information to classify the image into categories (classified by type), color extraction, quality check, duplicate or similar images (images) according to specified conditions.

In operation 660, the electronic device may execute the Story function described above (e.g., the function of the second service 542). For example, the electronic device may be configured to perform the following actions: category detection, color extraction, quality inspection, and / or duplicate or similar images (images) of the 640 operation and / Using the detection results, it is possible to create an image group and / or update an existing image group. Also, when creating the image group, a title of the image group can be generated. In addition, upon updating the existing image group, the title of the previously created image group may be updated. 660 operation will be described in detail later.

In operation 670, the electronic device may execute the above-described Enhance function (e.g., a function of the fourth service 545). For example, the electronic device may generate an AGIF file of the images included in the generated image group. In addition, the electronic device may generate an AGIF file of the images included in the updated image group. For example, the electronic device may provide a short moving image or the like generated using some of the images included in the image group through a Highlight Video (e.g., Highlight Video 511). For example, the electronic device may be configured to include in the group of images together with the image the acoustic information obtained with the image at the time of photographing, acquire acoustic information relating to the images contained in the group of images, It is possible to provide a function of allowing the user to memorize the picture more meaningfully such as providing the acquired sound information together.

7 is a flowchart of an image group creation operation of an electronic device (e.g., electronic device 101) according to various embodiments.

In operation 705, the electronic device may analyze each of the plurality of images. For example, the electronic device can identify tag information of each of a plurality of images. The tag information may include at least one of location information, time information, image type information, or user input information, for example.

In operation 710, the electronic device may use the result of the analysis to clustering the plurality of images based on location information. The clustering is collectively referred to as a technique of grouping data (e.g., images) into groups based on concepts such as similarity.

For example, the electronic device may classify the plurality of images according to position information of each of the plurality of images. For example, the electronic device may generate at least one group by classifying a plurality of images by images including the same and / or similar position information.

The positional information may be information obtained by the electronic device with respect to the photographing position when the GPS function of the electronic device is activated, for example, when taking a photograph using the electronic device. Alternatively, the location information may be information obtained according to, for example, image analysis. For example, if the image includes a specific building such as a palace, the electronic device can analyze that the image includes Gyeongbokgung according to the image analysis operation of the 705 operation, and can determine the location information of Gyeongbokgung. Alternatively, the location information may be stored in the electronic device if, for example, the GPS function of the electronic device is off, or for some other reason, the electronic device can not acquire the location information, If photographing has been performed, the position information previously obtained using the previously obtained position information or the position information (position information similar to the previously obtained position information) may be determined as the position information of the photographed photograph. Alternatively, the location information may be used when the electronic device is storing a user profile (e.g., schedule information) of the electronic device, for example, when a user arrives at an analogous time on average, The location information of the user can be determined based on the schedule. Alternatively, when the electronic device accesses a specific access point (AP), the location information may be confirmed based on a signal of the specific AP (e.g., location information included in a beacon signal).

 In operation 715, the electronic device may determine whether the group generated according to the position information-based clustering satisfies a first condition (also referred to as a first image group generation condition). For example, the first condition may be a condition that the number of images included in the generated group should be equal to or greater than a specified number. For example, the first condition may be a condition that two or more groups should be created for the same position. For example, the first condition may be a condition that a specific object or a person should be photographed. For example, the first condition may be a condition that the user must be located at a specified specific place. If the user is located at a specified location, the first condition can be satisfied even when the number of images is less than the designated number. For example, the first condition may be a condition for confirming the schedule information of the user and matching the schedule. For example, at 2:00 on Saturday, when an office meeting is stored in the electronic device as a schedule, the actual photographing location may correspond to the location of the schedule.

In operation 715, if the electronic device determines that the group generated according to the location-based clustering meets the specified first condition, then the operation 720 may be performed, otherwise the operation of the present invention may be terminated.

In operation 720, the electronic device may generate the location information based image group using the images included in the group generated according to the location information based clustering. For example, the electronic device may generate a group generated based on the location-based clustering as a group of location-based images.

In operation 730, the electronic device may cluster the plurality of images based on time information, using the analysis result of the 705 operation.

For example, the electronic device confirms the time information of each of the plurality of images, and classifies each of the plurality of images by a predetermined time zone according to a date, thereby generating at least one group (hereinafter referred to as moment )) Can be generated. A detailed description of the at least one group generating operation will be described in detail in the description of FIG. 8 below.

For example, the time information may include a photographing time (or a time when a photograph is captured and stored), and a time when the photographing time (or a photographing time and a photographing time) May be time information.

In operation 735, the electronic device may determine whether the group generated according to the temporal information-based clustering satisfies a specified second condition (also referred to as a second image group generation condition). For example, the second condition may be that the number of images included in the group generated according to the time information-based clustering should be equal to or greater than a specified number. For example, the second condition may be that the time interval with at least one immediately adjacent group generated according to the time information-based clustering should be less than a predetermined time interval. For example, the second condition may be a condition that a specific object or person should be photographed. For example, the second condition may be a condition that the user should be located at a specified specific place. If the user is located at a specified location, the second condition may be satisfied even if the number of images is less than the specified number. For example, the second condition may be a condition for confirming the schedule information of the user and matching the schedule. For example, at 2:00 on Saturday, when an office meeting is stored in the electronic device as a schedule, the actual photographing location may correspond to the location of the schedule.

In operation 735, if the electronic device determines that the group generated according to the temporal information based clustering meets the second condition specified, then the operation 740 may be performed, otherwise the operation of the present invention may be terminated.

In operation 740, the electronic device may generate the temporal information based image group using the images included in the group generated according to the temporal information based clustering. For example, the electronic device can generate a group generated according to the time information-based clustering as a time information based image group.

In operation 750, the electronic device may cluster the plurality of images based on image type information (also referred to as image analysis information), using the analysis results.

For example, the image type information may be information indicating what is included in the image (indicating what the image is), for example, information indicating that the image includes a specific person, food, Lt; / RTI > For example, the image type information may be information capable of identifying minutiae of an object included in the image and identifying an object included in the image.

For example, the image type information may be information capable of identifying a person included in the image using a face recognition technique.

The electronic device may generate at least one group by classifying the images including the same and / or similar image type information according to the image type information of each of the plurality of images. For example, if the image type information of some of the plurality of images includes information such as food, the plurality of images may be generated as one group for food.

In operation 755, the electronic device may determine whether a group generated according to the image type information-based clustering satisfies a third condition (also referred to as a third image group generation condition). For example, according to the image type information, the lowest image number for generating an image group may be preset. Accordingly, the third condition may be that the number of images included in the generated group should be equal to or greater than a specified number (a predetermined minimum number of images of the image type information corresponding to the generated group). For example, the third condition may be a condition that a specific object or a person should be photographed. For example, the third condition may be a condition that the user should be located at a specified specific place. If the user is located at a specified location, the third condition can be satisfied even when the number of images is equal to or smaller than the specified number. For example, the third condition may be a condition for confirming the schedule information of the user and matching the schedule. For example, at 2:00 on Saturday, when an office meeting is stored in the electronic device as a schedule, the actual photographing location may correspond to the location of the schedule.

In operation 755, if the electronic device determines that the group generated according to the image analysis information-based clustering satisfies the third condition, it can execute operation 760, otherwise the operation of the present invention can be terminated.

In operation 760, the electronic device may use the images contained in the group generated according to the image type information-based clustering to generate an image type information based image group. For example, the electronic device may generate a group generated according to the image type information-based clustering into an image type information based image group.

In operation 770, the electronic device may generate a title of the generated image group according to the specified conditions. The title generation of the image group will be described in detail later.

8 is a flow diagram of a time information-based image group creation operation of an electronic device (e.g., electronic device 101) according to various embodiments.

Referring to FIG. 8, an example of an operation of generating an image group based on time information will be described by taking an image as an example of a picture, but the present invention can also be applied to various contents such as moving images other than images.

The electronic device can confirm the daily image acquisition time. In addition, the electronic device may divide a time interval of a plurality of images in a time zone near the image acquisition time into unit moments. For example, if an average of more than three pictures per hour of the day is taken or a picture is stored, more than one unit moment may be generated. The electronic device can generate images of one unit group (hereinafter, referred to as a moment) including the unit moments by connecting unit moments closer in time with respect to the generated unit moments.

In operation 810, the electronic device may generate moments (also referred to as groups) by classifying the plurality of images using time information of a plurality of images.

The plurality of images may be obtained in various manners such as photographing, image copying, image capturing or downloading, image reception through file transfer, etc., and photographs shared with the cloud.

The unit moment is for sorting images based on time information, and may be a time interval classified according to a specified condition.

The electronic device may generate moments according to the following operations.

The electronic device can check the time period during which the photographs are photographed or photographed / copied / downloaded on an average of more than a day's daily basis, and can classify a set of photographs in a close time zone into a significant moment. An image group can be generated by connecting moments close in time around the important moment. For example, when taking or storing more than three pictures per hour on average per day, the important moments can be divided into one or more.

The electronic device can confirm the number of images acquired for each predetermined time interval on a daily basis. According to various embodiments, the electronic device can identify a time interval (referred to as a peak time interval) having an image number greater than the average number of images per unit moment in the time intervals of the same date. The peak time interval may be one or a plurality of peak time intervals, and a plurality of peak time intervals will be described below as an example. The electronic device can confirm the nearest previous time period and the following time period in which the number of images acquired for each of the plurality of peak time intervals is a designated number (e.g., 0). Wherein the electronic apparatus is configured to calculate, for each of the peak time intervals, from the closest previous time interval in which the number of acquired images is the predetermined number to the most recent subsequent time interval in which the number of acquired images is the specified number, Can be determined by the moment.

In operation 820, the electronic device computes time intervals between front and rear moments (moments in the nearest time period before and after each moment) by moments, and time intervals between the moments before and after the moments, And a long time interval.

For example, as shown in FIG. 9, when the moments are arranged in a line in time sequence over several days, the time intervals of the moments with respect to each moment can be calculated.

9, the electronic device can calculate the time interval with the first moment 901, which is the moment immediately before the second moment 902, as 14 hours and 58 minutes, and the second moment 902 and the third moment 903 immediately after the second moment 902 can be calculated as 2 hours 33 minutes. The electronic device may also calculate the time interval with respect to the back and forth moments for each of the remaining moments (e.g., 901, 903, 904, ...) other than the second moment 902.

At this time, the short time interval may be T1, and the long time interval may be T2, and 1 hour < T1 < T2 < For example, a value obtained by subtracting the last time of the (n-1) -th moment from the first time of the n-th moment (the time of the preceding picture) The minimum value among the values obtained by subtracting the last time can be set to T1. Also, the maximum value obtained by subtracting the last time of the (n-1) -th moment from the first time of the n-th moment minus the last time of the n-th moment from the first time of the (n + 1) -th moment may be T2.

T1 and T2 can be calculated by the following equation (1).

[Equation 1]

T1 = Min {Ns- (N-1), (N + 1) -Ne}

T2 = Max {Ns- (N-1), (N + 1) -Ne}

(N + 1) is the end time of the (n-1) th moment, and (N + 1) 1) th moment)

In operation 830, the electronic device may calculate an average time interval of the short time intervals separated by moments and an average time interval of the long time intervals separated by moments, respectively.

For example, the electronic device can generate a time interval table for a short time interval and a long time interval with front and rear moments calculated and calculated for each moment as shown in FIG. For example, the electronic device can specify that the maximum value of T1 does not exceed 24 hours, and the range can be specified to limit the maximum value of T2 to one week. For example, the electronic device may be configured such that the short time interval of the first moment 901 of FIG. 9 is 14 hours and 58 minutes, the short time interval of the second moment 902 is 2 hours and 33 minutes, When a short time interval is 2 hours, 33 minutes, etc., a short time interval table can be generated as shown in FIG. The long time interval of the first moment 901 of FIG. 9 is 7 days and 1 hour 56 minutes, the long time interval of the second moment 902 is 14 hours and 58 minutes, the long time interval of the third moment 903 is 12 hours Time 2 minutes, etc., a long time interval table can be generated as shown in FIG. The electronic device then calculates the average time interval of short time intervals of the short time interval table as 11 hours, 24 minutes and 20 seconds and calculates the average time interval of the long time intervals of the long time interval table as 2 hours, 12 hours and 43 minutes. Time intervals exceeding a predetermined range of the maximum value of T1 and the maximum value of T2 may be excluded in calculating the average time interval of the short time intervals and the long time intervals.

In operation 840, the electronic device may determine whether the image information of the plurality of images corresponds to the specified travel information. For example, it is possible to determine whether the plurality of images are photographed images while traveling.

For example, the designated travel information may be when the location information included in the image information is overseas. For example, when the location information included in the image information of the plurality of images is overseas, it can be determined that the image information of the plurality of images corresponds to the designated travel information. When the location information included in the image information of the plurality of images is domestic, it can be determined that the image information of the plurality of images does not correspond to the designated travel information.

For example, the specified travel information may be a distance greater than or equal to a specified distance from the user's home location stored in the electronic device. For example, when the location information included in the image information of the plurality of images is information corresponding to a distance not less than a specified distance from the home location of the user, the image information of the plurality of images corresponds to the designated travel information . When it is determined that the image information of the plurality of images does not correspond to the designated travel information when the position information included in the image information of the plurality of images corresponds to a distance smaller than a specified distance from the home position of the user can do.

For example, the designated travel information may be a case where event information of a schedule management application that registers and manages a schedule of a user installed in the electronic device is travel information. For example, when event information of a schedule management application that registers and manages a schedule of a user installed in the electronic device is recorded as a trip for a specified period, and time information of the plurality of images is included in a specified period recorded as the trip , It can be determined that the image information of the plurality of images corresponds to the designated travel information, or otherwise it can be determined that the image information of the plurality of images does not correspond to the designated travel information.

In operation 840, the electronic device may execute 850 operations if it determines that the image information of the plurality of images corresponds to the specified travel information, otherwise it may execute 860 operations

In operation 850, the electronic device may generate moments of at least one large range, such that the time interval between immediate moments is less than or equal to the average time interval of the long time intervals, to create at least one group of images. For example, the electronic device may generate images as a group of images that are included in the generated at least one large range of moments. 9 and 10, when the average time interval of the long time intervals is 2 days and 12 hours and 43 minutes, the time interval between the first moment 901 and the second moment 902 in FIG. 9 is 14 hours and 58 minutes The time interval between the second moment 902 and the third moment 903 is 2 hours and 33 minutes and the time interval between the third moment 903 and the fourth moment 904 is 23 hours and 2 minutes, 901) to the fourth moment 904 and the images included in the second moments 902 to the fourth moments 904 can be generated as one image group have.

In operation 860, the electronic device may generate moments of at least one large range by collecting moments whose time intervals between immediately adjacent moments are not more than the average time interval of the short time intervals, to generate at least one image group. For example, the electronic device may generate images as a group of images that are included in the generated at least one large range of moments. 9 and 10, when the average time interval of the short time intervals is 11 hours, 24 minutes and 20 seconds, the third moment 903 immediately adjacent to the first moment 902 and the second moment 902 of FIG. The second moment 902 and the third moment 903 can be generated in a first range including the second moment 902 and the third moment 903 because the time interval between the first and second moments 902 and 903 is 2 hours and 33 minutes, May be created as a group of images.

According to various embodiments, in order to minimize the amount of computation of an electronic device (e.g., electronic device 101) during the time information based image group generation operation, the electronic device may be operated on the following criteria.

The generation time of the moment can be set in advance for a specified time (for example, 5 o'clock every day, 5 o'clock every two days, 5 o'clock at an interval of one week, etc.). For example, for photographed images other than images added by download and MTP (multicast transport protocol) copy, it is possible to preset the moment to be generated only at a designated time.

The short time intervals and long time intervals of the moments depend on the images that are the user data, but the generation of the image group can be made according to the calculation time of the average time interval of short time intervals of moments and the average time interval of long time intervals . Accordingly, it is possible to store a mean time interval in which the calculation is completed without storing the short time interval and the long time interval from the beginning again according to the moments, The average time interval of time intervals and the average time interval of new long time intervals can be calculated.

&Quot; (2) &quot;

Average time interval of new short time intervals = (Average time interval of existing short time intervals * Number of existing moments + New short time interval 1 + New short time interval 2 + ... + + New short time interval N) / Number of moments + number of new moments)

&Quot; (3) &quot;

Average time interval of new long time intervals = (Average time interval of existing long time intervals * Number of existing moments + New long time interval 1 + New long time interval 2 + ... + New long time interval N) / Number of moments + number of new moments)

The electronic device does not take into account dynamic parameters (peak time interval, short time interval, long time interval, etc.) at the time of image group update such as including at least one image in addition to the generated image group after image group creation . For example, when the electronic device acquires at least one image, if the acquired image is included in an image group generation condition of a specific image group generated in advance, the acquired image is added to the specific image group, The sorting rule can be applied to update the group of images by excluding similar images. For example, when the electronic device deletes one or more images, in accordance with the deletion of the image, if the image of the group of images in which the image was included does not meet the specified minimum number of images (e.g., 7) You can delete the group of images that contained the image.

The electronic device can perform an operation for generating an image group in a state in which at least one image is clustered, a service flag for checking that the images have been analyzed and processed, and excluding the once processed image . However, if the condition of the minimum number of images is not satisfied, the deleted image group may be newly created if an image is added by MTP insert or the like and the criterion is satisfied.

According to various embodiments, an electronic device (e.g., electronic device 101) may analyze a plurality of images to generate a group of images based on image analysis results and / or image tag information. When the image result information type information is a person, the electronic device, when there are more than a specified number of some images estimated to contain the same person among a plurality of images, Group can be created. In addition, the electronic device may generate one image group including pictures of a person tagged with the same name, if there are more than a specified number (e.g., 10) of pictures of the person tagged with the same name.

If the number of images estimated to be the same person is equal to or greater than a designated number (e.g., 15), the electronic device can display a screen for prompting the user to perform name tags for images as shown in FIG. Referring to FIG. 11, the electronic device may display a screen 1103 asking the user to tag names of images and generate a group, together with images 1101 estimated to be the same person. Depending on the user input, the electronic device may be able to name tags for the photographs, and may also create the photographs as a group of images.

According to various embodiments, an electronic device (e.g., electronic device 101) may use the tag information of each of the images to generate a group of images. For example, the electronic device can automatically generate an image group when the number of pictures designated by the tag information is satisfied, according to the priority of the tag information, as shown in Table 1 below.

About tags First priority (Chapter 15) Second priority (50 chapters) Third priority (100) User input information,
Name, Documents, Food, Fashion
Coast, Forest, Mountain, Night scene, Street, Vehicles, Bike, Car, Airplane, Boat, Train Group, Scenery, City Overlook, Building, Open Country, Snow, Waterfall, Pet

In consideration of the importance of the image group, the electronic device can designate the tag information as a first priority, a second priority, and a third priority in advance, as shown in Table 1, The number of images to be included can be specified in advance. The tag information included in the first priority is higher in priority than the tag information included in the second priority, and the tag information included in the second priority has a higher priority than the tag information included in the third priority.

For example, as shown in Table 1, when the first priority includes a name and the lowest image number of the first priority is designated as 15, if the electronic device confirms that there are 15 images having the same name tag , And 15 images having the same name tag may be generated as one image group.

In addition, when a plurality of pieces of tag information are included in one image, the electronic device checks whether the image group can be generated according to the image group generation condition using the tag information having a high priority, Can be generated. If it is not possible to generate an image group according to the image group creation condition using the tag information having the high priority, it can be checked whether image group creation is possible according to the image group creation condition using the tag information with low priority .

According to various embodiments, an electronic device (e.g., electronic device 101) may analyze a plurality of images to generate a group of images based on the location information of the plurality of images.

For example, when the number of photographs taken at the same place is equal to or greater than a specified number, the electronic device can generate a photograph group of the photographs taken at the same place.

For example, the electronic device has photographs taken for a predetermined number of days or more at the same place, and at the same time, when the photographs taken per day satisfy the specified condition, the photographs taken at the same place can be generated as one image group. The specified number of days may include a date immediately following, such as May 1, May 2, and / or a date not immediately followed by an interval for some period such as May 1 and June 3.

For example, the designated days are referred to as two days, and the specified condition may be a case where the number of photographs at the same place is the designated number, for example, 15 or more. Under such a condition, for example, when a user takes 20 pictures at night in Apgujeongjung using the electronic device on May 15, the electronic device displays the pictures photographed according to automatic and / or user input Can be stored on May 15th as night in Apgujeong-dong. In this case, the electronic device satisfies the specified condition, but does not generate the image group for the affirmative action by confirming that the specified days are not satisfied.

Thereafter, when the user takes three photographs in Apkujung-dong using the electronic device on June 8, the electronic device displays the tag information of each of the photographs photographed according to automatic and / or user input on June 8, It can be saved as Apgujeong-dong. In this case, the electronic device satisfies the specified number of days, but does not satisfy the specified condition, and does not generate the image group for the affirmative action.

Then, when the user takes 17 photographs in the afternoon in Apkujungjeong using the electronic device on June 22, the electronic device displays tag information of each of the photographs photographed according to automatic and / or user input on June 22, It can be saved as afternoon in Apgujeong-dong. In this case, the electronic device confirms that it meets the specified days and the specified conditions, that is, confirms that the specified condition is satisfied on June 22, which is the second date after May 15, You can create an image group that includes all the photos taken at Apgujeong-dong on the 15th and 22nd of June.

On the other hand, in addition, the electronic device has photographs taken for a predetermined number of days or more at the same place, and at the same time,

If there is a picture photographed on at least one date that does not satisfy the specified condition between the days of the several days that satisfy the specified condition at the same place, All of the photographs photographed on at least one date can be created as one image group. For example, an image group including all the photographs taken on May 15, June 8, and June 22, Apgujeong-dong may be created.

According to various embodiments, an image group for a plurality of images can be generated by combining two or more pieces of time information, image analysis result, tag information, or position information.

According to various embodiments, an electronic device (e. G., Electronic device 101) may automatically generate a title of a created or updated group of images. For example, a title of an image group can be dynamically generated by the electronic device using tag information of images included in an image group. For example, the electronic device can use various kinds of combinations of location + time, POI + time, time + image analysis information (situation information), person + place, travel, weather, holiday, Can be generated automatically.

Tables 2 and 3 show an example of a title generation rule of an image group.

<Table 2> time date slot The month + week + day of the week, for example, generates "the second Monday of March" Create morning, morning, afternoon, evening, night, or dawn National holidays create, for example, "Liberation Day" (skip)

<Table 3> Formula (only one formula is applied in order of A, B, C) A: At least one face tagged image is included. B: If the following tag information is contained by 30% or more C: When the following tag information is included in at least one image If there is only one person, create with "Formation" If it is Street, create it from "Street" For group images (more than 5 faces detected), create "special" If there are two players, create a "Formation with high" For Scenery, create "outdoors" If it is a party shot (for example, a wedding party), create "fun" If there are more than 3 people, create "with 2 out of form" If you are Coast, create "at the beach" If the time information is night time [or] and the image has night view tag information, it generates "romantic" If it is a mountain, create it from "mountains"

The generated expressions can be arranged in the order of date + (formula C) + time zone + (formula A) or (formula B) using the title generation rules of Tables 2 and 3, and the formulas A, B, C May apply only one of them according to the designated priority, or may not apply Equations A to C at all.

Wherein the electronic device is configured such that the date is "second Monday of March", the formula C is "romantic", the time zone is "night", the formula A is "with the formula" For example, if B is determined to be "at the beach ", at least one of the following image group titles may be generated.

1) Applying Equation A: With the evening formation on the second Monday of March (created by "2 nd week of March Sunday Evening with Jinhyung" when the electronic device is set to support English)

2) Applying formula B: On the evening of the second Monday in March at the beach (when the electronic device is set to support English, it is created as "2 nd week of March Sunday Evening at the beach"

3) Apply to Formula C: Romantic Evening on the second Monday of March (created when the electronic device is set to support English, "2 nd week of March Sunday Romantic Night")

4) No formulas: Monday night of the second week of March (if the electronic device is set to support English, it is created as "2 nd week of March Sunday Night")

Wherein the electronic device is a digital camera having a date of "Christmas", a formula C of "special", a time of day of "evening", a formula A of "with a formula" and a formula B of " Quot ;, it is possible to generate at least one of the titles of the following image group.

1) Applying formula A: With Christmas evening formation (if the electronic device is set to support English, it will be created as "Christmas Evening with Jinhyung")

2) Applying Formula B: On Christmas Evening Beach (if the electronic device is set to support English, it will be created as "Christmas Evening at the beach")

3) Applying formula C: Christmas special evening (created as "Christmas Special Evening" if the electronic device is set to support English)

4) No formula: Christmas evening (if the electronic device is set to support English, it will be created as "Christmas Evening")

If the electronic device determines that the date is "Christmas ", the formula C is absent, there is no time zone, the formula A is" with the formation ", and the formula B is & , It is possible to generate at least one of the titles of the following image groups.

1) Applying Formula A: With Christmas Formation (if the electronic device is set to support English, it will be created as "Christmas with Jinhyung")

2) Applying Formula B: On Christmas Beach (if the electronic device is set to support English, it will be created as "Christmas at the beach")

3) No formula: Christmas evening (created by "Christmas Evening" if the electronic device is set to support English)

According to various embodiments, an electronic device (e.g., electronic device 101) may update the group of images stored in the electronic device upon acquisition of a new image.

For example, when the electronic device acquires a new image, the electronic device can automatically extract and / or analyze and store tag information of the image. For example, the electronic device can acquire a new image through a camera, an MTP, a cloud, a download, and the like.

The electronic device may include the image in the stored image group if the tag information of the image is compared with the tag information such as the time, location, and the like of the image group previously stored in the electronic device. At this time, the electronic device can automatically update the title of the image group as well as the tag information of the added image. In addition, the electronic device may check the similarity with images not included in the stored image group, and generate a new image group if the image group creation condition is satisfied.

For example, in a case where an image group having the title "in Han River Park" is stored in the electronic device, a newly stored image received through the messenger is a photograph including a face tagged person, If one image corresponds to the time and location information of the photos in the image group, the newly stored image is added to the stored image group, and the title of the stored image group is automatically updated to "in Han River Park with Mike" .

When the electronic device removes the image stored in the electronic device according to a user input, the image group including the image is searched for and the corresponding image is automatically deleted, and the title of the image group including the corresponding image is also currently It can be automatically updated using images of the included image group.

12 is a flowchart of image group creation and update operations in accordance with image acquisition of an electronic device (e.g., electronic device 101) according to various embodiments. Referring to FIG. 12, according to various embodiments, the electronic device may analyze a usage pattern of a user to generate a group of images, and dynamically change the group according to the user pattern. For example, the time interval of taking a picture for each user of the electronic device is different, and the pattern for taking a picture in daily life and traveling abroad may be different. In consideration of this, the electronic device generates a group of images from photographs and updates the generated group of images, so that even if the same or similar photographs are acquired for each of the electronic devices, Lt; / RTI &gt;

In operation 1210, the electronic device may acquire at least one new image.

In operation 1220, the electronic device may analyze the acquired image.

In operation 1230, the electronic device may use the analysis results to detect image information of the obtained image. The image information may include at least one of location information, time information, image type information, tag information, or user input information, for example.

In operation 1240, the electronic device may determine whether the detected image information corresponds to at least one of the image groups stored in the electronic device. For example, the electronic device confirms whether the detected image information and the image group information (image group generation conditions) of at least one of the stored image groups correspond (identical or similar) And determine whether the acquired image corresponds to at least one of the stored image groups.

In operation 1240, if the electronic device determines that the acquired image corresponds to at least one of the stored image groups, then the 1250 operation may be performed, otherwise the 1270 operation may be performed.

In operation 1250, the electronic device may include the acquired image in at least one group of images determined to be corresponding.

In operation 1260, the electronic device may update the existing title of the at least one group of images determined to correspond to the new title, according to the detected image information. For example, using the image information of the obtained image, an existing title of at least one image group judged to correspond may be changed to a new title.

In operation 1270, the electronic device may determine, based on the image information of the acquired image, to create a new image group with the images stored in the electronic device with the acquired image. For example, when the electronic device confirms the images stored in the electronic device having the image information corresponding to the image information of the obtained image, and generates the obtained images and the confirmed images into one image group , It is possible to judge whether or not it meets the image group creation condition stored in the electronic device.

The image group generation condition may be the time information, the image analysis result, the tag information, the condition at the time of generating the image group based on the position information, or the condition combining the two or more conditions at the time of generating the image group.

If at 1270 operation the electronic device determines that it can create the new image group, it can execute 1280 operations, otherwise the operation of the present invention can be terminated.

In 1280 operation, the electronic device may generate a new image group with the images stored in the electronic device along with the acquired image.

In operation 1290, the title of the created new image group can be determined.

According to various embodiments, when an electronic device (e.g., electronic device 101) confirms the information of a new image group generated by the addition of the acquired image, the electronic device determines that the information of the identified image group If it is confirmed that the classification condition is satisfied, the new image group can be classified into at least two groups. Alternatively, for example, when the electronic device confirms the information of the new image group generated by the addition of the obtained image, if the electronic device confirms that it corresponds to the designated image group classification condition, , It is possible to generate image groups classified into at least two images in the new image group. The image classification condition may be a case where the number of images corresponding to specific information is equal to or greater than a specified number.

For example, although photos included in a new image group are generated based on place information, at least some of the photos included in the new image group may have first time information and others may have second time information . If the number of the at least some photographs is equal to or greater than a specified number and the number of remaining pictures is equal to or greater than a specified number, the electronic device classifies the photographs included in the new image group into two image groups of place information + time information .

13 is a flowchart of operations for merging groups of images according to the creation of a new image group of an electronic device (e.g., electronic device 101) according to various embodiments. In the existing image group generation method, the image group generation range is determined through image analysis, or, when the image group generation condition is satisfied within the predetermined range, the image group is generated by classifying the images according to the conditions. After that, when the user takes a picture, the photographing and stored photographs in the determined image group creation range are analyzed to again determine the range and create a new image group. On the other hand, in the image group creation method according to the embodiment of FIG. 13, when the user newly photographs a photograph after the image group creation is completed, the electronic device analyzes the photographed photographs and if it meets the generation condition of the existing created image group The photographed photographs can be included in the existing image group. For example, when an image group for a specific place is created and stored, when the electronic device photographed and stored at the specific place, the position information of the photographed photograph corresponds to the position information of the image information, The photographed photograph can be included in the image group for the specific place. Accordingly, the user can experience the effect that the user's lifestyle is logged, and can easily find the user's continuous experience from the past to the present, and important moments, important positions, important persons, and important experiences in each experience have.

In operation 1310, the electronic device may generate a new image group based on location information, in accordance with the image group creation conditions specified in the electronic device. Embodiments of generating an image group based on position information have been described above and thus will be omitted.

In operation 1320, the electronic device can confirm position information of the generated image group.

In operation 1330, the electronic device may determine whether the position information of the generated image group corresponds to the position information of at least one of the image groups stored.

In operation 1330, if the electronic device determines that the location information of the generated image group corresponds to the location information of at least one of the image groups stored in the existing image group, then the operation 1340 is performed, otherwise, Can be terminated. For example, when the position information of the generated image group is a region (or a place) narrower than the position information of the at least one image group, the position information of the generated image group is the position information of the at least one image group As shown in FIG. When the position information of the generated image group is the same as the position information of the at least one image group, it can be determined that the position information of the generated image group corresponds to the position information of the at least one image group . And the position information of the generated image group corresponds to the position information of the at least one image group when the position information of the generated image group is similar to the position information of the at least one image group .

In operation 1340, the electronic device may merge the generated image group and the at least one image group.

For example, the electronic device may include the images included in the generated image group in the at least one image group, thereby updating the at least one image group.

For example, the electronic device may be configured to include, in addition to the generated image group and the at least one image group, a new image group including images included in the generated image group and images included in the at least one image group. You can create a group of images.

In 1350 operation, the electronic device may generate a new title of the merged image group.

According to the above-described embodiments, when a user of the electronic device travels for four days in Jeju Island in 2013, takes a picture at several places using the electronic device, The user can create an image group related to the travel of Jeju Island with the photographed photographs. Then, when the user travels to Jeju Island in 2015, photographing is performed at various places using the electronic device, and when photographing is performed by meeting Kim again, an existing image group is updated according to the embodiment of FIG. 12, The existing image group can be updated according to the embodiment of FIG. For example, the electronic device updates the image group of the trip to Jeju Island created in 2013 according to a similar experience (trip to Jeju Island) of 2013 and 2015, and the image about the trip to Jeju Island from 2013 to 2015 Groups, and image groups of memories with Kim Yu-na in Jeju-do from 2013 to 2015. In addition, you can continue to maintain the existing image group on the trip to Jeju Island created in 2013, and create a new image group on the trip to Jeju Island in 2015. As described above, according to the continuity of the similarity of categories such as the time at which the photograph was taken, the place where the photograph was taken, and the person included in the photograph, the image group confirms the similarity with the previously stored image group, .

According to various embodiments, the group of images may be grouped into a single condition such as place information, time information, person information, tag information, etc. However, the group of images may be grouped into a single condition such as place + time, place + person, time + person, place + time + Image clustering can be automatically performed based on various complex conditions such as + tag + category, and each condition can be changed dynamically based on image pattern (tag information detected according to image analysis).

For example, if the electronic device stores a group of images related to traveling to Jeju Island in 2013, and photographs are taken using the electronic device while traveling to Jeju Island in 2015, the electronic device is divided into an image group called &quot; The group of images created in 2013 will continue to include images taken in 2015 to update the group of images on the trip to Jeju.

Further, the electronic device can automatically or manually generate another image group with some images in the image group, using the designated image group generation condition, according to the image group information of the stored image group. The designated image group creation condition may be to cause another image group to be created with the specified number or more of images if, for example, the images including at least two identical image information of the image group are the specified number or more.

For example, if an image group for the trip to Jeju Island in 2013 is created in the electronic device, and photographs are taken using electronic devices with Kim Yeon-ah in Jeju Island in 2015, Based image group can be continuously updated and a group of person-based images can be generated in the image group in a dependent manner to create an image group called 'Kim Yeon-ah in Jeju Island'. Thus, the image group can continuously update meaningful (user desired) photographs according to the user's photographic pattern, and can again search for important moments in the updated photographs to create new image groups.

14 is an exemplary view of an image search screen using tag information according to various embodiments. Referring to FIG. 14, a user can search for a meaningful photograph through tag information, which is an image analysis value for creating an image group, and an electronic device (e.g., electronic device 101) You can also create an image group.

The image search screen of FIG. 14 includes tags (My own tags) 1401, person information (with whom) 1403, and time information (when ) 1405, location information 1407, category information 1409, color information 1411, image analysis information (not shown), and the like .

Items displayed on the image search screen may be dynamically changed and displayed. For example, if the user selects Feb (1413) in the time information 1405, then a screen may be displayed allowing selection of subsequent dependent steps (week, day, day of the week, etc.) And can be dynamically changed according to analysis information of an image owned by the electronic device.

In addition, the image search screen may include a search tag item 1415 for inputting a query of an image to be directly searched by the user. According to the user's input, the user can automatically or manually search the meaningful images through the tag information to generate an image group or easily share and transmit the image group to another person.

Meanwhile, the tag information, which is an image analysis value for generating an image group, is generated by automatically analyzing an image in the electronic device, and can be directly input by a user. In addition, the electronic device can extract various pieces of analysis information from one image, thereby generating a plurality of pieces of image information (tag information). According to the image information, the user can easily confirm the images with similar kinds of images.

15 is an illustration of an image containing a plurality of tag information according to various embodiments. 15, an electronic device (e.g., electronic device 101) may display an image including tag information such as 2014 (1501), Australia (1503), Food (1505) If the user selects tag information labeled Food 1505, the electronic device may display a plurality of Food related images such that the user can easily identify other images classified as Food. In addition, when the user selects other tag information displayed on the image of any one of the other images, it is possible to display meaningful images that may be related to the tag information.

According to various embodiments, the electronic device (electronic device 101) may exclude low-quality images and the like, using image analysis results, when creating an image group, according to a specified sorting rule. For example, the electronic device may exclude overexposed, under-exposed, and / or blurred images (e.g., images with motion blur> = 0.5, under exposure> 0.03).

For example, the electronic device can detect similar similar images (duplicated images) among a plurality of images, and can select an optimal image among the similar images under the following three conditions.

1) Duplicate images include only the best image

The following numerical values are calculated for each of the duplicated images, so that the image with the highest numerical value can be determined as the optimum image.

brightness + contrast + feature point distribution - motion blur - under exposure - over exposure - blur + feature number + out focus blur

For example, the quality of the image can be determined using the first condition and / or the second condition described below, and images of good quality among the plurality of photographs can be confirmed.

For example, using the following first, second, and third conditions, an optimal image among the duplicated images can be determined.

[First Condition] Overall brightness, contrast, and feature point distribution: summed

[Second Condition] A motion blur of a motion picture, a blur of a basic picture, an out focus blur, an under exposure, and an over exposure ): Subtraction

[Third condition] Similarity feature number: sum (total brightness, subject edge detection, sharpness of the part excluding the out focus)

2) If you have more than 5 images of person / animal related images, remove duplicate pictures

3) Determine whether or not to display the image group based on the minimum number of images (not displayed to the user when less than 7)

According to various embodiments, images of an image group generated by applying the above-described sorting rule can be generated as an AGIF file.

According to various embodiments described above, an electronic device (e.g., electronic device 101) may image analyze the pictures stored in the electronic device to determine the meaningful moments, such as snapshots, Photographs can be grouped into one image group, and titles can be generated using image information (tag information) of images included in the image group and displayed together with the photographs. Also, the created image group can be shared and updated in real time.

Referring to FIG. 16A, the electronic device may display a title 1601 of an image group automatically generated based on information of images included in the image group, according to the generation of the image group. In addition, the electronic device may display the shared member list 1603 when sharing the generated image group. In addition, the electronic device may display time information and position information 1605 of an image group. In addition, the electronic device can display a plurality of images in a thumbnail format together with an image 1607 of a face-tagged acquaintance included in the image group.

Referring to FIG. 16B, the electronic device may display a title 1611 of an image group automatically generated based on image information included in the image group, according to the generation of the image group. In addition, the electronic device may display an icon 1613 for adding members to share for sharing of the generated image group. In addition, the electronic device may display time information 1615 of a group of images. In addition, the electronic device may display a plurality of images included in the image group in a thumbnail format.

According to various embodiments, there is provided a method of generating an image group of an electronic device (e.g., electronic device 101), the method comprising: generating a group comprising at least one image; Including the at least one image included in the generated group in a first image group corresponding to the generated group and having a first title; and generating image information of the at least one image included in the generated group , Changing the first title of the first group of images to the second title.

According to various embodiments, the act of creating the group comprising the at least one image comprises: analyzing the at least one image; and if the at least one image is a plurality of images, , And clustering the plurality of images.

According to various embodiments, the method further comprises determining the first image group corresponding to the generated group according to the generated group information, wherein the generated group information includes position information, time information, Image type information, or user input information.

According to various embodiments, the act of including the at least one image included in the generated group in the first group of images may include using at least one of the at least one image included in the first group of images, Determining if there are similar images among the at least one image included in the generated group; and if it is determined that there are similar images, selecting an image of one of the similar images according to a specified condition And an image other than the selected one of the similar images may not be included in the stored image group.

According to various embodiments, the image information of the at least one image included in the generated group may include at least one of position information, time information, image type information, or user input information.

According to various embodiments, the act of changing the first title of the first group of images to the second title may include, in the second title, a designated term corresponding to the at least one piece of information contained in the image information .

According to various embodiments, there is provided an image processing method comprising: generating at least one new image group using the at least one image included in the generated group; and using at least one image information included in the generated new image group And generating a title of the generated new image group.

According to various embodiments, when the at least one image contained in the generated group is a plurality of images, the act of creating the group comprises: using the time information of each of the plurality of images, Identifying at least one peak time interval having a larger number of images than the number of images in the time intervals before and after the time intervals; Determining a first time interval from the closest previous time interval in which the number of images is the designated number of previous time intervals to the nearest subsequent time interval in which the number of images is the predetermined number, And the group including the images included in the first time period It may include a property that motion.

According to various embodiments, when the generated group is a plurality of groups, the act of creating the at least one new image group may include calculating time intervals with forward and backward groups for each of the plurality of groups, A first time interval, which is an average value of the time intervals divided by the short time interval, and a second time interval, which is an average value of the time intervals divided by the long time interval, Calculating a second time interval that is an average value and generating at least one new image group including adjacent ones of the plurality of groups having a time interval shorter than the first time interval.

According to various embodiments, the method may further include generating at least one new image group that includes adjacent groups at a time interval that is less than the second time interval among the plurality of groups.

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the act of creating a group comprising the at least one image may include grouping the plurality of images with the same person Generating at least one group of images for each of the images including the same person by sorting by images, wherein the generating of the at least one new group of images comprises: calculating the number of images included in the at least one group Using the images contained in the at least one group, if the number of images in the at least one group is greater than or equal to the specified number.

According to various embodiments, classifying the plurality of images by images containing the same person may include using at least one of face recognition technology or name tag information.

According to various embodiments, when the number of images included in the at least one group is equal to or greater than a designated number, the display may further include displaying a screen for setting a name tag for a person corresponding to the group.

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the act of creating a group comprising the plurality of images comprises: Generating at least one group of images for each of the images including the same type information by classifying the plurality of images into images, wherein the generating of the at least one new image group comprises: And if the number is greater than or equal to a specified number of images, using the images contained in the at least one group to create at least one new image group.

According to various embodiments, when the at least one image included in the generated group is a plurality of images, the act of creating a group comprising the at least one image may include: And generating at least one group for each of the images including the corresponding position information by classifying the plurality of images by the included images, wherein the operation of generating the at least one new image group comprises: And generating at least one new image group using the images included in the first group if the number of images included in the first group generated according to the first location information is greater than or equal to the designated number of images .

According to various embodiments, the operation of generating the at least one new image group may include: if the number of images included in the second group generated according to the second position information is equal to or greater than the designated number of images, And using the images included in the second group to create the at least one new image group.

According to various embodiments, the operation of generating the at least one new image group may be performed according to the first position information corresponding to a time after the time corresponding to the first group and a time before the time corresponding to the second group And generating the at least one new image group using the images included in the first group, the second group, and the third group if the generated third group is present.

According to various embodiments, the method may further include generating images included in the first image group including the at least one image into an animation graphic interchange format (AGIF) file.

According to various embodiments, using the at least one image information included in the first image group including the at least one image, the first image group including the at least one image is divided into at least two image groups As shown in FIG.

According to various embodiments, the method may further comprise creating a new group of images using images comprising at least two identical image information in the first group of images including the at least one image.

According to various embodiments, there is provided a method of generating an image group of an electronic device (e.g., electronic device 101), said first group of images comprising a first image and a second image, Identifying a first attribute information corresponding to the first image group, the first attribute information including an image and a fourth image; Wherein the first attribute information includes at least one of a first image and a second image, the second attribute information corresponding to the second image group, Generating a third group of images comprising at least one image of at least one of the three images and the fourth image; And displaying images associated with the third image group through a display of the electronic device in association with each other.

According to various embodiments, when the third group of images includes the first image and the second image, the third image, and the fourth image, the first image, the second image, the third image, And generating the title of the third image group using the image and related information of each of the fourth images.

According to various embodiments, if the third image group includes the first image or the second image, the third image, and the fourth image, the first image or the second image included in the third image group 2. The method of claim 1, further comprising generating a title of the third image group using information associated with the second image and information associated with each of the third image and the fourth image, And a third image or a fourth image included in the third image group and the third image or the fourth image included in the third image group if the second image and the third image or the fourth image are included, 4 image using the image information of the first image group and the fourth image group.

According to various embodiments, the first property information includes first location information associated with the first image or the second image, and the second property information includes a second location information associated with the third image or the second image Wherein the act of generating the third group of images comprises: comparing the first place information with the second place information; And performing the generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first property information includes first time information related to the first image or the second image, and the second property information includes a second time information associated with the third image or the second image Wherein the act of generating the third group of images comprises: comparing the first time information with the second time information; And performing the generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first attribute information includes first tag information associated with the first image or the second image, and the second attribute information includes a second tag information associated with the third image or the second image Wherein the generating of the third group of images comprises: comparing the first tag information with the second tag information; And performing the generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first property information includes first object information related to the first image or the second image, and the second property information includes a second object information related to the second image or the second image Wherein the act of generating the third group of images comprises: comparing the first object information and the second object information; And performing the generation of the third image group if the result of the comparison satisfies a specified condition.

According to various embodiments, the first attribute information may include at least one of first location information, first time information, first tag information, and first object information related to the first image or the second image, The second attribute information may include at least one of first location information, first time information, first tag information, and first object information related to the third image or the fourth image.

According to various embodiments, if the electronic device acquires the fifth image and if the acquired fifth image corresponds to third attribute information corresponding to the third image group, 5 &lt; / RTI &gt; image.

According to various embodiments, the method may further include changing the title of the third image group using related information of each of the images included in the third image group.

According to various embodiments, if at least two pieces of information related to the first image included in the third image group and at least two pieces of information related to the third image included in the third image group correspond to each other, 1 image and the fourth image group including the third image.

According to various embodiments, an image of at least one of the first image and the second image included in the third image group and at least one of the third image and the fourth image is converted into an animation figure exchange format (AGIF ; and an animation graphic interchange format) file.

According to various embodiments, the act of associating and displaying the images forming the third group of images includes displaying the photos forming the third group of images in a metro user interface form, at least a portion of the images An album form, or a list form, which are displayed in a superimposed manner. The album type may be configured to display various images of the arrangement and size of the images forming the third image group. For example, the size of each of the images forming the third image group may be set differently. The album form may include text corresponding to images forming the third image group. The list form may be to display images arranged in a line in a specified direction.

The operations of associating and displaying the images forming the third image group with each other may be displayed in various ways, which may indicate that the images forming the third image group are one image group, in addition to the examples described above.

According to various embodiments, a recording medium for generating a group of images of an electronic device (e.g., electronic device 101), said first group of images comprising a first image and a second image, Confirming first attribute information corresponding to the first image group, the third image and the fourth image; Wherein the first attribute information includes at least one of a first image and a second image, the second attribute information corresponding to the second image group, Generating a third group of images comprising at least one image of at least one of the three images and the fourth image; And a program for operating an operation of associating and displaying the photographs forming the third image group through the display of the electronic device.

As used herein, the term "module " includes units comprised of hardware, software, or firmware and may be used interchangeably with terms such as, for example, logic, logic blocks, components, or circuits. A "module" may be an integrally constructed component or a minimum unit or part thereof that performs one or more functions. "Module" may be implemented either mechanically or electronically, for example, by application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs) And may include programmable logic devices. At least some of the devices (e.g., modules or functions thereof) or methods (e.g., operations) according to various embodiments may be stored in a computer readable storage medium (e.g., memory 130) . &Lt; / RTI &gt; When the instruction is executed by a processor (e.g., processor 120), the processor may perform a function corresponding to the instruction. The computer-readable recording medium may be a hard disk, a floppy disk, a magnetic medium such as a magnetic tape, an optical recording medium such as a CD-ROM, a DVD, a magnetic-optical medium such as a floppy disk, The instructions may include code that is generated by the compiler or code that may be executed by the interpreter. Modules or program modules according to various embodiments may include at least one or more of the components described above Operations that are performed by modules, program modules, or other components, in accordance with various embodiments, may be performed in a sequential, parallel, iterative, or heuristic manner, or at least in part Some operations may be executed in a different order, omitted, or other operations may be added.

Claims (41)

A method of generating an image group of an electronic device,
Creating a group comprising at least one image;
Including the at least one image included in the generated group in a first image group corresponding to the generated group and having a first title among at least one image group stored in the electronic device,
And changing the first title of the first group of images to a second title using image information of the at least one image included in the generated group.
2. The method of claim 1, wherein the act of creating a group comprising the at least one image comprises:
Analyzing the at least one image;
And clustering the plurality of images according to a result of the analysis if the at least one image is a plurality of images.
The method according to claim 1,
Further comprising: determining the first image group corresponding to the generated group according to the information of the generated group,
The information of the generated group may be,
A time information, an image type information, or a user input information.
4. The method of claim 3, wherein the act of including the at least one image in the generated group in the first group of images comprises:
Determining, using image analysis, whether there are similar images among at least one image included in the first image group and the at least one image included in the generated group;
Further comprising selecting an image of one of the similar images according to a specified condition if it is determined that the similar images are present,
Wherein an image other than the selected one of the similar images is not included in the stored image group.
2. The image processing apparatus according to claim 1, wherein the image information of the at least one image included in the generated group includes:
Wherein the information includes at least one of location information, time information, image type information, or user input information.
6. The method of claim 5,
Wherein changing the first title of the first group of images to the second title comprises:
Wherein the second title includes a specified term corresponding to the at least one piece of information included in the image information.
The method according to claim 1,
Generating at least one new image group using the at least one image included in the generated group;
And generating a title of the generated new image group using at least one piece of image information included in the generated new image group.
8. The method of claim 7, wherein when the at least one image included in the generated group is a plurality of images,
Classifying the plurality of images in a corresponding time interval among the designated time intervals using time information of each of the plurality of images;
Identifying at least one peak time interval having a number of images that is greater than the number of images in the forward and backward time intervals of the time intervals;
Wherein a time interval from the nearest previous time interval in which the number of images is the designated number of the previous time intervals to the nearest subsequent time interval in which the number of images is the predetermined number, An operation of determining a section,
And creating the group of images including the images included in the first time period as the group.
9. The method of claim 8, wherein when the generated group is a plurality of groups,
Dividing time intervals of the groups into a short time interval and a long time interval by calculating time intervals with respect to the groups before and after each of the plurality of groups,
Calculating a first time interval which is an average value of the time intervals divided by the short time interval and a second time interval which is an average value of the time intervals divided by the long time interval;
And generating at least one new image group including adjacent groups of the plurality of groups having a time interval shorter than the first time interval.
10. The method of claim 9,
Further comprising generating at least one new image group that includes adjacent groups at a time interval less than or equal to the second time interval among the plurality of groups.
8. The method of claim 7, wherein, when the at least one image included in the generated group is a plurality of images,
And grouping the plurality of images by images including the same person to generate at least one group for each image including the same person,
Wherein the act of creating the at least one new image group comprises:
Generating at least one new image group using images contained in the at least one group if the number of images included in the at least one group is greater than or equal to a specified number .
The method of claim 11, wherein classifying the plurality of images by images including the same person comprises:
Using face recognition technology or name tag information. &Lt; RTI ID = 0.0 &gt; [0002] &lt; / RTI &gt;
The method according to claim 11, further comprising displaying a screen for setting a name tag for a person corresponding to the group if the number of images included in the at least one group is equal to or greater than a specified number Generation method. 8. The method of claim 7, wherein, when the at least one image included in the generated group is a plurality of images,
And generating at least one group for each image including the same type information by classifying the plurality of images according to images including the same type information,
Wherein the act of creating the at least one new image group comprises:
Creating at least one new image group using images contained in the at least one group if the number of images included in the at least one group is greater than or equal to a specified number of images Way.
8. The method of claim 7, wherein, when the at least one image included in the generated group is a plurality of images,
Classifying the plurality of images by images including corresponding position information and generating at least one group for each image including the corresponding position information,
Wherein the act of creating the at least one new image group comprises:
And generating at least one new image group using the images included in the first group if the number of images included in the first group generated according to the first position information which is the corresponding position information is greater than or equal to the designated number of images &Lt; / RTI &gt;
16. The method of claim 15, wherein the act of creating the at least one new image group comprises:
If the number of images included in the second group generated according to the second position information is greater than or equal to the designated number of images, using the images included in the first group and the second group, &Lt; / RTI &gt; further comprising generating an image group of the electronic device.
17. The method of claim 16,
Wherein the act of creating the at least one new image group comprises:
If there is a third group generated according to the first position information corresponding to a time after the time corresponding to the first group and a time before the time corresponding to the second group, And using the images contained in the third group to create the at least one new image group.
2. The method of claim 1, further comprising generating images in the first image group including the at least one image into an animation graphic interchange format (AGIF) file, Generation method. The method of claim 1, further comprising: using the at least one image information included in the first image group including the at least one image to group the first image group including the at least one image into at least two image groups The method comprising the steps of: &lt; Desc / Clms Page number 17 &gt; 2. The electronic device of claim 1, further comprising an operation of creating a new group of images using images comprising at least two identical image information in the first group of images including the at least one image How to create a group. In an electronic device,
A memory,
A first image group corresponding to the generated group of at least one image group stored in the memory, the first image group having a first title, the first image group having at least one image group included in the generated group, And changing a first title of the first image group to a second title using image information of the at least one image included in the generated group.
22. The apparatus of claim 21,
Analyzing the at least one image and clustering the plurality of images according to a result of the analysis if the at least one image is a plurality of images.
22. The apparatus of claim 21,
And determining the first image group corresponding to the generated group according to the generated group information,
The information of the generated group may be,
Location information, time information, image type information, or user input information.
24. The apparatus of claim 23,
Determining whether there are similar images among at least one image included in the first group of images and the at least one image included in the generated group using image analysis; Selects an image of one of the similar images according to a condition and does not include an image other than the selected one of the similar images in the group of stored images.
22. The apparatus of claim 21, wherein the image information of the at least one image included in the generated group includes:
Location information, time information, image type information, or user input information.
26. The method of claim 25,
Wherein,
When the first title of the first group of images is changed to the second title, a specified term corresponding to the at least one piece of information included in the image information is included in the second title.
22. The apparatus of claim 21,
Generating at least one new image group using the at least one image included in the generated group, and using at least one image information included in the generated new image group, Lt; RTI ID = 0.0 &gt; of: &lt; / RTI &gt;
28. The apparatus of claim 27, wherein when the at least one image included in the generated group is a plurality of images,
Wherein the image processing unit classifies the plurality of images into corresponding time intervals of the designated time intervals using the time information of each of the plurality of images, And a peak time interval of the peak time interval is checked from the nearest previous time interval in which the number of images among the previous time intervals is a specified number with respect to the at least one peak time interval, Determining a first time interval up to a time interval, and generating the group including the images included in the first time interval.
29. The apparatus of claim 28, wherein when the generated group is a plurality of groups,
Calculating time intervals with the preceding and succeeding groups for each of the plurality of groups, dividing time intervals with respect to the groups into a short time interval and a long time interval,
Calculating a first time interval which is an average value of the time intervals divided by the short time interval and a second time interval which is an average value of the time intervals divided by the long time interval,
And generating at least one new image group comprising adjacent ones of the plurality of groups having time intervals that are equal to or less than the first time interval.
30. The apparatus of claim 29,
Further comprising generating at least one new image group comprising adjacent groups of the plurality of groups at time intervals less than the second time interval.
28. The apparatus of claim 27, wherein when the at least one image included in the generated group is a plurality of images,
Classifying the plurality of images by images including the same person to generate at least one group for each image including the same person,
Generating at least one new image group using images contained in the at least one group if the number of images included in the at least one group is greater than or equal to a specified number.
32. The apparatus of claim 31,
And classifying the plurality of images by images including the same person using at least one of face recognition technology and name tag information.
32. The method of claim 31,
Further comprising a display,
Wherein,
Further comprising displaying on the display a screen for setting a name tag for a person corresponding to the group if the number of images included in the at least one group is greater than or equal to a specified number.
28. The apparatus of claim 27, wherein when the at least one image included in the generated group is a plurality of images,
The image processing method according to claim 1, further comprising: generating at least one group for each image including the same type information by classifying the plurality of images by images including the same type information, And using the images contained in the at least one group to create at least one new image group.
28. The apparatus of claim 27, wherein when the at least one image included in the generated group is a plurality of images,
And generating at least one group for each image including the corresponding position information by classifying the plurality of images according to images including corresponding position information, and generating at least one group based on the first position information, And generating at least one new image group using the images included in the first group if the number of images included in the first group is equal to or greater than the designated number of images.
The apparatus of claim 35,
If the number of images included in the second group generated according to the second position information is greater than or equal to the designated number of images, using the images included in the first group and the second group, &Lt; / RTI &gt;
37. The apparatus of claim 36,
If there is a third group generated according to the first position information corresponding to a time after the time corresponding to the first group and a time before the time corresponding to the second group, And using the images contained in the third group to generate the at least one new image group.
22. The apparatus of claim 21,
Further comprising generating, as an animation graphic interchange format (AGIF) file, images included in the first image group including the at least one image.
22. The apparatus of claim 21,
Further comprising classifying the first image group including the at least one image into at least two image groups using at least one image information included in the first image group including the at least one image Lt; / RTI &gt;
22. The apparatus of claim 21,
Further comprising generating one new image group using images comprising at least two identical image information in the first image group including the at least one image.
A recording medium for generating a group of images of an electronic device,
Creating a group comprising at least one image;
Including the at least one image included in the generated group in a first image group corresponding to the generated group and having a first title among at least one image group stored in the electronic device,
A program for causing the computer to operate to change the first title of the first image group to the second title using the image information of the at least one image included in the generated group, Recording medium.
KR1020160020043A 2016-02-19 2016-02-19 Method for creating image group of electronic device and electronic device thereof KR20170098113A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160020043A KR20170098113A (en) 2016-02-19 2016-02-19 Method for creating image group of electronic device and electronic device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160020043A KR20170098113A (en) 2016-02-19 2016-02-19 Method for creating image group of electronic device and electronic device thereof

Publications (1)

Publication Number Publication Date
KR20170098113A true KR20170098113A (en) 2017-08-29

Family

ID=59760141

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160020043A KR20170098113A (en) 2016-02-19 2016-02-19 Method for creating image group of electronic device and electronic device thereof

Country Status (1)

Country Link
KR (1) KR20170098113A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108521422A (en) * 2018-04-09 2018-09-11 四川超影科技有限公司 Common communication protocol based on intelligent inspection robot
CN108888892A (en) * 2018-05-25 2018-11-27 石家庄学院 Fire protection patrol method, fire-fightingl patrol-checking device and electronic equipment
CN109813480A (en) * 2019-03-22 2019-05-28 陈利娟 A kind of monitoring of construction status cloud and pressure relieving system based on Internet of Things
US11132398B2 (en) 2018-12-05 2021-09-28 Samsung Electronics Co., Ltd. Electronic device for generating video comprising character and method thereof

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108521422A (en) * 2018-04-09 2018-09-11 四川超影科技有限公司 Common communication protocol based on intelligent inspection robot
CN108888892A (en) * 2018-05-25 2018-11-27 石家庄学院 Fire protection patrol method, fire-fightingl patrol-checking device and electronic equipment
US11132398B2 (en) 2018-12-05 2021-09-28 Samsung Electronics Co., Ltd. Electronic device for generating video comprising character and method thereof
US11531702B2 (en) 2018-12-05 2022-12-20 Samsung Electronics Co., Ltd. Electronic device for generating video comprising character and method thereof
CN109813480A (en) * 2019-03-22 2019-05-28 陈利娟 A kind of monitoring of construction status cloud and pressure relieving system based on Internet of Things

Similar Documents

Publication Publication Date Title
CN110083730B (en) Method and apparatus for managing images using voice tags
KR102545768B1 (en) Method and apparatus for processing metadata
KR102349428B1 (en) Method for processing image and electronic device supporting the same
US10021569B2 (en) Theme applying method and electronic device for performing the same
EP2892208B1 (en) Method and apparatus for operating electronic device
CN105825522B (en) Image processing method and electronic device supporting the same
KR20180074316A (en) System for providing plce information and electronic device and method supporting the same
KR20170136920A (en) Method for Outputting Screen and the Electronic Device supporting the same
US10922354B2 (en) Reduction of unverified entity identities in a media library
KR20170097980A (en) Method for sharing content group of electronic device and electronic device thereof
KR20170076380A (en) Electronic device and method for image control thereof
KR20160105239A (en) Electronic device and method for displaying picture thereof
CN106228511B (en) Electronic device and method for generating image file in electronic device
US10504560B2 (en) Electronic device and operation method thereof
US10839002B2 (en) Defining a collection of media content items for a relevant interest
KR20180094290A (en) Electronic device and method for determining underwater shooting
KR20170019809A (en) Method and Apparatus for Generating a Video Content
KR20170054746A (en) Method and electronic device selecting an area associated with contents
KR20160114434A (en) Electronic Device And Method For Taking Images Of The Same
KR20170098113A (en) Method for creating image group of electronic device and electronic device thereof
KR20180121273A (en) Method for outputting content corresponding to object and electronic device thereof
KR20170096711A (en) Electronic device and method for clustering photo therein
KR20170046496A (en) Electronic device having camera and image processing method of the same
KR20160134428A (en) Electronic device for processing image and method for controlling thereof
KR20180122137A (en) Method for giving dynamic effect to video and electronic device thereof