KR20170096711A - Electronic device and method for clustering photo therein - Google Patents

Electronic device and method for clustering photo therein Download PDF

Info

Publication number
KR20170096711A
KR20170096711A KR1020160018278A KR20160018278A KR20170096711A KR 20170096711 A KR20170096711 A KR 20170096711A KR 1020160018278 A KR1020160018278 A KR 1020160018278A KR 20160018278 A KR20160018278 A KR 20160018278A KR 20170096711 A KR20170096711 A KR 20170096711A
Authority
KR
South Korea
Prior art keywords
time
processor
photographs
pictures
electronic device
Prior art date
Application number
KR1020160018278A
Other languages
Korean (ko)
Inventor
권순범
이창선
정승환
김대희
김일섭
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020160018278A priority Critical patent/KR20170096711A/en
Publication of KR20170096711A publication Critical patent/KR20170096711A/en

Links

Images

Classifications

    • G06F17/30268
    • G06F17/30041
    • G06F17/30044
    • G06F17/30705
    • G06F17/3082
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Abstract

Various embodiments disclosed in the present invention are an electronic device and a method for clustering a picture of the electronic device, based on shooting time information of the picture shoot through the electronic device. The electronic device and the picture clustering method thereof automatically cluster numerous shot and stored pictures by segmenting the same into a plurality of groups by time in the corresponding date and provide the automatically clustered picture groups to a user, thereby supporting to facilitate a management such as saving, editing, deleting, and sharing of the pictures. The electronic device comprises a camera, a touch screen, a processor electrically connected to the camera and the touch screen, and a memory electrically connected to the processor.

Description

[0001] The present invention relates to an electronic device and method for clustering photo clustering,

Various embodiments disclosed in this document relate to a method of automatically clustering photographed and stored photographs based on photographing time information of photographs photographed through an electronic device.

As the technology of electronic devices develops, cameras with high pixel and high performance are being mounted on portable electronic devices.

Users of portable electronic devices equipped with cameras often photograph and store everyday pictures.

The user can sort photographed photographs according to each theme and store them in an album form, and output specific photographs via online or offline.

When the amount of photographs taken and stored through the electronic device increases, the user becomes difficult to classify the photographs of each theme and spends a lot of time in sorting the photographs.

Until now, the method of sorting photographs by year, month and date has been used, based on the time sequence of photographing.

For example, a method of sorting photographs in order of day by using time information added to photographed photographs, and then browsing classified photo folders is used.

However, this conventional method simply classifies photographs in order of date, which may be a problem in that photographs of different characteristics for various activities within the same date are mixed and classified into one group.

In this case, since the group of photographs for various activities can not be generated by time division, the user can not easily retrieve the desired photographs, and there may be difficulties in managing the photographs.

Thus, the electronic device may need to group and manage a large number of photographs taken and stored in time units.

Various embodiments disclosed in this document can automatically classify a number of photographs taken and stored on the basis of photographing time information of a photograph taken through an electronic device into a plurality of groups of time units within the corresponding date And a method of clustering photos of an electronic device and an electronic device.

For example, if there are several photographing events per day, the peak value, which is the average number of pictures per hour on the 24-hour basis, is calculated, and the group of photographs in the time zone around the peak value And can provide a method for automatic clustering.

An electronic device according to various embodiments of the present disclosure includes: a camera; touch screen; A processor electrically connected to the camera and the touch screen; And a memory electrically connected to the processor, wherein the processor, upon execution, identifies the picture data stored in the memory, analyzes the number of pictures for the picture taking time of the picture data, The photographing time of the photograph data and the number of photographs per hour are analyzed and the instructions for classifying the photograph group including the time zone in which the photograph photographing is concentrated and the time zone before and after the photograph photographing are classified in units of time based on the analyzed number of pictures per hour.

Also, a method of clustering photos of an electronic device according to various embodiments of the present disclosure includes the steps of: the processor verifying a collection of photo data stored in memory; The processor analyzing the number of photographs with respect to the photographing time of the photograph data; The processor analyzing the photographing time zone and the number of photographs per hour of the photograph data; And classifying the group of photographs at the time of photographed time and at the time of day before and after the photographing based on the number of pictures per hour analyzed by the processor.

In addition, the storage medium storing the program for controlling the functions of the electronic device according to the various embodiments of the present disclosure is characterized in that the electronic device confirms the photo data stored in the memory, And analyzing the number of photographs per hour of the photograph data and analyzing the number of photographs per hour and classifying the group of photographs in the time zone in which the photographs are concentrated on a time basis based on the analyzed number of pictures per hour .

Further, an electronic device according to various embodiments of the present disclosure includes: a memory for storing at least one photograph; And a processor, wherein the processor checks the number of photographs photographed based on a first time unit in a time period in which the at least one photograph was photographed, and if the number of photographs falls within a first range, The number of photographs photographed on the basis of the first time unit in a time period and a consecutive time period is checked, and when the number of photographs is within a second range, The number of photographed photographs corresponding to the identified time period and the consecutive time period based on the corresponding time unit of the first time unit or the second time unit, May be set to designate the photographed photographs corresponding to the time period and the continuous time period as one group .

According to various embodiments disclosed in this document, based on photographing time information of a photograph taken through an electronic device, a large number of photographs photographed and stored are subdivided into a plurality of groups on a time-by-day basis and automatically clustered .

Also, according to various embodiments disclosed herein, it is possible to facilitate management such as storage, editing, deletion and sharing of photographs by providing a group of photographs automatically clustered to the user.

In addition, according to various embodiments disclosed in this document, it is possible to provide a simple browsing service by providing a time line clustered in units of time so that a group of clusters can be easily browsed.

1 is a block diagram illustrating a network environment including an electronic device according to various embodiments of the present disclosure;
2 is a block diagram of an electronic device according to various embodiments of the present disclosure;
3 is a block diagram of a program module in accordance with various embodiments of the present disclosure;
4 is a diagram showing the configuration of an electronic device according to various embodiments of the present disclosure.
5 is a diagram illustrating a detailed configuration of a processor according to various embodiments of the present disclosure;
6 is a flow chart illustrating a method of clustering photos of an electronic device according to various embodiments of the present disclosure.
7 is a diagram illustrating a list of photo data stored in a memory of an electronic device according to various embodiments of the present disclosure.
8 is a diagram showing an example of a picture data analysis screen according to various embodiments of the present disclosure.
9 is a diagram showing an example of a photographing time zone analysis screen according to various embodiments of the present disclosure.
10 is a diagram showing an example of a screen for checking pictures before and after a peak time zone according to various embodiments of the present disclosure.
11 is a diagram illustrating an example of a screen in which a group of pictures classified by time unit is applied to an application according to various embodiments of the present disclosure.
12 is a flow chart specifically illustrating photographic time zone analysis and photo classification operations according to various embodiments of the present disclosure.
FIG. 13 is a diagram showing an example in which pictures of a specific date are classified by time according to various embodiments of the present disclosure.
14 is a diagram showing an example in which a photographing time zone of a specific date is concentrated according to various embodiments of the present disclosure.
FIG. 15 is a diagram showing an example in which a group of photographs in a time zone in which photographs are intensively photographed can be classified and classified according to various embodiments of the present disclosure. FIG.
Figure 16 is an illustration of an example in which a group of pictures can be categorized in the presence of continuous photographing according to various embodiments of the present disclosure.
Fig. 17 is a diagram showing an example in which the time and date of photographing can be classified in units of 30 minutes according to various embodiments of the present disclosure.
FIG. 18 is a diagram showing an example in which the time zone of photographing in accordance with various embodiments of the present disclosure can be classified in units of 15 minutes.
FIGS. 19-25 illustrate various examples of applying a group of pictures classified by time unit to an application according to various embodiments of the present disclosure.

Hereinafter, various embodiments of the present invention will be described with reference to the accompanying drawings. It is to be understood that the embodiments and terminologies used herein are not intended to limit the invention to the particular embodiments described, but to include various modifications, equivalents, and / or alternatives of the embodiments.

In connection with the description of the drawings, like reference numerals may be used for similar components. The singular expressions may include plural expressions unless the context clearly dictates otherwise.

In this specification, the expressions "have," "having," "includes," or "may include" may be used to denote the presence of features, such as numerical values, functions, And does not exclude the presence of additional features.

In this document, the expressions "A or B" or "at least one of A and / or B" and the like may include all possible combinations of the items listed together. For example, "A or B". "At least one of A and B" or "At least one of A or B" means that (1) 3) both of at least one A and at least one B.

Expressions such as " first, "" second," " first, "or" second, " But is not limited to those components. For example, the first electronic device and the second electronic device may represent different electronic devices, regardless of order or importance. For example, without departing from the scope of the rights described in the various embodiments of the present invention, the first component can be named as the second component, and similarly the second component can be named as the first component have.

When it is mentioned that some (e.g., first) component is "(functionally or communicatively) connected" or "connected" to another (second) component, May be connected directly to the component, or may be connected through another component (e.g., a third component). On the other hand, when it is mentioned that a component (e.g., a first component) is "directly connected" or "directly connected" to another component (e.g., a second component) It can be understood that there is no other component (e.g., a third component) between other components.

In this document, the term " configured to (or configured) to "as used herein is intended to encompass all types of hardware, software, , "" Made to "," can do ", or" designed to ". In some situations, the expression "a device configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a general purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the other embodiments. The singular expressions may include plural expressions unless the context clearly dictates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as commonly understood by one of ordinary skill in the art having recited herein. General predefined terms used herein may be interpreted in the same or similar sense as the contextual meanings of the related art and are to be understood as meaning ideal or overly formal meanings unless explicitly defined herein . In some cases, the terms defined herein can not be construed to exclude embodiments of the present disclosure.

Electronic devices in accordance with various embodiments of the present document may be used in various applications such as, for example, smart phones, tablet PCs, mobile phones, videophones, electronic book readers, desktop PCs, laptop PCs, netbook computers, workstations, a portable multimedia player, an MP3 player, a medical device, a camera, or a wearable device. Wearable devices may be of the type of accessories (eg, watches, rings, bracelets, braces, necklaces, glasses, contact lenses or head-mounted-devices (HMD) A body attachment type (e.g., a skin pad or a tattoo), or a bio-implantable circuit.

In some embodiments, the electronic device may be, for example, a television, a digital video disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, Panel, a security control panel, a media box (eg Samsung HomeSync , Apple TV , or Google TV ), a game console (eg Xbox , PlayStation ), an electronic dictionary, an electronic key, a camcorder, One can be included.

In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) A navigation system, a global navigation satellite system (GNSS), an event data recorder (EDR), a flight data recorder (FDR), an automobile infotainment device, a marine electronic equipment (For example, marine navigation systems, gyro compasses, etc.), avionics, security devices, head units for vehicles, industrial or domestic robots, drones, ATMs at financial institutions, of at least one of the following types of devices: a light bulb, a fire detector, a fire alarm, a thermostat, a streetlight, a toaster, a fitness device, a hot water tank, a heater, a boiler, .

According to some embodiments, the electronic device is a piece of furniture or a part of a building / structure, an electronic board, an electronic signature receiving device, a projector, Water, electricity, gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device may be a combination of one or more of the various devices described above. An electronic device according to some embodiments may be a flexible electronic device. Further, the electronic device according to the embodiment of the present document is not limited to the above-described devices, and may include a new electronic device according to technological advancement.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An electronic device according to various embodiments of the present disclosure will now be described with reference to the accompanying drawings. In this specification, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).

1 is a block diagram illustrating a network environment 100 including an electronic device 101 in accordance with various embodiments of the present disclosure.

1, an electronic device 101, 102, or 104 or a server 106 may be interconnected via a network 162 or a local area network.

The electronic device 101 may include a bus 110, a processor 120, a memory 130, an input / output interface 150, a display 160 and a communication interface 170. In some embodiments, the electronic device 101 may omit at least one of the components or additionally comprise other components.

The bus 110 may include, for example, circuitry that interconnects the components 120-170 and communicates (e.g., control messages and / or data) between the components.

The processor 120 may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP). The processor 120 may perform computations or data processing related to, for example, control and / or communication of at least one other component of the electronic device 101.

Memory 130 may include volatile and / or non-volatile memory. Memory 130 may store instructions or data related to at least one other component of electronic device 101, for example. According to one embodiment, the memory 130 may store software and / or programs 140.

The program 140 may include one or more of the following: a kernel 141, a middleware 143, an application programming interface (API) 145, and / or an application program . ≪ / RTI > At least a portion of the kernel 141, middleware 143, or API 145 may be referred to as an operating system (OS).

The kernel 141 may include system resources used to execute an operation or function implemented in other programs (e.g., middleware 143, API 145, or application program 147) (E.g., bus 110, processor 120, or memory 130). The kernel 141 also provides an interface to control or manage system resources by accessing individual components of the electronic device 101 in the middleware 143, API 145, or application program 147 .

The middleware 143 can perform an intermediary role such that the API 145 or the application program 147 can communicate with the kernel 141 to exchange data. In addition, the middleware 143 may process one or more task requests received from the application program 147 according to the priority order. For example, middleware 143 may use system resources (e.g., bus 110, processor 120, or memory 130, etc.) of electronic device 101 in at least one of application programs 147 Prioritize, and process the one or more task requests.

The API 145 is an interface for the application 147 to control the functions provided by the kernel 141 or the middleware 143, Control or the like, for example, instructions.

Output interface 150 may be configured to communicate commands or data entered from a user or other external device to another component (s) of the electronic device 101, or to another component (s) of the electronic device 101 ) To the user or other external device.

Display 160 may include, for example, a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) A microelectromechanical systems (MEMS) display, or an electronic paper display. Display 160 may display various content (e.g., text, image, video, icon, or symbol, etc.) to a user, for example. Display 160 may include a touch screen and may receive a touch, gesture, proximity, or hovering input using, for example, an electronic pen or a portion of the user's body.

The communication interface 170 establishes communication between the electronic device 101 and an external device (e.g., the first external electronic device 102, the second external electronic device 104, or the server 106) . For example, communication interface 170 may be connected to network 162 via wireless or wired communication to communicate with an external device (e.g., second external electronic device 104 or server 106).

The wireless communication may include, for example, LTE, LTE-A (LTE Advance), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro) System for Mobile Communications), and the like. According to one embodiment, the wireless communication may be wireless communication, such as wireless fidelity (WiFi), Bluetooth, Bluetooth low power (BLE), Zigbee, NFC, Magnetic Secure Transmission, Frequency (RF), or body area network (BAN). According to one example, wireless communication may include GNSS. GNSS may be, for example, Global Positioning System (GPS), Global Navigation Satellite System (Glonass), Beidou Navigation Satellite System (Beidou) or Galileo, the European global satellite-based navigation system. Hereinafter, in this document, "GPS" can be used interchangeably with "GNSS ". The wired communication may include, for example, at least one of a universal serial bus (USB), a high definition multimedia interface (HDMI), a recommended standard 232 (RS-232), a power line communication or a plain old telephone service have. Network 162 may include at least one of a telecommunications network, e.g., a computer network (e.g., LAN or WAN), the Internet, or a telephone network.

Each of the first and second external electronic devices 102, 104 may be the same or a different kind of device as the electronic device 101. According to various embodiments, all or a portion of the operations performed in the electronic device 101 may be performed in one or more other electronic devices (e.g., electronic devices 102, 104, or server 106). According to the present invention, when electronic device 101 is to perform a function or service automatically or on demand, electronic device 101 may perform at least some functions associated therewith instead of, or in addition to, (E.g., electronic device 102, 104, or server 106) may request the other device (e.g., electronic device 102, 104, or server 106) Perform additional functions, and forward the results to the electronic device 101. The electronic device 101 may process the received results as is or additionally to provide the requested functionality or services. For example, Cloud computing, distributed computing, or client-server computing techniques can be used.

2 is a block diagram of an electronic device 201 in accordance with various embodiments of the present disclosure.

The electronic device 201 may include all or part of the electronic device 101 shown in Fig. 1, for example. The electronic device 201 includes one or more processors (e.g., AP) 210, a communication module 220, a subscriber identity module 229, a memory 230, a security module 236, a sensor module 240, A display module 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297 and a motor 298 .

The processor 210 may control a plurality of hardware or software components connected to the processor 210, for example, by driving an operating system or an application program, and may perform various data processing and calculations. The processor 210 may be implemented with, for example, a system on chip (SoC). According to one embodiment, the processor 210 may further include a graphics processing unit (GPU) and / or an image signal processor. Processor 210 may include at least some of the components shown in FIG. 1 (e.g., cellular module 221). Processor 210 may load and process instructions or data received from at least one of the other components (e.g., non-volatile memory) into volatile memory and store the resulting data in non-volatile memory.

The communication module 220 may have the same or similar configuration as the communication interface 170 of FIG. The communication module 220 may include a cellular module 221, a WiFi module 222, a Bluetooth module 223, a GNSS module 224 (e.g., a GPS module, a Glonass module, a Beidou module, A Galileo module), an NFC module 225, an MST module 226, and a radio frequency (RF) module 227.

The cellular module 221 can provide voice calls, video calls, text services, or Internet services, for example, over a communication network. According to one embodiment, cellular module 221 may utilize a subscriber identity module (e.g., a SIM card) 229 to perform the identification and authentication of electronic device 201 within the communication network. According to one embodiment, the cellular module 221 may perform at least some of the functions that the processor 210 may provide. According to one embodiment, the cellular module 221 may comprise a communications processor (CP).

According to some embodiments, at least some (e.g., two or more) of the cellular module 221, the WiFi module 222, the Bluetooth module 223, the GNSS module 224, or the NFC module 225, (IC) or an IC package. The RF module 227 can transmit and receive a communication signal (e.g., an RF signal), for example.

The RF module 227 may include, for example, a transceiver, a power amplifier module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna. According to another embodiment, at least one of the cellular module 221, the WiFi module 222, the Bluetooth module 223, the GNSS module 224, or the NFC module 225 transmits / receives an RF signal through a separate RF module .

The subscriber identity module 229 may include, for example, a card or an embedded SIM containing a subscriber identity module and may include unique identification information (e.g., ICCID) or subscriber information (e.g., IMSI (international mobile subscriber identity).

Memory 230 (e.g., memory 130 of FIG. 1) may include, for example, internal memory 232 or external memory 234. Volatile memory (e.g., a DRAM, an SRAM, or an SDRAM), a non-volatile memory (e.g., an OTPROM, a PROM, an EPROM, an EEPROM, a mask ROM, a flash ROM , A flash memory, a hard drive, or a solid state drive (SSD).

The external memory 234 may be a flash drive, for example, a compact flash (CF), a secure digital (SD), a micro-SD, a mini-SD, an extreme digital (xD) Or a memory stick or the like. The external memory 234 may be functionally or physically connected to the electronic device 201 via various interfaces.

The security module 236 may be a module including a storage space having a relatively higher security level than the memory 230, and may be a circuit that ensures safe data storage and a protected execution environment. The security module 236 may be implemented as a separate circuit and may include a separate processor. The security module 236 may be embedded within a removable smart chip, a secure digital (SD) card, for example, or embedded within a fixed chip of the electronic device 201 ESE)). In addition, the security module 236 may be operated with an operating system different from the operating system (OS) of the electronic device 201. For example, it can operate on a Java card open platform (JCOP) operating system.

The sensor module 240 may, for example, measure a physical quantity or sense the operating state of the electronic device 201 to convert the measured or sensed information into an electrical signal. The sensor module 240 includes a gesture sensor 240A, a gyro sensor 240B, an air pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, A temperature sensor 240G, a UV sensor 240G, a color sensor 240H (e.g., an RGB (red, green, blue) sensor), a living body sensor 240I, And a sensor 240M. Additionally or alternatively, the sensor module 240 may be configured to perform various functions such as, for example, an e-nose sensor, an electromyography (EMG) sensor, an electroencephalograph (EEG) sensor, an electrocardiogram An infrared (IR) sensor, an iris sensor, and / or a fingerprint sensor. The sensor module 240 may further include a control circuit for controlling at least one or more sensors belonging to the sensor module 240. In some embodiments, the electronic device 201 further includes a processor configured to control the sensor module 240, either as part of the processor 210 or separately, so that while the processor 210 is in a sleep state, The sensor module 240 can be controlled.

The input device 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, a key 256, or an ultrasonic input device 258. As the touch panel 252, for example, at least one of an electrostatic type, a pressure sensitive type, an infrared type, and an ultrasonic type can be used. Further, the touch panel 252 may further include a control circuit. The touch panel 252 may further include a tactile layer to provide a tactile response to the user. (Digital) pen sensor 254 may be part of, for example, a touch panel or may include a separate recognition sheet. Key 256 may include, for example, a physical button, an optical key, or a keypad. The ultrasonic wave input device 258 can sense the ultrasonic wave generated from the input tool through the microphone (e.g., the microphone 264) and confirm the data corresponding to the ultrasonic wave detected.

Display 260 (e.g., display 160) may include panel 262, hologram device 264, projector 266, and / or control circuitry for controlling them. The panel 262 may be embodied, for example, flexibly, transparently, or wearably. The panel 262 may comprise a touch panel 252 and one or more modules. The hologram device 264 can display a stereoscopic image in the air using interference of light. The projector 266 can display an image by projecting light onto a screen. The screen may be located, for example, inside or outside the electronic device 201.

The interface 270 may include, for example, an HDMI 272, a USB 274, an optical interface 276, or a D-sub (D-subminiature) 278.

The interface 270 may, for example, be included in the communication interface 170 shown in FIG. Additionally or alternatively, the interface 270 may include, for example, a mobile high-definition link (MHL) interface, an SD card / multi-media card (MMC) interface, or an infrared data association have.

The audio module 280 can, for example, convert sound and electrical signals in both directions. At least some of the components of the audio module 280 may be included, for example, in the input / output interface 150 shown in FIG. The audio module 280 may process sound information input or output through, for example, a speaker 282, a receiver 284, an earphone 286, a microphone 288, or the like.

The camera module 291 is, for example, a device capable of capturing still images and moving images, and according to one embodiment, one or more image sensors (e.g., a front sensor or a rear sensor), a lens, an image signal processor (ISP) , Or flash (e.g., an LED or xenon lamp, etc.).

The power management module 295 can, for example, manage the power of the electronic device 201. [ According to one embodiment, the power management module 295 may include a power management integrated circuit (PMIC), a charging IC, or a battery or fuel gauge. The PMIC may have a wired and / or wireless charging scheme. The wireless charging scheme may include, for example, a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic wave scheme, and may further include an additional circuit for wireless charging, for example, a coil loop, a resonant circuit, have. The battery gauge can measure, for example, the remaining amount of the battery 296, the voltage during charging, the current, or the temperature.

The battery 296 may include, for example, a rechargeable battery and / or a solar cell.

The indicator 297 may indicate a particular state of the electronic device 201 or a portion thereof (e.g., processor 210), e.g., a boot state, a message state, or a state of charge.

The motor 298 can convert the electrical signal to mechanical vibration, and can generate vibration, haptic effects, and the like. Electronic device 201 is, for example, DMB Mobile TV-enabled devices capable of handling media data in accordance with standards such as (digital multimedia broadcasting), DVB (digital video broadcasting), or MediaFLO (mediaFlo TM) (for example, : GPU).

Each of the components described in this document may be composed of one or more components, and the name of the component may be changed according to the type of the electronic device. In various embodiments, an electronic device (e. G., Electronic device 201) may have some components omitted, further include additional components, or some of the components may be combined into one entity, The functions of the preceding components can be performed in the same manner.

3 is a block diagram of a program module according to various embodiments. According to one embodiment, program module 310 (e.g., program 140) includes an operating system that controls resources associated with an electronic device (e.g., electronic device 101) and / E.g., an application program 147). The operating system may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM .

3, program module 310 includes a kernel 320 (e.g., kernel 141), middleware 330 (e.g., middleware 143), API 360 (e.g., API 145) ), And / or an application 370 (e.g., an application program 147). At least a portion of the program module 310 may be preloaded on an electronic device, 102 and 104, a server 106, and the like).

The kernel 320 may include, for example, a system resource manager 321 and / or a device driver 323. The system resource manager 321 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 321 may include a process manager, a memory manager, or a file system manager. The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication .

The middleware 330 may provide various functions through the API 360, for example, to provide functions that are commonly needed by the application 370 or allow the application 370 to use limited system resources within the electronic device. Application 370 as shown in FIG. According to one embodiment, the middleware 330 includes a runtime library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager 346, a package manager 347, a connectivity manager 348, a notification manager 349, a location manager 350, a graphic manager 351, a security manager 352, or a payment manager 354, as shown in FIG.

The runtime library 335 may include, for example, a library module that the compiler uses to add new functionality via a programming language while the application 370 is executing. The runtime library 335 may perform input / output management, memory management, or arithmetic function processing.

The application manager 341 can manage the life cycle of the application 370, for example. The window manager 342 can manage GUI resources used in the screen. The multimedia manager 343 can recognize the format required for reproducing the media files and can perform encoding or decoding of the media file using a codec according to the format. The resource manager 344 can manage the source code of the application 370 or the space of the memory. The power manager 345 may, for example, manage the capacity or power of the battery and provide the power information necessary for operation of the electronic device.

According to one embodiment, the power manager 345 may interoperate with a basic input / output system (BIOS). The database manager 346 may create, retrieve, or modify the database to be used in the application 370, for example. The package manager 347 can manage installation or update of an application distributed in the form of a package file.

The connection manager 348 may, for example, manage the wireless connection. The notification manager 349 may provide the user with an event such as, for example, an arrival message, appointment, proximity notification, and the like. The location manager 350 can, for example, manage the location information of the electronic device. The graphic manager 351 may, for example, manage the graphical effects to be presented to the user or a user interface associated therewith. Security manager 352 may provide, for example, system security or user authentication. The payment manager 354 can relay information for payment from the application 370 to the application 370 or the kernel 320. [ In addition, information related to the settlement received from the external device can be stored in the electronic device 201 or information stored therein can be transmitted to the external device.

According to one embodiment, the middleware 330 may include a telephony manager for managing the voice or video call function of the electronic device, or a middleware module capable of forming a combination of the functions of the above-described components . According to one embodiment, the middleware 330 may provide a module specialized for each type of operating system. Middleware 330 may dynamically delete some existing components or add new ones.

The API 360 may be provided in a different configuration depending on the operating system, for example, as a set of API programming functions. For example, for Android or iOS, you can provide a single API set for each platform, and for Tizen, you can provide two or more API sets for each platform.

The application 370 may include a home 371, a dialer 372, an SMS / MMS 373, an instant message 374, a browser 375, a camera 376, an alarm 377, Contact 378, voice dial 379, email 380, calendar 381, media player 382, album 383, clock 384, payment 385, healthcare Blood glucose, etc.), or environmental information (e.g., pressure, humidity, or temperature information) applications.

According to one embodiment, the application 370 may include an information exchange application capable of supporting the exchange of information between the electronic device and the external electronic device. The information exchange application may include, for example, a notification relay application for delivering specific information to an external electronic device, or a device management application for managing an external electronic device. For example, the notification delivery application can transmit notification information generated in another application of the electronic device to the external electronic device, or receive notification information from the external electronic device and provide the notification information to the user. The device management application may, for example, control the turn-on / turn-off or brightness (or resolution) of an external electronic device in communication with the electronic device (e.g., the external electronic device itself Control), or install, delete, or update an application running on an external electronic device.

According to one embodiment, the application 370 may include an application (e.g., a healthcare application of a mobile medical device) designated according to the attributes of the external electronic device. According to one embodiment, the application 370 may include an application received from an external electronic device. At least some of the program modules 310 may be implemented (e.g., executed) in software, firmware, hardware (e.g., a processor), or a combination of at least two of the same, , An instruction set, or a process.

As used herein, the term "module " includes units comprised of hardware, software, or firmware and may be used interchangeably with terms such as, for example, logic, logic blocks, components, or circuits. A "module" may be an integrally constructed component or a minimum unit or part thereof that performs one or more functions. "Module" may be implemented either mechanically or electronically, for example, by application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs) And may include programmable logic devices.

At least some of the devices (e.g., modules or functions thereof) or methods (e.g., operations) according to various embodiments may be stored in a computer readable storage medium (e.g., memory 130) . ≪ / RTI > When the instruction is executed by a processor (e.g., processor 120), the processor may perform a function corresponding to the instruction.

The computer-readable recording medium may be a hard disk, a floppy disk, a magnetic medium such as a magnetic tape, an optical recording medium such as a CD-ROM, a DVD, a magnetic-optical medium such as a floppy disk, The instructions may include code that is generated by the compiler or code that may be executed by the interpreter. Modules or program modules according to various embodiments may include at least one or more of the components described above Some of which may be omitted, or may further include other components.

Operations performed by modules, program modules, or other components, in accordance with various embodiments, may be performed sequentially, in parallel, repetitively, or heuristically, or at least some operations may be performed in a different order, .

And the embodiments disclosed in this document are presented for the purpose of explanation and understanding of the disclosed technology and do not limit the scope of the technology described in this document. Accordingly, the scope of this document should be interpreted to include all modifications based on the technical idea of this document or various other embodiments.

4 is a diagram showing the configuration of an electronic device according to various embodiments of the present disclosure.

4, an electronic device 400 according to various embodiments of the present disclosure includes a camera 410, a wireless communication unit 420, a memory 430, a touch screen 440, and a processor 450 .

According to various embodiments of the present disclosure, the camera 410 may perform photographing of a subject and convert the photographed subject to digital information. The photographs may include text information, image information, and file information on which moving pictures are recorded. The photo may include location information. If the photo contains location information, it can be displayed in order of year, month, day, time, and location (e.g., year - month - day - time - location) taken.

According to one embodiment, when the photograph includes position information, the electronic device 400 may further include a position sensing portion. The position sensing unit may sense the position of the electronic device 400 periodically. The photographing position information of the photograph sensed by the position sensing unit may be attached to the photograph as a tag by the processor 450. For the position measurement of the position sensing unit, a cell, a Global Positioning System (GPS) method, an Angle of Arrival (AOA) method, a Time Difference of Arrival (TDOA) observed Time Difference (E-OTD) method.

According to various embodiments, the wireless communication unit 420 may perform the communication function of the electronic device 400. [ The wireless communication unit 420 may form a communication channel with the network to perform a communication function with at least one external device to support at least one of a voice call, a video call, and a data communication function. The wireless communication unit 420 may include various communication modules such as a mobile communication module (at least one module capable of supporting various communication methods such as 2G, 3G, and 4G), a WiFi module, and a short distance communication module. The wireless communication unit 420 may include an RF transmitter for up-converting and amplifying the frequency of the transmitted signal, an RF receiver for low-noise amplifying the received signal, and down-converting the frequency of the received signal. The wireless communication unit 420 may receive data through a wireless channel and transmit the data to the processor 450 and may transmit data output from the processor 450 to an external device through a wireless channel.

According to various embodiments of the present disclosure, the wireless communication unit 420 may support communication, for example, by forming a communication channel with an external electronic device or a cloud server so as to exchange or share a predetermined photograph.

According to various embodiments, the memory 430 stores a program for processing and controlling the processor 450, an operating system (OS), various applications, and input / output data, and the electronic device 400 ) Can be stored. The memory 430 may store a user interface (UI) provided in the electronic device 400 and various setting information necessary for performing a function process in the electronic device 400. [

According to various embodiments of the present disclosure, the memory 430 may store content such as a photograph taken via the camera 410 and at least one picture shared via the wireless communication unit 420. [ The memory 430 may store the mathematical formulas and the source codes that can be used to sort photographed and stored photographs, and to detect peak values that are the average number of pictures per hour.

According to various embodiments, the touch screen 440 may perform an input function and a display function. To this end, the touch screen 440 may include a touch panel 441 and a display unit 445. The touch panel 441 may be constituted by a touch sensing sensor such as a capacitive overlay, a resistive overlay or an infrared beam or may be constituted by a pressure sensor . In addition to the sensors, all kinds of sensor devices capable of sensing contact or pressure of an object can be configured with the touch panel 441 of the embodiment of the present invention. The touch panel 441 senses a user's touch input, generates a sensing signal, and transmits the sensing signal to the processor 450. The sensing signal may include coordinate data input by the user. When the user inputs the touch position movement operation, the touch panel 441 may generate a sensing signal including coordinate data of the touch position movement path and transmit the sensing signal to the processor 450.

According to various embodiments of the present disclosure, the touch panel 441 may include a force touch sensor 442. The decompression sensor 442 can sense a depth sense that the user touches the touch panel 421 and can easily support selection of a group of pictures classified by year, month, day and time, for example.

According to various embodiments, the display unit 445 may display various types of menus of the electronic device 400, information that the user inputs, or information to provide to the user. The display unit 445 may be formed of a liquid crystal display, an organic light emitting diode (OLED), an active matrix organic light emitting diode (AMOLED), a flexible display, a transparent display, or the like. The display unit 445 may provide various screens such as a home screen, a menu screen, a lock screen, a game screen, a web page screen, a call screen, a music or a moving picture playback screen according to the use of the electronic device 400.

According to various embodiments of the present disclosure, the display portion 445 may display photographed and stored photographs in various manners under the control of the processor 450. [ For example, the display unit 445 displays the pictures stored in the memory 430 in a stacked layer system, a popup system of a corresponding layer by pressing of the reduced pressure sensor 442, a horizontally aligned plane arrangement system, can do. In addition, a card-type UI, a rotating-plate UI method, an album-type arrangement method, and the like can be displayed in various other ways.

According to various embodiments, the processor 450 may perform functions that control the overall operation of the electronic device 400 and the signal flow between the internal components and process the data. The processor 450 may be formed of, for example, a central processing unit (CPU), an application processor, and a communication processor. The processor 450 may be formed of a single core processor or a multi-core processor, and may be composed of a plurality of processors.

In accordance with various embodiments of the present disclosure, the processor 450 may include a number of photographs that are captured via the camera 410 and stored in the memory 430, or shared with an external device or cloud server via the wireless communication unit 420 Can be automatically subdivided into a plurality of groups of time units within the corresponding date based on the photographing time information of the photographs. For example, if there are a plurality of photographic events in the day, a peak value, which is the average number of pictures per hour of the day, is calculated based on 24 hours, and a group of photographs Can be automatically clustered by time division.

According to various embodiments, when many pictures are taken in a specific time zone, the processor 450 may cluster the taken pictures by subdividing the time in minutes. In addition, when there are few photographs taken in a specific time period, the processor 450 may apply the time more widely to clusters the photographed photographs. That is, the processor 450 may dynamically change the clustering interval according to the number of photographs or the photographing frequency of photographs taken at a specific time.

According to various embodiments, the processor 450 may determine the number of pictures photographed based on the first time unit in the time interval over which at least one photograph was taken. If the number of the photographs falls within the first range, the number of photographed photographs based on the first time unit can be confirmed at the photographed time and the consecutive time period. For example, the first time unit may be 24 hours. The number of photographed photographs may be confirmed based on the number of photographs for calculating the peak value, which is the average number of photographs per hour photographed based on 24 hours.

According to various embodiments, the processor 450 may determine the number of pictures photographed based on the second time unit in the time period and the continuous time period if the number of photographed pictures falls within the second range . For example, the second time unit may be a time unit such as 45 minutes, 30 minutes, and 15 minutes.

According to various embodiments, the processor 450 determines the number of photographed photographs corresponding to the time period and the consecutive time period, which are identified based on the time unit of either the first time unit or the second time unit It is possible to set the time period and the photographed time corresponding to the continuous time period as one group.

5 is a diagram illustrating a detailed configuration of a processor according to various embodiments of the present disclosure;

According to one embodiment, the processor 450 may include a photo data analysis unit 451, a peak detection unit 453, a photo classification unit 455, and an application application unit 457. The photograph data analysis unit 451 can analyze the number of photographed photographs by date and the number of photographs taken every year, month, day, hour, and minute in a plurality of photograph files photographed and stored in the memory 430. [ The peak detecting unit 453 can analyze the average number of pictures per hour of the day based on 24 hours. The photograph classifying unit 455 may classify the groups of photographs in the before and after time periods in which the photographing is concentrated on the basis of the average number of photographs per hour calculated through the peak detecting unit 453, The application application unit 457 may classify the pictures subdivided through the picture classifying unit 455 into a popup method of a corresponding layer by pressing of the pressure sensitive sensor 442, , A scroll method, or the like.

In various embodiments of the present disclosure, it is described that the processor 450 includes the photo data analysis unit 451, the peak detection unit 453, the photo classification unit 455, and the application application unit 457 separately to perform the corresponding functions The processor 150 may directly perform the corresponding function without separately including the photo data analysis unit 451, the peak detection unit 453, the photo classification unit 455, and the application application unit 457 .

The electronic device 400 according to various embodiments of the present disclosure may have various variations depending on the convergence trend of digital devices. For example, the electronic device 400 of the present disclosure may further include other configurations, such as a position sensing portion for sensing information related to the position change and a GPS module for measuring the position of the electronic device 400 .

6 is a flow chart illustrating a method of clustering photos of an electronic device according to various embodiments of the present disclosure.

According to one embodiment, the processor 450 may, at act 610, search a collection of photo data stored in the memory 430. The photographic data may be photographic data photographed by the camera 410 or shared with an external electronic device or a cloud server through the wireless communication unit 420. [ Although the present disclosure describes an example of clustering photographic data, a text file, an image, and a moving image file can be applied to clustering. 7, the photo data may be stored in the memory 430 in the order of year, month, day, and serial number (e.g., 20150501_194352 to 20150329_155513.jpg). If the photographic data includes position information, the photographs may be stored in the memory 430 in the order of year, month, day, photographing time, and photographing position (e.g., YYYY-MM-DD_PM00 at Kangnam station).

According to one embodiment, in operation 620, the processor 450 may analyze the number of pictures photographed for years, months, days, hours, and minutes for a plurality of photographic data. For example, as shown in FIGS. 8 and 9, the processor 450 has taken a total of three pictures at approximately 13:05 on May 16, 2015, A total of 10 pictures were taken between 20:12 and 20:15, and a total of 18 pictures were taken between 19:27 and 19:29 on May 22, 2015. In the embodiment of the present disclosure, the time information of the photograph is analyzed. However, the time information of the content such as the moving picture file and the photo file having the EXIF information may be analyzed.

According to one embodiment, the processor 450 may analyze the average number of pictures per hour and the time the photos were taken, at 24 hours a day, in the year, month, and day of the month, For example, as shown in FIG. 9, a total of 15 pictures were taken between 18:00 and 18:59 on January 5, 2014, and between 14:00 and 14:59 on March 15, 2014 A total of 13 photographs were taken.

According to one embodiment, processor 450, in operation 640, subdivides and automatically classifies a group of pictures in a time period of time during which photography is concentrated, based on the average number of pictures per hour analyzed through operation 630 . 10, the processor 450 determines the peak time of the corresponding day in which the photographing is concentrated (e.g., from 15:00 on November 15, 2014, which is equal to or greater than the average (190/10 = 19) (Eg, November 15, 2014, 12 to 17) of the peak hours of the day on which the photo shoots were concentrated.

According to one embodiment, the processor 450 may, at act 650, apply a group of pictures classified by time to the application through the act 640 to display to the user. For example, as shown in FIG. 11, the processor 450 may provide a group of pictures classified by time to the user so as to facilitate management such as storage, editing, deletion and sharing of photographs . In addition, the processor 450 can provide an easy browsing service for a photograph by providing a time line classified by time unit as shown in FIG. 11, the processor 450 may display important groups of photographs including the peak time zones of photographing through markers 1101 and 1102 for respective dates. In addition, the processor 450 may display a title indicating an important moment of time based on the subject of the first photo group while displaying the corresponding photo group.

12 is a flow chart specifically describing the photographic time zone analysis of operation 630 and the photo classification operation of operation 640 according to various embodiments of the present disclosure.

According to one embodiment, the processor 450 may, at act 1210, identify a particular day of pictures by time. For example, as shown in Fig. 13, the processor 450 divides the time into 0 to 24 hours for a specific date (e.g., Day1 and Day2) and stores the number of pictures . Referring to FIG. 13, it can be confirmed that two pictures were taken at 0 to 1 hour for Day 1, and 10 pictures were taken at 5 to 6 hours for Day 2.

According to one embodiment, the processor 450 may, at operation 1220, obtain an average number of pictures per hour of a particular date. For example, the average number of pictures per hour of a specific date can be obtained by the following equations (1) and (2) stored in the memory 430.

[Equation 1]

Pv = TIpD / h1

Here, Pv represents the average number of pictures per hour of a specific date, TIpD (total image per day) represents the total number of pictures, and h1 represents the corresponding time.

&Quot; (2) "

AIpH = TIpD / h1 {}

Here, AIpH represents the average number of pictures per hour, TIpD represents the total number of pictures, h1 {} represents the time set in which the pictures exist (24- Number).

For example, AIpH (average number of pictures per hour) of Day 1 shown in FIG. 13 is 34/36 (24-10) because the total number of pictures of Day 1 is 34, = 2.6428 ... . ≪ / RTI >

According to one embodiment, in operation 1230, the processor 450 may integerize the average number of pictures obtained in the operation 1220 for the calculation amount reduction by the following equation (3).

&Quot; (3) "

ceiling function f (AIpH) = [AIpH]

Here, the ceiling function indicates that the input number is converted into an integer rounded to an unconditionally rounded value, and the above-mentioned 37 / (24-10) = AIpH (2.6428 ...) can be converted into f (AIpH) = 3 through have.

According to one embodiment, in operation 1240, the processor 450 may detect as a "peak " a time zone greater than or equal to a specific value (e.g., three integers) obtained through the operation 1230. [ The "peak" may be, for example, a time zone in which three or more pictures are intensively photographed. For example, as shown in FIG. 14, the processor 450 may be configured to process three sheets at 9 to 10 hours, three sheets at 11 to 12 hours, and four sheets at 12 to 13 hours for a specific date , It can be confirmed that the photographing time of 5 is concentrated at 13 ~ 14 o'clock, 3 at 19 ~ 20 o'clock, 4 at 21 ~ 22 o'clock, and 5 at 22 ~ 23 o'clock.

According to one embodiment, processor 450 may, in operation 1250, cluster photos into groups of photographs that exist at pre- and post-time periods of the peak detected through operation 1240. [ For example, as shown in Fig. 15, the processor 450 divides photographs photographed at 9 to 14 hours and 20 to 24 hours, which are before and after the time when three or more photographs were intensively photographed, into photograph groups Can be classified.

In this case, in order to generate the dividing points of the group of photographs in which the photographs are intensively photographed, the processor 450 sets the number of photographed images in the corresponding time zone to a predetermined number, for example, The end time can be recorded.

According to various embodiments, applying the operations 1210 to 1250 described above, the processor 450 may continue to take photographs continuously throughout the entire day, e.g., as shown in FIG. 16 Photograph), the "peak" can be divided into 1 hour, 30 minutes, and 15 minutes to group the photographs. Referring to FIG. 16, when 187 photographs are taken for 14 hours from 6 to 19 during traveling, 187 / (24-10) = 13.3571 using the equation (2). And applying this value to Equation (3), 14 peaks can be detected.

According to various embodiments, the processor 450 may, as shown in FIG. 17, for example, if the length of a group of pictures is 6 hours or more, the number of time divisions is doubled so that 24 hours are divided into 48 intervals It is possible to check the time zone of the photographing.

According to various embodiments, the processor 450 may, as shown in FIG. 18, for example, if the user continuously performs photographing for more than 12 hours, the time division is quadrupled, It is possible to divide into 96 sections and check the time zone of the photographing.

According to one embodiment, the processor 450 may use an algorithm such as Equation 4 below to derive the breakpoints of a group of pictures.

&Quot; (4) "

if EndTime - StartTime ≥ 6 then

    threshold ← 30min

else if EndTime - StartTime ≥ 12 then

    threshold ← 15min

else

    threshold ← 1hour

end if

Using the equation (4), it is possible to prevent the photographs of the entire day from being merged into one group of photographs.

According to various embodiments, when the photographed photograph includes positional information, the processor 450 may include the positional information representative of the group of photographs in the memory 430 after the selection. The processor 450 may generate a classification event by grouping a group of photographs having positionally close distance information with a group of important photographs in consideration of an interval between clusters around a group of photographs that the user may consider important.

Figures 19-25 illustrate various examples of application of a group of pictures classified by time unit of act 650 to an application in accordance with various embodiments of the present disclosure.

According to one embodiment, as shown in FIG. 19, the processor 450 may provide a group of pictures in a layered-layer manner through the display unit 445. FIG. For example, referring to FIG. 19, a group of photographs may be displayed on the display screen of a display unit such as a year (1), a month (2), a day (3, DayCluster; And can be displayed as a laminated layer. In this case, the photograph can be easily navigated through pinch zoom in / out for the layer.

According to one embodiment, as shown in FIG. 20, the processor 450 may assist in retrieving a group of pictures by force touch through a touch panel 441 with a reduced pressure sensor 442 . For example, referring to FIG. 20, when a group of photographs is displayed on the display screen of the display unit in the order of year (1), month (2), day 1 (3, DayCluster; , And can quickly navigate to a group of pictures of a desired layer by the user's touch and pressure. In this case, the group of pictures of the layer is directly enlarged by the user's touch and pressure. For example, the user can adjust the degree of touch and pressure of the touch panel 441 in units of years (1) (5). ≪ / RTI > When the decompression sensor 442 is not applied to the touch panel 441, the processor 450 can support entry and exit of a photo group of the corresponding layer through pinch zoom in / out.

According to one embodiment, as shown in FIG. 21, the processor 450 may arrange the groups of pictures in a horizontal alignment planar manner through the display portion 445. [ For example, referring to FIG. 21, a group of photographs may be displayed on the display screen of a display unit such as a year (1), a month (2), a day 1 (3, DayCluster; , And the user can access the group of pictures of the corresponding layer through the left and right scrolling. In this case as well, the processor 450 can directly control entry and release of the layer into the photo group through pinch zoom in / out. In addition, the unnecessary group of pictures can be ignored through the radio button 6 (see FIG. 22) arranged at the top of the specific layer, so that the user can intuitively access the group of pictures of the corresponding layer.

According to one embodiment, as shown in FIG. 22, the processor 450 can quickly move to a group of pictures desired by the user using the scroll bar 7 disposed on the side of the layer.

According to one embodiment, as shown in FIG. 23, the processor 450 may provide a timeline 8 on the side of the layer to facilitate browsing of the grouped pictures. For example, referring to FIG. 23, the processor 450 selects a corresponding timeline 8 in a state where picture groups are automatically classified according to a subject, and enters a pop-up menu for the corresponding picture group . This allows the creation of folders for photo groups, editing and deleting photos, sharing with other external devices, and uploading SNS. In addition, the processor 450 may display a title indicating an important moment of time based on the subject of the first photo group while displaying the corresponding photo group. For example, based on the time information of a photo shoot, it is possible to automatically generate titles for holidays, anniversaries, and holidays, automatically generate titles for user's vacation, meetings, meetings and birthdays, You can automatically generate titles for people with whom, where, and where, including additional information from people.

According to one embodiment, as shown in FIG. 24, the processor 450 may upload a corresponding photo group to an SNS or recommend it to another user in an SNS application or the like. For example, referring to FIG. 24, when the user touches and selects a specific photograph, the processor 450 increases the user's convenience by recommending all of the classified photograph groups associated with the one photograph .

According to one embodiment, as shown in FIG. 25, the processor 450 may classify and organize all the pictures stored in the memory 430 of the electronic device 400 for each folder group by group . For example, referring to FIG. 25, when the user selects Auto packing from the menu 9 displayed through the display unit 445, the processor 450 stores the automatically classified folder as an external electronic You can share the same group of photos with other users by sending them to your device. In this case, it is possible to upload and manage the pictures in the same folder basis in the SNS in which the automatic classification function of pictures is not linked. In this case, the processor 450 generates and manages folders for photograph groups automatically classified by year, month, and day (DayCluster: more than 1 day to 31 days), day, and hour Possible to support.

The various embodiments disclosed in the present specification and drawings are only specific examples for the purpose of understanding and are not intended to limit the scope of various embodiments of the present invention. Accordingly, the scope of various embodiments of the present invention should not be limited by the above-described embodiments, and all changes or modifications derived from the technical ideas of various embodiments of the present invention are included in the scope of various embodiments of the present invention Should be interpreted.

400: electronic device 410: camera
420: wireless communication unit 430: memory
440: touch screen 441: touch panel
442: Decompression sensor 445: Display
450: processor 451: photo data analysis unit
453: peak detecting unit 455:
457: application application section

Claims (21)

In an electronic device,
camera;
touch screen;
A processor electrically connected to the camera and the touch screen; And
And a memory electrically coupled to the processor,
Wherein the memory, when executed,
Checking picture data stored in the memory,
Analyzing the number of photographs with respect to the photographing time of the photograph data,
Analyzing the photographing time zone of the photograph data and the number of photographs per hour,
Based on the analyzed number of pictures per hour, a group of pictures including a time zone in which photographing is concentrated and a time zone before and after the photographing is classified in units of time.
The method according to claim 1,
Wherein the processor applies the group of pictures grouped by time unit to an application and displays the group of pictures.
The method according to claim 1,
Wherein the photo comprises at least one of text, images, and moving images.
The method according to claim 1,
Wherein the photograph includes year, month, day, photographing time, and photographing position information when the photograph includes position information.
The method according to claim 1,
Further comprising a wireless communication unit for establishing a communication channel with an external electronic device or a cloud server to support sharing of a predetermined photograph.
The method according to claim 1,
Further comprising a touch screen,
Wherein the touch screen displays a photograph stored in a memory using at least one of a laminated layer method, a popup method by a touch of the layer, a horizontal alignment plane arrangement method, a scroll method, and a time line method.
The method according to claim 1,
The processor comprising:
When there are a plurality of photographing events in the date, a peak value which is the average number of photographs per hour of the day is calculated based on 24 hours, and a photograph of the time zone around the time of photographing the number of the calculated peak value or more An electronic device that categorizes groups in time units.
The method according to claim 1,
The processor comprising:
A photograph data analyzer for analyzing the number of photographed photographs taken every year, the number of photographs taken every year, month, day, hour and minute in a plurality of photograph files stored in the memory,
A peak detecting unit for analyzing the average number of pictures per hour of the date,
And a picture classifying unit for classifying a group of pictures in the time-and-after time zone in which the photographing is concentrated based on the number of pictures per hour analyzed through the peak detecting unit.
9. The method of claim 8,
And the processor includes an application application unit for displaying pictures classified by the photo classification unit by various application methods.
CLAIMS 1. A method of clustering photos of an electronic device,
Confirming a collection of photo data stored in the memory by the processor;
The processor analyzing the number of photographs with respect to the photographing time of the photograph data;
The processor analyzing the photographing time zone and the number of photographs per hour of the photograph data; And
Wherein the processor is operable to classify a time-based group of photographs at a time and at a time of photographed time, based on the analyzed number of pictures per hour.
11. The method of claim 10,
Wherein the processor applies to the application and displays a group of pictures classified by time unit.
11. The method of claim 10,
An operation in which the processor divides a picture of a specific date by time;
The processor obtaining an image number per hour of a particular date;
The processor recalculating the obtained number of pictures per hour;
Detecting at a peak a photographing time zone in which the processor has a specific value or more that is integerized; And
Clustering the photographs into groups of photographs wherein the processor is present in the time zones before and after the detected peaks.
13. The method of claim 12,
Wherein the processor is operable to distinguish between a beginning and an end of a group of pictures based on a time zone in which the number of photographs of the corresponding time period is a specified number.
13. The method of claim 12,
Wherein the processor is operable to variably divide a time unit when the photographing is continuously performed over a given date, and to check the time zone of the photographing.
13. The method of claim 12,
Wherein the processor derives the breakpoints of a group of pictures to prevent the pictures of the previous and next time zones other than the peak from being merged into the group of pictures.
11. The method of claim 10,
The processor is configured to display a group of pictures stored in a memory in a stacked layer manner using a touch screen and to perform photoindustry clustering of an electronic device supporting pinch zoom in / Way.
11. The method of claim 10,
Wherein the processor displays the group of pictures stored in the memory using a touch screen so that the group of pictures can be searched by a force touch method.
11. The method of claim 10,
Wherein the processor supports the use of a touch screen to enable a group of pictures stored in the memory to select a timeline and enter a pop-up menu for the group of pictures.
11. The method of claim 10,
Wherein the processor supports sending folders of classified photo groups to an external electronic device or sharing with a cloud server.
1. A storage medium storing a program for controlling a function of an electronic device,
Wherein the electronic device identifies picture data stored in a memory,
Analyzing the number of photographs with respect to the photographing time of the photograph data,
Analyzing the photographing time zone of the photograph data and the number of photographs per hour,
And classifying the group of photographs in the time-and-after time zone in which the photographing is concentrated, based on the analyzed number of pictures per hour, on a time-by-time basis.
In an electronic device,
A memory for storing at least one photograph; And
A processor,
The processor comprising:
The number of photographs photographed based on the first time unit in the time period in which the at least one photograph was photographed,
If the number of the photographs falls within the first range, the number of photographs photographed based on the first time unit in a continuous time interval with the photographed time is confirmed,
If the number of the photographs falls within a second range, checking the number of photographs taken based on the second time unit in the time period and the continuous time period,
When the number of photographed photographs corresponding to the time period and the continuous time period identified based on the corresponding time unit of the first time unit or the second time unit satisfies the specified condition, And to designate photographed photographs corresponding to the time segments as a group.
KR1020160018278A 2016-02-17 2016-02-17 Electronic device and method for clustering photo therein KR20170096711A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160018278A KR20170096711A (en) 2016-02-17 2016-02-17 Electronic device and method for clustering photo therein

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160018278A KR20170096711A (en) 2016-02-17 2016-02-17 Electronic device and method for clustering photo therein

Publications (1)

Publication Number Publication Date
KR20170096711A true KR20170096711A (en) 2017-08-25

Family

ID=59761396

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160018278A KR20170096711A (en) 2016-02-17 2016-02-17 Electronic device and method for clustering photo therein

Country Status (1)

Country Link
KR (1) KR20170096711A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110969195A (en) * 2019-11-22 2020-04-07 中国南方电网有限责任公司超高压输电公司天生桥局 Intelligent classification method for inspection photos of power transmission line
WO2020145653A1 (en) * 2019-01-09 2020-07-16 삼성전자 주식회사 Electronic device and method for recommending image capturing place
KR20200127367A (en) 2019-05-02 2020-11-11 지현종 Apparatus and method for managing images
US11961217B2 (en) 2020-12-29 2024-04-16 Pusan National University Industry—University Cooperation Foundation Device and method for storing image data for surface defect detection scanner

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020145653A1 (en) * 2019-01-09 2020-07-16 삼성전자 주식회사 Electronic device and method for recommending image capturing place
KR20200127367A (en) 2019-05-02 2020-11-11 지현종 Apparatus and method for managing images
CN110969195A (en) * 2019-11-22 2020-04-07 中国南方电网有限责任公司超高压输电公司天生桥局 Intelligent classification method for inspection photos of power transmission line
US11961217B2 (en) 2020-12-29 2024-04-16 Pusan National University Industry—University Cooperation Foundation Device and method for storing image data for surface defect detection scanner

Similar Documents

Publication Publication Date Title
KR20170097888A (en) Method for combining and providing colltected data from plural devices and electronic device for the same
KR20180081918A (en) Electronic device and method for photographing image thereof
KR20170136920A (en) Method for Outputting Screen and the Electronic Device supporting the same
KR20180127695A (en) Apparatus for securing storage space and method thereof
KR20180074316A (en) System for providing plce information and electronic device and method supporting the same
EP3107087B1 (en) Device for controlling multiple areas of display independently and method thereof
KR20170055254A (en) Method and apparatus for processing metadata
KR20170071960A (en) Apparatus and method for providing user interface of electronic device
US10510170B2 (en) Electronic device and method for generating image file in electronic device
KR20170119934A (en) Electronic device and method for processing gesture input
CN110178110B (en) Electronic device and method for displaying screen through the same
KR102588524B1 (en) Electronic apparatus and operating method thereof
KR20180094290A (en) Electronic device and method for determining underwater shooting
US10719209B2 (en) Method for outputting screen and electronic device supporting the same
KR20170086798A (en) Method for displaying user interface and electronic device supporting the same
KR20170053065A (en) Electronic device and method for providing recommendation object
KR20170054746A (en) Method and electronic device selecting an area associated with contents
KR20180121273A (en) Method for outputting content corresponding to object and electronic device thereof
KR20180051002A (en) Method for cotrolling launching of an application in a electronic device using a touch screen and the electronic device thereof
KR20170096711A (en) Electronic device and method for clustering photo therein
KR20170098113A (en) Method for creating image group of electronic device and electronic device thereof
EP3446240B1 (en) Electronic device and method for outputting thumbnail corresponding to user input
KR20170046496A (en) Electronic device having camera and image processing method of the same
KR20170089642A (en) Apparatus and method for managing of history information in a electronic device
KR20170086806A (en) Electronic device and method for recognizing touch input using the same