KR20150058607A - Method for oupputing synthesized image, a terminal and a server thereof - Google Patents

Method for oupputing synthesized image, a terminal and a server thereof Download PDF

Info

Publication number
KR20150058607A
KR20150058607A KR1020130139973A KR20130139973A KR20150058607A KR 20150058607 A KR20150058607 A KR 20150058607A KR 1020130139973 A KR1020130139973 A KR 1020130139973A KR 20130139973 A KR20130139973 A KR 20130139973A KR 20150058607 A KR20150058607 A KR 20150058607A
Authority
KR
South Korea
Prior art keywords
image
information
images
user
terminal
Prior art date
Application number
KR1020130139973A
Other languages
Korean (ko)
Inventor
천강우
Original Assignee
인포뱅크 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 인포뱅크 주식회사 filed Critical 인포뱅크 주식회사
Priority to KR1020130139973A priority Critical patent/KR20150058607A/en
Publication of KR20150058607A publication Critical patent/KR20150058607A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present invention relates to a method and a device for executing an application. The method includes the steps of: capturing a screen of a first application being executed; registering the captured screen with identification information and user information for the first application in a second application; displaying the captured screen on a thumb nail image when the second application is executed; and displaying a screen corresponding to the captured screen by executing the first application using the registered identification information and the registered user information when the displayed thumb nail image is selected.

Description

Technical Field [0001] The present invention relates to an image output method, a terminal and a server device using the image output method,

The present invention relates to a method of outputting an image through a terminal or a server device such as a cellular phone.

A terminal may be divided into a mobile terminal and a stationary terminal depending on whether the terminal is movable. Again, the terminal may be divided into a handheld terminal and a vehicle mount terminal according to the user's ability to carry it directly.

And, as the service provided by the terminal has diversified recently, it is considered to improve the structural part and / or the software part of the terminal.

In addition, due to the development of operating systems and applications of terminals such as smart phones, the terminal has been able to perform various roles to improve the convenience of the user, beyond the conventional simple data communication function. In addition, the development of the Internet network and the wireless communication technology enables the terminal to perform real-time access and data transmission / reception to the server that provides the service through the network.

In particular, recently emerging social network services are controlled to share information such as texts, images, and videos among users who access social network services through terminals, to provide users with rich information exchange and data sharing services .

However, the conventional social network service merely provides a sharing function of contents, and attempts to efficiently manage and process shared data through the shared social network service to expand functions of the terminal are weak.

An object of the present invention is to provide an image output method capable of extending functions of a terminal through a social network service and improving user's convenience, and a terminal and a server using the same.

According to an embodiment of the present invention, there is provided an image output method using a network, the method comprising: obtaining a theme content to be searched in a social network server through the network; Extracting image identification information from the theme content; Searching a plurality of images uploaded to the social network server through a user account registered in the social network server using the extracted image identification information, and creating a composite image by connecting according to preset conditions; And outputting the generated composite image.

According to another aspect of the present invention, there is provided a terminal for outputting an image, the terminal comprising: a user input unit; A communication unit for connecting to a social network; An output unit for outputting an image; And extracting image identification information from the theme content based on user input through the user input unit, extracting image identification information from the theme content, and transmitting the extracted image identification information to the social network And a control unit for connecting the plurality of searched user images according to predetermined conditions to generate a composite image, and the output unit outputs the generated composite image.

According to another aspect of the present invention, there is provided a server apparatus comprising: a communication unit connected to a social network and receiving a theme content to be searched from the user terminal through the social network; And a control unit for extracting image identification information from the theme content and creating a composite image by connecting a plurality of images retrieved from the social network according to preset conditions using the extracted image identification information, And transmits the generated composite image to the terminal.

Meanwhile, the operations of the terminal and the server device may be embodied as a computer-readable recording medium on which a program for executing the program is stored.

According to an embodiment of the present invention, a plurality of user images uploaded to a social network server can be searched according to previously acquired theme contents, so that users can conveniently acquire images required for composition.

In addition, according to an embodiment of the present invention, all or a part of images of a plurality of users searched may be connected to generate an image automatically synthesized according to a condition, thereby facilitating the image synthesizing process, Expert image can be easily produced, and user convenience can be improved.

1 is a diagram illustrating an entire system including a terminal and a social network server according to an embodiment of the present invention.
2 is a diagram illustrating a configuration of a terminal according to an embodiment of the present invention.
3 is a diagram illustrating a configuration of a server apparatus according to an embodiment of the present invention.
4 is a flowchart illustrating an operation of a terminal according to an exemplary embodiment of the present invention.
5 to 11 are views showing an example of a terminal operation screen according to a user input

Hereinafter, an application execution method according to an embodiment of the present invention, a terminal and a server using the method will be described in detail with reference to the accompanying drawings.

The foregoing objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Like reference numerals designate like elements throughout the specification. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

Hereinafter, a terminal and a server apparatus related to the present invention and their operation will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

1 is a diagram illustrating an entire service providing system including a terminal and a social network server according to an embodiment of the present invention.

Referring to FIG. 1, an entire service providing system according to an embodiment of the present invention includes a plurality of terminals 100, a social network server 200, and a communication network 300 for connecting them.

The social network server 200 performs user authentication of the connected terminal 100 when at least one terminal 100 accesses the client and imports external data at the request of the terminal 100, Out operation.

In particular, the social network server 200 may provide a social network service (SNS) through data transmission / reception with the terminal 100. The social network server 200 provides a function of sharing text, image, and image data uploaded from the terminal 100 with other users.

In addition, in the embodiment of the present invention, the social network server 200 receives the theme content to be searched through the social network from the terminal 100, extracts the image identification information from the theme content, The plurality of images retrieved from the social network may be connected to each other according to predetermined conditions using the identification information to generate a composite image, and the generated composite image may be transmitted to the terminal 100.

Meanwhile, the terminal 100 may refer to any device that can be connected to the social network server 200 through the communication network 300. The client terminal 200 may access the server device 100 to transmit user authentication information, and may perform data transmission / reception with the server device 100 when authenticated.

The terminal 100 described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player) , But the present invention is not limited thereto and may be various devices capable of user input and information display.

The terminal 100 acquires the theme content to be searched in the social network server 200, extracts the image identification information from the theme content, and transmits the extracted image identification information to the social network server 200 A plurality of images uploaded to the social network server 200 are retrieved through a user account registered in the server 200 and connected according to preset conditions to generate a composite image and output the generated composite image.

The communication network 300 may provide a network for supporting communication between the terminal 100 as a client and the social network server 200. As a means for protecting contents transmitted through the network in the communication network 300, conditional access or content protection can be used. As an example of such restriction reception or content protection, a scheme such as a cable card (CableCARD) and a DCAS (Downloadable Conditional Access System) may be used.

Also, the communication network 300 may include a mobile station (MS) for network relay, and the mobile station may include a terminal, a mobile terminal MT, a subscriber station (SS) May refer to a subscriber station (PSS), a user equipment (UE), an access terminal (AT), or the like, and may refer to a terminal, a terminal, a subscriber station, a mobile subscriber station, And the like.

2 is a diagram illustrating a configuration of a terminal according to an embodiment of the present invention.

2 is a block diagram illustrating a configuration of a terminal according to an exemplary embodiment of the present invention. The terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, A sensing unit 140, an output unit 150, a memory unit 160, an interface unit 170, a control unit 180, a power supply unit 190, and the like. The components shown in Fig. 2 are not essential, so that a terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the terminal 100 and the wireless communication system or between the terminal 100 and the network in which the terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory unit 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and the wireless Internet module 113 can be built in or externally attached to the terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The location information module 115 is a module for confirming or obtaining the location of the terminal. The location information module 115 may obtain location information using a global navigation satellite system (GNSS). Here, the Global Positioning System (GNSS) is a term used to describe the radionavigation satellite systems in which certain types of radionavigation receivers orbiting the Earth are transmitting reference signals that can determine their position near the surface or ground surface . The Global Positioning System (GNSS) includes Global Positioning System (GPS) in the United States, Galileo in Europe, GLONASS (Global Orbiting Navigational Satelite System) in Russia, COMPASS in China and Japan QZSS (Quasi-Zenith Satellite System), which is operated by the Ministry of Land, Infrastructure and Transport.

As a representative example of the GNSS, the location information module 115 may be a Global Position System (GPS) module. The GPS module calculates information on a distance (distance) from three or more satellites to a point (object), information on a time when the distance information is measured, and then applies a trigonometric method to the calculated distance information, Dimensional position information according to latitude, longitude, and altitude with respect to a point (object) in the three-dimensional space. Further, a method of calculating position and time information using three satellites and correcting the error of the calculated position and time information using another satellite is also used. The GPS module continuously calculates the current position in real time, and calculates velocity information using the current position.

Referring to FIG. 2, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory unit 160 or may be transmitted to the outside through the wireless communication unit 110. [ The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the terminal 100 such as the open / close state of the terminal 100, the position of the terminal 100, the presence of the user, the orientation of the terminal, And generates a sensing signal for controlling the operation of the sensor. For example, when the terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. In addition, it may be responsible for a sensing function related to whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor.

The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154, for example, for generating output related to visual, auditory, have.

The display unit 151 displays and outputs information to be processed by the terminal 100. For example, when the terminal is in the call mode, a UI (User Interface) or GUI (Graphic User Interface) associated with the call is displayed. When the terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display).

Some of these displays may be transparent or light transmissive so that they can be seen through. This may be referred to as a transparent display. A typical example of the transparent display is a transparent LCD or the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the terminal 100. For example, in the terminal 100, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

The display unit 151 is connected to a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') having a mutual layer structure It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

Referring to FIG. 2, a proximity sensor may be disposed in an inner region of the terminal to be wrapped by the touch screen or in the vicinity of the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory unit 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 outputs an acoustic signal related to a function (e.g., a call signal reception sound, a message reception sound, etc.) performed in the terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like. Also, the sound output module 152 may output sound through the earphone jack 116. The user can connect the earphone to the earphone jack 116 to hear the sound output.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the terminal 100. Examples of events that occur in a terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may also be output through the display unit 151 or the audio output module 152.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may be configured to perform various functions such as an effect of stimulation by a pin arrangement vertically moving with respect to a contact skin surface, an effect of stimulation by air spraying force or suction force through a jet opening or a suction opening, A variety of tactile effects such as an effect of stimulation through contact of an electrode, an effect of stimulation by an electrostatic force, and an effect of reproducing a cold sensation using a heat absorbing or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to feel the tactile effect through the muscles of the user's finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The memory unit 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory unit 160 may store data related to vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory unit 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The terminal 100 may operate in connection with a web storage that performs a storage function of the memory unit 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the usage right of the terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a universal user authentication module A Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The control unit 180 typically controls the overall operation of the terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

In particular, the control unit 180 accesses the social network server 200 through the network, acquires the theme content to be searched according to the user input, and extracts the image identification information from the theme content.

In addition, the control unit 180 may search a plurality of images uploaded to the social network server 200 using the extracted image identification information, and may generate a composite image by connecting according to preset conditions, And the output unit 150 may be controlled to output the synthesized image. The image identification information may include at least one of image pattern information, position information, and time information capable of searching the plurality of images.

The control unit 180 may search for the event information associated with the theme content through the social network, provide the searched event information, and extract the image identification information from the first event selected in the provided event information.

The control unit 180 displays a plurality of searched images in a thumbnail form, displays a setting menu for setting the condition, and displays a thumbnail form corresponding to the plurality of images according to a condition selected in the setting menu The preview images corresponding to the composite image can be displayed by connecting the images in a predetermined direction.

Meanwhile, the control unit 180 may generate the composite image by connecting the plurality of images in a predetermined direction according to the time order of each of the plurality of images.

In addition, the control unit 180 may generate the composite image by connecting the plurality of images in a predetermined direction according to the position information of each of the plurality of images.

The controller 180 may acquire the theme content based on at least one of image information, image information, and keyword information input through an input interface for inputting the theme content.

The composite image may be output as a single image including the plurality of images. For example, the composite image may be output as a panoramic image in which a plurality of images are connected in one direction, or in a merged form in which a plurality of images are placed in a specific graphic image.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing functions. In some cases, And may be implemented by the control unit 180.

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. Also, the software codes may be stored in the memory unit 160 and executed by the control unit 180. [

According to an embodiment of the present invention, all or a part of images of a plurality of users retrieved from the social network server 200 are connected to each other, , And panoramic photographs can be easily produced, so that user convenience can be improved.

3 is a diagram illustrating a configuration of a social network server according to another embodiment of the present invention.

The social network server 200 according to another embodiment of the present invention includes a communication unit 210, a control unit 220, a user management unit 230, a user database 240, a content management unit 250, and a content database 260 .

The communication unit 210 can transmit and receive data for providing a social network service through the communication network 300. The communication unit 210 can receive the theme content to be searched through the social network from the user's terminal 100. [ To this end, the communication unit 210 may include at least one of the mobile communication module, the wireless Internet module, and the short-range communication module.

The user management unit 230 may store and manage the user account information registered in the social network service in the user database 240 to provide the social network service. In addition, the user management unit 230 may store and manage the connection relationship information between the users in the user database 240. [ For example, the user management unit 230 can store and manage the connection relationships between users by dividing them into friends, families, or small groups.

The content management unit 250 can store and manage the content uploaded through the social network in the content database 260 to provide the social network service. Also, the content management unit 250 can generate, store, and manage attribute information of content to be uploaded from each user. For example, the attribute information may include at least one of position information, time information, event information, and pattern information.

The control unit 180 controls the overall operation of the server 200. For example, for data communication, database management, and the like.

In particular, according to one embodiment of the present invention, the control unit 180 extracts image identification information from the theme content transmitted through the communication unit 210 or receives image identification information extracted and transmitted in advance from the terminal 100 Image identification information can be obtained.

In addition, the control unit 180 can perform the search in the social network based on the image identification information. The control unit 180 may control the user management unit 230 and the content management unit 250 to acquire a plurality of user images corresponding to image identification information among various contents uploaded from a plurality of users.

The combining unit 270 may generate a composite image by connecting a plurality of images according to preset conditions under the control of the controller 180 and may output the combined image to the outside through the output unit 280, To the specific terminal 100 via the Internet. However, the functions of the combining unit 270 and the output unit 280 may be selectively performed in the terminal 100 according to the communication environment.

4 is a flowchart illustrating an image output method according to an exemplary embodiment of the present invention. The image output method illustrated in FIG. 4 is a block diagram illustrating a configuration of a terminal according to an exemplary embodiment of the present invention shown in FIG. May be described in connection with a block diagram of a server according to an embodiment of the present invention shown in FIG.

Referring to FIG. 4, the terminal 100 accesses a social network service (S101).

The terminal 100 can access the social network server 200 through the communication network 300 and access the social network service by performing the authentication.

Then, the terminal 100 determines a theme content for image composition (S101).

The control unit 180 of the terminal 100 may determine the theme content to be searched according to user input. In particular, the control unit 180 provides an interface for inputting the theme content through the display unit 151, and can determine the theme content according to the user input.

The theme content may include, for example, at least one of an image, a moving image, and a text. To this end, the control unit 180 may provide an image input interface for inputting an image, a moving image input interface for inputting a moving image, or a keyword input or a voice input interface for inputting text. In addition, the control unit 180 can perform the theme content determination by providing a menu around the image provided through the social network service so that the user can decide to select the theme content.

Then, the terminal 100 extracts image identification information from the theme content, and sets a search condition (S103). This process may be performed in the social network server 200.

The control unit 180 of the terminal 100 may extract the image identification information individually according to the property of the theme content.

When the theme content corresponds to the image, the control unit 180 may extract at least one of the pattern information, the time information, and the position information from the image as the image identification information. The pattern information may include at least one of, for example, point information, line information or plane information. The extracted image identification information according to the pattern information can be used to search for similar images. Further, the image may additionally include time information and position information. The control unit 180 may extract at least one of the time information and the position information as image identification information.

In addition, when the theme content corresponds to the moving image, the control unit 180 may extract the image identification information by extracting a specific frame from the moving image and obtaining pattern information from the frame.

If the theme content corresponds to the text, the control unit 180 can extract the keyword information from the text. The keyword information may include time information, location information, and event information. For example, the keyword information may include a keyword corresponding to at least one of a specific performance, an exhibition, a travel destination, a place, and time information.

In addition, the controller 180 searches the network for event information associated with the theme content, provides the retrieved event information to the user, and extracts the image identification information from the specific event selected from the provided event information.

For example, when the theme content corresponds to an image photographed at a performance, the control unit 180 may retrieve information about a plurality of performances highly related to the photographed image and provide the information to a user. When the user selects a specific performance, the controller 180 may acquire at least one of pattern information, position information, and time information corresponding to the selected performance as image identification information.

Then, the terminal 100 searches a plurality of images uploaded to the social network server 200 based on the image identification information (S105). This step may also be performed in the social network server 200.

For example, the control unit 180 of the terminal 100 may search the content database 260 of the social network server 200 using the image identification information, and may receive a plurality of user images corresponding to the image identification information have. The social network server 200 searches a plurality of user images uploaded to the content database 260 based on the extracted image identification information or extracted from the theme content received from the terminal 100, To the synthesizer (100), or to the synthesizer (270).

In addition, the control unit 180 may set a search condition or a search range in advance according to user input. For example, the search scope may be limited to content uploaded from user accounts set as friends. Further, the search range can be limited to contents uploaded from user accounts set as a family. And, the search range can be limited to contents uploaded from user accounts set as a specific group. The control unit 180 may perform a search based on the image identification information on the contents set to the public display in accordance with the user selection input.

Thereafter, the terminal 100 links the plurality of searched images according to preset conditions (S107). This step may be performed by the combining unit 270 of the social network server 200.

In particular, in one embodiment of the present invention, the terminal 100 may provide a composite interface for connecting a plurality of retrieved images.

The control unit 180 may display a plurality of searched images in a thumbnail form via a composite interface and provide a setting menu for setting the condition.

In addition, the controller 180 may connect the thumbnail images corresponding to the plurality of images in a predetermined direction according to the condition selected in the setting menu, and display the preview image corresponding to the composite image.

The selectable conditions in the setting menu may include at least one of arrangement order, arrangement type, and size adjustment.

Further, in order to suitably connect a plurality of images, various connection conditions can be used. For example, the connection condition may include a position-based order or a time-based order, and a pattern recognition connection method that is automatically classified according to a pattern recognition technique based on image feature information and appropriately connected may be used. Accordingly, the controller 180 can arrange and connect the plurality of images in a preset order, and perform size adjustment.

In particular, the controller 180 may connect the plurality of images in a predetermined direction according to the time order of each of the plurality of images, or connect the plurality of images in a predetermined direction according to the position information of the plurality of images have.

In addition, the predetermined direction may vary depending on the condition selected in the setting menu, and may include one direction or a plurality of directions.

Thereafter, when the plurality of images are connected, the terminal 100 performs the composite image generation based on the plurality of connected images (S109). This step may be performed by the combining unit 270 of the social network server 200.

The generated composite image is output to the outside through the terminal 100 or the social network server 200 (S111).

The generated composite image may be output through the display unit 151 of the terminal 100. In addition, the composite image generated by the social network server 200 may be transmitted to the terminal 100 through the communication unit 210.

Composite images can have a single image type. Further, the composite image may be generated as a separate format file including the additional data. The control unit 180 may further generate producer information corresponding to the generated composite image. The producer information may be included in the composite image or stored in the user database 240.

Also, according to one embodiment, the social network server 200 can also be plural and can provide different social network services, respectively. In this case, the control unit 180 may identify the search target social network database from which the plurality of user images are to be searched based on the user account information connected to the social network.

For example, the control unit 180 may further display a selection menu of a social network service to be searched when setting a search range for searching the plurality of user images according to a user input to the search interface. The search range may also include at least one of a full public network, a friend associated with the user account, and a group to which the user account is subscribed.

5 to 11 are views showing an example of a terminal operation screen according to a user input

5 illustrates a theme content input interface according to an embodiment of the present invention.

As shown in FIG. 5, the theme content input interface may include at least one of a video / image input menu, a keyword input menu, and an event search menu. The user can input the theme content by uploading the image through the video / image input menu of the theme content input interface, inputting the URL corresponding to the video / image, or uploading the video. Also, the user can set the search keyword through the keyword input menu of the theme content input interface. The user can select an event by searching for a specific performance, event, or exhibition through the event search menu.

According to the theme content input through the interface as shown in FIG. 5, the terminal 100 can acquire the image identification information to perform the search.

6, when the image identification information is extracted from the image / image, the terminal 100 may display the image identification information extracted from the image / image on the display unit 151. [ 6 shows that a pattern image (uploaded photo image) extracted from a user-input image or image, positional information (a record of Mt. Seolak), and time information (September 21, 2013) are displayed.

In addition, as shown in FIG. 7, the image identification information according to keyword input or event search input may be displayed on the display unit 151 of the terminal 100. FIG. In this case, input keyword information or event search information may be displayed. The event search information may include a place, date and / or performance / exhibition / event name information associated with the event. The user can confirm the image identification information through the interface of Fig. 6 or Fig. 7 and make a decision for performing the search.

8 shows a search range setting interface for searching a plurality of images according to an embodiment of the present invention.

Referring to FIG. 8, the search range may include at least one of a friend, a family, a group, or the entirety.

For example, when the user desires to set uploaded images from other users in the group to which the user belongs, the user can select only the group and deselect the friends and family to designate the search range. In addition, if the user does not want to set the search range, he can search the entire image set to be public.

As described above, the content management unit 250 can manage the attributes of the content stored in the content database 260, so that the content management unit 250 can perform a search within the search range requested by the terminal 100. [ In order to improve the convenience of retrieval, it is also possible to include at least one of user information uploaded in the content attribute, upload date information, upload location information, image pattern information, and associated event information.

9 shows a search result screen in which the retrieved images are received by the terminal 100 and the images are displayed through the display unit 151. When the user selects synthesis progress, the preview interface as shown in FIG. 10 is further displayed .

In particular, the control unit 180 of the terminal 100 may display a plurality of searched images in a thumbnail form and display a setting menu for setting a connection condition at a predetermined position.

In addition, the controller 180 may connect thumbnail images corresponding to the plurality of images in a predetermined direction according to the condition selected in the setting menu, and display a preview image corresponding to the composite image.

As described above, the arrangement order and arrangement type for connection may include various schemes. In addition, according to the embodiment of the present invention, since the size adjustment of each image can be set in detail, various needs for user's image synthesis can be reflected.

For example, as shown in FIG. 10, the arrangement order determination method may include at least one of the order by position, the order by time, the automatic connection by pattern recognition, and the random order. In addition, the arrangement type determination method may include at least one of a tile arrangement, a column number setting, a panoramic arrangement, and a random method. In addition, the resizing menu of the image may include at least one of height match, width match, pattern recognition, or no adjustment.

The preview image can be modified in real time according to the input to the setting menu of the user. When the preview image is out of the screen size of the terminal 100, the controller 180 may connect the plurality of images in a scrollable structure and display only a part of the images on the screen.

In addition, the user can remove or move a specific user image by using a touch input or a gesture input with respect to the preview image. For example, the control unit 180 may display the removal icon 101 for each of the plurality of user images, and the user may select the removal icon 101 to delete the respective images. In addition, the control unit 180 can scroll the linked image according to the user's touch input, and can switch to the edit mode when the specific user image is touched for a predetermined time or more. In the edit mode, at least one of the arrangement order, the photo position and the size can be changed according to the user input according to the user's input.

FIG. 11 shows a synthesized result screen synthesized in a panoramic format according to an embodiment of the present invention.

As shown in FIG. 11, according to the user's input, the image of Seoraksan is determined as the theme content, and thus, a plurality of the retrieved user images can be connected in a panoramic manner. When these images are connected by the position information or the image pattern recognizing method, a panoramic image according to the position or the pattern can be generated.

As described above, according to the embodiment of the present invention, the user can conveniently search various user images uploaded through the social network service, and can easily generate a synthesized image by selecting a plurality of searched images. The generated image can be shared through the social network service or separately stored in the terminal 100.

Meanwhile, the operation of the terminal 100 described above may be performed through a predetermined application. The application may be any one of a variety of functions provided in the terminal 100, such as a map, the Internet, mail, a messenger, navigation, and the like, and is not limited to a specific function application. In addition, an application may additionally have various functions in addition to the method of executing an application according to an embodiment of the present invention. Data for installation and operation of the application can be transmitted and received through the server 200. [

The method according to the present invention may be implemented as a program for execution on a computer and stored in a computer-readable recording medium. Examples of the computer-readable recording medium include a ROM, a RAM, a CD- , A floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave (for example, transmission over the Internet).

The computer readable recording medium may be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner. And, functional programs, codes and code segments for implementing the above method can be easily inferred by programmers of the technical field to which the present invention belongs.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It should be understood that various modifications may be made by those skilled in the art without departing from the spirit and scope of the present invention.

Claims (16)

A method for image output of a terminal using a network,
Connecting to a social network server;
Obtaining a theme content to be searched through the social network server;
Extracting image identification information from the theme content;
Retrieving a plurality of user images uploaded to the social network server using the extracted image identification information, and creating a composite image by connecting according to preset conditions; And
And outputting the generated composite image.
The method according to claim 1,
Displaying the retrieved plurality of user images in a thumbnail form and displaying a setting menu for setting the condition; And
And displaying a preview image corresponding to the composite image by connecting thumbnail images corresponding to the plurality of user images in a predetermined direction according to a condition selected in the setting menu.
The method according to claim 1,
Wherein generating the composite image comprises:
And connecting the plurality of images in a predetermined direction according to a time order of each of the plurality of images to generate the composite image.
The method according to claim 1,
Wherein generating the composite image comprises:
And connecting the plurality of images in a predetermined direction according to position information of each of the plurality of images to generate the composite image.
The method according to claim 1,
Wherein the obtaining of the theme content comprises:
And acquiring the theme content based on at least one of image information, image information, and keyword information input through an input interface for inputting the theme content.
The method according to claim 1,
Wherein the image identification information includes at least one of image pattern information, position information, and time information capable of searching the plurality of images.
The method according to claim 1,
Wherein the extracting of the image identification information comprises:
Retrieving event information associated with the theme content through the social network;
Providing the retrieved event information; And
And extracting the image identification information from a first event selected from the provided event information.
The method according to claim 1,
Wherein the composite image is output as a single image including the plurality of images.
9. A recording medium on which a program for causing a computer to execute the method according to any one of claims 1 to 8 is recorded. 1. A terminal for outputting an image,
A user input section;
A communication unit for connecting to a social network;
An output unit for outputting an image; And
The method includes obtaining a theme content to be searched through the social network based on a user input through the user input unit, extracting image identification information from the theme content, searching the social network using the extracted image identification information, And a controller for connecting the plurality of user images according to predetermined conditions to generate a composite image,
Wherein the output unit outputs a composite image using a social network for outputting the generated composite image.
11. The method of claim 10,
Wherein the control unit identifies a search target social network database from which the plurality of user images are to be searched based on user account information connected to the social network.
12. The method of claim 11,
Wherein the control unit sets a search range in which to search for the plurality of user images according to a user input to the search interface,
Wherein the search range includes at least one of an entire network, a friend connected to the user account, and a group to which the user account is subscribed.
11. The method of claim 10,
Wherein the control unit connects the plurality of user images in a predetermined direction according to at least one of position information or time information of each of the plurality of user images to generate the composite image.
11. The method of claim 10,
Wherein the control unit searches the social network for event information associated with the theme content, provides the searched event information, and extracts the image identification information from the first event selected from the provided event information.
11. The method of claim 10,
Wherein the composite image is output as a single image comprising the plurality of images.
In the server apparatus,
A communication unit for accessing a network for providing a social network service and receiving a theme content to be searched from a user terminal; And
And a control unit for extracting image identification information from the theme content and creating a composite image by connecting a plurality of images retrieved from a database according to preset conditions using the extracted image identification information,
And the communication unit transmits the generated composite image to the user terminal.
KR1020130139973A 2013-11-18 2013-11-18 Method for oupputing synthesized image, a terminal and a server thereof KR20150058607A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130139973A KR20150058607A (en) 2013-11-18 2013-11-18 Method for oupputing synthesized image, a terminal and a server thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130139973A KR20150058607A (en) 2013-11-18 2013-11-18 Method for oupputing synthesized image, a terminal and a server thereof

Publications (1)

Publication Number Publication Date
KR20150058607A true KR20150058607A (en) 2015-05-29

Family

ID=53392781

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130139973A KR20150058607A (en) 2013-11-18 2013-11-18 Method for oupputing synthesized image, a terminal and a server thereof

Country Status (1)

Country Link
KR (1) KR20150058607A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200055209A (en) * 2018-11-12 2020-05-21 주식회사 로뎀마이크로시스템 System, apparatus and method for producing experience based content
WO2020176398A1 (en) * 2019-02-25 2020-09-03 Life Impact Solutions, Inc. Media alteration based on variable geolocation metadata
WO2023191121A1 (en) * 2022-03-29 2023-10-05 엘지전자 주식회사 Display device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200055209A (en) * 2018-11-12 2020-05-21 주식회사 로뎀마이크로시스템 System, apparatus and method for producing experience based content
WO2020176398A1 (en) * 2019-02-25 2020-09-03 Life Impact Solutions, Inc. Media alteration based on variable geolocation metadata
US11763503B2 (en) 2019-02-25 2023-09-19 Life Impact Solutions Media alteration based on variable geolocation metadata
WO2023191121A1 (en) * 2022-03-29 2023-10-05 엘지전자 주식회사 Display device

Similar Documents

Publication Publication Date Title
KR101753031B1 (en) Mobile terminal and Method for setting metadata thereof
KR101651191B1 (en) Mobile terminal and control method thereof
US8666454B2 (en) Mobile terminal and method of controlling the same
KR20120029234A (en) Mobile terminal, electronic system and method of transferring and receiving data using the same
KR20170029178A (en) Mobile terminal and method for operating thereof
KR20120035292A (en) Electronic device and operating method thereof
KR20110133713A (en) Mobile terminal and method for controlling the same
KR20150059344A (en) Mobile terminal and controlling method thereof
KR101712666B1 (en) Electronic device and contents sharing method for electronic device
KR20150058607A (en) Method for oupputing synthesized image, a terminal and a server thereof
KR20110132031A (en) Mobile terminal and image processing method for mobile terminal
KR102079696B1 (en) A server appratus for synthesizing multimedia using network
KR101253754B1 (en) Electronic Device and the Operating Method Thereof
KR20160061154A (en) Mobile terminal and method for controlling the same
KR20150068838A (en) Electronic device and method for controlling of the same
KR102079691B1 (en) A terminal for synthesizing multimedia using network
KR20150092624A (en) Electronic device and control method thereof
KR102107554B1 (en) A Method for synthesizing multimedia using network
KR101769934B1 (en) Mobile terminal and control method therof
KR101070679B1 (en) Mobile terminal and control method for mobile terminal
KR101531194B1 (en) Method of controlling application interworking with map key and mobile terminal using the same
KR20150069133A (en) Electronic device and control method thereof
KR20100105191A (en) Mobile terminal and information displaying method thereof
KR20150092875A (en) Electronic Device And Method Of Controlling The Same
KR101987462B1 (en) Mobile terminal and method for controlling of the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination