KR20120014794A - Mobile terminal and method for guiding photography thereof - Google Patents

Mobile terminal and method for guiding photography thereof Download PDF

Info

Publication number
KR20120014794A
KR20120014794A KR1020100076985A KR20100076985A KR20120014794A KR 20120014794 A KR20120014794 A KR 20120014794A KR 1020100076985 A KR1020100076985 A KR 1020100076985A KR 20100076985 A KR20100076985 A KR 20100076985A KR 20120014794 A KR20120014794 A KR 20120014794A
Authority
KR
South Korea
Prior art keywords
information
mobile terminal
camera
location
recommended
Prior art date
Application number
KR1020100076985A
Other languages
Korean (ko)
Inventor
권순모
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100076985A priority Critical patent/KR20120014794A/en
Publication of KR20120014794A publication Critical patent/KR20120014794A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a mobile terminal and a photographing guide method thereof. The present invention relates to a method for driving a camera, obtaining location-related information of a camera, and transmitting the obtained location-related information to a server. When the relevant recommended photo information is provided, photographing is performed using the recommended photo information.

Description

How to Guide Mobile Device and His Photo Shoot {MOBILE TERMINAL AND METHOD FOR GUIDING PHOTOGRAPHY THEREOF}

The present invention relates to a mobile terminal for guiding photographing by using recommended photograph information of other users photographed at a photographing location and a photographing guide method thereof.

Terminals, such as personal computers, laptops, mobile phones, etc., have diversified functions, for example, multimedia devices having complex functions such as taking pictures or videos, playing music or video files, playing games, and receiving broadcasts. It is implemented in the form of (multimedia player).

Terminals may be divided into mobile terminals and stationary terminals according to their mobility. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

When taking a picture using a camera built in the terminal, only the screen division information such as a crosshair or a grid is provided on the preview screen to help the user compose the picture. There is not enough to satisfy the camera user's desire to leave a better picture because it does not have information about the object or background to take a picture. Accordingly, researches for allowing a user to take a good picture more conveniently and easily in the use of a camera are continuing.

The present invention has been made to solve the above problems, the present invention is a mobile terminal and a photographing guide for taking a picture using the recommended pictures of other people taken at the location where the terminal is currently located when taking a picture It is to provide a method.

The photographing guide method of the mobile terminal according to an embodiment of the present invention for realizing the above object is to drive the camera to obtain location-related information, the step of transmitting the obtained location-related information to the server, and Receiving recommendation photo information related to the location-related information from the server, and taking a picture by using the received recommendation photo information.

In addition, the mobile terminal according to the present invention includes a camera, a position related information acquisition unit for acquiring position related information about the camera, a display unit for displaying an image input through the camera, and the position related information as a server. And a control unit configured to transmit and receive recommendation photo information related to the location related information from the server, and to control the camera to take a photograph by using the recommendation photo information.

The present invention according to at least one embodiment of the present invention configured as described above may perform a picture taking by using the recommended pictures of other users taken at the point where the terminal is currently located when taking a picture. Therefore, the present invention can provide convenience for a user who does not use the camera well or does not take a good picture easily to take a good picture.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
Figure 2a is a perspective view from the front of an example of a mobile terminal according to an embodiment of the present invention.
Figure 2b is a rear perspective view of the mobile terminal shown in Figure 2a.
3 is a flowchart illustrating a photographing guide method of a mobile terminal according to an embodiment of the present invention.
4A to 4B illustrate an example in which a mobile terminal associated with an embodiment of the present invention recognizes a photographing target using augmented reality.
FIG. 5 is an example of recognizing a photographing target through communication with a location information providing apparatus at a location where a mobile terminal according to an embodiment of the present invention is located; FIG.
6A to 6C are screens showing the received recommended picture information of the mobile terminal according to the present invention.
7A and 7B are screens for displaying a recommended picture selected by a mobile terminal according to the present invention.
8a to 8c is a picture-taking step-by-step screen of the mobile terminal according to the present invention.
9A and 9B illustrate an example in which a mobile terminal according to the present invention takes a picture using recommended picture information and facial recognition functions.

Hereinafter, a mobile terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, and the like, except when applicable only to mobile terminals.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface. The unit 170, the controller 180, and the power supply unit 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast center through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast center may refer to a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the use environment.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 detects the current state of the mobile terminal 100 such as the open / closed state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, the acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, if the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a shooting mode, the mobile terminal 100 displays a photographed and / or received image, a UI, or a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

Two or more display units 151 may exist according to the implementation form of the mobile terminal 100. For example, a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. Proximity sensors have a longer life and higher utilization than touch sensors.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed by the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet port, grazing to the skin surface, contact of an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. The haptic module 154 may be provided with two or more according to the configuration aspect of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.). The memory 160 may store data relating to various patterns of vibration and sound output when a touch input on the touch screen is performed.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM (Random Access Memory, RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a passage with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits the data inside the mobile terminal 100 to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identification module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

The interface unit may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various command signals input from the cradle by a user may be transferred. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

The controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

Figure 2a is a front perspective view of an example of a mobile terminal according to the present invention.

The disclosed mobile terminal 100 has a terminal body in the form of a bar. However, the present invention is not limited thereto and may be applied to various structures such as a slide type, a folder type, a swing type, a swivel type, and two or more bodies are coupled to be relatively movable.

The body includes a casing (casing, housing, cover, etc.) that forms an exterior. In this embodiment, the case may be divided into a front case 101 and a rear case 102. Various electronic components are built in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be further disposed between the front case 101 and the rear case 102.

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The display unit 151, the audio output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, and the interface 170 may be disposed in the terminal body, mainly the front case 101. have.

The display unit 151 occupies most of the main surface of the front case 101. The sound output unit 152 and the camera 121 are disposed in regions adjacent to one end of both ends of the display unit 151, and the user input unit 131 and the microphone 122 are disposed in regions adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on side surfaces of the front case 101 and the rear case 102.

The user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100, and may include a plurality of manipulation units 131 and 132. The manipulation units 131 and 132 may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the user operates the tactile manner with a tactile feeling.

Content input by the first or second manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

FIG. 2B is a rear perspective view of the mobile terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 ′ may be additionally mounted on the rear of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.

For example, the camera 121 has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the camera 121 'photographs a general subject and does not transmit it immediately. It is desirable to have a high pixel because there are many. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 are further disposed adjacent to the camera 121 '. The flash 123 shines light toward the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the camera 121 '.

The sound output unit 152 'may be further disposed on the rear surface of the terminal body. The sound output unit 152 ′ may implement a stereo function together with the sound output unit 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

In addition to the antenna for talking and the like, a broadcast signal reception antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

The terminal body is equipped with a power supply unit 190 for supplying power to the mobile terminal 100. The power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may be further equipped with a touch pad 135 for sensing a touch. Like the display unit 151, the touch pad 135 may also be configured to have a light transmission type. In this case, if the display unit 151 is configured to output visual information from both sides, the visual information may be recognized through the touch pad 135. The information output on both surfaces may be controlled by the touch pad 135. Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may be disposed on the rear case 102 as well.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed in parallel to the rear of the display unit 151. The touch pad 135 may have the same or smaller size as the display unit 151.

Hereinafter, a mobile terminal according to the present invention searches for information on a subject using location related information of a terminal, finds a recommended photograph of the subject through the search, and guides photographing based on the information of the recommended photograph. It demonstrates in detail with reference.

Here, the mobile terminal 100 includes a location related information obtaining unit for obtaining location related information of the terminal. The location-related information acquisition unit includes a GPS module 115 for acquiring location information such as longitude, latitude, and altitude of a location where the terminal is located, an electronic compass (gyro sensor, geomagnetic sensor) and a terminal for measuring a direction to which the terminal is directed. It includes a tilt sensor for measuring the degree of tilt.

3 is a flowchart illustrating a photographing guide method of a mobile terminal according to an embodiment of the present invention.

Referring to FIG. 3, the controller 180 of the mobile terminal 100 drives the camera 121 according to a user input in operation S101. When the camera 121 is driven, the controller 180 transmits an image input through the camera 121 to the display unit 151. The display unit 151 outputs an image received from the camera 121 as a preview screen under the control of the controller 180.

After driving the camera 121, the controller 180 sets an operation mode by guide photographing (S102). The controller 180 may be set to guide photographing according to a user's menu operation, or set to guide photographing if the camera 121 faces a subject (a photographing target) for a predetermined time or more.

After setting the guide photographing, the controller 180 obtains location related information of the mobile terminal 100 (S103). The location related information includes location information (longitude, latitude, altitude) of the terminal, direction information toward the terminal, and inclination information indicating the degree of inclination of the terminal. The controller 180 controls the location information module 115 to obtain location information of the mobile terminal from the GPS satellites, and measures the direction of the terminal (camera) toward the electronic compass. In addition, the controller 180 controls the tilt sensor to measure the degree of tilting the terminal (camera).

When the acquisition of the location-related information is completed, the controller 180 transmits the obtained location-related information to the server 200 providing a recommendation picture (guide picture) through the wireless communication unit 110. In other words, the controller 180 requests to provide information regarding the photographing place and the subject based on the obtained position related information.

The server 200 receives location-related information transmitted from the controller 180 of the mobile terminal 100 and recognizes a photographing place and a subject based on the location-related information. Grizo, the server 200 searches for recommended photograph information (subject related information) related to the recognized photographing place and / or the subject (S105). That is, the server 200 recommends a photograph taken at the photographing location and / or a photographed recommendation photographing the subject, photographing information of the photographed photograph (shooting date, photographing time, illuminance, white balance, scene mode, etc.), recommendation photograph. Search for location information, location information of nearby recommended locations, and recommended photos taken from the recommended location. The server 200 may search in a database or search through a portal site or a blog.

The server 200 transmits the found recommendation photo information to the mobile terminal 100 (S106). The recommendation picture information includes a recommendation picture (original image and / or thumbnail image), photographing information (shooting time, illumination, white balance, mode, etc.) of a recommendation picture, a photographing position, and the like. The recommended photograph information may further include a photographing location where the photographed photograph was taken, tourist information of a region adjacent thereto, information about a photographing location recommended, and the like.

The controller 180 of the mobile terminal 100 performs a picture taking using the recommended picture information received through the wireless communication unit 110 (S107). The controller 180 displays the object outline of the recommended picture included in the recommended picture information by superimposing it on the preview screen as a guideline. Therefore, the user can take a picture by matching the subject to the object contour.

In the present embodiment, a subject is recognized using the location information of the terminal as an example, but image recognition technology may be used. For example, the controller 180 of the mobile terminal 100 transmits the image input through the camera 121 to the server 200, the server 200 is an object included in the image using image recognition technology (Subject) can be recognized. Alternatively, the controller 180 may receive information about a subject from a location information providing device provided in the subject and perform a picture taking.

4A to 4B illustrate an example in which a mobile terminal according to an embodiment of the present invention recognizes a photographing target by using augmented reality.

Referring to the drawings, the controller 180 of the mobile terminal 100 drives the camera 121 according to a user input, and displays the image (image) input through the camera 121 as a preview screen 151. Output to. In addition, the controller 180 measures the positional related information of the camera 121 when the camera 121 faces the photographing object for a predetermined time or more. In this case, the measurement of the location-related information may be implemented to start with the driving of the camera 121.

When the measurement of the location related information is completed, the controller 180 transmits the measured location related information to a location information server (not shown). The location information server accesses the point of interest information corresponding to the location related information from a database and transmits it to the mobile terminal 100. The point of interest information includes a building name, a store name, a place name, a road name, and the like.

The controller 180 of the mobile terminal 100 receives the point of interest information through the wireless communication unit 110 and superimposes the received point of interest information on the preview screen of the display unit 151 as shown in FIG. 4A. Display. That is, the controller 180 displays information corresponding to each of the objects displayed on the preview screen of the camera 121. For example, if there is a building on the preview screen, the controller 180 may display a building name or shop information located in the building as information corresponding to the building.

When a touch is detected on any one of the point of interest information displayed on the preview screen, the controller 180 detects the touch input through the sensing unit 140 and provides the point of interest information selected by the touch with the recommended picture providing server. Send to 200. In other words, when any one of the objects displayed on the preview screen is selected, the controller 180 requests the recommendation photo providing server 200 to provide recommendation photo information related to the selected object. For example, when 'A cafe' is selected from the point of interest information displayed on the preview screen as shown in FIG. 4A, the controller 180 requests a recommendation photo including a store name 'A cafe' and location information of the store. Send a message to the providing server 200.

In addition, the controller 180 may include category information of a desired recommendation photo in the recommendation photo providing request message and transmit the same. For example, as illustrated in FIG. 4B, when a point of interest is selected, the controller 180 outputs a category selection screen on the display unit 151 to select a category of a recommended photo to be recommended. When any one of the various categories displayed on the category selection screen is selected, the controller 180 transmits to the recommendation photo providing server 200 including the category information selected in the recommendation photo providing request message.

FIG. 5 is a diagram illustrating an example of recognizing a photographing target through communication with a location information providing apparatus at a location where a mobile terminal according to an embodiment of the present invention is located.

Referring to FIG. 5, the location information providing apparatus 300 is disposed adjacent to a building (a photographing target) and provides location information and additional information (eg, building name and tourist attractions information) about the building. The location information providing apparatus 300 may be implemented to broadcast information about a building at predetermined time intervals.

When the mobile terminal 100 enters the communication radius (range) of the location information providing apparatus 300 (dotted line), the location information providing apparatus 300 detects the entry of the mobile terminal 100. The location information providing apparatus 300 transmits information about a building to the mobile terminal 100. In this case, the building information includes a building name, location information, place name, information on nearby attractions, and the like.

The controller 180 transmits the building information provided from the location information providing apparatus 300 to the recommended picture providing server 200. For example, if the building information provided from the location information providing apparatus 300 is 'Chosungdae', the controller 180 requests the recommendation photo providing server 200 to provide a recommendation photo of another user photographed at the 'Chosungdae'.

6A through 6C illustrate screens displaying received recommended photo information of a mobile terminal according to the present invention.

Referring to FIG. 6A, the controller 180 receives recommendation photos of other users photographed at the current location of the terminal from the recommendation photo providing server 200 and superimposes thumbnail images 311 of the recommendation photos on the preview screen. To display. For example, when attempting to take a picture at a specific tourist spot, the controller 180 of the mobile terminal 100 receives pictures taken by other users from the tourist spot where the terminal is currently located from the recommended picture providing server 200. That is, the controller 180 receives information about a place and a photographing composition suitable for taking a picture in the specific tourist destination from the recommended picture providing server 200 and displays it on the preview screen. In addition, the controller 180 also displays the positions 312, 313, and 314 where the recommended pictures are taken on the preview screen. Here, the controller 180 displays location display icons 312 and 313 indicating a corresponding location when the photographing location of the recommended picture exists on the preview screen, and recommends based on the current location if the location is not present on the preview screen. An icon 314 indicating the photographing position direction and distance of the picture is displayed.

In addition, when any one of the thumbnail images 311 displayed on the preview screen is selected, the controller 180 gives a visual effect to the icon 312 displaying the photographing position corresponding to the selected image. For example, the controller 180 changes the color or size of the icon 312.

When the selected recommended picture is set as a guide picture, the location where the picture was taken is compared with the current location. As a result of the comparison, if the current position of the terminal and the photographing position of the selected recommendation photo do not match, the controller 180 executes a navigation function and sets the photographing position of the recommendation photo as a destination. In addition, the controller 180 performs a road guide to the set destination.

Referring to FIG. 6B, the controller 180 may classify and display the recommended photo by category (by subject). For example, the controller 180 classifies the recommended pictures provided from the recommended picture providing server 200 into landscape pictures and portrait pictures. The controller 180 displays the divided landscape photos and the portrait photos on different pop-up windows 321 and 322.

Referring to FIG. 6C, the controller 180 displays a menu for selecting a category on a display screen. When any one of the categories displayed on the display screen is selected, the controller 180 displays a list of recommended photos of the selected category on the preview screen as shown in FIG. 6A.

7A and 7B illustrate screens displaying a recommended picture selected by a mobile terminal according to the present invention.

When one recommendation photo is selected from the list of recommended photos displayed on the preview screen, the controller 180 displays the corresponding recommendation photo on the full screen of the display unit 151, and the EXIF (EXchangable Image File) included in the recommendation photo. Using the format) information, shooting information of the recommended picture is displayed in a pop-up window as shown in FIG. 7A. The pop-up window is displayed superimposed on the recommendation photo. Alternatively, as shown in FIG. 7B, when the recommended picture is selected, the controller 180 divides the display screen, displays the recommended picture in one of the divided areas, and captures photographing information of the recommended picture in the other area. Is displayed. The photographing information includes photographing time, illuminance, white balance, shutter speed, and the like. Alternatively, the controller 180 may superimpose and display the thumbnail image of the selected recommended photo and the shooting information of the recommended photo on the preview screen.

After checking the shooting information of the recommended picture displayed as shown in FIGS. 7A and 7B, when the user wants to change the camera setting to be identical to the shooting information of the recommended picture, the controller 180 sets the camera setting as the shooting information of the recommended picture. Change it. For example, if the user inputs a touch to the setting icon 331 after checking the shooting information of the recommended picture, the controller 180 recognizes the command as a camera setting change command and changes the camera setting based on the shooting information of the recommended picture. The controller 180 restores the camera settings before the change when the photographing is completed with the changed camera settings. On the other hand, when the user touches the cancel icon 332, the controller 180 disappears the recommended picture and its shooting information displayed without changing the camera settings. The controller 180 may switch to the recommendation photo selection screen.

8A through 8C illustrate screens for photographing steps of a mobile terminal according to the present invention.

The controller 180 displays the recommended picture information provided from the recommended picture providing server 200 as shown in FIG. 8A. When a predetermined theme is selected on the guide picture selection screen, the controller 180 displays a list of recommended pictures of the selected theme on the display screen. In this case, the controller 180 displays a list of recommended photos in thumbnail form. When one recommendation picture is selected from the recommendation picture list, the controller 180 sets the selected recommendation picture as a guide picture. As shown in FIG. 8B, the controller 180 extracts an object outline of the guide picture and generates the object outline as a guide line. Here, the guideline of the recommendation picture may be generated by the recommendation picture providing server 200. The controller 180 displays the generated guide line by overlapping the preview screen. In this case, the controller 180 may change the camera setting by using shooting information of the recommended picture.

Thereafter, as shown in FIG. 8C, the user inputs a shooting button (shooting icon) after aligning a shooting target with the guide line. The controller 180 captures an image displayed on the preview screen according to a user input. In other words, when a photographing command is input, the controller 180 controls the camera 121 to perform photographing.

9A and 9B illustrate an example in which a mobile terminal according to the present invention takes a picture by using recommended picture information and a face recognition function.

Referring to FIG. 9A, when any one of the recommended picture information is selected, the controller 180 sets the selected recommended picture as a guide picture and generates a preview of a subject boundary of the set recommended picture as a guide line. Display on the screen. In this case, if there is a person in the recommended picture, the controller 180 recognizes the face of the person and displays the face guide line 351 indicating the face area. The controller 180 recognizes a face of a person (a subject to be photographed) included in a subject input through the camera 121 using the face recognition function. The controller 180 displays a face recognition mark 352 that displays the recognized face region of the subject.

When the user fits the photographing target to the guide line displayed on the preview screen, the controller 180 compares the size and position of the face guide line 351 and the face recognition mark 352. The controller 180 outputs a guide message through the speaker so that the photographing subject can move according to the comparison result. For example, as shown in FIG. 9A, when the face area of the guide picture is larger than the face area of the subject, the controller 180 may output a message such as "Move forward" so that the subject moves forward.

Alternatively, the controller 180 adjusts an enlargement / reduction function of the camera 121 according to the comparison result. For example, as shown in FIG. 9A, when the face area of the guide picture is larger than the face area of the person to be photographed, the controller 180 displays the subject until the face area of the guide picture matches the face area of the person as shown in FIG. 9B. Zoom in.

According to an embodiment of the present invention, the above-described method may be embodied as computer readable codes on a medium in which a program is recorded. The computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . In addition, the computer may include the controller 180 of the terminal.

The above-described terminal is not limited to the configuration and method of the above-described embodiments, the embodiments may be configured by selectively combining all or a part of the embodiments so that various modifications can be made have.

100: mobile terminal 110: wireless communication unit
120: A / V input unit 121: camera
130: user input unit 140: sensing unit
150: output unit 160: memory
170: interface unit 180: control unit
190: power supply

Claims (12)

Driving the camera to obtain location-related information;
Transmitting the obtained location related information to a server;
Receiving recommendation photo information related to the location-related information from the server;
And taking a picture using the received recommended picture information.
The method of claim 1, wherein the location related information,
The photographing guide method of the mobile terminal, characterized in that it comprises position information, direction information, tilt information of the camera.
The method of claim 1, wherein the recommended photo information,
The photographing guide method of the mobile terminal, characterized in that it includes shooting information of the recommended photo, shooting location, original image, recommended shooting location information, surrounding tourist information.
The method of claim 3, wherein the photographing step comprises:
The photographing guide method of the mobile terminal, characterized in that for taking a picture by changing the camera settings using the shooting information.
The method of claim 3, wherein the photographing step comprises:
Generating outlines of the objects in the recommended picture as guide lines,
Superimposing and displaying the generated guide lines on a preview screen;
And photographing a target to be photographed according to the displayed guide line.
The method of claim 3, wherein the photographing step comprises:
If the shooting position of the recommended photo included in the recommended photo information and the current position of the terminal does not match, the guide photographing method of the mobile terminal, characterized in that by taking a road guide to the shooting position of the recommended photo.
The method of claim 3, wherein the photographing step comprises:
Recognizing a face area in the recommended picture;
Recognizing the subject's face through facial recognition;
Comparing the size and position of the face area of the recommended picture with the face area of the subject;
And guiding the location of the photographing subject according to the comparison result.
Camera,
A location-related information acquisition unit for obtaining location-related information of the camera;
A display unit displaying an image input through the camera;
And a control unit which transmits the location related information to a server, receives recommended picture information related to the location related information from the server, and controls the camera to take a picture by using the recommended picture information. Mobile terminal.
The method of claim 8, wherein the location-related information acquisition unit,
A location information module for obtaining location information of the camera;
An electronic compass for measuring a direction that the camera faces;
Mobile terminal, characterized in that it comprises a tilt sensor for measuring the degree of tilting the camera.
The method of claim 8, wherein the control unit,
Mobile terminal, characterized in that for changing the setting of the camera with the shooting information included in the recommended photo information.
The method of claim 8, wherein the control unit,
The mobile terminal, characterized in that the outline of the object in the original image included in the recommended photo information to generate a guide line and superimposed on the preview screen.
The method of claim 11, wherein the control unit,
And comparing the face area of the original image with the size and position of the face area of the subject input through the camera and guiding the movement of the subject according to the comparison result.
KR1020100076985A 2010-08-10 2010-08-10 Mobile terminal and method for guiding photography thereof KR20120014794A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100076985A KR20120014794A (en) 2010-08-10 2010-08-10 Mobile terminal and method for guiding photography thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100076985A KR20120014794A (en) 2010-08-10 2010-08-10 Mobile terminal and method for guiding photography thereof

Publications (1)

Publication Number Publication Date
KR20120014794A true KR20120014794A (en) 2012-02-20

Family

ID=45837744

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100076985A KR20120014794A (en) 2010-08-10 2010-08-10 Mobile terminal and method for guiding photography thereof

Country Status (1)

Country Link
KR (1) KR20120014794A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150134960A (en) * 2014-05-23 2015-12-02 주식회사 카카오 Method and apparatus for recommending photo composition
US9866709B2 (en) 2013-12-13 2018-01-09 Sony Corporation Apparatus and method for determining trends in picture taking activity
US9955068B2 (en) 2013-07-15 2018-04-24 Samsung Electronics Co., Ltd. Apparatus and method for operating a composition of a picture in a portable terminal
KR20190026286A (en) 2017-09-04 2019-03-13 중앙대학교 산학협력단 Apparatus and Method of Image Support Technology Using OpenCV
KR20190026636A (en) 2018-12-10 2019-03-13 중앙대학교 산학협력단 Apparatus and Method of Image Support Technology Using OpenCV
KR20200018530A (en) 2020-02-10 2020-02-19 중앙대학교 산학협력단 Apparatus and Method of Image Support Technology Using OpenCV
US11678047B2 (en) 2019-02-19 2023-06-13 Samsung Electronics Co., Ltd. Electronic device and method providing content associated with image to application
US11869213B2 (en) 2020-01-17 2024-01-09 Samsung Electronics Co., Ltd. Electronic device for analyzing skin image and method for controlling the same

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9955068B2 (en) 2013-07-15 2018-04-24 Samsung Electronics Co., Ltd. Apparatus and method for operating a composition of a picture in a portable terminal
US9866709B2 (en) 2013-12-13 2018-01-09 Sony Corporation Apparatus and method for determining trends in picture taking activity
KR20150134960A (en) * 2014-05-23 2015-12-02 주식회사 카카오 Method and apparatus for recommending photo composition
KR20190026286A (en) 2017-09-04 2019-03-13 중앙대학교 산학협력단 Apparatus and Method of Image Support Technology Using OpenCV
KR20190026636A (en) 2018-12-10 2019-03-13 중앙대학교 산학협력단 Apparatus and Method of Image Support Technology Using OpenCV
US11678047B2 (en) 2019-02-19 2023-06-13 Samsung Electronics Co., Ltd. Electronic device and method providing content associated with image to application
US11869213B2 (en) 2020-01-17 2024-01-09 Samsung Electronics Co., Ltd. Electronic device for analyzing skin image and method for controlling the same
KR20200018530A (en) 2020-02-10 2020-02-19 중앙대학교 산학협력단 Apparatus and Method of Image Support Technology Using OpenCV

Similar Documents

Publication Publication Date Title
KR102065408B1 (en) Mobile terminal
KR101658087B1 (en) Mobile terminal and method for displaying data using augmented reality thereof
KR101608761B1 (en) Mobile terminal and method for controlling the same
KR20180019392A (en) Mobile terminal and method for controlling the same
KR20170112491A (en) Mobile terminal and method for controlling the same
KR20120014794A (en) Mobile terminal and method for guiding photography thereof
KR20100044527A (en) A mobile telecommunication device and a method of scrolling a screen using the same
KR20100050595A (en) Method of controlling instant message and mobile terminal using the same
KR20160019145A (en) Mobile terminal and method for controlling the same
KR20170014356A (en) Mobile terminal and method of controlling the same
KR102065410B1 (en) Mobile terminal and controlling method thereof
KR20180094340A (en) Mobile terminal and method for controlling the same
KR20150131815A (en) Mobile terminal and controlling method thereof
KR20180010042A (en) Mobile terminal and method for controlling the same
KR101695812B1 (en) Mobile terminal and method for controlling the same
KR20170089653A (en) Mobile terminal and method for controlling the same
KR20170112493A (en) Mobile terminal and method for controlling the same
KR20120032336A (en) Method for displaying information of augmented reality and terminal thereof
KR20160056582A (en) Mobile terminal and controlling method thereof
KR20140146759A (en) Mobile terminal and method for controlling the same
KR20180037370A (en) Mobile terminal and method for controlling the same
KR20170013062A (en) Mobile terminal and method for controlling the same
KR101709506B1 (en) Mobile terminal and editing method of augmented reality information using the mobile terminal
KR101779504B1 (en) Mobile terminal and control method for mobile terminal
KR20110136451A (en) Mobile terminal and method for displaying object related information thererof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application