KR101622680B1 - Mobile terminal and method for handling image thereof - Google Patents

Mobile terminal and method for handling image thereof Download PDF

Info

Publication number
KR101622680B1
KR101622680B1 KR1020090115344A KR20090115344A KR101622680B1 KR 101622680 B1 KR101622680 B1 KR 101622680B1 KR 1020090115344 A KR1020090115344 A KR 1020090115344A KR 20090115344 A KR20090115344 A KR 20090115344A KR 101622680 B1 KR101622680 B1 KR 101622680B1
Authority
KR
South Korea
Prior art keywords
image
thumbnail
delete delete
unit
mobile terminal
Prior art date
Application number
KR1020090115344A
Other languages
Korean (ko)
Other versions
KR20110058522A (en
Inventor
권성민
아나톨리 티호트스키
알렉산더 마이보로다
이현구
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020090115344A priority Critical patent/KR101622680B1/en
Publication of KR20110058522A publication Critical patent/KR20110058522A/en
Application granted granted Critical
Publication of KR101622680B1 publication Critical patent/KR101622680B1/en

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a mobile terminal and an image processing method thereof, and a mobile terminal of the present invention requests a view picture according to a control command when a control command for loading and displaying a predetermined image file is inputted, The image processing unit generates and displays a first view picture from a thumbnail of the image file if the requested view picture is a first view picture and decodes the image data of the image file if the requested view picture is a second view picture, Generates and displays a second view picture from the decoded image data, and if the requested view picture is not the first and second view pictures, the requested view picture is generated from the image data stored in the buffer Display.

View mobile devices, images, and albums

Description

[0001] MOBILE TERMINAL AND METHOD FOR HANDLING IMAGE THEREOF [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a mobile terminal for rapidly processing and displaying an image and an image processing method thereof.

A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.

In general, a terminal can be divided into a mobile terminal and a stationary terminal according to whether it can be moved or not, and the mobile terminal can be divided into a handheld terminal (or a handheld terminal) And a vehicle mount terminal.

Efforts are continuing to support and enhance the functions of terminals. The foregoing efforts include not only changes and improvements in structural components that form the terminal, but also improvements in software or hardware.

As a terminal with a camera becomes popular, it is necessary to process a digital image quickly and efficiently in a mobile terminal having a limited memory space and processing power. In particular, digital image processing techniques are important for quickly displaying high resolution images on a display having a relatively low resolution. Therefore, there have been a lot of studies on image processing technology for quickly processing high resolution images.

The present invention provides an image data access structure and a thumbnail for storing image data acquired from a camera or other terminal and storing the image data together.

The present invention also provides a mobile terminal for decoding image data, storing the image data in a dedicated buffer, and extracting a desired image fragment from image data stored in a dedicated buffer when a part of the image is requested due to a movement or zoom operation, and an image processing method thereof .

According to another aspect of the present invention, there is provided an image processing method for a mobile terminal, the method including receiving a control command for loading and displaying a predetermined image file, receiving a view picture according to the control command, And generating and displaying a first view picture from the thumbnail of the image file if the requested view picture is a first view picture; and decoding the image data of the image file if the requested view picture is a second view picture And generating and displaying a second view picture from the decoded image data, if the requested view picture is not a first and a second view picture, extracting the requested view picture from the image data stored in the buffer, And displaying it.

A mobile terminal according to an exemplary embodiment of the present invention includes a user input unit for receiving a control command, a memory for storing an image file, and a display unit for displaying a view picture requested from a thumbnail of the image file according to a view picture required according to the control command An image processing unit for decoding the image data of the image file and storing the decoded image data in a buffer and extracting an image fragment for generating a requested view picture from the image data stored in the buffer to generate a requested view picture; A display unit for displaying a view picture output from the display unit, and a controller for controlling operations of the components.

The mobile terminal related to at least one embodiment of the present invention configured as described above forms an image data access structure and a thumbnail according to scenarios when storing images acquired from a camera or another terminal and stores them together. Thus, the present invention can quickly decode image data using image data access structures or thumbnails when decoding image data.

In addition, the present invention decodes the image data and stores it in a dedicated buffer for image processing, and extracts the requested image fragment from the image data stored in the buffer when partial decoding is required. Accordingly, the mobile terminal of the present invention does not decode the image data when partial decoding of the image data is required, so that partial decoding and representation can be performed quickly.

Hereinafter, a mobile terminal related to the present invention will be described in more detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), navigation and the like. However, it will be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to a mobile terminal.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence of the user, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for generating output related to the visual, auditory or tactile sense and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154 .

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may include a display buffer 155 for storing data to be output on the display screen.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is surrounded by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). The information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. At least two haptic modules 154 may be provided according to the configuration of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit 170 may be a path through which power from the cradle is supplied to the mobile terminal 100, or various commands A signal may be a path through which the signal is transmitted to the mobile terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, it carries out related control and processing for voice call, data communication, video call, etc. The control unit 180 may include a multimedia module 181 for multimedia playback and an image processing unit 200 for decoding an image (still image). The multimedia module 181 and the image processing unit 200 may be implemented in the silver control unit 180 or separately from the control unit 180. [

The image processing unit 200 processes image data input under the control of the control unit 180 and stores the processed image data in the memory 160 and / or the display buffer 155. The image data may be obtained from the camera 121 or downloaded from another terminal through the wireless communication unit 110. [

The image processing unit 200 generates thumbnails and image-related additional information about the image data acquired through the camera 121, and stores the image data together with the acquired image data (original data) as an image file. In other words, the image file may contain original image data, thumbnails, and image information.

In addition, the image processing unit 200 checks whether an image file includes a thumbnail by downloading an image file from another terminal. If the image file does not include a thumbnail, the image processing unit 200 forms a thumbnail and an image data access structure for the downloaded image and inserts the structure into an image file.

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2A is a perspective view of a mobile terminal according to an embodiment of the present invention.

The disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding a synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input units 130/131 and 132, the microphone 122, the interface 170, and the like may be disposed in the front body 101 have.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 152 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion and may be employed in any manner as long as the user operates in a tactile manner.

The contents inputted by the first or second operation unit 131 or 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 To the touch recognition mode of the touch screen.

2B is a rear perspective view of the mobile terminal shown in FIG. 2A.

Referring to FIG. 2B, a camera 121 'may be further mounted on the rear surface of the terminal body, that is, the rear case 102. The camera 121 'may have a photographing direction substantially opposite to the camera 121 (see FIG. 2A), and may be a camera having different pixels from the camera 121.

For example, the camera 121 may have a low pixel so that the face of the user can be photographed and transmitted to the other party in case of a video call or the like, and the camera 121 ' It is preferable to have a large number of pixels. The cameras 121 and 121 'may be installed in the terminal body so as to be rotatable or pop-upable.

A flash 123 and a mirror 124 are additionally disposed adjacent to the camera 121 '. The flash 123 illuminates the subject when the subject is photographed by the camera 121 '. The mirror 124 allows the user to illuminate the user's own face or the like when the user intends to shoot himself / herself (self-photographing) using the camera 121 '.

An acoustic output 152 'may be additionally disposed on the rear surface of the terminal body. The sound output unit 152 'may implement the stereo function together with the sound output unit 152 (see FIG. 2A), and may be used for the implementation of the speakerphone mode during a call.

In addition to the antenna for talking and the like, a broadcast signal reception antenna 116 may be additionally disposed on the side of the terminal body. The antenna 116, which forms part of the broadcast receiving module 111 (see FIG. 1), can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the mobile terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the terminal body or may be detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may also be of a light transmission type like the display unit 151. [ In this case, if the display unit 151 is configured to output time information on both sides, the time information can be recognized through the touch pad 135 as well. The information output on both sides may be all controlled by the touch pad 135. [ Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may be disposed on the rear case 102 as well.

The touch pad 135 operates in correlation with the display portion 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

Hereinafter, a related operation of the display unit 151 and the touch pad 135 will be described with reference to FIG.

3 is a front view of a mobile terminal for explaining an operation state of the mobile terminal according to the present invention.

Various types of time information can be displayed on the display unit 151. [ These pieces of information can be displayed in the form of letters, numbers, symbols, graphics, or icons.

At least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement for inputting such information, thereby being implemented as a keypad. Such a keypad may be called a so-called " soft key ".

FIG. 3 shows that a touch applied to a soft key is inputted through a front surface of a terminal body.

The display unit 151 may operate as an entire area or may be divided into a plurality of areas and operated. In the latter case, the plurality of areas can be configured to operate in association with each other.

For example, an output window 151a and an input window 151b are displayed on the upper and lower portions of the display unit 151, respectively. The output window 151a and the input window 151b are areas allocated for outputting or inputting information, respectively. In the input window 151b, a soft key 151c for displaying a number for inputting a telephone number or the like is output. When the soft key 151c is touched, a number corresponding to the touched soft key is displayed on the output window 151a. When the first operation unit 131 is operated, call connection to the telephone number displayed on the output window 151a is attempted.

Although the above embodiment discloses receiving a touch applied to a soft key through a front surface of a terminal body, in the case of a terminal having a transparent display, a touch applied to a soft key may be received through a rear surface of the terminal.

3, when the terminal body is arranged in portrait and the terminal body is changed to landscape, the terminal displays an output screen displayed on the display unit 151 according to the placement direction of the terminal Gt;

The terminal 100 shown in Figs. 1 to 3 is operable in a communication system capable of transmitting data through a frame or a packet, including a wired / wireless communication system and a satellite-based communication system .

4 is a flowchart illustrating a procedure for preparing an image file from an image input from a camera, according to an embodiment of the present invention. The present embodiment will be described taking the case of acquiring an original image from a camera as an example.

Referring to FIG. 4, the controller 180 of the mobile terminal 100 acquires image data from the camera 121 (S101). For example, when the user selects the 'camera' menu through the menu operation, the control unit 180 drives the camera 121. When the camera 121 is driven, the control unit 180 displays the image input from the camera 121 on the display unit 151. Here, the display unit 151 serves as a viewfinder. When a subject to be photographed is identified through the window, the user inputs a photographing button of the user input unit 130. [ When the control signal input from the user input unit 180 is recognized as a photographing instruction, the control unit 180 controls the camera 121 to photograph. When the photographing is completed, the camera 121 compresses the digital image data (original image) of the captured full resolution and transmits the compressed digital image data to the image processing unit 200.

The image processing unit 200 decodes the image data transmitted from the camera 121 into a bit stream under the control of the control unit 180 (S102). In other words, the image processing unit 200 decodes the image data transmitted from the camera 121 on a bit basis.

The image processing unit 200 generates thumbnails of the image data acquired through the camera 121 from the decoded image data and the terminal parameters (S103). In other words, the image processing unit 200 reduces the decoded image data by a predetermined ratio to form a small image. Here, the terminal parameters include display parameters such as display resolution (screen size), and camera parameters such as lens focal length, exposure time, aperture opening value, whether or not to use the flash, and the like.

For example, when the display screen size of the terminal is 320x240, the image processing unit 200 generates a thumbnail by reducing the original image of high resolution (2048x1536, 1600x1212, 1280x1024) to the screen size of the terminal do. Alternatively, the image processing unit 200 generates thumbnails by reducing original images of low resolution (640 × 480, 320 × 240) to 72 × 54 size.

When the thumbnail is generated, the image processing unit 200 generates additional information related to the acquired image data (S104). Here, the additional information includes an image data access structure, camera information and a compression mode, a color space, a number of pixels, and the like. The camera information includes camera maker, camera model, software, photo date, Exif version, shooting date, An image size, an exposure time (Exposure Time), a shooting program (Exposure Program), a lens focal length (Focal length), an aperture open number (F-Number)

In addition, the image data access structure is used for partial decoding of image data. The image data access structure is an indicator that indicates a requested area if a certain area of the image data is requested according to a scenario. For example, the image data access structure includes information on a start point and an end point of a corresponding image fragment in accordance with a movement or zoom operation step.

When the additional information is generated, the image processing unit 200 stores the image data, the thumbnail, and the additional information acquired from the camera 121 as one image file (S105). The image file is stored as an Exif (exchangeable image file format) image file. The Exif image file includes primary image data, thumbnails as index information of the original image data, and additional information about the image. In order to confirm the additional information, there is a separate program supporting the Exif standard.

The original image data may be implemented as any one of RGB uncompressed data based on Tagged Image File Format (TIFF) and YCbCr uncompressed data and JPEG compressed data. In addition, the position information can be obtained through the position information module 115 and recorded in the additional information in the image file. The mobile terminal 100 can efficiently manage the image file using the additional information recorded in the image file and obtain information on the image data.

5 is a flow diagram illustrating a procedure for a mobile terminal associated with an embodiment of the present invention to prepare an image file from an image input from another source. The present embodiment describes an example of acquiring an image file from another source.

Referring to FIG. 5, the image processing unit 200 loads an image file from a source other than the camera 121 under the control of the control unit 180 (S201). That is, the control unit 180 acquires an image file from the memory 160 or another terminal. The mobile terminal 100 of the present invention can download and load an image file stored in another terminal through the wireless communication unit 110. [

Then, the image processing unit 200 determines whether the loaded image file includes a thumbnail (S202). For example, the image processing unit 200 determines whether the loaded image file is an Exif file. That is, the image processing unit 200 confirms the file type of the image file downloaded from the other terminal.

If it is determined that the loaded image file does not include a thumbnail, the controller 180 controls the image processing unit 200 to decode the image data of the loaded image file (S 203).

When the decoding of the image data is completed, a thumbnail is formed from the original image data and the terminal parameters (S204). For example, the image processing unit 200 determines the size of the thumbnail in consideration of the display resolution of the mobile terminal and the image size of the image data, and resizes the image size of the decoded image data to the determined size Create a thumbnail.

When the thumbnail is generated, the image processing unit 200 generates additional information on the image data (S205). For example, if the zoom operation provided by the image viewer of the terminal is made in four stages, the image processing unit 200 generates an indicator indicating the start point and the end point of the corresponding image fragment in each zoom step, Record. In other words, the image processing unit 200 generates image data access structure information and inserts it into the additional information.

When the additional information is generated, the image processing unit 200 inserts the generated thumbnail and additional information into the image file, and stores the image file in the memory 160 (S206).

If the loaded image file contains a thumbnail in step S202, the controller 180 directly stores the loaded image file in the memory 160. [ That is, the control unit 180 of the mobile terminal 100 stores the image file downloaded from the other terminal in the memory 160 without processing.

6 is an exemplary diagram illustrating an original image processed in a mobile terminal and a derivative image thereof according to an embodiment of the present invention.

6A shows an original image having a full resolution and an image obtained by adjusting the original image to a basic scale level. Here, the basic scaling level is set to a total of four including the original image (1: 1). In this embodiment, the basic scaling level is set to 1: 1, 1: 2, 1: 4, 1: 8, but the basic scaling level number can be changed.

Figure 6 (b) shows a thumbnail and a first view picture. Thumbnails can be displayed in a smaller image than original image data reduced to a basic level of 1: 8. The first view picture is generated from the thumbnail without decoding the original image.

FIG. 6 (c) shows corresponding image fragments (a-d) for each zoom step. A view picture is generated by the image fragment corresponding to each step each time an enlargement command is inputted in a state of displaying the first view picture. For example, in a case where zooming is performed in four steps, the image processing unit 200 of the mobile terminal 100 extracts an image fragment (a) corresponding to the step of zooming in step 1, , The image fragment (b) corresponding to the second stage is extracted. At this time, the image processing unit 200 may extract the corresponding image fragments by using the image data access structure recorded in the additional information in the image file.

7 is a schematic diagram showing an image processing unit of a mobile terminal according to an embodiment of the present invention.

Referring to FIG. 7, the image processing unit 200 includes first and second switchers 210 and 215, a thumbnail decoder 220, an image decoder 215, a scaler 230, a buffer 235, (240), a post-processing unit (250), and the like. In addition, the image processing unit 200 controls the operation of each component constituting the image processing unit 200 under the control of the control unit 180.

The first and second switchers 210 and 215 set a path for image data processing according to a view picture (image fragment) required by the controller 180. For example, when a specific image is selected while the album view is being executed, the control unit 180 requests the image processing unit 200 for a first view picture for the selected image. The image processing unit 200 switches the first switcher 210 to a closed state and allows the image file to be input to the thumbnail decoder 220 through the first switcher 210. [ Alternatively, if an enlargement command is input by the user while displaying the first view picture of the selected image, the controller 180 requests the image processing unit 200 to display a second view picture. The image processing unit 200 switches the first switcher 210 to the open state and switches the second switcher 215 to the closed state so that the image file is input to the image decoder 225.

In other words, the image processing unit 200 controls the first and second switchers 210 and 215 according to the first view picture or another view picture (second view picture, third view picture, etc.) And sets the movement path of the selected image.

The thumbnail decoder 220 decodes the thumbnail included in the image file. The thumbnail decoder 220 outputs the decoded thumbnail to the scaler 230.

The scaler 230 resizes the image size of the decoded thumbnail to the same size as the screen size of the display unit 151. In other words, the scaler 230 generates a first view picture by scaling the image size of the decoded thumbnail at a predetermined ratio. The scaler 230 outputs the scaled image data.

The image decoder 225 decodes the original image data of the image file. At this time, the image decoder 225 converts the image data into YUV data when decoding the image data. The image decoder 225 also decodes the image data and scales the image size at a predetermined rate. For example, the image decoder 225 reduces the size of the original image by 1/4 (decoded image size: original image size = 1: 2).

In the buffer 235, decoded image data output from the image decoder 225 is stored. Here, the image data stored in the buffer 235 is used for image processing according to the movement or zooming operation.

The image extracting unit 240 extracts the image fragment requested by the control unit 180 from the image data stored in the buffer 235. For example, when any one of the upward, downward, leftward, and rightward movement (leftward scroll, rightward scroll, upward scroll, downward scroll) input by the user and zoom-in commands is input, To the image processing unit 200. [0050] FIG. The image processing unit 200 extracts an image fragment corresponding to the requested view picture through the image extracting unit 240.

In addition, the image extracting unit 240 scales the extracted image fragment to a display screen size. In other words, the image extracting unit 240 generates the second view picture to the Nth view picture from the image data stored in the buffer 235. [ Here, N refers to another view picture level at the zooming or moving operation step.

For example, the image extracting unit 240 may extract a requested image fragment from the image stored in the buffer 235. [ Then, the image extracting unit 240 scales the cropped image to the display screen size.

The postprocessor 250 post-processes the image data output from the image extracting unit 240 or the image data output from the scaler 230. Here, the post-processing is processing for improving the image quality by adjusting the contrast, sharpness, saturation, dithering, etc. of the image.

When post-processing of the image data is completed, the post-processing unit 250 outputs the processed image data to the display buffer 155. The display unit 151 displays the image data stored in the display buffer 155 on the screen.

8 is a flowchart illustrating a method of displaying a first view picture of a mobile terminal according to an embodiment of the present invention. The present embodiment will be described taking the case of displaying the first view picture as an example.

First, the control unit 180 executes the album function according to the control command input from the user input unit 130 (S301). For example, when the menu button of the user input unit 130 is input, the controller 180 displays the menus and executes the album function when the 'view album' menu is selected.

When the album function is executed, the controller 180 displays a list of image files stored in the memory 160 on the display screen (S302). Here, the control unit 180 may display the image file list in a text form or in a thumbnail form.

If any of the image files in the image file list is selected, the controller 180 controls the image processing unit 200 to decode the thumbnail stored in the selected image file (S303, S304). In other words, the image processing unit 200 switches the first switcher 210 and the second switcher 215 to the closed state and the open state, respectively, and inputs the selected image file to the thumbnail decoder 220. The thumbnail decoder 220 decodes a thumbnail in the selected image file. Here, the thumbnail decoder 220 converts the thumbnail into YUV data.

When the decoding of the thumbnail is completed, the image processing unit 200 scales the image size of the decoded thumbnail to a display screen size and outputs the scaled image size to the display buffer 155 (S305). Here, the image processing unit 200 may pass the scaled image data through the post-processing unit 250 to improve the image quality of the image. The post-processing unit 250 adjusts the contrast, sharpness, saturation, dithering, etc. of the image. Here, enhancement procedures such as contrast, sharpness, and saturation procedures are performed in the YUV color space and the dithering procedure is performed in the RGB (Red Green Blue) color space color space. Thus, the dithering procedure can be integrated with the process of converting YUV data to RGB data.

When the image data output from the thumbnail decoder is scaled, the controller 180 controls the display unit 151 to display the scaled image stored in the display buffer 155 on the screen (S306).

9 is a flowchart illustrating an image processing method according to a control command of a mobile terminal according to an embodiment of the present invention. The present embodiment will be described taking as an example a case of performing up, down, left and right movement (scrolling) or enlargement / reduction operations while displaying a specific image through a viewer.

First, the controller 180 displays any one of the images stored in the memory 160 on the display screen (S401). For example, when a specific image file is selected by the user, a first view picture is generated from a thumbnail of the selected image file and displayed on the display screen. Alternatively, the control unit 180 enlarges specific image data and displays it on the display screen.

When a control command is input from the user input unit 130 while displaying a specific image on the display screen, the controller 180 generates a corresponding view picture (a first view picture, a second view picture, a third view picture, ) To the image processing unit 200 (S402). Here, the control command is any of the up, down, left, right, and up / down commands.

The image processing unit 200 determines whether the view picture requested by the controller 180 is a first view picture (S403). In other words, the mobile terminal 100 confirms whether a full screen view of the image displayed on the display screen is requested. For example, when the image data is enlarged at a predetermined ratio and the reduction command is input, the controller 180 requests the first view picture and requests the third view picture when the enlargement command is input.

If the first view picture is requested in step S403, the image processing unit 200 decodes the thumbnail stored in the image file in step S404. The image processing unit 200 transmits the image file to the thumbnail decoder 220. The thumbnail decoder 220 decodes a thumbnail in the image file. The mobile terminal 100 according to the present invention decodes the thumbnail without decoding the original image data in the image file.

When the thumbnail is decoded, the image processing unit 200 scales the image size of the decoded thumbnail to a display screen size (S305). In other words, the image processing unit 200 generates the first view picture from the thumbnail. The thumbnail decoder 220 outputs the decoded thumbnail to the scaler 230. The scaler 230 scales the image size of the decoded thumbnail to a display screen size.

The image processing unit 200 processes the scaled image data and displays it on the display screen (S406, S407). The scaler 230 transmits the scaled image to the post-processing unit 250. The post-processing unit 250 performs processing for improving the image quality of the scaled image, and then transmits the processed image to the display buffer 155. The display unit 151 displays the image data stored in the display buffer 155 on the display screen under the control of the controller 180.

If the first view picture is not requested in step S403, the image processing unit 200 determines whether the second view picture is requested (S411).

If the second view picture is requested, the image processing unit 200 decodes the original image data in the image file (S412). At this time, the image processing unit 200 decodes the original image data and scales the image size of the decoded original image data. For example, when the image processing unit 200 inputs the image file of the image file to the image decoder 225, the image decoder 225 decodes the original image data in the image file. Further, the image decoder 225 reduces the decoded image data to 1: 2 scale. That is, the image decoder 225 reduces the image size of the decoded image data to 1/4.

When the original image data is decoded, the image processing unit 200 stores the decoded image data in the buffer 235 (S413). The buffer 235 deletes the stored image data when the viewing of the image data other than the image data currently displayed on the display screen is requested or the viewing of the image data currently displayed is completed.

The image processing unit 200 extracts an image fragment corresponding to the second view picture from the decoded image data, and scales the image size of the extracted image fragment to a display screen size (S414). The image processing unit 200 controls the image fragment extraction unit 250 to extract a requested image fragment. The image fragment extraction unit 250 cuts out the requested image fragment from the image data stored in the buffer 235. Then, the image fragment extraction unit 250 scales the image size of the cut-out image fragment.

When the requested image fragment is extracted, the image processing unit 200 post-processes the extracted image fragment and displays it on the display screen (S406, S407). The post-processing unit 250 controls the image processing unit 200 to transmit the image fragment output from the image fragment extraction unit 250 to the display buffer 155. [ The display unit 151 displays the image fragments stored in the display buffer 155 on the screen.

If the second view picture is not requested in step 311, the image processing unit 200 extracts the requested image fragment from the image data stored in the buffer 235, scales the extracted image fragment to a display screen size (S314). For example, when a third view picture is requested, the image processing unit 200 extracts an image fragment required to form a third view picture from the image data stored in the buffer 235 through the image fragment extraction unit 240 . The image fragment extraction unit 240 scales the image size of the extracted image fragment into a display screen size and transmits the scaled image to the display buffer 155.

10 shows a screen displaying a processed image of a mobile terminal according to an embodiment of the present invention.

First, when the user selects the album view in the menu, the controller 180 displays a list of image files stored in the memory 160 (a). If any of the image files in the image file list is selected, the control unit 180 requests the image processing unit 200 for the first view picture. The image processing unit 200 decodes a thumbnail in the selected image file when the first view picture is requested. The image processing unit 200 enlarges the decoded thumbnail to a display screen size and displays the enlarged thumbnail on a display screen (b).

If the enlargement command for displaying the first view picture is inputted, the controller 180 requests the image processor 200 for the second view picture. The image processing unit 200 drives the image decoder 225 to decode the original image data of the image file, and stores the decoded image data in the buffer 235. At this time, the image decoder 225 decodes the original image data into 1: 2 scale. Then, the image processing unit 200 extracts an image fragment F1 corresponding to the second view picture from the decoded image data. The image processing unit 200 scales the extracted image fragment F1 to a display screen size, and displays the scaled image fragment F1 on the display screen through an image quality improvement procedure (c).

If the enlargement command is input again in the state of displaying the second view picture on the display screen, the controller 180 requests the image processor 200 for the third view picture. The image processing unit 200 extracts an image fragment F2 for generating a third view picture and scales the extracted image fragment F2 to a display screen size. Then, the image processing unit 200 displays the scaled image fragment on the display screen (d).

If the reduction instruction is inputted in the state of displaying the third view picture, the control unit 180 requests the image processing unit 200 for the second view picture. The image processing unit 200 extracts an image fragment F1 for generating a second view picture from the image data stored in the buffer 235 according to a request from the controller 180, (C).

Further, according to an embodiment of the present invention, the above-described method can be implemented as computer-readable code on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . In addition, the computer may include a control unit 180 of the terminal.

The above-described terminal may be configured such that the configuration and method of the embodiments described above are not limitedly applied, but the embodiments may be configured such that all or some of the embodiments are selectively combined so that various modifications may be made. have.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention;

2A is a front perspective view of a mobile terminal according to an embodiment of the present invention;

FIG. 2B is a rear perspective view of a mobile terminal according to an embodiment of the present invention; FIG.

3 is a front view of a mobile terminal for explaining an operation state of the mobile terminal according to the present invention;

4 is a flow chart illustrating a procedure in which a mobile terminal associated with an embodiment of the present invention prepares an image file from an image input from a camera.

5 is a flow diagram illustrating a procedure for a mobile terminal associated with one embodiment of the present invention to prepare an image file from an image input from another source.

FIG. 6 is an exemplary diagram showing an original image processed in a mobile terminal and a derivative image thereof according to an embodiment of the present invention; FIG.

7 is a block diagram illustrating an image processing unit of a mobile terminal according to an embodiment of the present invention;

8 is a flow chart illustrating a method of displaying a first view picture of a mobile terminal in accordance with an embodiment of the present invention.

9 is a flowchart showing an image processing method according to a control command of a mobile terminal according to an embodiment of the present invention.

10 is a view illustrating a screen displaying a processed image according to an embodiment of the present invention

DESCRIPTION OF THE REFERENCE NUMERALS

100: mobile terminal 110: wireless communication unit

120: A / V input unit 130: user input unit

140: sensing unit 150: output unit

160: memory 170: interface section

180: control unit 190: power supply unit

Claims (25)

delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete Loading an image from another terminal; Determining whether the image includes a thumbnail; Generating thumbnails of the images at different sizes according to the resolution of the display unit and the size of the image file based on the determination result; Inserting and storing the generated thumbnail in the image; Inserting and storing a plurality of indicators associated with the enlargement of the image when the thumbnail is generated, the plurality of indicators defining different areas of the entire area of the image; Displaying the generated thumbnail in an image display area of the display unit in response to a display request for the image; And And displaying an entire region of the image display area corresponding to the one of the indicators by using one of the indicators when a zoom-in request for displaying the thumbnail is applied. 19. The method of claim 18, Characterized in that the indicators define regions of different sizes, Displaying an area corresponding to one of the indicators on the entire image display area when the enlargement request is further applied while displaying the area corresponding to any one of the indicators With features, And the other defines an area smaller in size than the one of the two. delete 19. The method of claim 18, If the image includes a thumbnail, storing the image as it is. A display unit configured to display screen information; A wireless communication unit configured to transmit and receive data to and from another terminal; And Controls the wireless communication unit to load an image from the other terminal, Determining whether the image includes a thumbnail, Generating thumbnails of the images at different sizes according to the resolution of the display unit and the size of the image file, And a controller for inserting and storing the generated thumbnail in the image, Wherein, Inserting and storing a plurality of indicators associated with enlargement of the image, defining a plurality of different regions of the entire region of the image, when the thumbnail is generated, In response to a display request for the image, controlling the display unit such that the generated thumbnail is displayed in an image display area of the display unit, Wherein the control unit controls the display unit so that a region corresponding to any one of the indicators is displayed on the entire image display area when a zoom-in request for displaying the thumbnail is applied, using one of the indicators. . 23. The method of claim 22, Characterized in that the indicators define regions of different sizes, Wherein, And controlling the display unit such that an area corresponding to one of the indicators is displayed on the entire image display area when the enlargement request is additionally applied while the area corresponding to one of the above is displayed With features, And the other defines a region smaller in size than the one. delete 23. The method of claim 22, If the image contains a thumbnail, Wherein, And stores the image as it is.
KR1020090115344A 2009-11-26 2009-11-26 Mobile terminal and method for handling image thereof KR101622680B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090115344A KR101622680B1 (en) 2009-11-26 2009-11-26 Mobile terminal and method for handling image thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090115344A KR101622680B1 (en) 2009-11-26 2009-11-26 Mobile terminal and method for handling image thereof

Publications (2)

Publication Number Publication Date
KR20110058522A KR20110058522A (en) 2011-06-01
KR101622680B1 true KR101622680B1 (en) 2016-05-19

Family

ID=44394001

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090115344A KR101622680B1 (en) 2009-11-26 2009-11-26 Mobile terminal and method for handling image thereof

Country Status (1)

Country Link
KR (1) KR101622680B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101363325B1 (en) * 2013-12-11 2014-02-17 (주)진명아이앤씨 A hierarchical switching apparatus for covering uhd resolutions and the method thereof
US9749649B2 (en) 2014-08-27 2017-08-29 Fingram Co., Ltd. Method and system for generating and displaying thumbnail images from original images

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003034709A1 (en) * 2001-10-17 2003-04-24 Naltec Inc. Decompressing method and data processor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003034709A1 (en) * 2001-10-17 2003-04-24 Naltec Inc. Decompressing method and data processor

Also Published As

Publication number Publication date
KR20110058522A (en) 2011-06-01

Similar Documents

Publication Publication Date Title
US9124785B2 (en) Method for receiving low-resolution and high-resolution images and device therefor
US8264566B2 (en) Method for processing image and portable terminal having camera thereof
US9313409B2 (en) Mobile terminal and control method thereof
US8339480B2 (en) Mobile terminal with image magnification and image magnification controlling method of a mobile terminal
US8565831B2 (en) Mobile terminal and method for controlling the same
US20100066810A1 (en) Mobile terminal having a panorama photographing function and method for controlling operation thereof
US10771691B2 (en) Mobile terminal and controlling method thereof
KR101622599B1 (en) Mobile terminal and zoom image controlling method thereof
CN111355998B (en) Video processing method and device
KR20170016211A (en) Mobile terminal and control method for the mobile terminal
KR20130010590A (en) Electronic device and the operating method thereof
KR20110054196A (en) Method for editing image usign touch at mobile terminal and apparatus therefor
KR101595380B1 (en) Mobile terminal and control method thereof
KR101622680B1 (en) Mobile terminal and method for handling image thereof
KR101587137B1 (en) Mobile terminal and method for controlling the same
KR20110003705A (en) Method for displaying information in mobile terminal and mobile terminal using the same
KR101649637B1 (en) Mobile terminal and method for controlling the same
KR101709504B1 (en) Mobile Terminal and Method for managing list thereof
KR101537698B1 (en) Terminal and method for controlling the same
KR101709503B1 (en) Mobile terminal and image display method
KR101531506B1 (en) Mobile terminal and method for controlling display thereof
KR101531917B1 (en) Mobile terminal and input controlling method of the same
KR101476450B1 (en) Portable terminal
KR20150014266A (en) Display device and method for controlling the same
KR101922008B1 (en) Control apparatus of mobile terminal and method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
FPAY Annual fee payment

Payment date: 20190424

Year of fee payment: 4