KR20170025406A - Mobile terminal using virtual fitting solution for shopping and method for controlling thereof - Google Patents

Mobile terminal using virtual fitting solution for shopping and method for controlling thereof Download PDF

Info

Publication number
KR20170025406A
KR20170025406A KR1020150121801A KR20150121801A KR20170025406A KR 20170025406 A KR20170025406 A KR 20170025406A KR 1020150121801 A KR1020150121801 A KR 1020150121801A KR 20150121801 A KR20150121801 A KR 20150121801A KR 20170025406 A KR20170025406 A KR 20170025406A
Authority
KR
South Korea
Prior art keywords
information
user
character
touch screen
displayed
Prior art date
Application number
KR1020150121801A
Other languages
Korean (ko)
Other versions
KR101731718B1 (en
Inventor
이형철
이지유
이지이
Original Assignee
이형철
이지이
이지유
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이형철, 이지이, 이지유 filed Critical 이형철
Priority to KR1020150121801A priority Critical patent/KR101731718B1/en
Publication of KR20170025406A publication Critical patent/KR20170025406A/en
Application granted granted Critical
Publication of KR101731718B1 publication Critical patent/KR101731718B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a mobile terminal used to purchase products and a method for controlling the same wherein, more specifically, the mobile terminal enables a user to check the style with a product by using a virtual character created in the mobile terminal. According to an embodiment of the present invention, the mobile terminal comprises: a camera which obtains an image of a user; a touch screen which receives input of first information related to a users body; and a control unit which recognizes a users face from the image, generates a first character by using the recognized face and the first information, and controls the first character to be displayed on the touch screen. When the user selects a first object, which is a product to be purchased, on the touch screen, the control unit controls the touch screen to display information for requesting additional photographing of second information, which is users body portion information related to the first object, through the camera, changes the shape of the first character to correspond to the second information, and controls the first character in a changed shape to wear the first object and to be displayed on the touch screen.

Description

TECHNICAL FIELD [0001] The present invention relates to a mobile terminal using a virtual fitting solution for shopping, and a control method thereof. [0002] MOBILE TERMINAL USING VIRTUAL FITTING SOLUTION FOR SHOPPING AND METHOD FOR CONTROLLING THEREOF [

The present invention relates to a mobile terminal using a virtual fitting solution for shopping and a control method thereof. More particularly, the present invention relates to a mobile terminal capable of confirming the wearing style of a product by using a virtual character implemented in the mobile terminal and a control method thereof.

Traditionally, there has been a trade in off-line trading, but online marketing has also been activated due to computer technology and Internet communication technology. Especially, online merchandise is characterized by the fact that the seller does not operate the store and the distribution cost is reduced, so that the purchaser can purchase at a low price.

In addition, recently, the development of smart phones has changed the way of purchasing products on the internet. In the past, if a user made purchases through a personal computer at a certain place, purchases are made through mobile terminals regardless of time and place In fact.

Many of the products that are sold on the market are inexpensive but not suitable for the body, or different from the expected designs, and there are many cases where the images that appear on the screen are different from the actual wear patterns. At this time, there is a problem that additional costs arise in the process of exchanging and refunding purchased goods.

To reduce these costs, some buyers make purchases in the form of showrooming, which means experiencing in-store experience and buying the same product online at a lower cost. This behavior has the potential to cause dissatisfaction with offline sellers.

In addition, even when purchasing an article offline, there is a troublesome problem of repeatedly taking off and taking off clothes, and the problem of damage to the displayed article frequently occurs.

In order to solve such a problem, there is a demand for an apparatus which a consumer can easily purchase a desired article without actually wearing clothes.

Korean Patent Application Publication No. 10-2004-0042324

Disclosure of Invention Technical Problem [8] The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide a mobile terminal which can form a virtual character that can be integrated with a user, To the user.

Further, it is an object of the present invention to provide a user with a function of analyzing a user's taste and recommending the product.

It is an object of the present invention to provide a user with a function of informing the user of the availability of a product that matches the user's taste through the mobile terminal.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are not intended to limit the invention to the precise form disclosed. It can be understood.

A camera for acquiring an image of a user related to an example of the present invention for realizing the above-mentioned problems; A touch screen for inputting first information related to the body of the user; And a controller for recognizing the face of the user in the image, generating a first character using the recognized face and first information, and controlling the first character to be displayed on the touch screen In the terminal, when the user selects a first object to be purchased on the touch screen, the control unit may further photograph the second information, which is the body part information of the user related to the first object, Wherein the control unit controls the touch screen to display the requested information, changes the shape of the first character to correspond to the second information, changes the shape of the first character, So that the display can be controlled.

On the other hand, a camera for acquiring an image of an animal related to another example of the present invention for realizing the above-mentioned object; A touch screen for receiving first information related to the body of the animal from a user; And a controller for recognizing the face of the animal in the image, generating a first character using the recognized face and first information, and controlling the first character to be displayed on the touch screen In the terminal, when the user selects a first object to be purchased on the touch screen, the controller may further photograph second information, which is body part information of the animal related to the first object, through the camera Wherein the control unit controls the touch screen to display the requested information, changes the shape of the first character to correspond to the second information, changes the shape of the first character, So that the display can be controlled.

The information processing apparatus may further include a memory for storing the plurality of second information, wherein the first object and the second information are plural, and the controller may further include a memory for storing the plurality of second information stored in the memory, Can be cumulatively changed.

In addition, the control unit may divide the entire area of the touch screen into a plurality of areas, and control each of the plurality of first characters to be displayed in the separated area.

Also, the first object may be a plurality of objects, and the control unit may control each of the plurality of first characters to be displayed by wearing each of the plurality of first objects.

In addition, the plurality of first objects may be the same category or different categories.

The wireless communication unit transmits third information related to the first character to the outside, and the wireless communication unit can receive information related to the second character from the outside.

The wireless communication unit may receive feedback information on the first object included in the first character from the outside and transmit feedback information on the first object included in the second character to the outside .

In addition, the first object and the character may be plural, and the control unit may control each of the plurality of first characters to be transmitted to the outside through the wireless communication unit by wearing each of the plurality of first objects.

In addition, the external user who has received the third information can pay for the first object included in the first character, and the user can pay for the first object included in the second character.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: acquiring an image of a user through a camera; Receiving first information related to the user's body through a touch screen; Recognizing the face of the user in the image; Generating a first character using the recognized face and first information; And displaying the first character on the touch screen when the user selects the first object on the touch screen, wherein the first object is an object of purchase, wherein the body part information of the user related to the first object The touch screen displaying information requesting to further photograph second information via the camera; Changing a shape of the first character to correspond to the second information; And a first character whose shape is changed is displayed on the touch screen by wearing the first object.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: acquiring an image of an animal through a camera; Receiving first information related to the body of the animal from a user through a touch screen; Recognizing the animal's face in the image; Generating a first character using the recognized face and first information; And displaying the first character on the touch screen when the user selects a first object on the touch screen, the first object being the object of purchase, the body part information of the animal related to the first object The touch screen displaying information requesting to further photograph second information via the camera; Changing a shape of the first character to correspond to the second information; And a first character whose shape is changed is displayed on the touch screen by wearing the first object.

The present invention can provide a user with a mobile terminal capable of forming a virtual character that can be integrated with a user and wearing a garment virtually through a formed character to confirm the wearing form of the product.

In addition, the user can be provided with a function of analyzing the user's taste and recommending the product.

The user can be provided with a function of notifying the user whether or not the user can purchase a product that matches the user's taste through the mobile terminal.

The present invention allows a character to purchase a commodity that exactly matches a user's taste by wearing a homogeneous or heterogeneous category product.

In addition, the present invention forms a network with neighboring friends to easily receive feedback information on a product put on a character, and determine whether to purchase the product.

In addition, the present invention may provide a function of mutually paying for products selected by nearby friends forming a network.

It should be understood, however, that the effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned may be clearly understood by those skilled in the art to which the present invention belongs It will be possible.

BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate preferred embodiments of the invention and, together with the description, serve to further the understanding of the technical idea of the invention, It should not be construed as limited.
FIG. 1 shows an example of a block diagram of a mobile terminal that can be applied to the present invention.
2A to 2C show an example of a character formed to be integrated with a user that can be applied to the present invention.
FIG. 3 shows an example of using augmented reality to display on a touch screen that a character formed to integrate with a user wears clothes that are offline, instead of a user.
Fig. 4 shows an example of displaying information on offline clothes on a touch screen using an augmented reality.
FIG. 5 shows an example of using a virtual reality to display on a touch screen that a character formed to integrate with a user wears clothing displayed on an online market on behalf of a user.
6 shows an example of displaying information on clothing displayed on the online market on a touch screen using a virtual reality.
7A and 7B illustrate an example in which a virtual character is changed by input by a user's gesture and displayed on the touch screen.
FIG. 8 is a flowchart illustrating a specific example of shopping through a process of displaying a character wearing a shopping object determined according to a user proposed by the present invention.
FIGS. 9A to 9D are diagrams for explaining concrete steps of determining the shape of a character corresponding to the user shown in FIG. 8. FIG.
FIG. 10 is a flowchart for explaining a concrete example of shopping through a process of displaying a character, determined according to a pet of a user proposed by the present invention, wearing a shopping object.
11 (a) and 11 (b) are diagrams for explaining concrete steps of making a character using accumulated information in the context of the present invention.
FIG. 12 is a diagram for explaining how a display area is divided into a plurality of areas and a plurality of characters are displayed in divided areas to support shopping.
Fig. 13 is a diagram for explaining how easy shopping is supported by comparing a plurality of characters wearing a same category shopping item with each other in the context of the present invention.
FIG. 14 is a diagram for explaining how easy shopping is supported by comparing a plurality of characters wearing different kinds of category shopping items, in relation to the present invention.
FIG. 15 is a diagram for explaining contents for facilitating an easy shopping using a plurality of characters corresponding to a plurality of users forming a network.
FIG. 16 is a diagram for explaining a method by which a plurality of users in FIG. 15 can settle each other.

Hereinafter, a preferred embodiment of the present invention will be described with reference to the drawings. In addition, the embodiment described below does not unduly limit the content of the present invention described in the claims, and the entire structure described in this embodiment is not necessarily essential as the solution means of the present invention.

Hereinafter, the mobile terminal of the present invention will be described in detail.

A mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a PDA Navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.

FIG. 1 shows an example of a block diagram of a mobile terminal that can be applied to the present invention.

1, the portable terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A memory 160, an interface unit 170, a control unit 180, a battery 190, and the like. The components shown in Fig. 1 are not essential, and a portable terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the wireless terminal 100 and the wireless communication system or between the wireless terminal 100 and a network in which the wireless terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only A digital broadcasting system such as DVB-CB, OMA-BCAST, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 refers to a module for wireless Internet access, and may be built in or externally mounted in the mobile terminal 100.

WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as the technology of the wireless Internet.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as the short range communication technology.

The position information module 115 is a module for obtaining the position of the portable terminal 100, and a representative example thereof is a Global Position System (GPS) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites, and then applies trigonometry to the calculated information to obtain a three-dimensional string of latitude, longitude, The location information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [

At this time, two or more cameras 121 may be provided depending on the use environment.

For example, the camera 121 may include first and second cameras 121a and 121b for capturing 3D images on the opposite side of the display unit 151 of the portable terminal 100, A third camera 121c for self-photographing the user may be provided in a part of the surface of the display unit 151 of the terminal 100. [

In this case, the first camera 121a is for capturing the left eye image, which is the source image of the 3D image, and the second camera 121b, for the right eye image capturing.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal.

The user input unit 130 may receive from the user a signal designating two or more contents among the displayed contents according to the present invention. A signal for designating two or more contents may be received via the touch input, or may be received via the hard key and soft key input.

The user input unit 130 may receive an input from the user for selecting the one or more contents. In addition, an input for generating an icon related to a function that the portable terminal 100 can perform can be received from the user.

The user input unit 130 may include a directional keypad, a keypad, a dome switch, a touchpad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, the presence of the user, the orientation of the portable terminal, And generates a sensing signal for controlling the operation of the portable terminal 100. For example, when the portable terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the battery 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141. The proximity sensor 141 will be described later in relation to the touch screen.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) the information processed in the portable terminal 100. For example, when the portable terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received image, UI, or GUI is displayed.

In addition, the display unit 151 according to the present invention supports 2D and 3D display modes.

That is, the display unit 151 according to the present invention may have a configuration in which the switch liquid crystal 151b is combined with a general display device 151a as shown in FIG. 1 below. By using the switch liquid crystal 151b, the optical parallax barrier 50 can be operated to control the traveling direction of the light so that different lights can be separated from the left and right eyes. Therefore, when a combined image of the right eye image and the left eye image is displayed on the display device 151a, the user can see the image corresponding to each eye and feel as if it is displayed in three-dimensional form.

That is, under the control of the control unit 180, the display unit 151 drives only the display device 151a without driving the switch liquid crystal 151b and the optical parallax barrier 50 in the 2D display mode Performs normal 2D display operation.

The display unit 151 drives the switch liquid crystal 151b and the optical parallax barrier 50 and the display device 151a under the control of the controller 180 to display the display liquid crystal 151b, 151a to perform the 3D display operation.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) A flexible display, and a three-dimensional display (3D display).

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the portable terminal 100. [ For example, in the portable terminal 100, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner area of the portable terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, etc.) performed in the portable terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the portable terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided on the side surface, the front surface, or the back surface of the portable terminal 100 in the longitudinal direction. It goes without saying that the projector module 155 may be provided at any position of the portable terminal 100 as needed.

The memory 160 may store a program for processing and controlling the controller 180 and may store the input / output data (e.g., a telephone directory, a message, an audio, a still image, an electronic book, History, and the like). The memory 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 also stores a web browser displaying 3D or 2D web pages in accordance with the present invention.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.) ), A random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- A magnetic disk, an optical disk, a memory, a magnetic disk, or an optical disk. The portable terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the portable terminal 100. The interface unit 170 receives data from an external device or receives power from the external device and transmits the data to each component in the portable terminal 100 or allows data in the portable terminal 100 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the portable terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the portable terminal 100, or various command signals input from the cradle by the user It can be a passage to be transmitted to the terminal. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the portable terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the portable terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

When the display unit 151 is formed of an organic light-emitting diode (OLED) or a TOLED (Transparent OLED), the controller 180 controls the display unit 151, When the preview image is pulled up on the screen of the organic light-emitting diode (OLED) or the TOLED (Transparent OLED) and the size of the preview image is adjusted according to the user's operation, The power consumption of the power source supplied from the power supply unit 190 to the display unit 151 can be reduced by turning off the driving of the pixels in the second area other than the first area in which the preview image with the adjusted preview image is displayed.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

Hereinafter, an embodiment of a mobile terminal using a virtual character proposed by the present invention will be described in detail.

However, although the embodiment described below is based on a person, this is for convenience only, and is not limited to a person but can be applied to an animal (for example, a pet dog).

≪ Embodiment 1 >

The first embodiment of the present invention creates a virtual character 20 having the same body form as a user in the mobile terminal and creates a virtual character 20 such as clothes, bags, hats, glasses, sunglasses, The virtual character 20 on the acquired image is displayed on the touch screen of the mobile terminal by wearing the article 30 on behalf of the user so that the wearer is not required to wear the article 30 The invention relates to an anticipated invention.

The augmented reality (AR) is a technique of superimposing a three-dimensional virtual object on the real world. In other words, it is a technology that superimposes a virtual object on the real world seen by the user.

It is called Mixed Reality (MR) because it combines real world and virtual world with additional information in real time and displays it as one image.

The hybrid VR system is a hybrid VR system that combines the real environment with the virtual environment and has been undergoing research and development centered on the US and Japan since the late 1990s.

Augmented reality, a concept that complements the real world with a virtual world, uses a virtual environment created by computer graphics, but the protagonist is a real environment.

Here, the computer graphic serves to provide additional information necessary for the real environment.

By overlapping the three-dimensional virtual image on the real image that the user is viewing, it means that the distinction between the real environment and the virtual screen becomes blurred.

Virtual reality technology allows users to immerse themselves in a virtual environment so that they can not see the actual environment. However, the augmented reality technology, which is a mixture of real environment and virtual objects, allows the user to see the real environment and provides better realism and additional information.

For example, when the mobile terminal 100 illuminates the surroundings with the camera 121, information such as the location of a nearby shop, a telephone number, and the like is displayed as a stereoscopic image.

Augmented reality is a very complicated and difficult imaging technology internally, but basically it works with the following principles and procedures.

There are several necessary elements for applying the augmented reality technology, such as a GPS device for transmitting and receiving geographical / geographical information, a gravity (tilt + electronic compass) sensor (or gyroscope sensor), a location information system (An Internet connection is required), an augmented reality application that receives the detailed information and displays it on a real background, and an IT device (mobile terminal 100, tablet PC) that finally outputs it to a display.

However, the present invention is not limited thereto, and it is also possible to apply augmented reality technology with more or fewer components.

First, after executing the augmented reality application, a user may illuminate a specific distance or a building with a built-in camera (cam) such as the mobile terminal 100. Then, the user can input latitude / longitude information, tilt / gravity information, 100).

The GPS information is then transmitted over the Internet to a specific location information system. This is because it is practically impossible to store all the detailed information of the area or the building of the position radius in the mobile terminal 100.

In addition, the location information system receiving the GPS information such as the position and the inclination from the user searches the database for the detailed information of the corresponding area or object, and transmits the result to the mobile terminal 100 again.

This includes, of course, the name of the particular building, the telephone number, and so on.

The mobile terminal 100 receiving the data can display the real time image after matching with the current map information through the augmented reality application.

Since the data transmission and reception step is continuously performed and performed, detailed information on the area and the surrounding area sequentially appears on the screen when the mobile terminal 100 is moved over a distance.

Next, concrete contents in which the augmented reality is provided using the mobile terminal 100 will be described.

If the user of the mobile terminal 100 is using the increased reality function, the user can experience a different experience.

For example, if the user is interested in detailed information of a book being watched by a friend, the user can execute the augmented reality application in the mobile terminal 100 and then display the book cover on the camera screen or take a shutter.

The application then reads the screen information and displays the title of the book, author, publisher, book review rating, and price on the Internet database.

So, of course, 3G / 4G mobile communication or Wi-Fi should be able to access the Internet. You can see most of the book information, not rare books.

If you know the book information and decide to buy it, you can order it on the internet, but you may prefer to browse nearby bookstores and buy it directly.

In this case, augmented reality map retrieval application can be utilized.

That is, GPS information of the mobile terminal 100 or the tablet PC is received, and the current location of the mobile terminal 100 or the tablet PC is determined and then the bookstore located at the closest distance is searched.

In addition, how to go to a nearby bookstore can be guided through augmented reality applications.

It is possible to provide a virtual navigation function when walking on the road as well as a car moving route, public transportation, and transfer information.

That is, when the mobile terminal 100 is illuminated on the street, the direction of the augmented reality application is indicated by a virtual arrow or the like.

Also, if you have a book at a nearby bookstore and you want to stop at a quiet café to read slowly, you can also check if there is a decent cafe around in the augmented reality application.

Likewise, when the distance is illuminated by the mobile terminal 100 camera, buildings and mutual information are automatically displayed on the screen.

Also, the path to the desired café is indicated by an arrow, so you can easily find it.

There are a wide variety of fields in which augmented reality technology is applied as well as real life.

Nowadays, it is attracting attention in advertising and public relations. In other words, we can create a unique atmosphere by putting a virtual image on our products.

In addition, it is actively utilized in the field of TV broadcasting.

A typical example is a virtual weather map and an information graph behind a weather caster.

If the virtual display technology and the 3D stereoscopic technology develop further, then the case where the augmented reality can be applied is expected to be greatly expanded.

As described above, the augmented reality can be applied to the present invention.

Hereinafter, a first embodiment of the present invention using an augmented reality will be described in detail with reference to the drawings.

2A to 2C show a character formed to be integrated with a user that can be applied to the present invention.

More specifically, FIG. 2A is a view showing photographing the appearance of a user including a face using a camera of a mobile terminal, FIG. 2B is a drawing showing creation of a virtual character having a face of a user photographed in the mobile terminal FIG. 2C is a diagram illustrating a user's personal information received from the mobile terminal and completing the user's character synchronized with the user's appearance.

2A, an external appearance of a user 10 including a face can be photographed using a camera 121 attached to the mobile terminal 100, and the photographed image can be stored in the memory 160. [ At this time, it is preferable to photograph the front face of the user 10.

The control unit 180 can recognize the face of the user 10 from the photographed image and extract the recognized face of the user from the image and store the extracted face in the memory 160. [

2B, a virtual character 20 can be created in the mobile terminal 100, and can be created in a human body shape by adjusting the sex, hair style, etc. of the user.

At this time, the face portion of the virtual character 20 may be generated to include the face of the user 10 extracted from the photographed photograph.

2C, the body shape of the virtual character 20 may be changed to match the body shape of the user by receiving personal information about the body of the user 10 in the mobile terminal 100. FIG.

At this time, as an example of personal information about the user's body, there are Bust Circumference, Waist Circumference, Hip Circumference, back neck point to waist, a neck length, a neck length, a sleeve length, a back neck point-shoulder point-wrist, a top arm, a waist circumference, , Pants length, thigh circumference, ankle circumference, and waist length.

In adult women, there is a tendency to increase in Bust Circumference, lower chest circumference, Waist Circumference, Hip Circumference, back neck point to waist, front shoulder to waist, A neck circumference, a neck length, a sleeve length, a back neck point-shoulder point-wrist, a top arm, a pants length, ), Thigh circumference, ankle circumference, and waist length.

3 is a diagram showing a character formed in a state integrated with a user by using an augmented reality on a touch screen by projecting the character on the off-line clothing.

Referring to FIG. 3, an external object can be photographed using a camera 121 attached to the mobile terminal 100. The objects may include clothing, shoes, bags, watches, glasses, hats, and the like.

The image including the object 30 photographed through the camera 121 is displayed through the display unit 151. [ The virtual character 20 is displayed on the display unit 151 in order to wear the object 30 in place of the user 10. In this state, ), And it is preferable that the object is photographed against the virtual character 20 in consideration of the actual position to be worn by the user.

At this time, if necessary, the virtual character 20 can take a set gesture such as changing the shape and operating according to the touch of the user 10. [

On the other hand, FIG. 4 shows an example of displaying information on offline clothes on the touch screen using the augmented reality.

Referring to FIG. 4, first information related to an object photographed using the wireless communication unit 110 can be transmitted / received outside the mobile terminal 100, wherein the first information includes a trade name, a product name, an appearance, , Color, size, appearance, style, and the like.

The second information corresponding to the first information can be retrieved from the goods information database by transmitting the first information to the external goods information database through the wireless communication unit 110, Lt; / RTI >

At this time, the second information includes the price information of the object, the inventory information of the surrounding area, the sales information on the online market, the evaluation of the goods, the length and circumference of the clothes, and the length, height, width, .

The received second information may be displayed on the display unit 151 on the mobile terminal 100 and may be displayed together with the object 30 currently being photographed or displayed on the object 30.

Also, the sales information on the online market included in the second information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing.

At this time, the online market order for selling the objects using the price information included in the second information can be displayed in descending order, and the user 10 can easily access the online market where the lowest price is sold.

In addition, if the user 10 accesses the online market for purchasing an object, but the object is out of stock, the user can reserve the reservation. If an inventory of the reserved object is received in the online market, It is possible to display the notification function and the information about the fact on the display unit 151 so that the user can recognize it. One example of how to implement this is to use the push functionality of smartphones.

In addition, it may include a function of recognizing the taste of the user 10 and recommending the user to a desired or matching object. The first information obtained from the recording of the plurality of objects is stored in the memory, the stored first information is analyzed to store the third information including the user's taste in the memory, and the third information is stored in the external article information database And displays the fourth information received at this time on the display unit 151. In this case, the fourth information is displayed on the display unit 151. In this case, In addition, a notification function can be included so that the user can recognize the recommendation of the product.

At this time, the fourth information may include price information of the object, inventory information of the surrounding area, sales information on the online market, evaluation of the product, and the like.

In addition, the sales information on the online market included in the fourth information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing it.

At this time, the online market order for selling the objects using the price information included in the fourth information can be displayed in descending order, and the user 10 can easily access the online market where the lowest price is sold.

The mobile terminal 100 may generate image data including an image displayed on the display unit 151 and an image obtained through the camera 121 and a voice obtained through the microphone 122, 160). In addition, the stored image data can be transmitted to the outside and directly transmitted to other users.

Meanwhile, the above function can be implemented through an application used in a smart phone.

Also, the control unit 180 may control the virtual character 20 to be displayed on the display unit 151 only when a predetermined application is executed.

That is, the contents of the present invention can be applied only when the function of the present invention is not provided in a state in which a specific application is not executed, and an image is acquired in a state in which a specific application is executed.

When the mobile terminal is turned off or an error occurs during the storage of the image, an event occurrence point is stored in the memory. When the mobile terminal is operated again, So that the recording can be started.

In this case, according to the control of the control unit 180, the memory 160 may store the progress of the terminal to the point where the power of the terminal is interrupted and the point where the power is interrupted.

Thereafter, when the power of the mobile terminal 100 is restored, the control unit 180 may perform the recording step from the point where it was interrupted, and perform an operation of adding new contents to the contents proceeded to the interrupted point.

Meanwhile, a network capable of sharing information among a plurality of users installing specific applications providing the functions proposed by the present invention can be formed.

Here, as the telecommunication technology for the network formation, a code division multiple access (CDMA), a frequency division multiple access (FDMA), a time division multiple access (TDMA), an orthogonal frequency division multiple access (OFDMA), a single carrier frequency division multiple access techniques may be used.

In addition, as a short-range communication technology for forming a network, it is possible to use a wireless communication technique such as WiFi, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB) At least one can be used.

When the above-described configuration of the present invention is applied, it is possible to provide a user with a mobile terminal capable of forming a virtual character that can be integrated with a user and wearing a virtual object through a formed character to confirm the wearing style.

≪ Embodiment 2 >

The second embodiment of the present invention creates a virtual character 20 having the same body shape as the user in the mobile terminal, receives the input of the user's body information, transforms the body shape of the virtual character 20, A form in which a virtual character is worn by a virtual character wearing a clothes, bag, hat, glasses, sunglasses or the like displayed on the online market instead of the user 10 is displayed on the touch screen of the mobile terminal, To an invention which can be seen.

In the second embodiment, a virtual reality may be used. In this case, the virtual reality may be an artificial reality, a cyberspace, a virtual world, a virtual environment, a synthetic environment ), And artificial environments.

The purpose of using this virtual reality is to allow people to show and manipulate the environment as if they are in the environment without experiencing the environment that is difficult to experience everyday.

Conventional virtual reality (VR) is used in various fields such as military, entertainment, medical, learning, film, architectural design, and tourism. Applications include education, advanced programming, remote operation, remote satellite surface exploration, Data analysis, and scientific visualization.

In a virtual reality system, human participants and real and virtual workspaces are interconnected by hardware. Also, it helps the participants to visually feel what is happening in a virtual environment, and uses hearing and tactile sense as auxiliary.

This virtual reality is not real, but it can create a virtual space and experience what is possible within a computer software program as if it were real. For example, it is possible to walk around in a virtual space where a virtual character exists in a virtual space, to select a favorite product in a favorite store, and to pay for it electronically.

This virtual reality is mainly used for amusement in the real world, and it is given an opportunity to get in touch with it easily.

The foregoing has outlined the virtual reality that can be applied to the present invention. Hereinafter, a second embodiment of the present invention using a virtual reality will be described in detail with reference to the drawings.

FIG. 5 shows an example of using a virtual reality to display on a touch screen that a character formed to integrate with a user wears clothing displayed on an online market on behalf of a user.

Referring to FIG. 5, a virtual character 20 having the same body shape as the user 10 is created in the mobile terminal as described in the first embodiment. The virtual character 20 is connected to the online market and receives first information related to the object 30 displayed on the online market. The virtual character 20 receives the first information related to the object 30 Can be worn.

At this time, the first information may include an external feature, shape, shape, color, etc. of the object 30. In case of clothes, an adult man may have a Bust Circumference, a Waist Circumference, Hip Circumference, back neck point to waist, front shoulder to waist, center back full length, neck circumference, neck length, sleeve length, And may include a back neck point-shoulder point-wrist, a top arm, a pants length, a thigh circumference, an ankle circumference, and a waist length.

In addition, adult women may have a variety of conditions, such as Bust Circumference, Lower Chest, Waist Circumference, Hip Circumference, Back neck point to waist, Front shoulder to waist, a neck circumference, a neck length, a sleeve length, a back neck point-shoulder point-wrist, a top arm, a pants length, Thigh circumference, ankle circumference, and waist length.

The hat may include the head circumference, and the glasses and sunglasses may include the length and height of the eyeglasses and the distance between the eyeglasses.

For a bag it can include length, height and width.

Through the first information, the object 30 can be represented on the virtual reality in the form in which it exists in reality, and can be worn by the virtual character 20.

Meanwhile, FIG. 6 shows an example of displaying information on clothes displayed on the online market on a touch screen using a virtual reality.

Referring to FIG. 6, the first information on the clothes is transmitted to the goods information database, the second information corresponding to the first information is searched in the goods information database, and the received second information is displayed on the display And can be displayed together with the object 30 currently being photographed, or can be displayed on the object 30.

At this time, the online market order for selling the objects using the price information included in the second information can be displayed in descending order, and the user 10 can easily access the online market where the lowest price is sold.

In addition, if the user 10 accesses the online market for purchasing an object, but the object is out of stock, the user can reserve the reservation. If an inventory of the reserved object is received in the online market, It is possible to display the notification function and the information about the fact on the display unit 151 so that the user can recognize it. One example of how to implement this is to use the push functionality of smartphones.

In addition, it may include a function of recognizing the taste of the user 10 and recommending the user to a desired or matching object. The first information obtained from the recording of the plurality of objects is stored in the memory, the stored first information is analyzed to store the third information including the user's taste in the memory, and the third information is stored in the external article information database And displays the fourth information received at this time on the display unit 151. In this case, the fourth information is displayed on the display unit 151. In this case, In addition, a notification function can be included so that the user can recognize the recommendation of the product.

At this time, the fourth information may include price information of the object, inventory information of the surrounding area, sales information on the online market, evaluation of the product, and the like.

In addition, the sales information on the online market included in the fourth information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing it.

At this time, the online market order for selling the objects using the price information included in the fourth information can be displayed in descending order, and the user 10 can easily access the online market where the lowest price is sold.

Meanwhile, FIGS. 7A and 7B show that characters and clothes in a virtual reality according to a user's touch are displayed in a deformed manner.

7A and 7B, the user can input a touch of a predetermined pattern to the display unit 151 including the touch screen, and according to the input touch, the virtual character 20 and the virtual- (30) may be deformed and displayed on the display unit (151).

For example, when a touch of enlargement and reduction is input, the size of the virtual character 20 and the object 30 can be enlarged or reduced. When the touch of rotation is inputted, the virtual character 20 and the object (30) can rotate together in the direction of the touch. In addition, the virtual character 20 can perform the operation while wearing the object 30. [

Through this touch, the user 10 can easily determine the entire shape of the virtual character 20 worn by the object 30.

In addition, if the user 10 accesses the online market for purchase of an object, but the object is out of stock, the user can reserve it, and when the stock of the reserved object is received in the online market, The notification function and information on the fact can be displayed on the display unit 151 so that the user can recognize it. One example of how to implement this is to use the push functionality of smartphones.

In addition, it may include a function of recognizing the taste of the user 10 and recommending the user to a desired or matching object. The user stores the first information obtained from the displayed object 30 connected to the online market in the memory and analyzes the stored first information to extract the second detailed information that matches the first detailed information constituting the first information Stores the collected third information in the memory, transmits the third information to the external merchandise information database, and retrieves the fourth information, and receives the fourth information corresponding thereto, and displays the received fourth information on the display unit 151 ). ≪ / RTI > At this time, the fourth information may include price information of the object, sales information on the online market, evaluation of the product, and the like.

In addition, the sales information on the online market included in the fourth information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing it.

At this time, the online market order for selling the objects using the price information included in the fourth information can be displayed in descending order, and the user 10 can easily access the online market where the lowest price is sold.

The mobile terminal 100 may generate image data including an image displayed on the display unit 151 and an image obtained through the camera 121 and a voice obtained through the microphone 122, 160). In addition, the stored image data can be transmitted to other users.

Meanwhile, the above function can be implemented through an application used in a smart phone.

Also, the control unit 180 may control the virtual character 20 to be displayed on the display unit 151 only when a predetermined application is executed.

That is, the contents of the present invention can be applied only when the function of the present invention is not provided in a state in which a specific application is not executed, and an image is acquired in a state in which a specific application is executed.

When the mobile terminal is turned off or an error occurs during the storage of the image, an event occurrence point is stored in the memory. When the mobile terminal is operated again, So that the recording can be started.

In this case, according to the control of the control unit 180, the memory 160 may store the progress of the terminal to the point where the power of the terminal is interrupted and the point where the power is interrupted.

Thereafter, when the power of the mobile terminal 100 is restored, the control unit 180 may perform the recording step from the point where it was interrupted, and perform an operation of adding new contents to the contents proceeded to the interrupted point.

In addition, the mobile terminal 100 can interoperate with a nearby mobile terminal or a stationary terminal such as a TV, and can receive information of an object from another mobile terminal or a stationary terminal and display it on the mobile terminal 100 And the virtual character 20 can be seen in the mobile terminal 100 by wearing a commodity such as a garment, a bag, shoes, a hat, sunglasses, and the like.

The virtual character generated in the mobile terminal 100 can be displayed on the nearby mobile terminal or the stationary terminal by interfacing the virtual character created in the mobile terminal 100 with a nearby mobile terminal or a stationary terminal such as a TV, It can be seen by wearing a product displayed on a nearby mobile terminal or a stationary terminal.

For example, in a case where it is desired to confirm that a garment on the broadcast is suitable for a user rather than a broadcast such as a home shopping in a stationary terminal such as a TV, a virtual character 20 generated in the user's mobile terminal 100 is transmitted The virtual character 20 displayed on the stationary terminal can wear the garment on behalf of the user on the stationary terminal, and the virtual character 20 worn can be displayed on the stationary terminal Can be displayed on the user's mobile terminal 100, and the shape can be changed according to the gesture input of the user. Also, the virtual character 20 displayed on the stationary terminal may be changed in accordance with the gesture input of the user input to the mobile terminal 100 of the user.

Meanwhile, a network capable of sharing information among a plurality of users installing specific applications providing the functions proposed by the present invention can be formed.

Here, as the telecommunication technology for the network formation, a code division multiple access (CDMA), a frequency division multiple access (FDMA), a time division multiple access (TDMA), an orthogonal frequency division multiple access (OFDMA), a single carrier frequency division multiple access techniques may be used.

In addition, as a short-range communication technology for forming a network, it is possible to use a wireless communication technique such as WiFi, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB) At least one can be used.

≪ Third Embodiment >

According to another embodiment of the present invention, a virtual character is created using only basic physical information and a user's face set in advance, and when a user selects an article to be purchased, a body part of the user corresponding to the selected article The virtual character can be updated through the photographed body and the user can easily make a shopping through the method of confirming the state in which the updated character wears the selected article in advance.

That is, instead of capturing or inputting the entire body information of the user from the beginning, a virtual character is created using only the most basic body information and the face of the user, and only the body part corresponding to the article selected by the user is newly photographed, By renewing, a user can select a product that suits the user through a character wearing the product, while reducing the effort and time required to make the character.

Reference is made to Fig. 8 to illustrate the method proposed by the present invention.

FIG. 8 is a flowchart illustrating a specific example of shopping through a process of displaying a character wearing a shopping object determined according to a user proposed by the present invention.

Referring to FIG. 8, first, an image for a user is acquired through a camera (S110) is performed.

In step S110, it is sufficient that the image including only the user's face is not captured so that all of the user's body is exposed.

That is, as described above, the character is created using only the face of the user and the basic body information to be described later, rather than creating the character using all the body parts of the user. Therefore, All of which can be applied.

Next, step S120 of receiving first information related to the user's body through the touch screen is performed.

In step S120, not only all of the user's body information is input, but only simple body information such as a key, a weight, and a body shape can be input.

In addition, information to be input according to the preference of the user may be specified or changed in advance by the user.

Thereafter, the step of recognizing the user's face in the image (S130) is performed.

That is, a step of extracting only the part corresponding to the face of the user in the image acquired in step S110 is performed.

Accordingly, as described above, in step S110, the image may be captured by including only the face of the user's body.

In addition, the step of generating the first character using the recognized face and the first information through the control unit (S140) proceeds.

That is, a simple character corresponding to the basic body information obtained through the face and step S120 can be generated. Thus, the user can create a simple virtual character projected on his / her body without much time and effort.

Thereafter, the first character is displayed on the touch screen (S150).

In addition, the step S160 of selecting the first object, which is the object of purchase, by the user on the touch screen may be performed.

In step S160, the user may purchase all of the general online and offline purchases.

That is, the selection of the first object in step S160 includes selecting a product to be purchased at the online shopping mall, selecting a product to be purchased from the photographed image after photographing the offline product with the camera .

At this time, step S170 is performed in which the touch screen displays information requesting that the second information, which is the body part information of the user related to the first object, is further photographed through the camera.

For example, if the product the user wants to purchase is pants, the user may be required to take a picture of the user's legs specifically.

As another example, if the product the user wants to purchase is a t-shirt, the user may be required to specifically photograph the upper body part of the user.

Thereafter, when the second information is obtained through step S170, a step of changing the shape of the first character to correspond to the second information (S180) is performed.

That is, the simple generated character corresponding to the basic body information obtained through the face and the step S120 in the step S140 is updated and displayed using the concrete second information.

The generation of the character according to the second information can be accumulated.

For example, if the product the user wants to purchase is pants, the user's leg portion is specifically photographed through the camera, and the character is updated with respect to the leg portion.

In addition, when the updated character is maintained and the product that the user intends to purchase is a t-shirt, the upper body part of the user is specifically photographed through the camera, and the character can be additionally updated with respect to the upper body part.

Therefore, as the user proceeds to the shopping, the character that exactly matches the body of the user can be completed.

This can increase the interest of the user to further induce a desire for shopping, and can also provide an index for better determining whether the product selected by the user is well suited to the user.

Thereafter, the first character whose shape is changed is displayed on the touch screen by wearing the first object (S190), and the user can determine whether or not to purchase the product through the changed character.

Reference is made to Fig. 9 to describe steps S150 through S190 in more detail.

FIGS. 9A to 9D are diagrams for explaining concrete steps of determining the shape of a character corresponding to the user shown in FIG. 8. FIG.

9 (a) shows a step S150 in which a simple character corresponding to the basic body information obtained through the face and step S120 is displayed.

FIG. 9B shows a step S160 of the user selecting a first object on the touch screen to be purchased. 9 (b), it is assumed that the user has selected the pants product.

In addition, FIG. 9 (c) shows that the user photographs a leg region of a body part in response to a message to specifically photograph a user's leg through a camera.

FIG. 9D shows a step S190 in which a character whose leg parts are updated through the information of the leg areas obtained in FIG. 9C is expressed.

The method of shopping by the user through FIG. 8 can be applied to a case where a user purchases a product for a pet. This will be described in detail with reference to FIG.

FIG. 10 is a flowchart for explaining a concrete example of shopping through a process of displaying a character, determined according to a pet of a user proposed by the present invention, wearing a shopping object.

Referring to FIG. 10, first, an image for an animal is acquired through a camera (S210) is performed.

In the same manner as in FIG. 8, an image including only the face portion of the animal may be acquired in Step S210.

Thereafter, step S220 of receiving first information related to the animal's body from the user through the touch screen is performed.

It is not necessary to input all the body information of the pet as in Fig. 8, and only the basic information needs to be inputted.

It is also possible for the user to specify or modify the information to be input according to the user's preference.

Thereafter, step S230 of recognizing the animal's face in the image is performed.

In addition, a step S240 of generating a first character using the recognized face and the first information is performed.

Thereafter, the first character is displayed on the touch screen (S250), and the user selects the first object on the touch screen (S260).

At this time, in step S270, the touch screen displays information requesting that the second information, which is the body part information of the animal related to the first object, is additionally photographed through the camera.

In addition, the step of changing the shape of the first character to correspond to the second information (S280) and the step of displaying the changed first character on the touch screen by wearing the first object (S290) .

Therefore, it is possible to easily create a character for a pet of a user, to create a product that matches a really good character, and to sequentially complete a character that perfectly matches a pet through accumulated information.

The steps of completing the character using the accumulated information will be described in detail with reference to FIG.

11 (a) and 11 (b) are diagrams for explaining concrete steps of making a character using accumulated information in the context of the present invention.

9 (c) shows a user photographing a leg region of a body part in response to a message to specifically photograph a user's leg through a camera, FIG. 9 (d) FIG. 11 (a) shows a character in which the leg portion is updated through the information of the leg region obtained in FIG. 11, and FIG. 11 (a) represents the character in which the leg portion is updated.

11 (b) shows a state in which the user photographs the upper body region of the body region corresponding to a message to photograph the upper body portion of the user in detail through the camera, and additionally The upper body part is updated and the character is represented.

<Fourth Embodiment>

According to another embodiment of the present invention, there is provided a method of dividing a display area of a display unit into a plurality of areas and displaying a plurality of characters in the divided areas.

FIG. 12 is a diagram for explaining how a display area is divided into a plurality of areas and a plurality of characters are displayed in divided areas to support shopping.

Referring to FIG. 12, the entire area of the display unit or the touch screen is divided into four areas, and each of the divided areas is displayed with a plurality of characters.

However, the area divided into four is merely an example for application of the present invention, and four or less or four or more divided areas may be displayed.

Also, the user can select a plurality of articles using the plurality of characters, and make the plurality of characters wear differently so that the user can select the product that best suits him / her.

Fig. 13 is a diagram for explaining how easy shopping is supported by comparing a plurality of characters wearing a same category shopping item with each other in the context of the present invention.

Referring to FIG. 13, four virtual characters of his / her own are displayed by wearing hats of different styles, and the displayed character is checked to select a hat that best suits him / herself.

Fig. 14 is a diagram for explaining how a plurality of characters wearing different kinds of category shopping articles are compared with each other to facilitate easy shopping, in connection with the present invention.

In FIG. 14, unlike FIG. 13, a process of randomly mixing different kinds of articles other than the same kind of articles and worn by a character, comparing the worn shapes with each other, and selecting an article or a plurality of articles best suited to them It is.

Accordingly, the user can select a product best suited to the user by putting a plurality of the same kinds of articles on a character projected by the user, or by applying a plurality of different kinds of moles and comparing them with each other.

<Fifth Embodiment>

According to another embodiment of the present invention, a user can share a character with friends who form a network through a wireless communication unit to receive feedback on a selected item or to substitute for payment.

FIG. 15 is a diagram for explaining contents for facilitating an easy shopping using a plurality of characters corresponding to a plurality of users forming a network.

The user can establish a network with a user of another mobile terminal using at least one of the above-described short-range communication and long-distance communication. At this time, the number of other users forming the network may be plural.

Referring to FIG. 15, a plurality of characters of a user and a character of another user forming the network are displayed.

The users who form the network can send feedback comments on the articles worn by the respective characters, and the user who receives the feedback comments can utilize the article to decide whether or not to purchase the article.

Furthermore, the user forming the network can pay for the item selected by the user other than himself / herself.

FIG. 16 is a diagram for explaining a method by which a plurality of users in FIG. 15 can settle each other.

Referring to FIG. 16, a user can select a character to be paid out among characters corresponding to other users, and a message asking whether or not to settle the payment can be displayed on the terminal.

If the user selects &quot; YES &quot; in response to this message, the counterpart can substitute for payment for the selected item.

Therefore, a method of making a gift can be applied by making settlement between users forming a network instead.

When the above-described configuration of the present invention is applied, it is possible to provide a user with a mobile terminal capable of forming a virtual character that can be integrated with a user and wearing a garment virtually through a formed character, .

Also, it is possible to provide the user with a function of analyzing the user's taste and recommending the corresponding product to the user, and to provide the user with a function of notifying the user of the availability of the product meeting the user's taste through the mobile terminal.

In addition, according to the present invention, a character can be purchased with a product of the same or different category, so as to accurately match the user's taste.

In addition, the present invention provides a function of easily receiving feedback information about a product put on a character by forming a network with neighboring friends, determining whether to purchase the product, and mutually paying for products selected by nearby friends forming the network .

The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet) . In addition, the computer-readable recording medium may be distributed over network-connected computer systems so that computer readable codes can be stored and executed in a distributed manner. In addition, functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers of the technical field to which the present invention belongs.

It should be noted that the above-described apparatus and method are not limited to the configurations and methods of the embodiments described above, but the embodiments may be modified so that all or some of the embodiments are selectively combined .

Claims (12)

A camera for acquiring an image for a user; A touch screen for inputting first information related to the body of the user; And a controller for recognizing the face of the user in the image, generating a first character using the recognized face and first information, and controlling the first character to be displayed on the touch screen In the terminal,
When the user selects a first object to be purchased on the touch screen,
Wherein,
Controlling the touch screen to display information requesting to photograph second information, which is body part information of the user related to the first object, through the camera,
Wherein the control unit changes the shape of the first character to correspond to the second information and controls the first character whose shape is changed to be displayed on the touch screen by wearing the first object.
A camera for acquiring an image of an animal; A touch screen for receiving first information related to the body of the animal from a user; And a controller for recognizing the face of the animal in the image, generating a first character using the recognized face and first information, and controlling the first character to be displayed on the touch screen In the terminal,
When the user selects a first object to be purchased on the touch screen,
Wherein,
Controlling the touch screen to display information requesting that the second information, which is the body part information of the animal related to the first object, is further photographed through the camera,
Wherein the control unit changes the shape of the first character to correspond to the second information and controls the first character whose shape is changed to be displayed on the touch screen by wearing the first object.
3. The method according to claim 1 or 2,
Wherein the first object and the second information are plural,
And a memory for storing the plurality of second information,
Wherein the control unit accumulates and changes the shape of the first character so as to correspond to a plurality of second information stored in the memory.
3. The method according to claim 1 or 2,
Wherein,
The entire area of the touch screen is divided into a plurality of areas,
And controls each of the plurality of first characters to be displayed in the separated area.
5. The method of claim 4,
Wherein the first object is a plurality of objects,
Wherein the control unit controls each of the plurality of first characters to be displayed by wearing each of the plurality of first objects.
6. The method of claim 5,
Wherein the plurality of first objects are the same category or different categories.
3. The method according to claim 1 or 2,
And a wireless communication unit for transmitting third information related to the first character to the outside,
Wherein the wireless communication unit receives information related to the second character from the outside.
8. The method of claim 7,
The wireless communication unit includes:
From the outside, feedback information on a first object included in the first character,
And transmits feedback information on a first object included in the second character to the outside.
8. The method of claim 7,
Wherein the first object and the character are plural,
Wherein,
And controls each of the plurality of first characters to be transmitted to the outside through the wireless communication unit by wearing each of the plurality of first objects.
8. The method of claim 7,
Wherein the external user who has received the third information is able to pay for the first object included in the first character,
Wherein the user is able to pay for the first object included in the second character.
Acquiring an image for a user through a camera;
Receiving first information related to the user's body through a touch screen;
Recognizing the face of the user in the image;
Generating a first character using the recognized face and first information; And
And displaying the first character on the touch screen,
When the user selects a first object to be purchased on the touch screen,
The touch screen displaying information requesting to further photograph second information, which is body part information of the user related to the first object, through the camera;
Changing a shape of the first character to correspond to the second information; And
The first character having the changed shape is displayed on the touch screen by wearing the first object.
Acquiring an image of the animal through a camera;
Receiving first information related to the body of the animal from a user through a touch screen;
Recognizing the animal's face in the image;
Generating a first character using the recognized face and first information; And
And displaying the first character on the touch screen,
When the user selects a first object to be purchased on the touch screen,
The touch screen displaying information requesting to further photograph second information, which is body part information of the animal associated with the first object, through the camera;
Changing a shape of the first character to correspond to the second information; And
The first character having the changed shape is displayed on the touch screen by wearing the first object.
KR1020150121801A 2015-08-28 2015-08-28 Mobile terminal using virtual fitting solution for shopping and method for controlling thereof KR101731718B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150121801A KR101731718B1 (en) 2015-08-28 2015-08-28 Mobile terminal using virtual fitting solution for shopping and method for controlling thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150121801A KR101731718B1 (en) 2015-08-28 2015-08-28 Mobile terminal using virtual fitting solution for shopping and method for controlling thereof

Publications (2)

Publication Number Publication Date
KR20170025406A true KR20170025406A (en) 2017-03-08
KR101731718B1 KR101731718B1 (en) 2017-04-28

Family

ID=58403547

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150121801A KR101731718B1 (en) 2015-08-28 2015-08-28 Mobile terminal using virtual fitting solution for shopping and method for controlling thereof

Country Status (1)

Country Link
KR (1) KR101731718B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020077835A1 (en) * 2018-10-15 2020-04-23 广东美的白色家电技术创新中心有限公司 Product display method, apparatus, and system
KR20220040051A (en) * 2020-09-23 2022-03-30 주식회사 넥스트키 Apparel wearing system based on face application, and methoe thereof
KR20220105306A (en) * 2021-01-20 2022-07-27 강승진 Virtual wardrobe-based apparel sales application, method, and apparatus therefor

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040042324A (en) 2002-11-14 2004-05-20 조성억 System and method for avatar service

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040042324A (en) 2002-11-14 2004-05-20 조성억 System and method for avatar service

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020077835A1 (en) * 2018-10-15 2020-04-23 广东美的白色家电技术创新中心有限公司 Product display method, apparatus, and system
KR20220040051A (en) * 2020-09-23 2022-03-30 주식회사 넥스트키 Apparel wearing system based on face application, and methoe thereof
KR20220105306A (en) * 2021-01-20 2022-07-27 강승진 Virtual wardrobe-based apparel sales application, method, and apparatus therefor

Also Published As

Publication number Publication date
KR101731718B1 (en) 2017-04-28

Similar Documents

Publication Publication Date Title
CN111652678B (en) Method, device, terminal, server and readable storage medium for displaying article information
US20220301231A1 (en) Mirroring device with whole-body outfits
US11694280B2 (en) Systems/methods for identifying products for purchase within audio-visual content utilizing QR or other machine-readable visual codes
KR101894021B1 (en) Method and device for providing content and recordimg medium thereof
TWI567670B (en) Method and system for management of switching virtual-reality mode and augmented-reality mode
US9395875B2 (en) Systems, methods, and computer program products for navigating through a virtual/augmented reality
US20150170256A1 (en) Systems and Methods for Presenting Information Associated With a Three-Dimensional Location on a Two-Dimensional Display
US9830388B2 (en) Modular search object framework
US11061533B2 (en) Large format display apparatus and control method thereof
KR20160067373A (en) System of giving clothes wearing information with AVATA and operating method thereof
WO2010121110A1 (en) Apparatus, systems, and methods for a smart fixture
JP6720385B1 (en) Program, information processing method, and information processing terminal
EP2940607A1 (en) Enhanced search results associated with a modular search object framework
US20220327747A1 (en) Information processing device, information processing method, and program
KR101731718B1 (en) Mobile terminal using virtual fitting solution for shopping and method for controlling thereof
EP4217953A1 (en) Providing ar-based clothing in messaging system
CN113393290A (en) Live broadcast data processing method and device, computer equipment and medium
KR20230122642A (en) 3D painting on eyewear devices
WO2020189341A1 (en) Image display system, image distribution method, and program
JP7458363B2 (en) Information processing device, information processing method, and information processing program
JP7458362B2 (en) Information processing device, information processing method, and information processing program
JP2018160227A (en) Browsing system and program
Hasan et al. Augmented Reality in E-commerce sites of Bangladesh
KR20240056477A (en) Device and method for allowing transaction of point of interest to enter metaverse and purchase activity in metaverse store
KR20230156467A (en) Clothes sale method using augmented reality

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant