KR20170025406A - Mobile terminal using virtual fitting solution for shopping and method for controlling thereof - Google Patents
Mobile terminal using virtual fitting solution for shopping and method for controlling thereof Download PDFInfo
- Publication number
- KR20170025406A KR20170025406A KR1020150121801A KR20150121801A KR20170025406A KR 20170025406 A KR20170025406 A KR 20170025406A KR 1020150121801 A KR1020150121801 A KR 1020150121801A KR 20150121801 A KR20150121801 A KR 20150121801A KR 20170025406 A KR20170025406 A KR 20170025406A
- Authority
- KR
- South Korea
- Prior art keywords
- information
- user
- character
- touch screen
- displayed
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Strategic Management (AREA)
- General Engineering & Computer Science (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The present invention relates to a mobile terminal using a virtual fitting solution for shopping and a control method thereof. More particularly, the present invention relates to a mobile terminal capable of confirming the wearing style of a product by using a virtual character implemented in the mobile terminal and a control method thereof.
Traditionally, there has been a trade in off-line trading, but online marketing has also been activated due to computer technology and Internet communication technology. Especially, online merchandise is characterized by the fact that the seller does not operate the store and the distribution cost is reduced, so that the purchaser can purchase at a low price.
In addition, recently, the development of smart phones has changed the way of purchasing products on the internet. In the past, if a user made purchases through a personal computer at a certain place, purchases are made through mobile terminals regardless of time and place In fact.
Many of the products that are sold on the market are inexpensive but not suitable for the body, or different from the expected designs, and there are many cases where the images that appear on the screen are different from the actual wear patterns. At this time, there is a problem that additional costs arise in the process of exchanging and refunding purchased goods.
To reduce these costs, some buyers make purchases in the form of showrooming, which means experiencing in-store experience and buying the same product online at a lower cost. This behavior has the potential to cause dissatisfaction with offline sellers.
In addition, even when purchasing an article offline, there is a troublesome problem of repeatedly taking off and taking off clothes, and the problem of damage to the displayed article frequently occurs.
In order to solve such a problem, there is a demand for an apparatus which a consumer can easily purchase a desired article without actually wearing clothes.
Disclosure of Invention Technical Problem [8] The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide a mobile terminal which can form a virtual character that can be integrated with a user, To the user.
Further, it is an object of the present invention to provide a user with a function of analyzing a user's taste and recommending the product.
It is an object of the present invention to provide a user with a function of informing the user of the availability of a product that matches the user's taste through the mobile terminal.
It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are not intended to limit the invention to the precise form disclosed. It can be understood.
A camera for acquiring an image of a user related to an example of the present invention for realizing the above-mentioned problems; A touch screen for inputting first information related to the body of the user; And a controller for recognizing the face of the user in the image, generating a first character using the recognized face and first information, and controlling the first character to be displayed on the touch screen In the terminal, when the user selects a first object to be purchased on the touch screen, the control unit may further photograph the second information, which is the body part information of the user related to the first object, Wherein the control unit controls the touch screen to display the requested information, changes the shape of the first character to correspond to the second information, changes the shape of the first character, So that the display can be controlled.
On the other hand, a camera for acquiring an image of an animal related to another example of the present invention for realizing the above-mentioned object; A touch screen for receiving first information related to the body of the animal from a user; And a controller for recognizing the face of the animal in the image, generating a first character using the recognized face and first information, and controlling the first character to be displayed on the touch screen In the terminal, when the user selects a first object to be purchased on the touch screen, the controller may further photograph second information, which is body part information of the animal related to the first object, through the camera Wherein the control unit controls the touch screen to display the requested information, changes the shape of the first character to correspond to the second information, changes the shape of the first character, So that the display can be controlled.
The information processing apparatus may further include a memory for storing the plurality of second information, wherein the first object and the second information are plural, and the controller may further include a memory for storing the plurality of second information stored in the memory, Can be cumulatively changed.
In addition, the control unit may divide the entire area of the touch screen into a plurality of areas, and control each of the plurality of first characters to be displayed in the separated area.
Also, the first object may be a plurality of objects, and the control unit may control each of the plurality of first characters to be displayed by wearing each of the plurality of first objects.
In addition, the plurality of first objects may be the same category or different categories.
The wireless communication unit transmits third information related to the first character to the outside, and the wireless communication unit can receive information related to the second character from the outside.
The wireless communication unit may receive feedback information on the first object included in the first character from the outside and transmit feedback information on the first object included in the second character to the outside .
In addition, the first object and the character may be plural, and the control unit may control each of the plurality of first characters to be transmitted to the outside through the wireless communication unit by wearing each of the plurality of first objects.
In addition, the external user who has received the third information can pay for the first object included in the first character, and the user can pay for the first object included in the second character.
According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: acquiring an image of a user through a camera; Receiving first information related to the user's body through a touch screen; Recognizing the face of the user in the image; Generating a first character using the recognized face and first information; And displaying the first character on the touch screen when the user selects the first object on the touch screen, wherein the first object is an object of purchase, wherein the body part information of the user related to the first object The touch screen displaying information requesting to further photograph second information via the camera; Changing a shape of the first character to correspond to the second information; And a first character whose shape is changed is displayed on the touch screen by wearing the first object.
According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: acquiring an image of an animal through a camera; Receiving first information related to the body of the animal from a user through a touch screen; Recognizing the animal's face in the image; Generating a first character using the recognized face and first information; And displaying the first character on the touch screen when the user selects a first object on the touch screen, the first object being the object of purchase, the body part information of the animal related to the first object The touch screen displaying information requesting to further photograph second information via the camera; Changing a shape of the first character to correspond to the second information; And a first character whose shape is changed is displayed on the touch screen by wearing the first object.
The present invention can provide a user with a mobile terminal capable of forming a virtual character that can be integrated with a user and wearing a garment virtually through a formed character to confirm the wearing form of the product.
In addition, the user can be provided with a function of analyzing the user's taste and recommending the product.
The user can be provided with a function of notifying the user whether or not the user can purchase a product that matches the user's taste through the mobile terminal.
The present invention allows a character to purchase a commodity that exactly matches a user's taste by wearing a homogeneous or heterogeneous category product.
In addition, the present invention forms a network with neighboring friends to easily receive feedback information on a product put on a character, and determine whether to purchase the product.
In addition, the present invention may provide a function of mutually paying for products selected by nearby friends forming a network.
It should be understood, however, that the effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned may be clearly understood by those skilled in the art to which the present invention belongs It will be possible.
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate preferred embodiments of the invention and, together with the description, serve to further the understanding of the technical idea of the invention, It should not be construed as limited.
FIG. 1 shows an example of a block diagram of a mobile terminal that can be applied to the present invention.
2A to 2C show an example of a character formed to be integrated with a user that can be applied to the present invention.
FIG. 3 shows an example of using augmented reality to display on a touch screen that a character formed to integrate with a user wears clothes that are offline, instead of a user.
Fig. 4 shows an example of displaying information on offline clothes on a touch screen using an augmented reality.
FIG. 5 shows an example of using a virtual reality to display on a touch screen that a character formed to integrate with a user wears clothing displayed on an online market on behalf of a user.
6 shows an example of displaying information on clothing displayed on the online market on a touch screen using a virtual reality.
7A and 7B illustrate an example in which a virtual character is changed by input by a user's gesture and displayed on the touch screen.
FIG. 8 is a flowchart illustrating a specific example of shopping through a process of displaying a character wearing a shopping object determined according to a user proposed by the present invention.
FIGS. 9A to 9D are diagrams for explaining concrete steps of determining the shape of a character corresponding to the user shown in FIG. 8. FIG.
FIG. 10 is a flowchart for explaining a concrete example of shopping through a process of displaying a character, determined according to a pet of a user proposed by the present invention, wearing a shopping object.
11 (a) and 11 (b) are diagrams for explaining concrete steps of making a character using accumulated information in the context of the present invention.
FIG. 12 is a diagram for explaining how a display area is divided into a plurality of areas and a plurality of characters are displayed in divided areas to support shopping.
Fig. 13 is a diagram for explaining how easy shopping is supported by comparing a plurality of characters wearing a same category shopping item with each other in the context of the present invention.
FIG. 14 is a diagram for explaining how easy shopping is supported by comparing a plurality of characters wearing different kinds of category shopping items, in relation to the present invention.
FIG. 15 is a diagram for explaining contents for facilitating an easy shopping using a plurality of characters corresponding to a plurality of users forming a network.
FIG. 16 is a diagram for explaining a method by which a plurality of users in FIG. 15 can settle each other.
Hereinafter, a preferred embodiment of the present invention will be described with reference to the drawings. In addition, the embodiment described below does not unduly limit the content of the present invention described in the claims, and the entire structure described in this embodiment is not necessarily essential as the solution means of the present invention.
Hereinafter, the mobile terminal of the present invention will be described in detail.
A mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a PDA Navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.
FIG. 1 shows an example of a block diagram of a mobile terminal that can be applied to the present invention.
1, the
Hereinafter, the components will be described in order.
The
The
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.
The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).
For example, the
The broadcast signal and / or broadcast related information received through the
The
The
WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as the technology of the wireless Internet.
The short-
The
Referring to FIG. 1, an A / V (Audio / Video)
The image frame processed by the
At this time, two or
For example, the
In this case, the first camera 121a is for capturing the left eye image, which is the source image of the 3D image, and the second camera 121b, for the right eye image capturing.
The
The
The
The
The
The
The
The
In addition, the
That is, the
That is, under the control of the
The
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
(Hereinafter, referred to as a 'touch screen') in which a
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the
The
Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
The
The
The
In addition to the vibration, the
The
The
Specifically, the
The
Preferably, the
The
The
The
The
The identification module is a chip for storing various information for authenticating the usage right of the
When the
The
The
When the
The
The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays , Microprocessors, microprocessors, microprocessors, and other electronic units for carrying out other functions. In some cases, the embodiments described herein may be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the
Hereinafter, an embodiment of a mobile terminal using a virtual character proposed by the present invention will be described in detail.
However, although the embodiment described below is based on a person, this is for convenience only, and is not limited to a person but can be applied to an animal (for example, a pet dog).
≪ Embodiment 1 >
The first embodiment of the present invention creates a
The augmented reality (AR) is a technique of superimposing a three-dimensional virtual object on the real world. In other words, it is a technology that superimposes a virtual object on the real world seen by the user.
It is called Mixed Reality (MR) because it combines real world and virtual world with additional information in real time and displays it as one image.
The hybrid VR system is a hybrid VR system that combines the real environment with the virtual environment and has been undergoing research and development centered on the US and Japan since the late 1990s.
Augmented reality, a concept that complements the real world with a virtual world, uses a virtual environment created by computer graphics, but the protagonist is a real environment.
Here, the computer graphic serves to provide additional information necessary for the real environment.
By overlapping the three-dimensional virtual image on the real image that the user is viewing, it means that the distinction between the real environment and the virtual screen becomes blurred.
Virtual reality technology allows users to immerse themselves in a virtual environment so that they can not see the actual environment. However, the augmented reality technology, which is a mixture of real environment and virtual objects, allows the user to see the real environment and provides better realism and additional information.
For example, when the
Augmented reality is a very complicated and difficult imaging technology internally, but basically it works with the following principles and procedures.
There are several necessary elements for applying the augmented reality technology, such as a GPS device for transmitting and receiving geographical / geographical information, a gravity (tilt + electronic compass) sensor (or gyroscope sensor), a location information system (An Internet connection is required), an augmented reality application that receives the detailed information and displays it on a real background, and an IT device (
However, the present invention is not limited thereto, and it is also possible to apply augmented reality technology with more or fewer components.
First, after executing the augmented reality application, a user may illuminate a specific distance or a building with a built-in camera (cam) such as the
The GPS information is then transmitted over the Internet to a specific location information system. This is because it is practically impossible to store all the detailed information of the area or the building of the position radius in the
In addition, the location information system receiving the GPS information such as the position and the inclination from the user searches the database for the detailed information of the corresponding area or object, and transmits the result to the
This includes, of course, the name of the particular building, the telephone number, and so on.
The
Since the data transmission and reception step is continuously performed and performed, detailed information on the area and the surrounding area sequentially appears on the screen when the
Next, concrete contents in which the augmented reality is provided using the
If the user of the
For example, if the user is interested in detailed information of a book being watched by a friend, the user can execute the augmented reality application in the
The application then reads the screen information and displays the title of the book, author, publisher, book review rating, and price on the Internet database.
So, of course, 3G / 4G mobile communication or Wi-Fi should be able to access the Internet. You can see most of the book information, not rare books.
If you know the book information and decide to buy it, you can order it on the internet, but you may prefer to browse nearby bookstores and buy it directly.
In this case, augmented reality map retrieval application can be utilized.
That is, GPS information of the
In addition, how to go to a nearby bookstore can be guided through augmented reality applications.
It is possible to provide a virtual navigation function when walking on the road as well as a car moving route, public transportation, and transfer information.
That is, when the
Also, if you have a book at a nearby bookstore and you want to stop at a quiet café to read slowly, you can also check if there is a decent cafe around in the augmented reality application.
Likewise, when the distance is illuminated by the
Also, the path to the desired café is indicated by an arrow, so you can easily find it.
There are a wide variety of fields in which augmented reality technology is applied as well as real life.
Nowadays, it is attracting attention in advertising and public relations. In other words, we can create a unique atmosphere by putting a virtual image on our products.
In addition, it is actively utilized in the field of TV broadcasting.
A typical example is a virtual weather map and an information graph behind a weather caster.
If the virtual display technology and the 3D stereoscopic technology develop further, then the case where the augmented reality can be applied is expected to be greatly expanded.
As described above, the augmented reality can be applied to the present invention.
Hereinafter, a first embodiment of the present invention using an augmented reality will be described in detail with reference to the drawings.
2A to 2C show a character formed to be integrated with a user that can be applied to the present invention.
More specifically, FIG. 2A is a view showing photographing the appearance of a user including a face using a camera of a mobile terminal, FIG. 2B is a drawing showing creation of a virtual character having a face of a user photographed in the mobile terminal FIG. 2C is a diagram illustrating a user's personal information received from the mobile terminal and completing the user's character synchronized with the user's appearance.
2A, an external appearance of a
The
2B, a
At this time, the face portion of the
2C, the body shape of the
At this time, as an example of personal information about the user's body, there are Bust Circumference, Waist Circumference, Hip Circumference, back neck point to waist, a neck length, a neck length, a sleeve length, a back neck point-shoulder point-wrist, a top arm, a waist circumference, , Pants length, thigh circumference, ankle circumference, and waist length.
In adult women, there is a tendency to increase in Bust Circumference, lower chest circumference, Waist Circumference, Hip Circumference, back neck point to waist, front shoulder to waist, A neck circumference, a neck length, a sleeve length, a back neck point-shoulder point-wrist, a top arm, a pants length, ), Thigh circumference, ankle circumference, and waist length.
3 is a diagram showing a character formed in a state integrated with a user by using an augmented reality on a touch screen by projecting the character on the off-line clothing.
Referring to FIG. 3, an external object can be photographed using a
The image including the
At this time, if necessary, the
On the other hand, FIG. 4 shows an example of displaying information on offline clothes on the touch screen using the augmented reality.
Referring to FIG. 4, first information related to an object photographed using the
The second information corresponding to the first information can be retrieved from the goods information database by transmitting the first information to the external goods information database through the
At this time, the second information includes the price information of the object, the inventory information of the surrounding area, the sales information on the online market, the evaluation of the goods, the length and circumference of the clothes, and the length, height, width, .
The received second information may be displayed on the
Also, the sales information on the online market included in the second information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing.
At this time, the online market order for selling the objects using the price information included in the second information can be displayed in descending order, and the
In addition, if the
In addition, it may include a function of recognizing the taste of the
At this time, the fourth information may include price information of the object, inventory information of the surrounding area, sales information on the online market, evaluation of the product, and the like.
In addition, the sales information on the online market included in the fourth information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing it.
At this time, the online market order for selling the objects using the price information included in the fourth information can be displayed in descending order, and the
The
Meanwhile, the above function can be implemented through an application used in a smart phone.
Also, the
That is, the contents of the present invention can be applied only when the function of the present invention is not provided in a state in which a specific application is not executed, and an image is acquired in a state in which a specific application is executed.
When the mobile terminal is turned off or an error occurs during the storage of the image, an event occurrence point is stored in the memory. When the mobile terminal is operated again, So that the recording can be started.
In this case, according to the control of the
Thereafter, when the power of the
Meanwhile, a network capable of sharing information among a plurality of users installing specific applications providing the functions proposed by the present invention can be formed.
Here, as the telecommunication technology for the network formation, a code division multiple access (CDMA), a frequency division multiple access (FDMA), a time division multiple access (TDMA), an orthogonal frequency division multiple access (OFDMA), a single carrier frequency division multiple access techniques may be used.
In addition, as a short-range communication technology for forming a network, it is possible to use a wireless communication technique such as WiFi, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB) At least one can be used.
When the above-described configuration of the present invention is applied, it is possible to provide a user with a mobile terminal capable of forming a virtual character that can be integrated with a user and wearing a virtual object through a formed character to confirm the wearing style.
≪ Embodiment 2 >
The second embodiment of the present invention creates a
In the second embodiment, a virtual reality may be used. In this case, the virtual reality may be an artificial reality, a cyberspace, a virtual world, a virtual environment, a synthetic environment ), And artificial environments.
The purpose of using this virtual reality is to allow people to show and manipulate the environment as if they are in the environment without experiencing the environment that is difficult to experience everyday.
Conventional virtual reality (VR) is used in various fields such as military, entertainment, medical, learning, film, architectural design, and tourism. Applications include education, advanced programming, remote operation, remote satellite surface exploration, Data analysis, and scientific visualization.
In a virtual reality system, human participants and real and virtual workspaces are interconnected by hardware. Also, it helps the participants to visually feel what is happening in a virtual environment, and uses hearing and tactile sense as auxiliary.
This virtual reality is not real, but it can create a virtual space and experience what is possible within a computer software program as if it were real. For example, it is possible to walk around in a virtual space where a virtual character exists in a virtual space, to select a favorite product in a favorite store, and to pay for it electronically.
This virtual reality is mainly used for amusement in the real world, and it is given an opportunity to get in touch with it easily.
The foregoing has outlined the virtual reality that can be applied to the present invention. Hereinafter, a second embodiment of the present invention using a virtual reality will be described in detail with reference to the drawings.
FIG. 5 shows an example of using a virtual reality to display on a touch screen that a character formed to integrate with a user wears clothing displayed on an online market on behalf of a user.
Referring to FIG. 5, a
At this time, the first information may include an external feature, shape, shape, color, etc. of the
In addition, adult women may have a variety of conditions, such as Bust Circumference, Lower Chest, Waist Circumference, Hip Circumference, Back neck point to waist, Front shoulder to waist, a neck circumference, a neck length, a sleeve length, a back neck point-shoulder point-wrist, a top arm, a pants length, Thigh circumference, ankle circumference, and waist length.
The hat may include the head circumference, and the glasses and sunglasses may include the length and height of the eyeglasses and the distance between the eyeglasses.
For a bag it can include length, height and width.
Through the first information, the
Meanwhile, FIG. 6 shows an example of displaying information on clothes displayed on the online market on a touch screen using a virtual reality.
Referring to FIG. 6, the first information on the clothes is transmitted to the goods information database, the second information corresponding to the first information is searched in the goods information database, and the received second information is displayed on the display And can be displayed together with the
At this time, the online market order for selling the objects using the price information included in the second information can be displayed in descending order, and the
In addition, if the
In addition, it may include a function of recognizing the taste of the
At this time, the fourth information may include price information of the object, inventory information of the surrounding area, sales information on the online market, evaluation of the product, and the like.
In addition, the sales information on the online market included in the fourth information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing it.
At this time, the online market order for selling the objects using the price information included in the fourth information can be displayed in descending order, and the
Meanwhile, FIGS. 7A and 7B show that characters and clothes in a virtual reality according to a user's touch are displayed in a deformed manner.
7A and 7B, the user can input a touch of a predetermined pattern to the
For example, when a touch of enlargement and reduction is input, the size of the
Through this touch, the
In addition, if the
In addition, it may include a function of recognizing the taste of the
In addition, the sales information on the online market included in the fourth information may include a hyperlink to each online market for selling the object, so that the user can easily access the online market by purchasing it.
At this time, the online market order for selling the objects using the price information included in the fourth information can be displayed in descending order, and the
The
Meanwhile, the above function can be implemented through an application used in a smart phone.
Also, the
That is, the contents of the present invention can be applied only when the function of the present invention is not provided in a state in which a specific application is not executed, and an image is acquired in a state in which a specific application is executed.
When the mobile terminal is turned off or an error occurs during the storage of the image, an event occurrence point is stored in the memory. When the mobile terminal is operated again, So that the recording can be started.
In this case, according to the control of the
Thereafter, when the power of the
In addition, the
The virtual character generated in the
For example, in a case where it is desired to confirm that a garment on the broadcast is suitable for a user rather than a broadcast such as a home shopping in a stationary terminal such as a TV, a
Meanwhile, a network capable of sharing information among a plurality of users installing specific applications providing the functions proposed by the present invention can be formed.
Here, as the telecommunication technology for the network formation, a code division multiple access (CDMA), a frequency division multiple access (FDMA), a time division multiple access (TDMA), an orthogonal frequency division multiple access (OFDMA), a single carrier frequency division multiple access techniques may be used.
In addition, as a short-range communication technology for forming a network, it is possible to use a wireless communication technique such as WiFi, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB) At least one can be used.
≪ Third Embodiment >
According to another embodiment of the present invention, a virtual character is created using only basic physical information and a user's face set in advance, and when a user selects an article to be purchased, a body part of the user corresponding to the selected article The virtual character can be updated through the photographed body and the user can easily make a shopping through the method of confirming the state in which the updated character wears the selected article in advance.
That is, instead of capturing or inputting the entire body information of the user from the beginning, a virtual character is created using only the most basic body information and the face of the user, and only the body part corresponding to the article selected by the user is newly photographed, By renewing, a user can select a product that suits the user through a character wearing the product, while reducing the effort and time required to make the character.
Reference is made to Fig. 8 to illustrate the method proposed by the present invention.
FIG. 8 is a flowchart illustrating a specific example of shopping through a process of displaying a character wearing a shopping object determined according to a user proposed by the present invention.
Referring to FIG. 8, first, an image for a user is acquired through a camera (S110) is performed.
In step S110, it is sufficient that the image including only the user's face is not captured so that all of the user's body is exposed.
That is, as described above, the character is created using only the face of the user and the basic body information to be described later, rather than creating the character using all the body parts of the user. Therefore, All of which can be applied.
Next, step S120 of receiving first information related to the user's body through the touch screen is performed.
In step S120, not only all of the user's body information is input, but only simple body information such as a key, a weight, and a body shape can be input.
In addition, information to be input according to the preference of the user may be specified or changed in advance by the user.
Thereafter, the step of recognizing the user's face in the image (S130) is performed.
That is, a step of extracting only the part corresponding to the face of the user in the image acquired in step S110 is performed.
Accordingly, as described above, in step S110, the image may be captured by including only the face of the user's body.
In addition, the step of generating the first character using the recognized face and the first information through the control unit (S140) proceeds.
That is, a simple character corresponding to the basic body information obtained through the face and step S120 can be generated. Thus, the user can create a simple virtual character projected on his / her body without much time and effort.
Thereafter, the first character is displayed on the touch screen (S150).
In addition, the step S160 of selecting the first object, which is the object of purchase, by the user on the touch screen may be performed.
In step S160, the user may purchase all of the general online and offline purchases.
That is, the selection of the first object in step S160 includes selecting a product to be purchased at the online shopping mall, selecting a product to be purchased from the photographed image after photographing the offline product with the camera .
At this time, step S170 is performed in which the touch screen displays information requesting that the second information, which is the body part information of the user related to the first object, is further photographed through the camera.
For example, if the product the user wants to purchase is pants, the user may be required to take a picture of the user's legs specifically.
As another example, if the product the user wants to purchase is a t-shirt, the user may be required to specifically photograph the upper body part of the user.
Thereafter, when the second information is obtained through step S170, a step of changing the shape of the first character to correspond to the second information (S180) is performed.
That is, the simple generated character corresponding to the basic body information obtained through the face and the step S120 in the step S140 is updated and displayed using the concrete second information.
The generation of the character according to the second information can be accumulated.
For example, if the product the user wants to purchase is pants, the user's leg portion is specifically photographed through the camera, and the character is updated with respect to the leg portion.
In addition, when the updated character is maintained and the product that the user intends to purchase is a t-shirt, the upper body part of the user is specifically photographed through the camera, and the character can be additionally updated with respect to the upper body part.
Therefore, as the user proceeds to the shopping, the character that exactly matches the body of the user can be completed.
This can increase the interest of the user to further induce a desire for shopping, and can also provide an index for better determining whether the product selected by the user is well suited to the user.
Thereafter, the first character whose shape is changed is displayed on the touch screen by wearing the first object (S190), and the user can determine whether or not to purchase the product through the changed character.
Reference is made to Fig. 9 to describe steps S150 through S190 in more detail.
FIGS. 9A to 9D are diagrams for explaining concrete steps of determining the shape of a character corresponding to the user shown in FIG. 8. FIG.
9 (a) shows a step S150 in which a simple character corresponding to the basic body information obtained through the face and step S120 is displayed.
FIG. 9B shows a step S160 of the user selecting a first object on the touch screen to be purchased. 9 (b), it is assumed that the user has selected the pants product.
In addition, FIG. 9 (c) shows that the user photographs a leg region of a body part in response to a message to specifically photograph a user's leg through a camera.
FIG. 9D shows a step S190 in which a character whose leg parts are updated through the information of the leg areas obtained in FIG. 9C is expressed.
The method of shopping by the user through FIG. 8 can be applied to a case where a user purchases a product for a pet. This will be described in detail with reference to FIG.
FIG. 10 is a flowchart for explaining a concrete example of shopping through a process of displaying a character, determined according to a pet of a user proposed by the present invention, wearing a shopping object.
Referring to FIG. 10, first, an image for an animal is acquired through a camera (S210) is performed.
In the same manner as in FIG. 8, an image including only the face portion of the animal may be acquired in Step S210.
Thereafter, step S220 of receiving first information related to the animal's body from the user through the touch screen is performed.
It is not necessary to input all the body information of the pet as in Fig. 8, and only the basic information needs to be inputted.
It is also possible for the user to specify or modify the information to be input according to the user's preference.
Thereafter, step S230 of recognizing the animal's face in the image is performed.
In addition, a step S240 of generating a first character using the recognized face and the first information is performed.
Thereafter, the first character is displayed on the touch screen (S250), and the user selects the first object on the touch screen (S260).
At this time, in step S270, the touch screen displays information requesting that the second information, which is the body part information of the animal related to the first object, is additionally photographed through the camera.
In addition, the step of changing the shape of the first character to correspond to the second information (S280) and the step of displaying the changed first character on the touch screen by wearing the first object (S290) .
Therefore, it is possible to easily create a character for a pet of a user, to create a product that matches a really good character, and to sequentially complete a character that perfectly matches a pet through accumulated information.
The steps of completing the character using the accumulated information will be described in detail with reference to FIG.
11 (a) and 11 (b) are diagrams for explaining concrete steps of making a character using accumulated information in the context of the present invention.
9 (c) shows a user photographing a leg region of a body part in response to a message to specifically photograph a user's leg through a camera, FIG. 9 (d) FIG. 11 (a) shows a character in which the leg portion is updated through the information of the leg region obtained in FIG. 11, and FIG. 11 (a) represents the character in which the leg portion is updated.
11 (b) shows a state in which the user photographs the upper body region of the body region corresponding to a message to photograph the upper body portion of the user in detail through the camera, and additionally The upper body part is updated and the character is represented.
<Fourth Embodiment>
According to another embodiment of the present invention, there is provided a method of dividing a display area of a display unit into a plurality of areas and displaying a plurality of characters in the divided areas.
FIG. 12 is a diagram for explaining how a display area is divided into a plurality of areas and a plurality of characters are displayed in divided areas to support shopping.
Referring to FIG. 12, the entire area of the display unit or the touch screen is divided into four areas, and each of the divided areas is displayed with a plurality of characters.
However, the area divided into four is merely an example for application of the present invention, and four or less or four or more divided areas may be displayed.
Also, the user can select a plurality of articles using the plurality of characters, and make the plurality of characters wear differently so that the user can select the product that best suits him / her.
Fig. 13 is a diagram for explaining how easy shopping is supported by comparing a plurality of characters wearing a same category shopping item with each other in the context of the present invention.
Referring to FIG. 13, four virtual characters of his / her own are displayed by wearing hats of different styles, and the displayed character is checked to select a hat that best suits him / herself.
Fig. 14 is a diagram for explaining how a plurality of characters wearing different kinds of category shopping articles are compared with each other to facilitate easy shopping, in connection with the present invention.
In FIG. 14, unlike FIG. 13, a process of randomly mixing different kinds of articles other than the same kind of articles and worn by a character, comparing the worn shapes with each other, and selecting an article or a plurality of articles best suited to them It is.
Accordingly, the user can select a product best suited to the user by putting a plurality of the same kinds of articles on a character projected by the user, or by applying a plurality of different kinds of moles and comparing them with each other.
<Fifth Embodiment>
According to another embodiment of the present invention, a user can share a character with friends who form a network through a wireless communication unit to receive feedback on a selected item or to substitute for payment.
FIG. 15 is a diagram for explaining contents for facilitating an easy shopping using a plurality of characters corresponding to a plurality of users forming a network.
The user can establish a network with a user of another mobile terminal using at least one of the above-described short-range communication and long-distance communication. At this time, the number of other users forming the network may be plural.
Referring to FIG. 15, a plurality of characters of a user and a character of another user forming the network are displayed.
The users who form the network can send feedback comments on the articles worn by the respective characters, and the user who receives the feedback comments can utilize the article to decide whether or not to purchase the article.
Furthermore, the user forming the network can pay for the item selected by the user other than himself / herself.
FIG. 16 is a diagram for explaining a method by which a plurality of users in FIG. 15 can settle each other.
Referring to FIG. 16, a user can select a character to be paid out among characters corresponding to other users, and a message asking whether or not to settle the payment can be displayed on the terminal.
If the user selects " YES " in response to this message, the counterpart can substitute for payment for the selected item.
Therefore, a method of making a gift can be applied by making settlement between users forming a network instead.
When the above-described configuration of the present invention is applied, it is possible to provide a user with a mobile terminal capable of forming a virtual character that can be integrated with a user and wearing a garment virtually through a formed character, .
Also, it is possible to provide the user with a function of analyzing the user's taste and recommending the corresponding product to the user, and to provide the user with a function of notifying the user of the availability of the product meeting the user's taste through the mobile terminal.
In addition, according to the present invention, a character can be purchased with a product of the same or different category, so as to accurately match the user's taste.
In addition, the present invention provides a function of easily receiving feedback information about a product put on a character by forming a network with neighboring friends, determining whether to purchase the product, and mutually paying for products selected by nearby friends forming the network .
The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet) . In addition, the computer-readable recording medium may be distributed over network-connected computer systems so that computer readable codes can be stored and executed in a distributed manner. In addition, functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers of the technical field to which the present invention belongs.
It should be noted that the above-described apparatus and method are not limited to the configurations and methods of the embodiments described above, but the embodiments may be modified so that all or some of the embodiments are selectively combined .
Claims (12)
When the user selects a first object to be purchased on the touch screen,
Wherein,
Controlling the touch screen to display information requesting to photograph second information, which is body part information of the user related to the first object, through the camera,
Wherein the control unit changes the shape of the first character to correspond to the second information and controls the first character whose shape is changed to be displayed on the touch screen by wearing the first object.
When the user selects a first object to be purchased on the touch screen,
Wherein,
Controlling the touch screen to display information requesting that the second information, which is the body part information of the animal related to the first object, is further photographed through the camera,
Wherein the control unit changes the shape of the first character to correspond to the second information and controls the first character whose shape is changed to be displayed on the touch screen by wearing the first object.
Wherein the first object and the second information are plural,
And a memory for storing the plurality of second information,
Wherein the control unit accumulates and changes the shape of the first character so as to correspond to a plurality of second information stored in the memory.
Wherein,
The entire area of the touch screen is divided into a plurality of areas,
And controls each of the plurality of first characters to be displayed in the separated area.
Wherein the first object is a plurality of objects,
Wherein the control unit controls each of the plurality of first characters to be displayed by wearing each of the plurality of first objects.
Wherein the plurality of first objects are the same category or different categories.
And a wireless communication unit for transmitting third information related to the first character to the outside,
Wherein the wireless communication unit receives information related to the second character from the outside.
The wireless communication unit includes:
From the outside, feedback information on a first object included in the first character,
And transmits feedback information on a first object included in the second character to the outside.
Wherein the first object and the character are plural,
Wherein,
And controls each of the plurality of first characters to be transmitted to the outside through the wireless communication unit by wearing each of the plurality of first objects.
Wherein the external user who has received the third information is able to pay for the first object included in the first character,
Wherein the user is able to pay for the first object included in the second character.
Receiving first information related to the user's body through a touch screen;
Recognizing the face of the user in the image;
Generating a first character using the recognized face and first information; And
And displaying the first character on the touch screen,
When the user selects a first object to be purchased on the touch screen,
The touch screen displaying information requesting to further photograph second information, which is body part information of the user related to the first object, through the camera;
Changing a shape of the first character to correspond to the second information; And
The first character having the changed shape is displayed on the touch screen by wearing the first object.
Receiving first information related to the body of the animal from a user through a touch screen;
Recognizing the animal's face in the image;
Generating a first character using the recognized face and first information; And
And displaying the first character on the touch screen,
When the user selects a first object to be purchased on the touch screen,
The touch screen displaying information requesting to further photograph second information, which is body part information of the animal associated with the first object, through the camera;
Changing a shape of the first character to correspond to the second information; And
The first character having the changed shape is displayed on the touch screen by wearing the first object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150121801A KR101731718B1 (en) | 2015-08-28 | 2015-08-28 | Mobile terminal using virtual fitting solution for shopping and method for controlling thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150121801A KR101731718B1 (en) | 2015-08-28 | 2015-08-28 | Mobile terminal using virtual fitting solution for shopping and method for controlling thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170025406A true KR20170025406A (en) | 2017-03-08 |
KR101731718B1 KR101731718B1 (en) | 2017-04-28 |
Family
ID=58403547
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150121801A KR101731718B1 (en) | 2015-08-28 | 2015-08-28 | Mobile terminal using virtual fitting solution for shopping and method for controlling thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101731718B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020077835A1 (en) * | 2018-10-15 | 2020-04-23 | 广东美的白色家电技术创新中心有限公司 | Product display method, apparatus, and system |
KR20220040051A (en) * | 2020-09-23 | 2022-03-30 | 주식회사 넥스트키 | Apparel wearing system based on face application, and methoe thereof |
KR20220105306A (en) * | 2021-01-20 | 2022-07-27 | 강승진 | Virtual wardrobe-based apparel sales application, method, and apparatus therefor |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040042324A (en) | 2002-11-14 | 2004-05-20 | 조성억 | System and method for avatar service |
-
2015
- 2015-08-28 KR KR1020150121801A patent/KR101731718B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040042324A (en) | 2002-11-14 | 2004-05-20 | 조성억 | System and method for avatar service |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020077835A1 (en) * | 2018-10-15 | 2020-04-23 | 广东美的白色家电技术创新中心有限公司 | Product display method, apparatus, and system |
KR20220040051A (en) * | 2020-09-23 | 2022-03-30 | 주식회사 넥스트키 | Apparel wearing system based on face application, and methoe thereof |
KR20220105306A (en) * | 2021-01-20 | 2022-07-27 | 강승진 | Virtual wardrobe-based apparel sales application, method, and apparatus therefor |
Also Published As
Publication number | Publication date |
---|---|
KR101731718B1 (en) | 2017-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111652678B (en) | Method, device, terminal, server and readable storage medium for displaying article information | |
US20220301231A1 (en) | Mirroring device with whole-body outfits | |
US11694280B2 (en) | Systems/methods for identifying products for purchase within audio-visual content utilizing QR or other machine-readable visual codes | |
KR101894021B1 (en) | Method and device for providing content and recordimg medium thereof | |
TWI567670B (en) | Method and system for management of switching virtual-reality mode and augmented-reality mode | |
US9395875B2 (en) | Systems, methods, and computer program products for navigating through a virtual/augmented reality | |
US20150170256A1 (en) | Systems and Methods for Presenting Information Associated With a Three-Dimensional Location on a Two-Dimensional Display | |
US9830388B2 (en) | Modular search object framework | |
US11061533B2 (en) | Large format display apparatus and control method thereof | |
KR20160067373A (en) | System of giving clothes wearing information with AVATA and operating method thereof | |
WO2010121110A1 (en) | Apparatus, systems, and methods for a smart fixture | |
JP6720385B1 (en) | Program, information processing method, and information processing terminal | |
EP2940607A1 (en) | Enhanced search results associated with a modular search object framework | |
US20220327747A1 (en) | Information processing device, information processing method, and program | |
KR101731718B1 (en) | Mobile terminal using virtual fitting solution for shopping and method for controlling thereof | |
EP4217953A1 (en) | Providing ar-based clothing in messaging system | |
CN113393290A (en) | Live broadcast data processing method and device, computer equipment and medium | |
KR20230122642A (en) | 3D painting on eyewear devices | |
WO2020189341A1 (en) | Image display system, image distribution method, and program | |
JP7458363B2 (en) | Information processing device, information processing method, and information processing program | |
JP7458362B2 (en) | Information processing device, information processing method, and information processing program | |
JP2018160227A (en) | Browsing system and program | |
Hasan et al. | Augmented Reality in E-commerce sites of Bangladesh | |
KR20240056477A (en) | Device and method for allowing transaction of point of interest to enter metaverse and purchase activity in metaverse store | |
KR20230156467A (en) | Clothes sale method using augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |