CN113902466A - Unmanned store interaction method, unmanned store and storage medium - Google Patents
Unmanned store interaction method, unmanned store and storage medium Download PDFInfo
- Publication number
- CN113902466A CN113902466A CN202111055963.8A CN202111055963A CN113902466A CN 113902466 A CN113902466 A CN 113902466A CN 202111055963 A CN202111055963 A CN 202111055963A CN 113902466 A CN113902466 A CN 113902466A
- Authority
- CN
- China
- Prior art keywords
- information
- user
- commodity
- key
- unmanned
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000003993 interaction Effects 0.000 title claims abstract description 21
- 238000003860 storage Methods 0.000 title claims abstract description 14
- 238000003825 pressing Methods 0.000 claims abstract description 19
- 230000006870 function Effects 0.000 claims description 79
- 238000009434 installation Methods 0.000 claims description 7
- 230000009471 action Effects 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 8
- 230000007246 mechanism Effects 0.000 description 6
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 241001465754 Metazoa Species 0.000 description 4
- 244000228088 Cola acuminata Species 0.000 description 3
- 235000010205 Cola acuminata Nutrition 0.000 description 3
- 235000015438 Cola nitida Nutrition 0.000 description 3
- 241000270322 Lepidosauria Species 0.000 description 3
- 208000025174 PANDAS Diseases 0.000 description 3
- 208000021155 Paediatric autoimmune neuropsychiatric disorders associated with streptococcal infection Diseases 0.000 description 3
- 240000004718 Panda Species 0.000 description 3
- 235000016496 Panda oleosa Nutrition 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012856 packing Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 235000021049 nutrient content Nutrition 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0281—Customer communication at a business location, e.g. providing product or service information, consulting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- General Engineering & Computer Science (AREA)
- Finance (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Marketing (AREA)
- Signal Processing (AREA)
- General Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Multimedia (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to the technical field of unmanned stores, in particular to an unmanned store interaction method, an unmanned store and a storage medium, wherein the method comprises the following steps: projecting first information, wherein the first information comprises first commodity information and key information, and the key information is used for indicating a function key on the function keyboard; responding to the pressing operation of the user on the function key, and projecting second information; wherein the second information includes descriptive information of the first information. The embodiment of the application provides an interaction means between a user and an unmanned shop, provides a way for the user to further know commodities, and the user can further know commodity information by pressing keys on a functional keyboard device, so that the shopping experience of the user can be improved.
Description
Technical Field
The application relates to the technical field of unmanned stores, in particular to an unmanned store interaction method, an unmanned store and a storage medium.
Background
The existing operation flow in the unmanned store is intelligently and automatically processed by technical means, no or little manual intervention exists, and certain convenience is provided for people to shop. However, the existing unmanned stores only rely on simple intelligent processing equipment and programs to provide simple shopping instructions for users, and when the users want to know a certain commodity in detail in the stores, corresponding consultation help can hardly be obtained in the existing unmanned stores, so that unpleasant shopping experience is brought to the users.
Disclosure of Invention
The embodiment of the application provides an unmanned store interaction method, an unmanned store and a storage medium, a user can know commodity information by pressing keys on a function keyboard device, and shopping experience of the user can be improved.
In a first aspect, an embodiment of the present application provides an unmanned store interaction method, which is applied to an unmanned store, where the unmanned store is provided with a projection device and a function keyboard, and the method includes:
projecting first information, wherein the first information comprises first commodity information and key information, and the key information is used for indicating a function key on the function keyboard;
responding to the pressing operation of the user on the function key, and projecting second information;
wherein the second information includes descriptive information of the first information.
In some embodiments, the method further comprises:
responding to the pressing operation of the user on the function key, and projecting third information, wherein the third information comprises second commodity information;
the second commodity information and the first commodity information belong to different commodities.
In some embodiments, the instructional information comprises an animated presentation of the merchandise content.
In some embodiments, the animation demonstration comprises at least one of an installation demonstration, a disassembly demonstration, a merchandise deformation demonstration and a merchandise part demonstration.
In some embodiments, the unmanned store further comprises a voice device, the method further comprising:
and when the projected second information comprises the animation demonstration, playing the explaining voice through the voice device.
In some embodiments, the method further comprises:
projecting a virtual shopping guide;
and sending out the voice introduction of the interested commodities by matching with the action of the virtual shopping guide.
In some embodiments, the function keyboard further comprises an activation key, and the projecting first information comprises:
and projecting first information in response to the operation of pressing the starting key by the user.
In a second aspect, an embodiment of the present application provides an unmanned store, including:
the projection device is used for projecting a virtual image;
a function keyboard, the function keyboard comprising at least one function key;
a control system;
the control system includes:
the at least one processor is respectively connected with the projection device and the functional keyboard device; and
a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor to enable the at least one processor to perform the method described above.
In some embodiments, the unmanned store further comprises:
and the voice device is connected with the processor and is used for voice recognition and voice output.
In a third aspect, the present application provides a readable storage medium, on which a program or instructions are stored, and when the program or instructions are executed by a processor, the method is implemented.
The beneficial effects of the embodiment of the application are that: different from the prior art, the embodiment of the application can project first information including first commodity information and key information in the unmanned store, the key information indicates function keys on the function keyboard, and when a user presses the function keys of the function keyboard, the unmanned store projects second information including description information of the first information. Through the second information, the user can further know the condition of the commodity. The embodiment of the application provides an interaction means between a user and an unmanned shop, provides a way for the user to further know commodities, and the user can further know commodity information by pressing keys on a functional keyboard device, so that the shopping experience of the user can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below. It is obvious that the drawings described below are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a schematic diagram of an unmanned store to which embodiments of the present application are applied;
FIG. 2 is a schematic diagram of the structure of an appliance part of an unmanned shop to which an embodiment of the present application is applied;
FIG. 3 is a schematic diagram of a function keyboard in an unmanned shop, to which an embodiment of the present application is applied;
FIG. 4 is a schematic diagram of a hardware structure of a control system in an unmanned shop to which an embodiment of the present application is applied;
FIG. 5 is a flow chart of an unmanned store interaction method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of first information in an embodiment of the unmanned store interaction method of the present application;
FIG. 7 is a schematic diagram of first information in another embodiment of the unmanned store interaction method of the present application;
FIG. 8 is a diagram illustrating third information in an embodiment of the unmanned store interaction method of the present application;
FIG. 9 is a flowchart of an unmanned store interaction method according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application.
In the description of the present application, it is to be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Further, the references to "horizontal" and "vertical" etc. indicate that the orientation or positional relationship is based on that shown in the drawings only for the purpose of describing the application or for the convenience of description, and do not indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and therefore should not be construed as limiting the application.
The unmanned shop interaction method provided by the embodiment of the present application can be applied to an unmanned shop, and fig. 1 shows a building structure form of an unmanned shop 100, which generally forms an internal space and an external space in a building form such as a house, the internal space is used for storing goods, and the internal space has an entrance and an exit, and a user can enter the internal space through the entrance to make an autonomous shopping.
Fig. 2 shows a schematic configuration diagram of an electric appliance part of the unmanned shop, and as shown in fig. 2, the unmanned shop 100 includes a projection apparatus 10, a function keyboard 20, and a control system 30. The projection device 10 and the function keyboard 20 are each communicatively coupled to a control system 30.
The projection device 10 is used to project a projection image onto a projection bearing surface for displaying, where the projection bearing surface may be any suitable plane or curved surface, such as a wall surface. The projection device 10 may be any suitable projection apparatus having a projection function, such as a projector, and may utilize any suitable projection technology, such as CRT, LCD, DLP, or DLV technology, and the like. The projection device typically includes a light source, a lens, and the like.
The projection device 10 may project various suitable virtual images in the unmanned store, such as commodity information, commodity recommendation information, further description information of the commodity information, virtual animals, virtual shopping guides, and the like.
The function keyboard device 20 includes a plurality of function keys, and when the function key is pressed, the function keyboard device will generate a key signal, and after receiving the key signal, the control system 30 identifies the function key corresponding to the key signal, and can control the projection apparatus 10 to project the commodity information corresponding to the function key.
The function keyboard 20 may further include a start key, and when the start key is pressed, the function keyboard generates another key signal, and after receiving the key signal, the control system 30 identifies the start key corresponding to the key signal, and starts to control the projection apparatus 10 to perform projection display on the information of the commodity.
The function keyboard 20 may further include a translation key, which is disposed on the function keyboard and used for switching languages, so that the projected text is the language understood by the user.
Fig. 3 shows a structure of a function keyboard, and in the embodiment shown in fig. 3, the function keyboard 20 includes an activation key, a translation key, and 10 function keys. In other embodiments, the function keyboard may also include more keys, or omit some of the keys.
The functional keyboard device may be a device similar to a keyboard, such as a mechanical keyboard, and may include a touch switch and a key circuit, wherein when the touch switch is pressed, a contact of the touch switch is turned on, the key circuit is turned on, and the key circuit generates a key signal.
The control system 30 is a control center of the unmanned shop 100, and is configured to coordinate various components of the unmanned shop 100 to implement functions of the unmanned shop 100. The control system 30 may be a single controller, or may include a plurality of controllers, and in the case where a plurality of controllers are included, the control system 30 may be a combination of controllers provided in each component (e.g., a function keyboard, a projector).
Fig. 3 exemplarily shows a hardware structure of the control system 30, and as shown in fig. 3, the controller includes a memory 31 and a processor 32.
Memory 31, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable program instructions, among other things. The memory 31 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like.
Further, the memory 31 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 31 may optionally include memory located remotely from the processor 32, which may be connected to the terminal over a network.
Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The processor 32 connects various parts of the entire unmanned shop 100 using various interfaces and lines, performs various functions of the unmanned shop 100 and processes data by running or executing software programs stored in the memory 31 and calling data stored in the memory 31, for example, implementing an interaction method or a shopping guide method described in an embodiment of the present application.
The processor 32 may be one or more, and one processor 32 is illustrated in fig. 4. The processor 32 and the memory 31 may be connected by a bus or other means, such as the bus connection shown in fig. 4.
The processor 32 may include a Central Processing Unit (CPU), Digital Signal Processor (DSP), application specific integrated circuit (AS ic), Field Programmable Gate Array (FPGA) device, or the like. The processor 32 may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
In other embodiments, the unmanned store 100 may further include a camera device for capturing images and transmitting the images to the control system 30, so that the control system 30 performs image recognition and the like, such as identity confirmation recognition (including face recognition), commodity recognition, location recognition and the like, based on the images. The camera device may be any suitable device having image capturing capabilities, such as a camera, monitor, etc.
The unmanned store 100 may also include a voice device communicatively coupled to the control system 30 for emitting voice or recognizing voice, which may include, for example, a microphone, a speaker, and a voice recognition chip, among others.
The unmanned store may further include a rotating device fixedly coupled to the projection device 10, and the rotating device is also communicatively coupled to the control system 30. The rotating device is used for driving the projecting device 10 to rotate, so that the virtual image projected by the projecting device 10 moves in the real space. The rotating device may be any suitable device having a rotating function, such as a pan and tilt head. The rotating device can adjust the angle of the projection device, such as adjusting the horizontal angle and the pitch angle.
The rotating device generally includes a body, and a driving mechanism, a transmission mechanism, a motor, etc. disposed on the body, where the rotating device is used to adjust the horizontal angle and the pitch angle of the projection apparatus, the motor may include a horizontal motor and a pitch motor.
The driving mechanism may be in communication connection with the control system 30, receive the control signal of the control system 30, and generate a driving signal based on the control signal, the motor may rotate under the driving of the driving mechanism, so as to drive the transmission mechanism to move, and the transmission mechanism may drive the projection apparatus to move, for example, to adjust the horizontal and pitch angles of the projection apparatus.
It will be understood by those skilled in the art that the above is only an example of the hardware structure of the unmanned store 100, and in practical applications, more components may be provided for the unmanned store 100 according to the actual functional requirements, and of course, one or more components may be omitted according to the functional requirements.
In practical application, the projection device and the camera device can be placed on the top of an inner space of an unmanned store or on a wall surface, and the functional keyboard device can be placed on a table for placing commodities.
At present, an unmanned store only relies on simple intelligent processing equipment and programs to provide simple shopping instructions for a user, and when the user wants to know a certain commodity in detail in the store, the user cannot obtain corresponding help.
According to the embodiment of the application, the projection device and the function keyboard device are arranged in the unmanned shop, the projection device is used for projecting the first information comprising the first commodity information and the key information, and the key information indicates the function keys on the function keyboard device. When the user presses the function key of the function keyboard, the unmanned shop projects second information including the description information of the first commodity information. Through the second information, the user can further know the condition of the commodity.
The embodiment of the present application provides an unmanned store interaction method, which may be applied to an unmanned store, such as the unmanned store 100 in fig. 1, and as shown in fig. 5, the method includes:
101: and projecting first information, wherein the first information comprises first commodity information and key information, and the key information is used for indicating function keys on the function keyboard.
In some embodiments, the image before the commodity shelf can be shot in real time through the camera device, and the image is transmitted to the control system, and when the control system determines that a user stays in front of a certain commodity shelf through image recognition, the projection device is controlled to project first information aiming at the commodity.
In other embodiments, a start key may also be set on the function keyboard, when the user presses the start key, the function keyboard generates a key signal corresponding to the start key, and the control system receives the key signal and then controls the projection device to project the first information of the commodity in response to the pressing operation of the user.
The first commodity information includes, for example, a production area, a manufacturer, a shelf life, a material content, a nutrient content, the number of parts, a deformation mode, and the like of the commodity. The first commodity information is used for the user to roughly know the condition of the commodity. The key information is used to indicate function keys on the function keyboard, for example, to indicate functions corresponding to the function keys, and when the user is instructed that the user wants to further know information on a certain aspect of the product, the user can obtain the information by pressing which function key.
102: and projecting second information in response to the pressing operation of the user on the function key, wherein the second information comprises description information of the first information.
The second information includes further description information of the first information, and the first information generally only enables the user to roughly know the condition of the commodity and often cannot meet the requirement that the user wants to comprehensively know the commodity. The key information is displayed in the first information, so that a user can know the functions corresponding to the functional keys, the key signals can be generated by pressing the corresponding keys, the control system switches the projection picture of the projection device after receiving the key signals, the projection device projects the second information, and the user can further know the conditions of the commodities through the second information.
The second information usually includes description information of the first information, for example, information for further describing a certain product information in the first information, and if the first product information is 2 types of modification, the description information may be further describing the 2 types of modification, for example, a specific projection screen of the 2 types of modification.
In some embodiments, the description information included in the second information may be an animation demonstration of the content of the commodity, such as a commodity demonstration, a commodity part demonstration, a commodity deformation demonstration, an installation demonstration, a disassembly demonstration, and the like. When a user wants to purchase a commodity, information such as a production place, a brand and the like can only provide auxiliary reference for the user. The commodity contents are more concerned by the user, and the commodity information can be provided for the user to refer in a very intuitive display mode by performing animation demonstration on the commodity contents.
The commodity demonstration means that the commodity in the packing carton is demonstrated, and the commodity often has the extranal packing, and the user can only know the commodity through the commodity picture on the extranal packing at present, and this application embodiment shows the virtual image of commodity for the user through projection arrangement, can make the user audio-visual understanding commodity.
The commodity part demonstration means that each part of a commodity is projected through a projection device, for example, the shape of each building block can be projected when the commodity is a building block.
The commodity deformation demonstration refers to various deformation modes of the commodity projected by the projection device.
The installation demonstration and the disassembly demonstration refer to the installation mode and the disassembly mode of projecting commodities through a projection device.
In addition to the animation demonstration, the product content may be presented in a static state.
Fig. 6 illustrates a form of first information by taking a toy a (e.g., a building block) as an example, and in the embodiment illustrated in fig. 6, first commodity information (including a producing place, a producer, a quality guarantee period, a production date, a part and a deformation mode) is projected on a projection screen, and key information (including "part showing — key 3", "deformation showing — key 2") is projected, and the key information is used for indicating a user, if detailed information of the part is to be known, the part showing can be obtained by pressing the key 3, and if specific conditions of deformation are to be known, the deformation showing can be obtained by pressing the key 2.
After the projection device projects the first information shown in fig. 6, if the user wants to further know the information of the commodity, for example, wants to know the deformation condition of the commodity, the user can press the key 2 on the function keyboard, and after receiving the key signal corresponding to the key 2, the control system 30 switches the projection content of the projection device, so that the projection device projects two deformation modes of the toy a, which can be displayed at an interval of time, for example, 10S. Through the virtual image display of the deformation mode, a user can visually know various deformation states of the commodity. In this embodiment, the second information is a projection screen of each modification of the toy a.
The method and the device for displaying the commodity information can project first information comprising the first commodity information and key information in the unmanned store, the key information indicates function keys on the function keyboard, and when a user presses the function keys of the function keyboard, the unmanned store projects second information comprising description information of the first information. Through the second information, the user can further know the condition of the commodity. The embodiment of the application provides an interaction means between a user and an unmanned shop, provides a way for the user to further know commodities, and the user can further know commodity information by pressing keys on a functional keyboard device, so that the shopping experience of the user can be improved.
In other embodiments, in order to provide more references to the user and make the user better understand the merchandise, the control system may also play the instruction voice for the merchandise through the voice device when displaying the content of the merchandise. The user can know the commodity from the image and the sound at the same time.
In other embodiments, the control system may further control the projection device to project a virtual shopper, and enable the voice device to send a voice introduction in cooperation with the action of the virtual shopper.
Specifically, the control system may project a display picture of the commodity content, such as an animation demonstration of the installation mode, through one projection device, and project a virtual shopper through another projection device. The control system controls the two projection devices to project on one hand and controls the voice device to send corresponding voice on the other hand.
For example, the control system controls the virtual shopper projected by the projection device to point at the display screen on the one hand, and controls the voice device to issue a voice introduction to the display screen (for example, when explaining the installation mode of the merchandise) on the other hand, so that the action of the virtual shopper is matched with the voice introduction.
Wherein, the virtual shopping guide can be a human image or an animal image.
Various commodities are placed on the commodity shelf, one commodity can use one function keyboard, and information of the commodity is acquired through the function keyboard. Or a plurality of commodities can share one functional keyboard device, and the information of the commodities can be acquired by the functional keyboard device. In some embodiments of the present application, for cost reasons, a plurality of commodities share one function keyboard, and the interaction method further includes:
responding to the pressing operation of the user on the function key, and projecting third information, wherein the third information comprises second commodity information;
the second commodity information and the first commodity information belong to different commodities.
In this embodiment, the key information of the first information includes key information indicating other products, for example, information that can be switched to other products by a certain key. Fig. 7 shows another form of the first information, and in the embodiment shown in fig. 7, further includes key information "switch to toy B — key 0".
The third information comprises second commodity information, and when a user presses a corresponding function key, the control system controls the projection device to project the third information after receiving a key signal corresponding to the function key. Also taking fig. 7 as an example, when the user presses the key 0, the projection device projects information of the toy B. Fig. 8 shows one form of the third information.
It is possible to establish a correspondence relationship between each function keyboard and a commodity, for example, the function keyboard 1 corresponds to the toys a to E. The corresponding relation between the information to be displayed of each commodity and the function keys can be established in advance, when the projection device projects the commodity information, the corresponding relation can be displayed through the first information, and a user can know the functions of the function keys through the first information, so that the user can obtain the desired information by pressing the corresponding keys.
It should be noted that "first" and "third" in the first information and the third information are only for convenience of explaining that the first information and the third information belong to different products, and are not purchased for limitation. For example, the third information is the first information of toy B for toy B, and the third information of toy B for toy a.
The embodiment of the present application further provides an unmanned store shopping guide method, which may be applied to an unmanned store, such as the unmanned store 100 in fig. 1, as shown in fig. 9, and the method includes:
901: and obtaining an interest commodity and an interest user corresponding to the interest commodity.
The interested commodity refers to a commodity which is interested by the user, and the interested user refers to a person who is interested in the interested commodity.
In some embodiments, the interest product and the interest user corresponding to the interest product may be determined by a selection operation of the user on the product.
For example, at least one item information may be displayed outside the unmanned store, such as a hot product, a special product or other item information in the store projected outside the store or displayed through a display screen, and one or more items may be selected by the user, so that the user is an interested user, and the item selected by the user is an interested item.
For example, the commodity information is projected outside the store, the user performs a selection operation on the projected picture (for example, points at a certain commodity with a hand), the image pickup device captures the action of the user for selecting the commodity, the control system recognizes the selection action of the user on the commodity through image recognition, recognizes the face feature and the commodity feature of the user, and records the user. And storing the record in a memory of the control system, wherein the record at least comprises human face characteristics and commodity characteristics, the human face characteristics correspond to interested users, and the commodity characteristics correspond to interested commodities.
902: and projecting a first virtual animal by using a first projection device, and controlling the first virtual animal to move towards the direction of the interest commodity, wherein the first virtual animal is used for indicating the position of the interest commodity for the interest user.
When a user enters an unmanned commodity, the image of the user is acquired by the camera device and is transmitted to the control system, and the control system extracts the face features of the user and compares the face features with the face features of an interested user. If the matching is successful, the user is an interested user, the characteristics of the interested commodity corresponding to the interested user are obtained, and the position of the interested commodity is determined.
The control system controls the first projection device to project a first virtual animal, enables the first virtual animal to present an animation effect, and moves towards the direction of the interested commodity so as to indicate the position of the interested commodity for the interested user. Specifically, the rotating device can be controlled to rotate, so that the rotating device drives the projection device to move, and the first virtual animal moves. The first virtual animal may be any suitable animal figure, such as a panda, kola, lizard, or the like.
According to the embodiment of the application, the first virtual animal is projected by the first projection device, and the first virtual animal is controlled to move towards the direction of the commodity which is interested by the user, so that the position of the commodity is indicated for the user. The commodity can be effectively found to the help user, and provide the guide with virtual animal's mode, and is fresh interesting, improves user's shopping experience.
Because a plurality of shelves are mostly arranged in the unmanned store, the virtual image projected by the first projection device is often easily shielded by the shelves, so that the virtual image cannot cover all corners of the unmanned store. For example, when the first projection device is disposed on a certain wall, if the interested user walks from a corner to another wall, the first virtual animal may be blocked by the shelf and cannot be completely presented to the interested user.
Accordingly, in some embodiments, the unmanned store shopping guide method further comprises:
and acquiring the position of the interested user, projecting a second virtual animal by using a second projection device when the position of the interested user is positioned at the corner of the unmanned store, and controlling the second virtual animal to move towards the direction of the interested commodity, wherein the second virtual animal is used for indicating the position of the interested commodity for the interested user.
Specifically, the image of the interested user can be acquired in real time through the camera device and transmitted to the control system, and the control system performs image recognition to determine whether the position of the user is a corner in the unmanned store.
When the position of the interested user is located at the corner of the unmanned store, the control system controls the second projection device to project a second virtual animal and controls the second virtual animal to move towards the interested commodity so as to continuously indicate the position of the interested commodity for the user.
It can be understood that, if the interested user does not reach the location of the interested product when walking to another corner, the control system may control the third projection device to project the third virtual pet, and control the third virtual pet to continue to guide the interested user until the interested user reaches the location of the interested product.
Wherein, the image of the second virtual animal can be the same as or different from the image of the first virtual animal. The second virtual animal may be any suitable animal figure, such as a panda, kola, lizard, or the like.
The interested user may be attracted by other goods in the process of moving to the interested goods, so as to hold on for watching. In some embodiments, to further improve the guiding effect, the method further includes:
controlling the first virtual animal or the second virtual animal to stop moving when the interested user stops walking. And/or the presence of a gas in the gas,
when the interested user starts to walk again, the first virtual animal or the second virtual animal is controlled to move continuously to the position of the interested commodity.
Namely, when the interested user stops walking, the first virtual animal or the second virtual animal also stops moving, and when the interested user starts walking, the first virtual animal or the second virtual animal also starts moving, so that the effect that the first virtual animal or the second virtual animal accompanies and guides the interested user to move to the interested commodity is presented.
In practical application, the image of the interested user can be acquired in real time and transmitted to the control system, and when the control system determines that the interested user is in a stop state through image recognition, the rotating device is controlled to stop rotating. When the control system determines that the interested user is in the walking state again, the control system controls the rotating device to start rotating so as to enable the virtual pet (one of the first virtual pet, the second virtual pet or other virtual pets, hereinafter referred to as the virtual pet) to start moving.
In other embodiments, the shopping guide method further comprises:
and responding to the input operation of a user, projecting a third virtual animal, and controlling the third virtual animal to move towards the outlet direction, wherein the third virtual animal is used for indicating the outlet position for the user.
When the user finishes shopping and wants to leave the unmanned store, a certain key (for example, a leave indication key arranged on the function keyboard) can be pressed, the key generates a key signal, and after receiving the key signal, the control system controls the projection device to project a third virtual animal and controls the third virtual animal to move towards the exit direction, so as to indicate the exit direction for the user.
The third virtual animal may be any suitable animal figure, such as a panda, kola, lizard, or the like.
According to the embodiment of the application, the virtual animal is projected through the projection device, the three-dimensional and vivid virtual animal can attract the attention of a user, and clear direction sense is brought to the user when the direction is guided for the user. The virtual animal may exhibit different behavioral states, such as crawling, leg lifting, head shaking, chinch, etc.
Embodiments of the present application further provide a computer-readable storage medium, which stores computer-executable instructions, which are executed by one or more processors, such as the processor 31 in fig. 4, so that the one or more processors can execute the shopping guide method or the interaction method in any of the above-mentioned method embodiments, for example, execute the method steps 101 to 102 in fig. 5, and the method step 901 and 902 in fig. 9, which are described above.
Embodiments of the present application also provide a computer program product comprising a computer program stored on a non-volatile computer-readable storage medium, the computer program comprising program instructions that, when executed by a machine, cause the machine to perform the aforementioned shopping guide method or interaction method, for example, the method steps 101 to 102 in fig. 5, and the method steps 901 and 902 in fig. 9 described above are performed.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; within the context of the present application, where technical features in the above embodiments or in different embodiments can also be combined, the steps can be implemented in any order and there are many other variations of the different aspects of the present application as described above, which are not provided in detail for the sake of brevity; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
Claims (10)
1. An unmanned shop interaction method applied to an unmanned shop, wherein the unmanned shop is provided with a projection device and a function keyboard, and the method comprises the following steps:
projecting first information, wherein the first information comprises first commodity information and key information, and the key information is used for indicating a function key on the function keyboard;
responding to the pressing operation of the user on the function key, and projecting second information;
wherein the second information includes descriptive information of the first information.
2. The method of claim 1, further comprising:
responding to the pressing operation of the user on the function key, and projecting third information, wherein the third information comprises second commodity information;
the second commodity information and the first commodity information belong to different commodities.
3. The method of claim 1 or 2, wherein the explanatory information comprises an animated presentation of the contents of the article.
4. The method of claim 3, wherein the animated demonstration comprises at least one of an installation demonstration, a removal demonstration, a merchandise deformation demonstration, and a merchandise component demonstration.
5. The method of claim 3, wherein the unmanned store further comprises a voice device, the method further comprising:
and when the projected second information comprises the animation demonstration, playing the explaining voice through the voice device.
6. The method according to claim 1 or 2, characterized in that the method further comprises:
projecting a virtual shopping guide;
and sending out the voice introduction of the interested commodities by matching with the action of the virtual shopping guide.
7. The method of claim 1, wherein the function keyboard further comprises an activation key, and wherein projecting the first information comprises:
and projecting first information in response to the operation of pressing the starting key by the user.
8. An unmanned store, comprising:
the projection device is used for projecting a virtual image;
a function keyboard, the function keyboard comprising at least one function key;
a control system;
the control system includes:
the at least one processor is respectively connected with the projection device and the functional keyboard device; and
a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor to enable the at least one processor to perform the method of any of claims 1-7.
9. The unmanned store of claim 8, further comprising:
and the voice device is connected with the processor and is used for voice recognition and voice output.
10. A readable storage medium, on which a program or instructions are stored, which when executed by a processor, implement the method of any one of claims 1-7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111055963.8A CN113902466A (en) | 2021-09-09 | 2021-09-09 | Unmanned store interaction method, unmanned store and storage medium |
PCT/CN2021/136491 WO2023035442A1 (en) | 2021-09-09 | 2021-12-08 | Self-service store interaction method, and self-service store and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111055963.8A CN113902466A (en) | 2021-09-09 | 2021-09-09 | Unmanned store interaction method, unmanned store and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113902466A true CN113902466A (en) | 2022-01-07 |
Family
ID=79028066
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111055963.8A Pending CN113902466A (en) | 2021-09-09 | 2021-09-09 | Unmanned store interaction method, unmanned store and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113902466A (en) |
WO (1) | WO2023035442A1 (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102411749A (en) * | 2011-08-24 | 2012-04-11 | 厦门市鼎朔信息技术有限公司 | Virtual guiding system based on positioning information and network display terminal |
JP2017059175A (en) * | 2015-09-18 | 2017-03-23 | 株式会社日本総合研究所 | Merchandise comparing terminal, control method for the same, and merchandise comparing program |
CN106663277A (en) * | 2014-03-13 | 2017-05-10 | 电子湾有限公司 | Interactive displays based on user interest |
CN109959386A (en) * | 2018-09-29 | 2019-07-02 | 大连艾米移动科技有限公司 | A kind of supermarket shopping projection navigation system |
CN110021062A (en) * | 2018-01-08 | 2019-07-16 | 佛山市顺德区美的电热电器制造有限公司 | A kind of acquisition methods and terminal, storage medium of product feature |
CN110378773A (en) * | 2019-07-26 | 2019-10-25 | 织网(上海)互联网科技有限公司 | Merchandise display processing method, device, storage medium and plateform system |
CN111199443A (en) * | 2018-11-20 | 2020-05-26 | 北京京东尚科信息技术有限公司 | Commodity information processing method, commodity information processing device and computer-readable storage medium |
WO2020232615A1 (en) * | 2019-05-20 | 2020-11-26 | 深圳市欢太科技有限公司 | Information recommendation method and apparatus, and electronic device and storage medium |
WO2021088889A1 (en) * | 2019-11-04 | 2021-05-14 | 青岛海信激光显示股份有限公司 | Display system, display method, and computing device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6070149A (en) * | 1998-07-02 | 2000-05-30 | Activepoint Ltd. | Virtual sales personnel |
CN109224442B (en) * | 2018-09-03 | 2021-06-11 | 腾讯科技(深圳)有限公司 | Data processing method and device for virtual scene and storage medium |
-
2021
- 2021-09-09 CN CN202111055963.8A patent/CN113902466A/en active Pending
- 2021-12-08 WO PCT/CN2021/136491 patent/WO2023035442A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102411749A (en) * | 2011-08-24 | 2012-04-11 | 厦门市鼎朔信息技术有限公司 | Virtual guiding system based on positioning information and network display terminal |
CN106663277A (en) * | 2014-03-13 | 2017-05-10 | 电子湾有限公司 | Interactive displays based on user interest |
JP2017059175A (en) * | 2015-09-18 | 2017-03-23 | 株式会社日本総合研究所 | Merchandise comparing terminal, control method for the same, and merchandise comparing program |
CN110021062A (en) * | 2018-01-08 | 2019-07-16 | 佛山市顺德区美的电热电器制造有限公司 | A kind of acquisition methods and terminal, storage medium of product feature |
CN109959386A (en) * | 2018-09-29 | 2019-07-02 | 大连艾米移动科技有限公司 | A kind of supermarket shopping projection navigation system |
CN111199443A (en) * | 2018-11-20 | 2020-05-26 | 北京京东尚科信息技术有限公司 | Commodity information processing method, commodity information processing device and computer-readable storage medium |
WO2020232615A1 (en) * | 2019-05-20 | 2020-11-26 | 深圳市欢太科技有限公司 | Information recommendation method and apparatus, and electronic device and storage medium |
CN110378773A (en) * | 2019-07-26 | 2019-10-25 | 织网(上海)互联网科技有限公司 | Merchandise display processing method, device, storage medium and plateform system |
WO2021088889A1 (en) * | 2019-11-04 | 2021-05-14 | 青岛海信激光显示股份有限公司 | Display system, display method, and computing device |
Also Published As
Publication number | Publication date |
---|---|
WO2023035442A1 (en) | 2023-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109085966B (en) | Three-dimensional display system and method based on cloud computing | |
CN111970533B (en) | Interaction method and device for live broadcast room and electronic equipment | |
CA2997034C (en) | Method and apparatus for playing video content from any location and any time | |
WO2019216419A1 (en) | Program, recording medium, augmented reality presentation device, and augmented reality presentation method | |
US20160267577A1 (en) | Holographic interactive retail system | |
EP3937154A1 (en) | Method for video interaction and electronic device | |
CN110888532A (en) | Man-machine interaction method and device, mobile terminal and computer readable storage medium | |
CN108038726A (en) | Article display method and device | |
WO2019221952A1 (en) | Viewing a virtual reality environment on a user device | |
WO2014190221A1 (en) | Object display with visual verisimilitude | |
CN108805766B (en) | AR somatosensory immersive teaching system and method | |
EP3655204A1 (en) | Electronic device capable of moving and operating method thereof | |
CN111857335A (en) | Virtual object driving method and device, display equipment and storage medium | |
CN109714647B (en) | Information processing method and device | |
US20160048311A1 (en) | Augmented reality context sensitive control system | |
CN112269553B (en) | Display system, display method and computing device | |
CN113902466A (en) | Unmanned store interaction method, unmanned store and storage medium | |
CN108022134A (en) | A kind of unmanned shopping guide's display systems for integrating sensing, voice, picture, video, light, scene | |
CN113888206A (en) | Unmanned store shopping guide method, unmanned store and storage medium | |
WO2018121542A1 (en) | Operation method and apparatus for service object, and electronic device | |
JP2022095625A (en) | System, method, and program for creating video | |
TW202002665A (en) | Intelligent product introduction system and method thereof | |
US11449451B2 (en) | Information processing device, information processing method, and recording medium | |
JP2020103685A (en) | Moving image distribution system, moving image distribution method, and moving image distribution program | |
CN112764522A (en) | Electronic advertisement display equipment and electronic advertisement display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |