CN110895769A - Information display method, device, system and storage medium - Google Patents

Information display method, device, system and storage medium Download PDF

Info

Publication number
CN110895769A
CN110895769A CN201811070731.8A CN201811070731A CN110895769A CN 110895769 A CN110895769 A CN 110895769A CN 201811070731 A CN201811070731 A CN 201811070731A CN 110895769 A CN110895769 A CN 110895769A
Authority
CN
China
Prior art keywords
target user
user
data
image
consumption behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811070731.8A
Other languages
Chinese (zh)
Other versions
CN110895769B (en
Inventor
陈煌
王骏
唐方
周燕
蔡孟媛
沈俊佚
郭奥申
廖志伟
徐远同
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811070731.8A priority Critical patent/CN110895769B/en
Publication of CN110895769A publication Critical patent/CN110895769A/en
Application granted granted Critical
Publication of CN110895769B publication Critical patent/CN110895769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations

Landscapes

  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Engineering & Computer Science (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an information display method, equipment, a system and a storage medium. In the information display system, based on the camera that the entity shop set up, can gather the image that contains the action orbit of user at the entity shop, follow this and acquire the data relevant with the target user based on shooting the image to show with the effect mode through the display terminal that the entity shop set up, and then the interest of multiplicable off-line shopping, richen the flow of off-line consumption.

Description

Information display method, device, system and storage medium
Technical Field
The present application relates to the field of internet technologies, and in particular, to an information display method, device, system, and storage medium.
Background
At present, the flow of offline consumption is single, that is, a user selects commodities, settles accounts and leaves. In the process, the offline merchant cannot summarize and fun and guide the consumption behavior of the user. Therefore, a solution is yet to be proposed.
Disclosure of Invention
Aspects of the present application provide an information display method, device, system, and storage medium for increasing interest of offline shopping and enriching the flow of offline consumption.
An embodiment of the present application provides an information display system, including: the camera and the display terminal are arranged in the physical store; the camera is used for acquiring images containing user behavior tracks in the physical store and sending the images to the display terminal; and the display terminal is used for determining a target user, acquiring data related to the target user according to the image and displaying the data related to the target user in an active and active mode.
An embodiment of the present application further provides an information display method, including: acquiring images containing user behavior tracks in a physical store; determining a target user, and acquiring data related to the target user according to the image; presenting data related to the target user in a live-action manner.
An embodiment of the present application further provides a display terminal, including: a memory, a processor, and a display component; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: acquiring an image containing a user behavior track in a physical store; determining a target user, and acquiring data related to the target user according to the image; presenting data related to the target user in a live-action manner.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program can execute the steps in the information display method provided in the embodiments of the present application.
In the embodiment of the application, based on the camera that the entity shop set up, can gather the image that contains the action orbit of user at the entity shop, then based on shoot the image and acquire the data relevant with the target user to display with the animation effect mode through the display terminal that the entity shop set up, and then the interest of multiplicable off-line shopping enriches the flow of off-line consumption.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic diagram of an information display system according to an exemplary embodiment of the present application;
fig. 2a is a schematic diagram of global tracking according to another exemplary embodiment of the present application;
FIG. 2b is a schematic diagram of an information display system according to another exemplary embodiment of the present application;
FIG. 2c is a schematic diagram of the temporal mechanical energy storage effect provided by an exemplary embodiment of the present application;
FIG. 2d is a schematic diagram of an instant messenger showing consumption behavior data most strongly correlated with a consumption intention of a target user according to another exemplary embodiment of the present application;
FIG. 2e is a schematic diagram of an optical-mechanical display showing consumption behavior data with weak association with the consumption intention of a target user according to another exemplary embodiment of the present application;
FIG. 2f is a schematic diagram of an illustration of an timeline provided by an exemplary embodiment of the present application showing consumption behavior data that is less correlated with a target user's consumption intent;
FIG. 2g is a schematic diagram of a timeline presentation overview page provided in another exemplary embodiment of the present application;
fig. 3 is a schematic flowchart of an information display method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a display terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Aiming at the technical problem that the existing offline consumption process is single, in some embodiments of the application, based on a camera arranged in a physical store, images containing behavior tracks of users in the physical store are collected, data related to target users are acquired based on the shot images, and the data are displayed in a dynamic and effective mode through a display terminal arranged in the physical store, so that the interest of offline shopping can be increased, and the offline consumption process is enriched. The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of an information display system according to an exemplary embodiment of the present application, and as shown in fig. 1, the information display system 100 includes: a camera 10 and a display terminal 20 installed in a physical store.
The physical stores include physical supermarkets, physical convenience stores, physical shopping malls, unmanned stores and the like. In this embodiment, the camera 10 is mainly used for acquiring an image including a user behavior track in a physical store, and sending the acquired image to the display terminal 20. Wherein the behavior trace comprises: all behaviors (such as commodity browsing behaviors, commodity purchase behaviors, specific commodity area standing behaviors and the like) and motion tracks (such as browsing sequences, walking paths and the like in different commodity areas) of a user in a physical store. The number of the cameras 10 may be one or plural. When the number of the cameras 10 is plural, the plural cameras can be uniformly distributed at various positions of the physical store, and the shooting range of the cameras can be ensured to completely cover the area which can be reached by the user in the physical store. For example, multiple cameras may be mounted at the top of a brick and mortar store, on shelves, at shelf aisles, at doorways, and so forth, respectively. The camera 10 may be implemented as a general camera or as a camera having a function of rotating up, down, left, and right and capable of performing a target tracking function.
In an alternative embodiment, the camera 10 includes a plurality of cameras without a target tracking function, the plurality of cameras may send respective captured images to the display terminal 20, the display terminal 20 performs feature analysis on the received images, and determines a behavior track of the user in the brick and mortar store according to a result of the feature analysis.
In another alternative embodiment, the camera 10 includes a plurality of cameras with a target tracking function, each camera can perform global tracking on behavior tracks of different users in the physical store, capture images including the behavior tracks of the users, send the images to the display terminal 20, perform feature analysis by the display terminal 20 according to the received images, and determine the behavior tracks of the users in the physical store according to results of the feature analysis.
The two embodiments are optional embodiments of the present application, and will be specifically described in the following examples.
And the display terminal 20 is used for determining a target user, acquiring data related to the target user according to the image sent by the camera 10, and displaying the data related to the target user in an active mode. Wherein the target user refers to one or more users in the current physical store; the data related to the target user may include any data related to the existence of the target user, such as identification data of the target user, membership data, historical consumption data, favorite data, shopping data at a physical store, and possibly favorite merchandise data, which are not limited to the embodiments described above. And the dynamic effect mode comprises one or more combinations of visual dynamic effect, auditory dynamic effect and sensory dynamic effect so as to provide a data display process with interestingness for the target user.
In practice, the display terminal 10 may be implemented as a smart device having a display function and a communication function. The display terminal 20 may be one or more, and typically, one display terminal 20 includes at least one processing unit, at least one memory, and a display. The number of processing units and memories depends, among other things, on the configuration and type of display terminal 20. The display may include a screen, primarily for displaying various types of data, such as data related to the target user. Alternatively, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect information such as duration and pressure related to the touch or slide operation.
The Memory may include volatile, such as RAM, non-volatile, such as Read-Only Memory (ROM), flash Memory, etc., or both. The memory 112 typically stores an Operating System (OS), one or more application programs, and may also store program data and the like.
Optionally, in addition to the processing unit, the memory and the display, the display terminal 20 further includes some basic configurations, such as a network card chip, an IO bus, an audio/video component, a communication component, and the like. Optionally, the display terminal 20 may further include some peripheral devices, such as a keyboard, a mouse, an input pen, a printer, a motion sensing device, a virtual reality device, and the like, which are not described herein again.
In the information display system 100, the camera 10 and the display terminal 20 may be connected by wire or wireless communication to implement the image transmission and reception process described in the above embodiment. For example, the communication connection is made through a wireless/wired switch within a local area network, or the communication connection is made through a mobile network. When the mobile network is connected through communication, the network format of the mobile network may be any one of 2G (gsm), 2.5G (gprs), 3G (WCDMA, TD-SCDMA, CDMA2000, UTMS), 4G (LTE), 4G + (LTE +), WiMax, and the like. Optionally, when the physical store occupies a small space, the camera 10 and the display terminal 20 may be communicatively connected through communication methods such as bluetooth, ZigBee, infrared ray, LoRA, and the like, which is not limited in this embodiment.
In this embodiment, based on the camera that the entity shop set up, can gather the image that contains the action orbit of user at the entity shop, then acquire the data relevant with the target user based on shooting the image to display with the animation effect mode through the display terminal that the entity shop set up, and then the interest of multiplicable off-line shopping enriches the flow of off-line consumption.
In an alternative embodiment, in which the display terminal 20 performs feature analysis according to the image captured by the camera 10 and determines the behavior track of the user in the physical store according to the result of the feature analysis, the camera 10 is fixedly installed at the top, on the shelf, at the shelf aisle, and at the entrance and exit of the physical store.
Fig. 2a illustrates the principle of this embodiment by taking the user a as an example. When the user a enters the physical store, the camera installed at the entrance of the physical store can acquire the biometric features of the user a and send the biometric features to the display terminal 20 for storage. When the display terminal 20 receives the images including the behavior tracks of the respective users captured by the camera 10, the images including the behavior tracks of the a user can be determined therefrom based on the biometric features of the a user. Then, the position change information, the bone motion information, the face orientation information, and the like of the user are recognized from the image including the behavior trajectory of the user a, the behavior trajectory of the user a is determined, and the correspondence relationship between the biometric feature of the user a and the behavior trajectory is established.
In another alternative embodiment, in which the display terminal 20 performs feature analysis on the image captured by the camera 10 and determines the behavior track of the user in the physical store according to the result of the feature analysis, the camera 10 may be implemented as a dynamically movable image capturing device, for example, a robot having a visual tracking function, which can lock the user and perform tracking shooting on the user. In this embodiment, after the user a enters the physical store, the display terminal 20 may allocate an available camera a 'to the user a, track the user a all the way through the camera a', capture an image including the behavior of the user a in the physical store, and send the image to the display terminal 20. The display terminal 20 determines the behavior of the user a according to the image sent by the camera a ', and may determine the motion trajectory of the user a according to the motion trajectory of the camera a'. Optionally, in this embodiment, when the camera 10 is implemented as a robot with visual tracking, the robot may further have a shopping guide function, a commodity inquiry function, and the like, which will not be described in detail.
In some exemplary embodiments, the manner in which the display terminal 20 determines the target user includes:
mode 1: the user included in the received image is identified in position, and the identified user located within the setting range of the display terminal 20 is set as the target user. The images captured by the camera 10 may contain the behavior tracks of the various users. After receiving the image sent by the camera 10, the display terminal 20 may determine the real-time position of each user according to the behavior track of the user included in the image, and display data related to the target user in an effective manner by taking the user located within the setting range of the display terminal 20 as the target user. The setting range of the display terminal 20 refers to a distance range convenient for viewing the content displayed on the display terminal 20, and may be, for example, a range defined by a semicircle having the center of the display terminal 20 and a radius of 1 m.
Alternatively, the display terminal 20 may obtain the biological characteristics of each user in advance, for example, when each user enters a physical store, the camera 10 obtains the biological characteristics of each user separately, and establishes an association relationship between the biological characteristics of each user and data related to the user. When there is a user in the setting range of the display terminal 20, the display terminal 20 may obtain the biometric features of the user according to the image captured by the camera 10, and based on the association relationship, determine the user matched with the biometric features as the target user and obtain data related to the target user, thereby ensuring that the displayed data is matched with the user located in the setting range of the display terminal 20. For example, when the a-user is within the setting range of the display terminal 20, the display terminal 20 will present the data related to the a-user.
Mode 2: and when the voice awakening instruction is received, taking the user sending the voice awakening instruction as a target user.
The voice wake-up command may include specific voice content, so that the display terminal 20 can recognize the voice wake-up command from different voice signals. For example, the contents of the voice wake-up instruction may be: the hour light machine is started! Optionally, the specific voice content included in the voice wake-up instruction may be set by a user in a user-defined manner, and is not described in detail.
In this embodiment, the display terminal 20 may collect voice samples of each user in advance, extract corresponding voiceprint features from the voice samples, and establish an association relationship between the user and data related to the user based on the voiceprint features. When the display terminal 20 receives the voice wake-up instruction, it may obtain the voiceprint features included in the voice wake-up instruction, determine the identity of the user who sent the voice wake-up instruction based on the association relationship, determine the data related to the user who sent the voice wake-up instruction, and further ensure that the displayed data matches the user who sent the voice wake-up instruction.
In some other alternative embodiments, the display terminal 20 may also present the data related to the target user in a special effect manner under a specific wake-up condition. Alternatively, the specific wake-up condition may be that a predetermined wake-up time arrives, or a wake-up message of another device is received, or a trigger operation of the user on the touch screen or the peripheral hardware device of the display terminal 20 is received, which includes but is not limited to this embodiment.
In addition to the system architecture and functions described in the above embodiment, as shown in fig. 2b, the information display system 100 provided in this embodiment further includes: an online transaction server 30. The online transaction server 30 is used for storing online identity information and online consumption behavior data of registered users. In this embodiment, the display terminal 20 is further configured to obtain a network account number registered by the user on the online transaction server 30, and obtain at least one of online identity information and online consumption behavior data of the user from the online transaction server 30 according to the network account number.
The way for the display terminal 20 to obtain the network account number registered on the online transaction server 30 includes:
mode 1: the display terminal 20 presents on its display screen a corresponding information code (e.g., a barcode or a two-dimensional code) for identifying the display terminal 20. The user scans the information code through an online consumption application program installed on an intelligent terminal device (such as a mobile phone, a tablet computer and the like) held by a person. After the online consumption application scans the information code, the information code and the network account number that the user has logged in on the online application are sent to the online transaction server 30. Then, the online transaction server 30 may send the network account number to the display terminal 20 corresponding to the information code. In some embodiments, the online transaction server 30 may further obtain online consumption data corresponding to the network account number, and send the online consumption data to the display terminal 20.
Mode 2: the display terminal 20 may acquire a biometric feature (e.g., a face feature or an iris feature) of the user through the camera 10, and request the online transaction server 30 to acquire a network account number matching the biometric feature based on the biometric feature. In this embodiment, the online transaction server 30 stores a correspondence between the biometric of the user and the network account number. After receiving the biometric features sent by the display terminal 20, the online transaction server 30 may query a network account number matched with the received biometric features based on the correspondence relationship, and send the network account number to the display terminal 20.
Mode 3: the display terminal 20 may obtain a network account number and a corresponding password registered on the online transaction server 30 and input by the user through a peripheral input device (e.g., a keyboard or a touch panel) provided thereon, and transmit the network account number and the corresponding password to the online transaction server 30 for verification. In this embodiment, an account password input interface may be displayed on the display terminal 20, and the user may be prompted to input corresponding content. In some embodiments, the display terminal 20 may request the online transaction server 30 for the online consumption behavior data corresponding to the user identified by the network account number according to the network account number and the password input by the user.
Of course, the above listed embodiments are only provided for exemplary illustration, and in practice, other manners for acquiring the network account of the user are included, which are not described herein again.
The online transaction server 30 is a server capable of performing transaction processing in a network virtual environment, and generally refers to a server for performing online commodity transaction using a network, and a user generally needs to register identity information in the server to perform transaction activities such as purchasing commodities using a registered account number, and may be a transaction server of each e-commerce platform or an online transaction website, or may be a third-party server. In physical implementation, the online transaction server 30 may be any device capable of providing computing services, responding to service requests, and performing processing, and may be, for example, a conventional server, a cloud host, a virtual center, and the like. The server mainly comprises a processor, a hard disk, a memory, a system bus and the like, and is similar to a general computer framework. Based on this embodiment, when the camera 10 set in the physical store is a camera with a target tracking function, the display terminal 20 may send a tracking instruction to an available camera in the camera 10 when acquiring a network account registered on the online transaction server 30 by the user, establish a corresponding relationship between the camera and the network account, so that the camera tracks the user to which the network account belongs, and capture an image including a behavior track of the target user. Furthermore, when receiving images sent by different cameras in the cameras 10, the display terminal 20 may determine data related to users to which the different network accounts belong from the images sent by the different cameras based on the correspondence between each camera and the network account.
In some exemplary embodiments, the display terminal 20 may employ visual, auditory and/or sensory effects when presenting data related to the target user in an animated manner, which is not limited in this embodiment. Taking the visual animation as an example, in some embodiments, the display terminal 20 may employ a time-light animation to present data related to the target user. The dynamic effect of the time-axis machine can be regarded as a dynamic effect display mode carrying time-axis information. The time machine is a kind of animation effect, and can be realized as an animation machine that can move on the display screen of the display terminal 20, for example. During the display, the screen of the display terminal 20 may display the image that the optical engine passes through from the time light tunnel, and during the passing process, display the data related to the target user, so as to achieve a better interesting effect.
In an alternative embodiment, when the display terminal 20 displays the data related to the target user in the time-light-machine effect, the following display sequence may be adopted: firstly, the energy storage effect of the photo-machine is shown, as shown in fig. 2 c; the time-light-machine energy storage effect can be displayed when a specific awakening condition is met or when a target user is determined, so that the target user is attracted to stand and watch. When the light machine energy storage effect is achieved during display, the user name and the head portrait of the target user can be displayed so as to attract the target user to stay. Then, the optical-mechanical energy storage is displayed to finish the dynamic effect; and then, the optical machine moves the dynamic effect along the set direction during displaying, and dynamically displays the data related to the target user along the set direction in the process of displaying the moving dynamic effect.
In some application scenariosThe data related to the target user includes offline consumption behavior data of the target user, for example, consumption behavior data of the target user at a brick and mortar store. In this embodiment, the display terminal 20 may acquire the offline consumption behavior data of the target user from the image transmitted by the camera 10 and may display the offline consumption behavior data of the target user in a dynamic manner.
Optionally, when the display terminal 20 acquires the offline consumption behavior data of the target user from the image sent by the camera 10, it is specifically configured to perform at least one of the following operations: acquiring commodity information browsed by a target user from an image sent by the camera 10; acquiring commodity information of a target user added in a shopping cart or carried about from an image sent by the camera 10; acquiring a browsing sequence of a target user to a commodity area from an image sent by the camera 10; acquiring the commodity browsing time of a target user from an image sent by the camera 10; and acquiring the walking track and the distance of the target user in the physical store from the image sent by the camera 10.
Optionally, in this embodiment, when the movement behavior of the optical machine in the set direction is displayed and the data related to the target user is dynamically displayed in the set direction, the data may be displayed in order of the degree of relevance between the consumption behavior data and the consumption intention of the target user, and the target user may be guided to consume through a dialog while displaying.
For example, the consumption behavior data with the strongest correlation with the consumption intention of the target user may be presented first, including: the target user adds commodity information in the shopping cart or carries with the target user, commodity browsing time of the target user, and data such as walking track and distance of the target user in the physical store, as shown in fig. 2 d; then, the consumption behavior data which is weakly associated with the consumption intention of the target user is presented, and the method comprises the following steps: the target user browses the commodity information and displays a jargon guiding the target user to add the browsed commodity into the shopping cart, wherein the jargon is a word guiding the target user to consume, such as displaying commodity advantages, user feedback information of the commodity and the like, and is shown in fig. 2 e; then, the consumption behavior data which is less relevant to the consumption intention of the target user is displayed, and the method comprises the following steps: the browsing sequence of the target user to the commodity area, the number of times the target user visits a physical store, the image of the target user when entering the store, etc., as shown in fig. 2 f; then, an overview page may also be presented, on which the target user is guided to take a picture for sharing or guided to have more consumption, or the previously presented content is summarized accordingly, and so on, as shown in fig. 2 g. When the presentation time of the display terminal 20 reaches or the target user is not within the set range of the display terminal 20, the display terminal 20 may stop the presentation and switch its display content to the initial state.
Optionally, after the display terminal 20 obtains the offline consumption behavior data of the target user, in addition to displaying the offline consumption behavior data, in some embodiments, the display terminal 20 may also upload the offline consumption behavior data of the target user to the online transaction server 30. The online transaction server 30 may store the offline consumption behavior data of the target user, and may issue corresponding data to the requesting device when receiving the data acquisition request.
In other application scenariosThe data related to the target user includes: and fused data formed by at least one of the on-line identity information and the on-line consumption behavior data of the target user and the off-line consumption behavior data of the target user.
In such an embodiment, optionally, the display terminal 20 may fuse the online identity information of the target user with the offline consumption behavior data and present the fused data in an active manner. The online identity information of the target user may include an avatar uploaded to the online transaction server 30 by the target user, a member level of the target user, a current level of the target user, an amount of consumption of the target user away from a next level of the right, and the like. Optionally, fusing the online identity information of the target user and the offline consumption behavior data may be represented as: performing data association on the online identity information and the offline consumption behavior data, for example, performing discount calculation on commodities included in the offline consumption behavior data according to member grades and member rights and interests included in the online identity information; or, integrating the member account included in the online identity information according to the commodity information included in the offline consumption behavior data, and the like, which are not described again.
Optionally, the display terminal 20 may merge the online consumption behavior data and the offline consumption behavior data of the target user, and display the merged data in an active manner. The online consumption behavior data of the target user may include historical orders consumed online by the target user, preferred commodity categories, preferred brands, scores for different orders, and the like. Based on this, the display terminal 20 may determine, as the fusion data, the commodity information having a different degree of association with the consumption intention of the target user, based on the online consumption behavior data and the offline consumption behavior data of the target user. For example, from among the items of a shopping cart, the items carried around, and the viewed items of the target user in the physical store, the items most highly related to the consumption intention of the target user, the items second most highly related, and the items least highly related are determined.
Alternatively, the display terminal 20 may determine recommended commodity information that conforms to the consumption habits of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user. In the embodiment, the recommended commodity may be a commodity browsed by the target user or a commodity not browsed by the target user, and the mode of recommending the commodity to the target user based on the historical consumption data of the target user enables the target user to receive the recommended commodity according with the consumption habit of the target user when the target user shops in the physical store, so that the shopping enthusiasm and the interestingness of the target user in the physical store are greatly improved.
Optionally, the display terminal 20 may fuse the online identity information, the online consumption behavior data, and the offline consumption behavior data of the target user at the same time, and display the fused data in a dynamic and effective manner, and the specific implementation may refer to the above description, which is not repeated here.
It should be noted that, in the above embodiment, an embodiment is described in which the display terminal 20 forms the fused data with the online consumption behavior data of the target user through at least one of the online identity information and the online consumption behavior data of the target user, and in some other alternative embodiments, this step may also be performed by the online transaction server 30. Optionally, after receiving the offline consumption behavior data of the target user to be uploaded by the display terminal 20, the online transaction server 30 may form fusion data with the offline consumption behavior data of the target user based on the locally stored online identity information and the locally stored online consumption behavior data of the target user, which is not described again.
In a typical application scenario, in the information display system 100 provided in the present embodiment, the camera 10 may be implemented as a camera installed at an entrance, department or shelf of a physical store, and the display terminal 20 may be implemented as an intelligent shopping hour machine installed at the entrance or cash register of the physical store.
The two-dimensional code used for scanning to bind the online shopping account number can be provided on the optical machine during intelligent shopping, and when the user A enters the entity store, the two-dimensional code can be scanned at the optical machine during intelligent shopping so as to rapidly log in the personal online shopping account number. And then, the optical machine sends a tracking instruction to the available cameras in the physical store during intelligent shopping. The camera can track the action track of user A in the entity shop, shoots the image that contains user A's action track to the ray apparatus when sending for intelligent shopping. When the optical machine determines that the user approaches to the range of 1m of the optical machine during intelligent shopping according to the received image during intelligent shopping, displaying the animation effect that the optical machine starts to store energy during the intelligent shopping, and displaying the head portrait and the user name of the user A to attract the user to approach; when the optical machine determines that the user is within 1 meter of the optical machine during intelligent shopping according to the received image during intelligent shopping, the optical machine can show the browsing time, behavior track and commodity browsing record of the user A entering the store and commodity information purchased at the moment in an interesting mode of passing through the time light tunnel during intelligent shopping so as to guide the user to consume.
The above embodiment describes the system architecture and system functions of the information display system 100, and the present embodiment further provides an information display method, which can be executed by the information display system 100, as shown in fig. 3, and the method includes:
step 301, collecting images containing user behavior tracks in a physical store.
Step 302, a target user is determined, and data related to the target user is obtained according to the image.
Step 303, presenting data related to the target user in an active manner.
In some exemplary embodiments, a way of determining a target user includes: identifying the position of the user contained in the image, and taking the identified user positioned in the set range of the display terminal as the target user; or when receiving a voice awakening instruction, taking the user who sends the voice awakening instruction as the target user.
In some exemplary embodiments, after the user who issues the voice wakeup instruction is taken as the target user when the voice wakeup instruction is received, the method further includes: and performing voiceprint recognition according to the voice awakening instruction to determine the user who sends the voice awakening instruction, and determining data related to the user who sends the voice awakening instruction.
In some exemplary embodiments, one way of obtaining data related to the target user from the image includes: and acquiring offline consumption behavior data of the target user from the image.
In some exemplary embodiments, further comprising: and acquiring a network account number registered on the online transaction server by the target user, and acquiring at least one of online identity information and online consumption behavior data of the target user from the online transaction server according to the network account number.
In some exemplary embodiments, one way of obtaining data related to the target user from the image includes: acquiring at least one of online identity information and online consumption behavior data of the target user; acquiring offline consumption behavior data of the target user from the image; and forming fused data according to at least one of the online identity information and the online consumption behavior data of the target user and the offline consumption behavior data of the target user.
In some exemplary embodiments, a manner of obtaining offline consumption behavior data of the target user from the image includes: acquiring commodity information browsed by the target user from the image; acquiring commodity information of the target user added into the shopping cart or carried about from the image; acquiring the browsing sequence of the target user to the commodity area from the image; acquiring the commodity browsing time of the target user from the image; and acquiring the walking track and the distance of the target user in the physical store from the image.
In some exemplary embodiments, further comprising: and uploading the offline consumption behavior data of the target user to an online transaction server, so that the online transaction server stores the offline consumption behavior data of the target user.
In some exemplary embodiments, a method of forming fused data from at least one of online identity information and online consumption behavior data of the target user and offline consumption behavior data of the target user, comprises: determining commodity information with different association degrees with the consumption intention of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user; or determining recommended commodity information which accords with the consumption habits of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user.
In some exemplary embodiments, a way of presenting data related to the target user in an animated manner, includes: and displaying data related to the target user according to the time-of-flight dynamic effect.
In some exemplary embodiments, a way of presenting data related to the target user in a time-light maneuver effect includes: the light machine can store energy and effect when displaying; displaying the energy storage end dynamic effect of the time motor; and the optical machine moves the dynamic effect along the set direction during displaying, and dynamically displays the data related to the target user along the set direction in the process of displaying the moving dynamic effect.
In this embodiment, based on the camera that the entity shop set up, can gather the image that contains the action orbit of user at the entity shop, then acquire the data relevant with the target user based on shooting the image to display with the animation effect mode through the display terminal that the entity shop set up, and then the interest of multiplicable off-line shopping enriches the flow of off-line consumption.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of step 201 to step 204 may be device a; for another example, the execution subject of steps 201 and 202 may be device a, and the execution subject of step 203 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 201, 202, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 4 is a schematic structural diagram of a display terminal according to an exemplary embodiment of the present application. As shown in fig. 4, the display terminal includes: memory 401, processor 402, communication component 403, display component 404, audio component 405, and power component 406.
The memory 401 is used for storing a computer program and may be configured to store other various data to support operations on the terminal. Examples of such data include instructions for any application or method operating on the terminal, contact data, phonebook data, messages, pictures, videos, etc.
The memory 401 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Wherein the communication component 403 is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The display component 404 includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. Further optionally, the audio component 404 may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
A power supply component 406 for providing power to various components of the device in which the power supply component is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
A processor 402, coupled to the memory 401, for executing the computer program in the memory 401 for: acquiring images containing user behavior tracks in a physical store; determining a target user, and acquiring data related to the target user according to the image; data related to the target user is presented in a live manner.
Further optionally, when determining the target user, the display component 404 is specifically configured to: identifying the position of the user contained in the image through the processor 402, and taking the identified user positioned in the setting range of the display terminal as the target user; alternatively, upon receiving a voice wake-up instruction through the audio component 405, the user who issued the voice wake-up instruction is taken as the target user.
Further optionally, the processor 402, after receiving the voice wake-up instruction of the target user, before presenting the data related to the target user in an active manner, is further configured to: and performing voiceprint recognition according to the voice awakening instruction to determine the user who sends the voice awakening instruction, and determining data related to the user who sends the voice awakening instruction.
Further optionally, when the processor 402 acquires the data related to the target user according to the image, it is specifically configured to: and acquiring offline consumption behavior data of the target user from the image.
Further optionally, the processor 402 is further configured to: a network account registered by the target user on the online transaction server is obtained, and at least one of online identity information and online consumption behavior data of the target user is obtained from the online transaction server through the communication component 403 according to the network account.
Further optionally, when the processor 402 acquires the data related to the target user according to the image, it is specifically configured to: acquiring at least one of online identity information and online consumption behavior data of the target user; acquiring offline consumption behavior data of the target user from the image; and forming fused data according to at least one of the online identity information and the online consumption behavior data of the target user and the offline consumption behavior data of the target user.
Further optionally, when the processor 402 acquires the offline consumption behavior data of the target user from the image, it is specifically configured to: acquiring commodity information browsed by the target user from the image; acquiring commodity information of the target user added into the shopping cart or carried about from the image; acquiring the browsing sequence of the target user to the commodity area from the image; acquiring the commodity browsing time of the target user from the image; and acquiring the walking track and the distance of the target user in the physical store from the image.
Further optionally, the processor 402 is further configured to: the offline consumption behavior data of the target user is uploaded to the online transaction server through the communication component 403, so that the online transaction server stores the offline consumption behavior data of the target user.
Further optionally, when forming the fused data according to at least one of the online identity information and the online consumption behavior data of the target user and the offline consumption behavior data of the target user, the processor 402 is specifically configured to: determining commodity information with different association degrees with the consumption intention of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user; or determining recommended commodity information which accords with the consumption habits of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user.
Further optionally, the display component 404, when presenting the data related to the target user in an animated manner, is specifically configured to: and displaying data related to the target user according to the time-of-flight dynamic effect.
Further optionally, the display component 404, when displaying the data related to the target user in a time light maneuver effect, is specifically configured to: the light machine can store energy and effect when displaying; displaying the energy storage end dynamic effect of the time motor; and the optical machine moves the dynamic effect along the set direction during displaying, and dynamically displays the data related to the target user along the set direction in the process of displaying the moving dynamic effect.
In this embodiment, the display terminal may obtain data related to the target user through an image including a behavior trajectory of the user in the physical store, and display the data related to the target user in an effective manner, so as to increase the interest of online shopping and enrich the flow of online consumption.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the display terminal in the foregoing method embodiments when executed.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (20)

1. An information display system, comprising: the camera and the display terminal are arranged in the physical store; the camera is used for acquiring images containing user behavior tracks in the physical store and sending the images to the display terminal;
and the display terminal is used for determining a target user, acquiring data related to the target user according to the image and displaying the data related to the target user in an active and active mode.
2. The system of claim 1, wherein the display terminal is specifically configured to: identifying the position of the user contained in the image, and taking the identified user positioned in the set range of the display terminal as the target user; or when receiving a voice awakening instruction, taking the user who sends the voice awakening instruction as the target user.
3. The system of claim 2, wherein the display terminal is specifically configured to: and carrying out voiceprint recognition according to the voice awakening instruction so as to determine the user who sends the voice awakening instruction and determine data related to the user who sends the voice awakening instruction.
4. The system of claim 1, wherein the display terminal is specifically configured to: and acquiring the offline consumption behavior data of the target user from the image, and displaying the offline consumption behavior data of the target user in an active mode.
5. The system of claim 1, further comprising: the online transaction server is used for storing online identity information and online consumption behavior data of the registered user;
the display terminal is further configured to acquire a network account registered by the target user on the online transaction server, and acquire at least one of online identity information and online consumption behavior data of the target user from the online transaction server according to the network account.
6. The system of claim 5, wherein the display terminal is specifically configured to: and acquiring the offline consumption behavior data of the target user from the image, and forming fusion data according to at least one of the online identity information and the online consumption behavior data of the target user and the offline consumption behavior data of the target user to display the fusion data in an effective manner.
7. The system according to claim 4 or 6, wherein the display terminal is specifically configured to perform at least one of the following operations when acquiring the offline consumption behavior data of the target user from the image: acquiring commodity information browsed by the target user from the image;
acquiring commodity information of the target user added into the shopping cart or carried about from the image;
acquiring a browsing sequence of the target user to the commodity area from the image;
acquiring the commodity browsing time of the target user from the image;
and acquiring the walking track and the distance of the target user in the physical store from the image.
8. The system of claim 5, wherein the display terminal is further configured to: uploading the offline consumption behavior data of the target user to the online transaction server;
the online transaction server is also used for storing offline consumption behavior data of the target user.
9. The system according to claim 6, wherein the display terminal, when forming the merged data, is specifically configured to:
determining commodity information with different association degrees with the consumption intention of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user; alternatively, the first and second electrodes may be,
and determining recommended commodity information which accords with the consumption habits of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user.
10. The system according to any of claims 1-9, wherein the display terminal, when presenting the data related to the target user in an active manner, is specifically configured to:
and displaying data related to the target user according to the time-light machine effect.
11. The system of claim 10, wherein the display terminal is specifically configured to:
the light machine can store energy and effect when displaying;
displaying the energy storage end dynamic effect of the time motor; and
and during displaying, the optical machine moves the dynamic effect along a set direction, and dynamically displays data related to the target user along the set direction in the process of displaying the moving dynamic effect.
12. An information display method, comprising:
acquiring images containing user behavior tracks in a physical store;
determining a target user, and acquiring data related to the target user according to the image;
presenting data related to the target user in a live-action manner.
13. The method of claim 12, wherein determining a target user comprises:
identifying the position of the user contained in the image, and taking the identified user positioned in the set range of the display terminal as the target user; or when receiving a voice awakening instruction, taking the user who sends the voice awakening instruction as the target user.
14. The method according to claim 13, wherein, when receiving the voice wake-up command, after taking the user who sent the voice wake-up command as the target user, further comprising:
and carrying out voiceprint recognition according to the voice awakening instruction so as to determine the user who sends the voice awakening instruction and determine data related to the user who sends the voice awakening instruction.
15. The method of claim 12, wherein obtaining data related to the target user from the image comprises:
and acquiring offline consumption behavior data of the target user from the image.
16. The method of claim 12, wherein obtaining data related to the target user from the image comprises:
acquiring at least one of online identity information and online consumption behavior data of the target user;
acquiring offline consumption behavior data of the target user from the image;
and forming fused data according to at least one of the online identity information and the online consumption behavior data of the target user and the offline consumption behavior data of the target user.
17. The method of claim 16, wherein forming fused data from at least one of online identity information and online consumption behavior data of the target user and offline consumption behavior data of the target user comprises:
determining commodity information with different association degrees with the consumption intention of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user; alternatively, the first and second electrodes may be,
and determining recommended commodity information which accords with the consumption habits of the target user as fusion data according to the online consumption behavior data and the offline consumption behavior data of the target user.
18. The method of any of claims 12-17, wherein presenting data related to the target user in an active manner comprises:
and displaying data related to the target user according to the time-light machine effect.
19. A display terminal, comprising: a memory, a processor, and a display component;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: acquiring an image containing a user behavior track in a physical store; determining a target user, and acquiring data related to the target user according to the image; presenting data related to the target user in a live-action manner.
20. A computer-readable storage medium storing a computer program which, when executed, is capable of performing the method of any one of claims 12 to 18.
CN201811070731.8A 2018-09-13 2018-09-13 Information display method, device, system and storage medium Active CN110895769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811070731.8A CN110895769B (en) 2018-09-13 2018-09-13 Information display method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811070731.8A CN110895769B (en) 2018-09-13 2018-09-13 Information display method, device, system and storage medium

Publications (2)

Publication Number Publication Date
CN110895769A true CN110895769A (en) 2020-03-20
CN110895769B CN110895769B (en) 2023-11-17

Family

ID=69785605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811070731.8A Active CN110895769B (en) 2018-09-13 2018-09-13 Information display method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN110895769B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111784396A (en) * 2020-06-30 2020-10-16 广东奥园奥买家电子商务有限公司 Double-line shopping tracking system and method based on user image
CN113129060A (en) * 2021-04-19 2021-07-16 深圳市极客智能科技有限公司 Offline commodity recommendation display system, method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090287587A1 (en) * 2008-05-15 2009-11-19 Bloebaum L Scott Systems methods and computer program products for providing augmented shopping information
CN105869025A (en) * 2016-04-21 2016-08-17 深圳市杰博科技有限公司 Interactive mirror information media electronic business system
CN105869015A (en) * 2016-03-28 2016-08-17 联想(北京)有限公司 Information processing method and system
CN107464136A (en) * 2017-07-25 2017-12-12 苏宁云商集团股份有限公司 A kind of merchandise display method and system
CN107742229A (en) * 2017-10-26 2018-02-27 厦门物之联智能科技有限公司 Consumer behavior information collection method and system based on shared equipment
CN107767168A (en) * 2017-09-19 2018-03-06 神策网络科技(北京)有限公司 User behavior data processing method and processing device, electronic equipment and storage medium
CN108510364A (en) * 2018-03-30 2018-09-07 杭州法奈昇科技有限公司 Big data intelligent shopping guide system based on voiceprint identification

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090287587A1 (en) * 2008-05-15 2009-11-19 Bloebaum L Scott Systems methods and computer program products for providing augmented shopping information
CN105869015A (en) * 2016-03-28 2016-08-17 联想(北京)有限公司 Information processing method and system
CN105869025A (en) * 2016-04-21 2016-08-17 深圳市杰博科技有限公司 Interactive mirror information media electronic business system
CN107464136A (en) * 2017-07-25 2017-12-12 苏宁云商集团股份有限公司 A kind of merchandise display method and system
CN107767168A (en) * 2017-09-19 2018-03-06 神策网络科技(北京)有限公司 User behavior data processing method and processing device, electronic equipment and storage medium
CN107742229A (en) * 2017-10-26 2018-02-27 厦门物之联智能科技有限公司 Consumer behavior information collection method and system based on shared equipment
CN108510364A (en) * 2018-03-30 2018-09-07 杭州法奈昇科技有限公司 Big data intelligent shopping guide system based on voiceprint identification

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张荣君;张守文;邵明凤;吴芳芳;王洁;魏延津;王佳艺;李新华;: "零X线下阵发性室上性心动过速不同导管消融比较", 实用心电学杂志, no. 01, pages 29 - 34 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111784396A (en) * 2020-06-30 2020-10-16 广东奥园奥买家电子商务有限公司 Double-line shopping tracking system and method based on user image
CN113129060A (en) * 2021-04-19 2021-07-16 深圳市极客智能科技有限公司 Offline commodity recommendation display system, method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110895769B (en) 2023-11-17

Similar Documents

Publication Publication Date Title
US11238654B2 (en) Offline shopping guide method and apparatus
CN107909443B (en) Information pushing method, device and system
CN111383039B (en) Information pushing method, device and information display system
RU2731362C1 (en) Method of providing information on purchases in live video
US20140079281A1 (en) Augmented reality creation and consumption
US20150095228A1 (en) Capturing images for financial transactions
CN110858134B (en) Data, display processing method and device, electronic equipment and storage medium
US20140078174A1 (en) Augmented reality creation and consumption
CN109213310B (en) Information interaction equipment, data object information processing method and device
CN112749350B (en) Information processing method and device of recommended object, storage medium and electronic equipment
CN110033293A (en) Obtain the method, apparatus and system of user information
US10692103B2 (en) Systems and methods for hashtag embedding based on user generated content for creating user specific loyalty identifiers
CN108389268B (en) Payment method, system and equipment
CN111090327A (en) Commodity information processing method and device and electronic equipment
CN112000266B (en) Page display method and device, electronic equipment and storage medium
CN110895769B (en) Information display method, device, system and storage medium
US11361333B2 (en) System and method for content recognition and data categorization
CN113781058A (en) Payment method and device for unmanned vehicle, electronic equipment and storage medium
CN111178923A (en) Offline shopping guide method and device and electronic equipment
CN111178920B (en) Commodity object information recommendation method, device and system
CN110213307B (en) Multimedia data pushing method and device, storage medium and equipment
US9817849B2 (en) Image recognition method for offline and online synchronous operation
CN114333185A (en) Payment method and device, electronic equipment and storage medium
CN208938000U (en) Intelligent mirror equipment
KR101927078B1 (en) Method for providing image based information relating to user and device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant