CN110602680A - Near scene information interaction method, server and system - Google Patents

Near scene information interaction method, server and system Download PDF

Info

Publication number
CN110602680A
CN110602680A CN201910893872.8A CN201910893872A CN110602680A CN 110602680 A CN110602680 A CN 110602680A CN 201910893872 A CN201910893872 A CN 201910893872A CN 110602680 A CN110602680 A CN 110602680A
Authority
CN
China
Prior art keywords
user
service
interaction
information
intelligent equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910893872.8A
Other languages
Chinese (zh)
Inventor
任斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Beiguang Ladder Shadow Advertising Co Ltd
Original Assignee
Beijing Beiguang Ladder Shadow Advertising Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Beiguang Ladder Shadow Advertising Co Ltd filed Critical Beijing Beiguang Ladder Shadow Advertising Co Ltd
Priority to CN201910893872.8A priority Critical patent/CN110602680A/en
Publication of CN110602680A publication Critical patent/CN110602680A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/35Services specially adapted for particular environments, situations or purposes for the management of goods or merchandise

Abstract

The invention relates to a near-field scene information interaction method, a server and a system, wherein the method comprises the following steps: the method comprises the steps that a server receives identification information of intelligent equipment in a service scene, wherein the identification information is uploaded by mobile equipment, and is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment; the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment; and the server performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user. The embodiment of the invention can enable users in different service scenes to participate in scene service interaction and enjoy richer and more personalized scene service.

Description

Near scene information interaction method, server and system
Technical Field
The invention belongs to the technical field of communication, and particularly relates to a near-field scene information interaction method, a server and a system.
Background
At present, with the increase of the number of various intelligent devices and internet of things devices, the devices are more and more abundantly applied in various scenes, such as vending machines, electronic advertisement screens, electronic express boxes, advertisement projection in elevators, electronic door gates and the like. The user can directly carry out related operations on the equipment to realize the specified service function; or the preset corresponding information is obtained through the two-dimensional code on the scanning equipment or in the image displayed by the equipment.
However, in these service scenarios, devices in the same service scenario can only provide simple and uniform service functions, and there is no personalized interaction link with users in different specific scenarios.
Disclosure of Invention
The invention provides a near-field scene information interaction method, a server and a system, which can enable users in different service scenes to participate in scene service interaction and enjoy richer and more personalized scene service.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, a near-field scene information interaction method is provided, including:
the method comprises the steps that a server receives identification information of intelligent equipment in a service scene, wherein the identification information is uploaded by mobile equipment, and is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment;
the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment;
and the server performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user.
In a second aspect, a server is provided, including:
the information receiving module is used for receiving identification information of the intelligent equipment in a service scene, which is uploaded by the mobile equipment, wherein the identification information is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment;
the strategy generation module is used for generating an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment;
and the information interaction module is used for performing information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the business service contained in the business scene for the user.
In a third aspect, a near-field scene information interaction system is provided, including: the system comprises a server, intelligent equipment and mobile equipment, wherein the intelligent equipment and the mobile equipment are positioned in a service scene;
the server is used for receiving identification information of the intelligent equipment in a service scene uploaded by the mobile equipment, wherein the identification information is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment; the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment; and the server performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user.
According to the near-field scene information interaction method, the server and the system, a user can utilize the mobile equipment to detect and identify the near-field signal sent by the intelligent equipment in a service scene so as to obtain the identification information of the intelligent equipment and upload the identification information to the server; and the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment, and performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user.
In the invention, aiming at each specific service scene, the server generates an interaction strategy based on the identification information of the intelligent equipment in the service scene uploaded by the user in a limited time, and then indirectly realizes the service interaction with the user through the information interaction with the mobile equipment and/or the intelligent equipment. The service interaction mode can facilitate the service provider to flexibly set the interaction mode and adjust the service content, thereby providing the scene service which is more similar to the user requirement and personalized for the user.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a near-field scene information interaction method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a method for generating an interactive strategy according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a server according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a near-field scene information interaction system in the second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described in detail below. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the examples given herein without any inventive step, are within the scope of the present invention.
Example one
Fig. 1 is a flowchart of a near-field scene information interaction method provided in an embodiment of the present invention, where the method involves three behavior bodies: intelligent device, mobile device and server. The intelligent device is located in a designated service scene and can provide service in the corresponding service scene. For example, the smart device may be, but is not limited to being: vending machine, electronic advertising screen, electron express delivery case, advertisement projection in the elevator, electronic door forbidden gate etc.. The mobile device may be a mobile terminal, e.g. a mobile phone, held by a user. The mobile device may detect the smart device through near field signal detection techniques. The server can respectively carry out information interaction with the intelligent equipment and the mobile equipment, and further indirectly realizes service interaction in corresponding service scenes with the user. As shown in fig. 1, the near-field scene information interaction method includes the following steps:
s110, the server receives identification information of the intelligent device in the service scene, wherein the identification information is uploaded by the mobile device and acquired after the user utilizes the mobile device to detect and identify a near-field signal sent by the intelligent device.
Specifically, the intelligent device located in the service scene may send a near field signal to the surrounding environment through a built-in near field scene recognition device, where the near field signal carries identification information of the intelligent device. When a user uses an APP on a mobile terminal such as a mobile phone to detect a near field signal in a surrounding environment, identification information of the intelligent device can be acquired from the near field signal, so that the intelligent device is searched and identified. The identification information of the smart device is used to uniquely identify a smart device, and the identification information may be, but is not limited to, a device ID of the smart device.
In an embodiment, the near field scene recognition apparatus may include a plurality of communication sensing modules, which include but are not limited to: bluetooth, wireless WIFI, high frequency sound production module, NFC label etc.. Multiple sensor modules may broadcast information out simultaneously. The mobile device may detect the near field scene recognition device through a variety of built-in detection devices (bluetooth, WIFI, microphone, NFC tag, etc.). For example, the mobile device may perform detection and identification on the near-field signal sent by the smart device by using at least one of the following near-field signal detection methods, and obtain identification information of the smart device from the signal: bluetooth, wireless WIFI, audio signal, NFC label. In an actual application scene, the mobile terminal can further accurately judge the near field scene recognition device closest to the user according to the signal strength, the sound wave size, the sound field orientation, the distance and the like detected by the detection device.
The APP on the mobile terminal confirms the intelligent equipment located in the current service scene through one or more near-field signals sent by the detected sensor, acquires identification information of the intelligent equipment, and uploads the identification information to the server. The interaction between the server and the APP on the mobile terminal may be encrypted through a link layer and follow the oauth2.0 protocol to ensure the security of the interaction.
And S120, the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment.
For the intelligent equipment deployed in the service scene, the identification information of the intelligent equipment can be authenticated and the corresponding interaction strategy can be registered on the server in advance. The interaction strategy refers to an operation process of a user executing related services in a certain service scene, and the user can perform information interaction with a server through intelligent equipment and/or handheld mobile equipment in the service scene, so that service interaction of the user in the service scene is realized.
Specifically, after the user is in a certain service scenario and the detected identification information of the intelligent device in the service scenario is sent to the server through the mobile terminal, the server may verify the identification information and determine the type of the service scenario. After the verification is passed, the intelligent device can further determine whether an interaction policy related to the service scene type exists in the server or not, or whether configuration information of the interaction policy is generated. And if so, generating an interaction strategy of the user in the service scene based on the identification information of the intelligent equipment.
In a specific embodiment, as shown in fig. 2, the generating, by the server, the interaction policy of the user in the service scenario according to the identification information of the smart device may include:
s210, the server determines the type of the service scene where the user is located according to the identification information of the intelligent device.
The service scene type may be defined according to a service function specifically provided by the intelligent device, and may include, but is not limited to: vending machine, electronic advertising screen, electron express delivery case, advertisement projection in the elevator, electronic door forbidden gate etc.. Therefore, the service function of the intelligent equipment can be judged through the identification information of each intelligent equipment, and the service scene type of the user can be further determined.
And S220, the server generates an interaction strategy of the user in the service scene by taking the scene attribute information corresponding to the user as a constraint condition according to the service scene type of the user.
In real life, each service scene is different due to factors such as service content specifically provided for a user, the geographic environment where the service scene is located, and the time for providing the service, and these factors are also taken into consideration when generating an interaction strategy for the user in the service scene. In this embodiment, personalized information of a service scene corresponding to each time a user performs service interaction is defined as scene attribute information corresponding to the user. The scene attribute information corresponding to the user may include, but is not limited to: at least one of time information when the user uploads the identification information of the intelligent device, position information of the intelligent device and portrait characteristic information of the user.
In a specific embodiment, the server generating the interaction policy of the user in the service scenario may include the following steps:
and the server determines the service data displayed by the intelligent equipment at the moment according to the service scene type of the user and the moment information when the user uploads the identification information of the intelligent equipment.
For example, for an intelligent device such as an electronic display screen that can display service data, the full amount of service data displayed in the service scene can be acquired through the specific service scene type where the intelligent device is located, and by further locking the time information when the user uploads the identification information of the intelligent device, the service data displayed by the intelligent device when uploading the identification information can be determined from the display timetable of all service data, for example, the service data may be a video that is currently being played by the intelligent device.
And the server generates information interaction with the mobile equipment and/or the intelligent equipment according to the business data displayed by the intelligent equipment at the moment so as to provide an interaction strategy of the business service based on the displayed data for the user.
Specifically, the interaction policy may be any form and content business process derived based on the content of the presentation data. For example, a video is currently being displayed on an electronic display screen, and a corresponding interaction policy may be to send content detail information of the video to a mobile device, such as a mobile phone, held by a user. In this way, the user can learn more and more detailed service data presented or contained on the smart device through service interaction.
In another embodiment, the server generating the interaction policy of the user in the service scenario may include the following steps:
and the server generates information interaction with the mobile equipment and/or the intelligent equipment according to the type of the service scene where the user is located and the position information where the intelligent equipment is located, so as to provide an interaction strategy of the service based on the position where the intelligent equipment is located for the user.
Specifically, for service scenes with different geographic locations, even if the service scenes belong to the same service scene type, the corresponding service contents of the service scenes are different. Therefore, when forming the interaction strategy, the position of the service scene needs to be further considered, and the position can be replaced by the position information of the intelligent device. For example, the peripheral environment data of the shop service, the entertainment place, the shopping, the school, the hospital, and the like around the current business scene can be determined according to the position information of the intelligent device. Correspondingly, when the interaction strategy is formed, the interaction strategy based on the service at the position of the intelligent device can be formed by combining the peripheral environment data on the basic service content of the original service scene. For example, when the current business scene is a business service providing environmental navigation for the user, navigation information of surrounding destinations more likely to be traveled by the current user can be actively provided in combination with the position information of the intelligent device, so that the user can go out conveniently.
In another embodiment, the server generating the interaction policy of the user in the service scenario may include the following steps:
and the server determines the interest point information of the user according to the portrait characteristic information of the user.
For example, the mobile device and/or the intelligent terminal may actively acquire the image characteristic information of the user, such as height, weight, sex, age, and the like of the user. Based on the portrait feature information of the user, the interest point information of the user can be determined with tendentiousness, such as the hobby field of the user, the habitual communication mode and the like.
The server generates information interaction with the mobile device and/or the intelligent device according to the type of the service scene where the user is located and the interest point information of the user, so as to provide an interaction strategy of the service based on the interest point information of the user for the user.
The basic full service in the service scene can be obtained through the specific service scene type of the intelligent equipment, and the service content preferred by the user or the service interaction mode preferred by the user can be selected from the full service to form the interaction strategy through further locking the interest point information of the user. For example, a smart device providing a restaurant store navigation may actively provide a type of restaurant (e.g., chinese, western, fast food, snack, buffet) that is preferred by a current user in combination with point-of-interest information of the user; the electronic advertisement screen can play advertisement videos which are interesting to users; the intelligent shopping guide robot can guide the user to interested shops; the shared car may unlock an address or the like that navigates to a destination desired by the user.
S130, the server performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the business service contained in the business scene for the user.
Specifically, the interaction policy formed by the server may be that the server interacts with the mobile device, or that the server interacts with the intelligent device in combination with the mobile device. In the interaction process, the server and the mobile equipment and/or the intelligent equipment can realize the provision of the business service contained in the business scene to the user by performing at least one round of information interaction according to the interaction strategy. And the user interacts with the server through the mobile device and/or the intelligent device, so that the user can enjoy the service contents provided by the current service scene.
According to the near-field scene information interaction method provided by the embodiment of the invention, a user can utilize the mobile equipment to detect and identify the near-field signal sent by the intelligent equipment in a service scene so as to obtain the identification information of the intelligent equipment and upload the identification information to the server; and the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment, and performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user.
In the invention, aiming at each specific service scene, the server generates an interaction strategy based on the identification information of the intelligent equipment in the service scene uploaded by the user in a limited time, and then indirectly realizes the service interaction with the user through the information interaction with the mobile equipment and/or the intelligent equipment. The service interaction mode can facilitate the service provider to flexibly set the interaction mode and adjust the service content, thereby providing the scene service which is more similar to the user requirement and personalized for the user.
Example two
Fig. 3 is a schematic structural diagram of a server according to an embodiment of the present invention, where the server is configured to execute the method steps shown in the first embodiment, and specifically includes:
the information receiving module 310 is configured to receive identification information of the intelligent device in a service scene, which is uploaded by the mobile device, where the identification information is obtained after a user detects and identifies a near-field signal sent by the intelligent device by using the mobile device;
the policy generation module 320 is configured to generate an interaction policy of the user in the service scenario according to the identification information of the smart device;
and the information interaction module 330 is configured to perform information interaction with the mobile device and/or the smart device according to the interaction policy, so as to provide the service included in the service scenario to the user.
In a specific embodiment, the detecting and identifying, by the mobile device, the near-field signal emitted by the smart device may include: detecting and identifying the near-field signal sent by the intelligent equipment by adopting at least one of the following near-field signal detection modes, and acquiring the identification information of the intelligent equipment from the signal: bluetooth, wireless WIFI, audio signal, NFC label.
In a specific embodiment, the policy generating module 320 may be configured to determine a service scenario type where the user is located according to the identification information of the smart device; and generating an interaction strategy of the user in the service scene by taking the scene attribute information corresponding to the user as a constraint condition according to the service scene type of the user.
In a specific embodiment, the scene attribute information corresponding to the user may include: at least one of time information when the user uploads the identification information of the intelligent device, position information of the intelligent device and portrait characteristic information of the user.
In a specific embodiment, the policy generating module 320 may be configured to determine, according to the type of the service scenario in which the user is located and time information when the user uploads the identification information of the intelligent device, service data displayed by the intelligent device at the time; and generating an interaction strategy for performing information interaction with the mobile equipment and/or the intelligent equipment according to the business data displayed by the intelligent equipment at the moment so as to provide the user with the business service based on the displayed data.
In a specific embodiment, the policy generating module 320 may be configured to generate an interaction policy for performing information interaction with the mobile device and/or the smart device according to the type of the service scenario where the user is located and the location information where the smart device is located, so as to provide the user with a service based on the location where the smart device is located.
In one embodiment, the policy generation module 320 may be configured to determine interest point information of the user according to the portrait feature information of the user; and generating an interaction strategy for performing information interaction with the mobile equipment and/or the intelligent equipment according to the type of the service scene where the user is located and the interest point information of the user so as to provide the user with the service based on the interest point information of the user.
In an embodiment, the information interaction module 330 may be configured to perform at least one round of information interaction with the mobile device and/or the smart device according to the interaction policy, so as to provide the business service included in the business scenario to the user.
In a specific embodiment, the information receiving module 310 may be configured to, after receiving the identification information of the smart device in the service scenario uploaded by the mobile device, verify the received identification information of the smart device, and after the verification is passed, trigger the policy generating module 320 to perform an operation of generating the interaction policy of the user in the service scenario according to the identification information of the smart device.
Further, as shown in fig. 4, an embodiment of the present invention further provides a near field scene information interaction system, including: a server 410, an intelligent device 420 located in a business scenario, and a mobile device 430; wherein the content of the first and second substances,
the server 410 may be configured to receive identification information of the smart device 420 located in a service scene, which is uploaded by the mobile device 430, where the identification information is obtained after a user detects and identifies a near-field signal sent by the smart device 420 by using the mobile device 430; the server 410 generates an interaction strategy of the user in the service scene according to the identification information of the intelligent device 420; the server 410 performs information interaction with the mobile device 430 and/or the smart device 420 according to the interaction policy to provide the user with the service included in the service scenario.
In some embodiments, the server 410 may be a server structure shown in the second embodiment, and details are not described herein.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (11)

1. A near-field scene information interaction method is characterized by comprising the following steps:
the method comprises the steps that a server receives identification information of intelligent equipment in a service scene, wherein the identification information is uploaded by mobile equipment, and is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment;
the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment;
and the server performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user.
2. The method of claim 1, wherein the mobile device performing detection recognition on the near-field signal emitted by the smart device comprises:
detecting and identifying the near-field signal sent by the intelligent equipment by adopting at least one of the following near-field signal detection modes, and acquiring the identification information of the intelligent equipment from the signal: bluetooth, wireless WIFI, audio signal, NFC label.
3. The method of claim 1, wherein the server generating the interaction policy of the user in the service scenario according to the identification information of the smart device comprises:
the server determines the type of the service scene where the user is located according to the identification information of the intelligent device;
and the server generates an interaction strategy of the user in the service scene according to the service scene type of the user and the scene attribute information corresponding to the user as a constraint condition.
4. The method of claim 3, wherein the scene attribute information corresponding to the user comprises: and at least one of time information when the user uploads the identification information of the intelligent equipment, position information of the intelligent equipment and portrait characteristic information of the user.
5. The method of claim 4, wherein the server generating the interaction policy of the user in the service scenario comprises:
the server determines the service data displayed by the intelligent equipment at the moment according to the service scene type of the user and the moment information when the user uploads the identification information of the intelligent equipment;
and the server generates information interaction with the mobile equipment and/or the intelligent equipment according to the business data displayed by the intelligent equipment at the moment so as to provide an interaction strategy of business service based on the displayed data for the user.
6. The method of claim 4, wherein the server generating the interaction policy of the user in the service scenario comprises:
and the server generates information interaction with the mobile equipment and/or the intelligent equipment according to the type of the service scene where the user is located and the position information where the intelligent equipment is located, so as to provide an interaction strategy of the service based on the position where the intelligent equipment is located for the user.
7. The method of claim 4, wherein the server generating the interaction policy of the user in the service scenario comprises:
the server determines interest point information of the user according to the portrait feature information of the user;
and the server generates an interaction strategy for performing information interaction with the mobile equipment and/or the intelligent equipment according to the type of the service scene where the user is located and the interest point information of the user so as to provide the user with a service based on the interest point information of the user.
8. The method according to any one of claims 1-7, wherein the server performing information interaction with the mobile device and/or the smart device according to the interaction policy comprises:
and the server performs at least one round of information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the business service contained in the business scene for the user.
9. The method of claim 1, wherein the server receives identification information of the smart device located in the service scenario uploaded by the mobile device, and then comprises:
and the server verifies the received identification information of the intelligent equipment, and executes the operation of generating the interaction strategy of the user in the service scene according to the identification information of the intelligent equipment after the verification is passed.
10. A server, comprising:
the information receiving module is used for receiving identification information of the intelligent equipment in a service scene, which is uploaded by the mobile equipment, wherein the identification information is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment;
the strategy generation module is used for generating an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment;
and the information interaction module is used for performing information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the business service contained in the business scene for the user.
11. A near field scene information interaction system, comprising: the system comprises a server, intelligent equipment and mobile equipment, wherein the intelligent equipment and the mobile equipment are positioned in a service scene;
the server is used for receiving identification information of the intelligent equipment in a service scene uploaded by the mobile equipment, wherein the identification information is obtained after a user detects and identifies a near-field signal sent by the intelligent equipment by using the mobile equipment; the server generates an interaction strategy of the user in the service scene according to the identification information of the intelligent equipment; and the server performs information interaction with the mobile equipment and/or the intelligent equipment according to the interaction strategy so as to provide the service contained in the service scene for the user.
CN201910893872.8A 2019-09-20 2019-09-20 Near scene information interaction method, server and system Pending CN110602680A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910893872.8A CN110602680A (en) 2019-09-20 2019-09-20 Near scene information interaction method, server and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910893872.8A CN110602680A (en) 2019-09-20 2019-09-20 Near scene information interaction method, server and system

Publications (1)

Publication Number Publication Date
CN110602680A true CN110602680A (en) 2019-12-20

Family

ID=68861774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910893872.8A Pending CN110602680A (en) 2019-09-20 2019-09-20 Near scene information interaction method, server and system

Country Status (1)

Country Link
CN (1) CN110602680A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105099511A (en) * 2014-04-17 2015-11-25 阿里巴巴集团控股有限公司 Method and device for realizing near-field services based on mobile network
CN105162846A (en) * 2015-08-10 2015-12-16 深圳市联信通信息科技有限公司 System and method for providing customized service for user based on identity identification
CN106028262A (en) * 2016-05-12 2016-10-12 腾讯科技(深圳)有限公司 Near-field service acquisition method and device for application
CN106856488A (en) * 2015-12-08 2017-06-16 阿里巴巴集团控股有限公司 A kind of scene perception and the method and device of offer service
CN107534849A (en) * 2015-07-24 2018-01-02 谷歌有限责任公司 System and method for personalized common equipment
CN107580063A (en) * 2017-09-18 2018-01-12 何杰斌 User terminal and business end polymerize by scene on the spot, the System and method for of interaction
KR20180067268A (en) * 2016-12-12 2018-06-20 현대자동차주식회사 Method for advertising a personal service and providing method of a personal service thereof
CN109905878A (en) * 2019-02-28 2019-06-18 阿里巴巴集团控股有限公司 The method and apparatus of information push

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105099511A (en) * 2014-04-17 2015-11-25 阿里巴巴集团控股有限公司 Method and device for realizing near-field services based on mobile network
CN107534849A (en) * 2015-07-24 2018-01-02 谷歌有限责任公司 System and method for personalized common equipment
CN105162846A (en) * 2015-08-10 2015-12-16 深圳市联信通信息科技有限公司 System and method for providing customized service for user based on identity identification
CN106856488A (en) * 2015-12-08 2017-06-16 阿里巴巴集团控股有限公司 A kind of scene perception and the method and device of offer service
CN106028262A (en) * 2016-05-12 2016-10-12 腾讯科技(深圳)有限公司 Near-field service acquisition method and device for application
KR20180067268A (en) * 2016-12-12 2018-06-20 현대자동차주식회사 Method for advertising a personal service and providing method of a personal service thereof
CN107580063A (en) * 2017-09-18 2018-01-12 何杰斌 User terminal and business end polymerize by scene on the spot, the System and method for of interaction
CN109905878A (en) * 2019-02-28 2019-06-18 阿里巴巴集团控股有限公司 The method and apparatus of information push

Similar Documents

Publication Publication Date Title
US10028146B2 (en) Management server and method for controlling device, user terminal apparatus and method for controlling device, and user terminal apparatus and control method thereof
US11004241B2 (en) Method and apparatus for producing and reproducing augmented reality contents in mobile terminal
US20210200501A1 (en) Projection, control, and management of user device applications using a connected resource
US9749808B2 (en) Method and apparatus for recommending content based on a travel route
CN107113520B (en) System and method for testing the media device used in the media environment connected with certification
KR102071579B1 (en) Method for providing services using screen mirroring and apparatus thereof
EP3221817B1 (en) Screenshot based indication of supplemental information
US9503854B1 (en) Criteria-associated media content
CN102348014B (en) For using sound that the apparatus and method of augmented reality service are provided
US20210056762A1 (en) Design and generation of augmented reality experiences for structured distribution of content based on location-based triggers
WO2015058623A1 (en) Multimedia data sharing method and system, and electronic device
CN112242980A (en) Screen projection method and device
US20150002506A1 (en) Method and apparatus for providing augmented reality display spaces
US20210095986A1 (en) Travel based notifications
CN111031391A (en) Video dubbing method, device, server, terminal and storage medium
CN111628925A (en) Song interaction method and device, terminal and storage medium
CN105608095B (en) Multimedia playing method and device and mobile terminal
CN111435377A (en) Application recommendation method and device, electronic equipment and storage medium
CN110602680A (en) Near scene information interaction method, server and system
CN210405658U (en) Near-field scene recognition device and information interaction system
CN104205902A (en) Management server and method for controlling device, user terminal apparatus and method for controlling device, and user terminal apparatus and control method thereof
CN112786022A (en) Terminal, first voice server, second voice server and voice recognition method
KR102647904B1 (en) Method, system, and computer program for classify place review images based on deep learning
KR101572349B1 (en) Voting system and object presence system using computing device and operatiog method thereof
TW201621274A (en) Cloud image positioning and navigation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 801-805, 8 / F, building 10, huaweili, Chaoyang District, Beijing 100020

Applicant after: Beijing tiying Media Technology Co.,Ltd.

Address before: 801-806, 8th floor, building 10, huaweili, Chaoyang District, Beijing 100020

Applicant before: BEIJING BAC TIKINMEDIA ADVERTISING MEDIA Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191220