CN112272352B - Device interaction method and device, storage medium and computer device - Google Patents

Device interaction method and device, storage medium and computer device Download PDF

Info

Publication number
CN112272352B
CN112272352B CN202011461597.1A CN202011461597A CN112272352B CN 112272352 B CN112272352 B CN 112272352B CN 202011461597 A CN202011461597 A CN 202011461597A CN 112272352 B CN112272352 B CN 112272352B
Authority
CN
China
Prior art keywords
information
target
interactive
equipment
scene information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011461597.1A
Other languages
Chinese (zh)
Other versions
CN112272352A (en
Inventor
张延�
沈国斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Koubei Network Technology Co Ltd
Original Assignee
Zhejiang Koubei Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Koubei Network Technology Co Ltd filed Critical Zhejiang Koubei Network Technology Co Ltd
Priority to CN202011461597.1A priority Critical patent/CN112272352B/en
Publication of CN112272352A publication Critical patent/CN112272352A/en
Application granted granted Critical
Publication of CN112272352B publication Critical patent/CN112272352B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The application discloses a device interaction method and device, a storage medium and computer equipment, wherein the method comprises the following steps: acquiring target scene information, and sending the target scene information to a server so as to enable the server to return interactive information of target equipment matched with the target scene information based on pre-acquired scene information, wherein the target scene information comprises target positioning information and target environment information; receiving and outputting the interactive information; and acquiring an interactive instruction corresponding to the interactive information, and controlling the target equipment to respond to the interactive instruction. Compared with the mode of establishing interactive connection with target equipment through the identification two-dimensional code in the prior art, interactive information of the target equipment can be acquired through target scene information acquired at low cost, the target equipment is controlled to respond to an interactive instruction, the operation cost of information acquisition is reduced, the accuracy of information acquisition is improved, and the success rate of contactless interaction is further improved.

Description

Device interaction method and device, storage medium and computer device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a device interaction method and apparatus, a storage medium, and a computer device.
Background
In certain scenarios, contactless interaction of a person with a physical device is of great significance. For example, during an outbreak of a disease, various physical devices are very likely to become carriers of viruses, such as buttons of an elevator, and a user needs to touch the elevator buttons when taking an elevator, so that if viruses exist on the elevator buttons, the user can easily contact the viruses to infect the disease when taking the elevator.
In the existing contactless entity device interaction scheme, entity device information is generally acquired in a two-dimensional code scanning manner, and a communication connection with an entity device is established, so that interaction with the entity device is realized. However, in this way, when the two-dimensional code is recognized, the user needs to be close enough to the two-dimensional code mark, the stability of the mobile phone is maintained, and good lighting conditions are provided.
Disclosure of Invention
According to an aspect of the present application, there is provided a device interaction method, including:
acquiring target scene information, and sending the target scene information to a server so as to enable the server to return interactive information of target equipment matched with the target scene information based on pre-acquired scene information, wherein the target scene information comprises target positioning information and target environment information;
receiving and outputting the interactive information;
and acquiring an interactive instruction corresponding to the interactive information, and controlling the target equipment to respond to the interactive instruction.
Optionally, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
Optionally, after the sending the target scenario information to the server, the method further includes:
receiving equipment identification code inquiry information returned by the server and outputting the equipment identification code inquiry information;
and acquiring equipment identification code input information corresponding to the equipment identification code inquiry information, and sending the equipment identification code to the server so as to enable the server to return the interactive information matched with the target scene information and the equipment identification code.
Optionally, the sending the target scene information to a server specifically includes:
acquiring a device identification code corresponding to the target device;
and sending the target scene information and the equipment identification code to the server so as to enable the server to return the interactive information matched with the target scene information and the equipment identification code.
Optionally, the receiving and outputting the interactive information specifically includes:
receiving the interactive information;
generating an interaction output signal corresponding to the interactive information, and outputting the interaction output signal, wherein the interaction output signal includes but is not limited to an interaction display interface and/or interaction sound wave information, and the interaction instruction includes but is not limited to a display interface touch interaction instruction and/or a sound wave input interaction instruction.
Optionally, the method further comprises:
receiving a communication address and a communication key of the target device;
the obtaining of the interactive instruction corresponding to the interactive information and the controlling of the target device to respond to the interactive instruction specifically include:
establishing a communication connection with the target device based on the communication identification and the communication key;
and acquiring an interactive instruction corresponding to the interactive information, and sending the interactive instruction to the target equipment so as to enable the target equipment to respond to the interactive instruction.
Optionally, the obtaining an interactive instruction corresponding to the interactive information, and controlling the target device to respond to the interactive instruction specifically include:
and acquiring an interactive instruction corresponding to the interactive information, and sending the interactive instruction to the server so that the server controls the target device to respond to the interactive instruction.
According to another aspect of the present application, there is provided a device interaction method, including:
receiving target scene information, wherein the target scene information comprises target positioning information and target environment information;
determining target equipment matched with the target scene information according to pre-collected scene information, and acquiring interactive information corresponding to the target equipment, wherein the pre-collected scene information comprises preset scene information corresponding to a plurality of preset equipment;
and sending the interactive information to enable the terminal equipment to control the target equipment according to the interactive information.
Optionally, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
Optionally, the determining, according to the pre-acquired scene information, the target device matched with the target scene information specifically includes:
screening the preset scene information according to the target positioning information, and determining alternative equipment corresponding to the target positioning information;
selecting the target device matched with the target scene information from the alternative devices.
Optionally, the selecting, among the candidate devices, the target device matched with the target scene information specifically includes:
respectively determining the matching degree of preset scene information corresponding to each alternative device and the target scene information based on the weight of each environment information corresponding to each alternative device;
and acquiring the target equipment with the matching degree larger than a preset matching degree threshold value from the alternative equipment.
Optionally, after selecting the target device matching the target scene information, the method further includes:
if the target equipment comprises a plurality of target equipment, equipment identification code inquiry information is sent to the terminal equipment;
receiving a device identification code corresponding to the device identification code inquiry information;
determining a target device matching the device identification code among the plurality of target devices.
Optionally, the method further comprises:
receiving a device identification code;
the selecting, among the candidate devices, the target device matched with the target scene information specifically includes:
selecting the target device matching the target scene information and the device identification code among the candidate devices.
Optionally, the method further comprises:
acquiring a communication identifier and a communication key corresponding to the target equipment;
and sending the communication identifier and the communication key to the terminal equipment.
Optionally, after the sending the interactable information, the method further includes:
and receiving an interactive instruction corresponding to the interactive information, and controlling the target equipment to respond to the interactive instruction.
Optionally, before receiving the target scenario information, the method further includes:
acquiring multiple groups of original scene information corresponding to each preset device;
clustering multiple groups of original scene information, and determining the preset scene information corresponding to each preset device;
if the similarity between the preset scene information corresponding to any plurality of preset devices is greater than a preset similarity threshold, respectively allocating device identification codes to the any plurality of preset devices;
and determining the pre-acquisition scene information based on the corresponding relation between the preset equipment and the preset scene information and the equipment identification code.
According to another aspect of the present application, there is provided a device interaction apparatus, including:
the scene information acquisition module is used for acquiring target scene information and sending the target scene information to the server so as to enable the server to return interactive information of target equipment matched with the target scene information based on pre-acquired scene information, wherein the target scene information comprises target positioning information and target environment information;
the interactive information output module is used for receiving and outputting the interactive information;
and the target equipment control module is used for acquiring an interactive instruction corresponding to the interactive information and controlling the target equipment to respond to the interactive instruction.
Optionally, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
Optionally, the apparatus further comprises:
the identification code query module is used for receiving equipment identification code query information returned by the server after the target scene information is sent to the server, and outputting the equipment identification code query information according to a preset output rule;
and the identification code sending module is used for acquiring equipment identification code input information corresponding to the equipment identification code inquiry information and sending the equipment identification code to the server so as to enable the server to return the interactive information matched with the target scene information and the equipment identification code.
Optionally, the scene information obtaining module specifically includes:
an identification code acquisition unit for acquiring a device identification code corresponding to the target device;
and the identification code sending unit is used for sending the target scene information and the equipment identification code to the server so as to enable the server to return the interactive information matched with the target scene information and the equipment identification code.
Optionally, the module for outputting interactive information specifically includes:
an interactive information receiving unit, configured to receive the interactive information;
the interactive information output unit is used for generating an interactive output signal corresponding to the interactive information and outputting the interactive output signal, wherein the interactive output signal includes but is not limited to an interactive display interface and/or interactive sound wave information, and the interactive instruction includes but is not limited to a display interface touch interactive instruction and/or a sound wave input interactive instruction.
Optionally, the apparatus further comprises:
the communication information receiving module is used for receiving the communication address and the communication key of the target device;
the target device control module specifically includes:
the communication connection unit is used for establishing communication connection with the target equipment;
and the first instruction sending unit is used for acquiring an interactive instruction corresponding to the interactive information and sending the interactive instruction to the target equipment so as to enable the target equipment to respond to the interactive instruction.
Optionally, the target device control module specifically includes:
and the second instruction sending unit is used for acquiring an interactive instruction corresponding to the interactive information and sending the interactive instruction to the server so that the server controls the target device to respond to the interactive instruction.
According to another aspect of the present application, there is provided a device interaction apparatus, including:
the scene information receiving module is used for receiving target scene information, wherein the target scene information comprises target positioning information and target environment information;
the interactive information acquisition module is used for determining target equipment corresponding to the target scene information according to pre-acquired scene information and acquiring interactive information corresponding to the target equipment, wherein the pre-acquired scene information comprises preset scene information corresponding to a plurality of preset equipment;
and the interactive information sending module is used for sending the interactive information so as to enable the terminal equipment to control the target equipment according to the interactive information.
Optionally, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
Optionally, the module for acquiring interactive information specifically includes:
the alternative equipment determining unit is used for screening the preset scene information according to the target positioning information and determining alternative equipment corresponding to the target positioning information;
and the target device determining unit is used for selecting the target device matched with the target scene information from the candidate devices.
Optionally, the target device determining unit specifically includes:
the matching degree calculation subunit is configured to determine, based on each environment information weight corresponding to each candidate device, a matching degree between preset scene information corresponding to each candidate device and the target scene information, respectively;
and the target equipment determining subunit is used for acquiring the target equipment of which the matching degree is greater than a preset matching degree threshold value from the alternative equipment.
Optionally, the apparatus further comprises:
an inquiry information sending module, configured to send device identification code inquiry information to the terminal device if a plurality of target devices are included after the target device matched with the target scene information is selected;
the first identification code receiving module is used for receiving the equipment identification code corresponding to the equipment identification code inquiry information;
and the target equipment matching module is used for determining the target equipment matched with the equipment identification code in the plurality of target equipment.
Optionally, the apparatus further comprises:
the second identification code receiving module is used for receiving the equipment identification code;
the target device determining unit is specifically configured to: selecting the target device matching the target scene information and the device identification code among the candidate devices.
Optionally, the apparatus further comprises:
the communication information acquisition module is used for acquiring a communication identifier and a communication key corresponding to the target equipment;
and the communication information sending module is used for sending the communication identifier and the communication key to the terminal equipment.
Optionally, the apparatus further comprises:
and the target equipment control module is used for receiving an interactive instruction corresponding to the interactive information after the interactive information is sent, and controlling the target equipment to respond to the interactive instruction.
Optionally, the apparatus further comprises:
the original information acquisition module is used for acquiring multiple groups of original scene information corresponding to each preset device before receiving the target scene information;
the original information clustering module is used for clustering a plurality of groups of original scene information and determining the preset scene information corresponding to each preset device;
the identification code distribution module is used for respectively distributing equipment identification codes for any plurality of preset equipment if the similarity between the preset scene information corresponding to the any plurality of preset equipment is greater than a preset similarity threshold;
and the scene information determining module is used for determining the pre-acquisition scene information based on the corresponding relation between the preset equipment and the preset scene information and the equipment identification code.
According to yet another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described device interaction method.
According to yet another aspect of the present application, there is provided a computer device comprising a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the device interaction method when executing the program.
By means of the technical scheme, the device interaction method and device, the storage medium and the computer device send the target scene information to the server, so that the server determines the target device matched with the target scene information and obtains the interactive information of the target device, the interactive information is output, the interactive instruction issued after the user obtains the interactive information is received, the target device is controlled to respond to the interactive instruction, and the non-contact interaction between the target device and the target device is achieved. Compared with the mode of establishing interactive connection with the target equipment through the identification two-dimensional code in the prior art, the interactive information of the target equipment can be acquired through the target scene information acquired at low cost, the target equipment is controlled to respond to the interactive instruction, the operation cost of information acquisition is reduced, the accuracy of information acquisition is improved, the success rate of contactless interaction is improved, and the user experience is improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart illustrating a device interaction method according to an embodiment of the present application;
fig. 2 is a flowchart illustrating another device interaction method provided in an embodiment of the present application;
fig. 3 is a flowchart illustrating another device interaction method provided in an embodiment of the present application;
fig. 4 is a flowchart illustrating another device interaction method provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram illustrating an apparatus interaction device according to an embodiment of the present application;
fig. 6 shows a schematic structural diagram of another device interaction apparatus provided in an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In this embodiment, a device interaction method is provided, as shown in fig. 1, the method includes:
step 101, acquiring target scene information, and sending the target scene information to a server so that the server returns interactive information of target equipment matched with the target scene information based on pre-acquired scene information, wherein the target scene information comprises target positioning information and target environment information;
the embodiment of the application is applied to terminal equipment, specifically, the terminal equipment can be intelligent equipment held by a user, such as a smart phone, an intelligent tablet computer and the like, and the terminal equipment can be provided with a signal acquisition module capable of acquiring scene signals in a built-in or peripheral mode, such as a positioning module for realizing a positioning function and an optical sensor for acquiring illumination signals. In the embodiment of the present application, various scene signals are collected by a signal collection module equipped to a terminal device, so that the terminal device can perform signal analysis on the collected scene signals to obtain target scene information, for example, an analog signal collected by an optical sensor is converted into a digital signal capable of describing illumination intensity, and the digital signal can be used as target scene information, and further the target scene information is sent to a server, it should be noted that the target scene information in the embodiment of the present application may include target positioning information and target environment information collected in a target scene, for example, when a user walks to a vicinity of an elevator in a shopping mall with a smart phone, and needs to use the elevator in the shopping mall, the location where the user is located may be used as the target scene, and various scene signals can be collected in the location by the signal collection module equipped to the smart phone, and the acquired scene signals are processed into target scene information through the smart phone and then sent to the server. In addition, the target scene information may be acquired by the background of the intelligent terminal by acquiring information of each signal acquisition module at intervals through calling the API of the operating system, or acquired by acquiring information of each signal acquisition module through calling the API of the operating system based on active operation of the user.
In the embodiment of the application, interactive information of a target device is determined based on target scene information, where the target scene information includes target positioning information corresponding to a target scene and target environment information corresponding to the target scene, so that a geographic position of the target scene can be determined through the target positioning information, so as to roughly screen out which controllable devices corresponding to the geographic position exist, and further determine what the environment around the target scene is through the target environment information, for example, interactive devices in a certain market include an elevator and an intelligent question and answer robot, light around the elevator is dark, light around the intelligent question and answer robot is strong, and the target device can be determined based on the target positioning information and illumination information. When the target scene information is collected, no matter the target positioning information or the target environment information, the operation requirements on the user are very low, for example, the illumination information is collected, and the user only needs to remove a shielding object on an optical sensor of the terminal equipment, so that the target scene information is low in acquisition difficulty and easy to realize, and the acquisition accuracy of the target scene information is high because the user does not depend on the operation level of the user.
Specifically, after receiving the target scene information, the server matches the target scene information according to the pre-stored pre-collected scene information, and determines the target device corresponding to the target scene information and the interactive information of the target device, for example, the server determines, based on the pre-collected scene information, that the target device corresponding to the received target scene information is an elevator, and further obtains function information corresponding to the elevator as the interactive information to be sent to the terminal device held by the user, for example, if the function of the elevator is transportation of minus 2-15 floors, the interactive information of the target device is used to indicate that the elevator can provide transportation service of minus 2-15 floors.
In addition, the pre-acquisition information may also be pre-stored in the terminal device, and the terminal device determines the target device and the interactive information corresponding to the target device on the terminal device side based on the target scene information and the pre-acquisition information.
102, receiving and outputting interactive information;
and 103, acquiring an interactive instruction corresponding to the interactive information, and controlling the target device to respond to the interactive instruction.
After the terminal equipment receives the interactive information, the interactive information can be output in a display output mode or a voice output mode, and after a user sees the interactive information on a display interface of the terminal equipment or hears that a loudspeaker of the terminal equipment outputs the interactive information, the user can issue an interactive instruction according to the self requirement of the user. Therefore, the non-contact interaction between the user and the target equipment is realized, and the interaction between the user and the target equipment is more convenient and safer.
It should be noted that the target device in the embodiment of the present application may include various physical devices, such as an elevator, an intelligent question and answer robot, a sweeping robot, an air conditioner, an intelligent kitchen tool, and the like, which is not limited herein.
By applying the technical scheme of the embodiment, the target scene information is sent to the server, so that the target device matched with the target scene information is determined by the server and the interactive information of the target device is obtained, the interactive information is output, an interactive instruction issued after the user obtains the interactive information is received, the target device is controlled to respond to the interactive instruction, and the non-contact interaction between the target device and the target device is realized. Compared with the mode of establishing interactive connection with the target equipment through the identification two-dimensional code in the prior art, the interactive information of the target equipment can be acquired through the target scene information acquired at low cost, the target equipment is controlled to respond to the interactive instruction, the operation cost of information acquisition is reduced, the accuracy of information acquisition is improved, the success rate of contactless interaction is improved, and the user experience is improved.
In the embodiment of the present application, specifically, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
In the above embodiment, the WiFi information may specifically be WiFi network address information, and the base station information may specifically be base station address information, so as to implement positioning of a target scene. The target environment information is generally collected by a sensor built in or externally arranged on the terminal device, such as an optical sensor, an air pressure sensor, and the like. The information acquisition difficulty is low, the user operation cost is low, the realization is convenient, and the accuracy is high.
Further, as a refinement and an extension of the specific implementation of the above embodiment, in order to fully illustrate the specific implementation process of the embodiment, another device interaction method is provided, as shown in fig. 2, and the method includes:
step 201, acquiring target scene information and sending the target scene information to a server;
step 202, receiving equipment identification code inquiry information returned by the server, and outputting the equipment identification code inquiry information according to a preset output rule;
step 203, acquiring equipment identification code input information corresponding to the equipment identification code inquiry information, and sending the equipment identification code to a server so as to enable the server to return interactive information matched with the target scene information and the equipment identification code;
in the embodiment of the present application, it is considered that if a plurality of devices are located close to each other, corresponding scene information of the devices will be very similar, so that target devices actually required by a user cannot be well distinguished. For example, there are three elevators in an office building, and due to the proximity of the three elevators, the scene information collected by the user around the three elevators is also very close. In this case, in order to accurately find out the target device required by the user, an identification code may be set for each interactive device, and after receiving the target scene information, if a plurality of interactive preset devices are matched based on the target scene information, and it is impossible to accurately determine which device is the target device required by the user, the server may send device identification code query information to the terminal device. For example, for three elevators with close positions, the corresponding equipment identification code can be displayed at the corresponding position of each elevator, such as a display at an elevator entrance for display, so as to be convenient for human eye recognition. For the terminal device, the terminal device receives the device identification code query information sent by the server, and queries according to a preset output rule, for example, popping up a device identification code input box to indicate a user to input an identification code of the target device, and after the user inputs the device identification code, the terminal device returns the identification code to the server. And the server further determines target equipment required by the user based on the equipment identification code and the target scene information, and sends interactive information of the target equipment to the terminal equipment.
In addition, in the embodiment of the present application, optionally, "sending the target scenario information to the server" in step 201 and step 202 and step 203 may be replaced with: acquiring a device identification code corresponding to target equipment; and sending the target scene information and the equipment identification code to the server so as to enable the server to return interactive information matched with the target scene information and the equipment identification code.
In the embodiment, the device identification code can be sent out together with the target scene information when the target scene information is sent to the server, and after the server receives the target scene information and the device identification code, the information is used for matching the target device, and interactive information corresponding to the target device is fed back to the terminal device.
Step 204, receiving interactive information;
step 205, generating an interaction output signal corresponding to the interactive information, and outputting the interaction output signal, wherein the interaction output signal includes but is not limited to an interaction display interface and/or interaction sound wave information;
in steps 204 and 205, the terminal device receives the interactive information from the server, and generates an interactive output signal corresponding to the interactive information according to an output rule preset in the terminal device or an output rule carried in the interactive information, and outputs the interactive output signal, so that the user issues an interactive instruction based on the output interactive output signal, where the interactive output signal may be in various forms, for example, the interactive output signal may be output in the form of an interactive display interface, or may be output in the form of interactive sound wave information, that is, voice prompt information, or may be a combination of both of outputting the interactive display interface and outputting the interactive sound wave information. For example, the interactive information corresponding to the target device displayed on the display screen of the mobile phone may be a floor reachable by the elevator, or may be information such as an air conditioner switch and adjustable temperature and humidity. Therefore, the contact type interaction interface of the entity equipment is converted into an interaction output signal, and the user can realize interaction without contacting with the entity equipment.
And step 206, acquiring an interactive instruction corresponding to the interactive information, and controlling the target device to respond to the interactive instruction, wherein the interactive instruction includes but is not limited to a display interface touch interactive instruction and/or a sound wave input interactive instruction.
In step 206, after the user sees or hears the interaction output signal, an interaction instruction may be issued in a display interface touch manner or a sound wave input (i.e., voice input) manner, that is, the terminal device may obtain the interaction instruction input by the user through the touch display panel or through voice, and then control the target device to respond to the interaction instruction based on the interaction instruction.
Specifically, while receiving the interactable information in step 204, the method may further include: a communication address of a target device and a communication key are received. Accordingly, the manner of controlling the target device to respond to the interactive instruction in step 206 may include: establishing communication connection with the target equipment based on the communication identification and the communication key; and acquiring an interactive instruction corresponding to the interactive information, and sending the interactive instruction to the target equipment so that the target equipment responds to the interactive instruction. May also include: and acquiring an interactive instruction corresponding to the interactive information, and sending the interactive instruction to the server so that the server controls the target device to respond to the interactive instruction.
In this embodiment, the terminal device may directly control the target device, or may send the interactive instruction to the server to enable the server to control the target device. In the method for directly controlling the target device by the terminal device, the terminal device can establish communication connection through protocols such as bluetooth, specifically, the communication connection can be established with the target device by using a received communication address and a communication key of the target device, the terminal device searches the target device of the communication address and sends a connection request carrying the communication key to the communication address, the target device verifies the communication key after receiving the connection request, and establishes communication connection with the terminal device after successful verification, and in addition, the server can simultaneously send the communication key to the target device while sending the communication address and the communication key to the terminal device so that the target device verifies the connection request of the terminal device. And then the interactive instruction input by the user is transmitted to the target device, for example, a smart phone held by the user transmits the target floor information to the elevator, and the elevator receives the target floor information and then responds to the information. In a mode of controlling the target device by using the server, for example, the target device does not support communication protocols such as bluetooth, the terminal device may send the interactive instruction to the server, and then the interactive instruction is forwarded to the target device control terminal by the server, so that the target device is controlled to respond to the interactive instruction, and finally, the contactless interaction between the user and the target device is realized.
In this embodiment, a device interaction method is provided, as shown in fig. 3, the method includes:
step 301, receiving target scene information, wherein the target scene information includes target positioning information and target environment information;
step 302, screening preset scene information according to the target positioning information, and determining alternative equipment corresponding to the target positioning information;
and step 303, sending the interactive information to enable the terminal device to receive and output the interactive information, and responding to the interactive instruction corresponding to the interactive information to control the target device to execute the interactive instruction.
The above embodiment may be applied to a server, where the server receives target scene information sent by a terminal device, and determines interactive information of a target device based on the target scene information, where the target scene information includes target positioning information corresponding to the target scene and target environment information corresponding to the target scene, so that a geographic location where the target scene is located may be determined by the target positioning information, so as to roughly screen out which controllable devices correspond to the geographic location, and further determine how the environment around the target scene is, for example, where the interactive devices in a certain mall include an elevator and an intelligent question and answer robot, the light around the elevator is dark, the light around the intelligent question and answer robot is strong, and the target device may be determined based on the target positioning information and illumination information. When the target scene information is collected, no matter the target positioning information or the target environment information, the operation requirements on the user are very low, for example, the illumination information is collected, and the user only needs to remove a shielding object on an optical sensor of the terminal equipment, so that the target scene information is low in acquisition difficulty and easy to realize, and the acquisition accuracy of the target scene information is high because the user does not depend on the operation level of the user. After the terminal equipment receives the interactive information, the interactive information can be output in a display output mode or a voice output mode, and after a user sees the interactive information on a display interface of the terminal equipment or hears that a loudspeaker of the terminal equipment outputs the interactive information, the user can issue an interactive instruction according to the self requirement of the user. Therefore, the non-contact interaction between the user and the target equipment is realized, and the interaction between the user and the target equipment is more convenient and safer.
Further, as a refinement and an extension of the specific implementation of the above embodiment, in order to fully illustrate the specific implementation process of the embodiment, another device interaction method is provided, as shown in fig. 4, and the method includes:
step 401, receiving target scene information, wherein the target scene information includes target positioning information and target environment information;
step 402, screening preset scene information according to the target positioning information, and determining alternative equipment corresponding to the target positioning information;
step 403, selecting target devices matched with the target scene information from the alternative devices;
in steps 401 to 403, after receiving the target scene information, the server may roughly screen the candidate devices within a certain spatial range according to the target positioning information, in particular, may implement spatial range screening based on a GeoHash algorithm, determine the candidate devices within the preset range of the target scene, further determine the target devices matched with the target scene information based on the target environment information, the target positioning information, and the preset scene information corresponding to the candidate devices, for example, perform similarity calculation based on the target scene information and the preset scene information corresponding to each candidate device, and use the candidate devices with similarity greater than the preset value as the target devices, for example, implement accurate matching of the target devices through a K-D tree algorithm, in consideration of the fact that the data size of the pre-acquired scene information may be large.
In this embodiment, optionally, step 403 may specifically include: respectively determining the matching degree of preset scene information and target scene information corresponding to each alternative device based on each environment information weight corresponding to each alternative device; and acquiring the target equipment with the matching degree larger than a preset matching degree threshold value from the alternative equipment.
In this embodiment, each candidate device further corresponds to a specific weight of environment information, for example, in the multiple preset devices, the preset scene information corresponding to each device includes illumination information, magnetic field information, and air pressure information, where the illumination information of the device a is significantly different from the illumination information of other devices, and the magnetic field information of the device B is significantly different from the magnetic field information of other devices, then the illumination information of the device a may be set to be a larger weight, and the magnetic field information of the device B may be set to be a larger weight, so that when the target device is matched, the scene information with the prominent difference can have a greater influence on the matching result. Further, after the matching degree between the candidate devices and the target scene information is calculated by using the environment information weight corresponding to each candidate device, the target devices with the matching degree higher than a preset matching degree threshold value are selected from the candidate devices.
Step 404, if a plurality of target devices are included, sending device identification code inquiry information to the terminal device;
step 405, receiving a device identification code corresponding to the device identification code inquiry information;
step 406, determining a target device matched with the device identification code in the plurality of target devices;
in steps 404 to 406, it is considered that if multiple devices are located close to each other, the corresponding scene information will be so similar that the target devices actually required by the user cannot be well distinguished. For example, there are three elevators in an office building, and due to the proximity of the three elevators, the scene information collected by the user around the three elevators is also very close. In this case, in order to accurately find out the target device required by the user, an identification code may be set for each interactive device, and after receiving the target scene information, if a plurality of interactive alternative devices are matched based on the target scene information and it cannot be accurately determined which is the target device required by the user, the server may send device identification code query information to the terminal device. For example, for three elevators with close positions, the corresponding equipment identification code can be displayed at the corresponding position of each elevator, such as a display at an elevator entrance for display, so as to be convenient for human eye recognition. For the terminal device, the terminal device receives the device identification code query information sent by the server, and queries according to a preset output rule, for example, popping up a device identification code input box to indicate a user to input an identification code of the target device, and after the user inputs the device identification code, the terminal device returns the identification code to the server. And the server further selects one target device matched with the device identification code from the plurality of target devices based on the device identification code, and acquires the interactive information corresponding to the target device.
Step 407, sending the interactable information to enable the terminal device to receive and output the interactable information, and controlling the target device to execute the interactive instruction in response to the interactive instruction corresponding to the interactable information.
In step 407, the terminal device may directly control the target device, or may send the interactive instruction to the server to enable the server to control the target device. In a mode of controlling the target device by using the server, for example, when the target device does not support communication protocols such as bluetooth, the terminal device may send the interactive instruction to the server, and then the interactive instruction is forwarded to the target device control terminal by the server, so that the target device is controlled to respond to the interactive instruction, and finally, the contactless interaction between the user and the target device is realized.
In a manner of directly controlling a target device by using a terminal device, the embodiment further includes: acquiring a communication identifier and a communication key corresponding to target equipment; and sending the communication identification and the communication key to the terminal equipment. Specifically, a communication connection may be established with a target device by using a received communication address and a communication key of the target device, the terminal device searches for the target device of the communication address and sends a connection request carrying the communication key to the communication address, the target device verifies the communication key after receiving the connection request, and establishes a communication connection with the terminal device after successful verification. And then the interactive instruction input by the user is transmitted to the target device, for example, a smart phone held by the user transmits the target floor information to the elevator, and the elevator receives the target floor information and then responds to the information. In a mode of controlling the target device by using the server, for example, the target device does not support communication protocols such as bluetooth, the terminal device may send the interactive instruction to the server, and then the interactive instruction is forwarded to the target device control terminal by the server, so that the target device is controlled to respond to the interactive instruction, and finally, the contactless interaction between the user and the target device is realized.
In this embodiment of the present application, optionally, the steps 402 to 406 may be replaced with: receiving a device identification code; screening preset scene information according to the target positioning information, and determining alternative equipment corresponding to the target positioning information; and selecting the target device matched with the target scene information and the device identification code from the alternative devices.
In the embodiment, the terminal device sends the target scene information to the server and simultaneously sends the device identification code, and after receiving the target scene information and the device identification code, the server matches the target device with the information and feeds back interactive information corresponding to the target device to the terminal device.
In addition, the embodiment of the present application further includes a method for determining pre-acquisition scene information, and optionally before step 301 or step 401, further includes:
acquiring multiple groups of original scene information corresponding to each preset device;
clustering multiple groups of original scene information, and determining preset scene information corresponding to each preset device;
if the similarity between the preset scene information corresponding to any plurality of preset devices is greater than a preset similarity threshold, respectively allocating device identification codes to the any plurality of preset devices;
and determining the pre-acquisition scene information based on the corresponding relation between the preset equipment and the preset scene information and the equipment identification code.
The service provider of the preset device can capture original scene information of a scene corresponding to the preset device through the intelligent device in advance, multiple groups of original scene information can be collected in advance in order to guarantee the accuracy of scene information collection, then clustering analysis is carried out on the multiple groups of original scene information corresponding to each preset device respectively, specifically, clustering analysis is carried out on non-directional information in the original scene information, so that the data volume is reduced, noise interference is reduced, and scene information with strong correlation with an operation mode, such as longitude and latitude, illumination intensity and the like, is clustered, and clustering is not needed for directional information in the original scene information, such as scene information including base station addresses, network addresses and the like. In addition, for a plurality of preset devices with similar positions, because the corresponding scene information has high similarity, if a common user wants to accurately obtain the corresponding scene information, the technical requirement on user operation is high, in order to meet the non-contact interaction appeal of the common user, one device identification code can be allocated to the plurality of preset devices with high similarity of the preset scene information, for example, three elevators with similar positions can be respectively allocated with the device identification code A, B, C, when the user needs to use one elevator, the target scene information and the device identification code are both sent to the server, and the server can determine the target device required by the user. In a practical application scenario, the device identification code may be presented at a specific location of the target device. And in the server, storing the corresponding relation between the preset scene information of the preset equipment and the equipment identification code as the pre-acquired scene information so as to identify and match the received target scene information and the equipment identification code.
Further, as a specific implementation of the method in fig. 1, an embodiment of the present application provides an apparatus interaction device, as shown in fig. 5, where the apparatus includes:
the scene information acquiring module 501 is configured to acquire target scene information and send the target scene information to a server, so that the server returns interactive information of a target device matched with the target scene information based on pre-acquired scene information, where the target scene information includes target positioning information and target environment information;
an interactive information output module 502, configured to receive and output interactive information;
and the target device control module 503 is configured to obtain an interactive instruction corresponding to the interactive information, and control the target device to respond to the interactive instruction.
Optionally, the apparatus further comprises:
the identification code query module is used for receiving equipment identification code query information returned by the server after the target scene information is sent to the server, and outputting the equipment identification code query information according to a preset output rule;
and the identification code sending module is used for acquiring equipment identification code input information corresponding to the equipment identification code inquiry information and sending the equipment identification code to the server so as to enable the server to return interactive information matched with the target scene information and the equipment identification code.
Optionally, the scene information obtaining module 501 specifically includes:
an identification code acquisition unit for acquiring a device identification code corresponding to a target device;
and the identification code sending unit is used for sending the target scene information and the equipment identification code to the server so as to enable the server to return interactive information matched with the target scene information and the equipment identification code.
Optionally, the interactive information output module 502 specifically includes:
the interactive information receiving unit is used for receiving interactive information;
the interactive information output unit is used for generating an interactive output signal corresponding to the interactive information and outputting the interactive output signal, wherein the interactive output signal includes but is not limited to an interactive display interface and/or interactive sound wave information, and the interactive instruction includes but is not limited to a display interface touch interactive instruction and/or a sound wave input interactive instruction.
Optionally, the apparatus further comprises:
the communication information receiving module is used for receiving a communication address and a communication key of the target equipment;
the target device control module 503 specifically includes:
a communication connection unit for establishing a communication connection with a target device;
and the first instruction sending unit is used for acquiring the interactive instruction corresponding to the interactive information and sending the interactive instruction to the target equipment so as to enable the target equipment to respond to the interactive instruction.
Optionally, the target device control module 503 specifically includes:
and the second instruction sending unit is used for acquiring the interactive instruction corresponding to the interactive information and sending the interactive instruction to the server so as to enable the server to control the target equipment to respond to the interactive instruction.
Optionally, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
Further, as a specific implementation of the method in fig. 3, an embodiment of the present application provides an apparatus interaction device, as shown in fig. 6, where the apparatus includes:
a scene information receiving module 601, configured to receive target scene information, where the target scene information includes target positioning information and target environment information;
an interactable information obtaining module 602, configured to determine, according to pre-collected scene information, a target device corresponding to the target scene information, and obtain interactable information corresponding to the target device, where the pre-collected scene information includes preset scene information corresponding to a plurality of preset devices;
an interactive information sending module 603, configured to send interactive information, so that the terminal device controls the target device according to the interactive information.
Optionally, the module 602 for obtaining information interactively includes:
the alternative equipment determining unit is used for screening preset scene information according to the target positioning information and determining alternative equipment corresponding to the target positioning information;
and the target equipment determining unit is used for selecting the target equipment matched with the target scene information from the alternative equipment.
Optionally, the target device determining unit specifically includes:
the matching degree calculation subunit is used for respectively determining the matching degree of the preset scene information and the target scene information corresponding to each alternative device based on each environment information weight corresponding to each alternative device;
and the target equipment determining subunit is used for acquiring the target equipment of which the matching degree is greater than a preset matching degree threshold value from the alternative equipment.
Optionally, the apparatus further comprises:
the inquiry information sending module is used for sending equipment identification code inquiry information to the terminal equipment after selecting the target equipment matched with the target scene information if the target equipment comprises a plurality of pieces of target equipment;
the first identification code receiving module is used for receiving the equipment identification code corresponding to the equipment identification code inquiry information;
and the target device selection module is used for determining a target device matched with the device identification code in the plurality of target devices.
Optionally, the apparatus further comprises:
the second identification code receiving module is used for receiving the equipment identification code;
the target device determining unit is specifically configured to: and selecting the target device matched with the target scene information and the device identification code from the alternative devices.
Optionally, the apparatus further comprises:
the communication information acquisition module is used for acquiring a communication identifier and a communication key corresponding to the target equipment;
and the communication information sending module is used for sending the communication identifier and the communication key to the terminal equipment.
Optionally, the apparatus further comprises:
and the target equipment control module is used for receiving the interactive instruction corresponding to the interactive information after the interactive information is sent, and controlling the target equipment to respond to the interactive instruction.
Optionally, the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, air pressure information, or a combination thereof.
Optionally, the apparatus further comprises:
the original information acquisition module is used for acquiring multiple groups of original scene information corresponding to each preset device before receiving the target scene information;
the original information clustering module is used for clustering multiple groups of original scene information and determining the preset scene information corresponding to each preset device;
the identification code distribution module is used for respectively distributing equipment identification codes for any plurality of preset equipment if the similarity between the preset scene information corresponding to any plurality of preset equipment is greater than a preset similarity threshold;
and the scene information determining module is used for determining the pre-acquisition scene information based on the corresponding relation between the preset equipment and the preset scene information and the equipment identification code.
It should be noted that other corresponding descriptions of the functional units related to the device interaction apparatus provided in the embodiment of the present application may refer to corresponding descriptions in the methods in fig. 1 to fig. 4, and are not described herein again.
Based on the method shown in fig. 1 to 4, correspondingly, the embodiment of the present application further provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the device interaction method shown in fig. 1 to 4.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Based on the method shown in fig. 1 to 4 and the virtual device embodiment shown in fig. 5 to 6, in order to achieve the above object, the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, and the like, where the computer device includes a storage medium and a processor; a storage medium for storing a computer program; a processor for executing a computer program to implement the device interaction method as described above and shown in fig. 1 to 4.
Optionally, the computer device may also include a user interface, a network interface, a camera, Radio Frequency (RF) circuitry, sensors, audio circuitry, a WI-FI module, and so forth. The user interface may include a Display screen (Display), an input unit such as a keypad (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., a bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the present embodiment provides a computer device architecture that is not limiting of the computer device, and that may include more or fewer components, or some components in combination, or a different arrangement of components.
The storage medium may further include an operating system and a network communication module. An operating system is a program that manages and maintains the hardware and software resources of a computer device, supporting the operation of information handling programs, as well as other software and/or programs. The network communication module is used for realizing communication among components in the storage medium and other hardware and software in the entity device.
Through the description of the above embodiments, those skilled in the art can clearly understand that the present application can be implemented by software plus a necessary general hardware platform, and also can send target scene information to a server by hardware, so as to determine a target device matched with the target scene information and obtain the interactive information of the target device through the server, thereby outputting the interactive information and receiving an interactive instruction issued after a user obtains the interactive information, and further controlling the target device to respond to the interactive instruction, so as to implement contactless interaction with the target device. Compared with the mode of establishing interactive connection with the target equipment through the identification two-dimensional code in the prior art, the interactive information of the target equipment can be acquired through the target scene information acquired at low cost, the target equipment is controlled to respond to the interactive instruction, the operation cost of information acquisition is reduced, the accuracy of information acquisition is improved, the success rate of contactless interaction is improved, and the user experience is improved.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application. Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios. The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (8)

1. A device interaction method, comprising:
acquiring target scene information, sending the target scene information to a server, so that the server screens candidate equipment matched with target positioning information based on pre-acquired scene information, respectively determining the matching degree of preset scene information corresponding to each candidate equipment and the target scene information based on each environment information weight corresponding to each candidate equipment, acquiring the target equipment with the matching degree being greater than a preset matching degree threshold value from the candidate equipment, and returning interactive information of the target equipment matched with the target scene information, wherein the target scene information comprises target positioning information and target environment information;
receiving and outputting the interactive information;
and acquiring an interactive instruction corresponding to the interactive information, and controlling the target equipment to respond to the interactive instruction.
2. The method of claim 1, wherein the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and wherein the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, barometric information, or a combination thereof.
3. The method of claim 1, wherein after sending the target scenario information to a server, the method further comprises:
receiving equipment identification code inquiry information returned by the server and outputting the equipment identification code inquiry information;
and acquiring equipment identification code input information corresponding to the equipment identification code inquiry information, and sending the equipment identification code to the server so as to enable the server to return the interactive information matched with the target scene information and the equipment identification code.
4. The method according to claim 1, wherein the sending the target scene information to a server specifically includes:
acquiring a device identification code corresponding to the target device;
and sending the target scene information and the equipment identification code to the server so as to enable the server to return the interactive information matched with the target scene information and the equipment identification code.
5. The method according to claim 1, wherein the receiving and outputting the interactable information specifically includes:
receiving the interactive information;
generating an interaction output signal corresponding to the interactive information, and outputting the interaction output signal, wherein the interaction output signal includes but is not limited to an interaction display interface and/or interaction sound wave information, and the interaction instruction includes but is not limited to a display interface touch interaction instruction and/or a sound wave input interaction instruction.
6. A device interaction method, comprising:
receiving target scene information, wherein the target scene information comprises target positioning information and target environment information;
screening candidate devices matched with the target positioning information according to pre-collected scene information, respectively determining the matching degree of the preset scene information corresponding to each candidate device and the target scene information based on the weight of each environment information corresponding to each candidate device, acquiring target devices with the matching degrees larger than a preset matching degree threshold value from the candidate devices, and acquiring interactive information corresponding to the target devices, wherein the pre-collected scene information comprises the preset scene information corresponding to a plurality of preset devices;
and sending the interactive information to enable the terminal equipment to control the target equipment according to the interactive information.
7. The method of claim 6, wherein the target location information includes, but is not limited to, any one of latitude and longitude information, WiFi information, base station information, or a combination thereof, and wherein the target environment information includes, but is not limited to, any one of lighting information, magnetic field information, barometric information, or a combination thereof.
8. The method of claim 6, wherein prior to receiving the target context information, the method further comprises:
acquiring multiple groups of original scene information corresponding to each preset device;
clustering multiple groups of original scene information, and determining the preset scene information corresponding to each preset device;
if the similarity between the preset scene information corresponding to any plurality of preset devices is greater than a preset similarity threshold, respectively allocating device identification codes to the any plurality of preset devices;
and determining the pre-acquisition scene information based on the corresponding relation between the preset equipment and the preset scene information and the equipment identification code.
CN202011461597.1A 2020-12-14 2020-12-14 Device interaction method and device, storage medium and computer device Active CN112272352B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011461597.1A CN112272352B (en) 2020-12-14 2020-12-14 Device interaction method and device, storage medium and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011461597.1A CN112272352B (en) 2020-12-14 2020-12-14 Device interaction method and device, storage medium and computer device

Publications (2)

Publication Number Publication Date
CN112272352A CN112272352A (en) 2021-01-26
CN112272352B true CN112272352B (en) 2021-04-13

Family

ID=74350103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011461597.1A Active CN112272352B (en) 2020-12-14 2020-12-14 Device interaction method and device, storage medium and computer device

Country Status (1)

Country Link
CN (1) CN112272352B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113014633B (en) * 2021-02-20 2022-07-01 杭州云深科技有限公司 Method and device for positioning preset equipment, computer equipment and storage medium
US20220291726A1 (en) * 2021-03-09 2022-09-15 Apple Inc. Transferrable interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103428250A (en) * 2012-05-23 2013-12-04 联想(北京)有限公司 Equipment matching Method, server and terminal device
CN111204625A (en) * 2020-04-17 2020-05-29 北京云迹科技有限公司 Method and device for contactless elevator riding
CN111204626A (en) * 2020-04-17 2020-05-29 北京云迹科技有限公司 Method and device for contactless elevator riding
CN111422709A (en) * 2020-02-13 2020-07-17 深圳市旺龙智能科技有限公司 Method, system, control device and storage medium for taking public elevator without contact

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103428250A (en) * 2012-05-23 2013-12-04 联想(北京)有限公司 Equipment matching Method, server and terminal device
CN111422709A (en) * 2020-02-13 2020-07-17 深圳市旺龙智能科技有限公司 Method, system, control device and storage medium for taking public elevator without contact
CN111204625A (en) * 2020-04-17 2020-05-29 北京云迹科技有限公司 Method and device for contactless elevator riding
CN111204626A (en) * 2020-04-17 2020-05-29 北京云迹科技有限公司 Method and device for contactless elevator riding

Also Published As

Publication number Publication date
CN112272352A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN111417028B (en) Information processing method, information processing device, storage medium and electronic equipment
CN105228106B (en) One kind being based on indoor medical staff's lookup method, location-server and system
US11340088B2 (en) Navigation system including navigation server, mobile terminals, and beacon signal generating devices
CN105008858B (en) For user's framework in the circle of indoor positioning
CN107979628B (en) Method, device and system for acquiring virtual article
KR101895455B1 (en) Method and apparatus for providing semantic location in electronic device
CN112272352B (en) Device interaction method and device, storage medium and computer device
US11265363B2 (en) IOT interaction system
CN109409244B (en) Output method of object placement scheme and mobile terminal
CN108347512B (en) Identity recognition method and mobile terminal
US10002584B2 (en) Information processing apparatus, information providing method, and information providing system
KR20170029178A (en) Mobile terminal and method for operating thereof
US11251887B2 (en) Signal strength band-based device management method and electronic device therefor
TW202105370A (en) Identity recognition preprocessing method and system and identity recognition method and system
CN112312308A (en) Indoor positioning method and terminal equipment
CN111723843B (en) Sign-in method, sign-in device, electronic equipment and storage medium
CN112929224A (en) Network distribution method and device of equipment, server and computer readable storage medium
CN108055644A (en) Position control method, device, storage medium and terminal device
KR20200085508A (en) Service request device
CN111065126B (en) Hot spot sharing method and device, storage medium and electronic equipment
CN106411681B (en) Information processing method, initiating device, server and participating device
CN110677537B (en) Note information display method, note information sending method and electronic equipment
CN110839205A (en) WiFi-based resource recommendation method and device
CN105425936B (en) Method and device for adapting terminal to external device and terminal
KR101858935B1 (en) Labelling apparatus and labelling method of wlan fingerprint based on recognition of location based off-line to on-line item

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant