CN110111785B - Communication interaction method, device, equipment and computer readable storage medium - Google Patents

Communication interaction method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN110111785B
CN110111785B CN201910353738.9A CN201910353738A CN110111785B CN 110111785 B CN110111785 B CN 110111785B CN 201910353738 A CN201910353738 A CN 201910353738A CN 110111785 B CN110111785 B CN 110111785B
Authority
CN
China
Prior art keywords
robot
terminal
connection request
connection
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910353738.9A
Other languages
Chinese (zh)
Other versions
CN110111785A (en
Inventor
姚军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN201910353738.9A priority Critical patent/CN110111785B/en
Publication of CN110111785A publication Critical patent/CN110111785A/en
Application granted granted Critical
Publication of CN110111785B publication Critical patent/CN110111785B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an interaction method, which comprises the following steps: when a wake-up instruction is detected, activating a connection channel of a first robot corresponding to the wake-up instruction, and detecting whether a connection request triggered based on the connection channel exists; if the connection request exists, the connection between the terminal corresponding to the connection request and the first robot is established, and a corresponding access interface is displayed on a display interface of the terminal, so that the terminal can interact with the first robot based on the access interface. The invention also discloses a communication interaction device, equipment and a computer readable storage medium. According to the invention, the connection channel is activated through the awakening instruction, and the interactive connection between the terminal and the robot is established based on the connection channel, so that the user realizes man-machine interaction through the terminal, and the interaction process is carried out at an access interface, thereby effectively improving the anti-interference capability and realizing intelligent interaction.

Description

Communication interaction method, device, equipment and computer readable storage medium
Technical Field
The invention relates to the technical field of financial technology (Fintech), in particular to an interaction method, device, equipment and a computer readable storage medium.
Background
In recent years, with the development of financial technology (Fintech), particularly internet finance, man-machine interaction technology has been introduced into the daily business of financial institutions such as banks. The robot has certain intelligence and mobility, and is popular with users in many application scenes, especially in application scenes such as guidance reception and interaction, so that how to establish man-machine connection and perform interaction is an important work.
In the prior art, a touch interface of a robot has options of related functions, a user can perform a manual click operation in the touch interface of the robot, and the robot feeds back a corresponding action based on the click operation of the user, such as displaying an interface corresponding to the click operation, or executing an action corresponding to the click operation, or playing a sound corresponding to the click operation; of course, the user can also issue commands to the robot through voice, and the robot executes corresponding actions through recognizing the commands in the voice.
However, in the current human-computer interaction, the interaction can be realized only by the user contacting the robot in a close range, the interaction function is deficient, and the human-computer interaction is easily interfered by the outside world in the interaction process, for example, when the user issues a voice command, the human-computer interaction is easily interfered by other sounds in the outside world; when a user manually clicks a touch interface, if other people click on the touch interface, instruction conflict and the like are easy to occur, so that the existing man-machine interaction man-machine connection is not firm, and the intelligence in the interaction process is insufficient.
Disclosure of Invention
The invention mainly aims to provide an interaction method, device, equipment and a computer readable storage medium, aiming at improving the firmness of human-computer connection and the intelligence of human-computer interaction in the human-computer interaction.
In order to achieve the above object, the present invention provides an interaction method, which comprises the following steps:
when a wake-up instruction is detected, activating a connection channel of a first robot corresponding to the wake-up instruction, and detecting whether a connection request triggered based on the connection channel exists;
if the connection request exists, the connection between the terminal corresponding to the connection request and the first robot is established, and a corresponding access interface is displayed on a display interface of the terminal, so that the terminal can interact with the first robot based on the access interface.
Preferably, when a wake-up command is detected, the step of activating a connection channel of the first robot corresponding to the wake-up command and detecting whether a connection request triggered based on the connection channel exists includes:
starting a camera and detecting whether the camera catches a preset action; or starting a voice recognition device and detecting whether the voice recognition device recognizes a preset voice command;
if so, displaying the two-dimension code of the first robot corresponding to the preset action or the preset voice instruction;
detecting whether a connection request triggered based on the two-dimensional code exists.
Preferably, when a wake-up command is detected, the step of activating a connection channel of the first robot corresponding to the wake-up command and detecting whether a connection request triggered based on the connection channel exists includes:
when a wake-up instruction is detected, determining identification information of a first robot corresponding to the wake-up instruction, and broadcasting an ibeacon signal corresponding to the identification information;
detecting whether there is a connection request triggered based on the ibeacon signal.
Preferably, if the connection request exists, the step of establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal includes:
if yes, determining whether the connection request meets a preset condition;
and if so, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal.
Preferably, if the connection request exists, the step of determining whether the connection request meets a preset condition includes:
if so, acquiring first position information corresponding to the connection request and second position information corresponding to the awakening instruction, and determining whether the first position information is consistent with the second position information;
and if so, determining that the connection request meets a preset condition.
Preferably, after the step of establishing a connection between the terminal corresponding to the connection request and the first robot and displaying a corresponding access interface on a display interface of the terminal if the connection exists, the method further includes:
when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction;
and/or sending data generated by the first robot based on the execution action to the terminal based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, so that the terminal can display the data on the access interface.
Preferably, when a preset instruction sent by the terminal based on the access interface is received, the step of controlling the first robot to execute an execution action corresponding to the preset instruction is as follows:
when a view sharing instruction sent by the terminal based on the access interface is received, determining a second robot corresponding to the view sharing instruction, and acquiring the view of the second robot;
and displaying the visual field of the second robot on the display interface of the first robot.
In addition, to achieve the above object, the present invention further provides an interactive communication device, including:
the detection module is used for activating a connection channel of the first robot corresponding to the awakening instruction when the awakening instruction is detected, and detecting whether a connection request triggered based on the connection channel exists or not;
and the connection module is used for establishing the connection between the terminal corresponding to the connection request and the first robot if the connection request exists, and displaying a corresponding access interface on a display interface of the terminal so that the terminal can interact with the first robot based on the access interface.
Preferably, the detection module is further configured to:
starting a camera and detecting whether the camera catches a preset action; or starting a voice recognition device and detecting whether the voice recognition device recognizes a preset voice command;
if so, displaying the two-dimension code of the first robot corresponding to the preset action or the preset voice instruction;
detecting whether a connection request triggered based on the two-dimensional code exists.
Preferably, the detection module is further configured to:
when a wake-up instruction is detected, determining identification information of a first robot corresponding to the wake-up instruction, and broadcasting an ibeacon signal corresponding to the identification information;
detecting whether there is a connection request triggered based on the ibeacon signal.
Preferably, the connection module is further configured to:
if yes, determining whether the connection request meets a preset condition;
and if so, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal.
Preferably, the connection module is further configured to:
if so, acquiring first position information corresponding to the connection request and second position information corresponding to the awakening instruction, and determining whether the first position information is consistent with the second position information;
and if so, determining that the connection request meets a preset condition.
Preferably, the communication interaction device further comprises an interaction module, configured to:
when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction;
and/or sending data generated by the first robot based on the execution action to the terminal based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, so that the terminal can display the data on the access interface.
Preferably, the interaction module is further configured to:
when a view sharing instruction sent by the terminal based on the access interface is received, determining a second robot corresponding to the view sharing instruction, and acquiring the view of the second robot;
and displaying the visual field of the second robot on the display interface of the first robot.
In addition, to achieve the above object, the present invention further provides an interactive communication device, including: the communication interaction method comprises a memory, a processor and a communication interaction program which is stored on the memory and can run on the processor, wherein the communication interaction program realizes the steps of the communication interaction method when being executed by the processor.
In addition, to achieve the above object, the present invention further provides a computer readable storage medium, on which a communication interaction program is stored, and the communication interaction program, when executed by a processor, implements the steps of the communication interaction method as described above.
According to the communication interaction method, when a wake-up instruction is detected, a connection channel of a first robot corresponding to the wake-up instruction is activated, and whether a connection request triggered based on the connection channel exists is detected; if the connection request exists, the connection between the terminal corresponding to the connection request and the first robot is established, and a corresponding access interface is displayed on a display interface of the terminal, so that the terminal can interact with the first robot based on the access interface. The invention also discloses a communication interaction device, equipment and a computer readable storage medium. According to the invention, the connection channel is activated through the awakening instruction, and the interactive connection between the terminal and the robot is established based on the connection channel, so that the user realizes man-machine interaction through the terminal, and the interaction process is carried out at an access interface, thereby effectively improving the anti-interference capability and realizing intelligent interaction.
Drawings
FIG. 1 is a schematic diagram of an apparatus architecture of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a communication interaction method according to a first embodiment of the present invention;
fig. 3 is a flowchart illustrating a communication interaction method according to a second embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present invention.
The device of the embodiment of the invention can be a PC or a server device.
As shown in fig. 1, the apparatus may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration of the apparatus shown in fig. 1 is not intended to be limiting of the apparatus and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include an operating system, a network communication module, a user interface module, and a communication interactive program therein.
The operating system is a program for managing and controlling the communication interactive equipment and the software resources and supports the operation of the network communication module, the user interface module, the communication interactive program and other programs or software; the network communication module is used for managing and controlling the network interface 1002; the user interface module is used to manage and control the user interface 1003.
In the interactive communication apparatus shown in fig. 1, the interactive communication apparatus calls an interactive communication program stored in a memory 1005 through a processor 1001 and performs operations in various embodiments of the interactive communication method described below.
Based on the hardware structure, the embodiment of the communication interaction method is provided.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the communication interaction method of the present invention, which includes:
step S10, when a wake-up command is detected, activating a connection channel of the first robot corresponding to the wake-up command, and detecting whether a connection request triggered based on the connection channel exists;
and step S20, if the connection request exists, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal so that the terminal can interact with the first robot based on the access interface.
The communication interaction method can be selectively applied to interaction equipment of financial institutions such as financial institutions or bank systems, the interaction equipment comprises at least one robot, a background server and a terminal used for being connected with the robot, and the robot comprises a robot display screen. The user can use terminal equipment such as a mobile phone and the like to be connected with the robot, and data information is transmitted through the background server, so that man-machine interaction is achieved, and the large screen and the small screen can be called as large and small screen interaction, wherein the large screen refers to a display screen of the robot, and the small screen refers to a display interface of the terminal.
According to the embodiment, the connection channel is activated through the awakening instruction, interactive connection between the terminal and the robot is established based on the connection channel, so that human-computer interaction of a user is achieved through the terminal, the interaction process is conducted on the access interface, the anti-interference capability is effectively improved, and intelligent interaction is achieved.
The respective steps will be described in detail below:
step S10, when a wake-up command is detected, activating a connection channel of the first robot corresponding to the wake-up command, and detecting whether a connection request triggered based on the connection channel exists.
In this embodiment, there may be multiple robots, and a user may select a robot that the user wants to interact with, and it can be understood that, in order to avoid that the robot is in a working state at any time and wastes more electric power, the robot is in a low-power state when not waking up, and in the low-power state, the robot only keeps a small number of sensors to work, such as only keeping a camera to work, or only keeping a voice recognition module to work, and the like.
When the interactive device detects a wake-up instruction triggered by a user, a connection channel of the first robot corresponding to the wake-up instruction is activated, that is, which robot the user wakes up is activated, the connection channel of which robot is activated, and other robots are in a low-power state, wherein the connection channel can be represented in any form of two-dimensional code, WiFi or bluetooth, and the user can use a terminal to scan the two-dimensional code, or connect WiFi, or connect bluetooth, and the like to initiate a connection request to the robot.
Further, whether a connection request triggered based on the connection channel exists or not is detected within a preset time.
In this step, it can be understood that, if the user wakes up the robot and does not use the terminal to initiate the connection request all the time, the robot will continuously consume power, and therefore, a time value is preset, and within a preset time, it is detected whether there is a connection request triggered based on the connection channel, and if not, it is considered that the current wake-up instruction is mistakenly woken up, and at this time, the connection channel of the first robot is closed, and the first robot is in a low-power state.
Further, a prompt message may be sent by the first robot before closing the connection path of the first robot, the prompt message including connection guidance of the first robot and information that the connection path of the first robot is about to be closed.
Further, step S10 includes:
step a, starting a camera and detecting whether the camera catches a preset action; or starting a voice recognition device and detecting whether the voice recognition device recognizes a preset voice command;
in this step, the interactive device only starts a camera of the robot, or only starts a voice recognition device of the robot, so that the robot is in a low power state, and detects whether a preset action exists in a field of view of the current camera through the camera, or detects whether a preset voice command appears in a preset recognition range through the voice recognition device, that is, in this step, the mode used by the user to wake up the robot includes gesture wake-up and voice wake-up. It can be understood that the user can also wake up the robot through a preset physical button of the robot.
B, if yes, displaying the two-dimension code of the first robot corresponding to the preset action or the preset voice instruction;
in the step, if the camera is detected to capture a preset action or the voice device is detected to recognize a preset voice command, the corresponding two-dimensional code of the first robot is displayed for the user to scan and connect. Taking voice recognition as an example, a user speaks voice information containing the starting robot, and the interaction device recognizes the information through the first robot and displays the two-dimensional code of the first robot.
And c, detecting whether a connection request triggered based on the two-dimension code exists.
In this step, after the two-dimensional code is displayed, it is detected whether a connection request triggered based on the two-dimensional code exists, specifically, whether a user uses the terminal to scan the current two-dimensional code.
That is, the wake-up mode of this embodiment may be gesture wake-up or voice wake-up, the connection channel is a two-dimensional code, the interactive device displays the two-dimensional code so that the user uses the terminal to scan, and the user scans the two-dimensional code by using the terminal, thereby initiating a connection request to the robot.
It should be further noted that the connection channel may also be a bluetooth connection channel, and activating the connection channel at this time is specifically represented by displaying a bluetooth access guidance on the display interface of the robot, so that the user can view the bluetooth access guidance and start the bluetooth function of the terminal according to the bluetooth access guidance to initiate a connection request to the robot.
Further, step S10 includes:
d, when a wake-up instruction is detected, determining identification information of the first robot corresponding to the wake-up instruction, and broadcasting an ibeacon signal corresponding to the identification information;
in this step, when a wake-up command is detected, identification information of the first robot corresponding to the wake-up command is determined, where the identification information is specifically information such as UUID, Minor, and Major of the robot, and an ibeacon signal corresponding to the identification information is broadcast, where the ibeacon signal refers to a signal broadcast based on an ibeacon technology, and the ibeacon uses a BLE (Bluetooth Low Energy) technology, specifically, a broadcast frame named as an advertisement frame (advertisement) in BLE. The advertisement frame is a frame that is periodically transmitted and can be received by any device that supports BLE.
And e, detecting whether a connection request triggered based on the ibeacon signal exists.
In this step, it is detected whether a connection request triggered based on the iBeacon signal exists, that is, the interaction device broadcasts the iBeacon signal of the first robot, and the user can use the bluetooth function of the terminal to identify information such as UUID (Universally Unique Identifier) and Minor (16-bit Identifier) carried by the iBeacon signal, where the UUID is a 128-bit Identifier specified in ISO/IEC11578:1996 standard, and the Major and the Minor are set by the iBeacon issuer itself. For example, the robot corresponding to the trading place may write the trading area information in Major, and may write the ID of the trading booth in Minor. When the iBeacon function is embedded in the robot, the model number may be denoted by Major, and the error code may be denoted by Minor, to notify the outside of a failure or the like.
Specifically, the ibeacon technology can be accessed through the wechat interface, and the user can identify the ibeacon signal broadcasted by the robot and initiate a connection request only by opening the wechat and shaking the periphery.
And step S20, if the connection request exists, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal so that the terminal can interact with the first robot based on the access interface.
In this embodiment, if there is a connection request, the ID of the terminal corresponding to the connection request and the ID of the first robot are determined, a point-to-point connection is established based on the ID of the terminal and the ID of the first robot, and after the connection is successful, a corresponding access interface is displayed on a display interface of the terminal for a user to operate in the access interface, so as to implement human-computer interaction.
The specific interaction process comprises the steps that a user initiates a request for acquiring business marketing contents stored by a first robot or acquired by a background server in real time to the first robot through an access interface, such as a request for acquiring coupons, advertising propaganda and robot use skill propaganda, and the like, the first robot receives the request and issues corresponding business marketing contents to a terminal through the background server; the user can also send a preset instruction to the first robot through the access interface, such as an acousto-optic special effect, a bullet screen, theme interaction, photographing and the like, the first robot receives the instruction and displays the bullet screen or the theme on a display screen of the robot, or starts a camera to photograph and displays the photograph on the access interface and the like; or the user issues an action instruction through the access interface, and the first robot moves to a corresponding destination according to the action instruction.
It can be understood that the interaction mode between the terminal and the first robot can be various, therefore, the access interface of this embodiment includes the function list corresponding to the first robot, the user can implement various functions of the first robot through the function list, and at the same time, the robot has a data processing function, and can process the data sent from the terminal, for example, the user sends a section of voice to the first robot through the terminal, and after receiving the voice, the robot performs inflexion processing and playing on the voice, and can open the light special effect in the voice playing process.
In this embodiment, when a wake-up instruction is detected, a connection channel of a first robot corresponding to the wake-up instruction is activated, and whether a connection request triggered based on the connection channel exists is detected; if the connection request exists, the connection between the terminal corresponding to the connection request and the first robot is established, and a corresponding access interface is displayed on a display interface of the terminal, so that the terminal can interact with the first robot based on the access interface. According to the invention, the connection channel is activated through the awakening instruction, and the interactive connection between the terminal and the robot is established based on the connection channel, so that the user realizes man-machine interaction through the terminal, and the interaction process is carried out at an access interface, thereby effectively improving the anti-interference capability and realizing intelligent interaction.
Further, based on the first embodiment of the communication and interaction method of the present invention, a second embodiment of the communication and interaction method of the present invention is provided.
The second embodiment of the communication interaction method is different from the first embodiment of the communication interaction method in that, referring to fig. 3, the step S20 includes:
step S21, if yes, determining whether the connection request meets a preset condition;
and step S22, if yes, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal.
In this embodiment, when the connection request is detected, it is determined whether the connection request meets a preset condition, and the connection between the terminal and the first robot is established only when the connection request meets the preset condition, so that the connection between the terminal and the first robot is more rigorous, and an invalid connection request is effectively avoided.
The respective steps will be described in detail below:
and step S21, if yes, determining whether the connection request meets a preset condition.
In this embodiment, if a connection request based on a connection channel is detected, the connection request is verified, and it is determined whether the current connection request meets a preset condition.
The specific verification steps include:
if so, acquiring first position information corresponding to the connection request and second position information corresponding to the awakening instruction, and determining whether the first position information is consistent with the second position information;
in this step, if a connection request based on a connection channel is detected, first location information corresponding to the connection request, that is, location information of a terminal that initiated the connection request to the first robot, and second location information corresponding to a wake-up instruction, that is, location information of a user that wakens the first robot are obtained, and the first location information and the second location information are compared to determine whether the first location information and the second location information are consistent, it can be understood that, if the user that wakens the first robot and the user corresponding to the terminal that initiated the connection request to the first robot are the same person, the first location information and the second location information are consistent. This is to avoid a situation where the user a cannot connect to the robot, as the user a wakes up the robot and the user B connects to the robot.
If the connection request is consistent with the preset condition, determining that the connection request meets the preset condition; and if not, determining that the connection request does not meet the preset condition.
In the step, if the first position information and the second position information are determined to be consistent, determining that the connection request meets a preset condition; and if not, determining that the connection request does not meet the preset condition.
Further, the verifying step further comprises:
when a connection request based on the connection channel is detected, the identification code corresponding to the connection request is displayed through the display screen of the first robot, and the input frame corresponding to the identification code is sent to the access interface so that a user can input the identification code to complete verification.
That is, in this step, during the process that the user uses the terminal to initiate the connection request to the first robot, the identification code displayed by the first robot needs to be input in the access interface of the terminal to complete the verification.
And step S22, if yes, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal.
In this embodiment, if it is determined that the current connection request meets the preset condition, the connection between the terminal corresponding to the connection request and the first robot is established, and a corresponding access interface is displayed on a display interface of the terminal.
Further, if the connection request does not meet the preset condition, sending a prompt message containing the current connection error through the first robot, and displaying the connection guide message so as to facilitate the user to correct.
In this embodiment, when the connection request is detected, it is determined whether the connection request meets a preset condition, and the connection between the terminal and the first robot is established only when the connection request meets the preset condition, so that the connection between the terminal and the first robot is more rigorous, an invalid connection request is effectively avoided, and the intelligent connection between the terminal and the robot is realized.
Further, a third embodiment of the communication interaction method is provided based on the first and second embodiments of the communication interaction method of the present invention.
The third embodiment of the communication interaction method differs from the first and second embodiments of the communication interaction method in that the method further comprises:
step f, when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction;
and g, based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, sending data generated by the first robot based on the execution action to the terminal, so that the terminal can display the data on the access interface.
The embodiment can receive a preset instruction sent by the terminal based on the access interface, and control the first robot to execute an execution action corresponding to the preset instruction, or send data generated by the first robot to the terminal, so as to realize interaction between the terminal and the first robot.
The respective steps will be described in detail below:
and f, when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction.
In this embodiment, a user may issue a preset instruction at an access interface of a terminal, specifically, the access interface includes function items of each function of a first robot, the user may click a corresponding function icon on the access interface or input preset information corresponding to the preset instruction in an input frame corresponding to the access interface and send the preset information, and when the interaction device receives the preset instruction sent by the terminal, the first robot is controlled to execute a corresponding execution action, and if the preset instruction is a photo, the interaction device controls the first robot to start a camera to take a corresponding photo.
And g, based on a telemetry transmission protocol, sending data generated by the first robot based on the execution action to the terminal, so that the terminal can display the data on the access interface.
In this embodiment, it can be understood that, during the execution of the corresponding execution action, the first robot may generate data, for example, when the first robot takes a picture, photo data may be generated, so that according to the MQTT (Message Queue telemeasurement Transport protocol), the WebSocket communication protocol, and the Http protocol, the data generated by the first robot based on the execution action may be sent to the terminal, so that the terminal may display the data on the access interface.
Specifically, the terminal can receive a subscription request sent by the first robot, and send data corresponding to the item to the terminal when detecting that the item corresponding to the subscription request changes based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, so that the terminal can display the data on an access interface. If the user subscribes the discount information through the terminal, the interactive equipment sends the new discount information to the terminal when detecting that the new discount information is released, so that the terminal can display the new discount information on the access interface.
Further, when a view sharing instruction sent by the terminal based on the access interface is received, a second robot corresponding to the view sharing instruction is determined, and the view of the second robot is obtained;
and displaying the visual field of the second robot on the display interface of the first robot.
In this step, the user may interact with the second robot in addition to the current first robot through the terminal. When the interactive equipment receives a view sharing instruction sent by the terminal based on the access interface, the second robot corresponding to the view instruction is determined, and then the view of the second robot is obtained, namely, a user can obtain the view of the area where the second robot is located by means of a camera of the second robot.
It can be understood that, if a user is using the second robot, the current user may interact with the user corresponding to the second robot.
The embodiment can receive a preset instruction sent by the terminal based on the access interface, control the first robot to execute an execution action corresponding to the preset instruction, or send data generated by the first robot to the terminal so as to realize communication interaction between the terminal and the first robot, and meanwhile, the terminal can be connected with the second robot by means of the first robot so as to realize cross-region connection and cross-region interaction.
The invention also provides an interactive communication device. The communication interaction device of the invention comprises:
the detection module is used for activating a connection channel of the first robot corresponding to the awakening instruction when the awakening instruction is detected, and detecting whether a connection request triggered based on the connection channel exists or not;
and the connection module is used for establishing the connection between the terminal corresponding to the connection request and the first robot if the connection request exists, and displaying a corresponding access interface on a display interface of the terminal so that the terminal can interact with the first robot based on the access interface.
Further, the detection module is further configured to:
starting a camera and detecting whether the camera catches a preset action; or starting a voice recognition device and detecting whether the voice recognition device recognizes a preset voice command;
if so, displaying the two-dimension code of the first robot corresponding to the preset action or the preset voice instruction;
detecting whether a connection request triggered based on the two-dimensional code exists.
Further, the detection module is further configured to:
when a wake-up instruction is detected, determining identification information of a first robot corresponding to the wake-up instruction, and broadcasting an ibeacon signal corresponding to the identification information;
detecting whether there is a connection request triggered based on the ibeacon signal.
Further, the connection module is further configured to:
if yes, determining whether the connection request meets a preset condition;
and if so, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal.
Further, the connection module is further configured to:
if so, acquiring first position information corresponding to the connection request and second position information corresponding to the awakening instruction, and determining whether the first position information is consistent with the second position information;
and if so, determining that the connection request meets a preset condition.
Further, the communication interaction device further comprises an interaction module, configured to:
when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction;
and/or sending data generated by the first robot based on the execution action to the terminal based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, so that the terminal can display the data on the access interface.
Further, the interaction module is further configured to:
when a view sharing instruction sent by the terminal based on the access interface is received, determining a second robot corresponding to the view sharing instruction, and acquiring the view of the second robot;
and displaying the visual field of the second robot on the display interface of the first robot.
The invention also provides a computer readable storage medium.
The computer readable storage medium of the present invention stores a communication interaction program, and the communication interaction program implements the steps of the communication interaction method as described above when executed by a processor.
The method implemented when the communication interaction program running on the processor is executed may refer to each embodiment of the communication interaction method of the present invention, and details are not described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (12)

1. A communication interaction method is characterized by comprising the following steps:
when a wake-up instruction is detected, activating a connection channel of a first robot corresponding to the wake-up instruction, and detecting whether a connection request triggered based on the connection channel exists;
if the connection request exists, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal so that the terminal can interact with the first robot based on the access interface;
if the connection request exists, the step of establishing the connection between the terminal corresponding to the connection request and the first robot and displaying a corresponding access interface on a display interface of the terminal comprises the following steps:
if yes, determining whether the connection request meets a preset condition;
if so, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal;
if yes, the step of determining whether the connection request meets a preset condition comprises:
if so, acquiring first position information corresponding to the connection request and second position information corresponding to the awakening instruction, and determining whether the first position information is consistent with the second position information;
and if so, determining that the connection request meets a preset condition.
2. The interactive method for communication according to claim 1, wherein the step of activating a connection channel of the first robot corresponding to the wake-up command and detecting whether there is a connection request triggered based on the connection channel when the wake-up command is detected comprises:
starting a camera and detecting whether the camera catches a preset action; or starting a voice recognition device and detecting whether the voice recognition device recognizes a preset voice command;
if so, displaying the two-dimension code of the first robot corresponding to the preset action or the preset voice instruction;
detecting whether a connection request triggered based on the two-dimensional code exists.
3. The interactive method for communication according to claim 1, wherein the step of activating a connection channel of the first robot corresponding to the wake-up command and detecting whether there is a connection request triggered based on the connection channel when the wake-up command is detected comprises:
when a wake-up instruction is detected, determining identification information of a first robot corresponding to the wake-up instruction, and broadcasting an ibeacon signal corresponding to the identification information;
detecting whether there is a connection request triggered based on the ibeacon signal.
4. A communication interaction method as claimed in any one of claims 1 to 3, wherein, after the step of establishing a connection between the terminal corresponding to the connection request and the first robot and displaying the corresponding access interface on the display interface of the terminal, the method further comprises:
when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction;
and/or sending data generated by the first robot based on the execution action to the terminal based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, so that the terminal can display the data on the access interface.
5. The communication interaction method of claim 4, wherein when receiving a preset instruction sent by the terminal based on the access interface, the step of controlling the first robot to execute an execution action corresponding to the preset instruction is:
when a view sharing instruction sent by the terminal based on the access interface is received, determining a second robot corresponding to the view sharing instruction, and acquiring the view of the second robot;
and displaying the visual field of the second robot on the display interface of the first robot.
6. An interactive communication device, comprising:
the detection module is used for activating a connection channel of the first robot corresponding to the awakening instruction when the awakening instruction is detected, and detecting whether a connection request triggered based on the connection channel exists or not;
the connection module is used for establishing the connection between the terminal corresponding to the connection request and the first robot if the connection request exists, and displaying a corresponding access interface on a display interface of the terminal so that the terminal can interact with the first robot based on the access interface;
the connection module is further configured to:
if yes, determining whether the connection request meets a preset condition;
if so, establishing the connection between the terminal corresponding to the connection request and the first robot, and displaying a corresponding access interface on a display interface of the terminal;
the connection module is further configured to:
if so, acquiring first position information corresponding to the connection request and second position information corresponding to the awakening instruction, and determining whether the first position information is consistent with the second position information;
and if so, determining that the connection request meets a preset condition.
7. The interactive communication device of claim 6, wherein the detection module is further configured to:
starting a camera and detecting whether the camera catches a preset action; or starting a voice recognition device and detecting whether the voice recognition device recognizes a preset voice command;
if so, displaying the two-dimension code of the first robot corresponding to the preset action or the preset voice instruction;
detecting whether a connection request triggered based on the two-dimensional code exists.
8. The interactive communication device of claim 6, wherein the detection module is further configured to:
when a wake-up instruction is detected, determining identification information of a first robot corresponding to the wake-up instruction, and broadcasting an ibeacon signal corresponding to the identification information;
detecting whether there is a connection request triggered based on the ibeacon signal.
9. An interactive communication device as claimed in any one of claims 6 to 8, further comprising an interaction module for:
when a preset instruction sent by the terminal based on the access interface is received, controlling the first robot to execute an execution action corresponding to the preset instruction;
and/or sending data generated by the first robot based on the execution action to the terminal based on a telemetry transmission protocol, a WebSocket communication protocol and an Http protocol, so that the terminal can display the data on the access interface.
10. The interactive communication device of claim 9, wherein the interactive module is further configured to:
when a view sharing instruction sent by the terminal based on the access interface is received, determining a second robot corresponding to the view sharing instruction, and acquiring the view of the second robot;
and displaying the visual field of the second robot on the display interface of the first robot.
11. An interactive communication device, comprising: memory, a processor and a communication and interaction program stored on the memory and executable on the processor, the communication and interaction program, when executed by the processor, implementing the steps of the communication and interaction method according to any one of claims 1 to 5.
12. A computer-readable storage medium, wherein a communication interaction program is stored on the computer-readable storage medium, and when executed by a processor, the communication interaction program implements the steps of the communication interaction method according to any one of claims 1 to 5.
CN201910353738.9A 2019-04-29 2019-04-29 Communication interaction method, device, equipment and computer readable storage medium Active CN110111785B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910353738.9A CN110111785B (en) 2019-04-29 2019-04-29 Communication interaction method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910353738.9A CN110111785B (en) 2019-04-29 2019-04-29 Communication interaction method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110111785A CN110111785A (en) 2019-08-09
CN110111785B true CN110111785B (en) 2021-04-23

Family

ID=67487432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910353738.9A Active CN110111785B (en) 2019-04-29 2019-04-29 Communication interaction method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110111785B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110610704A (en) * 2019-09-09 2019-12-24 上海赛连信息科技有限公司 Method, medium and device for displaying identification and computing equipment
CN111489510A (en) * 2020-04-15 2020-08-04 青岛海信智能商用系统股份有限公司 Communication method and contactless self-service shopping settlement system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8064092B2 (en) * 1999-12-01 2011-11-22 Silverbrook Research Pty Ltd System for retrieving display data for handheld display device
CN203786752U (en) * 2014-03-19 2014-08-20 北京兆维电子(集团)有限责任公司 Bank remote virtual service business hall terminal
CN104540088A (en) * 2014-12-23 2015-04-22 小米科技有限责任公司 Connection establishment method, terminal and device
CN105511722A (en) * 2015-12-04 2016-04-20 广东威创视讯科技股份有限公司 Display screen control method and system
CN107623714A (en) * 2017-07-28 2018-01-23 平安科技(深圳)有限公司 Data sharing method, device and computer-readable recording medium
CN108170285A (en) * 2018-03-01 2018-06-15 贵州小爱机器人科技有限公司 The interaction control method and device of a kind of electronic equipment
CN108170277A (en) * 2018-01-08 2018-06-15 杭州赛鲁班网络科技有限公司 A kind of device and method of intelligent visual interaction

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8064092B2 (en) * 1999-12-01 2011-11-22 Silverbrook Research Pty Ltd System for retrieving display data for handheld display device
CN203786752U (en) * 2014-03-19 2014-08-20 北京兆维电子(集团)有限责任公司 Bank remote virtual service business hall terminal
CN104540088A (en) * 2014-12-23 2015-04-22 小米科技有限责任公司 Connection establishment method, terminal and device
CN105511722A (en) * 2015-12-04 2016-04-20 广东威创视讯科技股份有限公司 Display screen control method and system
CN107623714A (en) * 2017-07-28 2018-01-23 平安科技(深圳)有限公司 Data sharing method, device and computer-readable recording medium
CN108170277A (en) * 2018-01-08 2018-06-15 杭州赛鲁班网络科技有限公司 A kind of device and method of intelligent visual interaction
CN108170285A (en) * 2018-03-01 2018-06-15 贵州小爱机器人科技有限公司 The interaction control method and device of a kind of electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"大屏"与"小屏"的深度融合;刘芯玮;《新闻研究导刊》;20190228;第10卷(第3期);第190-191页 *
智能网联汽车人机交互手势识别设计;刘华仁;《北京汽车》;20171231;第16-28页 *

Also Published As

Publication number Publication date
CN110111785A (en) 2019-08-09

Similar Documents

Publication Publication Date Title
CN110430558B (en) Device control method, device, electronic device and storage medium
EP3612925B1 (en) Electronic device and method for processing user speech
US11955124B2 (en) Electronic device for processing user speech and operating method therefor
US10706847B2 (en) Method for operating speech recognition service and electronic device supporting the same
US20190370525A1 (en) Fingerprint recognition method, electronic device, and storage medium
US20170060599A1 (en) Method and apparatus for awakening electronic device
US10402625B2 (en) Intelligent electronic device and method of operating the same
CN106201491B (en) Mobile terminal and method and device for controlling remote assistance process of mobile terminal
EP3726376B1 (en) Program orchestration method and electronic device
KR20180083587A (en) Electronic device and operating method thereof
KR102389996B1 (en) Electronic device and method for screen controlling for processing user input using the same
CN109451141B (en) Operation control method and related terminal
CN107450838B (en) Response method and device of black screen gesture, storage medium and mobile terminal
CN110968362B (en) Application running method, device and storage medium
US11394671B2 (en) Method for providing transaction history-based service and electronic device therefor
CN110111785B (en) Communication interaction method, device, equipment and computer readable storage medium
KR20140036532A (en) Method and system for executing application, device and computer readable recording medium thereof
CN104184890A (en) Information processing method and electronic device
CN104407865A (en) Method and device for displaying window
CN110175063B (en) Operation assisting method, device, mobile terminal and storage medium
CN110955332A (en) Man-machine interaction method and device, mobile terminal and computer readable storage medium
CN111027963A (en) Payment code sharing method and electronic equipment
WO2023222128A1 (en) Display method and electronic device
CN110766396A (en) Graphic code display method and electronic equipment
EP3866482A1 (en) Method and device for processing information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant