CN117762288A - Emergency station system with man-machine interaction function and man-machine interaction emergency processing method - Google Patents

Emergency station system with man-machine interaction function and man-machine interaction emergency processing method Download PDF

Info

Publication number
CN117762288A
CN117762288A CN202311534045.2A CN202311534045A CN117762288A CN 117762288 A CN117762288 A CN 117762288A CN 202311534045 A CN202311534045 A CN 202311534045A CN 117762288 A CN117762288 A CN 117762288A
Authority
CN
China
Prior art keywords
interaction
emergency
user
voice
emergency station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311534045.2A
Other languages
Chinese (zh)
Inventor
闫新
张海
叶甜香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Putian Yitong Technologies Co ltd
Original Assignee
Shenzhen Putian Yitong Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Putian Yitong Technologies Co ltd filed Critical Shenzhen Putian Yitong Technologies Co ltd
Priority to CN202311534045.2A priority Critical patent/CN117762288A/en
Publication of CN117762288A publication Critical patent/CN117762288A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an emergency station system with a man-machine interaction function and a man-machine interaction emergency processing method, which comprises the following steps: the touch display screen is used for displaying various emergency touch interaction functions and receiving touch interaction operation instructions of a user; the camera component is used for acquiring face interaction images and gesture interaction images of the user; the voice recognition module is used for recognizing a voice interaction instruction of a user; the code scanning interaction module is used for displaying code scanning interaction two-dimensional codes; the distance sensor is used for detecting and sensing the distance information of the user; the function key is used for receiving a function key instruction of a user; the emergency station main controller is respectively connected with the touch display screen, the camera component, the voice recognition module, the code scanning interaction module and the distance sensor. The invention adds new functions for the emergency station, including interaction through a touch large screen, interaction through a camera, interaction through voice, interaction through intelligent terminal code scanning, interaction through a distance sensor, and the like, and greatly provides convenience for emergency use for users.

Description

Emergency station system with man-machine interaction function and man-machine interaction emergency processing method
Technical Field
The invention relates to the technical field of emergency stations, in particular to an emergency station system with a man-machine interaction function and a man-machine interaction emergency processing method.
Background
With the development of society, emergency stations are becoming more important; the existing emergency station basically has no intelligent interaction function, and only some simple interactions such as popping up corresponding emergency materials by a key can be realized, and if some complex interactions are met, the emergency station in the prior art can not solve the problems. That is, the emergency station in the prior art has the defects that the interaction function of the emergency station is single, more user demands such as emergency real-time emergency service, emergency real-time consultation, emergency communication and feedback and the like cannot be completed, and the emergency use of users is sometimes inconvenient.
Accordingly, there is a need for improvement and development in the art.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides the man-machine interaction emergency station, which adds new functions for the emergency station, and the interaction mode comprises the interaction of a touch large screen, the interaction of a camera, the interaction of voice, the interaction of code scanning through an intelligent terminal, the interaction of sensing through a distance sensor and the like, so that great convenience is provided for emergency use of users.
The technical scheme adopted by the invention for solving the problems is as follows:
an emergency station system with man-machine interaction function, comprising:
the touch display screen is used for displaying various emergency touch interaction functions and receiving touch interaction operation instructions of a user;
the camera component is used for acquiring face interaction images and gesture interaction images of the user;
the voice recognition module is used for recognizing a voice interaction instruction of a user;
the code scanning interaction module is used for displaying code scanning interaction two-dimensional codes;
the distance sensor is used for detecting and sensing the distance information of the user;
the function key is used for receiving a function key instruction of a user;
the emergency station main controller is respectively connected with the touch display screen, the camera component, the voice recognition module, the code scanning interaction module and the distance sensor, and is used for receiving touch interaction operation instructions sent by the touch display screen and controlling and running corresponding emergency station operation functions according to the touch interaction operation instructions; the system comprises a camera component, a gesture interaction image and a human face interaction image, wherein the camera component is used for analyzing the human face interaction image and the gesture interaction image sent by the camera component, analyzing out corresponding user requirements and driving to execute requirement functions corresponding to the user requirements; the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand control of the corresponding user is analyzed to realize the corresponding voice demand function, and voice information is processed to output voice sound; the function key processing module is used for processing the function key request sent by the function key and controlling and executing the corresponding function key to realize operation; and for processing the human body information sensed by the distance sensor, when a user is sensed to exist in a preset surrounding range, an emergency public welfare propaganda function is started;
And the communication module is connected with the emergency station master controller and is used for receiving and sending interaction information through the communication module.
The emergency station system with the man-machine interaction function further comprises:
and the mobile terminal is in communication connection with the communication module and is used for scanning the two-dimensional code of the code scanning interaction module and generating code scanning interaction instructions to perform code scanning interaction.
The emergency station system with the man-machine interaction function further comprises:
the emergency management terminal is in communication connection with the communication module and is used for performing operation control on each functional module of the emergency station system with the man-machine interaction function.
The emergency station system with the man-machine interaction function, wherein the voice recognition module comprises:
an audio module connected with the emergency station master controller,
a microphone connected to the audio module,
and the loudspeaker is connected with the audio module.
The emergency station system with the man-machine interaction function is characterized in that the code scanning interaction module is a display screen for displaying code scanning interaction two-dimensional codes.
A human-computer interaction emergency treatment method based on any one of the emergency station systems with human-computer interaction functions, comprising the steps of:
Controlling the touch display screen to display various emergency touch interaction functions, and detecting whether the touch display screen receives a touch interaction operation instruction of a user in real time;
the camera component is controlled to start, and whether the camera component acquires a face interaction image and a gesture interaction image of a user or not is detected;
and realizing detection of whether the voice recognition module recognizes the voice interaction instruction of the user;
the code scanning interaction module is controlled to display a two-dimensional code for interaction, and whether a code scanning interaction instruction of the user mobile terminal is received and acquired or not is detected in real time;
the distance sensor is controlled to start distance sensing, and whether the distance sensor senses that the person stands in a specified distance range from the emergency or not is confirmed;
detecting whether the function key receives a function key instruction of a user or not;
when the emergency station master controller receives a touch interaction operation instruction sent by the touch display screen, controlling and running corresponding emergency station operation functions according to the touch interaction operation instruction;
when the emergency station master controller receives the face interaction image and the gesture interaction image sent by the camera component, analyzing out corresponding user demands, and driving to execute demand functions corresponding to the user demands;
When the emergency station master controller receives the voice interaction instruction sent by the voice recognition module, the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand control of the corresponding user is analyzed to realize the corresponding voice demand function, and the voice interaction instruction is processed to output voice sound for voice information;
when the emergency station master controller receives a function key request sent by a function key, the function key request sent by the function key is processed, and the corresponding function key is controlled to be executed to realize operation;
when the emergency station master controller receives that the distance sensor senses that the person is in a specified distance range from the emergency station, the emergency public welfare propaganda function is controlled to be started.
According to the man-machine interaction emergency processing method, when the emergency station master controller receives the touch interaction operation instruction sent by the touch display screen, the step of controlling and running the corresponding emergency station operation function according to the touch interaction operation instruction comprises the following steps:
the emergency station master controller detects whether the touch display screen receives a touch interaction operation instruction in real time;
when the touch display screen is detected to receive a touch interactive operation instruction, controlling the touch interactive operation instruction sent by the touch display screen to an emergency station master controller;
When the emergency station master controller receives a touch interaction operation instruction sent by the touch display screen, controlling and running corresponding emergency station operation functions according to the touch interaction operation instruction;
and controlling the touch display screen to continuously receive the touch interaction instruction of the user and continuously finish the corresponding interaction requirement while running the corresponding emergency station operation function.
According to the man-machine interaction emergency processing method, when the emergency station master controller receives the face interaction image and the gesture interaction image sent by the camera component, the face interaction image and the gesture interaction image sent by the camera component are analyzed, corresponding user requirements are analyzed, and the step of executing the requirement function corresponding to the user requirements is driven to be executed comprises the following steps:
the emergency station master controller detects whether the camera shoots a face interaction image and a gesture interaction image of a user in real time;
when the camera is detected to shoot the face interaction image and the gesture interaction image of the user, the shot face interaction image and gesture interaction image of the user are sent to the emergency station main controller;
when the emergency station master controller receives the face interaction image and the gesture interaction image sent by the camera assembly, analyzing actions of a user, and analyzing corresponding user requirements;
And driving to execute a demand function corresponding to the user demand according to the analyzed user demand.
According to the man-machine interaction emergency processing method, when the emergency station main controller receives the voice interaction instruction sent by the voice recognition module, the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand control corresponding to the user is analyzed to realize the corresponding voice demand function, and the processing step of outputting voice sound for voice information comprises the following steps:
the emergency station master controller detects whether the voice recognition module receives a voice interaction instruction in real time;
when the voice interaction instruction is detected to be received by the voice recognition module, the voice interaction instruction is sent to an emergency station master controller;
when the emergency station main controller receives the voice interaction instruction sent by the voice recognition module, the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand of the corresponding user is analyzed, the function corresponding to the voice demand is controlled to be realized, and the voice interaction instruction is processed into voice information to output voice sound.
According to the man-machine interaction emergency processing method, when the emergency station master controller receives a function key request sent by a function key, the step of processing the function key request sent by the function key and controlling and executing corresponding function key implementation operation comprises the following steps:
The emergency station master controller detects whether the function key receives a function key request in real time;
when the function key is detected to receive the function key request, controlling to receive the function key request sent by the function key;
according to the function key request sent by the function key, controlling to process the function key request sent by the function key, and controlling to execute the corresponding function key to realize operation;
when the emergency station master controller receives that the distance sensor senses that the person is in a specified distance range from the emergency station, the step of controlling to start the emergency public welfare propaganda function comprises the following steps of:
the emergency station master controller detects whether the distance sensor senses that a person is in a specified distance range from the emergency station in real time;
when the distance sensor senses that the person is in a specified distance range from the emergency station, an emergency public welfare propaganda function is started.
The invention has the beneficial effects that: the invention provides an emergency station system with a man-machine interaction function and a man-machine interaction emergency processing method, wherein new functions are added to the emergency station by using the emergency station, including touch large screen interaction, camera interaction, voice interaction, intelligent terminal code scanning interaction, distance sensor sensing interaction and the like, so that a user is better served by realizing man-machine multifunctional interaction. The invention has the following advantages:
1): the emergency station can interact through a touch large screen: touch interaction;
2): the emergency station can interact through the camera: image interaction and gesture interaction;
3): the emergency station can interact through voice: voice and semantic interaction;
4): the emergency station can scan the code interaction through the intelligent terminal: code scanning interaction;
5): and (3) distance sensing interaction, and when users are sensed to exist around, the emergency station plays public benefit programs such as emergency knowledge.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings may be obtained according to the drawings without inventive effort to those skilled in the art.
Fig. 1 is a schematic diagram of a system frame of an emergency station system with man-machine interaction function according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of man-machine interaction through a touch display screen in the man-machine interaction emergency processing method of the emergency station system with man-machine interaction function provided by the embodiment of the invention.
Fig. 3 is a schematic flow chart of interaction between a camera component and an emergency station in the man-machine interaction emergency processing method of the emergency station system with man-machine interaction function provided by the embodiment of the invention.
Fig. 4 is a schematic diagram of an interaction flow for implementing a user and an emergency station through a voice recognition module in the man-machine interaction emergency processing method of the emergency station system with man-machine interaction function provided by the embodiment of the invention.
Fig. 5 is a schematic diagram of an interaction flow for implementing a user and an emergency station through function keys in a man-machine interaction emergency processing method of an emergency station system with man-machine interaction function provided by an embodiment of the invention.
Fig. 6 is a schematic flow chart of a human-computer interaction emergency processing method of an emergency station system with a human-computer interaction function, in which an emergency station senses a user through a distance sensor and realizes interaction with the user.
Fig. 7 is a schematic flow chart of a human-computer interaction emergency processing method of an emergency station system with a human-computer interaction function, in which an emergency station senses a user through a distance sensor and realizes interaction with the user.
Fig. 8 is a schematic diagram of a process of interaction between a user and an emergency station through a mobile phone by using bluetooth pairing connection in the man-machine interaction emergency processing method of the emergency station system with man-machine interaction function provided by the embodiment of the invention.
Fig. 9 is a schematic diagram of a process flow for realizing interaction between an emergency station and a cloud end and interaction between the emergency station and a remote user and a manager through a communication module and a communication network in a man-machine interaction emergency processing method of an emergency station system with a man-machine interaction function provided by an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clear and clear, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
It should be noted that, if directional indications (such as up, down, left, right, front, and rear … …) are included in the embodiments of the present invention, the directional indications are merely used to explain the relative positional relationship, movement conditions, etc. between the components in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indications are correspondingly changed.
As shown in fig. 1, an emergency station system with man-machine interaction function provided in an embodiment of the present invention includes: the system comprises a touch display screen 110, a camera assembly 120, a voice recognition module 130, a code scanning interaction module 140, a distance sensor 150, function keys 160, an emergency station master controller 170 and a communication module 180;
The touch display screen 110 is connected with the emergency station main controller 170, and is used for displaying various emergency touch interaction functions and receiving touch interaction operation instructions of a user; the touch display screen can be arranged at a position of the emergency station, which is convenient for a user to watch and operate, for example, the touch display screen is arranged at a door of the emergency station; the touch display screen in the embodiment of the invention is used for displaying various emergency touch interaction functions, can display various touch interaction functions in emergency, and can receive touch operation instructions of users so as to perform corresponding emergency treatment.
The camera assembly 120 is connected with the emergency station main controller 170, and is used for acquiring a face interaction image and a gesture interaction image of a user; in this embodiment, the camera module 120 includes a camera, and the capturing of the facial interaction image and the gesture interaction image of the user refers to capturing facial expressions and gesture actions of the user through the camera or other sensor technologies, and converting the facial expressions and gesture actions into digital image data, so that the computer system can understand and respond to the interaction actions of the user.
For example, a facial interaction image of a user can be obtained through a camera, and the expression of the user can be identified through a facial recognition technology. In a word, the method for acquiring the human face interaction image and the gesture interaction image of the user can help the computer system to better understand and respond to the interaction behavior of the user, so that the user experience and interaction efficiency are improved.
In the embodiment of the invention, the emergency station can capture the image of the user through the camera component 120 and transmit the image to the emergency station main controller, and the emergency station main controller analyzes according to the image captured by the camera component 120 and distinguishes the intention of the user according to the action of the user, thereby meeting the requirements of some special users and realizing the interaction between the user and the emergency station through the action of the user in an emergency state.
The voice recognition module 130 is connected with the emergency station master controller 170 and is used for recognizing voice interaction instructions of users; in the embodiment of the invention, the emergency user can issue a voice emergency instruction through the voice recognition module 130, and convenience is provided for the user in the emergency situation.
In the implementation of the present invention, preferably, as shown in fig. 1, the voice recognition module 130 includes: an audio module 131 connected to the emergency station master 170, a microphone 132 connected to the audio module 131, and a speaker 133 connected to the audio module 131.
In the embodiment of the invention, the microphone and the loudspeaker of the voice recognition module 130 can be arranged at the entrance and exit position of the emergency station, the microphone and the loudspeaker are used for realizing the interaction between the user and the emergency station through sound, the user can speak emergency demands through the microphone, voice signals of the user are picked up by the microphone and converted into analog electric signals, the analog electric signals are processed by the audio module and then are converted into digital signals, the digital signals are transmitted to the main controller of the emergency station for processing, and the main controller of the emergency station can realize the demands of the user according to the voice of the user after analyzing the voice signals of the user, such as obtaining emergency materials, providing emergency services and the like, and meanwhile, the loudspeaker is used for informing the user results and the demand information of the user.
The code scanning interaction module 140 is connected with the emergency station main controller 170 and is used for displaying code scanning interaction two-dimensional codes; preferably, the code scanning interaction module 140 is a display screen for displaying the code scanning interaction two-dimensional code. In the embodiment of the invention, some emergency functions can be connected with the interactive functions of the interactive emergency station after verification by scanning the two-dimensional code displayed by the code scanning interaction module 140 through the user mobile phone terminal 302.
For example, the mode of realizing the intelligent interaction between the emergency station and the user by scanning the two-dimension code displayed by the code scanning interaction module 140 through the user mobile phone terminal 302 can perform the related emergency function by scanning the two-dimension code displayed by the code scanning interaction module 140 at the emergency station through the user mobile phone terminal 302; of course, interaction between the user and the emergency station through the mobile phone can be realized through Bluetooth pairing connection; the emergency station realizes the remote interaction between the mobile phone and the emergency station through the communication module and the communication network, and the interaction between the mobile phone and the emergency station can be realized even if a user is not on site.
In the embodiment of the present invention, the distance sensor 150 is connected to the emergency station master controller 170, and is used for detecting the distance information of the sensing user; the distance sensor is used for sensing whether users exist in a preset range around, if so, the emergency station can control the on-site user to play emergency public welfare videos and other preset videos, and automatic interaction between the user and the emergency station is achieved.
The function key 160 is connected with the emergency station master controller 170, and is used for receiving a function key instruction of a user; the function key may be an operation keyboard. In the embodiment of the invention, the function keys arranged on the emergency station are used for enabling a user to interact with the emergency station directly through the keys and realizing related emergency functions, such as key emergency calling, key emergency alarming, key emergency material taking and the like. The key interaction is concise, convenient and quick, and is the most rapid interaction mode for emergency.
In the embodiment of the present invention, the emergency station master controller 170 is respectively connected to the touch display screen 110, the camera assembly 120, the voice recognition module 130, the code scanning interaction module 140, and the distance sensor 150, where the emergency station master controller 170 is configured to receive a touch interaction operation instruction sent by the touch display screen 110, and control and run a corresponding emergency station operation function according to the touch interaction operation instruction;
that is, in this embodiment, the emergency station master 170 may receive touch interactive operation instructions from the touch display screen, and control operation functions of the emergency station according to the instructions. In other words, when a user performs a touch operation on the touch display screen, the system receives a corresponding instruction and performs the related functions of the emergency station accordingly. For example, assuming a touch screen on a fire emergency station, a user may select different operations, such as an alarm, call rescue, etc., by touching the display screen. When a user operates on the touch display screen, the system receives corresponding instructions and then performs corresponding emergency station operation functions, such as sending an alarm to a fire department, triggering a fire suppression system, etc., in accordance with the instructions. Thus, the operation function of the emergency station is controlled by touching the interactive operation instruction.
The emergency station master controller 170 is further configured to analyze the face interaction image and the gesture interaction image sent by the camera assembly 120, analyze a corresponding user requirement, and drive and execute a requirement function corresponding to the user requirement; that is, the emergency station master controller 170 in this embodiment may analyze the face interaction image and the gesture interaction image sent from the camera module 120, and analyze the corresponding user requirements therefrom, and then drive to execute the requirement functions corresponding to the user requirements. In other words, the user's needs are understood by analyzing the user's face and gesture interaction behavior, and corresponding functions are performed accordingly.
For example, if the smart interactive camera assembly 120 is equipped at an emergency station, the user may control the emergency station to initiate the relevant emergency function through facial expressions and gestures. For example, when a user places a certain emergency facial expression or emergency gesture (e.g., a two-hand lifting gesture is an emergency gesture, an OK gesture corresponds to dialing 110 an emergency gesture, etc.), the camera assembly captures these interactive images and sends them to the system. The images are analyzed and then the user's needs, such as emergency calls, emergency alarms, emergency supplies, etc., are resolved. And then the functions corresponding to the user demands are driven to be executed, and emergency calling, emergency alarming and emergency material taking are realized.
In a word, through analyzing and analyzing the human face interaction image and the gesture interaction image sent by the camera component, the requirements of the user can be understood, and corresponding functions are driven and executed accordingly, so that the interaction experience and the use efficiency of the user are improved.
Further, the emergency station master controller 170 is further configured to parse the voice interaction instruction sent by the voice recognition module 130, parse out that the corresponding user voice requirement control implements a corresponding voice requirement function, and process the voice interaction instruction to output voice for voice information; in the embodiment of the invention, the voice interaction instruction sent from the voice recognition module 130 is analyzed, the voice emergency requirement of the user is analyzed, the corresponding voice emergency function is executed, the result is processed into voice information to be output, and voice sounding is performed. In other words, the present invention recognizes the voice command of the user through the voice recognition module, then performs the corresponding emergency function, and outputs the result in the form of voice using the voice synthesis technology.
For example, the user may trigger the emergency function by a voice command. When the user speaks "call police" or "fire alarm", the voice recognition module sends the instruction to the emergency station master controller for analysis, and the emergency requirement of the user is recognized to call police or alarm. The control then performs the corresponding emergency functions, such as sending an alarm, triggering a fire suppression system, etc., and processes the result as a voice message output using voice synthesis techniques, for example, responding to "police has called" or "fire alarm has triggered", and then sounds to output this voice message.
In a word, the voice interaction instruction sent by the voice recognition module is analyzed, so that the system can understand the voice emergency requirement of the user, execute the corresponding emergency function according to the voice emergency requirement, process the result into voice information and output the voice information to make voice.
The emergency station master controller 170 is further configured to process the function key request sent by the function key 160, and control execution of the corresponding function key to implement operation; the term means that the function key request transmitted from the function key 160 is processed and the corresponding function operation is performed according to the request. In other words, when the user presses the function key 160, the system recognizes the request and performs a corresponding operation according to the function corresponding to the key.
For example, a user may control an emergency alert switch of an emergency station by pressing a function key. When the user presses the emergency alarm switch of the function key, the system recognizes the request and performs a corresponding operation, such as performing an emergency alarm, according to the function corresponding to the key. Therefore, the present invention can identify the user's requirement and execute the corresponding function operation by processing the function key request sent by the function key 160, thereby realizing intelligent control.
The emergency station master controller 170 is further configured to process the human body information sensed by the distance sensor 150, and start an emergency public welfare propaganda function when a user is sensed to be in a predetermined range around; that is, in the embodiment of the present invention, the human body information sensed from the distance sensor 150 may be processed, and the emergency public welfare propaganda function may be started according to the sensed information. In other words, when the distance sensor 150 senses that there is a user within a predetermined range around, the system will activate the emergency public welfare propaganda function. For example, assume a public place is equipped with a distance sensor 150 for detecting whether a person is present in the place. When the sensor detects that someone exists in the place, the system can process the human body information sensed by the sensor and start the emergency public welfare propaganda function according to the sensed information. For example, the system can play a section of publicity advertisement to remind people to pay attention to safety, or play a section of epidemic prevention propaganda video to remind people to pay attention to protective measures.
Therefore, by processing the human body information sensed by the distance sensor 150, the system can identify the demands of the user and start the corresponding emergency public welfare propaganda function according to the demands, thereby improving the safety consciousness and the health consciousness of the user.
Further, the communication module 180 is connected to the emergency station master 170, and is configured to receive and transmit interaction information through the communication module 180. I.e., in embodiments of the present invention, the interactive information is received and transmitted through the communication module 180. The communication module 180 may refer to any hardware module capable of performing communication, such as a Wi-Fi module, a bluetooth module, or a mobile communication module. The system can realize communication with a user or other devices through the communication module 180 for receiving and sending interaction information, thereby realizing functions of intelligent control, information transfer and the like.
Further, as shown in fig. 1, the emergency station system with man-machine interaction function further includes:
the mobile terminal 301 is communicatively connected to the communication module 180, and is configured to scan the two-dimensional code of the code scanning interaction module 140, and generate a code scanning interaction instruction to perform code scanning interaction. In the embodiment of the invention, the code scanning interaction instruction can be generated by scanning the two-dimensional code on the code scanning interaction module 140, so that emergent code scanning interaction is performed. In other words, when the user scans the two-dimensional code on the code scanning interaction module 140, the system generates a corresponding instruction and performs the emergency code scanning interaction operation.
For example, it is assumed that the emergency station system with man-machine interaction function of the present invention is equipped with a code scanning interaction module 140 for performing emergency code scanning interaction. When an emergency occurs in a place, a user can acquire related information or perform emergency operation by scanning the two-dimensional code on the code scanning interaction module 140. For example, when a fire occurs in a site, a user may obtain information such as an escape route map or a fire emergency treatment guide by scanning the two-dimensional code on the code scanning interaction module 140.
In a word, the two-dimensional code on the code scanning interaction module 140 is scanned, so that the requirements of a user can be identified, corresponding instructions are generated, emergency code scanning interaction operation is performed, and the emergency response capability of the user is improved.
The emergency management terminal 200 is in communication connection with the communication module 180 and is used for performing operation control on each functional module of the emergency station system with the man-machine interaction function. That is, the emergency management terminal 200 in the embodiment of the present invention may perform operation control on each functional module of the emergency station system having a man-machine interaction function. In other words, the emergency management terminal 200 may operate the respective functional modules in the emergency station system by means of operation control.
For example, when an emergency such as a natural disaster, a traffic accident, etc. occurs, an emergency manager may perform operation control of an emergency station system having a man-machine interaction function through the emergency management terminal 200. They can send instructions such as issuing emergency notification, scheduling rescue resources, monitoring disaster, etc. to the emergency station system through the emergency management terminal 200. In this way, emergency management personnel can operate and control the entire emergency station system in real time through the emergency management terminal 200.
In short, the emergency management terminal 200 performs operation control on each functional module of the emergency station system with the man-machine interaction function, so that emergency management personnel can be helped to quickly respond to an emergency situation, and command scheduling and resource management can be effectively performed.
Based on the emergency station system with the man-machine interaction function in the above embodiment, the embodiment of the present invention further provides a man-machine interaction emergency processing method of the emergency station system with the man-machine interaction function in any one of the above embodiments, including the steps of:
s100, controlling the touch display screen 110 to display various emergency touch interaction functions, and detecting whether the touch display screen 110 receives a touch interaction operation instruction of a user in real time;
And controlling the camera component 120 to start, and detecting whether the camera component 120 acquires the face interaction image and the gesture interaction image of the user;
and realizing detection of whether the voice recognition module 130 recognizes the voice interaction instruction of the user;
and controlling the code scanning interaction module 140 to display the two-dimensional code for interaction, and detecting whether the code scanning interaction instruction of the user mobile terminal is received and acquired in real time;
and controlling the distance sensor 150 to start distance sensing and confirming whether the distance sensor 150 senses that the person is at a specified distance range from the emergency stop;
and detecting whether the function key 160 receives a function key instruction of a user;
s200, when the emergency station main controller 170 receives a touch interaction operation instruction sent by the touch display screen 110, controlling to run corresponding emergency station operation functions according to the touch interaction operation instruction;
when the emergency station master controller 170 receives the face interaction image and the gesture interaction image sent by the camera assembly 120, the face interaction image and the gesture interaction image sent by the camera assembly 120 are analyzed, corresponding user requirements are analyzed, and a requirement function corresponding to the user requirements is driven to be executed;
When the emergency station master controller 170 receives the voice interaction instruction sent by the voice recognition module 130, the voice interaction instruction sent by the voice recognition module 130 is parsed, the voice demand control of the corresponding user is parsed to realize the corresponding voice demand function, and the voice interaction instruction is processed to output voice sound for voice information;
when the emergency station master controller 170 receives the function key request sent by the function key 160, the function key request sent by the function key is processed, and the corresponding function key is controlled to be executed to realize operation;
when the emergency station master 170 receives that the distance sensor 150 senses that the person is within a specified distance range from the emergency station, the emergency public welfare propaganda function is controlled to be started.
In the embodiment of the present invention, when the emergency station master controller 170 receives the touch interaction operation instruction sent by the touch display screen 110, the step of controlling to run the corresponding emergency station operation function according to the touch interaction operation instruction includes:
the emergency station master controller 170 detects whether the touch display screen 110 receives a touch interactive operation instruction in real time;
when the touch display screen 110 is detected to receive the touch interactive operation instruction, controlling the touch interactive operation instruction sent by the touch display screen 110 to the emergency station master controller 170;
When the emergency station master controller 170 receives the touch interaction operation instruction sent by the touch display screen 110, controlling to run corresponding emergency station operation functions according to the touch interaction operation instruction;
while running the corresponding emergency station operation function, the touch display screen 110 is controlled to continuously receive the touch interaction instruction of the user to continuously complete the corresponding interaction requirement.
Specifically, the flow of man-machine interaction through the touch display screen is shown in fig. 2, and the method comprises the following steps:
s211, displaying various touch function information on a touch display screen of the emergency station;
this step describes the display of various available function options on the touch screen of the emergency station, which the user can select the corresponding function by touching. For example: on the touch display screen of the emergency station, functional options such as 'calling rescue', 'viewing a map', 'reporting a fault' and the like are displayed.
S212, a user touches a corresponding function according to the function displayed on the touch display screen;
the method comprises the steps that a user selects corresponding functions through a touch screen according to function options displayed on the touch screen, and corresponding operation is triggered. For example: the user clicks a "call rescue" button on the touch screen to request help.
S213, the emergency station operates corresponding functions according to the touch result of the user;
in this embodiment, the emergency station may perform corresponding functional operations, such as calling for rescue, displaying a map, recording fault information, and the like, according to a selection of a user on the touch screen. For example: when the user clicks the 'call rescue' button, the emergency station system can automatically send out a distress signal and inform relevant personnel.
S214, the emergency station continuously interacts with the user through the touch display screen in the process of operating the function selected by the user;
that is, in this embodiment, the emergency station system may remain interactive with the user during the execution of the user-selected function, which may require the user to provide additional information or confirmation. For example: after the user calls for rescue, the emergency station system can ask the user to confirm the help seeking information through the prompt on the touch screen.
S215, the user and the emergency station interact well through the touch display screen, and the user requirement is met.
The final objective in the embodiment of the invention is to realize good interaction between the user and the emergency station system through the touch screen, meet the requirements of the user, and ensure that the system can effectively respond to the operation of the user. For example: the user selects 'view map' on the touch screen, and the emergency station system immediately displays the map and allows the user to perform operations such as zooming in and zooming out so as to meet the requirement of the user on map information.
From the above, the invention realizes the interaction between the user and the emergency station system through the touch screen, and the user can select different functions through the touch screen and interact with the system in real time so as to meet the requirements of the user in emergency.
Further, in the human-computer interaction emergency processing method, when the emergency station master controller 170 receives the face interaction image and the gesture interaction image sent by the camera assembly 120, the steps of analyzing the face interaction image and the gesture interaction image sent by the camera assembly 120, analyzing the corresponding user requirements, and driving to execute the requirement functions corresponding to the user requirements include:
the emergency station main controller 170 detects whether the camera shoots a face interaction image and a gesture interaction image of a user in real time;
when the camera is detected to shoot the face interaction image and the gesture interaction image of the user, the shot face interaction image and gesture interaction image of the user are sent to the emergency station main controller 170;
when the emergency station master controller 170 receives the face interaction image and the gesture interaction image sent by the camera assembly 120, the face interaction image and the gesture interaction image sent by the camera assembly 120 are analyzed, actions of a user are analyzed, and corresponding user requirements are analyzed;
And driving to execute a demand function corresponding to the user demand according to the analyzed user demand.
According to the embodiment of the invention, the camera of the emergency station can be used for shooting the image of the user and transmitting the image to the CPU of the emergency station, the CPU of the emergency station analyzes the image shot by the camera and distinguishes the intention of the user according to the action of the user, so that the requirements of some special users are met and the interaction between the user and the emergency station is realized through the action of the user in an emergency state;
specifically, in the embodiment of the present invention, a flow of interaction between the camera assembly and the emergency station is shown in fig. 3, and the method includes the following steps:
s221, when the emergency station master controller detects that a user exists, a camera is opened to acquire a user image;
in the step, when the emergency station master controller detects that a user enters the system range, the camera is automatically opened to acquire the image information of the user. For example: when a user enters the monitoring range of the emergency station, the emergency station main controller can automatically open the monitoring camera to acquire the image of the user.
S222, transmitting the image of the user to an emergency station by the camera for analysis, in particular to analysis of the action of the user;
in this embodiment, the camera may transmit the acquired user image to the emergency station master, and the emergency station master may analyze the image, in particular, analyze the user's action. For example: the camera transmits the acquired user image to the emergency station main controller, and the emergency station main controller analyzes the image, for example, identifies information such as gestures and postures of the user.
S223, analyzing the requirements of the user by the emergency station master controller through analyzing the actions of the user;
in this step, the emergency station master controller can analyze the user's needs, such as the user needs to seek help, acquire certain information, etc., through analysis of the user's actions. For example: through analysis of the user gestures, the emergency station master controller can analyze the requirements of the user for help.
S224, driving corresponding functions by the emergency station master controller according to the requirements of the user, and completing the corresponding requirements of the user;
in the step, the emergency station master controller drives corresponding functional modules according to the requirements of the user to finish the requirements of the user, such as calling rescue, displaying a map and the like. For example: when the emergency station master controller analyzes the demand of the user for help, the system can automatically call rescue and inform related personnel.
S225, the emergency station master controller controls the camera to capture the action of the user and analyze the user demand, so that the user demand is met;
in the embodiment, the emergency station master controller can capture the action of the user by controlling the camera, and analyze the requirement of the user by analyzing the action of the user so as to meet the requirement of the user. For example: the emergency station master controller can capture gesture actions of the user by controlling the camera, and analyze the requirements of the user by analyzing the gestures, so that the requirements of the user are met.
From the above, in the embodiment of the invention, the camera is used for acquiring the image information of the user, and the user requirement is analyzed through analysis of the user action, so that the corresponding functional module is driven to meet the user requirement.
In a further embodiment, when the emergency station master controller 170 receives the voice interaction instruction sent by the voice recognition module 130, the step of analyzing the voice interaction instruction sent by the voice recognition module 130, analyzing the voice demand control of the corresponding user to realize the corresponding voice demand function, and processing to output voice for voice information includes:
the emergency station master controller 170 detects in real time whether the voice recognition module 130 receives a voice interaction instruction;
when the voice recognition module 130 receives the voice interaction instruction, the voice interaction instruction is sent to the emergency station master controller 170;
when the emergency station master controller 170 receives the voice interaction instruction sent by the voice recognition module 130, the voice interaction instruction sent by the voice recognition module 130 is parsed, the voice requirement of the corresponding user is parsed, the function corresponding to the voice requirement is controlled to be realized, and the voice interaction instruction is processed into voice information to output voice.
In specific implementation, as shown in fig. 4, the interactive flow between the user and the emergency station is implemented through the voice recognition module 130 according to the embodiment of the present invention, which includes the following steps:
s231, the emergency station master controller controls a microphone of the voice recognition module to receive the voice of the user;
in the step, the emergency station master controller controls a microphone of the voice recognition module and is used for receiving voice instructions or information of a user. For example: when the user speaks "i need help", the emergency station master controls the microphone of the speech recognition module to receive the user's voice.
S232, the emergency station master controller processes the received user voice signals and analyzes the requirements of the user;
in the step, the emergency station master controller processes the received user voice signals and analyzes the requirements expressed by the user through a voice recognition technology. For example: when the voice recognition module receives the 'I need help' uttered by the user, the emergency station master controller can analyze the requirement that the user needs to seek help.
S233, after the emergency station master controller analyzes the requirements of the user, driving the execution mechanism to complete the requirements of the user;
in the step, the emergency station master controller drives a corresponding executing mechanism to meet the requirements of the user according to the analyzed user requirements, such as calling rescue, information displaying and the like. For example: when the emergency station master controller analyzes the demand of the user for help, the system can automatically drive the execution mechanism to call rescue and inform related personnel.
S234, the emergency station master controller controls the execution result to be fed back to the user through the loudspeaker and the screen, and interaction is completed;
in this embodiment, the emergency station master controller feeds back the execution result to the user through the equipment such as the loudspeaker and the screen, and the interaction process of the user requirement is completed. For example: when the emergency station master controller finishes processing the user demands, the system feeds back execution results to the user through a loudspeaker and a screen, such as playing prompt tones and displaying information that help is in place.
In this embodiment, the microphone and the speaker of the emergency station enable the user to interact with the emergency station through sound, the user speaks the emergency requirement through the microphone, the voice signal of the user is picked up by the microphone and then converted into an analog electric signal, the analog electric signal is processed by the audio module and then becomes a digital signal, the digital signal is transmitted to the CPU of the emergency station for processing, the CPU of the emergency station analyzes the voice signal of the user and then enables the requirement of the user according to the voice of the user, such as obtaining emergency materials, providing emergency services and the like, and meanwhile, the speaker is used for informing the user of the result and the requirement information of the user
Therefore, the invention can receive the voice instruction or information of the user through the voice recognition module, drive the corresponding executing mechanism to meet the user requirement through processing and analyzing the voice signal, and finally feed back the executing result to the user through equipment such as a loudspeaker, a screen and the like, thereby completing the interaction process of the user requirement.
In a further embodiment, when the emergency station master controller 170 receives a function key request sent by a function key, the steps of processing the function key request sent by the function key and controlling to execute the corresponding function key implementation operation include:
the emergency station master controller 170 detects in real time whether the function key 160 receives a function key request;
when the function key 160 is detected to receive the function key request, controlling to receive the function key request sent by the function key 160;
according to the function key request sent by the function key 160, the function key request sent by the function key is controlled to be processed, and the corresponding function key is controlled to be executed to realize operation;
in specific implementation, as shown in fig. 5, the embodiment of the present invention implements an interaction flow between a user and an emergency station through function keys, including the following steps:
s241, an emergency station (a master controller) detects whether a function key is pressed;
in this step, the emergency station master controller determines whether a function key is pressed by detecting the sensor or key state so as to trigger a corresponding function module. For example: when the user presses the "call rescue" button, the emergency station master controller detects that the key is pressed.
S242, when the emergency station (main controller) detects that a functional key is pressed, the functional module corresponding to the key is controlled to be started;
namely, in the step: after the emergency station master controller detects that the function key is pressed, the function module corresponding to the key is controlled to be started so as to meet the requirement of a user. For example: when the emergency station master controller detects that the call rescue button is pressed, the system starts a call rescue function module.
S243, the emergency station (master controller) interacts with the user through a screen and a loudspeaker in the process of completing the function required by the corresponding user;
in this step, when the emergency station master controller completes the function required by the user, the emergency station master controller interacts with the user through devices such as a screen and a loudspeaker to feed back the execution result or provide further guidance. For example: when the user presses the 'call rescue' button, the emergency station master controller can display the 'rescue request sent' through a screen and play a prompt tone through a loudspeaker to interact with the user.
S244, the emergency station (main controller) rapidly and conveniently completes the user demand through key and user interaction;
according to the embodiment of the invention, the emergency station master controller can rapidly and conveniently complete the requirements of the user through the key and the user interaction, so that the user experience and the emergency response efficiency are improved. For example: after a user presses a 'call rescue' button, the emergency station master controller rapidly starts a rescue flow and interacts with the user through a screen and a loudspeaker, so that the user demand is rapidly and conveniently completed.
From the above, the function keys of the emergency station are used for enabling a user to interact with the emergency station directly through the keys and realizing related emergency functions, such as key emergency calling, key emergency alarming, key emergency taking emergency materials and the like. The key interaction is concise, convenient and quick, and is the most rapid interaction mode for emergency; according to the embodiment, the corresponding functional module is triggered by detecting the state of the functional key, and the user is interacted with the equipment such as a screen, a loudspeaker and the like, so that the user demand is rapidly and conveniently completed.
In a further embodiment of the present invention, when the emergency station master controller 170 receives that the distance sensor 150 senses that the person is within a specified distance range from the emergency station, the step of controlling to start the emergency public welfare propaganda function includes:
the emergency station master 170 detects in real time whether the distance sensor 150 senses that the person is within a specified distance range from the emergency station;
when the distance sensor 150 senses that the person is within a specified distance range from the emergency station, an emergency public welfare propaganda function is started.
As shown in fig. 6, the emergency station of the embodiment senses a user through a distance sensor and implements a process of interacting with the user, and includes the following steps:
S251, an emergency station (a master controller) senses whether a user is around the emergency station through a distance sensor;
in this embodiment, the emergency station master controller detects whether there are users around through the distance sensor, so as to respond to the needs of the users in time or provide corresponding services. For example: when a user approaches the emergency station, the distance sensor detects the presence of the user.
S252, starting to start corresponding functions when the emergency station (master controller) detects that the distance sensor senses that users exist around;
in this step, when detecting that the distance sensor senses that there is a user within a predetermined range around, for example, within 50 meters, the emergency station master controller starts a corresponding functional module to meet the needs of the user or provide related services. For example: when the distance sensor detects that a user approaches, the emergency station master controller can start a user navigation function to guide the user to reach a destination.
S253, when the emergency station (main controller) receives that users exist around the distance sensor, related functions such as emergency public welfare propaganda are started, and the users can obtain the related emergency functions at the first time;
in the step, when receiving that users exist around the distance sensor, the emergency station master controller starts related functions such as emergency public welfare propaganda and the like so as to enable the users to obtain related emergency functions or information at the first time. For example: when the distance sensor detects that a user approaches, the emergency station master controller can start to play emergency public welfare propaganda information, so that the user can obtain relevant emergency knowledge and information at the first time.
From the above, the distance sensor of the emergency station disclosed by the invention is used for sensing whether users exist around, and if so, the emergency station can play emergency public welfare videos and other preset videos for on-site users, so that automatic interaction between the users and the emergency station is realized. According to the invention, the existence of the user is detected through the distance sensor, and the corresponding functional module is started according to the detection result, so that the user can obtain the emergency related function or information at the first time.
In this embodiment, as shown in fig. 7, the process of implementing the interaction with the user by the emergency station through the screen two-dimensional code includes the following steps:
s251, controlling an emergency station (a main controller) to display various functional two-dimensional codes on a display screen;
in this step, the emergency station master controller can display various two-dimensional codes related to functions on the display screen by controlling the display screen, so that a user can acquire related functions or information by scanning the two-dimensional codes. For example: the emergency station main controller displays the two-dimensional code with the function of calling for rescue on a display screen, so that a user can initiate a rescue request by scanning the two-dimensional code.
S252, a user scans the two-dimensional code of the emergency station screen through a mobile phone and performs related operation;
In this step, the user can scan the two-dimensional code displayed on the screen of the emergency station by using the camera function of the mobile phone, and perform related operations according to the information on the two-dimensional code, for example, initiate a rescue request, acquire emergency information, and the like. For example: the user uses the mobile phone to scan the two-dimension code of the call rescue displayed on the screen of the emergency station so as to initiate a rescue request.
S253, after a user scans the two-dimensional code and enters a related interface, the user can interact with the emergency station through a mobile phone;
in the step, once a user enters a related interface by scanning the two-dimension code, the user can interact with the emergency station through the mobile phone so as to acquire required emergency service or information. For example: after the user scans the two-dimension code of 'calling rescue', the user can carry out text or voice communication with the emergency station on the mobile phone interface so as to explain specific emergency.
Therefore, the two-dimensional code can be displayed on the display screen, so that a user can acquire related functions or information by scanning the two-dimensional code through the mobile phone and interact with the emergency station through the mobile phone, and the emergency requirement of the user is met.
In this embodiment, as shown in fig. 8, the user can implement an interaction flow between the user and the emergency station through the mobile phone by using bluetooth pairing connection, and the method includes the steps of:
S261, the user is successfully connected with the Bluetooth pairing of the emergency station through the Bluetooth of the mobile phone;
in the step, a user can be connected with the emergency station in a pairing way through the Bluetooth function of the mobile phone so as to establish Bluetooth communication connection, so that the mobile phone can perform data transmission and control with the emergency station. For example: the user searches for and connects to nearby emergency station equipment in the cell phone setting, successfully establishes a bluetooth connection.
S262, the emergency station master controller controls the operation interface to be projected to the screen of the mobile phone of the user;
in this embodiment: the emergency station master controller can screen the operation interface of the emergency station on the mobile phone screen of the user through control, so that the user can directly operate the functional interface of the emergency station through the mobile phone. For example: the emergency station master controller projects the operation interface of the emergency station onto the mobile phone screen of the user in real time through Bluetooth connection, and the user can directly see the operation interface of the emergency station on the mobile phone.
S263, a user operates a stock screen interface of the emergency station on the mobile phone to realize various functions;
in this embodiment: the user can operate the screen projection interface of the emergency station on the mobile phone to realize various functions, such as calling for rescue, obtaining emergency information, navigation and other operations. For example: the user can select to call rescue and send emergency information or check the nearest emergency facility position and navigate through the screen interface of the operation emergency station on the mobile phone.
From the above, in the embodiment of the invention, the connection between the mobile phone and the emergency station is realized through bluetooth connection, so that the user can operate the functional interface of the emergency station through the mobile phone, thereby realizing various emergency operations.
In the embodiment of the invention, as shown in fig. 9, the process of realizing the interaction between the emergency station and the cloud end and the interaction between the emergency station and the remote user and the manager through the communication module and the communication network comprises the following steps:
s271, the emergency station receives data from the communication module or transmits data to the communication module;
in this embodiment: the emergency station can receive data from other devices or users through the communication module, and can send data to the other devices or users through the communication module so as to realize information exchange and communication functions. For example: the emergency station receives the call rescue information from the mobile phone of the user through the communication module and sends the emergency information to the cloud through the communication module.
S272, connecting the communication module with a communication network and exchanging data with the communication network;
in the step: the communication module may be connected to a communication network through which data exchange and communication with other devices or users takes place. For example: the communication module is connected to the 4G network, exchanges data with the cloud server, and receives an emergency instruction from the cloud or sends emergency information to the cloud.
S273, the emergency station is connected with the cloud end through a communication module and a communication network, and is connected with a remote user terminal or an emergency personnel terminal;
in this embodiment: the emergency station can be connected with the cloud server through the communication module and the communication network, and can be connected to a remote user terminal or an emergency personnel terminal at the same time so as to realize remote communication and control functions. For example: the emergency station is connected to the cloud server through the communication module, so that data exchange and control with the cloud are realized, and meanwhile, the emergency station can be connected to a terminal of a rescue worker so as to conduct real-time communication and command.
S274, the emergency station realizes interaction with the cloud and interaction with a remote user or a remote emergency ambulance through a communication network;
in this embodiment, the emergency station may implement data interaction with the cloud through the communication network, and may also perform real-time communication and interaction with a remote user or a remote emergency rescue member. For example: the emergency station performs data synchronization and information exchange with the cloud server through the communication network, and can also perform voice communication or text communication with a remote user mobile phone so as to acquire guidance or provide real-time information.
From the above, the functions in the embodiment of the invention enable the emergency station to realize connection and interaction with the cloud and remote users through the communication module and the communication network so as to realize more efficient emergency response and rescue service.
In summary, the invention provides an emergency station system with a man-machine interaction function and a man-machine interaction emergency processing method, and the use of the emergency station adds new functions including touch large screen interaction, camera interaction, voice interaction, intelligent terminal code scanning interaction, distance sensor sensing interaction and the like, so that the system can better serve users through the realization of man-machine multifunctional interaction. The invention has the following advantages:
1): the emergency station can interact through a touch large screen: touch interaction;
2): the emergency station can interact through the camera: image interaction and gesture interaction;
3): the emergency station can interact through voice: voice and semantic interaction;
4): the emergency station can scan the code interaction through the intelligent terminal: code scanning interaction;
5): and (3) distance sensing interaction, and when users are sensed to exist around, the emergency station plays public benefit programs such as emergency knowledge.
It is to be understood that the invention is not limited in its application to the examples described above, but is capable of modification and variation in light of the above teachings by those skilled in the art, and that all such modifications and variations are intended to be included within the scope of the appended claims.

Claims (10)

1. An emergency station system with man-machine interaction function, comprising:
The touch display screen is used for displaying various emergency touch interaction functions and receiving touch interaction operation instructions of a user;
the camera component is used for acquiring face interaction images and gesture interaction images of the user;
the voice recognition module is used for recognizing a voice interaction instruction of a user;
the code scanning interaction module is used for displaying code scanning interaction two-dimensional codes;
the distance sensor is used for detecting and sensing the distance information of the user;
the function key is used for receiving a function key instruction of a user;
the emergency station main controller is respectively connected with the touch display screen, the camera component, the voice recognition module, the code scanning interaction module and the distance sensor, and is used for receiving touch interaction operation instructions sent by the touch display screen and controlling and running corresponding emergency station operation functions according to the touch interaction operation instructions; the system comprises a camera component, a gesture interaction image and a human face interaction image, wherein the camera component is used for analyzing the human face interaction image and the gesture interaction image sent by the camera component, analyzing out corresponding user requirements and driving to execute requirement functions corresponding to the user requirements; the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand control of the corresponding user is analyzed to realize the corresponding voice demand function, and voice information is processed to output voice sound; the function key processing module is used for processing the function key request sent by the function key and controlling and executing the corresponding function key to realize operation; and for processing the human body information sensed by the distance sensor, when a user is sensed to exist in a preset surrounding range, an emergency public welfare propaganda function is started;
And the communication module is connected with the emergency station master controller and is used for receiving and sending interaction information through the communication module.
2. The emergency station system with man-machine interaction function according to claim 1, further comprising:
and the mobile terminal is in communication connection with the communication module and is used for scanning the two-dimensional code of the code scanning interaction module and generating code scanning interaction instructions to perform code scanning interaction.
3. The emergency station system with man-machine interaction function according to claim 1, further comprising:
the emergency management terminal is in communication connection with the communication module and is used for performing operation control on each functional module of the emergency station system with the man-machine interaction function.
4. The emergency station system with man-machine interaction function according to claim 1, wherein the voice recognition module comprises:
an audio module connected with the emergency station master controller,
a microphone connected to the audio module,
and the loudspeaker is connected with the audio module.
5. The emergency station system with the human-computer interaction function according to claim 1, wherein the code scanning interaction module is a display screen for displaying code scanning interaction two-dimensional codes.
6. A man-machine interaction emergency treatment method based on the emergency station system with man-machine interaction function as set forth in any one of claims 1 to 5, comprising the steps of:
controlling the touch display screen to display various emergency touch interaction functions, and detecting whether the touch display screen receives a touch interaction operation instruction of a user in real time;
the camera component is controlled to start, and whether the camera component acquires a face interaction image and a gesture interaction image of a user or not is detected;
and realizing detection of whether the voice recognition module recognizes the voice interaction instruction of the user;
the code scanning interaction module is controlled to display a two-dimensional code for interaction, and whether a code scanning interaction instruction of the user mobile terminal is received and acquired or not is detected in real time;
the distance sensor is controlled to start distance sensing, and whether the distance sensor senses that the person stands in a specified distance range from the emergency or not is confirmed;
detecting whether the function key receives a function key instruction of a user or not;
when the emergency station master controller receives a touch interaction operation instruction sent by the touch display screen, controlling and running corresponding emergency station operation functions according to the touch interaction operation instruction;
When the emergency station master controller receives the face interaction image and the gesture interaction image sent by the camera component, analyzing out corresponding user demands, and driving to execute demand functions corresponding to the user demands;
when the emergency station master controller receives the voice interaction instruction sent by the voice recognition module, the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand control of the corresponding user is analyzed to realize the corresponding voice demand function, and the voice interaction instruction is processed to output voice sound for voice information;
when the emergency station master controller receives a function key request sent by a function key, the function key request sent by the function key is processed, and the corresponding function key is controlled to be executed to realize operation;
when the emergency station master controller receives that the distance sensor senses that the person is in a specified distance range from the emergency station, the emergency public welfare propaganda function is controlled to be started.
7. The human-computer interaction emergency processing method according to claim 6, wherein when the emergency station master controller receives a touch interaction operation instruction sent by the touch display screen, the step of controlling to run the corresponding emergency station operation function according to the touch interaction operation instruction comprises:
The emergency station master controller detects whether the touch display screen receives a touch interaction operation instruction in real time;
when the touch display screen is detected to receive a touch interactive operation instruction, controlling the touch interactive operation instruction sent by the touch display screen to an emergency station master controller;
when the emergency station master controller receives a touch interaction operation instruction sent by the touch display screen, controlling and running corresponding emergency station operation functions according to the touch interaction operation instruction;
and controlling the touch display screen to continuously receive the touch interaction instruction of the user and continuously finish the corresponding interaction requirement while running the corresponding emergency station operation function.
8. The human-computer interaction emergency processing method according to claim 6, wherein when the emergency station master controller receives the face interaction image and the gesture interaction image sent by the camera assembly, the step of analyzing the face interaction image and the gesture interaction image sent by the camera assembly, analyzing out a corresponding user requirement, and driving to execute a requirement function corresponding to the user requirement comprises:
the emergency station master controller detects whether the camera shoots a face interaction image and a gesture interaction image of a user in real time;
When the camera is detected to shoot the face interaction image and the gesture interaction image of the user, the shot face interaction image and gesture interaction image of the user are sent to the emergency station main controller;
when the emergency station master controller receives the face interaction image and the gesture interaction image sent by the camera assembly, analyzing actions of a user, and analyzing corresponding user requirements;
and driving to execute a demand function corresponding to the user demand according to the analyzed user demand.
9. The human-computer interaction emergency processing method according to claim 6, wherein when the emergency station master controller receives the voice interaction instruction sent by the voice recognition module, the step of analyzing the voice interaction instruction sent by the voice recognition module, analyzing the voice demand control of the corresponding user to realize the corresponding voice demand function, and processing to output voice for voice information comprises:
the emergency station master controller detects whether the voice recognition module receives a voice interaction instruction in real time;
when the voice interaction instruction is detected to be received by the voice recognition module, the voice interaction instruction is sent to an emergency station master controller;
When the emergency station main controller receives the voice interaction instruction sent by the voice recognition module, the voice interaction instruction sent by the voice recognition module is analyzed, the voice demand of the corresponding user is analyzed, the function corresponding to the voice demand is controlled to be realized, and the voice interaction instruction is processed into voice information to output voice sound.
10. The human-computer interaction emergency processing method according to claim 6, wherein when the emergency station master controller receives a function key request sent by a function key, the step of processing the function key request sent by the function key and controlling to execute a corresponding function key implementation operation includes:
the emergency station master controller detects whether the function key receives a function key request in real time;
when the function key is detected to receive the function key request, controlling to receive the function key request sent by the function key;
according to the function key request sent by the function key, controlling to process the function key request sent by the function key, and controlling to execute the corresponding function key to realize operation;
when the emergency station master controller receives that the distance sensor senses that the person is in a specified distance range from the emergency station, the step of controlling to start the emergency public welfare propaganda function comprises the following steps of:
The emergency station master controller detects whether the distance sensor senses that a person is in a specified distance range from the emergency station in real time;
when the distance sensor senses that the distance between the person and the emergency station is within a specified distance range, the emergency public welfare propaganda function is controlled to be started.
CN202311534045.2A 2023-11-16 2023-11-16 Emergency station system with man-machine interaction function and man-machine interaction emergency processing method Pending CN117762288A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311534045.2A CN117762288A (en) 2023-11-16 2023-11-16 Emergency station system with man-machine interaction function and man-machine interaction emergency processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311534045.2A CN117762288A (en) 2023-11-16 2023-11-16 Emergency station system with man-machine interaction function and man-machine interaction emergency processing method

Publications (1)

Publication Number Publication Date
CN117762288A true CN117762288A (en) 2024-03-26

Family

ID=90313370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311534045.2A Pending CN117762288A (en) 2023-11-16 2023-11-16 Emergency station system with man-machine interaction function and man-machine interaction emergency processing method

Country Status (1)

Country Link
CN (1) CN117762288A (en)

Similar Documents

Publication Publication Date Title
US8606316B2 (en) Portable blind aid device
US20160258202A1 (en) Garage door communication systems and methods
US20220295119A1 (en) Method and apparatus for interacting in live stream
CN108154579A (en) A kind of intelligent access control system and exchange method that can be interacted with visitor
TW201923737A (en) Interactive Method and Device
CN111971647A (en) Speech recognition apparatus, cooperation system of speech recognition apparatus, and cooperation method of speech recognition apparatus
KR100811077B1 (en) Individual security system using of a mobile phone and method of the same
KR20170018140A (en) Method for emergency diagnosis having nonlinguistic speech recognition function and apparatus thereof
CN110830771A (en) Intelligent monitoring method, device, equipment and computer readable storage medium
JP7340063B2 (en) Security system and surveillance display
JP2004214895A (en) Auxiliary communication apparatus
CN110248144B (en) Video conference control method, device, equipment and computer readable storage medium
US20180039836A1 (en) Single call-to-connect live communication terminal, method and tool
CN114363547A (en) Double-recording device and double-recording interaction control method
JP2778488B2 (en) Awareness control device
CN115733918A (en) Flight mode switching method and device, electronic equipment and storage medium
CN113473062A (en) Intercom with visual function
CN117762288A (en) Emergency station system with man-machine interaction function and man-machine interaction emergency processing method
JP2002261966A (en) Communication support system and photographing equipment
CN216748889U (en) Service guiding device and service system of bank self-service network
KR101260879B1 (en) Method for Search for Person using Moving Robot
CN114999496A (en) Audio transmission method, control equipment and terminal equipment
CN113741910A (en) Scene interaction method and device, electronic equipment and storage medium
KR102113419B1 (en) System and method for reporting disaster using automatic video call switching
CN111586335A (en) Remote meeting method, device and system and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication