CN111523427B - Behavior information transmission method and electronic equipment during disaster occurrence - Google Patents

Behavior information transmission method and electronic equipment during disaster occurrence Download PDF

Info

Publication number
CN111523427B
CN111523427B CN202010296566.9A CN202010296566A CN111523427B CN 111523427 B CN111523427 B CN 111523427B CN 202010296566 A CN202010296566 A CN 202010296566A CN 111523427 B CN111523427 B CN 111523427B
Authority
CN
China
Prior art keywords
behavior
user
disaster
behavior information
bluetooth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010296566.9A
Other languages
Chinese (zh)
Other versions
CN111523427A (en
Inventor
张腾飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN202010296566.9A priority Critical patent/CN111523427B/en
Publication of CN111523427A publication Critical patent/CN111523427A/en
Application granted granted Critical
Publication of CN111523427B publication Critical patent/CN111523427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/90Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/005Discovery of network devices, e.g. terminals
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Tourism & Hospitality (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Marketing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Security & Cryptography (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Emergency Management (AREA)
  • Educational Administration (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Alarm Systems (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application relates to the technical field of communication, and discloses a behavior information transmission method and electronic equipment when disasters occur, wherein the method comprises the following steps: detecting the corresponding behavior of a user when the current disaster is determined; determining behavior information corresponding to the response behavior, wherein the behavior information at least comprises a safety degree, and the safety degree is the safety effect of the estimated response behavior when a disaster occurs; and sending the behavior information corresponding to the response behavior to preset terminal equipment. Therefore, the electronic equipment can transmit behavior information of the user when the disaster occurs to the preset terminal equipment in real time without user operation, and escape state information can be effectively transmitted in time when the disaster occurs.

Description

Behavior information transmission method and electronic equipment during disaster occurrence
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a behavior information transmission method and an electronic device when a disaster occurs.
Background
Natural disasters often occur between sudden and severe earthquakes, which are a very damaging natural disaster and have a great threat to human life safety. If an earthquake occurs, people need to quickly take countermeasures (such as escaping to a clear place, etc.), so relatives and friends respectively located at different positions during the earthquake can only know the escape state of each other by calling or sending short messages after the earthquake, and the current method for transmitting information such as calling cannot effectively transmit escape state information in time when a disaster occurs.
Disclosure of Invention
The embodiment of the application discloses a behavior information transmission method and electronic equipment in disaster occurrence, which can effectively transmit escape state information in time in disaster occurrence.
The first aspect of the embodiment of the application discloses a behavior information transmission method when a disaster occurs, which comprises the following steps:
detecting the corresponding behavior of the user when the disaster is determined to occur currently;
determining behavior information corresponding to the response behavior, wherein the behavior information at least comprises a safety degree, and the safety degree is the safety effect brought by the evaluated response behavior when the disaster occurs;
and sending the behavior information corresponding to the response behavior to preset terminal equipment.
As an optional implementation manner, in the first aspect of the embodiment of the present application, when determining that the disaster currently occurs, detecting a response behavior of the user includes:
when the disaster is determined to occur currently, starting a preset behavior detection function to obtain behavior parameters; the behavior detection function comprises at least one of a motion sensor function, a positioning function and an audio-video recording function;
and determining the coping behavior of the user according to the behavior parameters.
As an optional implementation manner, in the first aspect of the embodiment of the present application, after the sending the behavior information corresponding to the coping behavior to a preset terminal device, the method further includes:
when the user is detected to be in a static state, acquiring an image through a camera;
detecting whether a person other than the user exists based on the image;
if the person does not exist, searching and obtaining a plurality of Bluetooth devices through the Bluetooth module;
determining a target Bluetooth device nearest to the electronic device from the plurality of Bluetooth devices;
sending a request for establishing Bluetooth connection to the target Bluetooth device;
and after the Bluetooth connection is established with the target Bluetooth device, sending a friend adding request to the target Bluetooth device.
As an optional implementation manner, in the first aspect of the embodiment of the present application, after determining, from the plurality of bluetooth devices, a target bluetooth device closest to the user, the method further includes:
determining the relative position of the electronic equipment and the target Bluetooth equipment, wherein the relative position at least comprises the distance between the electronic equipment and the target Bluetooth equipment and the azimuth angle of the electronic equipment relative to the target Bluetooth equipment;
Generating a position diagram according to the relative position, wherein the position diagram comprises a first coordinate point representing the electronic equipment, a second coordinate point representing the target Bluetooth equipment and a direction indication icon, the direction indication icon is used for indicating a specific direction, and the specific direction is north, south, east or west;
and outputting the position schematic diagram on a display screen.
As an optional implementation manner, in the first aspect of the embodiment of the present application, after the capturing, by the camera, an image when the user is detected to be in a stationary state, the method further includes:
extracting image features from the image;
determining a target disaster scene matched with the image features from a plurality of preset disaster scenes of the disaster;
acquiring a coping strategy of the target disaster scene;
determining an output mode corresponding to the file format of the coping strategy;
and outputting the coping strategy in the output mode.
As an optional implementation manner, in the first aspect of the embodiment of the present application, the method further includes:
detecting vital sign parameters of a user, the vital sign parameters including at least one of body temperature and pulse;
And sending the vital sign parameters to the preset terminal equipment.
As an optional implementation manner, in the first aspect of the embodiment of the present application, after the sending the behavior information corresponding to the coping behavior to a preset terminal device, the method further includes:
when detecting that the user generates a new coping action, acquiring action information corresponding to the new coping action;
and sending the behavior information corresponding to the new response behavior to the preset terminal equipment.
A second aspect of an embodiment of the present application discloses an electronic device, including:
the first detection unit is used for detecting the response behavior of the user when the current disaster is determined;
a first determining unit, configured to determine behavior information corresponding to the response behavior, where the behavior information includes at least a security degree, and the security degree is an estimated security effect of the response behavior when the disaster occurs;
and the first sending unit is used for sending the behavior information corresponding to the response behavior to preset terminal equipment.
A third aspect of an embodiment of the present application discloses an electronic device, including:
A memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program code stored in the memory to execute a behavior information transmission method when a disaster occurs, which is disclosed in the first aspect of the embodiment of the present application.
A fourth aspect of the embodiments of the present application is a computer readable storage medium storing a computer program, where the computer program is configured to execute a behavior information transfer method when a disaster occurs as disclosed in the first aspect of the embodiments of the present application.
A fifth aspect of the embodiments of the present application discloses a computer program product which, when run on a computer, causes the computer to perform part or all of the steps of any one of the methods of the first aspect.
A sixth aspect of the embodiments of the present application discloses an application publishing platform for publishing a computer program product, wherein the computer program product, when run on a computer, causes the computer to perform some or all of the steps of any one of the methods of the first aspect.
Compared with the prior art, the embodiment of the application has the following beneficial effects:
The electronic equipment spontaneously detects the response behavior of the user when the occurrence of the disaster is detected, determines the safety degree and other behavior information brought by the response behavior, wherein the safety degree is the safety effect brought by the evaluated response behavior when the disaster occurs, and sends the behavior information corresponding to the response behavior to the preset terminal equipment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a behavior information transmission method at the time of disaster occurrence according to an embodiment of the present application;
FIG. 2 is a flow chart of another method for delivering behavior information when a disaster occurs according to an embodiment of the present application;
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of another electronic device disclosed in an embodiment of the present application;
FIG. 5 is a schematic structural diagram of yet another electronic device disclosed in an embodiment of the present application;
fig. 6 is a schematic diagram of displaying behavior information on a preset terminal device according to an embodiment of the present application;
fig. 7 is an exemplary diagram of a schematic diagram of an output position of an electronic device on a display screen according to an embodiment of the present application.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made clearly and completely with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that the terms "first," "second," "third," and "fourth," etc. in the description and claims of the present invention are used for distinguishing between different objects and not for describing a particular sequential order. The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the application discloses a behavior information transmission method and electronic equipment in disaster occurrence, which can effectively transmit escape state information in time in disaster occurrence. The following detailed description is made with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flow chart of a behavior information transmission method at the time of disaster occurrence according to an embodiment of the present application. As shown in fig. 1, the behavior information transfer method at the time of occurrence of the disaster may include the following steps.
101. And the electronic equipment detects the coping behavior of the user when determining that the disaster occurs currently.
In this embodiment of the present application, the electronic device may be a wearable smart device (such as a phone watch) or a smart phone, which is not limited in this embodiment. Specifically, the electronic device may be provided with a motion sensor such as a triaxial accelerometer, a gyroscope, a geomagnetic sensor, and a positioning module such as a global positioning system (Global Positioning System, GPS), an Ultra wideband positioning module (UWB), and an ultrasonic positioning module.
In this embodiment of the present application, the disaster may be any disaster such as an earthquake, a fire disaster, or a typhoon, which is not limited in this embodiment.
In this embodiment of the present application, the manner in which the electronic device determines that a disaster occurs currently may include: the electronic equipment receives disaster early warning information, and detects a preset shortcut instruction indicating the occurrence of the disaster, wherein the shortcut instruction can be a preset gesture instruction and/or a voice instruction, and at least one of the detected current occurrence of the disaster is confirmed by the electronic equipment, so that the confirmation efficiency can be improved. The disaster early-warning information may be disaster early-warning information sent by any one of the servers or the terminal device. For example, after receiving the disaster early warning information, the electronic device performs semantic recognition to determine a place where the disaster occurs, and if the distance between the place and the current position of the electronic device is smaller than a preset distance threshold, the electronic device determines that the disaster occurs currently; the shortcut command can be preset by a user, for example, gestures of an index finger, a middle finger and a ring finger are compared, and when the electronic equipment detects the shortcut command, the current disaster is determined. The electronic device may also preset disaster characteristics, such as preset fire characteristics: and when the electronic equipment detects that the environmental temperature is 60 ℃ and the environmental humidity is 20%, the electronic equipment determines that the disaster occurs currently.
In the embodiment of the present application, the foregoing countermeasures refer to escape measures implemented by the user when the disaster occurs. The electronic device detecting the response behavior of the user may specifically include: the electronic equipment acquires an escape image of the user through the image acquired by the camera, identifies limb characteristics and environmental characteristics of the user according to the escape image, and analyzes the coping behavior of the user according to the limb characteristics. For example, in the escape image collected by the electronic device, the identified limb features are elbow bending and palm being placed on the head, the identified environmental feature is the lower part of the desk and chair, and the coping behavior of the user can be determined as "holding the head and hiding the lower part of the desk and chair".
As an alternative embodiment, after step 101, the method further comprises:
the electronic equipment searches the escape method of the disaster and judges whether the coping behavior is a correct escape method in the disaster; if the response is not the correct escape method in the disaster, the electronic equipment adjusts the volume to the preset volume and broadcasts the prompt information in a voice form, and the prompt information indicates the user to adopt the correct escape method.
Therefore, by implementing the embodiment, the user can correct the response behavior error in time, the preset volume is generally larger, and the user does not need to stop to check in a voice broadcasting mode, so that the escape blocking the user can be avoided.
102. The electronic device determines behavior information corresponding to the response behavior, wherein the behavior information at least comprises a safety degree, and the safety degree is the safety effect of the estimated response behavior when the disaster occurs.
In this embodiment of the present application, the behavior information refers to information related to the behavior, and in addition to the security degree, the behavior information may further include the behavior itself, the time when the behavior occurs, and the place where the behavior occurs; the safety degree can be expressed in the form of at least one of the expression of characters, numbers, letters and colors, the characters are taken as an example, when an earthquake occurs at present, the electronic equipment detects that the user makes a corresponding action of squatting under a desk, the electronic equipment evaluates that the safety effect brought by the corresponding action of squatting under the desk in the earthquake is an effective head protection, and the safety degree can be that the user squats under the desk and can effectively protect the head.
In some embodiments, the electronic device may also express the security level in combination with numbers, letters or colors, where the electronic device divides the security level into several levels in advance, and different numbers or letters indicate different levels of the security level, for example, numbers "1-10" are used to sequentially indicate levels of increasing (or decreasing) the security level, or letters "a-F" are used to sequentially indicate levels of increasing (or decreasing) the security level, or different colors are used to indicate different levels of the security level, so that the security level of the countermeasures is more intuitively and clearly expressed.
103. And the electronic equipment sends the behavior information corresponding to the corresponding behavior to the preset terminal equipment.
In this embodiment of the present application, the preset terminal device may be a smart phone, a computer, or an intelligent wearable device, and specifically, the preset terminal device may be a terminal device bound to an electronic device, and may generally be a terminal device of an emergency contact person or a terminal device of a relatives, and specifically, the electronic device stores a device identifier of the preset terminal device in a preset storage space, and step 103 may specifically include: the electronic equipment invokes a preset equipment identification code of the terminal equipment from a preset storage space, and sends the behavior information corresponding to the response behavior to the preset terminal equipment according to the equipment identification code.
After the electronic device sends the behavior information corresponding to the corresponding behavior to the preset terminal device, the terminal device is triggered to display the behavior information on a display screen of the terminal device after receiving the behavior information, please refer to fig. 6, and fig. 6 is a schematic diagram of displaying the behavior information by the preset terminal device according to the embodiment of the present application. Optionally, when the terminal device receives the behavior information, the behavior information may perform highlighting processing, such as font amplifying processing, font thickening processing, or highlighting processing, on the behavior information, so as to obtain target behavior information after highlighting processing, and display the target behavior information on a display screen of the terminal device. Performing the highlighting process can assist a user at the terminal device in speeding up the review efficiency.
According to the method provided by the embodiment, the electronic equipment transmits the behavior information of the user when the disaster occurs to the preset terminal equipment in real time without user operation, namely, escape state information can be effectively transmitted in time when the disaster occurs. In addition, when the response action made by the user is wrong, the prompt information can be timely corrected through voice broadcasting.
Referring to fig. 2, fig. 2 is a flow chart of another behavior information transmission method at the time of disaster according to an embodiment of the present application. As shown in fig. 2, the behavior information transfer method at the time of occurrence of the disaster may include the following steps.
201. When the electronic equipment determines that a disaster occurs currently, starting a preset behavior detection function to obtain behavior parameters; the behavior detection function includes at least one of a motion sensor function, a positioning function, and an audio-video recording function.
In the embodiment of the application, the motion sensor function, the positioning function and the audio-video recording function are utilized for behavior detection, and multi-angle detection is performed, so that the accuracy of detection results is higher. The behavior parameters may include at least one of acceleration parameters, position parameters, and audio-visual parameters. The electronic equipment can detect acceleration force of the electronic equipment by utilizing the function of the motion sensor, and analyze the acceleration force to obtain acceleration parameters; the electronic equipment can be positioned according to a preset frequency by utilizing a positioning function so as to obtain a position parameter, wherein the position parameter comprises the positions of the electronic equipment at different times; the electronic equipment receives the recorded video through the audio-video recording function so as to obtain audio-video data, and extracts audio parameters and video parameters from the audio-video data, wherein the audio-video parameters can be audio semantics, character images, article images and the like which are identified from the audio-video.
202. And the electronic equipment determines the coping behavior of the user according to the behavior parameters.
In some embodiments, the determining, by the electronic device, the coping behavior of the user according to the behavior parameters may include:
the electronic equipment obtains the moving mode of the user by analyzing the acceleration parameter, obtains the position change information of the user by analyzing the position parameter, and obtains the change information of the gesture of the user (such as sitting to squatting) by analyzing the audio-video parameter; the electronic equipment determines the coping behavior of the user by comprehensively analyzing the moving mode of the user, the position change information of the user and the change information of the gesture of the user.
203. The electronic equipment determines behavior information corresponding to the response behavior, wherein the behavior information at least comprises a safety degree, and the safety degree is the safety effect of the evaluated response behavior when the disaster occurs.
204. And the electronic equipment sends the behavior information corresponding to the corresponding behavior to the preset terminal equipment.
As an alternative embodiment, the method may further comprise the steps of:
the electronic device detects vital sign parameters of the user, the vital sign parameters including at least one of body temperature and pulse;
and the electronic equipment sends the vital sign parameters to preset terminal equipment.
Further optionally, before the electronic device sends the vital sign parameters to the preset terminal device, the electronic device may further determine a physical state of the user according to the vital sign parameters; the sending, by the electronic device, the vital sign parameters to the preset terminal device may include: and the electronic equipment sends the vital sign parameters and the physical state of the user to a preset terminal device.
In some embodiments, the electronic device may be provided with a body temperature sensor, a pulse sensor, etc., where the body state is used to indicate the health condition of the user, and specifically may include normal body temperature, low body temperature, high body temperature, normal pulse, slow pulse, fast pulse, etc., which is not limited in this embodiment; according to the implementation method, the electronic equipment sends the vital sign parameters to the preset terminal equipment, so that relatives and friends holding the terminal equipment can know the health state of the user in time, and further, the physical state of the user is determined through sending, so that the method is more visual and clear.
As an alternative embodiment, following step 204, the following steps may be included:
when detecting that a user generates a new coping action, the electronic equipment acquires action information corresponding to the new coping action;
And the electronic equipment sends the behavior information corresponding to the new response behavior to the preset terminal equipment.
It can be seen that when the user makes a new countermeasure, the implementation of the embodiment sends the new countermeasure to the preset terminal device, so that information can be updated in time.
205. When the user is detected to be in a static state, the electronic equipment collects images through the camera.
In some embodiments, the electronic device may determine whether the movement distance of the user within the preset time period is greater than a preset movement distance threshold by using the positioning function detection, and if the movement distance of the user within the preset time period is greater than the preset movement distance threshold, the electronic device determines that the user is in a stationary state. For example, if the electronic device detects that the moving distance of the user within 3 minutes is 1.2 meters and 1.2 meters is less than the preset moving distance threshold value of 240 meters, it is determined that the user is in a stationary state.
As an alternative embodiment, after step 205, the following steps may be further included:
the electronic equipment extracts image characteristics from the image;
the electronic equipment determines a target disaster scene matched with the image characteristics from a plurality of preset disaster scenes of the disaster;
The electronic equipment acquires a coping strategy of a target disaster scene;
the electronic equipment determines an output mode corresponding to a file format of the coping strategy;
the electronic device outputs the coping strategy in the output mode.
In this embodiment of the present application, a disaster scene refers to a scene that may be generated after a disaster occurs, for example, a scene that a user is covered in a certain narrow space by a collapse object may occur during an earthquake, and different disaster scenes have different characteristics. If a user is covered by the collapse object in a scene of a certain narrow space, the table and the chair are inclined, and the object interval is small. Features of some common disaster-stricken scenes can be stored in advance, and if the image features extracted by the electronic equipment have the characteristics of inclination of a table and a chair, small article intervals and the like, the target disaster-stricken scene can be determined to be a scene in which a user is covered in a certain narrow space by a collapse object. In addition, the coping strategies can comprise calling for help, saving equipment power, moving collapsed objects under what conditions, moving collapsed objects, and the like, can be searched for the coping strategies of the scene in a networking manner, and can be stored in a local storage space in advance for later calling; the output mode may be text output, voice output, video output or picture output, and the present embodiment is not limited thereto. Therefore, by implementing the embodiment, the coping strategy of the current disaster scene is provided for the user, which is beneficial to further improving the safety guarantee of the user.
206. The electronic device detects whether people except the user exist according to the image; if not, executing the steps 207 to 210; if yes, the process is ended.
207. The electronic device searches and obtains a plurality of Bluetooth devices through the Bluetooth module.
208. The electronic device determines a target Bluetooth device nearest to the electronic device from a plurality of Bluetooth devices.
In some embodiments, the electronic device may determine, according to the signal strength, the target bluetooth device closest to the electronic device, and determine the target bluetooth device by using other screening methods, which is not limited in the embodiments of the present application.
As an alternative embodiment, after step 208, the following steps may be further included:
the electronic equipment determines the relative position of the electronic equipment and the target Bluetooth equipment, wherein the relative position at least comprises the distance between the electronic equipment and the target Bluetooth equipment and the azimuth angle of the electronic equipment relative to the target Bluetooth equipment; the Bluetooth module configured by the electronic device can be provided with Bluetooth 5.1 characteristics, so that the electronic device can measure the distance and azimuth angle between the electronic device and another Bluetooth device with Bluetooth 5.1 characteristics by analyzing Bluetooth signals.
The electronic equipment generates a position diagram according to the relative position, wherein the position diagram comprises a first coordinate point representing the current position of the electronic equipment, a second coordinate point representing the current position of the target Bluetooth equipment and a direction indication icon, the direction indication icon is used for indicating a specific direction, and the specific direction is north, south, east or west;
the electronic device outputs a schematic of the location on the display screen. Referring specifically to fig. 7, fig. 7 is an exemplary diagram illustrating an output position of an electronic device on a display screen according to an embodiment of the present disclosure.
Therefore, according to the implementation of the embodiment, the electronic device side displays the position schematic diagram comprising the first coordinate point representing the electronic device, the second coordinate point representing the target Bluetooth device and the direction indication icon on the display screen, the relative position of the electronic device and the target Bluetooth device is intuitively and clearly displayed, and a user can be assisted in finding the position of the target Bluetooth device.
209. The electronic device sends a request to establish a bluetooth connection to the target bluetooth device.
210. And the electronic equipment sends a friend adding request to the target Bluetooth equipment after the Bluetooth connection is established with the target Bluetooth equipment.
In the embodiment of the present invention, the friend adding request may include a social account of the user, current location information of the user, and the like, which is not limited in the embodiment of the present invention.
Steps 205-210 are implemented, the electronic device cannot detect other characters except the user, which indicates that the user has fallen to list, and a friend adding request is sent to the target Bluetooth device nearest to the electronic device, so that the user can help or find a companion.
According to the method provided by the embodiment, the electronic equipment transmits the behavior information of the user when the disaster occurs to the preset terminal equipment in real time without user operation, namely, escape state information can be effectively transmitted in time when the disaster occurs. In addition, the behavior detection is carried out at multiple angles, and the accuracy of the detection result is high. In addition, the electronic equipment sends vital sign parameters to preset terminal equipment, so that relatives and friends holding the terminal equipment can know the health state of the user in time. In addition, when the user makes a new countermeasure, the new countermeasure is sent to the preset terminal equipment, so that information can be updated in time. In addition, a friend adding request is sent to the target Bluetooth device nearest to the electronic device, so that the user can help or find a companion. In addition, the electronic equipment displays a position schematic diagram comprising a first coordinate point representing the electronic equipment, a second coordinate point representing the target Bluetooth equipment and a direction indication icon on the display screen, the relative position of the electronic equipment and the target Bluetooth equipment is intuitively and clearly displayed, and a user can be assisted in finding the position of the target Bluetooth equipment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 3, the electronic device may include:
a first detecting unit 301, configured to detect a handling behavior of a user when it is determined that a disaster currently occurs;
in this embodiment, the manner in which the first detection unit 301 is configured to determine that a disaster occurs at present may specifically be: the first detection unit 301 receives disaster early warning information, a preset shortcut instruction indicating occurrence of a disaster is detected by the first detection unit 301, and at least one of the currently occurring disasters is detected by the first detection unit 301, and whether the disaster occurs currently or not is confirmed in various manners, so that confirmation efficiency can be improved. The disaster early-warning information may be disaster early-warning information sent by any one of the servers or the terminal device. For example, after receiving the disaster early warning information, the first detection unit 301 performs semantic recognition to determine a place where the disaster occurs, and if the distance between the place where the disaster occurs and the current position of the first detection unit 301 is less than a preset distance threshold, the first detection unit 301 determines that the disaster occurs currently; the shortcut command may be preset by the user, for example, when the first detecting unit 301 detects the shortcut command by comparing the gestures of the index finger, the middle finger and the ring finger, it is determined that a disaster occurs currently. The first detecting unit 301 may also preset disaster characteristics, such as preset fire characteristics: when the environmental temperature is greater than 45 degrees celsius and the environmental humidity is lower than 28 degrees celsius, the first detecting unit 301 determines that a disaster is currently occurring when the first detecting unit 301 detects that the environmental temperature is 60 degrees celsius and the environmental humidity is 20 degrees celsius.
In some embodiments, the first detection unit 301 is configured to detect the response behavior of the user specifically may be: the first detection unit 301 acquires an escape image of the user through a camera acquisition image, identifies limb characteristics and environmental characteristics of the user according to the escape image, and analyzes the coping behavior of the user according to the limb characteristics.
As an alternative embodiment, the first detecting unit 301 may be further configured to search for an escape method of the disaster after detecting the response behavior of the user, and determine whether the response behavior is a correct escape method in the disaster; if not, the volume is adjusted to the preset volume, and the prompt information is broadcasted in a voice mode, wherein the prompt information indicates the user to adopt a correct escape method.
Therefore, the preset volume is generally larger, and the voice broadcasting mode is adopted, so that the user does not need to stop to check, and the escape blocking of the user can be avoided.
A first determining unit 302, configured to determine behavior information corresponding to the response behavior, where the behavior information includes at least a security degree, and the security degree is a security effect of the evaluated response behavior when the disaster occurs;
In some embodiments, the determining unit 302 may express the security level in combination with numbers, letters or colors, where the determining unit 302 divides the security level into several levels in advance, and different numbers or letters represent different levels of the security level, for example, the numbers "1 to 10" are used to sequentially represent the levels of the security level that are increased (or decreased) in order, or the letters "a to F" are used to sequentially represent the levels of the security level that are increased (or decreased), or different colors are used to represent different levels of the security level, so that the security level of the countermeasure is more intuitively and clearly expressed.
A first sending unit 303, configured to send behavior information corresponding to the coping behavior to a preset terminal device.
In some embodiments, the first sending unit 303 is further configured to store a device identifier of a preset terminal device in a preset storage space, and the manner in which the first sending unit 303 is configured to send behavior information corresponding to a response behavior to the preset terminal device may specifically be: the first sending unit 303 invokes a device identifier of a preset terminal device from a preset storage space, and sends the behavior information corresponding to the response behavior to the preset terminal device according to the device identifier.
According to the electronic equipment provided by the embodiment, user operation is not needed, and the electronic equipment transmits behavior information of a user when the disaster occurs to the preset terminal equipment in real time, namely escape state information can be effectively transmitted in time when the disaster occurs. In addition, when the response action made by the user is wrong, the prompt information can be timely corrected through voice broadcasting.
Referring to fig. 4, fig. 4 is a schematic structural diagram of another electronic device according to an embodiment of the present disclosure. The electronic device shown in fig. 4 is obtained by optimizing the electronic device shown in fig. 3. Compared to the electronic device shown in fig. 3, the electronic device shown in fig. 4 may further include:
the first detection unit 301 includes:
the starting subunit 3011 is configured to start a preset behavior detection function when determining that a disaster currently occurs, so as to obtain behavior parameters; the behavior detection function comprises at least one of a motion sensor function, a positioning function and an audio and video recording function;
in the embodiment of the application, the motion sensor function, the positioning function and the audio-video recording function are utilized for behavior detection, and multi-angle detection is performed, so that the accuracy of detection results is higher. The behavior parameters may include at least one of acceleration parameters, position parameters, and audio-visual parameters. The opening subunit 3011 may detect an acceleration force of the opening subunit 3011 by using a motion sensor function, and analyze the acceleration force to obtain an acceleration parameter; the opening subunit 3011 can be positioned according to a preset frequency by utilizing a positioning function to obtain a position parameter, wherein the position parameter comprises the positions of the opening subunits 3011 at different times; the starting subunit 3011 receives the recorded video through the audio-video recording function to obtain audio-video data, and extracts audio parameters and video parameters from the audio-video data, wherein the audio-video parameters can be audio semantics, character images, article images and the like identified from the audio-video.
A determining subunit 3012 is configured to determine the coping behavior of the user according to the behavior parameters.
In some embodiments, the determining subunit 3012 may specifically determine the coping behavior of the user according to the behavior parameters by: the determining subunit 3012 obtains the movement mode of the user by analyzing the acceleration parameter, obtains the position change information of the user by analyzing the position parameter, and obtains the change information of the gesture of the user (such as sitting to squat) by analyzing the audio-video parameter; the determination subunit 3012 determines the coping behavior of the user by comprehensively analyzing the movement pattern of the user, the positional change information of the user, and the change information of the user posture.
The acquisition unit 304 is configured to acquire an image through the camera when detecting that the user is in a stationary state after sending behavior information corresponding to the response behavior to a preset terminal device;
a second detecting unit 305 for detecting whether or not a person other than the user exists from the image;
a searching unit 306 for searching and obtaining a plurality of bluetooth devices through the bluetooth module when the second detecting unit 305 detects that the person is not present;
a second determining unit 307, configured to determine a target bluetooth device closest to the electronic device from the plurality of bluetooth devices;
A second sending unit 308, configured to send a request for establishing a bluetooth connection to a target bluetooth device;
the second sending unit 308 is further configured to send a friend adding request to the target bluetooth device after the bluetooth connection is established with the target bluetooth device.
The second determining unit 307 is further configured to determine, after determining, from the plurality of bluetooth devices, a target bluetooth device closest to the user, a relative position between the electronic device and the target bluetooth device, where the relative position includes at least a distance between the electronic device and the target bluetooth device and an azimuth angle of the electronic device relative to the target bluetooth device;
a generating unit 309, configured to generate a position diagram according to the relative position, where the position diagram includes a first coordinate point representing the electronic device, a second coordinate point representing the target bluetooth device, and a direction indication icon, where the direction indication icon is used to indicate a specific direction, and the specific direction is north, south, east or west;
the first output unit 310 is configured to output a schematic position diagram on the display screen.
As an alternative embodiment, the electronic device may include:
an extracting unit 311, configured to extract image features from an image after the image is acquired by the camera when it is detected that the user is in a stationary state;
In some embodiments, the extracting unit 311 may determine whether the movement distance of the user within the preset time period is greater than a preset movement distance threshold by using the positioning function detection, and if the movement distance of the user within the preset time period is greater than the preset movement distance threshold, the extracting unit 311 determines that the user is in a stationary state. For example, if the extraction unit 311 detects that the moving distance of the user within 3 minutes is 1.2 meters and 1.2 meters is less than the preset moving distance threshold value 240 meters, it is determined that the user is in a stationary state.
A third determining unit 312, configured to determine a target disaster-affected scene that matches the image feature from a plurality of preset disaster-affected scenes of the disaster;
an obtaining unit 313, configured to obtain a coping strategy of the target disaster scene;
the third determining unit 312 is further configured to determine an output manner corresponding to a file format of the coping strategy;
and a second output unit 314 for outputting the coping strategy in the above-described output manner.
Different disaster scenes can be formed after the disaster occurs, for example, a scene that a user is covered in a certain narrow space by a collapsed object can occur during an earthquake, the coping strategies can comprise calling for help, saving equipment power, moving the collapsed object under the condition, how to move the collapsed object and the like, the coping strategies of the scene can be searched in a networking mode, and the coping strategies can be stored in a local storage space in advance for later calling; the output format may be text output, voice output, video output, or picture output, and the present embodiment is not limited thereto. Therefore, by implementing the embodiment, the coping strategy of the current disaster scene is provided for the user, which is beneficial to further improving the safety guarantee of the user.
As an alternative embodiment, the electronic device may include:
a third detecting unit 315 for detecting vital sign parameters of the user, the vital sign parameters including at least one of a body temperature and a pulse;
the first sending unit 303 is further configured to send the vital sign parameter to a preset terminal device.
Further optionally, the third detecting unit 315 may be further configured to determine, before sending the vital sign parameter to a preset terminal device, a physical state of the user according to the vital sign parameter; the manner in which the first sending unit 303 is configured to send the vital sign parameters to the preset terminal device may specifically be: the first sending unit 303 is configured to send the vital sign parameter and the physical state of the user to a preset terminal device.
In this embodiment of the present application, the physical state is used to indicate the health condition of the user, and may specifically include normal body temperature, low body temperature, high body temperature, normal pulse, slow pulse, fast pulse, and the like, which is not limited in this embodiment; according to the implementation method, the electronic equipment sends the vital sign parameters to the preset terminal equipment, so that relatives and friends holding the terminal equipment can know the health state of the user in time, and further, the physical state of the user is determined through sending, so that the method is more visual and clear.
As an alternative embodiment, in the electronic device described above:
the first determining unit 302 is further configured to, after sending the behavior information corresponding to the coping behavior to a preset terminal device, obtain the behavior information corresponding to the coping behavior when detecting that a new coping behavior occurs to the user;
the first sending unit 303 is further configured to send behavior information corresponding to the new coping behavior to a preset terminal device.
It can be seen that when the user makes a new countermeasure, the implementation of the embodiment sends the new countermeasure to the preset terminal device, so that information can be updated in time.
The electronic device provided by the embodiment does not need user operation, and the electronic device can timely and effectively transmit escape state information when disasters occur by transmitting behavior information when the disasters occur to the preset terminal device in real time. In addition, the behavior detection is carried out at multiple angles, and the accuracy of the detection result is high. In addition, the electronic equipment sends vital sign parameters to preset terminal equipment, so that relatives and friends holding the terminal equipment can know the health state of the user in time. In addition, when the user makes a new countermeasure, the new countermeasure is sent to the preset terminal equipment, so that information can be updated in time. In addition, a friend adding request is sent to the target Bluetooth device nearest to the electronic device, so that the user can help or find a companion. In addition, the electronic equipment displays a position schematic diagram comprising a first coordinate point representing the electronic equipment, a second coordinate point representing the target Bluetooth equipment and a direction indication icon on the display screen, the relative position of the electronic equipment and the target Bluetooth equipment is intuitively and clearly displayed, and a user can be assisted in finding the position of the target Bluetooth equipment.
Referring to fig. 5, fig. 5 is a schematic structural diagram of another electronic device according to an embodiment of the present disclosure. As shown in fig. 5, the electronic device may include:
a memory 501 in which executable program codes are stored;
a processor 502 coupled to the memory 501;
the processor 502 calls executable program codes stored in the memory 501, and executes the behavior information transfer method at the time of disaster occurrence described in the above embodiments.
The present embodiment discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute the behavior information transfer method at the time of disaster occurrence described in the above embodiments.
The present application also discloses a computer program product, wherein the computer program product, when run on a computer, causes the computer to perform some or all of the steps of the method as in the method embodiments above.
The embodiment of the application also discloses an application release platform, which is used for releasing a computer program product, wherein when the computer program product runs on a computer, the computer is caused to execute part or all of the steps of the method in the embodiment of each method.
Those of ordinary skill in the art will appreciate that all or part of the steps of the various methods of the above embodiments may be implemented by a program that instructs associated hardware, the program may be stored in a computer readable storage medium including Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disk Memory, magnetic disk Memory, tape Memory, or any other medium that can be used for carrying or storing data that is readable by a computer.
The foregoing describes in detail a method for transferring behavior information and an electronic device during disaster occurrence, which are disclosed in the embodiments of the present application, and specific examples are applied to illustrate the principles and embodiments of the present invention, where the foregoing description of the embodiments is only for helping to understand the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (9)

1. A behavior information transmission method at the time of disaster occurrence, the method comprising:
detecting the corresponding behavior of the user when the disaster is determined to occur currently;
determining behavior information corresponding to the response behavior, wherein the behavior information at least comprises a safety degree, and the safety degree is the safety effect brought by the evaluated response behavior when the disaster occurs;
transmitting behavior information corresponding to the response behavior to preset terminal equipment;
when the user is detected to be in a static state, acquiring an image through a camera;
detecting whether a person other than the user exists based on the image;
if the person does not exist, searching and obtaining a plurality of Bluetooth devices through the Bluetooth module;
determining a target Bluetooth device nearest to the electronic device from the plurality of Bluetooth devices;
sending a request for establishing Bluetooth connection to the target Bluetooth device;
and after the Bluetooth connection is established with the target Bluetooth device, sending a friend adding request to the target Bluetooth device.
2. The method of claim 1, wherein upon determining that the disaster is currently occurring, detecting a response behavior of a user comprises:
When the disaster is determined to occur currently, starting a preset behavior detection function to obtain behavior parameters; the behavior detection function comprises at least one of a motion sensor function, a positioning function and an audio-video recording function;
and determining the coping behavior of the user according to the behavior parameters.
3. The method of claim 1, wherein after determining the closest target bluetooth device to the user from the plurality of bluetooth devices, the method further comprises:
determining the relative position of the electronic equipment and the target Bluetooth equipment, wherein the relative position at least comprises the distance between the electronic equipment and the target Bluetooth equipment and the azimuth angle of the electronic equipment relative to the target Bluetooth equipment;
generating a position diagram according to the relative position, wherein the position diagram comprises a first coordinate point representing the electronic equipment, a second coordinate point representing the target Bluetooth equipment and a direction indication icon, the direction indication icon is used for indicating a specific direction, and the specific direction is north, south, east or west;
and outputting the position schematic diagram on a display screen.
4. The method of claim 1, wherein after capturing the image by the camera when the user is detected to be in a stationary state, the method further comprises:
extracting image features from the image;
determining a target disaster scene matched with the image features from a plurality of preset disaster scenes of the disaster;
acquiring a coping strategy of the target disaster scene;
determining an output mode corresponding to the file format of the coping strategy;
and outputting the coping strategy in the output mode.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
detecting vital sign parameters of a user, the vital sign parameters including at least one of body temperature and pulse;
and sending the vital sign parameters to the preset terminal equipment.
6. The method according to claim 1 or 2, wherein after the sending the behavior information corresponding to the coping behavior to a preset terminal device, the method further includes:
when detecting that the user generates a new coping action, acquiring action information corresponding to the new coping action;
and sending the behavior information corresponding to the new response behavior to the preset terminal equipment.
7. An electronic device, the electronic device comprising:
the first detection unit is used for detecting the response behavior of the user when the current disaster is determined;
a first determining unit, configured to determine behavior information corresponding to the response behavior, where the behavior information includes at least a security degree, and the security degree is an estimated security effect of the response behavior when the disaster occurs;
the first sending unit is used for sending the behavior information corresponding to the response behavior to preset terminal equipment;
the acquisition unit is used for acquiring images through the camera when detecting that the user is in a static state;
a second detection unit configured to detect whether or not a person other than the user exists from the image;
the searching unit is used for searching and obtaining a plurality of Bluetooth devices through the Bluetooth module when the second detecting unit detects that the person does not exist;
the second determining unit is used for determining a target Bluetooth device closest to the electronic device from the plurality of Bluetooth devices;
a second sending unit, configured to send a request for establishing a bluetooth connection to the target bluetooth device; and the method is also used for sending a friend adding request to the target Bluetooth equipment after the Bluetooth connection is established with the target Bluetooth equipment.
8. An electronic device, the electronic device comprising:
a memory storing executable program code;
a processor coupled to the memory;
the processor calls the executable program code stored in the memory to execute the behavior information transfer method when a disaster occurs according to any one of claims 1 to 6.
9. A computer-readable storage medium storing a computer program for executing the behavior information transfer method at the time of occurrence of a disaster according to any one of claims 1 to 6.
CN202010296566.9A 2020-04-15 2020-04-15 Behavior information transmission method and electronic equipment during disaster occurrence Active CN111523427B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010296566.9A CN111523427B (en) 2020-04-15 2020-04-15 Behavior information transmission method and electronic equipment during disaster occurrence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010296566.9A CN111523427B (en) 2020-04-15 2020-04-15 Behavior information transmission method and electronic equipment during disaster occurrence

Publications (2)

Publication Number Publication Date
CN111523427A CN111523427A (en) 2020-08-11
CN111523427B true CN111523427B (en) 2023-07-25

Family

ID=71910692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010296566.9A Active CN111523427B (en) 2020-04-15 2020-04-15 Behavior information transmission method and electronic equipment during disaster occurrence

Country Status (1)

Country Link
CN (1) CN111523427B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102984656A (en) * 2012-11-27 2013-03-20 广东欧珀移动通信有限公司 Automatic help-seeking method on disaster site based on mobile terminal (MT) and MT
WO2013136976A1 (en) * 2012-03-16 2013-09-19 シャープ株式会社 Terminal device, and safety confirmation system
US10104527B1 (en) * 2017-04-13 2018-10-16 Life360, Inc. Method and system for assessing the safety of a user of an application for a proactive response

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3874672B2 (en) * 2002-02-25 2007-01-31 富士通株式会社 Disaster-related information processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013136976A1 (en) * 2012-03-16 2013-09-19 シャープ株式会社 Terminal device, and safety confirmation system
CN102984656A (en) * 2012-11-27 2013-03-20 广东欧珀移动通信有限公司 Automatic help-seeking method on disaster site based on mobile terminal (MT) and MT
US10104527B1 (en) * 2017-04-13 2018-10-16 Life360, Inc. Method and system for assessing the safety of a user of an application for a proactive response

Also Published As

Publication number Publication date
CN111523427A (en) 2020-08-11

Similar Documents

Publication Publication Date Title
CN104822042B (en) A kind of pedestrains safety detection method and device based on camera
US10699541B2 (en) Recognition data transmission device
US9788065B2 (en) Methods and devices for providing a video
WO2016147770A1 (en) Monitoring system and monitoring method
US9837121B2 (en) Information processing device, information processing method, and program
KR20180017606A (en) Method for providing parking location information of a vehicle and electronic device thereof
CN104054039A (en) Augmented Reality With Sound And Geometric Analysis
WO2019036309A1 (en) Selective identity recognition utilizing object tracking
KR20150039252A (en) Apparatus and method for providing application service by using action recognition
KR20160037326A (en) Mobile communication terminal for warning dangerous situation and operation method thereof
US20200053518A1 (en) Method for providing emergency service, electronic device therefor, and computer readable recording medium
CN114783061B (en) Smoking behavior detection method, device, equipment and medium
US9256945B2 (en) System for tracking a moving object, and a method and a non-transitory computer readable medium thereof
CN113850109A (en) Video image alarm method based on attention mechanism and natural language processing
CN106648042B (en) Identification control method and device
CN111523427B (en) Behavior information transmission method and electronic equipment during disaster occurrence
CN112307323B (en) Information pushing method and device
US20150086074A1 (en) Information processing device, information processing method, and program
CN111523428B (en) Self-rescue prompting method in disasters, electronic equipment and storage medium
CN112926542B (en) Sex detection method and device, electronic equipment and storage medium
KR101223088B1 (en) Video call service method, apparatus and system thereof
CN108922116A (en) A kind of alarm method and device based on intelligent terminal
US20150062355A1 (en) Information processing device, information processing method and program
CN109084750B (en) Navigation method and electronic equipment
CN108337361B (en) Method and terminal for pre-judging behaviors through gyroscope sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant