CN112150667A - Information processing method and device and information recorder - Google Patents

Information processing method and device and information recorder Download PDF

Info

Publication number
CN112150667A
CN112150667A CN202011003777.5A CN202011003777A CN112150667A CN 112150667 A CN112150667 A CN 112150667A CN 202011003777 A CN202011003777 A CN 202011003777A CN 112150667 A CN112150667 A CN 112150667A
Authority
CN
China
Prior art keywords
image
value
information
evaluation value
time period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011003777.5A
Other languages
Chinese (zh)
Inventor
李晓峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202011003777.5A priority Critical patent/CN112150667A/en
Publication of CN112150667A publication Critical patent/CN112150667A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • G01P3/38Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light using photographic means

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Power Engineering (AREA)
  • Alarm Systems (AREA)

Abstract

The embodiment of the application provides an information processing method, an information processing device and an information recorder, wherein the information processing method comprises the following steps: determining a first number of people, an image moving speed and an image moving acceleration according to the images captured in the first time period; calculating a target emergency situation evaluation value according to the first number of people, the image moving speed and the image moving acceleration; and executing a processing plan corresponding to the target emergency situation evaluation value, wherein the processing plan comprises a storage plan of the information recorder. The embodiment of the application can improve the automation degree of the information recorder.

Description

Information processing method and device and information recorder
Technical Field
The present application relates to the field of information processing technologies, and in particular, to an information processing method and apparatus, and an information recorder.
Background
The information recorder, also called as a field information recorder, integrates the functions of digital camera shooting, digital photographing and interphone transmitter, can carry out dynamic and/or static digital recording on the field condition in the working process, and is convenient for users such as workers to use in various working fields.
The existing information recorder mainly completes the basic functions of video recording, photographing, talkback and the like under the control of a user, and has low automation degree.
Disclosure of Invention
The application provides an information processing method, an information processing device and an information recorder, which can detect the situation of a working site, automatically execute a processing plan based on the situation of the working site, and improve the automation degree of the information recorder.
In a first aspect, an embodiment of the present application provides an information processing method applied to an information recorder, including:
determining a first number of people, an image moving speed and an image moving acceleration according to images captured in a first time period, wherein the image moving speed represents the moving speed of the information recorder through the change of the images, and the image moving acceleration is used for representing the moving acceleration of the information recorder through the change of the images;
calculating a target emergency situation evaluation value according to the first number of people, the image moving speed and the image moving acceleration, wherein the target emergency situation evaluation value is used for representing the situation emergency degree; the first number of people, the image moving speed and the image moving acceleration are positively correlated with the target emergency situation evaluation value respectively;
and executing a processing plan corresponding to the target emergency situation evaluation value, wherein the processing plan comprises a storage plan of the information recorder.
In one possible implementation manner, the method further includes:
determining a first image information entropy according to the image captured in the first time period; the first image information represents the moving speed of the information recorder through the fuzzy degree of the image;
correspondingly, the target emergency situation evaluation value is calculated according to the first image information entropy, and the first image information entropy is inversely related to the target emergency situation evaluation value.
In one possible implementation manner, the method further includes:
determining a first abnormal degree value of the movement acceleration of the information recorder according to the movement acceleration curve of the information recorder in the first time period; the moving acceleration curve is detected by an acceleration sensor of the information recorder; the first abnormal degree value represents the moving speed of the information recorder through the abnormal degree of the moving acceleration of the information recorder;
correspondingly, the target emergency situation evaluation value is calculated according to the first abnormal degree value, and the first abnormal degree value is positively correlated with the target emergency situation evaluation value.
In one possible implementation manner, the method further includes:
determining a first sound intensity value of the first audio signal and/or a keyword hit rate of text information corresponding to the first audio signal according to the first audio signal picked up in the first time period; the first sound intensity value is used for representing the sound intensity of the environment where the information recorder is located; the keyword hit rate is used for representing the probability of the preset keywords appearing in the sound of the environment where the information recorder is located;
correspondingly, the target emergency situation evaluation value is further calculated according to the first sound intensity value and/or the keyword hit rate, and the first sound intensity value and the keyword hit rate are respectively positively correlated with the target emergency situation evaluation value.
In one possible implementation manner, before determining the first number of people, the image moving speed, and the image moving acceleration, the method further includes:
calculating a first emergency situation evaluation value according to the average sound intensity of a second audio signal picked up by the information recorder in a second time period and the keyword hit rate of the text information corresponding to the second audio signal; the average sound intensity of the second audio signal and the keyword hit rate of the text information corresponding to the second audio signal are positively correlated with the first emergency situation evaluation value respectively;
and judging that the first emergency situation evaluation value exceeds a first evaluation threshold value.
In a possible implementation manner, the executing the processing plan corresponding to the target emergency situation assessment value includes:
when the target emergency situation evaluation value reaches a segmentation threshold value, segmenting the video data of the information recorder;
and when the target emergency situation evaluation value is reduced to be below the segmentation threshold value, segmenting the video data of the information recorder again, and marking the video segment obtained between the two segmentations.
In a possible implementation manner, the executing the processing plan corresponding to the target emergency situation assessment value includes:
determining a preset interval where the target emergency situation evaluation value is located; the preset interval is obtained by dividing the value range of the target emergency situation evaluation value according to at least one preset threshold value;
and executing the determined preset processing plan corresponding to the preset interval.
In one possible implementation, the determining a first person number according to the image captured in the first time period includes:
for each image captured in the first time period, calculating the number of people in the image;
and calculating a median or an average value according to the number of people in the image captured in the first time period to obtain the first number of people.
In one possible implementation, the determining the image moving speed according to the image captured in the first time period includes:
respectively calculating the image moving speed corresponding to the two adjacent images for the image captured in the first time period;
and calculating a median or an average value according to the image moving speeds corresponding to the two adjacent images to obtain the image moving speed.
In a possible implementation manner, the determining a first image information entropy according to the image captured in the first time period includes:
respectively calculating the image information entropy of each gray image according to the gray images of the images captured in the first time period;
and calculating a median or an average value according to the image information entropy of the image captured in the first time period to obtain the first image information entropy.
In a possible implementation manner, the determining a first abnormal degree value of the movement acceleration of the information recorder according to the movement acceleration curve of the information recorder in the first time period includes:
determining the moving acceleration of at least one time point in the first time period according to the moving acceleration curve;
calculating the difference value between the moving acceleration of each time point and the moving acceleration mean value in the second time period;
and calculating a median or an average value according to the difference corresponding to each time point to obtain the first abnormal degree value.
In a second aspect, an embodiment of the present application provides an information processing apparatus, including:
a first determination unit configured to determine a first number of persons, an image movement speed, and an image movement acceleration from an image captured within a first time period, wherein the image movement speed represents a movement speed of the information recorder by a change in the image, and the image movement acceleration represents a movement acceleration of the information recorder by a change in the image;
a calculating unit configured to calculate a target emergency evaluation value indicating an event emergency degree, based on the first number of people, the image moving speed, and the image moving acceleration; the first number of people, the image moving speed and the image moving acceleration are positively correlated with the target emergency situation evaluation value respectively;
and the execution unit is used for executing a processing plan corresponding to the target emergency situation evaluation value, and the processing plan comprises a storage plan of the information recorder.
In a third aspect, an embodiment of the present application provides an information recorder, including:
one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method of any of the first aspects.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, which, when run on a computer, causes the computer to perform the method of any one of the first aspect.
In a fifth aspect, the present application provides a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In a possible design, the program of the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
FIG. 1 is a diagram illustrating an exemplary structure of an information recorder according to the present application;
FIG. 2 is a flow chart of an embodiment of an information processing method of the present application;
FIG. 3 is a flow chart of another embodiment of an information processing method of the present application;
FIG. 4 is a flow chart of yet another embodiment of the information processing method of the present application;
FIG. 5 is a flow chart of yet another embodiment of the information processing method of the present application;
FIG. 6 is a flow chart of yet another embodiment of the information processing method of the present application;
fig. 7 is a schematic structural diagram of an embodiment of an information processing apparatus according to the present application.
Detailed Description
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
In the existing implementation scheme, a user controls the information recorder to execute the functions of video recording, photographing, talkback, voice transmission and the like according to the condition of a working site, and the automation degree is low.
The application provides an information processing method, an information processing device and electronic equipment, which can detect the situation of a working site, automatically execute a processing plan based on the situation of the working site, improve the automation degree of an information recorder and further improve the working efficiency of a user.
The information processing method can be suitable for the information recorder.
Fig. 1 is a structural example diagram of an information recording apparatus according to the present application, and as shown in fig. 1, the information recording apparatus may include: processor 11, memory 12, camera 13, microphone 14, gyroscope sensor 15, acceleration sensor 16, speaker 17, laser light 18, communication module 19, etc. The processor 11, the memory 12, the camera 13, the microphone 14, the gyro sensor 15, the acceleration sensor 16, the speaker 17, the laser light 18, and the communication module 19 may communicate with each other through an internal connection path to transmit control and/or data signals.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the information recording apparatus. In other embodiments of the present application, the information recorder may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be provided. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The memory 12 may be used for storing a computer program, from which memory 12 the processor 11 may be used to call and run the computer program. The processor 11 and the memory 12 may be combined into one processing device, more generally separate components, the processor 11 being adapted to execute program code stored in the memory 12 to implement the corresponding functions. In particular, the memory 12 may be integrated in the processor 11, or may be independent of the processor 11.
The camera 13 may be used to perform a video recording or photographing function, and send a photographed video or picture to the processor 11.
The microphone 14 may be used to pick up sound from the environment in which the information recorder is located, and send the picked-up audio signal to the processor 11.
The gyro sensor 15 may be used to determine the movement posture of the information recorder. The acceleration sensor 16 may be used to detect the magnitude of acceleration of the information recorder in various directions (e.g., x, y, z axes preset in the information recorder).
The speaker 17 may be used to play sound under the control of the processor 11.
The laser light 18 may be used to emit a laser beam under the control of the processor 11.
The communication module 19 can provide a solution including 2G/3G/4G/5G wireless communication applied to the information recorder, and implement data communication between the information recorder and a work control center, and between the information recorder and other electronic devices, such as other information recorders, mobile phones, and the like.
It should be understood that the information recorder shown in fig. 1 is capable of implementing the processes of the methods provided by the embodiments of the present application described below. The operation and/or function of each module in the information recorder are respectively for realizing the corresponding flow in the following method embodiments. Reference may be made specifically to the description of the method embodiments illustrated in the present application, and a detailed description is appropriately omitted herein to avoid redundancy.
The information processing method of the present application is exemplarily described below with reference to fig. 1.
Fig. 2 is a flowchart of an embodiment of an information processing method according to an embodiment of the present application, which may be applied to an information recorder, and as shown in fig. 2, the method may include:
step 201: and determining a first number of people, the moving speed of the image and the moving acceleration of the image according to the image captured in the first time period.
The first number of people can represent the number of the accident participants in the work site to a certain extent, and generally, the greater the first number of people, the higher the emergency degree of the accident, the fewer the first number of people, and the lower the emergency degree of the accident. The image moving speed can be used for estimating the moving speed of the information recorder and the moving speed of the user of the information recorder to a certain extent through the change of the image, generally, the faster the image moving speed is, the faster the moving speed of the information recorder is, the higher the emergency degree of the situation is, and conversely, the lower the emergency degree of the situation is. The image movement acceleration can be used for estimating the movement acceleration of the information recorder and the movement acceleration of a user of the information recorder to a certain extent through the change of the image, and generally, the larger the image movement acceleration is, the larger the movement acceleration of the information recorder is, the higher the emergency degree of the situation is, and conversely, the lower the emergency degree of the situation is.
The information recorder may process the information according to a certain preset period, and in this case, the first time period may be one period. The specific duration of one period is not limited in the embodiments of the present application.
The time interval for capturing the images in the first time period is not limited in the embodiment of the application, and the smaller the general time interval is, the more the captured images are, the more accurate the first number of people, the moving speed of the images and the moving acceleration of the images determined in the step are, that is, the more accurate the estimation of the number of people at the work site and the moving speed of the user is, and the more accurate the subsequent estimation and judgment of the situation are.
Alternatively, determining the first number of people from the captured images in the first time period may be achieved by taking a median or average of the number of people in the images in the first time period, and so on. For example, determining a first number of persons from images captured during a first time period may include: for each image captured in the first time period, calculating the number of people in the image; and calculating the median or the average value according to the number of people in the captured image in the first time period to obtain the first number of people.
For example, assuming that 10 images are captured in the first time period, the number of people in each image is calculated respectively to obtain the number of people in each image in the 10 images, the number of people corresponding to the 10 images is averaged, and the average value is used as the first number of people; alternatively, the median of the number of persons corresponding to 10 images may be calculated, and the median may be the first number of persons.
In the calculation of the number of people in each image, the image may be grayed out, and the number of people in the grayed-out image may be calculated using a histogram of directional gradients, a feature point algorithm, or the like.
Optionally, determining the image moving speed according to the image captured in the first time period may include:
respectively calculating the image moving speed corresponding to two adjacent images for the image captured in the first time period;
and calculating the median or the average value according to the image moving speeds corresponding to the two adjacent images to obtain the image moving speed.
For example, assuming that 10 images are captured in the first time period, namely, image 1 to image 10, the image moving speed may be calculated according to image 1 and image 2, the image moving speed may be calculated according to image 2 and image 3, and so on, to obtain 9 image moving speeds, calculate an average value of the 9 image moving speeds, and use the average value as the image moving speed; alternatively, the median of the 9 image moving speeds may be calculated, and the median may be set as the image moving speed.
Optionally, the method for calculating the image moving speed corresponding to two adjacent images may include:
selecting a first pixel point in a first image for the gray images of two adjacent images; the first image is a gray image of an image which is shot relatively earlier in time in two adjacent images;
determining an image area in a second image by taking a pixel point corresponding to the first coordinate as a center according to the first coordinate of the first pixel point in the first image; the second image is a gray image of an image with a later shooting time in two adjacent images;
searching a second pixel point which has the same gray value as the first pixel point in the determined image area;
for each second pixel point, calculating the displacement deltas of the second pixel point relative to the first pixel point according to the second coordinate of the second pixel point and the first coordinate of the first pixel point;
and calculating the image moving speed according to the displacement of the second pixel point relative to the first pixel point.
The calculated image moving speed is an image moving speed corresponding to the first image and the second image.
The first pixel point may be any pixel point in the first image, or may also be any pixel point in the first image having a preset gray value; preferably, in order to ensure that an image region centered on the pixel point of the first coordinate can be obtained in the second image, the first pixel point may be relatively close to the center position of the first image.
The first image and the second image can use the same coordinate system, so that the pixel points which are located at the same position as the first pixel points in the second image can be obtained on the second image based on the first coordinate.
The image area determined by taking the pixel point corresponding to the first coordinate as the center may be a circular area taking the pixel point corresponding to the first coordinate as the center of a circle, and the specific value of the radius of the circular area is not limited in the embodiment of the present application.
Wherein, calculating the image moving speed according to the displacement of the second pixel point relative to the first pixel point may include:
for each second pixel point, calculating the moving speed of the second pixel point relative to the first pixel point
Figure BDA0002695203110000061
Wherein Δ t is a difference in shooting time between the first image and the second image; determining the median of the moving speed corresponding to the second pixel point, and taking the determined median as the image displacement speed; alternatively, the first and second electrodes may be,
calculating an average value of the moving speeds corresponding to the second pixel points
Figure BDA0002695203110000062
And taking the calculated average value as the image displacement speed. Wherein n is the total number of the second pixel points.
It should be noted that, the above-exemplified method for calculating the image moving speed corresponding to two adjacent images takes selecting one first pixel point as an example, and in other possible implementation manners, 2 pixel points or more pixel points may also be selected, the image moving speed corresponding to each pixel point is calculated according to the above-mentioned method, and then, the median or the average value, etc., is calculated according to the image moving speed corresponding to each pixel point as the image moving speed of two adjacent images.
The image moving acceleration may be calculated according to the image moving speeds corresponding to the two adjacent images, and specifically, the acceleration of the two adjacent images may be calculated first, and then the median or the average of the acceleration is calculated to obtain the image moving acceleration.
For example, suppose that 10 images are captured in the first time period, which are image 1 to image 10, the image moving speed 1 can be calculated according to image 1 and image 2, the image moving speed 2 is calculated according to image 2 and image 3, and so on, 9 image moving speeds are obtained, which are image moving speeds 1 to 9, the image moving acceleration 1 is calculated according to image moving speed 1 and image moving speed 2, the image moving acceleration 2 is calculated according to image moving speed 2 and image moving speed 3, and so on, 8 image moving accelerations are obtained, and the median or average of 8 image moving accelerations is calculated, so that the image moving acceleration is obtained.
Step 202: and calculating a target emergency situation evaluation value according to the first number of people, the image moving speed and the image moving acceleration.
Wherein the target emergency situation assessment value is used to indicate the degree of situation urgency. Alternatively, the target emergency situation assessment value is positively correlated with the situation urgency, that is: the larger the target emergency state evaluation value is, the higher the emergency degree of the state is, the smaller the target emergency state evaluation value is, and the lower the emergency degree of the state is; accordingly, the method can be used for solving the problems that,
the first population is positively correlated with the target emergency assessment value, i.e.: the larger the first person is, the larger the target emergency state evaluation value is, and the smaller the first person is, the smaller the target emergency state evaluation value is;
the image moving speed is positively correlated with the target emergency situation evaluation value, that is: the larger the image moving speed is, the larger the target emergency state evaluation value is, and the smaller the image moving speed is, the smaller the target emergency state evaluation value is;
the image movement acceleration is positively correlated with the target emergency situation evaluation value, that is: the larger the image movement acceleration is, the larger the target emergency state evaluation value is, and the smaller the image movement acceleration is, the smaller the target emergency state evaluation value is.
Wherein, this step can include:
calculating the target emergency situation assessment value L using the following formula: l-a 1-C1 + a 2-C2-a + B1, where a1 is a weight corresponding to the first person, C1 is a numerical value corresponding to the first person, a2 is a weight corresponding to the image movement speed, C2 is a numerical value corresponding to the image movement speed, a is a numerical value corresponding to the image movement acceleration, and B1 is a constant.
In general, the target emergency situation evaluation value is positively correlated with the first person number and the like, but may not be absolutely linear, and may be used for parameter adjustment of the target emergency situation evaluation value by adding the constant B1.
The value C1 corresponding to the first person may be the first person, or may be a value having a mapping relationship with the first person. For example, assuming that the first number of people is 10, one possible implementation is that C1 takes the value 10, another possible implementation is: if a mapping relationship between the value range of the first number of people and a value range is preset, for example, a mapping relationship exists between the value range [0,100] of the first number of people and the value range [0,10 ], then 10 can be mapped to a value of 1, and C1 can be mapped to a value of 1. The method for determining the value C2 corresponding to the image moving speed and the image moving acceleration a can refer to the method for determining C1, and will not be described herein again.
In the step, the first number of people can represent the number of the situation participants in the work site, and the image moving speed and the image moving acceleration can represent the speed and acceleration changes of the user using the information recorder, so that the situation emergency degree of the work site can be estimated through the three parameters, and the situation emergency degree can be evaluated in a quantitative mode.
Step 203: and executing a processing plan corresponding to the target emergency situation evaluation value, wherein the processing plan comprises a storage plan of the information recorder.
This step may include: determining a preset interval where the target emergency situation evaluation value is located; the preset interval is obtained by dividing the value range of the target emergency situation evaluation value according to at least one preset threshold value; and executing a preset processing plan corresponding to the determined preset interval.
For example, at least one level threshold may be preset in the information recorder, the value range of the target emergency situation evaluation value is divided into at least two sections according to the preset at least one level threshold, and a corresponding processing plan is set for each section; correspondingly, the processing plan corresponding to the section where the target emergency situation evaluation value is located can be executed according to the section where the target emergency situation evaluation value is located.
Wherein each interval may correspond to an event level.
Different processing plans are set for different intervals, so that different processing plans can be set according to the emergency degree of the situation, the information recorder can perform corresponding reasonable processing according to the situation, the automation and the intelligent degree of the information recorder are improved, and the use is convenient for a user.
For example, a first ranking threshold q1 and a second ranking threshold q2 may be preset, the first ranking threshold being smaller than the second ranking threshold, so that the value range (b1, b2) of the target emergency situation assessment value may be divided into 3 intervals (b1, q1) [ q1, q2), [ q2, b2), assuming that the following three situation rankings correspond respectively: third-level, second-level, and first-level.
The lower the event level is, the higher the event urgency level is, and the following processing plans may be set for the three event levels according to the event urgency level.
Three-level state: carrying out local processing; local processing includes, but is not limited to: and controlling the laser lamp to flicker and controlling the loudspeaker to play a preset audio signal.
The second-level state: performing local processing and remote processing; remote processing includes, but is not limited to: and uploading the emergency state information to the control center, and waiting for the control center to issue a command instruction.
The first-level state: local processing and remote processing are performed and emergency status information is transmitted directly to associated electronic equipment, such as an information recorder, requesting immediate emergency support, without passing through a control center.
The information recorder can automatically start peripheral devices such as laser lamps and loudspeakers to give an alarm and prompt for target personnel, and the order of a work site is maintained. The flashing frequency of the laser lamp can be adaptively changed according to the grade of an emergency state, and the higher the emergency degree of the situation is, the higher the flashing frequency can be; the preset audio signals played in the loudspeaker can be pre-defined by workers, the volume of the preset audio signals played by the loudspeaker can be adaptively changed according to the emergency state grade, and the higher the emergency degree of the situation is, the larger the volume can be.
Optionally, this step may include:
when the target emergency situation evaluation value reaches a segmentation threshold value, segmenting the video data of the information recorder;
when the target emergency situation evaluation value is reduced to be below the segmentation threshold value, segmenting the video data of the information recorder again, and marking the video segment obtained between the two segmentations.
Because the information recorder is in the use, the user generally can open the video recording function of the information recorder in the working process all the time, record the video to the whole working process, when these video data are stored, in the correlation technique, the video data are generally segmented according to the preset time period, for example, 10 minutes, to obtain the video segment, because the duration of the video segment is fixed, and the information is complicated and redundant, it is not favorable for the follow-up retrieval and check of the key information in the working process. In the embodiment of the application, the video data can be flexibly segmented according to the target emergency situation evaluation value, and the video with the target emergency situation evaluation value reaching or exceeding the segmentation threshold value is divided into the video segments and marked by setting the segmentation threshold value, so that the emergency situation can be completely recorded by the video segments, and the video segments are favorable for subsequent workers to retrieve and check under the condition of setting the marks, and the working efficiency of the workers is improved.
In addition, the method can further comprise the following steps: when the information recorder uploads the video data to the control center, the video fragment with the mark is preferentially uploaded. Therefore, by uploading the video recording fragments with the marks in a wired manner, the key information in the information recorder can be ensured to be uploaded to the control center in real time, and the control center is convenient for workers to process efficiently.
Alternatively, the segmentation threshold may be equal to one of the level thresholds, so that the video segmentation is performed as part of the processing scheme corresponding to one of the event levels.
Taking the above three-level event level as an example, the segmentation threshold may be set as the first level threshold q1, and when the target emergency evaluation value reaches the three-level event, the video is segmented.
The information recorder can also upload the situation level to the control center. The control center can preferentially process events with high emergency degree according to the event grades uploaded by the information recorders, and can carry out resource scheduling more efficiently.
The information recorder uploads the video segments with the marks preferentially, the control center can search and match target personnel quickly from the database, information of the target personnel is fed back to the information recorder, and work strategies are adjusted in time.
In the method shown in fig. 2, in order to improve the accuracy of evaluating the situation of the work site, a first image information entropy in a first time period may be further calculated, and a target emergency situation evaluation value may be further calculated according to the first image information entropy, in this case, step 201 may further include: determining a first image information entropy according to a snapshot image in a first time period; accordingly, when the target emergency situation evaluation value is calculated in step 202, calculation is also performed based on the first image information entropy.
The image information entropy can represent the fuzzy degree of an image, the higher the moving speed of a general information recorder is, the higher the fuzzy degree of the image shot by a camera of the information recorder is, the smaller the image information entropy is, the slower the moving speed of the information recorder is, the lower the fuzzy degree of the image shot by the camera of the information recorder is, the clearer the image is, and the larger the image information entropy is, so that the moving speed of the information recorder can be represented from another dimension through the image information entropy. Therefore, the first image information entropy in the first time period in the embodiment of the application can represent the moving speed of the information recorder in the first time period, the larger the first image information entropy is, the slower the moving speed of the information recorder in the first time period is, the lower the emergency degree of the situation is, the smaller the first image information entropy is, the faster the moving speed of the information recorder in the first time period is, and the higher the emergency degree of the situation is.
Alternatively, if the event urgency is positively correlated with the target emergency assessment value, the first image information entropy and the target emergency assessment value are negatively correlated, that is: the larger the first image information entropy is, the smaller the target emergency state evaluation value is, and the smaller the first image information entropy is, the larger the target emergency state evaluation value is.
At this time, the calculation formula of the target emergency situation evaluation value may be: l1 × C1+ a2 × C2 × a + a3 × C3+ B1, where a3 is a weight corresponding to the first image information entropy, and C3 is a numerical value corresponding to the first image information entropy. The method for determining the corresponding value on the first image information may refer to the method for determining the value C1 corresponding to the first person, which is not described herein again.
Determining the first image information entropy according to the image captured in the first time period may include:
respectively calculating the image information entropy of each gray image according to the gray images of the images captured in the first time period;
and calculating an average value according to the image information entropy of the image captured in the first time period to obtain a first image information entropy.
The calculation formula of the image information entropy of the grayed image may be:
Figure BDA0002695203110000091
wherein, PiThe gray value is the proportion of the pixel point with the gray value i in all the pixel points of the gray image. The above formula takes the gray scale value range of 0-255 as an example, and is not used to limit the gray scale value range of the grayed image.
Unlike the method shown in fig. 2, in order to improve the accuracy of evaluation of the situation of the work site, the target emergency situation evaluation value may also be calculated based on the movement acceleration-related parameter of the information recorder. Referring to fig. 3, step 301 is added before step 202, and accordingly, the calculation method in step 203 is adaptively changed. Specifically, the method comprises the following steps:
step 301: and determining a first abnormal degree value of the movement acceleration of the information recorder according to the movement acceleration curve of the information recorder in the first time period.
The moving acceleration curve of the information recorder can be detected by an acceleration sensor arranged in the information recorder.
Wherein, this step can include: determining the moving acceleration of a plurality of preset time points in a first time period according to the moving acceleration curve; calculating the difference between the moving acceleration of each time point and the average moving acceleration in the second time period; and calculating the median or average value of the difference to obtain a first abnormal degree value of the moving acceleration.
Wherein the second time period is a time period before the first time period, optionally, if the first time period is a cycle, the second time period is a cycle before the cycle of the first time period, for example, if the first time period is a5 th cycle, the second time period may be a4 th cycle.
The first abnormal degree value of the moving acceleration can represent the abnormal degree of the moving acceleration of the information recorder and the abnormal situation of the moving acceleration of a user of the information recorder, and further represent the emergency degree of the situation. Generally, the greater the first abnormality degree value, the higher the incident emergency degree, and the smaller the first abnormality degree value, the lower the incident emergency degree.
Accordingly, the target emergency situation evaluation value is also calculated in step 202 according to the first abnormal degree value of the moving acceleration. Alternatively, if the event urgency is positively correlated with the target emergency assessment value, then the first anomaly magnitude is positively correlated with the target emergency assessment value, that is: the larger the first abnormality degree value is, the larger the target emergency evaluation value is, and the smaller the first abnormality degree value is, the smaller the target emergency evaluation value is.
In this embodiment, assuming that the target emergency situation evaluation value is calculated in step 202 based on the first number of people, the image moving speed, the image moving acceleration, the first degree of abnormality value of the moving acceleration, and the first image information entropy, the step may include:
the target emergency evaluation value L is calculated using the following formula: l-a 1 × C1+ a2 × C2 × a + a3 × C3+ a4 × C4+ B1, where a4 is a weight corresponding to the first abnormal degree value of the moving acceleration, C4 is a numerical value corresponding to the first abnormal degree value of the moving acceleration, and B1 is a constant.
It should be noted that the execution sequence between step 201 and step 301 is not limited in this application. The execution of step 301 after step 201 in fig. 3 is merely an example.
In order to improve the accuracy of evaluation of the situation of the work site, it is also possible to perform calculation of relevant parameters based on the first audio signal picked up by the information recorder in the first time period, and accordingly, to calculate the target emergency situation evaluation value based on these parameters. In the following, the embodiment shown in fig. 3 will be described by adding the above-described processing. Referring to fig. 4, step 401 is added before step 201, and the calculation method in step 203 is adaptively changed, specifically:
step 401: and determining a first sound intensity value of the first audio signal and/or a keyword hit rate of text information corresponding to the first audio signal according to the first audio signal picked up by the information recorder in the first time period.
Correspondingly, in step 203, a target emergency situation assessment value is also calculated according to the first sound intensity value and/or the keyword hit rate.
Wherein the first sound intensity value of the first audio signal may be: and the sound intensity of the first audio signal corresponding to a plurality of time points in the first time period is a numerical value obtained by taking a median value or an average value. The detailed description of the calculation method is omitted here. The time intervals between the time points may be the same or different, and preferably the time intervals between the time points are the same, so that the first sound intensity value may better represent the sound intensity of the first audio signal. The number of time points is not limited in the embodiments of the present application, and generally, the greater the number of time points is, the better the calculated first sound intensity value can represent the sound intensity of the first audio signal.
The first sound intensity value can represent the sound intensity of the environment where the information recorder is located, and therefore the emergency degree of the situation can be represented from the dimension of the sound intensity. Generally, the greater the first sound intensity value, the higher the event urgency, and the smaller the first sound intensity value, the lower the event urgency. If the state emergency degree and the target emergency evaluation value are positively correlated, correspondingly, the first sound intensity value and the target emergency evaluation value are positively correlated, that is: the larger the first sound intensity value is, the larger the target emergency state evaluation value is, and conversely, the smaller the target emergency state evaluation value is.
The keyword hit rate of the text information corresponding to the first audio signal is as follows: the proportion of the characters hit the preset keywords in the character information is determined. The preset keywords can be characters which are obtained through statistics and possibly appear under the emergency situation, so that the keyword hit rate can represent the probability of the preset keywords appearing in the sound of the environment where the information recorder is located, and further represent the emergency degree of the situation. Generally, the higher the keyword hit rate, the higher the incident urgency level, and the lower the keyword hit rate, the lower the incident urgency level. If the state emergency degree and the target emergency evaluation value are positively correlated, correspondingly, the keyword hit rate and the target emergency evaluation value are positively correlated, namely: the larger the keyword hit rate is, the larger the target emergency evaluation value is, and conversely, the smaller the target emergency evaluation value is.
For example, assume that the text information corresponding to the first audio signal is: i want to complain you. Wherein, only 'complaint' is included in the preset keyword, and the keyword hit rate of the text information 'i want to complain you' is: 2/5-40%.
In this embodiment, assuming that the target emergency situation evaluation value is calculated in step 202 based on the first number of people, the image moving speed, the image moving acceleration, the first degree of abnormality value of the moving acceleration, the first image information entropy, the first sound intensity value, and the keyword hit rate, the step may include:
the target emergency evaluation value L is calculated using the following formula: l-a 1 × C1+ a2 × C2 × a + a3 × C3+ a4 × C4+ a5 × C5+ a6 × C6+ B1, where a5 is a weight corresponding to the first sound intensity value, C5 is a numerical value corresponding to the first sound intensity value, a6 is a weight corresponding to the keyword hit rate, C6 is a numerical value corresponding to the keyword hit rate, and B1 is a constant.
The method for determining the value C6 corresponding to the keyword hit rate may refer to the method for determining the value C1 corresponding to the first person, which is not described herein again.
Alternatively, the value C5 corresponding to the first sound intensity value may be calculated using the following formula: c5 ═ a1 (S1-S0) × T1+ B2;
where a1 is a scaling factor, S1 is a first sound intensity value, S0 is a preset sound intensity threshold, T1 is a duration of time during which the average sound intensity value of the first audio signal exceeds the preset sound intensity threshold during the first time period, and B2 is a constant.
The method for determining T1 will be explained below: in the first time period, the first time period is divided into a plurality of sub-time periods, the average sound intensity value of each sub-time period is calculated respectively, whether the average sound intensity value exceeds a preset sound intensity threshold value or not is judged, and the duration T1 is calculated according to the duration of the sub-time period when the average sound intensity value exceeds the preset sound intensity threshold value. The durations of the sub-periods may be the same or different, and preferably, the durations of the sub-periods are the same. For example, the first time period is 5 minutes in duration, and is divided into 5 sub-time periods, wherein the average sound intensity value of the sub-time periods 2-4 exceeds the preset sound intensity threshold, and the duration T1 is 3 minutes.
It should be noted that the execution sequence among steps 201, 301, and 401 is not limited in this application. The execution of step 401 before step 201 in fig. 4 is merely an example.
In order to reduce the data processing amount and power consumption of the information recorder, the information recorder may not continuously perform step 201 and step 301, but perform step 201 and step 301 when the trigger condition is reached, which is exemplified by increasing the trigger condition in the embodiment shown in fig. 4 in the embodiment shown in fig. 5, and in this case, step 401 may be preceded by:
step 501: and calculating a first emergency state evaluation value according to the first sound intensity value of the second audio signal in the second time period and the keyword hit rate of the text information corresponding to the second audio signal, and judging that the first emergency state evaluation value exceeds a preset trigger threshold value.
Wherein the first emergency situation assessment value is also used for representing the emergency degree of the situation. Optionally, the state emergency degree and the first emergency state evaluation value are positively correlated, and correspondingly, the first sound intensity value and the keyword hit rate of the second audio signal are positively correlated with the first emergency state evaluation value respectively.
The first emergency state evaluation value may be calculated by the following formula: v ═ x1 × D1+ x2 × D2+ B3;
wherein D1 is a value corresponding to the first sound intensity value of the second audio signal, and D2 is a value corresponding to the keyword hit rate of the second audio signal; x1 is the weight corresponding to the first sound intensity value, and x2 is the weight corresponding to the keyword hit rate; b3 is a constant.
For determining the first sound intensity value of the second audio signal and the corresponding value thereof, reference may be made to the method for determining the first sound intensity value of the first audio signal and the corresponding value thereof illustrated in fig. 3, which is not described herein again.
Alternatively, assuming that the target emergency situation assessment value is calculated in step 202 according to the first person number, the image moving speed, the first image certain acceleration, the first degree of abnormality value of the moving acceleration, the first image information entropy, the first sound intensity value, and the keyword hit rate, step 202 in this embodiment may include:
calculating a first emergency situation evaluation value of a first time period according to a first sound intensity value of the first audio signal and a keyword hit rate of the first audio signal;
calculating a second emergency situation evaluation value of the first time period according to the first number of people in the first time period, the image moving speed, the image moving acceleration, the first image information entropy and the first abnormal degree value of the moving acceleration;
and calculating a target emergency evaluation value according to the first emergency evaluation value and the second emergency evaluation value of the first time period.
Wherein the second emergency situation assessment value is also used to represent the emergency degree of the situation. Optionally, the second emergency situation evaluation value is positively correlated with the situation emergency degree, correspondingly, the first person number, the image moving speed, the image moving acceleration and the first abnormal degree value of the moving acceleration are respectively positively correlated with the second emergency situation evaluation value, and the first image information entropy is negatively correlated with the second emergency situation evaluation value.
For calculating the first emergency state evaluation value of the first time period according to the first sound intensity value of the first audio signal and the keyword hit rate of the first audio signal, reference may be made to the calculation method of the first emergency state evaluation value of the second time period in step 401, which is not described herein again.
Wherein calculating the second emergency situation assessment value for the first time period according to the first number of people for the first time period, the image moving speed, the image moving acceleration, the first image information entropy, and the first abnormal degree value of the moving acceleration may include:
the second emergency situation evaluation value P for the first period is calculated using the following formula: p ═ y1 × C1+ y2 × C2 × a + y3 × C3+ y4 × C4+ B4; where y1 is a weight corresponding to the first person, y2 is a weight corresponding to the image moving speed and the image moving acceleration, y3 is a weight corresponding to the first image information entropy, y4 is a weight corresponding to the first degree of abnormality value, and B4 is a constant.
Note that the first entropy of the image information and the first anomaly degree value of the moving acceleration are optional parameters, and if the second emergency evaluation value is not calculated from the first entropy of the image information, y3 × C3 in the above formula may be omitted, and similarly inverted, and if the second emergency evaluation value is not calculated from the first anomaly degree value, y4 × C4 in the above formula may be omitted.
The calculation formula of the target emergency situation evaluation value may be: l ═ α · V + β · P + K;
where α is a weight of the first emergency state evaluation value, β is a weight of the second emergency state evaluation value, V is the first emergency state evaluation value, P is the second emergency state evaluation value, and K is a constant.
The information processing method in the embodiment of the present application is described as an example. Fig. 6 is a flowchart of another embodiment of the information processing method of the present application, which is described by taking an example that the first time period and the second time period are respectively a cycle, as shown in fig. 6, the method may include:
step 601: the information recorder uses a microphone to pick up sound in the environment in each period to obtain an audio signal of each period.
Optionally, in order to avoid interference of environmental sounds in subsequent processing, the information recorder may perform front-end enhancement filtering processing on the audio signal obtained by sound pickup to filter out common environmental noises in the audio signal and retain a clear voice signal, and a filtering method specifically used is not limited in the embodiment of the present application; accordingly, the audio signal processed in the subsequent step may be a filtered audio signal.
Step 602: the information recorder detects the average sound intensity value of the audio signal in each period, and if it is detected that the duration of the average sound intensity value exceeding the preset sound intensity threshold exceeds the time length threshold in the nth period, step 503 is performed. N is a natural number.
When the workplace is fiercely disputed, the emotion and voice fluctuation of people are large, so that the duration time that the detected sound intensity exceeds the sound intensity threshold value exceeds the time length threshold value indicates that the possible workplace situation is tense, and the subsequent steps are triggered to be executed to further analyze the situation of the situation.
The step of detecting that the duration of the sound intensity exceeding the sound intensity threshold exceeds the time length threshold in this step triggers execution of step 603 is an optional step, and by executing this optional step, execution of subsequent steps can be made more purposeful and efficient, and power consumption of the information recorder can be reduced relative to continuous execution of subsequent steps by the information recorder.
If the information recorder detects that the duration of the average sound intensity value exceeding the preset sound intensity threshold does not exceed the time length threshold in a certain period after the nth period, the execution of step 603 and the subsequent steps may be stopped until the duration of the average sound intensity value exceeding the preset sound intensity threshold detected in a certain period in the next step 602 does not exceed the time length threshold, and the execution of step 603 and the subsequent steps is continuously triggered.
Step 603: and the information recorder performs character conversion on the audio signal in the (N + 1) th period to obtain character information corresponding to the audio signal, matches the character information with preset keywords, and calculates the keyword hit rate of the character information.
Step 604: the information recorder calculates a first sound intensity value of the audio signal in the (N + 1) th period and a keyword hit rate of the text information corresponding to the audio signal, calculates a first emergency evaluation value according to the first sound intensity value of the audio signal in the (N + 1) th period and the keyword hit rate of the text information corresponding to the audio signal, judges that the first emergency evaluation value exceeds a preset first threshold value, and executes steps 605 to 607.
Step 604 is an optional step.
Step 605: the information recorder calculates a first sound intensity value of the audio signal in the (N + 2) th period and a keyword hit rate of character information corresponding to the audio signal, and calculates a first emergency situation evaluation value according to the first sound intensity value and the keyword hit rate of the (N + 2) th period; step 609 is performed.
Step 606: the information recorder starts a timing snapshot function of the camera in the (N + 2) th period, acquires images captured at a timing snapshot, and calculates the first number of people, the image moving speed, the image moving acceleration and the first image information entropy in the (N + 2) th period according to the captured images in the (N + 2) th period; step 608 is performed.
Since the user uses the camera to record the work site when using the information recorder, the image captured at regular time is generally the image of the work site, so that the situation development condition of the work site can be estimated by acquiring the first number of people, the moving speed of the image and the moving acceleration of the image.
Step 607: the information recorder acquires a moving acceleration curve of the information recorder in the (N + 2) th cycle through a gyroscope sensor and an acceleration sensor, and calculates a first abnormal degree value of the moving acceleration of the information recorder according to the moving acceleration curve; step 608 is performed.
Step 608: the information recorder calculates a second emergency situation evaluation value according to the first number of people, the image moving speed, the image moving acceleration, the first image information entropy and the first abnormal degree value.
Step 609: the information recorder calculates a target emergency evaluation value according to the first emergency evaluation value and the second emergency evaluation value.
Step 610: the information recorder executes a processing plan corresponding to the target emergency situation evaluation value.
According to the embodiment of the application, the existing camera, the existing microphone and the existing related sensors in the information recorder are fully utilized, the environment information of a working site is efficiently extracted and used for emergency situation assessment of an incident, files such as video data and the like are intelligently segmented and grade-marked according to the target emergency situation assessment value of the incident, the workload of retrieval work is reduced, the burden of redundant data on information transmission is avoided, and the information transmission is enabled to be more rapid and effective.
Aiming at different situation levels, corresponding processing plans are provided, and effective work site information is transmitted to a control center and members; aiming at different state levels, the processing priorities are distinguished, and high-level emergencies can be guaranteed to be solved preferentially; the scheme ensures the personal safety of the working personnel, ensures the legal rights and interests of the target personnel and improves the working efficiency.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and other operations or variations of various operations may be performed by the embodiments of the present application. Further, the various steps may be performed in a different order presented in the above-described embodiments, and it is possible that not all of the operations in the above-described embodiments are performed.
Fig. 7 is a schematic structural diagram of an embodiment of an information processing apparatus according to the present application, and as shown in fig. 7, the apparatus 700 may include:
a determining unit 710 for determining a first number of people, an image moving speed and an image moving acceleration from the captured images in the first time period, wherein the image moving speed represents the moving speed of the information recorder through the change of the images, and the image moving acceleration represents the moving acceleration of the information recorder through the change of the images;
a calculating unit 720 for calculating a target emergency evaluation value indicating an emergency degree according to the first number of persons, the image moving speed, and the image moving acceleration; the first number of people, the image moving speed and the image moving acceleration are positively correlated with the target emergency situation evaluation value respectively;
the execution unit 730 is configured to execute a processing plan corresponding to the target emergency situation evaluation value, where the processing plan includes a storage plan of the information recorder.
Optionally, the determining unit 710 may be further configured to:
determining a first image information entropy according to a snapshot image in a first time period; the first image information represents the moving speed of the information recorder through the fuzzy degree of the image;
accordingly, the calculation unit 720 may also calculate the target emergency evaluation value from the first image information entropy, which is inversely related to the target emergency evaluation value.
Optionally, the determining unit 710 may be further configured to:
determining a first abnormal degree value of the movement acceleration of the information recorder according to the movement acceleration curve of the information recorder in a first time period; the moving acceleration curve is detected by an acceleration sensor of the information recorder; the first abnormal degree value represents the moving speed of the information recorder through the abnormal degree of the moving acceleration of the information recorder;
accordingly, the calculation unit 720 may further calculate the target emergency evaluation value according to the first abnormality degree value, which is positively correlated with the target emergency evaluation value.
Optionally, the first determining unit 710 may be further configured to:
determining a first sound intensity value of a first audio signal and/or a keyword hit rate of text information corresponding to the first audio signal according to the first audio signal picked up in a first time period; the first sound intensity value is used for representing the sound intensity of the environment where the information recorder is located; the keyword hit rate is used for representing the probability of the preset keywords appearing in the sound of the environment where the information recorder is located;
accordingly, the calculation unit 720 may further calculate the target emergency situation evaluation value according to the first sound intensity value and/or the keyword hit rate, which are respectively positively correlated with the target emergency situation evaluation value.
Optionally, the computing unit 720 may be specifically configured to: the target emergency evaluation value L is calculated using the following formula: l-a 1 × C1+ a2 × C2 × a + a3 × C3+ a4 × C4+ a5 × C5+ a6 × C6+ B1, where a1 is a weight corresponding to a first person, C1 is a numerical value corresponding to a first person, a2 is a weight corresponding to an image moving speed, C2 is a numerical value corresponding to an image moving speed, a is an image moving acceleration, a3 is a weight corresponding to a first image information entropy, C3 is a numerical value corresponding to a first image information entropy, a4 is a weight corresponding to a first abnormal degree value, C4 is a numerical value corresponding to a first abnormal degree value, a5 is a weight corresponding to a first sound intensity value, C5 is a numerical value corresponding to a first sound intensity value, a6 is a numerical value corresponding to a keyword weight, C1 is a keyword hit rate corresponding to a keyword 6, and a keyword hit rate is a keyword hit rate corresponding to a keyword 1.
Optionally, the computing unit 720 may be further configured to: the value C5 corresponding to the first sound intensity value is calculated using the following formula: c5 ═ a1 (S1-S0) × T1+ B2; where a1 is a scaling factor, S1 is a first sound intensity value, S0 is a preset sound intensity threshold, T1 is a duration of time during which the first sound intensity value of the first audio signal exceeds the preset sound intensity threshold during the first time period, and B2 is a constant.
Optionally, the computing unit 720 may be further configured to: calculating a first emergency situation evaluation value according to the average sound intensity of a second audio signal picked up by the information recorder in a second time period and the keyword hit rate of the text information corresponding to the second audio signal; the average sound intensity of the second audio signal and the keyword hit rate of the text information corresponding to the second audio signal are positively correlated with the first emergency situation evaluation value respectively; and judging that the first emergency situation evaluation value exceeds a first evaluation threshold value.
Optionally, the computing unit 720 may be specifically configured to: the first emergency situation evaluation value V is calculated using the following calculation formula: v ═ x1 × D1+ x2 × D2+ B3; wherein D1 is a value corresponding to the first sound intensity value of the second audio signal, and D2 is a value corresponding to the keyword hit rate of the second audio signal; x1 is the weight corresponding to the first sound intensity value, and x2 is the weight corresponding to the keyword hit rate; b3 is a constant.
Optionally, the computing unit 720 may be specifically configured to: calculating a first emergency situation evaluation value of a first time period according to a first sound intensity value of the first audio signal and a keyword hit rate of the first audio signal; calculating a second emergency situation evaluation value of the first time period according to the first number of people in the first time period, the image moving speed, the image moving acceleration, the first image information entropy and the first abnormal degree value of the moving acceleration; and calculating a target emergency evaluation value according to the first emergency evaluation value and the second emergency evaluation value of the first time period.
Optionally, the computing unit 720 may be specifically configured to: the second emergency situation evaluation value P for the first period is calculated using the following formula: p ═ y1 × C1+ y2 × C2 × a + y3 × C3+ y4 × C4+ B4; where y1 is a weight corresponding to the first person, C1 is a numerical value corresponding to the first person, y2 is a weight corresponding to the image moving speed, C2 is a numerical value corresponding to the image moving speed, a is the image moving acceleration, y3 is a weight corresponding to the first degree of abnormality value, C3 is a numerical value corresponding to the first degree of abnormality value, y4 is a weight corresponding to the first image information entropy, C4 is a numerical value corresponding to the first image information entropy, and B4 is a constant.
Optionally, the computing unit 720 may be specifically configured to: the target emergency evaluation value L is calculated using the following formula: l ═ α · V + β · P + K; where α is a weight corresponding to the first emergency state evaluation value, β is a weight corresponding to the second emergency state evaluation value, V is the first emergency state evaluation value, P is the second emergency state evaluation value, and K is a constant.
Optionally, the execution unit 730 may specifically be configured to: when the target emergency situation evaluation value reaches a segmentation threshold value, segmenting the video data of the information recorder; when the target emergency situation evaluation value is reduced to be below the segmentation threshold value, segmenting the video data of the information recorder again, and marking the video segment obtained between the two segmentations.
Optionally, the execution unit 730 may specifically be configured to: determining a preset interval where the target emergency situation evaluation value is located; the preset interval is obtained by dividing the value range of the target emergency situation evaluation value according to at least one preset threshold value; and executing a preset processing plan corresponding to the determined preset interval.
Optionally, the determining unit 710 may specifically be configured to: for each image captured in the first time period, calculating the number of people in the image; and calculating the median or the average value according to the number of people in the captured image in the first time period to obtain the first number of people.
Optionally, the determining unit 710 may specifically be configured to: respectively calculating the image moving speed corresponding to two adjacent images for the image captured in the first time period; and calculating the median or the average value according to the image moving speeds corresponding to the two adjacent images to obtain the image moving speed.
Optionally, the determining unit 710 may specifically be configured to: respectively calculating the image information entropy of each gray image according to the gray images of the images captured in the first time period; and calculating a median or an average value according to the image information entropy of the image captured in the first time period to obtain a first image information entropy.
Optionally, the determining unit 710 may specifically be configured to: determining the moving acceleration of at least one time point in a first time period according to the moving acceleration curve; calculating the difference value between the moving acceleration of each time point and the moving acceleration mean value in the second time period; and calculating a median or an average value according to the difference corresponding to each time point to obtain a first abnormal degree value.
The apparatus provided in the embodiment shown in fig. 7 may be used to implement the technical solutions of the method embodiments shown in fig. 2 to fig. 6 of the present application, and the implementation principles and technical effects thereof may be further referred to in the related description of the method embodiments.
It should be understood that the above division of the units of the apparatus shown in fig. 7 is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these units can be implemented entirely in software, invoked by a processing element; or may be implemented entirely in hardware; part of the units can also be realized in the form of software called by a processing element, and part of the units can be realized in the form of hardware. For example, the computing unit may be a separate processing element, or may be integrated into a chip of the electronic device. The other units are implemented similarly. In addition, all or part of the units can be integrated together or can be independently realized. In implementation, the steps of the method or the units above may be implemented by hardware integrated logic circuits in a processor element or instructions in software.
For example, the above units may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these units may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
The application also provides an information recorder, the device includes a storage medium and a central processing unit, the storage medium may be a non-volatile storage medium, a computer executable program is stored in the storage medium, and the central processing unit is connected with the non-volatile storage medium and executes the computer executable program to implement the method provided by the embodiment shown in fig. 2 to fig. 6 of the application.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is enabled to execute the method provided by the embodiment shown in fig. 2 to 6 of the present application.
Embodiments of the present application further provide a computer program product, which includes a computer program, when the computer program runs on a computer, the computer executes the method provided by the embodiments shown in fig. 2 to fig. 6 of the present application.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. An information processing method applied to an information recorder is characterized by comprising the following steps:
determining a first number of people, an image moving speed and an image moving acceleration according to images captured in a first time period, wherein the image moving speed represents the moving speed of the information recorder through the change of the images, and the image moving acceleration is used for representing the moving acceleration of the information recorder through the change of the images;
calculating a target emergency situation evaluation value according to the first number of people, the image moving speed and the image moving acceleration, wherein the target emergency situation evaluation value is used for representing the situation emergency degree; the first number of people, the image moving speed and the image moving acceleration are positively correlated with the target emergency situation evaluation value respectively;
and executing a processing plan corresponding to the target emergency situation evaluation value, wherein the processing plan comprises a storage plan of the information recorder.
2. The method of claim 1, further comprising:
determining a first image information entropy according to the image captured in the first time period; the first image information represents the moving speed of the information recorder through the fuzzy degree of the image;
correspondingly, the target emergency situation evaluation value is calculated according to the first image information entropy, and the first image information entropy is inversely related to the target emergency situation evaluation value.
3. The method of claim 1 or 2, further comprising:
determining a first abnormal degree value of the movement acceleration of the information recorder according to the movement acceleration curve of the information recorder in the first time period; the moving acceleration curve is detected by an acceleration sensor of the information recorder; the first abnormal degree value represents the moving speed of the information recorder through the abnormal degree of the moving acceleration of the information recorder;
correspondingly, the target emergency situation evaluation value is calculated according to the first abnormal degree value, and the first abnormal degree value is positively correlated with the target emergency situation evaluation value.
4. The method of any of claims 1 to 3, further comprising:
determining a first sound intensity value of the first audio signal and/or a keyword hit rate of text information corresponding to the first audio signal according to the first audio signal picked up in the first time period; the first sound intensity value is used for representing the sound intensity of the environment where the information recorder is located; the keyword hit rate is used for representing the probability of the preset keywords appearing in the sound of the environment where the information recorder is located;
correspondingly, the target emergency situation evaluation value is further calculated according to the first sound intensity value and/or the keyword hit rate, and the first sound intensity value and the keyword hit rate are respectively positively correlated with the target emergency situation evaluation value.
5. The method of claim 4, wherein prior to determining the first number of people, the speed of image movement, and the acceleration of image movement, further comprising:
calculating a first emergency situation evaluation value according to the average sound intensity of a second audio signal picked up by the information recorder in a second time period and the keyword hit rate of the text information corresponding to the second audio signal; the average sound intensity of the second audio signal and the keyword hit rate of the text information corresponding to the second audio signal are positively correlated with the first emergency situation evaluation value respectively;
and judging that the first emergency situation evaluation value exceeds a first evaluation threshold value.
6. The method according to any one of claims 1 to 5, wherein the executing the processing plan corresponding to the target emergency situation assessment value comprises:
when the target emergency situation evaluation value reaches a segmentation threshold value, segmenting the video data of the information recorder;
and when the target emergency situation evaluation value is reduced to be below the segmentation threshold value, segmenting the video data of the information recorder again, and marking the video segment obtained between the two segmentations.
7. The method according to any one of claims 1 to 5, wherein the executing the processing plan corresponding to the target emergency situation assessment value comprises:
determining a preset interval where the target emergency situation evaluation value is located; the preset interval is obtained by dividing the value range of the target emergency situation evaluation value according to at least one preset threshold value;
and executing the determined preset processing plan corresponding to the preset interval.
8. The method of any of claims 1 to 5, wherein determining the first person number from the images captured during the first time period comprises:
for each image captured in the first time period, calculating the number of people in the image;
and calculating a median or an average value according to the number of people in the image captured in the first time period to obtain the first number of people.
9. The method of any one of claims 1 to 5, wherein determining the image movement speed from the image captured during the first time period comprises:
respectively calculating the image moving speed corresponding to the two adjacent images for the image captured in the first time period;
and calculating a median or an average value according to the image moving speeds corresponding to the two adjacent images to obtain the image moving speed.
10. The method according to any one of claims 2 to 5, wherein the determining the first image information entropy from the images captured in the first time period comprises:
respectively calculating the image information entropy of each gray image according to the gray images of the images captured in the first time period;
and calculating a median or an average value according to the image information entropy of the image captured in the first time period to obtain the first image information entropy.
11. The method of any one of claims 3 to 5, wherein determining a first abnormal degree value of the movement acceleration of the information recorder from the movement acceleration profile of the information recorder over the first time period comprises:
determining the moving acceleration of at least one time point in the first time period according to the moving acceleration curve;
calculating the difference value between the moving acceleration of each time point and the moving acceleration mean value in the second time period;
and calculating a median or an average value according to the difference corresponding to each time point to obtain the first abnormal degree value.
12. An information processing apparatus characterized by comprising:
a first determination unit configured to determine a first number of persons, an image movement speed, and an image movement acceleration from an image captured within a first time period, wherein the image movement speed represents a movement speed of the information recorder by a change in the image, and the image movement acceleration represents a movement acceleration of the information recorder by a change in the image;
a calculating unit configured to calculate a target emergency evaluation value indicating an event emergency degree, based on the first number of people, the image moving speed, and the image moving acceleration; the first number of people, the image moving speed and the image moving acceleration are positively correlated with the target emergency situation evaluation value respectively;
and the execution unit is used for executing a processing plan corresponding to the target emergency situation evaluation value, and the processing plan comprises a storage plan of the information recorder.
13. An information recording apparatus, comprising:
one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method of any of claims 1 to 11.
14. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1 to 11.
CN202011003777.5A 2020-09-22 2020-09-22 Information processing method and device and information recorder Pending CN112150667A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011003777.5A CN112150667A (en) 2020-09-22 2020-09-22 Information processing method and device and information recorder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011003777.5A CN112150667A (en) 2020-09-22 2020-09-22 Information processing method and device and information recorder

Publications (1)

Publication Number Publication Date
CN112150667A true CN112150667A (en) 2020-12-29

Family

ID=73896882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011003777.5A Pending CN112150667A (en) 2020-09-22 2020-09-22 Information processing method and device and information recorder

Country Status (1)

Country Link
CN (1) CN112150667A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102055959A (en) * 2009-10-29 2011-05-11 株式会社日立制作所 Central monitoring system based on a plurality of monitoring cameras and central monitoring method
CN105430245A (en) * 2014-09-17 2016-03-23 奥林巴斯株式会社 Camera device and image shake correction method
CN105447458A (en) * 2015-11-17 2016-03-30 深圳市商汤科技有限公司 Large scale crowd video analysis system and method thereof
CN106886764A (en) * 2017-02-22 2017-06-23 深圳市深网视界科技有限公司 A kind of panic degree computational methods and device based on deep learning
CN109766762A (en) * 2018-12-15 2019-05-17 深圳壹账通智能科技有限公司 Intelligent emergent processing method, device, computer equipment and storage medium
CN110633593A (en) * 2018-06-21 2019-12-31 北京嘀嘀无限科技发展有限公司 Malignant event prediction method and system
CN110971817A (en) * 2019-11-18 2020-04-07 青岛海信移动通信技术股份有限公司 Monitoring equipment and determination method of monitoring image
CN111063162A (en) * 2019-12-05 2020-04-24 恒大新能源汽车科技(广东)有限公司 Silent alarm method and device, computer equipment and storage medium
CN111600938A (en) * 2020-04-29 2020-08-28 深圳警翼智能科技股份有限公司 Law enforcement data acquisition method and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102055959A (en) * 2009-10-29 2011-05-11 株式会社日立制作所 Central monitoring system based on a plurality of monitoring cameras and central monitoring method
CN105430245A (en) * 2014-09-17 2016-03-23 奥林巴斯株式会社 Camera device and image shake correction method
CN105447458A (en) * 2015-11-17 2016-03-30 深圳市商汤科技有限公司 Large scale crowd video analysis system and method thereof
CN106886764A (en) * 2017-02-22 2017-06-23 深圳市深网视界科技有限公司 A kind of panic degree computational methods and device based on deep learning
CN110633593A (en) * 2018-06-21 2019-12-31 北京嘀嘀无限科技发展有限公司 Malignant event prediction method and system
CN109766762A (en) * 2018-12-15 2019-05-17 深圳壹账通智能科技有限公司 Intelligent emergent processing method, device, computer equipment and storage medium
CN110971817A (en) * 2019-11-18 2020-04-07 青岛海信移动通信技术股份有限公司 Monitoring equipment and determination method of monitoring image
CN111063162A (en) * 2019-12-05 2020-04-24 恒大新能源汽车科技(广东)有限公司 Silent alarm method and device, computer equipment and storage medium
CN111600938A (en) * 2020-04-29 2020-08-28 深圳警翼智能科技股份有限公司 Law enforcement data acquisition method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
叶兆言: "《人类的起源》", 31 August 2020 *

Similar Documents

Publication Publication Date Title
CN110291489B (en) Computationally efficient human identification intelligent assistant computer
CN109152542B (en) System and method for detecting when a sensor is worn
CN108304758B (en) Face characteristic point tracking method and device
US20120109862A1 (en) User device and method of recognizing user context
CN110784628A (en) Image data acquisition processing method and system, intelligent camera and server
US10752213B2 (en) Detecting an event and automatically obtaining video data
JP2016503913A (en) Security monitoring system and corresponding alarm triggering method
CN102740121B (en) Be applied to video quality diagnostic control system and the method for video surveillance network
Han et al. GlimpseData: Towards continuous vision-based personal analytics
AU2018200557A1 (en) System and method for driver profiling corresponding to automobile trip
US11514713B2 (en) Face quality of captured images
KR101454644B1 (en) Loitering Detection Using a Pedestrian Tracker
CN112150667A (en) Information processing method and device and information recorder
WO2019132972A2 (en) Smart context subsampling on-device system
CN114125148A (en) Control method of electronic equipment operation mode, electronic equipment and readable storage medium
WO2021199315A1 (en) Monitoring device, monitoring method, and recording medium
US10726270B2 (en) Selecting media from mass social monitoring devices
CN109508703A (en) A kind of face in video determines method and device
CN111651690A (en) Case-related information searching method and device and computer equipment
CN109740525A (en) The determination method and device of object in a kind of video
WO2021199311A1 (en) Monitoring device, monitoring method, and recording medium
JPWO2019188429A1 (en) Moving object management device, moving object management system, moving object management method, and computer program
US20230011547A1 (en) Optimizing continuous media collection
US11785338B1 (en) Initiating content capture based on priority sensor data
JP2019179321A (en) Wearable camera and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201229