CN114419859A - Safety detection method, processor and device for supporting leg and fire fighting truck - Google Patents

Safety detection method, processor and device for supporting leg and fire fighting truck Download PDF

Info

Publication number
CN114419859A
CN114419859A CN202111617737.4A CN202111617737A CN114419859A CN 114419859 A CN114419859 A CN 114419859A CN 202111617737 A CN202111617737 A CN 202111617737A CN 114419859 A CN114419859 A CN 114419859A
Authority
CN
China
Prior art keywords
leg
processor
moving direction
area
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111617737.4A
Other languages
Chinese (zh)
Inventor
熊忆
杨懿
熊顺进
周敏
颜江鲁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Zoomlion Emergency Equipment Co Ltd
Original Assignee
Hunan Zoomlion Emergency Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Zoomlion Emergency Equipment Co Ltd filed Critical Hunan Zoomlion Emergency Equipment Co Ltd
Priority to CN202111617737.4A priority Critical patent/CN114419859A/en
Publication of CN114419859A publication Critical patent/CN114419859A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

The application relates to the field of engineering machinery, in particular to a landing leg safety detection method, a processor, a device, a landing leg, a fire engine and a storage medium for the fire engine. The method comprises the following steps: acquiring a field image of an area where the supporting leg is acquired by image acquisition equipment; continuously inputting the field image into the multi-target tracking detection model so as to determine the position of each object contained in the field image in real time through the multi-target tracking detection model; determining the moving direction of each object according to the change condition of the position of each object; and generating an alarm prompt when the moving direction of the object is determined to be towards the direction of the support leg. In the technical scheme, the processor acquires field images of the region where the supporting legs are located through the image acquisition equipment, outputs the positions of all objects in the images through the multi-target tracking detection model, determines the moving direction of each object, generates the corresponding alarm prompt, and can avoid danger caused by the fact that a moving object is close to the supporting legs of the fire fighting truck.

Description

Safety detection method, processor and device for supporting leg and fire fighting truck
Technical Field
The application relates to the field of engineering machinery, in particular to a landing leg safety detection method, a processor, a device, a landing leg, a fire engine and a storage medium for the fire engine.
Background
When the fire engine is operated in a fire scene, the supporting legs need to be extended to support and fix the body of the fire engine, and the fire engine can work stably. Because the environment in the scene of a fire is complicated, when the landing leg moves, fire engine operating personnel can't watch the fire engine automobile body surrounding environment simultaneously or need many people to watch the fire engine operation simultaneously. At this time, it is impossible to determine whether a person, an animal, a vehicle, or other moving object is close to the leg of the fire fighting vehicle, thereby causing injury to the person, the animal, or the like.
Disclosure of Invention
The purpose of the embodiment of the application is to provide a landing leg safety detection method, a processor, a device, a landing leg, a fire engine and a storage medium for the fire engine, which can avoid danger caused by the fact that moving objects such as people, animals and vehicles get close to the landing leg of the fire engine.
In order to achieve the above object, a first aspect of the present application provides a safety monitoring method for a leg, comprising
Acquiring a field image of an area where the supporting leg is acquired by image acquisition equipment;
continuously inputting the field image into the multi-target tracking detection model so as to determine the position of each object contained in the field image in real time through the multi-target tracking detection model;
determining the moving direction of each object according to the change condition of the position of each object;
and generating an alarm prompt when the moving direction of the object is determined to be towards the direction of the support leg.
In one embodiment of the application, a previous position and a next position of each object are determined according to successively acquired field images; the moving direction of each object from the previous position to the next position is determined as the moving direction of each object.
In one embodiment of the present application, determining the moving direction of each object according to the change of the position of each object includes: determining a distance difference between the position of each object and a preset warning area, wherein the preset warning area is determined according to the area where the supporting legs are located; under the condition that the distance difference value is continuously reduced, determining that the moving direction of the object moves towards the preset warning area; under the condition that the distance difference value is continuously increased, determining that the moving direction of the object is towards moving away from the preset warning area; and under the condition that the distance difference value is kept unchanged, determining that the object does not move.
In an embodiment of the present application, in a case that it is determined that there is a movement direction of the object moving toward the direction of the leg, generating the alert prompt includes: under the condition that the moving direction of the object is determined to be towards the direction of the supporting legs, the distance difference between the object and the preset warning area is determined; and generating an alarm prompt under the condition that the distance difference is smaller than a preset warning threshold value.
In one embodiment of the application, in the case that it is determined that the moving direction of the object is moving in the direction of the leg, the real-time position of the object moving in the direction of the leg is displayed through the display device.
In one embodiment of the present application, in a case where it is determined that the moving direction in which the object does not exist is moving in the direction of the leg, the live image is displayed through the display device.
In one embodiment of the present application, the object includes at least one of a human, an animal, and other movable object.
A second aspect of the present application provides a processor configured to perform any one of the above-described safety monitoring methods for a leg.
A third aspect of the application provides a safety monitoring device for a leg comprising a processor as described above.
A fourth aspect of the present application provides a leg comprising:
the image acquisition equipment is used for acquiring a field image of the region where the supporting leg is located; and the safety monitoring device for the supporting leg.
A fifth aspect of the present application provides a fire fighting vehicle comprising at least one leg as described above.
A sixth aspect of the present application provides a machine-readable storage medium having instructions stored thereon, which when executed by a processor, cause the processor to be configured to perform the safety monitoring method for a leg of any one of the above.
In the technical scheme, the processor acquires a field image of the area where the supporting leg is located through the image acquisition device, and outputs the positions of all objects included in the field image of the area where the supporting leg is located through the multi-target tracking detection model. The processor determines the moving direction of each object by detecting the obtained positions of the objects, and judges whether the moving direction of the object is close to the supporting legs of the fire fighting truck or not according to the determined moving direction of each object, so that an alarm prompt is generated. Thereby avoiding danger caused by the moving object approaching the supporting leg of the fire fighting truck.
Additional features and advantages of embodiments of the present application will be described in detail in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the embodiments of the disclosure, but are not intended to limit the embodiments of the disclosure. In the drawings:
fig. 1 schematically shows a flow diagram of a safety monitoring method for a leg according to an embodiment of the present application;
FIG. 2 schematically illustrates a block diagram of a leg according to an embodiment of the present application;
fig. 3 schematically shows an internal structure diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present application will be made with reference to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present application, are given by way of illustration and explanation only, and are not intended to limit the present application.
It should be noted that if directional indications (such as up, down, left, right, front, and back … …) are referred to in the embodiments of the present application, the directional indications are only used to explain the relative positional relationship between the components, the movement situation, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indications are changed accordingly.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present application, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present application.
As shown in fig. 1, a flow chart of a safety monitoring method for a support leg in an embodiment of the present application is schematically shown, and as shown in fig. 1, in an embodiment of the present application, a safety monitoring method for a support leg is provided, which includes the following steps:
step 101, acquiring a field image of an area where an image acquisition device acquires supporting legs;
step 102, continuously inputting the field image into a multi-target tracking detection model so as to determine the position of each object contained in the field image in real time through the multi-target tracking detection model;
103, determining the moving direction of each object according to the change situation of the position of each object;
and 104, generating an alarm prompt under the condition that the moving direction of the object is determined to be towards the direction of the supporting leg.
The image acquisition equipment can acquire the field image of the area where the supporting legs of the fire fighting truck are located. The image capturing device may be a camera or a camera, which has a function of capturing video and/or images. The processor can acquire a field image of the area where the supporting legs of the fire fighting truck are located through the image acquisition equipment. The processor obtains the field image of the area where the supporting legs of the fire fighting truck are located, and the field image of the area where the supporting legs of the fire fighting truck are located can be input into the multi-target tracking detection model. After receiving the field image of the area where the supporting legs of the fire fighting truck are transmitted by the processor, the multi-target tracking detection model can detect the field image of the area where the supporting legs of the fire fighting truck are located, so that the positions of all objects in the field image of the area where the supporting legs of the fire fighting truck are located are determined in real time through the multi-target tracking detection model.
The processor can acquire the positions of all the objects determined by the multi-target tracking detection model in real time. The processor may determine a variation of the position of each object according to the acquired positions of the respective objects, and determine a moving direction of each object according to the variation of the position of each object. After the processor determines the moving direction of each object, the processor can detect the moving direction of each object, and in the case that the processor determines that the moving direction of the object is towards the direction of the legs of the fire fighting truck, the processor can generate a related alarm prompt.
In one embodiment, determining the moving direction of each object according to the change of the position of each object comprises: determining the previous position and the next position of each object according to successively acquired field images; the moving direction of each object from the previous position to the next position is determined as the moving direction of each object.
The processor can obtain a field image of the area where the supporting legs of the fire fighting truck are located through the image acquisition device, and input the field image of the area where the supporting legs of the fire fighting truck are located into the multi-target tracking detection model so as to determine the positions of all objects included in the field image of the area where the supporting legs of the fire fighting truck are located in real time.
The processor can successively and repeatedly obtain field images of the area where the supporting leg of the fire fighting truck is located through the image acquisition device, and input the images into the multi-target tracking detection model, and the multi-target tracking detection model can successively output the position of each object according to the sequence of successively inputting the images by the processor. Therefore, the processor can acquire the position of each object output by the multi-target tracking detection model in sequence. The processor determines a moving direction of each object according to the determined previous position and the determined next position of each object, wherein the moving direction of each object is the moving direction of each object from the previous position to the next position.
For example, the processor obtains a field image of an area where the supporting legs of the fire fighting truck are located through the image acquisition device, inputs the image into the multi-target tracking detection model, and the multi-target tracking detection model outputs a first position of each object according to the image input by the processor, and assumes that the first position of the first object output by the multi-target tracking detection model is a point A. And the processor obtains the field image of the area where the supporting legs of the fire fighting truck are located again through the image acquisition device, inputs the image into the multi-target tracking detection model, and the multi-target tracking detection model outputs the second position of each object according to the image input by the processor, wherein the second position of the first object output by the multi-target tracking detection model is assumed to be a point B. The processor may determine that the former position of the first object is point a and the latter position of the first object is point B. The processor may determine a moving direction from the point a to the point B as a moving direction of the first object.
In one embodiment, determining the moving direction of each object according to the change of the position of each object comprises: determining a distance difference between the position of each object and a preset warning area, wherein the preset warning area is determined according to the area where the supporting legs are located; under the condition that the distance difference value is continuously reduced, determining that the moving direction of the object moves towards the preset warning area; under the condition that the distance difference value is continuously increased, determining that the moving direction of the object is towards moving away from the preset warning area; and under the condition that the distance difference value is kept unchanged, determining that the object does not move.
The processor is provided with a preset warning area of the support legs of the fire fighting truck, and the preset warning area is set according to the area where the support legs of the fire fighting truck are located, so that the safety of people or animals passing outside the preset warning area is guaranteed. The processor can determine the distance difference between the position of each object and a preset warning region set by the processor according to the position of each object output by the multi-target tracking detection model. The processor obtains the field images of the area where the supporting legs are located for multiple times, and obtains the position of each object through the multi-target tracking detection model for multiple times, so that the distance difference between the position of the object output each time and the preset warning area is determined. In the case where the processor determines that the distance difference between the position of the object and the preset alert zone is continuously decreased, the processor may determine that the moving direction of the object is toward the preset alert zone. In the case where the processor determines that the difference in distance between the position of the object and the preset alert zone is increasing, the processor may determine that the moving direction of the object is moving away from the preset alert zone. The processor may determine that the object has not moved when the processor determines that the difference in distance between the position of the object and the preset alert zone remains constant.
For example, the processor obtains a field image of an area where the landing legs of the fire fighting truck are located through the image acquisition device, inputs the image into the multi-target tracking detection model, outputs a first position of each object according to the image input by the processor through the multi-target tracking detection model, and can determine an A distance difference value between the A position and a preset warning area by assuming that the first position of the first object output by the multi-target tracking detection model is an A point. And the processor obtains the field image of the area where the supporting legs of the fire fighting truck are located again through the image acquisition device, inputs the image into the multi-target tracking detection model, and the multi-target tracking detection model outputs the second position of each object according to the image input by the processor, wherein the second position of the first object output by the multi-target tracking detection model is assumed to be a point B. The processor may determine a B distance difference between the B location point and the preset alert zone. The processor may detect the obtained a distance difference and B distance difference, and if the B distance difference is smaller than the a distance difference, it indicates that the distance difference between the position of the first object and the preset alert region is decreasing, and the processor may determine that the moving direction of the first object is moving toward the preset alert region. If the distance difference value B is greater than the distance difference value a, it indicates that the distance difference value between the position of the first object and the preset alert area is increasing, and the processor may determine that the moving direction of the first object is moving away from the preset alert area. If the distance difference value of B is consistent with the distance difference value of A, the distance difference value between the position of the first object and the preset warning area is kept unchanged, and the processor can determine that the first object does not move.
In one embodiment, in the case that it is determined that there is a movement of the object in the direction of the leg, generating the alert prompt includes: under the condition that the moving direction of the object is determined to be towards the direction of the supporting legs, the distance difference between the object and the preset warning area is determined; and generating an alarm prompt under the condition that the distance difference is smaller than a preset warning threshold value.
When the processor determines that the moving direction of the object is towards the direction of the supporting legs, the position of the object can be determined through the multi-target tracking detection model, and the distance difference between the object and the preset warning area is determined according to the obtained position. The processor may generate an alert prompt when a distance difference between the object and the preset alert zone is less than the preset alert zone set by the processor.
In one embodiment, in the case where it is determined that there is a movement direction of the object moving in the direction of the leg, the real-time position of the object moving in the direction of the leg is displayed through the display device.
The processor can determine the positions of all objects in the field image of the region where the supporting legs are located through the multi-target tracking detection model, and determine the moving direction of the objects according to the positions of the objects obtained for multiple times. In the case that the processor determines that the object moving towards the direction of the leg exists in the live image of the area where the leg is located, the processor may present the real-time position of the object moving towards the direction of the leg in the live image of the area where the leg is located on the display device. That is, the processor may present the real-time position of the object moving in the direction of the leg in the live image of the area where the leg is located on the display device and generate the corresponding safety prompt.
In one embodiment, the live image is displayed by the display device in a case where it is determined that the moving direction in which the object does not exist is moving in the direction of the leg.
The processor can display the field image of the area where the supporting legs of the fire fighting truck are located, which is acquired by the image acquisition device, in the display device under the condition that the moving direction of the object is determined to be towards the direction of the supporting legs, wherein the display device can be a display device installed in the fire fighting truck.
For example, assuming a fire truck is operating in a fire, it is necessary to extend the legs of the fire truck to secure the body of the fire truck. When the fire engine carries out the landing leg action, the image acquisition equipment of fire engine can gather the scene image in the region of fire engine landing leg place. After the processor acquires the field image of the area where the supporting legs of the fire fighting truck are located through the image acquisition equipment, the field image of the area where the supporting legs of the fire fighting truck are located can be input into the multi-target tracking detection model. After receiving the images of the periphery of the supporting legs of the fire fighting truck input by the processor, the multi-target tracking detection model can determine the positions of all objects included in the field image of the area where the supporting legs are located. Wherein, the object can be at least one of a human, an animal and other movable objects. The processor can determine the distance between the position of each object and the preset warning area of the fire fighting truck according to the position of each object determined by the multi-target tracking detection model. When the processor determines that the moving direction of the object is not moving towards the direction of the support leg according to the obtained distance, the processor can judge that the moving object is not dangerous because the moving object is too close to the support leg. The processor can display the field image of the area where the supporting legs of the fire fighting truck are located, which is acquired by the image acquisition device, on the display device of the fire fighting truck.
In one embodiment, the object includes at least one of a human, an animal, or other movable object.
The multi-target tracking detection model can determine the positions of all objects according to the field images of the areas where the legs of the fire fighting truck are located, wherein the objects can be at least one of people, animals and other movable objects. Since the above-mentioned objects may move and may cause danger due to being too close to the legs, they may be determined as objects that need to be detected so as not to cause danger due to being too close to the legs.
In one embodiment, a processor configured to perform a safety monitoring method for a leg according to the above is provided.
The image acquisition equipment can acquire the field image of the area where the landing leg of the fire engine is located, wherein the image acquisition equipment can be a camera or a camera. The processor can acquire a field image of the area where the supporting legs of the fire fighting truck are located through the image acquisition equipment. The processor obtains the field image of the area where the supporting legs of the fire fighting truck are located, and the field image of the area where the supporting legs of the fire fighting truck are located can be input into the multi-target tracking detection model. After receiving the field image of the area where the supporting legs of the fire fighting truck are transmitted by the processor, the multi-target tracking detection model can detect the field image of the area where the supporting legs of the fire fighting truck are located, so that the positions of all objects in the field image of the area where the supporting legs of the fire fighting truck are located are determined in real time through the multi-target tracking detection model.
The processor can acquire the positions of all the objects determined by the multi-target tracking detection model in real time. The processor may determine a variation of the position of each object according to the acquired positions of the respective objects, and determine a moving direction of each object according to the variation of the position of each object. After the processor determines the moving direction of each object, the processor can detect the moving direction of each object, and in the case that the processor determines that the moving direction of the object is towards the direction of the legs of the fire fighting truck, the processor can generate a related alarm prompt.
The processor can obtain a field image of the area where the supporting legs of the fire fighting truck are located through the image acquisition device, and input the field image of the area where the supporting legs of the fire fighting truck are located into the multi-target tracking detection model so as to determine the positions of all objects included in the field image of the area where the supporting legs of the fire fighting truck are located in real time.
The processor can successively and repeatedly obtain field images of the area where the supporting leg of the fire fighting truck is located through the image acquisition device, and input the images into the multi-target tracking detection model, and the multi-target tracking detection model can successively output the position of each object according to the sequence of successively inputting the images by the processor. Therefore, the processor can acquire the position of each object output by the multi-target tracking detection model in sequence. The processor determines a moving direction of each object according to the determined previous position and the determined next position of each object, wherein the moving direction of each object is the moving direction of each object from the previous position to the next position.
For example, the processor obtains a field image of an area where the supporting legs of the fire fighting truck are located through the image acquisition device, inputs the image into the multi-target tracking detection model, and the multi-target tracking detection model outputs a first position of each object according to the image input by the processor, and assumes that the first position of the first object output by the multi-target tracking detection model is a point A. And the processor obtains the field image of the area where the supporting legs of the fire fighting truck are located again through the image acquisition device, inputs the image into the multi-target tracking detection model, and the multi-target tracking detection model outputs the second position of each object according to the image input by the processor, wherein the second position of the first object output by the multi-target tracking detection model is assumed to be a point B. The processor may determine that the former position of the first object is point a and the latter position of the first object is point B. The processor may determine a moving direction from the point a to the point B as a moving direction of the first object.
The processor may also determine a direction of movement of each object based on a difference in distance between the position of each object and a preset alert zone. The processor is provided with a preset warning area of the support legs of the fire fighting truck, and the preset warning area is set according to the area where the support legs of the fire fighting truck are located, so that the safety of people or animals passing outside the preset warning area is guaranteed. The processor can determine the distance difference between the position of each object and a preset warning region set by the processor according to the position of each object output by the multi-target tracking detection model. The processor obtains the field images of the area where the supporting legs are located for multiple times, and obtains the position of each object through the multi-target tracking detection model for multiple times, so that the distance difference between the position of the object output each time and the preset warning area is determined. In the case where the processor determines that the distance difference between the position of the object and the preset alert zone is continuously decreased, the processor may determine that the moving direction of the object is toward the preset alert zone. In the case where the processor determines that the difference in distance between the position of the object and the preset alert zone is increasing, the processor may determine that the moving direction of the object is moving away from the preset alert zone. The processor may determine that the object has not moved when the processor determines that the difference in distance between the position of the object and the preset alert zone remains constant.
When the processor determines that the moving direction of the object is towards the direction of the supporting legs, the position of the object can be determined through the multi-target tracking detection model, and the distance difference between the object and the preset warning area is determined according to the obtained position. The processor may generate an alert prompt when a distance difference between the object and the preset alert zone is less than the preset alert zone set by the processor.
The processor can determine the positions of all objects in the field image of the region where the supporting legs are located through the multi-target tracking detection model, and determine the moving direction of the objects according to the positions of the objects obtained for multiple times. In the case that the processor determines that the object moving towards the direction of the leg exists in the live image of the area where the leg is located, the processor may present the real-time position of the object moving towards the direction of the leg in the live image of the area where the leg is located on the display device. That is, the processor may present the real-time position of the object moving in the direction of the leg in the live image of the area where the leg is located on the display device and generate the corresponding safety prompt.
The processor can display the field image of the area where the supporting legs of the fire fighting truck are located, which is acquired by the image acquisition device, in the display device under the condition that the moving direction of the object is determined to be towards the direction of the supporting legs, wherein the display device can be a display device installed in the fire fighting truck. Wherein, the object can be at least one of a human, an animal and other movable objects. Since the above-mentioned objects may move and may cause danger due to being too close to the legs, they may be determined as objects that need to be detected so as not to cause danger due to being too close to the legs.
For example, assuming a fire truck is operating in a fire, it is necessary to extend the legs of the fire truck to secure the body of the fire truck. When the fire engine carries out the landing leg action, the image acquisition equipment of fire engine can gather the scene image in the region of fire engine landing leg place. The processor obtains a field image of an area where supporting legs of the fire fighting truck are located through the image acquisition device, inputs the image into the multi-target tracking detection model, outputs a first position of each object according to the image input by the processor through the multi-target tracking detection model, and can determine an A distance difference value between the A position and a preset warning area by assuming that the first position of the first object output by the multi-target tracking detection model is an A point. And the processor obtains the field image of the area where the supporting legs of the fire fighting truck are located again through the image acquisition device, inputs the image into the multi-target tracking detection model, and the multi-target tracking detection model outputs the second position of each object according to the image input by the processor, wherein the second position of the first object output by the multi-target tracking detection model is assumed to be a point B. The processor may determine a B distance difference between the B location point and the preset alert zone. The processor may detect the obtained a distance difference and B distance difference, and if the B distance difference is smaller than the a distance difference, it indicates that the distance difference between the position of the first object and the preset alert region is decreasing, and the processor may determine that the moving direction of the first object is moving toward the preset alert region. If the distance difference value B is greater than the distance difference value a, it indicates that the distance difference value between the position of the first object and the preset alert area is increasing, and the processor may determine that the moving direction of the first object is moving away from the preset alert area. If the distance difference value of B is consistent with the distance difference value of A, the distance difference value between the position of the first object and the preset warning area is kept unchanged, and the processor can determine that the first object does not move.
And the processor determines that the first object moves towards the preset warning region direction by determining the A distance difference and the B distance difference of the first object, the processor can determine the position of the first object in real time through the multi-target tracking detection model, and under the condition that the distance difference between the position of the first object and the preset warning region is smaller than the preset warning region set by the processor, the processor can generate a warning prompt, so that an operator is prompted that an object is close to the supporting legs of the fire truck, and the first object is prompted not to continue to be close to the supporting legs of the fire truck through the warning prompt so as to avoid danger. The processor may display the live image captured by the image capturing device via a display device of the fire engine, and display the real-time location of the first object via the display device, assuming that the processor determines that the first object is moving in the direction of the legs. Thereby prompting the relevant operators of the fire engine.
In the technical scheme, the processor acquires a field image of the area where the supporting leg is located through the image acquisition device, and outputs the positions of all objects included in the field image of the area where the supporting leg is located through the multi-target tracking detection model. The processor determines the moving direction of each object by detecting the obtained positions of the objects, and judges whether the moving direction of the object is close to the supporting legs of the fire fighting truck or not according to the determined moving direction of each object, so that an alarm prompt is generated. Thereby avoiding danger caused by the moving object approaching the supporting leg of the fire fighting truck. Further, the processor can determine the previous position and the next position of each object according to the field images collected successively; and determining a moving direction of each object from a previous position to a subsequent position as a moving direction of each object. Or by determining a difference in distance between the position of each object and the preset alert zone, thereby determining whether the object moves toward the preset alert zone. Thus, whether the alarm prompt needs to be sent out or not is determined according to the corresponding result. To avoid moving objects approaching the legs of the fire fighting vehicle and thereby causing danger.
In one embodiment, a safety monitoring device for a leg is provided, comprising the processor described above.
In one embodiment, as shown in fig. 2, a block diagram of a leg 200 of the present application is schematically illustrated, including: the image acquisition device 201 is used for acquiring a field image of the area where the supporting leg 200 is located; and the safety monitoring device 202 for the legs described above.
In one embodiment, a fire fighting vehicle is provided, comprising at least one leg as described above.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more, and the safety monitoring method for the supporting leg is realized by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present application provides a storage medium, on which a program is stored, and the program, when executed by a processor, implements the above-mentioned safety monitoring method for a leg.
The embodiment of the invention provides a processor, wherein the processor is used for running a program, and the safety monitoring method for supporting legs is executed when the program runs.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 3. The computer device includes a processor a01, a network interface a02, a memory (not shown), and a database (not shown) connected by a system bus. Wherein processor a01 of the computer device is used to provide computing and control capabilities. The memory of the computer device comprises an internal memory a03 and a non-volatile storage medium a 04. The non-volatile storage medium a04 stores an operating system B01, a computer program B02, and a database (not shown in the figure). The internal memory a03 provides an environment for the operation of the operating system B01 and the computer program B02 in the nonvolatile storage medium a 04. The database of the computer equipment is used for storing field images of the region where the supporting legs are located and data of the multi-target tracking detection model, wherein the field images are acquired by the image acquisition equipment. The network interface a02 of the computer device is used for communication with an external terminal through a network connection. The computer program B02 is executed by the processor a01 to implement a safety monitoring method for a leg.
Those skilled in the art will appreciate that the architecture shown in fig. 3 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Fig. 1 is a schematic flow chart of a safety monitoring method for a leg according to an embodiment. It should be understood that, although the steps in the flowchart of fig. 1 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The embodiment of the invention provides equipment, which comprises a processor, a memory and a program which is stored on the memory and can run on the processor, wherein the processor executes the program and realizes the following steps: acquiring a field image of an area where the supporting leg is acquired by image acquisition equipment; continuously inputting the field image into the multi-target tracking detection model so as to determine the position of each object contained in the field image in real time through the multi-target tracking detection model; determining the moving direction of each object according to the change condition of the position of each object; and generating an alarm prompt when the moving direction of the object is determined to be towards the direction of the support leg.
In one embodiment, the former position and the latter position of each object are determined according to successively acquired field images; the moving direction of each object from the previous position to the next position is determined as the moving direction of each object.
In one embodiment, determining the moving direction of each object according to the change of the position of each object comprises: determining a distance difference between the position of each object and a preset warning area, wherein the preset warning area is determined according to the area where the supporting legs are located; under the condition that the distance difference value is continuously reduced, determining that the moving direction of the object moves towards the preset warning area; under the condition that the distance difference value is continuously increased, determining that the moving direction of the object is towards moving away from the preset warning area; and under the condition that the distance difference value is kept unchanged, determining that the object does not move.
In one embodiment, in the case that it is determined that there is a movement of the object in the direction of the leg, generating the alert prompt includes: under the condition that the moving direction of the object is determined to be towards the direction of the supporting legs, the distance difference between the object and the preset warning area is determined; and generating an alarm prompt under the condition that the distance difference is smaller than a preset warning threshold value.
In one embodiment, in the case where it is determined that there is a movement direction of the object moving in the direction of the leg, the real-time position of the object moving in the direction of the leg is displayed through the display device.
In one embodiment, the live image is displayed by the display device in a case where it is determined that the moving direction in which the object does not exist is moving in the direction of the leg.
In one embodiment, the object includes at least one of a human, an animal, or other movable object.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (12)

1. A safety monitoring method for a leg, comprising:
acquiring a field image of an area where the supporting leg is located, wherein the field image is acquired by image acquisition equipment;
continuously inputting the field image into a multi-target tracking detection model so as to determine the position of each object contained in the field image in real time through the multi-target tracking detection model;
determining the moving direction of each object according to the change condition of the position of each object;
and generating an alarm prompt when the moving direction of the object is determined to be towards the direction of the support leg.
2. The safety monitoring method for the supporting leg according to claim 1, wherein the determining the moving direction of each object according to the position change of each object comprises:
determining the previous position and the next position of each object according to successively acquired field images;
the moving direction of each object from the previous position to the next position is determined as the moving direction of each object.
3. The safety monitoring method for the supporting leg according to claim 1, wherein the determining the moving direction of each object according to the position change of each object comprises:
determining a distance difference between the position of each object and a preset warning area, wherein the preset warning area is determined according to the area where the supporting legs are located;
under the condition that the distance difference value is continuously reduced, determining that the moving direction of the object is towards the preset warning area;
under the condition that the distance difference value is continuously increased, determining that the moving direction of the object is towards moving away from the preset warning area;
determining that the object does not move if the distance difference remains unchanged.
4. The safety monitoring method for the supporting leg according to claim 2 or 3, wherein the generating of the alarm prompt in the case that the moving direction of the object is determined to be moving towards the supporting leg comprises:
under the condition that the moving direction of the object is determined to be towards the direction of the supporting legs, the distance difference between the object and a preset warning area is determined;
and generating an alarm prompt under the condition that the distance difference is smaller than a preset warning threshold value.
5. The safety monitoring method for a support leg of claim 1, further comprising:
and displaying the real-time position of the object moving towards the direction of the support leg through a display device under the condition that the moving direction of the object is determined to be towards the direction of the support leg.
6. The safety monitoring method for a support leg of claim 1, further comprising:
displaying, by the display device, the live image in a case where it is determined that the moving direction in which the object does not exist is moving in a direction of the leg.
7. The safety monitoring method for a leg of claim 1, wherein the object comprises at least one of a human, an animal, or other movable object.
8. A processor configured to perform the safety monitoring method for a leg according to any one of claims 1 to 7.
9. A safety monitoring device for a leg, comprising the processor of claim 8.
10. A leg, comprising:
the image acquisition equipment is used for acquiring a field image of the area where the supporting leg is located; and
a safety monitoring device for a support leg as claimed in claim 9.
11. A fire fighting vehicle characterized by comprising at least one leg according to claim 10.
12. A machine readable storage medium having instructions stored thereon, which when executed by a processor causes the processor to be configured to perform a safety monitoring method for a leg according to any of claims 1 to 7.
CN202111617737.4A 2021-12-27 2021-12-27 Safety detection method, processor and device for supporting leg and fire fighting truck Pending CN114419859A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111617737.4A CN114419859A (en) 2021-12-27 2021-12-27 Safety detection method, processor and device for supporting leg and fire fighting truck

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111617737.4A CN114419859A (en) 2021-12-27 2021-12-27 Safety detection method, processor and device for supporting leg and fire fighting truck

Publications (1)

Publication Number Publication Date
CN114419859A true CN114419859A (en) 2022-04-29

Family

ID=81270327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111617737.4A Pending CN114419859A (en) 2021-12-27 2021-12-27 Safety detection method, processor and device for supporting leg and fire fighting truck

Country Status (1)

Country Link
CN (1) CN114419859A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117392616A (en) * 2023-12-13 2024-01-12 深圳鲲云信息科技有限公司 Method and device for identifying supervision behaviors of garbage throwing, electronic equipment and medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202007002115U1 (en) * 2006-12-30 2007-05-31 Sany Heavy Industry Co., Ltd. Concrete pump vehicle e.g. support arrangement for concrete pump vehicle, has axis of rotation of front supporting legs which are arranged near rotating mechanism
CN103454639A (en) * 2012-05-31 2013-12-18 现代自动车株式会社 Apparatus and method for detecting moving-object around vehicle
CN106314266A (en) * 2016-09-07 2017-01-11 长沙中联消防机械有限公司 The warning control method of vehicle, control device, control system and fire fighting truck
CN109243150A (en) * 2018-09-30 2019-01-18 深圳市金豪泰科技有限公司 A kind of vehicle early warning method and terminal
CN209183003U (en) * 2018-10-22 2019-07-30 中国水利水电第七工程局有限公司 A kind of beam car deviation correcting alarm device
US20190379864A1 (en) * 2016-11-29 2019-12-12 Lumileds Llc Vehicle surveillance
CN111738240A (en) * 2020-08-20 2020-10-02 江苏神彩科技股份有限公司 Region monitoring method, device, equipment and storage medium
CN113807163A (en) * 2021-07-28 2021-12-17 中科云谷科技有限公司 Method for placing support legs of pump truck, device for placing support legs of pump truck and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202007002115U1 (en) * 2006-12-30 2007-05-31 Sany Heavy Industry Co., Ltd. Concrete pump vehicle e.g. support arrangement for concrete pump vehicle, has axis of rotation of front supporting legs which are arranged near rotating mechanism
CN103454639A (en) * 2012-05-31 2013-12-18 现代自动车株式会社 Apparatus and method for detecting moving-object around vehicle
CN106314266A (en) * 2016-09-07 2017-01-11 长沙中联消防机械有限公司 The warning control method of vehicle, control device, control system and fire fighting truck
US20190379864A1 (en) * 2016-11-29 2019-12-12 Lumileds Llc Vehicle surveillance
CN109243150A (en) * 2018-09-30 2019-01-18 深圳市金豪泰科技有限公司 A kind of vehicle early warning method and terminal
CN209183003U (en) * 2018-10-22 2019-07-30 中国水利水电第七工程局有限公司 A kind of beam car deviation correcting alarm device
CN111738240A (en) * 2020-08-20 2020-10-02 江苏神彩科技股份有限公司 Region monitoring method, device, equipment and storage medium
CN113807163A (en) * 2021-07-28 2021-12-17 中科云谷科技有限公司 Method for placing support legs of pump truck, device for placing support legs of pump truck and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117392616A (en) * 2023-12-13 2024-01-12 深圳鲲云信息科技有限公司 Method and device for identifying supervision behaviors of garbage throwing, electronic equipment and medium
CN117392616B (en) * 2023-12-13 2024-04-02 深圳鲲云信息科技有限公司 Method and device for identifying supervision behaviors of garbage throwing, electronic equipment and medium

Similar Documents

Publication Publication Date Title
CN110494861B (en) Image-based anomaly detection method and system
CN106959696B (en) Control method and device for moving target
EP2685421A1 (en) Determining objects present in a process control system
US20170280026A1 (en) Image Processing Apparatus and Image Processing Method
CN109214258B (en) Method and device for detecting illegal driving of non-driving personnel
JP2009545457A (en) Monitoring method and apparatus using camera for preventing collision of machine
KR20140004291A (en) Forward collision warning system and forward collision warning method
JP6841608B2 (en) Behavior detection system
CN114419859A (en) Safety detection method, processor and device for supporting leg and fire fighting truck
CN111488835A (en) Method and device for identifying fellow persons
CN110839051B (en) Service providing method, device, robot and storage medium
CN108665663B (en) Monitoring method, air conditioner and computer readable storage medium
CN114432636B (en) Method for identifying and positioning dangerous objects by using intelligent fire fighting truck and intelligent fire fighting truck
CN110930437B (en) Target tracking method and device
CN112802100A (en) Intrusion detection method, device, equipment and computer readable storage medium
CN116052103A (en) Method, device, computer equipment and storage medium for processing monitoring data
CN114368693B (en) Anti-collision method and device for arm support, processor and crane
CN115419839A (en) Monitoring method, controller, device, system and storage medium for pipeline
CN114432635B (en) Method for identifying and positioning fire source of intelligent fire truck and intelligent fire truck
CN109142354B (en) System, method and device for acquiring product images on production line
CN113997943A (en) Automatic driving vehicle control method, equipment and medium based on semantic clustering
JP7176429B2 (en) Gas leak localization system and gas leak localization program
CN114445858A (en) Method for identifying and positioning personnel for intelligent fire truck and intelligent fire truck
CN113379830A (en) Anti-collision method and device, storage medium and electronic equipment
CN112863096A (en) Monitoring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination