CN114449217A - Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision - Google Patents

Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision Download PDF

Info

Publication number
CN114449217A
CN114449217A CN202111599690.3A CN202111599690A CN114449217A CN 114449217 A CN114449217 A CN 114449217A CN 202111599690 A CN202111599690 A CN 202111599690A CN 114449217 A CN114449217 A CN 114449217A
Authority
CN
China
Prior art keywords
equipment
machine vision
dynamic video
chain
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111599690.3A
Other languages
Chinese (zh)
Inventor
王俊雷
姜晔明
顾海东
刘文峰
林尚飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
715th Research Institute of CSIC
Original Assignee
715th Research Institute of CSIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 715th Research Institute of CSIC filed Critical 715th Research Institute of CSIC
Priority to CN202111599690.3A priority Critical patent/CN114449217A/en
Publication of CN114449217A publication Critical patent/CN114449217A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention belongs to the field of image processing and moving target identification, and particularly designs a dynamic video monitoring device and a dynamic video monitoring method for a shipborne collecting and releasing device based on machine vision, which solves the problems that a common sensor cannot be used for key parts of the device in the using process of the shipborne collecting and releasing device, particularly, the flexible bodies monitor the motion state, reduce the operation difficulty of users, improve the intelligent level of the retraction equipment, the equipment comprises a control system of the collecting and releasing equipment, a laser radar, a monitoring suite, a network switch, a host and matched software, all the components are connected through an Ethernet, the invention is based on a machine vision technology, utilizes point cloud data of the laser radar and image data of a camera, setting an interested ROI for the laser perception area and the visual perception area to carry out self-defined area detection, and realizing dynamic monitoring of the motion process of the retractable equipment.

Description

Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision
Technical Field
The invention belongs to the field of image processing and moving target identification, and particularly relates to a dynamic video monitoring device and method for shipborne collecting and releasing equipment based on machine vision.
Background
At present, shipborne equipment is widely applied to marine resource exploration, such as imaging sonar and the like. The collecting and releasing equipment is responsible for the arrangement and recovery of the detection device and is an important component of the shipborne detection system. Most of the existing shipborne collecting and releasing equipment realize automatic control, but the state monitoring aspect mostly depends on manual judgment, a system built-in sensor, video monitoring and the like, the manual judgment method has high requirements on personnel, the working strength of operators is improved invisibly, the system built-in sensor is mostly used for monitoring parameters such as displacement, speed, pressure, flow and the like, the motion state monitoring of key parts of the equipment (particularly flexible bodies) cannot be carried out, in the video monitoring aspect, video pictures are provided for using the operators in most application at present, the judgment decision still depends on the operators, the requirements on users are high, and the equipment intelligence level is insufficient.
Disclosure of Invention
In order to solve the problems, the invention provides equipment and a method for dynamically monitoring video of shipborne collecting and releasing equipment based on machine vision, which realize the functions of dynamically monitoring, giving a fault alarm and the like of the shipborne collecting and releasing equipment by identifying and analyzing collected radar point cloud data, video image data and the like, and improve the reliability and the intelligent level of the equipment.
The invention provides the following technical scheme:
a dynamic video monitoring device of shipborne retraction equipment based on machine vision is disclosed, wherein the retraction equipment comprises a laser radar, a monitoring suite, a network switch, a host and matched software, the host is communicated with a control system arranged in the retraction equipment in real time, and an image recognition result is shared and used by the control system of the retraction equipment; the monitoring suite comprises a camera and an NVR recorder, and the matched software is used for processing and analyzing point cloud data provided by the laser radar and image data provided by the camera, generating a detection result required by a user and providing a human-computer interaction interface.
Preferably, the whole framework of the supporting software is divided into four levels, which are set from bottom to top: physical sensors, software function driving, software function realization and man-machine interaction.
Preferably, the laser radar, the monitoring suite and the host are respectively connected with the network switch through network cables.
A dynamic video monitoring method for a shipborne retractable device based on machine vision comprises a personnel mistaken entry detection method, a transmission chain appearance abnormity detection method, a chain state detection method at a chain wheel and a key component abnormity detection method.
Preferably, the method for detecting the mistaken entry of the person includes the steps of arranging a camera in an equipment dangerous area, taking image data of the camera as input, and realizing the mistaken entry detection of the person through a preset ROI area.
Preferably, the method for detecting appearance abnormality of the transmission chain takes point cloud data of a laser radar as input, and extracts the outline and the width of a preset ROI (region of interest) to realize the detection of the appearance abnormality of the chain.
Preferably, the chain state detection method at the chain wheel takes the point cloud data of the laser radar as input, and can monitor the meshing state of the chain and the chain wheel in real time.
Preferably, the method for detecting the abnormal state of the key component is to detect the current state of the component through a preset ROI region when detecting that a certain component reaches a designated position, and determine whether the current state of the component is a normal state.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
1. the invention can automatically identify key components, personnel mistakenly entering a dangerous area and the like while monitoring the shipborne collecting and releasing equipment in real time, and the detection result can be used as an input signal of a control system of the shipborne collecting and releasing equipment, thereby improving the emergency handling capacity of the equipment;
2. by building a test system in a laboratory and finding out through the experimental process, the invention can obviously reduce the use pressure of operators, reduce the number of personnel required by the retractable equipment during use and improve the intelligent level of the retractable equipment.
Drawings
FIG. 1 is a block diagram of the system of the present invention;
FIG. 2 is a software architecture diagram of the present invention;
FIG. 3 is a flow chart of chain appearance anomaly detection according to the present invention;
FIG. 4 is a flow chart of chain condition detection at a sprocket of the present invention;
FIG. 5 is a flow chart of primitive flipped missing operation detection of the present invention.
Detailed Description
The technical solutions in the embodiments are clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of them. All other embodiments, which can be derived by a person skilled in the art from the examples without making any creative effort, shall fall within the protection scope of the present invention.
In this embodiment, the ROI represents an "interest region" and is divided into an ROI (2 d block) in an image coordinate system and an ROI (3 d block) in a laser coordinate system, "the image ROI represents" a preset color appearing in the ROI range is a trigger, "the laser ROI represents" a number of reflection points exceeding a preset number appearing in the ROI range is a trigger, "the person ROI represents" a person appearing in the ROI range is a trigger, "and the laser-to-image ROI represents projection of the laser ROI to an image plane.
The shipborne collecting and releasing equipment in the embodiment is a large collecting and releasing system, a chain is used as a transmission carrier in the system, a hydraulic motor is used as a driving device and is responsible for distributing the elements underwater and recovering the elements, and the maximum load capacity of the collecting and releasing equipment is 50 tons.
The dynamic video monitoring method for the shipborne collecting and releasing equipment based on the machine vision mainly comprises the following steps: image ROI detection, laser-to-image ROI detection, and person ROI detection.
The image ROI detection method sets an HSV color gamut space threshold through monitoring an ROI (region of interest) in an image, and if a preset color appears, the triggering is performed, otherwise, the silencing is performed.
The laser ROI detection method is characterized in that a point cloud reflection intensity threshold value and a triggering threshold value point number are set by monitoring an ROI area in a three-dimensional space output by a laser radar, if a reflection point is larger than the threshold value, and the reflection point meets the reflection intensity constraint, the triggering is carried out, otherwise, the silencing is carried out.
The ROI detection method for converting laser to images is characterized in that a point cloud reflection intensity threshold value and a trigger threshold value point number are set by monitoring an ROI area in a three-dimensional space output by a laser radar, and three-dimensional point cloud data are reduced to an image plane.
The person ROI detection method is to monitor an ROI in an image, and if a person target appears, the person ROI is triggered, otherwise, the person ROI detection method is silent.
As shown in fig. 1, the device for monitoring the dynamic video of the onboard deploying and retracting device based on the machine vision of the embodiment includes a laser radar, a monitoring suite, a network switch, a host and supporting software.
The laser radar adopts a domestic livox series non-repetitive scanning type laser radar, has a non-repetitive scanning function, can perform dense three-dimensional reconstruction on a target area to obtain point cloud data in a centimeter level, has an Ethernet communication interface, and has the reflectivity of 260m @80%, the distance precision of 2cm, the angle precision of 0.1 degree and the laser wavelength of 905 nm; the laser radar is arranged near the specific position of the chain and the element running path, and is connected with the network switch through a network cable after the laser radar sets the static IP address.
The camera and the NVR recorder adopt a domestic NVR series monitoring suite of Haikangwei, data can be conveniently accessed and played, data can be played back, the data of the camera can be accessed in real time, the communication interface is Ethernet, the resolution ratio of a main code stream of the camera is 50Hz, the maximum frame rate is 20fps, the resolution ratio of a sub code stream is 50Hz, the maximum frame rate is 25fps, the communication interface is self-adaptive Ethernet, the camera is connected to the NVR recorder, and the NVR recorder is connected with the switch through a network cable.
The network switch selects a domestic Huasan 5-port gigabit Ethernet switch and is used for connecting the host computer, the laser radar and the camera.
The host computer adopts a commercial PC, the hardware environment is 32G memory, 2 RTC2080 display cards, the software environment is Ubuntu18.04 operating system installed with CUDA10.1, the host computer communicates with the control system of the collecting and releasing equipment in real time, and the image recognition result is shared and used by the control system of the collecting and releasing equipment.
The supporting software framework is shown in fig. 2, and is used for processing and analyzing point cloud data provided by the laser radar and image data provided by the camera, generating a detection result required by a user, and providing a human-computer interaction interface.
The whole framework of the matched software is divided into four levels which are divided into: physical sensors, software function driving, software function realization and man-machine interaction; the four levels are respectively connected with the upper level data and the lower level data, and cross-level access cannot be realized.
The physical sensor layer is a producer of data, ensures stable and reliable data sources, and mainly comprises laser radar data and image data. The laser radar data can perceive a three-dimensional space, has centimeter-level precision, does not have the function of collecting color information, can only acquire the reflection intensity of a target object, and can be used for distinguishing objects with different reflection loudness. The image data has rich color information, but the three-dimensional space distance of the target object cannot be ascertained, so that the method can be used for detecting the target object with obvious color.
The software function driving layer processes data generated by a hardware sensor to generate a self-defined ROI signal, local point cloud and image block data, and the software function driving layer abstracts the sensor data to generate data required by an upper system.
The software function realization layer receives the ROI signals, local point cloud and image block data required by the software function realization layer, carries out secondary processing according to different trigger logics of each module, generates detection result signals meeting the requirements of the system, and comprises personnel false entry detection, chain appearance abnormity detection and key component abnormity state detection.
The man-machine interaction layer receives the recognition result transmitted by the lower layer, performs graphic display rendering on different detection items, and has a certain interaction function.
A dynamic video monitoring method for a shipborne retractable device based on machine vision comprises a personnel mistaken entry detection method, a transmission chain appearance abnormity detection method, a chain state detection method at a chain wheel and a key component abnormity detection method.
The method for detecting the mistaken entry of the personnel adopts a personnel ROI detection method, a camera is arranged in an equipment danger area, image data are input through the camera, the ROI area in an image is monitored, if a personnel target appears, the triggering is carried out, otherwise, the silencing is carried out, and the mistaken entry detection of the personnel is realized.
As shown in fig. 3, the method for detecting appearance abnormality of a transmission chain adopts a method of detecting ROI by laser-to-image, and the method reduces the dimension of three-dimensional point cloud data of a laser radar to an image plane by monitoring the ROI area in a three-dimensional space to form a ROI area point cloud, a ROI area binary image, and a ROI area pseudo RGB image. And selecting the position of an orthographic view chain when the laser radar is deployed, and monitoring the chain. And triggering acquisition by personnel or a control system after the chain is static, enabling the system to enter an integral state, keeping for 3-5 seconds, then automatically jumping out of the integral state, carrying out contour detection and width extraction on a preset area, finally obtaining the width of the chain, and if the preset area has no target, indicating that the appearance contour of the chain is abnormal.
As shown in fig. 4, the chain state detection method at the sprocket adopts a laser-to-image ROI detection method, which is a normal detection. When the program is deployed, the point cloud data at the chain wheel is intercepted through the laser radar, the point cloud is adjusted to the front view angle, the 3d point cloud is drawn on the 2d image for visualization, the colors are in a pseudo RGB form, and mapping is carried out according to the reflection intensity. The method comprises the steps that standard convex hull data of a chain wheel need to be collected in a deployment stage, a convex hull line section is monitored in real time in an operation stage, the meshing state of a chain and the chain wheel is monitored in real time, and if the convex hull line section is greatly different from a standard situation, exception reporting is carried out.
The method for detecting the abnormal state of the key component is specifically element turning plate missing operation detection as shown in figure 5. The elements are underwater parts of the detecting device, the wing plates are spoilers arranged on the elements, and when the elements are released to a certain designated position, the wing plates are manually turned upwards (rotated by a certain angle) by an operator, and then the elements can be continuously distributed. In order to detect the situation where no flap operation is performed, a laser ROI detection method is employed, which is a cell-based trigger detection. And when the primitive appears, detecting the laser ROI of the turning plate area, and if the area has a turning plate signal, considering that turning is missed and generating alarm information. On the contrary, if the primitive signal exists in the detection process but the turning plate signal does not exist, the detection result is considered to be normal, and the standby state is entered after the primitive disappears.
The above description is only a preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any modification and replacement based on the technical solution and inventive concept provided by the present invention should be covered within the scope of the present invention.

Claims (8)

1. The utility model provides a ship-borne equipment dynamic video monitoring's that receive and releases based on machine vision equipment which characterized in that: the collecting and releasing equipment comprises a laser radar, a monitoring suite, a network switch, a host and matched software, wherein the host is in real-time communication with a control system arranged in the collecting and releasing equipment, and an image recognition result is shared and used by the control system of the collecting and releasing equipment;
the monitoring suite comprises a camera and an NVR recorder, and the matched software is used for processing and analyzing point cloud data provided by the laser radar and image data provided by the camera, generating a detection result required by a user and providing a human-computer interaction interface.
2. The machine vision based on-board retractable device dynamic video monitoring device of claim 1, wherein: the whole framework of the matched software is divided into four levels which are divided into: physical sensors, software function driving, software function realization and man-machine interaction.
3. The machine vision based on-board retractable device dynamic video monitoring device of claim 1, wherein: the laser radar, the monitoring suite and the host are connected with the network switch through network cables respectively.
4. A method for machine vision based dynamic video surveillance of onboard retraction devices according to any of claims 1 to 3, characterized by: the method comprises a personnel false entry detection method, a transmission chain appearance abnormity detection method, a chain state detection method at a chain wheel and a key component abnormity state detection method.
5. The method for machine vision based dynamic video surveillance of onboard retraction devices according to claim 4, characterized in that: the method for detecting the mistaken entry of the personnel comprises the steps of arranging a camera in an equipment dangerous area, taking image data of the camera as input, and realizing the mistaken entry detection of the personnel through a preset ROI area.
6. The method for machine vision based dynamic video surveillance of onboard retraction devices according to claim 4, characterized in that: the method for detecting the appearance abnormity of the transmission chain takes point cloud data of a laser radar as input, and realizes the detection of the appearance abnormity of the chain by extracting the outline and the width of a preset ROI (region of interest).
7. The method for machine vision based dynamic video surveillance of onboard retraction devices according to claim 4, characterized in that: the chain state detection method at the chain wheel takes the point cloud data of the laser radar as input, and can monitor the meshing state of the chain and the chain wheel in real time.
8. The method for machine vision based dynamic video surveillance of onboard retraction devices according to claim 4, characterized in that: the method for detecting the abnormal state of the key component is to detect the current state of the component through a preset ROI (region of interest) when detecting that the component reaches a specified position, and judge whether the current state of the component is a normal state.
CN202111599690.3A 2021-12-24 2021-12-24 Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision Pending CN114449217A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111599690.3A CN114449217A (en) 2021-12-24 2021-12-24 Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111599690.3A CN114449217A (en) 2021-12-24 2021-12-24 Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision

Publications (1)

Publication Number Publication Date
CN114449217A true CN114449217A (en) 2022-05-06

Family

ID=81364006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111599690.3A Pending CN114449217A (en) 2021-12-24 2021-12-24 Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision

Country Status (1)

Country Link
CN (1) CN114449217A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101881602A (en) * 2010-07-06 2010-11-10 西安交通大学 Assembly accuracy detection method of large complicated blade parts
CN108802758A (en) * 2018-05-30 2018-11-13 北京应互科技有限公司 A kind of Intelligent security monitoring device, method and system based on laser radar
CN110007315A (en) * 2019-04-09 2019-07-12 深圳市速腾聚创科技有限公司 Laser radar detection device, detection method and control system
CN112083437A (en) * 2020-07-10 2020-12-15 南京智慧水运科技有限公司 Marine laser radar and video combined target capturing system and method
US10893183B1 (en) * 2019-11-18 2021-01-12 GM Global Technology Operations LLC On-vehicle imaging system
CN112634566A (en) * 2020-12-16 2021-04-09 路晟悠拜(重庆)科技有限公司 Intelligent electronic fence construction method and system based on millimeter waves
CN113320924A (en) * 2021-05-17 2021-08-31 中国矿业大学(北京) Belt longitudinal tearing detection device based on single line laser radar
CN113506416A (en) * 2021-07-03 2021-10-15 山东省水利勘测设计院 Engineering abnormity early warning method and system based on intelligent visual analysis

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101881602A (en) * 2010-07-06 2010-11-10 西安交通大学 Assembly accuracy detection method of large complicated blade parts
CN108802758A (en) * 2018-05-30 2018-11-13 北京应互科技有限公司 A kind of Intelligent security monitoring device, method and system based on laser radar
CN110007315A (en) * 2019-04-09 2019-07-12 深圳市速腾聚创科技有限公司 Laser radar detection device, detection method and control system
US10893183B1 (en) * 2019-11-18 2021-01-12 GM Global Technology Operations LLC On-vehicle imaging system
CN112822348A (en) * 2019-11-18 2021-05-18 通用汽车环球科技运作有限责任公司 Vehicle-mounted imaging system
CN112083437A (en) * 2020-07-10 2020-12-15 南京智慧水运科技有限公司 Marine laser radar and video combined target capturing system and method
CN112634566A (en) * 2020-12-16 2021-04-09 路晟悠拜(重庆)科技有限公司 Intelligent electronic fence construction method and system based on millimeter waves
CN113320924A (en) * 2021-05-17 2021-08-31 中国矿业大学(北京) Belt longitudinal tearing detection device based on single line laser radar
CN113506416A (en) * 2021-07-03 2021-10-15 山东省水利勘测设计院 Engineering abnormity early warning method and system based on intelligent visual analysis

Similar Documents

Publication Publication Date Title
CN107369337A (en) Actively anti-ship hits monitoring and pre-warning system and method to bridge
CN102236947B (en) Flame monitoring method and system based on video camera
US6816184B1 (en) Method and apparatus for mapping a location from a video image to a map
KR101350777B1 (en) Image monitoring apparatus and system
CN113395491A (en) Remote monitoring and alarming system for marine engine room
JP6764481B2 (en) Monitoring device
CN112800860A (en) Event camera and visual camera cooperative high-speed scattered object detection method and system
CA2505841C (en) Foreign object detection system and method
KR102247359B1 (en) Image analysis system and method for remote monitoring
CN207517196U (en) Actively anti-ship hits monitoring and warning system to bridge
CN107679471A (en) Indoor occupant sky hilllock detection method based on video monitoring platform
CN101944267A (en) Smoke and fire detection device based on videos
CN112819068A (en) Deep learning-based real-time detection method for ship operation violation behaviors
CN112184773A (en) Helmet wearing detection method and system based on deep learning
CN117409191B (en) Fire inspection early warning method based on unmanned aerial vehicle and improved YOLOv8 target detection algorithm
CN111625159B (en) Man-machine interaction operation interface display method and device for remote driving and terminal
CN105632115A (en) Offshore oilfield security system
JP3997062B2 (en) Image monitoring device
CN111474916B (en) Ship navigation autonomous collision avoidance algorithm testing method and device
CN112598865B (en) Monitoring method and system for preventing cable line from being damaged by external force
CN114449217A (en) Equipment and method for dynamically monitoring video of shipborne retraction equipment based on machine vision
CN117953578A (en) Elevator passenger behavior detection method based on depth vision technology
CN112377265A (en) Rock burst alarm method based on image recognition acceleration characteristics
CN208087074U (en) A kind of humanoid anti-collision alarm system of harbour equipment based on monocular vision
CN113283273A (en) Front obstacle real-time detection method and system based on vision technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination