EP3921804A1 - Dispositif d'étalonnage pour un dispositif de surveillance, dispositif de surveillance pour la détection d'homme à la mer et procédé d'étalonnage - Google Patents

Dispositif d'étalonnage pour un dispositif de surveillance, dispositif de surveillance pour la détection d'homme à la mer et procédé d'étalonnage

Info

Publication number
EP3921804A1
EP3921804A1 EP20700878.0A EP20700878A EP3921804A1 EP 3921804 A1 EP3921804 A1 EP 3921804A1 EP 20700878 A EP20700878 A EP 20700878A EP 3921804 A1 EP3921804 A1 EP 3921804A1
Authority
EP
European Patent Office
Prior art keywords
calibration
ship
module
designed
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20700878.0A
Other languages
German (de)
English (en)
Inventor
Hans-Dieter Bothe
Holger Fillbrandt
Gabriele Mangiafico
Claudio Scaravati
Stefano Riboli
Paolo Mario France Terzon
Sarah Schuette
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3921804A1 publication Critical patent/EP3921804A1/fr
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the monitoring device is designed as a man-overboard monitoring of a ship section.
  • the monitoring device has at least one camera for video monitoring of the
  • the video monitoring is provided as video data.
  • the camera has intrinsic calibration parameters and extrinsic calibration parameters.
  • the video data are provided to the calibration device.
  • Contactless monitoring devices are a conceivable problem that moving objects, such as spray, birds, or things thrown overboard could be incorrectly detected as a man-overboard event.
  • a monitoring device must reduce such false detections below a certain value in order to be practical.
  • the ISO standard ISO / PAS21195 "Ships and marine technology" requires that Monitoring systems for man-overboard monitoring have a "true positive” detection rate of more than 95 percent and on average have a false alarm rate (“false positive rate”) of less than a single false alarm per day per ship.
  • a calibration device for a monitoring device having the features of claim 1 is proposed. Furthermore, a
  • the calibration device forms, in particular, a human-machine interface for the monitoring device.
  • the monitoring device can be calibrated, in particular initially calibrated or recalibrated.
  • the calibration device is designed, for example, as a graphical user interface for initializing, setting up and / or setting the monitoring device.
  • the monitoring device is designed for man-overboard monitoring of a ship section.
  • the monitoring device is part of and / or can be integrated into a ship, in particular a passenger ship.
  • Man-overboard monitoring by means of the monitoring device provides, for example, to distinguish whether a person and / or a test manikin (dummy) modeled on the person has gone overboard or another object.
  • the ship section is, for example, part of a stay area of the ship, preferably a section where a person can go overboard, and in particular an area with railings or windows, but also mechanically unsecured areas, such as. B. Places where service work is carried out or where rescue boats are located. Furthermore, it is preferably provided that the ship section a
  • the ship section should also ideally in particular capture part of the water surface; but this is not absolutely necessary.
  • the monitoring device can also monitor a number of sections of the ship and a number of monitoring devices, in particular a surrounding area around the ship. For example, the monitoring device is designed to output an alarm when a man-overboard event is detected.
  • the monitoring device has at least one camera.
  • the camera is preferably a color camera.
  • the camera or at least one camera is designed as an infrared camera.
  • one camera comprises two camera units, one camera unit representing a normal video camera in the visual wavelength range (color or black-and-white camera) and the other camera unit representing an infrared camera, preferably a thermal camera (far infrared Camera), represents;
  • Such cameras image the thermal radiation from objects (thermography). This makes it possible to monitor the ship section 24 hours a day, i.e. both in the dark using thermal image data and in daylight using color or black-and-white image data, with each camera being suitable for all-day operation with sufficient sensitivity is.
  • the monitoring device has a plurality of cameras, the cameras being able to monitor a plurality of ship sections, the ship sections
  • the camera can be arranged overlapping or non-overlapping. It is particularly preferred that the camera is laterally, for example on a
  • the camera monitors the ship section using video and / or image technology. For example, an area of the ship and an outer area of the ship facing the sea and / or the sea surface are monitored by video technology.
  • the video monitoring is used by the camera and / or cameras as video data.
  • Monitoring can also be provided by the camera as metadata (man overboard event: when, at which ship position, fall trajectory in the image and in 3D, object size, etc.).
  • the camera has at least one intrinsic calibration parameter and at least one extrinsic calibration parameter.
  • each camera has a plurality of intrinsic calibration parameters and a plurality on extrinsic calibration parameters.
  • the intrinsic calibration parameters are, in particular, calibration parameters that are specified by the camera itself, in particular as a result of manufacture, which means in particular that it
  • Intrinsic calibration parameters are for example
  • Imaging parameters of the camera for example the focal length of a camera.
  • the intrinsic calibration parameters are fixed parameters, that is to say parameters which ideally do not change at all or in reality only change slightly over the lifetime and / or functional time of the camera and / or the monitoring device.
  • the intrinsic calibration parameters can be known in advance and / or initially determined during or before the installation of the monitoring device and / or calibration device and / or camera.
  • Extrinsic calibration parameters are understood to mean, in particular, camera parameters that are dependent on an arrangement, orientation and / or mounting of the camera. For example, describe extrinsic
  • Calibration parameters the alignment of the optical axis of a camera with respect to a horizontal, for example the sea surface, or a vertical, for example the ship plane defined by a ship's side, plane.
  • the orientation can include, for example, an angle of inclination or the distance between the camera and a reference surface.
  • the extrinsic calibration parameters are assembly-dependent parameters.
  • Calibration parameters are therefore preferably determined by the installation and / or arrangement of the camera and / or can be changed by shifting, changing the alignment and / or dismantling the camera.
  • the calibration device is connected to the cameras in terms of data.
  • the video data are provided to the calibration device.
  • the calibration device can be provided with the intrinsic calibration parameters.
  • the calibration device can preferably form a decentralized module.
  • the calibration device forms a computer unit or a software module.
  • the calibration device has an input module.
  • the input module is designed in particular for graphical input by a user. By means of the input module, graphic areas can be displayed, drawn in and / or selected. In particular, a numeric or alphanumeric input can preferably be made by means of the input module.
  • the input module is designed as a touchscreen.
  • a user can enter, select, define and / or draw in a calibration element, the calibration element preferably having an orientation and / or an extension.
  • a calibration element is understood to mean, for example, information and / or a structure in the image and / or the ship section.
  • the user and / or the calibration device are provided that the user and / or the calibration device
  • Calibration elements are, for example, lines, areas, intersections, two-dimensional or three-dimensional objects.
  • Calibration element has in particular an orientation and / or a
  • Extension and / or a distance from the camera is, for example, the distance in 3D between two points and / or a length, a width or a depth.
  • the orientation of the calibration element is
  • a direction in a world coordinate system and / or another reference coordinate system can in particular also include and / or describe angles and / or orientations of calibration elements to one another.
  • a calibration element can be a perpendicular to a water surface, a ship's plane or the horizon.
  • Calibration elements can also be angular positions of the hull, for example the railing relative to the ship's surface and / or the Ship bottom.
  • the calibration element can, for example, be specified and / or entered manually in the form of lines, points and / or polygons.
  • the orientation and / or the extent can be entered in the form of numerical information by the user using the input module. For example, the user assigns lengths, angles and / or orientation numerically or alpha-numerically to the calibration element.
  • the calibration device has an evaluation module.
  • the evaluation module forms, for example, a software module or a computer module.
  • the evaluation module can be implemented on the input module and / or form a common module with it.
  • the calibration elements and / or the video data and / or a scene model are related to the evaluation module
  • the evaluation module is designed based on the calibration elements, in particular including all associated information and / or the information known in advance
  • Calibration parameters, and / or the scene model to determine the still unknown calibration parameters are determined once when the
  • the evaluation module is designed based on the intrinsic calibration parameters, for example focal length and lens distortion, as well as the calibration elements provided, the orientation, viewing direction and / or mounting arrangement, for example the distances from reference surfaces, that is in particular the water surface or the ship's plane, of the camera to determine.
  • Calibration elements for example focal length and lens distortion, as well as the calibration elements provided, the orientation, viewing direction and / or mounting arrangement, for example the distances from reference surfaces, that is in particular the water surface or the ship's plane, of the camera to determine.
  • a calibration element is a visible line in the real section of the ship which has a known length and / or orientation. This line is mapped by the camera so that the evaluation module is based on the map of the line and the known length and / or
  • Determining the unknown Calibration parameters through the known calibration parameters, the known calibration elements and / or the scene model are preferably based on the lens equation and / or imaging equation of the camera.
  • the invention is based on the idea that when monitoring a ship section with cameras for man-overboard monitoring, the cameras must be calibrated with sufficient precision in order to enable good and safe monitoring with few false alarms.
  • Calibration devices can be optically and / and numerically determined by a user calibration elements which are located in the ship area, an evaluation module determining the required unknown calibration parameters based on this selection. This provides a calibration device that is particularly easy to operate and that enables the monitoring device to be calibrated reliably and precisely. In particular, a large number of cameras can be used by means of the calibration device
  • the extrinsic calibration parameters describe an alignment of the camera in a three-dimensional world coordinate system.
  • the three-dimensional world coordinate system is preferably a Cartesian coordinate system.
  • the world coordinate system is, for example, the coordinate system of the ship, the ship section and / or spanned by the sea surface and a vertical line, for example the “ship plane”.
  • the extrinsic calibration parameters can also include an angle of inclination and a distance, in particular in the three-dimensional coordinate system and / or to the water surface.
  • extrinsic calibration parameters can include, for example, a distance and an orientation to a vertical ship plane, for example to the center of the ship, and / or to a side wall .
  • the intrinsic calibration parameters are stored and / or implemented by the manufacturer in terms of data in the camera, the camera when connected to the
  • Calibration parameters are the parameters that are required and / or necessary to describe the optical imaging by the camera.
  • One embodiment of the invention provides that the input module is designed so that the user can input, set and / or select the intrinsic calibration parameters. For example, by means of the
  • the input module has an input mask into which the user can enter the necessary parameters, values and / or parameters. Furthermore, it can be provided that a type or model designation of the camera can be selected by means of the input module, with the type designation and / or model designation in
  • Input module or evaluation module a data set with the intrinsic intrinsic
  • Calibration parameters are stored and / or can be called up.
  • a calibration element is formed by at least two alignment lines.
  • An alignment line is formed, for example, by a horizontal line on the outer side wall of the ship, for example by a section of the railing or a ship deck.
  • an alignment line can generally be formed by a horizontal line on another structure, for example a wall of cabins, in the ship section.
  • a calibration element is formed from a horizon line, for example the water level at a great distance, or the vanishing point from the ship level at a great distance and the
  • the calibration device comprises a model module which can model the view of the ship section in a more complex manner than through a single plane.
  • the model module can be made available by the setter via the input module by the setter marking and / or spatially delimiting structures on the ship in the video image shown, for example with a polygon, and assigning them an orientation and position in 3D.
  • This information can, for example, be taken manually or automatically from a CAD plan of the ship.
  • the model module can be part of the evaluation module.
  • the model module comprises a 3D model of the ship and / or a ship section.
  • the 3D model is a model of the ship section monitored by video technology.
  • the 3D model can form a CAD model.
  • the 3D model includes dimensions, angles,
  • the evaluation module is designed to determine the unknown, intrinsic and / or extrinsic, calibration parameters based on the 3D model. For example, the evaluation module can extract orientations of lines and / or surfaces to one another from the 3D model and map them into the
  • Video data determine the unknown calibration parameters.
  • the evaluation module can consider the video data as an image of the 3D model with the intrinsic calibration parameters and determine the unknown calibration parameters based on this.
  • provision can be made for the user to select, dimension and / or set a calibration element in the 3D model. This refinement is based on the consideration of providing a calibration device which enables the cameras and / or the monitoring device to be calibrated intuitively and easily.
  • the calibration device is a
  • the calibration device also includes a display module for displaying the video data.
  • a display module for displaying the video data.
  • Calibration control module and display module form a common module.
  • the calibration module is designed to make one or more model calibration elements definable, selectable and / or adjustable.
  • Model calibration element is, for example, a calibration element. It has a spatial reference to the 3D model, for example the base point of a vertically aligned linear calibration element is located on a visible surface of the 3D model.
  • the model calibration element also has
  • Model calibration element can be selected, marked and / or adjusted in the 3D model.
  • the calibration control module is designed to create a model calibration element based on its specified reference point in the image, for example the base point, and to determine the known calibration to be checked as a mapping and / or to draw and / or draw in the video data on the display module (“overlay” display). By comparing the displayed and / or drawn in model calibration element in the video data on the display module, the user can assess whether the calibration parameters to be checked, for example the initially determined, calibration parameters are accurate enough
  • Error-free are determined. For example, the user can determine this in that the model calibration element drawn does not deviate from the corresponding structure in the video data or only deviates insignificantly. For example, the user defines a line in the model or, in particular, a door or area, this element defined in the model being converted and displayed optically in the video data. Describe the items to be tested
  • a calibration device that can be operated simply and intuitively is thus provided which in particular makes it possible to quickly and easily determine the correctness of calibration parameters and / or a decalibration. If inaccurate or incorrect calibration parameters are found, the calibration parameters should be determined again.
  • the calibration device is a
  • Change determination module is designed to change the
  • Video data in particular whether there is a temporal, long-term and persistent variation in the environment, the ship section or other parts.
  • the camera attachment may have loosened and / or the ship section may have been converted so that the extrinsic calibration parameters have changed or the 3D ship model no longer accurately reflects the 3D structure of the scene.
  • a superimposed representation (“overlay”) of old and current image, for example an edge image), such a change can be determined.
  • overlay a superimposed representation of old and current image, for example an edge image
  • Change determination module be designed to report the decalibration to a user so that he can perform a new calibration.
  • the input module is a module for graphical input, definition and / or selection of the calibration element and / or the model calibration element.
  • the input module is provided with a touch screen or a computer mouse, with the aid of which the user can draw in the calibration element or the model calibration element or can define it by means of points or a polygon. This refinement is based on the idea of providing a calibration device that is easy to operate and does not require any complex programming languages or numerical inputs.
  • the input module is designed to enable the user to dimension calibration elements.
  • the dimensions can be numerically or alpha-numerically.
  • known calibration elements are already stored and / or can be selected with dimensions and / or additional information as a data record.
  • Another object of the invention is a monitoring device for man-overboard monitoring of a ship section.
  • Monitoring device has at least one camera, the camera being designed to monitor a section of the ship using video technology.
  • the camera is part of the surveillance device.
  • the monitoring device has the calibration device as previously described.
  • the calibration device is connected to the camera in terms of data, the video data being provided to the calibration device.
  • the monitoring device also has an evaluation device.
  • Evaluation device is designed, for example, as a computer unit.
  • the evaluation device is designed to calculate a kinematic variable of an object moving in the monitoring area based on the video data determine. For example, the evaluation device determines and / or the evaluation device detects a moving object and determines for it
  • the kinematic variable can be a speed or an acceleration, for example.
  • the evaluation device is designed to infer the presence of a man-overboard event.
  • the evaluation device is designed to determine the path and / or speed or acceleration profile of the moving object and to infer the presence of a man overboard event using known case hypotheses such as a case parabola.
  • the evaluation device can preferably be designed to infer a size of the moving object based on the kinematic variable, the size being, for example, the extent and / or the length or the diameter of the object.
  • the evaluation device is designed to determine a starting position of the moving object based on the intrinsic calibration parameters and / or the extrinsic calibration parameters and / or the kinematic variable. After the extrinsic
  • Calibration parameters define the alignment and position of the camera, for example, and the intrinsic calibration parameters define the imaging equation, a starting position in three dimensions can be determined by the kinematic variable, for example by comparison with the acceleration. Based on the starting position, it can be concluded that a man-overboard event has occurred.
  • the monitoring device provides that by means of the
  • the danger area is
  • the danger area is characterized in particular by the fact that these areas of the Ship from which a person could go overboard.
  • Sections of the ship outside and / or on the ship's side wall are, for example, part of the danger area.
  • the evaluation device can
  • Evaluation device for example, are discarded and are not evaluated as man-overboard events.
  • the evaluation device is designed to have a starting position in the event of a person falling off board in a higher-level
  • To determine the ship's coordinate system in particular to determine a number of the deck, a ship's side, a longitudinal and lateral coordinate.
  • the monitoring device is designed to output the starting position in the ship's coordinate system on an, in particular central, output device.
  • This refinement is based on the consideration that the 3D starting position relative to the camera position and the mounting location of the camera are known in the superordinate ship coordinate system and the starting position in the superordinate ship coordinate system can be determined based on this.
  • Another object of the invention is a method for calibrating a camera of a monitoring device for man-overboard monitoring or a monitoring device for man-overboard monitoring with the camera.
  • the procedure can be carried out by a user or a setter
  • a calibration element is an element which, for example, has a position, an orientation and an extent.
  • the calibration element is an element or a structure that is located in the ship section and / or can be fixed.
  • the camera has intrinsic and extrinsic calibration parameters. In the process, the unknown calibration parameters are based on the
  • Calibration parameters and / or the scene model determined. For this purpose, it is calculated with which values for the calibration parameters a correct
  • Mapping of a calibration element from the 3D world into the 2D video data can be achieved. For example, it determines how the The calibration element of reality (3D world) is mapped onto the calibration element mapped by video technology using the calibration parameters. The unknown calibration parameters can be determined through this comparison.
  • FIG. La shows a section monitored by a monitoring device
  • Figure lb schematically a monitoring device with a
  • FIG. 2 shows a picture of the ship section with a calibration element
  • FIG. 3 shows a further recording with an alternative calibration element
  • FIG. 4 shows a picture of the ship section with defined danger areas.
  • Figures la and lb show the monitoring of a ship section 4 with a monitoring device 1 with a calibration device 2.
  • Monitoring device 1 is designed to monitor a ship 3.
  • a ship section 4 of the ship 3 is monitored by video using the monitoring device 1.
  • the monitoring device 1 has two cameras 5a and 5b.
  • the camera 5a is designed as a camera for recording images in the visual area, the camera 5b being designed as an infrared camera and being able to record and / or create recordings even in complete darkness.
  • the monitoring device can also consist of just one of the two cameras.
  • the cameras 5a and 5b are aimed at the ship section 4 and reproduce it with video and / or image technology.
  • the recordings are made available to the calibration device 2 as video data.
  • Man-overboard monitoring takes place in the ship section 4 by means of the monitoring device 1. This monitors whether a person 6 goes overboard and is in danger. To this end, the monitoring device 1 determines a moving object in the ship section 4.
  • the moving object can be a person 6, for example.
  • the moving object is a person 6 or another object such as garbage or water.
  • An alarm is only output if the moving object has been characterized as a falling person 6.
  • Other falling or flying objects like
  • Cigarettes, birds or the like are not counted as a man-overboard event, so no alarm is given.
  • a person 6 who falls overboard describes a parabolic trajectory 19.
  • the trajectory ends at the water surface 18 of the sea.
  • the person 6 and also objects are accelerated towards the sea with the acceleration of gravity. Any horizontal speed components of a falling person 6 and also falling objects are in particular not or almost not accelerated.
  • the trajectory can be described by the horizontal and vertical object positions or speed (v x , v y ) over time.
  • the speed v y represents the accelerated movement perpendicular to the sea surface, the speed v x being a constant or almost constant speed parallel to the water surface.
  • the person 6 and / or the falling object has a length which is understood as a diameter, for example.
  • the diameter and / or the length can also be determined in that a rectangle is circumscribed around the falling object, the diameter describing the diagonal of the rectangle.
  • the video data are provided to the evaluation device 7.
  • Evaluation device 7 is part of the monitoring device 1 and
  • An evaluation device I 7 can in particular be connected to a camera 5a or 5b or to a plurality of cameras 5a and 5b. Based on the video data, the evaluation module determines whether the moving object is a person 6. The video data from a plurality of cameras are evaluated in a manner not linked to one another. In particular, the moving object is tracked by the evaluation device 7, for example in successive images of the video data.
  • the evaluation device 7 is designed to determine a kinematic variable for the moving object based on the video data.
  • the kinematic variable is, for example, a speed profile and / or acceleration values of the moving object.
  • the evaluation device is designed based on the kinematic variable, a variable and / or an extension or the To determine the diameter of the moving object. For example, the evaluation device uses the acceleration due to gravity for this purpose.
  • a size can be assigned to a pixel by comparing the pixels passing per second or the pixels per square second of the moving object. By determining the pixels along the
  • the size of the moving object can be deduced from the diagonal or the extent of the moving object. If the specific size of the moving object corresponds to an expected size of a person or person, an assessment is made as to whether or not it is a man overboard event.
  • the monitoring device 1 and in particular the cameras 5a and 5b must be calibrated.
  • the setting and / or determination of the intrinsic and extrinsic ones is used for calibration
  • parameters are understood as extrinsic calibration parameters which are dependent on the ship section 4 and / or on the ship 3 due to the installation, alignment and / or removal of the camera 5a or 5b.
  • an extrinsic calibration parameter is the viewing angle and / or inclination angle of the camera 5a, 5b with respect to a horizontal and / or the water surface.
  • Intrinsic calibration parameters are understood to be parameters of the camera 5a, 5b which are particularly dependent on the imaging and / or imaging ratio of the cameras 5a, 5b.
  • intrinsic calibration parameters are a lens distortion, a focal length.
  • the intrinsic calibration parameters can in particular be set and / or established numerically. These can be a
  • the monitoring device 1 comprises the calibration device 2.
  • the calibration device 2 has an evaluation module 8 and an input module 9.
  • the input module 9 is connected to the evaluation module 8 in terms of data.
  • the evaluation module 8 is designed so that a user can enter data graphically.
  • the input module 9 has for this purpose a display, for example a screen, on which a model of the ship 3, of the ship section 4 or the video data are displayed.
  • the user can select calibration elements 10 by means of the input module 9.
  • a calibration element 10 is a geometric object which has a position, length and orientation in the image, for example given by the start and end point of a line, and also has an orientation and / or length in the 3D world, for example perpendicular to
  • the user can use the input module 9 to assign dimensions to the calibration element 10, for example the length of a line in the 3-D world.
  • the video data are provided to the evaluation module 8.
  • the intrinsic calibration parameters 11 are also provided to the evaluation module 8. These can, for example, have been sent and / or transmitted by the camera 5a, 5b. Alternatively, the intrinsic calibration parameters 11 can be transferred to the evaluation module 8 by input by the user on the input module 9
  • the evaluation module 8 is designed based on the intrinsic calibration parameters 11, the calibration element 10 and the
  • the calibration element 10 is structural information in the video data.
  • the calibration element 10 is designed as an alignment line, as the horizon line or a fixed line in the video data.
  • the calibration elements 11 can also encompass known angles on the ship 3 and / or in the ship section 4. For example, known angles are that one object is perpendicular to another. In particular, it can
  • Evaluation module 8 comprise a ship model 13.
  • the ship model 13 is designed, for example, as a 3D model.
  • the calibration element 10 can also be selected in the displayed 3D model.
  • the evaluation module 8 is now designed based on the information from Calibration element 10, such as position, length and / or orientation and the comparison of how this calibration element appears in the video data to determine the extrinsic calibration parameters 12 such as the orientation and / or inclination of the view of the camera 5a, 5b on the ship section 4.
  • the determined extrinsic calibration parameters 12 are provided in terms of data, in particular to the evaluation module 8.
  • Figure 2 shows schematically an image and / or an image of the
  • Section 4 in the video data The ship section 4 and / or its image shows the ship 3 recorded from the perspective of the camera 5a.
  • the image has several alignment lines 24.
  • the alignment lines 24 are set for example by horizontal lines in the 3D world on the deck of the ship and / or its side wall 14, which run towards the horizon and / or the horizon line 22.
  • the escape lines 24 intersect with a ship's horizon 23 at the vanishing point 58.
  • the ship 3 floats on the water, so that the height of the ship is perpendicular to the sea.
  • a line on the ship's wall 14, for example, has been selected by the user in the model as the calibration element 10. This is perpendicular to the surface of the water, and the user can enter and / or store this information, for example, by means of the input module 9.
  • the alignment lines 24 are set for example by horizontal lines in the 3D
  • Calibration element 10 in particular also has a length and / or a
  • Extension the length being here, for example, two meters.
  • the user can also access the information about the extension with the
  • the evaluation module 8 is now designed to provide all information d. H. to offset previously known calibration parameters and calibration elements 10 with one another in such a way that the unknown calibration parameters are determined.
  • the calibration parameters can be made available to the evaluation module 7.
  • FIG. 3 shows a further example of a recording of the ship section 4 by the camera 5a.
  • the alignment line is defined, for example, by a railing 15 and / or set by the user.
  • the course of the railing 15 as an alignment line can be used to determine the orientation, position and / or rotation of the camera can be taken when recording the ship section 4.
  • This rotation and / or orientation is provided to the evaluation module as an extrinsic calibration parameter after the analysis, for example.
  • a post 26 of known height, which is arranged on the deck 27, can be selected as the calibration element 10.
  • FIG. 4 shows the ship section 4 in the form of an image in the video data, recorded with the camera 5a.
  • This image is drawn and / or displayed on the input module 9, for example.
  • the user can define areas in this image for which a monitoring and / or evaluation of moving objects is provided if the starting point of the movement of the moving object is in this area.
  • the areas selected in this way form danger areas 16.
  • the danger areas 16 can be selected and / or entered, for example, in the form of a surface.
  • the user defines the corner points and / or edge points for this purpose, with a closed contour then being formed that is stored as the danger area 16.
  • a section along the railing 15 and the window openings 28 form the danger areas 16.
  • the evaluation module 8 does not interpret a moving object that has no starting point in the danger area 16 as a man overboard event.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Image Processing (AREA)
  • Emergency Alarm Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Dispositif d'étalonnage (2) pour un dispositif de surveillance (1), le dispositif de surveillance (1) étant conçu comme dispositif de détection d'homme à la mer d'une partie de bateau (4), le dispositif de surveillance comportant au moins une caméra (5a, 5b) pour la surveillance vidéo de la partie de bateau (4) et pour la fourniture de données vidéo, la caméra (5a, 5b) présentant au moins un paramètre d'étalonnage intrinsèque (11) et au moins un paramètre d'étalonnage extrinsèque (12), les données vidéo étant fournies au dispositif d'étalonnage (2). Le dispositif d'étalonnage (2) comporte un module d'entrée (9) pour l'entrée d'un ou plusieurs éléments d'étalonnage (10) par l'utilisateur, et un module d'évaluation (8) conçu pour déterminer les paramètres d'étalonnage inconnus (11, 12) sur la base des éléments d'étalonnage (10), notamment sur leur orientation et/ou leur extension.
EP20700878.0A 2019-02-06 2020-01-13 Dispositif d'étalonnage pour un dispositif de surveillance, dispositif de surveillance pour la détection d'homme à la mer et procédé d'étalonnage Pending EP3921804A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019201490.9A DE102019201490A1 (de) 2019-02-06 2019-02-06 Kalibriereinrichtung für eine Überwachungsvorrichtung, Überwachungsvorrichtung zur Man-Overboard-Überwachung sowie Verfahren zur Kalibrierung
PCT/EP2020/050667 WO2020160874A1 (fr) 2019-02-06 2020-01-13 Dispositif d'étalonnage pour un dispositif de surveillance, dispositif de surveillance pour la détection d'homme à la mer et procédé d'étalonnage

Publications (1)

Publication Number Publication Date
EP3921804A1 true EP3921804A1 (fr) 2021-12-15

Family

ID=69172788

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20700878.0A Pending EP3921804A1 (fr) 2019-02-06 2020-01-13 Dispositif d'étalonnage pour un dispositif de surveillance, dispositif de surveillance pour la détection d'homme à la mer et procédé d'étalonnage

Country Status (4)

Country Link
US (1) US11595638B2 (fr)
EP (1) EP3921804A1 (fr)
DE (1) DE102019201490A1 (fr)
WO (1) WO2020160874A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915033A (zh) * 2020-08-13 2020-11-10 日照古工船舶服务有限公司 一种船舶检修监控系统及方法
CN112802117B (zh) * 2020-12-31 2022-04-08 清华大学苏州汽车研究院(吴江) 一种激光雷达和摄像头标定参数盲复原方法

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5699444A (en) * 1995-03-31 1997-12-16 Synthonics Incorporated Methods and apparatus for using image data to determine camera location and orientation
DE10318500A1 (de) * 2003-04-24 2004-11-25 Robert Bosch Gmbh Vorrichtung und Verfahren zur Kalibrierung eines Bildsensors
US7884849B2 (en) * 2005-09-26 2011-02-08 Objectvideo, Inc. Video surveillance system with omni-directional camera
US7949150B2 (en) * 2007-04-02 2011-05-24 Objectvideo, Inc. Automatic camera calibration and geo-registration using objects that provide positional information
US8031210B2 (en) * 2007-09-30 2011-10-04 Rdv Systems Ltd. Method and apparatus for creating a composite image
US8059154B1 (en) * 2008-10-06 2011-11-15 Verint Systems Ltd. Systems and methods for automatic camera calibration
MX2012011118A (es) * 2010-03-26 2013-04-03 Fortem Solutions Inc Navegacion sin esfuerzo a traves de camaras y control cooperativo de camaras.
TWI426775B (zh) * 2010-12-17 2014-02-11 Ind Tech Res Inst 攝影機再校正系統及其方法
US8810436B2 (en) * 2011-03-10 2014-08-19 Security Identification Systems Corporation Maritime overboard detection and tracking system
GB2493390A (en) * 2011-08-05 2013-02-06 Marine & Remote Sensing Solutions Ltd System for detecting a person overboard event
US20130214942A1 (en) * 2012-02-21 2013-08-22 Stephen Howard Joss Man Overboard Detection, Tracking and Recovery
EP2779102A1 (fr) * 2013-03-12 2014-09-17 E.sigma Systems GmbH Procédé de génération d'une séquence vidéo animée
EP2866052A1 (fr) * 2013-10-23 2015-04-29 Ladar Limited Système pour surveiller un environnement maritime
US11288517B2 (en) * 2014-09-30 2022-03-29 PureTech Systems Inc. System and method for deep learning enhanced object incident detection
US9569671B1 (en) * 2014-09-30 2017-02-14 Puretech Systems, Inc. System and method for man overboard incident detection
US10007849B2 (en) * 2015-05-29 2018-06-26 Accenture Global Solutions Limited Predicting external events from digital video content
CA2961921C (fr) * 2016-03-29 2020-05-12 Institut National D'optique Methode d'etalonnage de camera au moyen d'une cible d'etalonnage
GB2550112B (en) * 2016-04-29 2019-10-09 Marss Ventures S A Method of verifying a potential detection of a man overboard event and alert verification processing apparatus
GB2550111B (en) * 2016-04-29 2019-10-09 Marss Ventures S A Method of verifying a triggered alert and alert verification processing apparatus
US9896170B1 (en) * 2016-08-12 2018-02-20 Surveillance International, Inc. Man overboard detection system
US10372970B2 (en) * 2016-09-15 2019-08-06 Qualcomm Incorporated Automatic scene calibration method for video analytics
CN110463182B (zh) * 2017-03-31 2021-07-13 松下知识产权经营株式会社 摄像系统以及校正方法
US10635844B1 (en) * 2018-02-27 2020-04-28 The Mathworks, Inc. Methods and systems for simulating vision sensor detection at medium fidelity
US10839557B1 (en) * 2018-04-03 2020-11-17 A9.Com, Inc. Camera calibration for augmented reality
WO2019206247A1 (fr) * 2018-04-27 2019-10-31 Shanghai Truthvision Information Technology Co., Ltd Système et procédé d'étalonnage de caméra
WO2019222255A1 (fr) * 2018-05-14 2019-11-21 Sri International Systèmes et procédés d'inspection assistée par ordinateur
DE102018215125A1 (de) * 2018-09-06 2020-03-12 Robert Bosch Gmbh Überwachungsvorrichtung und Verfahren zur Mann-über-Bord-Überwachung
DE102019201493A1 (de) * 2019-02-06 2020-08-06 Robert Bosch Gmbh Überwachungsvorrichtung und Verfahren zur Man-Overboard-Überwachung eines Schiffsabschnitts
US20220301302A1 (en) * 2019-08-29 2022-09-22 FLIR Belgium BVBA Air and sea based fishing data collection and analysis systems and methods
US11270467B2 (en) * 2020-01-21 2022-03-08 Compound Eye, Inc. System and method for camera calibration

Also Published As

Publication number Publication date
US20220191467A1 (en) 2022-06-16
WO2020160874A1 (fr) 2020-08-13
DE102019201490A1 (de) 2020-08-06
US11595638B2 (en) 2023-02-28

Similar Documents

Publication Publication Date Title
DE102007011616B4 (de) Fahrzeugumgebung-Überwachungsgerät
DE102016220075A1 (de) Kraftfahrzeug und Verfahren zur 360°-Umfelderfassung
DE102014213981A1 (de) Parkplatz-Erfassungsvorrichtung und entsprechendes Verfahren
DE102009054698A1 (de) Verfahren zum Positionieren wenigstens einer Komponente, insbesondere eines Sitzes, in oder an einem Luft- oder Raumfahrzeug, sowie Luft- oder Raumfahrzeug
WO2005038395A1 (fr) Procede et dispositif pour determiner la position courante d'un instrument geodesique
EP1913565B1 (fr) Dispositif de detection
EP3921804A1 (fr) Dispositif d'étalonnage pour un dispositif de surveillance, dispositif de surveillance pour la détection d'homme à la mer et procédé d'étalonnage
EP3921819B1 (fr) Dispositif de surveillance et procédé de surveillance d'une partie de navire pour la détection d'homme à la mer
DE102011078746A1 (de) Abstands- und Typenbestimmung von Flugzeugen während des Andockens an das Gate
EP3847575A1 (fr) Dispositif de surveillance et procédé de détection d'homme à la mer
DE102016201741A1 (de) Verfahren zur Höhenerkennung
DE10151983A1 (de) Verfahren zur Dokumentation einer Unfallsituation
DE112019004963T5 (de) Optikbasiertes mehrdimensionales Ziel- und Mehrfachobjekterkennungs- und verfolgungsverfahren
EP2502205A2 (fr) Procédé pour générer une représentation d'un environnement
EP3435030A1 (fr) Procédé de création d'un modèle tridimensionnel d'un objet
DE10153113A1 (de) Verfahren und Vorrichtung zur Entfernungsbestimmung
DE19517026B4 (de) Verfahren zur Bestimmung der Geschwindigkeit eines Fahrzeuges mit Hilfe einer das Fahrzeug aufnehmenden Videokamera und Vorrichtung zur Durchführung des Verfahrens
DE102007046288B4 (de) Verfahren und Sensoranordnung zur Vermessung optischer Merkmale
DE19828318C2 (de) Drahthervorhebung
DE19517031A1 (de) Verfahren zur Bestimmung der Länge eines Fahrzeuges mit Hilfe einer das Fahrzeug aufnehmenden Videokamera und Vorrichtung zur Durchführung des Verfahrens
EP3844947A1 (fr) Procédé et ensemble de génération d'une représentation de l'environnement d'un véhicule et véhicule pourvu d'un tel ensemble
DE102006014546B4 (de) Verfahren und Vorrichtung zum sensorbasierten Überwachen einer Umgebung
DE10065180A1 (de) Sensorsystem mit optischen sensoren zur Kollisionsvermeidung von Flugzeugen sowie ein Verfahren zu deren Durchführung
DE19744694A1 (de) Videobewegungsmeldeeinrichtung
WO2017161401A1 (fr) Procédé d'auto-localisation de véhicules

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210906

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS