WO2017089364A1 - Unité de caméra multifonctionnelle destinée à être couplée à un premier système de support moile - Google Patents

Unité de caméra multifonctionnelle destinée à être couplée à un premier système de support moile Download PDF

Info

Publication number
WO2017089364A1
WO2017089364A1 PCT/EP2016/078485 EP2016078485W WO2017089364A1 WO 2017089364 A1 WO2017089364 A1 WO 2017089364A1 EP 2016078485 W EP2016078485 W EP 2016078485W WO 2017089364 A1 WO2017089364 A1 WO 2017089364A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
camera unit
multifunctional
interest
carrier system
Prior art date
Application number
PCT/EP2016/078485
Other languages
German (de)
English (en)
Inventor
Jens Müller
Joachim Horn
Original Assignee
Helmut-Schmidt Universität
Hamburg-Innovation Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Helmut-Schmidt Universität, Hamburg-Innovation Gmbh filed Critical Helmut-Schmidt Universität
Priority to DE112016005353.3T priority Critical patent/DE112016005353A5/de
Publication of WO2017089364A1 publication Critical patent/WO2017089364A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • Multifunctional camera unit for coupling to a first mobile carrier system
  • the invention relates to a multifunctional camera unit for coupling to a first mobile carrier system, in particular a mobile robot system. Furthermore, the invention relates to a method for determining position data of at least one object of interest, in particular a second mobile carrier system, using a multifunctional camera unit with a plurality of cameras, wherein the multifunctional camera unit is coupled to a first mobile carrier system. In a further aspect, the invention comprises a use of a multifunctional camera unit according to the invention for tracking at least one object of interest, in particular a second carrier system.
  • US 2007/0288132 A1 describes how a cooperative swarm of unmanned robots works using three different systems.
  • a global satellite-based location system is used for the respective self-localization in the swarm participating robots.
  • a separate transmitter with a so-called omnipresent signal is used, which is available at all times robots involved in the swarm.
  • the routing is ensured by a separate system, which is located on board a robot with the aforementioned two systems.
  • GPS external positioning system
  • Positioning inaccuracies due to weather related signal disturbances or unfavorable local constellations of the satellite position in relation to possibly surrounding signal - influencing buildings or Geodetic relevant surveys can lead to significant deviations in the goal setting and increased energy requirements in the spatial coordination of swarm participants.
  • AI is for realizing a swarm consisting of multi-legged robots star-shaped mobile base stations connected to other multi-legged robots that are operated externally by a person used for positioning an external positioning system (GPS).
  • GPS external positioning system
  • the operator has the option of setting targets and retrieving recorded sensor data.
  • the robots may view the surrounding area and, for example, respond to fire through a dedicated program.
  • the communication between the multi-legged robotic systems takes place exclusively via the entrained star base mobile base station and moreover can communicate only indirectly via the operator with other robots operated in a further star, if so provided.
  • the problem here is also the dependence on an external location system and the fact that in the absence of signal, the lack of information about the local whereabouts of participating in the swarm robots whose coordination is much more difficult and the realistic use in question.
  • US 2010/0268409 A1 describes a method for inspecting structures such as e.g. Buildings, bridges, infrastructure of the train etc. with the help of unmanned drones, organized in the swarm, which are equipped with application-oriented sensors and cameras.
  • a mobile centralized station e.g. an aircraft, provided for the inspection of the collected data, which has a data connection based on a suitable wireless technology that receives the collected image and sensor data from the drones and computationally evaluates or processes.
  • the centralized station also creates the flight plans, transfers them to the swarm participants, adapts them dynamically to local conditions and, after successful completion of a flight plan, distributes new tasks to the participating drones.
  • the application focus is outside and based on an external location system.
  • US 2009/0099768 A1 describes a navigation system which, in addition to the known so-called target attraction beacons, also has a so-called collision avoidance beacon. This is able to globally receive collision avoidance signals from surrounding drones, but also to broadcast itself collision avoidance signals globally. With regard to the collision signal is always assumed that the involved aircraft drones also such a signal Otherwise, the risk of a momentous blind flight is significantly increased and the described functionalities are challenged in terms of their practical applicability. Because real obstacles such as buildings or walls do not send collision signals, local sensors are used to detect such obstacles. The decision as to which direction to avoid the obstacle is based on the knowledge of how the obstacle is structured globally in order to achieve the intended goal in an efficient, energy-efficient and forward-looking manner. If the participating drones were essentially not only relying on the described simple radio signals for target tracking and collision avoidance and supplementing also using imaging sensors, the use in urban areas would be much easier to achieve by an early adjusted data situation.
  • the object of the invention is to further develop a multifunctional camera unit in such a way that by providing a multifunctional camera unit for the viewer, eg for a robot system, static and dynamic objects, such as further airborne drones, from their own motion to identify, to classify and locally to organize by internally calculated spatial coordinates so that in the field of mobile robotics and other application-relevant areas, such as security technology, highly complex, fully autonomous tasks, such as search and rescue operations can be performed.
  • a multifunctional camera unit for coupling to a first mobile carrier system, in particular a mobile robot system, wherein the camera unit can be used statically and / or mobile, and wherein the camera unit is designed for recognition and tracking of static and dynamic objects ,
  • a mobile carrier system is to be understood primarily as a mobile robot system.
  • the term unmanned mobile robot so-called UAVs (Unmanned Area Vehicles)
  • the carrier system is designed in particular airworthy.
  • the mobile carrier system is preferably a drone, especially the aircraft drone of the type ATC 3.0 (Airshark Technology Carrier).
  • Mobile carrier systems form for certain tasks, especially in a swarm, in which each mobile carrier system represents a swarm participant.
  • a swarm is understood to mean at least two swarm participants who act in the context of a superordinate task in the network.
  • such a swarm comprises several mobile carrier systems.
  • the multifunctional camera unit is designed for coupling to the first mobile carrier system.
  • a coupling is understood to be, in particular, the production of a detachable connection.
  • the multifunctional camera unit is arranged on the first mobile carrier system.
  • multifunctional camera unit is to be understood in particular as meaning a multifunctional recording unit of optical information.
  • the multifunctional camera unit is referred to as “swarm flight unit” (SFU).
  • SFU warm flight unit
  • the multifunctional camera unit imaging sensors and moving components that give it its multifunctionality.
  • the invention relates to a system comprising a multifunctional camera unit and a first mobile carrier system, wherein the multifunctional camera unit is coupled to the first mobile carrier system.
  • the multifunctional camera unit comprises a plurality, preferably at least two, furthermore preferably at least three, in particular at least four camera slides movably arranged on a circular path, the camera slides each having at least one camera, at least one ring illumination and at least one point projector.
  • Each camera carriage is preferably assigned exactly one camera, a ring illumination and a point projector.
  • the cameras are mediated their mobility.
  • the cameras are horizontally and / or vertically movable due to their arrangement on the camera carriage.
  • each camera carriage preferably includes two drives, wherein a drive for the horizontal movement of the camera carriage and a drive for the vertical circular movement of the arranged on the camera camera camera are responsible.
  • the mobility or the position data of the camera slides or cameras are mainly determined with reference to the geometric origin of the camera unit, which is primarily the center of the camera unit.
  • the cameras are piecewise stereoscopically adjustable.
  • the camera slides are arranged in particular movable on the same circular path.
  • the cameras are in particular designed such that they cooperate with an electronic evaluation unit and image processing algorithms for evaluating the data recorded by them.
  • the point projector is a light projector, especially a laser diode designed to project patterns such as points or lines onto objects of interest.
  • the projection of such patterns as visible structures facilitates the detection and tracking of objects and / or distance determination to objects, especially at less favorable illumination and contrast ratios.
  • the arrangement of the point projectors in relation to the respective recording plane is preferably orthogonal, but may also assume an arbitrarily different angle on the assumption of a projection visible in the image with regard to additional distance information to the destination. In a further manifestation variant, therefore, these setting angles can be adapted dynamically.
  • the ring illumination is arranged such that the camera is located in the center of the ring formed by the ring illumination.
  • the ring illumination illuminates the scene or picture taken by the camera.
  • the multifunctional camera unit has a static central camera with at least one point projector and at least one ring illumination.
  • static is to be understood as meaning that the central camera is not movable in relation to the camera unit.
  • the central camera in particular, is not arranged on a camera carriage which could lend it mobility mobile carrier system not mobile, in the context of the application certainly deals with highly dynamically changing image sequences.
  • the multifunctional camera unit comprises a plurality, preferably four, static cameras each having at least one, preferably exactly one, point projector and at least one, preferably exactly one, Ring lighting.
  • the static cameras are not designed to be mobile and, in particular, are not arranged on a camera carriage.
  • the four stationary cameras are in particular arranged around a neck of the camera unit, preferably at regular intervals from one another.
  • the areas between the lateral arms of the first mobile carrier system are optically covered.
  • the multifunctional camera unit is in particular designed to determine position data between the camera unit and the first mobile carrier system and / or position data between the camera unit and at least one object of interest, in particular a second carrier system.
  • position data is to be understood in particular as meaning three-dimensional data about the position, i.e. the exact location, preferably together with the angular position
  • angular position means the angle at which the camera is arranged and thus takes pictures.
  • the multifunctional camera unit is preferably designed to determine the position data without the aid of additional external data, in particular GPS data. In other words, the multifunctional camera unit is designed exclusively on the basis of the data taken by it for position data determination.
  • a second carrier system is an airworthy object, in particular a drone, especially a drone.
  • the first mobile carrier system and the second mobile carrier system are swarm participants of the same swarm.
  • the swarm comprises further swarm participants, wherein the multifunctional camera unit is designed to determine the position of all the carrier systems to be involved or already involved in the swarm.
  • the other carrier systems do not have to have a multifunctional camera unit.
  • the second carrier system in particular all other Schwarmteilauer the swarm, none multifunctional camera unit.
  • the other carrier systems may include a multifunctional camera unit.
  • the multifunctional camera unit further preferably comprises a means for image processing and calculation of position data between the camera unit and the first mobile carrier system and / or position data between the camera unit and at least one object of interest, in particular a second carrier system, and preferably a means for data transmission of the calculated position data, in particular a data interface between the camera unit and the first mobile carrier system.
  • the interface is above all an intelligent interface, which is described in particular in the published patent application DE 10 2013 100 155 AI.
  • the interface is designed primarily functional and karhybrid.
  • the means for image processing and calculation of position data is mainly for object recognition, i. the detection of objects of interest, which may be static or dynamic, suitable.
  • the transmission of the calculated position data to the first mobile carrier system serves, for example, to control the first mobile carrier system, in particular in order to fly in a swarm and to convey electronic and / or mechanical loads and / or automatically change over a suitable ground station and / or to make a landing approach ,
  • the determination of the positional data between the camera unit, in particular the different cameras of the camera unit, and the first mobile carrier system is crucial, because only then the various data taken by the cameras in spatial relation to the cameras and thus to the first mobile carrier system.
  • at least one camera of the camera unit and at least one point projector assigned to it form a distance warning system.
  • the distance warning system is designed to determine the respective distance between the cameras of the camera unit and at least one object of interest, in particular a second mobile carrier system.
  • all the cameras of the camera unit and the respectively assigned point projectors form a distance warning system.
  • the distance warning system serves in particular as a means for spatial collision avoidance of the first mobile carrier system and an object of interest, in particular a second carrier system. Furthermore, the distance warning system serves to support the landing process of the first mobile carrier system and / or of a second mobile carrier system. Furthermore, the distance warning system can provide distance information of an object of interest.
  • the camera carriages are movably arranged on a common circular path, wherein each camera carriage is assigned a working area, wherein adjacent working areas overlap in an overlapping area.
  • the camera unit is designed such that the camera slides are secured against collision.
  • the movement of a camera slide is assigned a segment of the circular path.
  • each camera carriage is assigned a quadrant as a work area.
  • As a work area is thus a range between defined mechanical limits of movement of a camera slide of the camera unit to understand.
  • each workspace is assigned only one camera slide.
  • the work areas are arranged in an overlapping manner in the plane, so that overlap areas form at the ends of the adjacent work areas.
  • An overlap region preferably extends over 5 ° to 15 ° of the common circular path.
  • the camera unit is designed in such a way that despite Overlapping areas a collision of adjacent cameras or camera slides is avoided.
  • the camera unit is preferably designed to determine the position data of the camera slides and the position data of the cameras arranged on the camera carriage, in particular the angular positions of the cameras.
  • the camera unit has an incremental encoder, comprising an optical transmitter, an incremental disk and a receiver.
  • both a horizontal and a vertical angle are assigned to the one object of interest captured relative to the first mobile carrier system.
  • the incremental disk provided for the camera carriages in the horizontal movement has a total of spatially separated tracks which overlap in pieces at an angle.
  • the number of tracks corresponds to the number of camera slides arranged on the incremental disk.
  • the incremental disk has four spatially separated tracks that overlap each other at an angle, so that the camera slides can, if necessary, also operate outside their assigned working area in an overlapping area.
  • each camera carriage has an incremental disk, via which the vertical angular position of the camera can be determined.
  • the invention relates to a method for determining position data of at least one object of interest, in particular a second mobile carrier system, using a multifunctional camera unit with a plurality of cameras, wherein the multifunctional camera unit is coupled to a first mobile carrier system.
  • the method comprises the determination of position data of at least one object of interest, in particular of the second mobile carrier system, exclusively based on data recorded by the camera unit.
  • the determination of the position data thus takes place explicitly without the aid of GPS data or other external data.
  • the method further comprises a collision check of the cameras of the camera unit.
  • the method comprises using a multifunctional camera unit according to the invention described above coupled to a first mobile carrier system.
  • the collision check relates in particular to the movement of the camera carriages on which the cameras of the camera unit are arranged.
  • the collision check prevents adjacent moving camera carriages from entering overlapping areas where they could collide with adjacent camera carriages.
  • the collision test is temporally upstream of the movement of the camera carriage. This allows the cameras to move freely in their workspaces without the risk of collision.
  • the collision check takes place above all at regular intervals, in particular permanently.
  • the collision check comprises checking whether a camera is already present in a work area, in particular in an overlapping area, before another camera is moved into this area. Within the overlapping areas, the entry of a camera into a "foreign" work area, ie a work area of another camera, is only allowed, so that it is not used, so is free. If there is already a camera in the overlap area, the entry of another camera is not permitted.
  • the method comprises a regular distance monitoring of the distances between the cameras of the camera unit and / or regular distance monitoring of the distance between the multifunctional camera unit and the object of interest, in particular the second mobile carrier system, wherein the distance monitors are formed separately from each other.
  • the distance monitoring is carried out in particular at regular intervals, preferably permanently.
  • the method may include projecting patterns onto the object of interest by means of point projectors facilitating the recognition of the object.
  • the evaluation of the propagation information of this point projection coupled with the measured viewing direction, ie the exact vertical and horizontal position data of the camera slides or the camera takes place within the camera unit by determining the projected area in relation to the projection pattern and the surrounding contour of the object of interest.
  • the projected patterns are not formed orthogonal to the respective recording plane, so that additional distance information with the resulting angular offset are evaluated.
  • the arrangement of the point projectors in relation to the respective recording plane is preferably orthogonal, but may also assume an arbitrarily different angle on the assumption of a projection visible in the image with regard to additional distance information to the destination.
  • the method may include dynamically adjusting the adjustment angles.
  • the method comprises storing all of the raw data recorded by the camera unit, in particular image recordings, within the multifunctional camera unit. These can be used in particular for downstream evaluations with regard to special application-related measurement tasks, not just those that originate from the field of geodesy.
  • Specific measuring tasks include, for example, the optical recording of infrastructure (eg superstructure, signal systems, parking facilities, railway stations) of local rail transport and rail-bound long-distance domestic and international transport, with the condition and completeness of system components with regard to operational safety and general planning of maintenance measures can be checked relatively quickly.
  • the inventive method can be compared to conventional test drives, which only by advance notification, application for a timetable, fees for energy supply, track and station use, disposition of a rail vehicle and providing a local train driver are possible and also planned in metropolitan areas up to several months in advance must be made, very cost-saving.
  • the method comprises the object recognition with the aid of digital image processing, with a subdivision into preprocessing, segmentation, feature extraction and analysis taking place here.
  • preprocessing the acquired digital color image is pretreated by means of point operators, as well as local and global filter operations - with a constant information content - so that the image size to be processed continues to decrease significantly in terms of the required storage volume.
  • an object of interest If an object of interest has been found, it is listed system-internally and tracked within the work area by so-called mechanical and software-technical tracking. In this case, an expensive pattern recognition is not constantly performed again for runtime reasons, but rather the object of interest is optically substituted by a simple geometric primitive, for example a line, a triangle or a quadrangle, and tracked in the image.
  • a simple geometric primitive for example a line, a triangle or a quadrangle
  • the invention relates to the use of a multifunctional camera unit described above for tracking at least one object of interest, in particular a second carrier system, the use comprising the step of intra-system listing of the object of interest and tracking of the system-interesting object by mechanical and software tracking.
  • the list of the object of interest preferably also comprises the creation of an internal recognition statistic, preferably that an object of interest is listed requires an object recognition as described above, the list of the respectively identified ones of interest pes allows a seamless transfer to the camera located in the next work area, which also track the object of interest without further pattern recognition using simple geometric objects.
  • the use comprises that a first camera tracks, ie tracks, the object of interest in a first work area and, when leaving the first work area in a second work area of a second camera, seamlessly follows the tracking and thus the responsibility to the second camera. This requires in particular the distance monitoring and the collision check. This ensures a seamless transition and thus a permanent tracking of the object of interest.
  • the mechanical object tracking is done within a work area fully automatically by moving the respective camera. Possible overlaps of the target by other moving objects, which could then lead to erroneous interpretations, are compensated by forward-looking, based on physical principles estimating the current position in space.
  • Figure 1 a flowchart of the internal operation of a
  • Camera unit in the context of connection with a first mobile
  • FIG. 3 is a side sectional view of a multifunctional camera unit
  • Figure 4 is a plan view of the division into working and overlapping areas
  • Figure 5 is a perspective view of the internal structure of the camera unit
  • FIG. 6 shows a perspective view of the four camera carriages together with gearwheel
  • FIG. 7 a perspective view of a camera slide
  • Figure 8 a plan view of the incremental discs for the horizontal
  • Figure 9 is a perspective view of the downward static
  • Figure 10 a perspective view of a combination of a multifunctional
  • FIG. 1 shows a program flow chart of the internal functioning of a multifunctional camera unit (100).
  • FIG. 1 illustrates the sequence of a method according to the invention and / or the sequence of using a multifunctional camera unit (100) for tracking at least one object of interest.
  • the first step (500) is an image acquisition by means of at least one camera (18) of the multifunctional camera unit (100), preferably by means of all the cameras (18).
  • the data of the image acquisition, and the originally recorded image data of the at least one camera, are stored in a further step (501) as raw data and can be used for downstream evaluations. Parallel to this, the image processing (502) of the recorded data takes place.
  • the recognition (503) of an object of interest For this purpose, at least one camera (18) or the associated camera slide (14, 15, 16, 17) in the respective work area (24) moves, in particular all cameras (18) or associated camera slides (14, 15, 16, 17) in the respective work area (24) moves. If no object of interest is found, the process continues again from image processing (502), whereby new image data recorded in the meantime are processed. If an object of interest has been found, it is checked in a further step (504) whether the given properties with regard to geometric features, for example size relationships between optical features, such as lateral arms, and body, ascents and curve transitions, and the surface texture used, in the multifunctional camera unit (100) stored system information.
  • geometric features for example size relationships between optical features, such as lateral arms, and body, ascents and curve transitions, and the surface texture used, in the multifunctional camera unit (100) stored system information.
  • a digital signature is interrogated which, in conjugation with the previously taken positive optical confirmation of belonging to the further mobile carrier systems known in the multifunctional camera unit (100) Classify (505) into a friendly or unfriendly object.
  • a security program is activated which, with regard to the superordinate task, merely lists this object with its given optical properties and its position, or with the aid of the first one Mobile support system (1000) attached tools active for the deactivation of this object, preferably flying object provides, so that no danger for the scheduled execution of the parent task is more.
  • a step (508) the movement limits of the camera (18) tracking the object of interest, i. tracked, and the associated camera slide (14, 15, 16, 17) monitored. If the local limits are reached in the acquired image of the camera (18), a mechanical tracking (509) is activated. As part of the mechanical tracking (509), the tracking of the object of interest is seamlessly passed to an adjacent camera (18) as soon as the previously responsible camera (18) due to their limited work area (24) can no longer monitor the object of interest. The transfer then takes place to the camera (18), in whose working area (24) the object of interest now falls.
  • a determination of the distance to object of interest is listed in a further step (511) and the 3D position data of the object of interest is transmitted to the first mobile support system (1000) on which the multifunctional camera unit (100) is located. Finally, it is assessed whether the procedure should be stopped at this point or continues. In the latter case, the process begins again in image processing (502) of newly acquired image data.
  • the above-described stop of the method may, for example, be triggered by the need for energy saving measures to achieve a safe landing position.
  • FIG. 2 shows an overview of the global mode of operation of the multifunctional camera unit (100) in conjunction with a first mobile carrier system (1000).
  • the multifunctional camera unit (100) is assigned the functions of image acquisition, image processing, determination of the positional data of an object of interest, mechanical tracking of the project of interest, and collision monitoring.
  • the first mobile carrier system (1000) is assigned the following functions: monitoring of the underlying superordinate task (mission), monitoring of navigation, data collection and data filtering, monitoring of safety, monitoring of the landing process, monitoring of the engines of the first mobile Carrier system (1000) and monitoring of external loads.
  • the multifunctional camera unit (100) and the first mobile carrier system (1000) are connected via a mechanical interface (200), which is designed in particular functionally and haphybrid.
  • the interface (200) will become as follows with regard to the security monitoring of the first mobile carrier system (1000) Functions assigned: a secure fully automatic coupling, which is controlled by a magnetic feedback, as well as an automatic safe switching on and off of signal and power paths.
  • a pulse-based identification signal from the load which is attached to the so-called tool side of the interface (200) is coupled to the so-called machine side of the interface () during the coupling process. 200), which is attached to the first mobile carrier system (1000), transmitted, so that in this way the basic familial affiliation between the load (300) and the first mobile carrier system (1000) can be determined.
  • the load (300), which is fastened to the tool side of the interface (200), is required after the transmission and evaluation of the above-described identification signal to generate a so-called vital sign which indicates the functional presence of the first mobile carrier system (1000 ) coupled load (300) signaled at any time.
  • the absence of this signal, which is permanently checked within the interface (200), immediately results in a shutdown of all signal and power paths result and an error message is generated, which the mounted it first mobile carrier system (1000) as part of an error message for further evaluation is made available.
  • the interface (200) has an internal intelligence that can distinguish and process the following information: Distinguish loads, create protocols, monitor number of couplings and operating hours, unnecessary coupling, ie avoid unnecessary loads.
  • the interface has the function of statistically summarizing the aforementioned information for further later evaluation.
  • FIG. 3 shows a side sectional view of a multifunctional camera unit (100) and thus the internal structure thereof.
  • the multifunctional camera unit (100) is funnel-shaped on both end faces.
  • the camera unit (100) has a housing (28) which widens conically in the direction of both end faces.
  • the housing (28) In one upper part of the housing (28), the Jardinaufnehenden parts of the camera unit (100) are arranged, while in the lower part of the electronics, in particular an electronic implementation (22) of the image processing and communication channels are located.
  • At least one upper part of the housing (28) of the camera unit (100) is designed as a transparent shell (10), so that image recordings of the cameras (18) arranged within the housing can take place.
  • the multifunctional camera unit (100) has a static central camera (11) which is designed to accommodate the airspace above it. Furthermore, the camera unit (100) comprises a point projector (12) and a ring illumination (13) for the central camera (11).
  • the camera unit (100) comprises a point projector (12) and a ring illumination (13) for the central camera (11).
  • In the sectional view of Figure 3 is further one (14) of four camera slides (14, 15, 16, 17) to see, on each of which a horizontally and vertically movable camera (18) is arranged.
  • Each camera (18) of a camera slide (14, 15, 16, 17) are each assigned a dot projector (18a) and a ring illumination (18b).
  • the camera slides (14) each have a drive (19) for the vertical circular movement of the associated camera (18).
  • the camera unit (100) has an incremental disc (20) for the horizontal movement, that is to say circular movement, of the camera carriage (14, 15, 16, 17) around the geometric origin of the camera unit (100).
  • the incremental disc (20) is four-lane.
  • the camera unit (100) in each case has a drive (23).
  • the multifunctional camera unit (100) For receiving the regions between lateral arms (1000a) of the first mobile carrier system (1000), the multifunctional camera unit (100) comprises four static cameras (21), one of which can be seen in the sectional view of FIG.
  • the static cameras (21) are arranged below the camera slides (14, 15, 16, 17).
  • the static cameras (21) allow a downwardly directed view directed between the lateral arms (1000a) of the first mobile carrier system (1000), and thus a directed image acquisition.
  • FIG. 4 shows a plan view of the division into working areas (24) and overlapping areas (26) of the camera carriages (14, 15, 16, 17) of the camera unit (100).
  • the angular range of 360 ° is divided sectorally into four quadrants (I, II, III, IV).
  • Each camera carriage (14, 15, 16, 17) is associated with a work area (24), wherein adjacent work areas (24) can overlap at their ends in overlapping areas (26). Thus, extended work areas (25) with overlap are created.
  • Figure 5 shows a perspective view of the internal structure of the camera unit (100).
  • a protective ring (27) for the mechanical shielding of the camera carriages (14, 15, 16, 17) is shown.
  • FIG. 6 shows a perspective view of the four camera carriages (14, 15, 16, 17) and of the toothed wheel (29) with which the camera carriages (14, 15, 16, 17) are connected.
  • the gear (29) together with the drive (23) for the horizontal movement of a camera carriage (14, 15, 16, 17) for the movement of the camera slides (14, 15, 16, 17) in the horizontal plane.
  • the central camera (11) is arranged on the gear (29).
  • FIG. 7 shows a perspective view of a camera slide (14).
  • the camera unit (100) has a frame (30).
  • the camera carriage (14) has a single-track incremental disc (31) which serves for the vertical circular movement of the camera (18) arranged on the camera carriage (14).
  • the drive (23) for the horizontal movement of the camera carriage (14) and the drive (19) for the vertical circular movement of the camera (18) is shown.
  • the two drives (19, 23) are formed by electronically commutating motors.
  • FIG. 8 shows the incremental disks (20, 31) of the multifunctional camera unit (100).
  • the incremental disc (20) for the horizontal movement of a camera carriage (14, 15, 16, 17) is shown, while in the right half of the picture the incremental disc (31) for the vertical circular movement of a camera (18) arranged on the camera carriage. is shown.
  • a first track (32) on which a first camera carriage (14) can move is arranged on the incremental disc (20) for the horizontal movement of a camera carriage (14, 15, 16, 17). Furthermore, the incremental disc (20) has a second track (33) for a second camera carriage (15), a third track (34) for a third camera carriage (16) and a fourth track (35) for a fourth camera carriage (17).
  • Each track (32, 33, 34, 35) has an initialization area (36).
  • the tracks are offset relative to one another in such a way that overlapping areas arise between the tracks (32, 33, 34, 35) of adjacent camera slides (14, 15, 16, 17).
  • the second track (33) is arranged offset radially inwards relative to the first track (32). The same applies to the third track (34) compared to the second track (33) and to the fourth track (35) in comparison to the third track (34).
  • a track (37) for the movement thereof is also arranged on the incremental disc (31) for the vertical circular movement of a camera (18) arranged on the camera carriage, which also has an initialization area (38).
  • FIG. 9 shows a perspective view of the downwardly directed static cameras (21) of the camera unit (100).
  • FIG. 9 shows in particular the dot projector (21a) respectively associated with the static camera (21) and the ring illumination (21b), above all in the enlarged representation shown in the right-hand half of the figure.
  • the static cameras (21) can produce downwardly directed image recordings.
  • FIG. 10 shows a perspective view of a combination of a multifunctional camera unit (100) and a first mobile carrier system (1000).
  • the combination also has an interface (200) between the camera unit (100) and the first mobile carrier system (1000).
  • the first mobile carrier system (1000) comprises an electronic load (300).
  • the first mobile carrier system (1000) is formed by a flying drone, in particular an Airshark Carrier System, which has four lateral outriggers (1000a) each with a propeller (1000b).
  • the multifunctional camera unit (100) is arranged in a central area, directed upwards, on the first mobile carrier system (1000).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne une unité de caméra multifonctionnelle (100) destinée à être couplée à un premier système de support mobile (1000), notamment à un système robotisé mobile, cette unité de caméra (100) pouvant être montée fixe et/ou mobile, ladite unité de caméra (100) étant avant tout destinée à la reconnaissance et au suivi d'objets statiques et dynamiques.
PCT/EP2016/078485 2015-11-23 2016-11-23 Unité de caméra multifonctionnelle destinée à être couplée à un premier système de support moile WO2017089364A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112016005353.3T DE112016005353A5 (de) 2015-11-23 2016-11-23 Multifunktionale Kameraeinheit zur Kopplung an einem ersten mobilen Trägersystem

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015120256 2015-11-23
DE102015120256.5 2015-11-23

Publications (1)

Publication Number Publication Date
WO2017089364A1 true WO2017089364A1 (fr) 2017-06-01

Family

ID=57421834

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/078485 WO2017089364A1 (fr) 2015-11-23 2016-11-23 Unité de caméra multifonctionnelle destinée à être couplée à un premier système de support moile

Country Status (2)

Country Link
DE (1) DE112016005353A5 (fr)
WO (1) WO2017089364A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017119149A1 (de) 2016-09-20 2018-03-22 Mayser Holding Gmbh & Co. Kg Verfahren zur berührungslosen Erfassung relativ zueinander bewegter Objekte und Sensoreinrichtung
EP3448012A1 (fr) * 2017-08-25 2019-02-27 Canon Kabushiki Kaisha Appareil de capture d'images
CN112594144A (zh) * 2020-12-10 2021-04-02 安徽农业大学 一种基于无人机搭载的智能化风电机组桨叶监测探伤机构

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090284596A1 (en) * 2008-05-19 2009-11-19 Camdeor Technology Co., Ltd. Infrared surveillance camera having detachable lighting device rotatable with camera lens
EP2348280A2 (fr) * 2010-01-20 2011-07-27 Honeywell International Inc. Systèmes et procédé pour la détection monoculaire d'objets aériens
US20120242795A1 (en) * 2011-03-24 2012-09-27 Paul James Kane Digital 3d camera using periodic illumination

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090284596A1 (en) * 2008-05-19 2009-11-19 Camdeor Technology Co., Ltd. Infrared surveillance camera having detachable lighting device rotatable with camera lens
EP2348280A2 (fr) * 2010-01-20 2011-07-27 Honeywell International Inc. Systèmes et procédé pour la détection monoculaire d'objets aériens
US20120242795A1 (en) * 2011-03-24 2012-09-27 Paul James Kane Digital 3d camera using periodic illumination

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FLORIAN BÄTHGE: "3D-Objektverfolgung mit Stereokameras zur bildbasierten Navigation autonom fliegender Luftfahrzeuge Bachelorarbeit zur Erlangung des Grades eines Bachelor of Science (B.Sc.) im Studiengang Computervisualistik", 2 April 2012 (2012-04-02), Magdeburg, XP055349571, Retrieved from the Internet <URL:http://studios.tinytall.de/wp-content/uploads/downloads/2012/05/Florian_Baethge_2012_Bachelor_Thesis.pdf> [retrieved on 20170227] *
HANS-ARTHUR MARSISKE: "Die Macht der Drohnen", C'T MAGAZIN, 12 June 2015 (2015-06-12), XP055347234, Retrieved from the Internet <URL:https://www.heise.de/ct/ausgabe/2015-14-Das-Wettruesten-bei-bewaffneten-UAVs-hat-begonnen-2683558.html> [retrieved on 20170217] *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017119149A1 (de) 2016-09-20 2018-03-22 Mayser Holding Gmbh & Co. Kg Verfahren zur berührungslosen Erfassung relativ zueinander bewegter Objekte und Sensoreinrichtung
EP3448012A1 (fr) * 2017-08-25 2019-02-27 Canon Kabushiki Kaisha Appareil de capture d'images
US10659702B2 (en) 2017-08-25 2020-05-19 Canon Kabushiki Kaisha Image capturing apparatus that matches an imaging range with an irridation range
EP3751836A1 (fr) * 2017-08-25 2020-12-16 Canon Kabushiki Kaisha Appareil de capture d'images
CN112594144A (zh) * 2020-12-10 2021-04-02 安徽农业大学 一种基于无人机搭载的智能化风电机组桨叶监测探伤机构

Also Published As

Publication number Publication date
DE112016005353A5 (de) 2018-08-02

Similar Documents

Publication Publication Date Title
EP3430368B1 (fr) Aéronef pour balayer un objet et système d&#39;analyse d&#39;endommagement de l&#39;objet
DE112014001058T5 (de) Verfahren und System zur Steuerung autonomer Fahrzeuge
DE102015205032A1 (de) Fahrzeugverbund und Verfahren zum Bilden und Betreiben eines Fahrzeugverbundes
DE112010003000T5 (de) Sichtsystem zum Überwachen von Menschen in dynamischen Umgebungen
EP2835973A1 (fr) Caméra 3D et un procédé de capture de données d&#39;image tridimensionnelles
DE102011082478A1 (de) Verfahren, System sowie Vorrichtung zur Lokalisation eines Fahrzeugs relativ zu einem vordefinierten Bezugssystem
WO2017089364A1 (fr) Unité de caméra multifonctionnelle destinée à être couplée à un premier système de support moile
DE102020203054B4 (de) Verfahren zur Steuerung einer Formation eines zusammenwirkenden Schwarms von unbemannten mobilen Einheiten
EP3663252B1 (fr) Procédé pour faire fonctionner un agv et système intralogistique avec un agv
DE102017213601A1 (de) Verfahren zum Erstellen einer Objektkarte für eine Fabrikumgebung
DE112014001069T5 (de) Steuerungssystem und -verfahren zur Steuerung eines Fahrzeugs im Zusammenhang mit der Erkennung eines Hindernisses
EP3482622B1 (fr) Procédé de guidage automatique d&#39;un véhicule le long d&#39;un système de rails virtuel
WO2020053215A1 (fr) Système et procédé pour la localisation basée sur les ondes radio et transformation de coordonnées
EP4116790B1 (fr) Dispositif de commande et de navigation pour système mobile autonome et système mobile autonome
DE112017008156T5 (de) Bestimmung von objektstandortkoordinaten
EP3676733A1 (fr) Procédé de mesure d&#39;espace au moyen d&#39;un véhicule de mesure
DE102014221763A1 (de) Verfahren zur automatischen Steuerung von Objekten innerhalb eines räumlich abgegrenzten Bereichs, der für die Herstellung oder Wartung oder das Parken eines Fahrzeugs vorgesehen ist
DE102014224884A1 (de) Verfahren und System zum Überwachen von Logistikeinrichtungen
EP3815012A1 (fr) Dispositif mobile pour inventorier des stocks
WO2011157723A1 (fr) Système et procédé d&#39;évitement de collisions
DE102015217314A1 (de) Überwachungssystem
DE102021108357A1 (de) Verfahren zum Betreiben eines Flurförderzeugs in einem intralogistischen System sowie intralogistisches System
DE102019108256A1 (de) Anordnung und Verfahren zur Ermöglichung einer autonomen Landung
DE102018002499A1 (de) Verfahren und Vorrichtung zur Bestimmung der Position von Objekten in einer dreidimensionalen Umgebung
DE102016209514B4 (de) Verfahren zum Optimieren eines Verkehrsflusses

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16802015

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112016005353

Country of ref document: DE

REG Reference to national code

Ref country code: DE

Ref legal event code: R225

Ref document number: 112016005353

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16802015

Country of ref document: EP

Kind code of ref document: A1