CN115469601A - Method for positioning a module, first module, second module and automation system - Google Patents

Method for positioning a module, first module, second module and automation system Download PDF

Info

Publication number
CN115469601A
CN115469601A CN202210645664.8A CN202210645664A CN115469601A CN 115469601 A CN115469601 A CN 115469601A CN 202210645664 A CN202210645664 A CN 202210645664A CN 115469601 A CN115469601 A CN 115469601A
Authority
CN
China
Prior art keywords
module
marking
optical
sensor unit
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210645664.8A
Other languages
Chinese (zh)
Inventor
S·施罗克
邱智宏
D·坎珀特
J·弗兰根
M·M·沃尔德雷尔
T·贝齐扎
U·勒伯勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of CN115469601A publication Critical patent/CN115469601A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/19Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41815Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35349Display part, programmed locus and tool path, traject, dynamic locus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manufacturing & Machinery (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automatic Assembly (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for locating modules 2, 3 of an automation system, wherein the automation system comprises a first module 2 and a second module 3, wherein the first module 2 has a marking area 4 comprising at least one optical marking 6, wherein the optical marking 6 comprises in each case position information about the position of the optical marking within the marking area 4, wherein the second module 2 has a sensor unit 8 for recording at least one image of the optical marking 6 of the first module 2, wherein a relative position, position and/or velocity at the sensor unit 8 with respect to the marking area 4 is determined on the basis of the at least one image of the optical marking 6.

Description

Method for positioning a module, first module, second module and automation system
Technical Field
The invention relates to a method for locating a module of an automation system having the features of claim 1.
Background
Automation systems, for example in industrial production, must be increasingly adaptable. Modular automation systems are of increasing interest, particularly in the industrial 4.0 context. In order that the modules of an automation system can cooperate effectively, the orientation and positioning of these modules relative to each other is of utmost importance. The methods for positioning modules of an automation system up to now are based on the specification of a route, for example by means of guide rails.
The published document DE 10 2016 2016 2016 196 A1 describes a sensor system having an evaluation unit for recording an image of a marking area, wherein the relative orientation of the sensor unit with respect to the marking area is determined on the basis of the image.
Disclosure of Invention
A method for locating a module of an automation system is specified with the features of claim 1. A first module and a second module and an automation system having these modules are also proposed. Preferred and/or advantageous embodiments emerge from the dependent claims, the description and the accompanying drawings.
A method for locating a module of an automation system is proposed. The method is used in particular for applying the automation system and/or the module. The method is in particular designed for execution by a control unit, for example for the module or the automation system. In particular, the method may be performed at least partially in a software-implemented manner. The method is used for positioning, in particular orientation, of the module relative to another module, in particular the module. The positioning, in particular the orientation, is in particular realized in six degrees of freedom. The localization method is used, for example, for the application during and/or after a retrofit, a module replacement and/or a module replacement of an automation system and/or for determining changes due to environmental influences, for example, collisions. In particular for production and/or manufacturing. The automation system is, for example, a production system, wherein the modules and/or the modules are designed to assume at least one substep, for example a function. These modules can be, for example, processing modules, work units, transport modules and/or transport vehicles, conveyor belts and/or robot-like handling modules. These modules and/or the module may be a mobile or stationary module. The module is designed, for example, for processing, handling, transporting and/or operating workpieces and/or objects. The module is designed in particular to undertake and/or perform functions, in particular process functions, such as drilling or measuring. The module is designed in particular for interacting with another module and/or with the workpiece. The automation system includes at least a first module and a second module. The automation system comprises, inter alia, a plurality of modules, for example a first module, a second module and further modules. The first module is designed in particular for interacting and/or co-participating in the production and/or in the automation system. In order to debug and/or operate the automation system, the first and second modules are and/or can be arranged in a determined position, orientation and/or orientation relative to each other. The position, location and/or orientation of the first and second modules relative to each other for commissioning and/or for operating the automation system is referred to in particular as positioning. Based on the determined position and/or the determined orientation and/or the determined velocity, the actuator of the first module and/or the actuator of the second module is/are actuated such that the two modules are positioned in a program-controlled manner with respect to one another in a specified position and/or a predetermined orientation.
The first module has at least one marking area, in particular a plurality of marking areas. The marking area is in particular a flat portion. The marking area is in particular a flat, flat portion. The marking area comprises at least one optical marking, in particular a plurality of optical markings. The optical marks are for example arranged uniformly in the mark area. The optical marking is designed, for example, as a code, in particular as a point code. The marking area is arranged on a first module, for example on a housing of the module. The marking areas and/or optical markings are for example arranged as labels, stickers, imprints, colourings, laser markings and/or chemical markings, for example estimates. The optical marking and/or the optical markings comprise and/or describe position information, respectively. The position information is, for example, the position of the optical marker within the marker region, for example as X and Y coordinates. The position information can in particular describe the position of the optical marking on the module and/or on the housing of the module, which position information is registered, in particular in a coded manner. The position information or optical marking comprises and/or describes, in particular, size information of the optical marking, for example an areal measure and/or a shape of the optical marking.
The second module includes at least one sensor unit. The sensor unit is designed as a camera and/or comprises a camera, for example. The sensor unit is preferably fixedly comprised, in particular firmly and/or positionally fixedly comprised, by the second module. The sensor unit can be especially retrofittable for the second module. The sensor unit is designed and/or arranged to record at least one image of the optical marking, one of the optical markings and/or the marking area of the first module, in particular if the first and second modules are arranged to run and/or debug the automation system.
Based on the optical markers and/or at least one image of the optical markers recorded by the sensor unit, the relative position, position and/or velocity of the sensor unit with respect to the marker area is determined, calculated and/or ascertained. In particular with respect to the relative orientation, position and/or velocity of the first and second modules. The relative orientation is determined in particular in six degrees of freedom. This relative orientation is in particular the orientation of the sensor unit relative to the marking area with respect to three translational and three rotational degrees of freedom and/or dimensions. The determination of the relative position, position and/or velocity based on the at least one image is based in particular on knowledge of the focus and/or recording parameters of the sensor unit.
The invention enables a faster, simplified and improved positioning of a module relative to another module of a production and/or automation system. For the proposed positioning, no rigid guidance of the module, for example caused by rails or other mechanical feedback, is required. The method allows in particular positioning in six degrees of freedom and/or in the submillimeter range. In particular, existing automation systems and/or modules can also be retrofitted for the method, for example by subsequently placing optical markers and/or sensor units. In particular, different first modules and different second modules with optical markers and/or sensor units are positioned relative to each other and/or designed and/or can be used for jointly applying the method. Furthermore, the method can be applied substantially automatically, for example by automated image evaluation and/or calculation.
Particularly preferred are: the positioning information is determined based on the determined and/or calculated relative position, position and/or velocity. The positioning information describes, includes and/or indicates how the actual positioning deviates from the target positioning. The actual position is, for example, a relative position, position and/or velocity between the first module and the second module, in particular between the marking region of the first module and the sensor unit of the second module, which is current and/or determined on the basis of the at least one image. The target position is, for example, a position of the first module and the second module for debugging and/or for operating the automation system. The target location may in particular have a tolerance measure. The target location can be adjusted and/or dependent on, among other things, the automation system and/or the process and/or function the automation system is intended to perform. The target location may be different from a previous target location, for example, after a retrofit and/or a production change. The positioning information may in particular describe the deviation and/or a measure of the deviation. The positioning information may be reproduced, for example, as a display, such as a light, a lamp and/or an LED. The display of the positioning information may in particular be displayed via the signal color and/or the signal strength. For example as positioning information when the actual positioning corresponds to the target positioning and/or corresponds to the target positioning within tolerances, in particular the signal color and/or the signal strength may indicate a measure of the deviation.
In particular, provision is made for: based on the positioning information, displacement information is determined, calculated and/or ascertained. The displacement information may be understood, for example, as a description, specification and/or navigation command, in particular how a transition from an actual localization to a target localization is possible, for example, in particular, particularly quickly and/or if other or further boundary conditions are observed. The displacement information may be presented and/or output as, for example, a direction of movement, a speed of movement. This displacement information can be ascertained and/or determined as a target course or trajectory, among other things. The displacement information can be output and/or presented, in particular, by means of arrows and/or navigation information.
Particularly preferred are: the displacement information is provided to the user for display. The displacement information is provided, inter alia, to a user for display on a mobile display device, in particular a terminal device. The mobile display device may be, for example, a smart device, such as a tablet or a smart phone. Preferably, the displacement information is provided by means of a wireless connection, such as bluetooth, radio and/or WiFi.
In particular, provision is made for: the optical markers include position information. The position information reflects, for example, the position of the respective optical marker within the marker region, alternatively the position information reflects the position of the optical marker on and/or at the module. Particularly preferred are: the optical marking comprises and/or presents, in particular in an encoded manner, further information. The optical markers comprise, for example, information on the size, geometry, orientation and/or structure of the markers. In particular, the optical markings comprise and/or describe characteristics and/or parameters of the module, such as functions, movement possibilities, actuation and/or limitations. In particular, the optical markers comprise and/or describe the target location and/or the location to be achieved.
In particular, provision is made for: the relative position, position and/or velocity between the first module and the second module, in particular between the marking area and the sensor unit, is determined in at least four degrees of freedom, preferably in six degrees of freedom. These six degrees of freedom include, in particular, three translational degrees of freedom and three rotational degrees of freedom, for example by specifying the euler angle.
Particularly preferred are: the first module has a further sensor unit, in particular a sensor unit like the second module. The further sensor unit of the first module is designed for recording a further marking area, in particular a marking area of a further module. Alternatively and/or additionally, the second module comprises a further marking area, in particular a marking area like the first module. The further marking area is designed for recording and/or detecting a sensor unit of a further module. The further module is designed in particular like the first module and/or the second module. The first module, the second module and/or the further module can be designed in particular in the same way, wherein "in the same way" can be understood in the sense of comprising a marking region and a sensor unit, so that these modules can be combined and/or positioned with respect to one another in this way. For example, a first module may form a second module to another module, alternatively and/or additionally, a second module may form a first module to another module.
In particular, provision is made for: the first module and/or the second module are designed as a mobile and/or mobile module. The first module and/or the second module are in particular mobile, in particular designed to be autonomously driven and/or operable. The second module is designed, for example, as: the positioning and/or orientation is carried out, controlled and/or regulated by the actuation of the drive device on the basis of the determined relative position, position and/or velocity, in particular the positioning information and/or displacement information. For example, the first module and/or the second module is a robot module and/or an autonomously driven, in particular autonomously guided, vehicle. The first module and/or the second module are in particular mobile modules for internal logistics, for example for transporting workpieces, fittings and/or materials. For example, the first module and/or the second module are designed for transporting workpieces between two processing stations and/or automation stations.
Particularly preferred are: the marking fields are arranged on the side of the respective automation module, for example on the housing surface. In particular, the marking area is arranged visibly and/or unobstructed. Particularly preferred are: the marking zone is arranged and/or designed as a vertical surface. In particular, the marking area is designed substantially as a vertical surface, for example with an inclination of between 75 ° and 115 ° relative to the ground. Provision is made here for: the sensor unit has a recording direction for recording the marking area, wherein the recording direction is horizontal and/or substantially horizontal and/or the same with respect to the ground direction.
Alternatively and/or additionally, provision is made for: the marking region forms and/or includes a horizontal surface. For example, the marking area is arranged in the bottom, e.g. the bottom and/or standing surface, alternatively the marking area may be arranged in the top and/or top surface. In particular, the marking zone is in this case oriented substantially horizontally, for example with an inclination angle of between-20 ° and 20 ° relative to the ground. For example, marking areas may be arranged at corner and/or edge regions of the module, wherein such marking areas may have horizontal and vertical surface portions.
Particularly preferred are: the first module has a module arm and/or the second module has a module arm. The modular arm is for example designed like a crane arm. If the first module has a module arm, provision is made, in particular, for: the marking area is arranged on the module arm, for example in an end portion in the longitudinal direction. If the second module has a module arm, it is preferably provided that: the sensor unit is arranged on the module arm, in particular in an end portion in the longitudinal direction. In order to position the first and second modules relative to each other, for example, provision is made for: the module arm engages partially into and/or under the other module. For example, the optical markings of the first module are arranged on the underside and/or the bottom, wherein the module arm of the second module with the sensor unit is moved underneath the first module, for example between the ground and the chassis, so that the marking area pointing to the ground can be detected and can be recorded as an image.
In particular, provision is made for: the optical markers and/or marker areas are glued, magnetically attached, positively or non-positively fixed to the module, for example via a clamping connection or screw clamps. Alternatively and/or additionally, the optical marking and/or the marking area is laser, printed, etched and/or clamped in place. The optical marking and/or the marking area can in particular be reversibly attached to the module, for example depending on the use in the automation system.
Another subject of the invention is a first module, in particular for carrying out and/or applying the proposed method. The first module is in particular a module of an automation system, for example a module for concentration, production and/or for industrial use. In particular, the first module is designed as an autonomous driving and/or controlled module. The first module is especially designed for performing and/or assuming functions, such as production and/or processing, as an alternative and/or complement to transportation and/or handling functions. The first module has a marking area, in particular a flat and/or planar marking area. The mark area comprises at least one optical mark, in particular a plurality of optical marks. The optical markers include and/or describe, among other things, their position, orientation and/or size within the marker area and/or module. The optical marking and/or the marking area are arranged in particular on a visible part of the module, for example in a housing part.
A further subject of the invention is a second module, in particular for carrying out the proposed method. The second module is designed in particular for the automation system. The second module comprises a sensor unit, wherein the sensor unit comprises and/or is a camera, among others. The sensor unit is designed and/or designed to record at least one image, in particular at least one image of the marking area and/or of the optical marking in the marking area of the first module. The marking area and/or the sensor unit are in particular arranged and/or designed such that they can be used for applying the method and/or image recording, for example arranged and/or designed at a corresponding height and/or arranged and/or designed with a corresponding orientation.
Another subject is an automation system, wherein the automation system is designed for carrying out and/or applying the method. The automation system includes a first module and a second module. The automation system is in particular designed such that a relative position, position and/or velocity is determined and/or calculated on the basis of at least one image recorded by the sensor unit. This is achieved inter alia on the basis of and/or analogously to the proposed method. Preferably, the automation system is designed to: based on the determined position and/or the determined orientation and/or the determined velocity, the actuator of the first module and/or the actuator of the second module is/are actuated such that the two modules are positioned in a program-controlled manner with a specified position and/or a predetermined orientation.
Drawings
Other advantages, effects and design considerations result from the attached figures and their description. Here:
fig. 1 shows a schematic arrangement of a sensor unit and a marking area;
FIG. 2 illustrates an embodiment of two modules;
FIG. 3 illustrates an embodiment of an arrangement of autonomous driving modules with a horizontally oriented marking zone;
FIG. 4 illustrates another embodiment of an autonomous module having a vertical marking zone;
FIG. 5 illustrates an embodiment of an autonomous driving module having a marked area;
FIG. 6 illustrates an embodiment of an autonomous driving module with a sensing device and a marker region.
Detailed Description
Fig. 1 shows an exemplary embodiment of an automation device 1 having a first module 2 and a second module 3. The automation device 1 is designed, for example, as a production system and/or a production line, wherein these modules 2, 3 and/or further modules assume the processes and/or functions. To ensure cooperation and/or interaction of the modules 2, 3, the arrangement, positioning and/or orientation of the modules with respect to each other is important and/or should be observed within tolerance measures.
For the purpose of positioning, the first module 2 comprises a marking area 4. The marking zone 4 is a surface portion which is arranged on the housing 5 of the second module 3. The marking area 4 has, for example, a rectangular outline and/or a border. A plurality of optical markers 6 are arranged within the marker region 4. The optical marking 6 is designed, for example, as an optical code, for example a dot code or a bar code. The optical marking 6 comprises and/or reflects the position, orientation and/or size of the optical marking within the marking area 4 and/or on the second module, respectively. The optical markers 6 are each designed with the same contour and/or boundary, here for example rectangular or square. The optical markers each comprise within their boundaries, for example, symbols, in this embodiment pixels, alternatively circles as dots. The optical marking 6 or the marking area 4 may have, for example, an offset O, wherein the offset O is defined, for example, with respect to the world coordinate origin, for example, with respect to the second module 3.
The second module 3 comprises a module arm 7a. A sensor unit 8 is arranged on the module arm 7a, wherein the sensor unit 8 has a camera 9 with camera parameters. The sensor unit 8 or the camera 9 is arranged such that it detects and/or records the optical marking 6 at least partially in the recording direction in the case of a desired positioning of the first and second modules 2, 3 relative to one another or in the case of a general deviation and/or in the case of a positioning effort. The sensor unit 8 is designed to detect at least one section in the marking area 4, in particular the optical marking 6, and to provide this section as an image. Based on the images recorded by the sensor unit 8, in particular the position information and/or the size information of the optical markers 6, the relative or absolute orientation, orientation and/or velocity of the first and second modules 2, 3 can be calculated.
Fig. 2 shows an exemplary embodiment of an automation system 1 having a first module 2 and a second module 3. In this case, the first module 2 and the second module 3 are designed in substantially the same way, wherein they can have different functions, for example a drilling or sawing function. These modules 2, 3 are similar or identical, in particular with regard to the presence and/or arrangement of the sensing means 8 and the marking zone 4.
In this case, the modules 2, 3 have, for example, conveyor belt sections 11 which are designed for transporting objects, for example workpieces or resorts. To ensure the transport, the modules 2, 3 should be correspondingly positioned and/or oriented relative to each other. For this purpose, the first and second modules 2, 3 each have a module arm 7a, b. These module arms 7a, b are preferably arranged on the same side, for example with respect to the direction of travel or transport. For example, the module arm 7a is arranged with a longitudinal extension against the direction of travel and/or transport, wherein the module arm 7b is arranged with a longitudinal extension along the direction of travel and/or transport.
These modules 2, 3 each have a marking field 4 on the upper side at a module arm 7a, which marking field has an optical marking 6. The optical markers 6 describe, in particular, the position of these optical markers in the marker area 4 or in the module 2, 3, in particular in the module arm 7a. In this case, the optical mark 6 is, for example, stuck, picked up or inserted.
These modules 2, 3 each comprise a sensor unit 8 at the module arm 7b, which sensor unit 8 in this case comprises a camera 9, wherein the camera 9 is arranged with its recording direction 10 facing downwards. The height and/or the arrangement of these arms 7a, b is selected in particular such that the sensor device 8 can record and/or clearly image the optical mark 6 with the positioning of these modules 2, 3 relative to one another, for example such that the mark region 4 is located in the focal region and/or focal plane. Based on the image recorded by the sensor unit 8 and the position information in the optical marker 6, the position and/or orientation of the sensor unit 8 and the marker area 4 of the two modules relative to each other can be determined and calculated. Based on the determined position and/or orientation, the actuator of the first module 2 and/or the actuator of the second module 3 is/are actuated such that the two modules 2, 3 are positioned in a program-controlled manner in a defined position and/or a predetermined orientation relative to one another. This positioning is correspondingly performed in the embodiments presented later with reference to fig. 3 to 6.
Fig. 3 shows an exemplary embodiment of an automation device 1 having a first module 2 and a second module 3. The second module 3 is designed as an autonomous driving and/or autonomous controlled module, wherein the second module has a drive. The drive means are designed, for example, on the basis of the determined and/or to be determined relative position, position and/or speed between the first module 2 and the second module 3.
The second module 3 has a module arm 7a at the free end of which a sensor unit 8 is arranged, wherein the sensor unit 8 has an upward recording direction 10.
The second module 2 has a conveyor belt 11 and a body portion 12. For example, goods to be transported can be transported and/or contained in the body portion 12, such as on a pallet, bucket, or container. The first module 2 has a marking area 4 on the ground-facing part, which marking area has an optical marking 6 according to the invention. This section is in particular spaced apart from the ground and/or the road, for example by the chassis of the first module 2 or by a clearance, for example a tunnel.
The module arm 7a can engage or protrude into the tunnel and/or the clearance such that the sensor unit 8 can be positioned and/or arranged such that at least a part of the marking area 4 and/or the optical marking 6 can be recorded as an image. Based on the recorded image of the optical marker 6, the relative orientation, position, orientation and/or velocity between the modules 2, 3 can be determined. In the present case, the marking area 4 is arranged in a horizontal plane, in particular the optical marking 6 and/or the marking area 4 is directed towards the ground.
Fig. 4 shows a further exemplary embodiment 13 of the automation system 1, which corresponds substantially to the structure of the automation device 1 from fig. 3. In this exemplary embodiment, the marking area 4 and/or the optical marking 6 are located in a vertical plane, in particular in a plane perpendicular to the direction of travel V. In this exemplary embodiment, the sensor units 8 are arranged in a recording direction 10 in the direction of travel and/or a line-of-sight direction, in particular a substantially horizontal recording direction 10. The sensor unit 8 is arranged for recording the marking area 4, wherein the positioning of these modules 2, 3 relative to each other is realized on the basis of the recorded images.
Fig. 5 shows an exemplary embodiment of an automation device 1 having a first module 2, a second module 3 and a further module 13. The first module 2 and the second module 3 are designed substantially as in the exemplary embodiment according to fig. 4. In addition, a further marking area 4 is arranged on the first module 2, wherein this marking area 4 likewise forms a vertical plane, in particular a vertical plane perpendicular to the direction of travel V. The marking area 4 is arranged on the opposite side of the first module 2 in the direction of travel.
The further module 13 can likewise be, for example, a machine, a processing unit or an autonomous driving module. The further module 13 comprises the same sensor unit 8, which has a recording direction 10 that is the same as or opposite to the driving direction, in particular is oriented substantially horizontally. The sensor unit 8 is designed to record a further image of the marking area 4 of the first module 2, wherein the relative position, orientation and/or speed between the further module 13 and the first module 2 is determined on the basis of the image of the at least one optical marking 6 showing the marking area 4, wherein on the basis thereof the modules 2, 13 and/or in particular the module 3 can be positioned relative to one another.
Fig. 6 shows an exemplary embodiment of an automation device 1, which is designed substantially as the automation device 1 from fig. 5, with the difference that the first module 2 has a further sensor unit 8 instead of the further marking area 4, and the further module 13 has a further marking area 4 instead of the further sensor unit 8. The further optical sensor unit 8 is designed to record a further recording, in particular an image, of the further marking region 4, wherein on the basis thereof the relative position, orientation and/or speed between the further module 13 and the first module 2 is determined, wherein on the basis thereof the two modules 2, 13, in particular also the module 3, are positioned relative to one another.

Claims (14)

1. A method for locating a module (2, 3) of an automation system,
wherein the automation system comprises a first module (2) and a second module (3),
wherein the first module (2) has a marking area (4) comprising at least one optical marking (6), wherein the optical markings (6) each comprise position information about the position of the optical marking within the marking area (4),
wherein the second module (2) has a sensor unit (8) for recording at least one image of the optical markers (6) of the first module (2),
wherein a relative position, position and/or velocity of the sensor unit (8) with respect to the marking area (4) is determined based on at least one image of the optical marking (6).
2. Method according to claim 1, characterized in that positioning information is determined based on the determined relative position, position and/or velocity, wherein the positioning information describes and/or comprises a deviation of the actual positioning of the first and second module (2, 3) relative to each other from a target positioning.
3. The method of claim 2, wherein displacement information is determined based on the positioning information, wherein the displacement information describes a transition from an actual positioning to a target positioning.
4. The method of claim 3, wherein the displacement information is provided to a user for display on a mobile display device or a local display device.
5. The method according to any of the preceding claims, characterized in that the optical marker (6) comprises position information about the position of the optical marker within the marker area (4).
6. A method according to any of the preceding claims, characterized in that the relative orientation and/or position is determined in six degrees of freedom.
7. Method according to any of the preceding claims, characterized in that the first module (2) has a further sensor unit (8) for coupling with a further module (13); and/or the second module (3) has a further marking area (4) for coupling with a further module (13).
8. The method according to any one of the preceding claims, characterized in that the first module (2) and/or the second module (3) are autonomous driving modules (2, 3).
9. Method according to any of the preceding claims, characterized in that the marking areas (4) are arranged at the sides and/or form vertical surfaces.
10. The method according to any one of claims 1 to 8, wherein the marking region (4) forms a horizontal surface.
11. Method according to any of the preceding claims, characterized in that the first module (2) has a module arm (7 a) comprising the marking area (4) or the second module (3) has a module arm (7 b) comprising the sensor unit (8), wherein the module arm (7 a, b) engages into or under a further module (3) for positioning.
12. Method according to any of the preceding claims, characterized in that the optical marking (6) is glued, magnetically attached, form-fittingly or non-fittingly fixed to the module (2, 3).
13. A module (2) for an automation system, wherein the module (2) has a marking area (4) comprising at least one optical marking (6), wherein the optical markings (6) each comprise position information about the position of the optical marking within the marking area (4).
14. An automation system having a first module (2) and a second module (3), wherein the first module (2) has a marking region (4) which comprises at least one optical marking (6), wherein the optical markings (6) each comprise positional information about the position of the optical marking within the marking region (4), wherein the second module (2) has a sensor unit (8) for recording at least one image of the optical marking (6) of the first module (2), the automation system having an evaluation module, wherein the evaluation module is designed to: determining a relative position, position and/or velocity of the sensor unit (8) with respect to the marking area (4) based on at least one image of the optical marking (6).
CN202210645664.8A 2021-06-10 2022-06-09 Method for positioning a module, first module, second module and automation system Pending CN115469601A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021205887.6A DE102021205887A1 (en) 2021-06-10 2021-06-10 Method for positioning a module, first module, second module and automation system
DE102021205887.6 2021-06-10

Publications (1)

Publication Number Publication Date
CN115469601A true CN115469601A (en) 2022-12-13

Family

ID=84192587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210645664.8A Pending CN115469601A (en) 2021-06-10 2022-06-09 Method for positioning a module, first module, second module and automation system

Country Status (2)

Country Link
CN (1) CN115469601A (en)
DE (1) DE102021205887A1 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016201619A1 (en) 2016-02-03 2017-08-03 Mahle International Gmbh Piston of an internal combustion engine

Also Published As

Publication number Publication date
DE102021205887A1 (en) 2022-12-15

Similar Documents

Publication Publication Date Title
US11727349B2 (en) Automated warehousing using robotic forklifts or other material handling vehicles
CN108780317B (en) Automatic carrying vehicle
KR100447308B1 (en) Method and device for detecting the position of a vehicle a given area
KR102291667B1 (en) Module landmarks for robot driving, landmarks and their robots
KR20180120982A (en) Fork lifter system, and control method thereof
JPH11278799A (en) Loading control device for unmanned fork lift, and loading control method for unmanned fork lift
CN109189076B (en) Heavy guided vehicle positioning method based on visual sensor and heavy guided vehicle
JP7469494B2 (en) Method for controlling an automated guided vehicle and a control system configured to carry out said method - Patents.com
Garibotto et al. Industrial exploitation of computer vision in logistic automation: autonomous control of an intelligent forklift truck
CN112173518A (en) Control method and automatic guided vehicle
CN112449164B (en) Method for locating a vehicle and vehicle for carrying out the method
KR101968217B1 (en) Automated Guided Vehicle capable of sequential obstacle avoidance
CN113998626A (en) AGV attitude adjusting method applied to tray recognition/positioning
KR101059927B1 (en) Apparatus and method for pallet position recognition of unmanned conveying equipment
JP7207046B2 (en) Autonomous mobile device, guidance system, and method of moving autonomous mobile device
JP7318244B2 (en) AUTONOMOUS MOBILE DEVICE, PROGRAM AND METHOD OF SELECTING OBJECT TO TRANSFER BY AUTONOMOUS MOBILE DEVICE
CN107357290A (en) One kind is based on magnetic navigation robot ambulation system
CN115469601A (en) Method for positioning a module, first module, second module and automation system
CN205507536U (en) Over -and -under type automatic navigation transport vechicle system based on RFID location
CN112173519A (en) Control method and automatic guided vehicle
US20210309501A1 (en) Control device, movement control system, control method, and program
CN113120799A (en) Accurate positioner of AGV
CN116472506A (en) Method for changing the position of a load on a load receiver of an unmanned transport vehicle
Martin et al. An autonomous transport vehicle in an existing manufacturing facility with focus on the docking maneuver task
Bouguerra et al. An autonomous robotic system for load transportation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication