CN112099025B - Method, device, equipment and storage medium for positioning vehicle under bridge crane - Google Patents

Method, device, equipment and storage medium for positioning vehicle under bridge crane Download PDF

Info

Publication number
CN112099025B
CN112099025B CN202010841115.9A CN202010841115A CN112099025B CN 112099025 B CN112099025 B CN 112099025B CN 202010841115 A CN202010841115 A CN 202010841115A CN 112099025 B CN112099025 B CN 112099025B
Authority
CN
China
Prior art keywords
unmanned vehicle
bridge crane
marker
point cloud
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010841115.9A
Other languages
Chinese (zh)
Other versions
CN112099025A (en
Inventor
杨政
刘飞
邓丹
钱炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Fabu Technology Co Ltd
Original Assignee
Hangzhou Fabu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Fabu Technology Co Ltd filed Critical Hangzhou Fabu Technology Co Ltd
Priority to CN202010841115.9A priority Critical patent/CN112099025B/en
Publication of CN112099025A publication Critical patent/CN112099025A/en
Application granted granted Critical
Publication of CN112099025B publication Critical patent/CN112099025B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/16Applications of indicating, registering, or weighing devices

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Mechanical Engineering (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

The embodiment of the disclosure provides a vehicle positioning method, device and equipment under a bridge crane and a storage medium. The method comprises the following steps: acquiring point cloud data obtained by scanning a radar device on an unmanned vehicle; and determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data obtained by scanning. According to the method, the positioning of the unmanned vehicle under the bridge crane is achieved through the radar device, and the positioning accuracy of the unmanned vehicle under the bridge crane is improved.

Description

Method, device, equipment and storage medium for positioning vehicle under bridge crane
Technical Field
The embodiment of the disclosure relates to the field of artificial intelligence, in particular to a method, a device, equipment and a storage medium for positioning a vehicle under a bridge crane, which can be used in the field of unmanned driving.
Background
When in port operation, in order to ensure that a lifting appliance on a bridge crane can smoothly grab a container on a vehicle or accurately place the container at a designated position on the vehicle, accurate alignment of the vehicle and the bridge crane is a key of success or failure of the operation.
Typically, port operations require the driver to rely on experience to drive the vehicle to a given location of the bridge crane. For an unmanned vehicle, the position of the unmanned vehicle relative to the bridge crane needs to be determined before alignment of the unmanned vehicle with the bridge crane is achieved. The bridge crane is usually larger, satellite positioning signals of the unmanned vehicle are easy to shield, the position of the bridge crane is not fixed, and the accuracy of the relative position of the unmanned vehicle and the bridge crane obtained through positioning is low.
Disclosure of Invention
The embodiment of the disclosure provides a vehicle positioning method, device and equipment under a bridge crane and a storage medium, which are used for improving the accuracy of the relative positions of an unmanned vehicle and the bridge crane obtained by positioning.
In a first aspect, an embodiment of the present disclosure provides a method for positioning a vehicle under a bridge crane, where a radar device is disposed on an unmanned vehicle, the method including:
acquiring point cloud data obtained by scanning the radar device;
and determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data.
In one possible embodiment, the determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data includes:
determining the relative distance between at least one marker on the bridge crane and the unmanned vehicle according to the point cloud data;
and obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle.
In one possible implementation manner, the determining, according to the point cloud data, a relative distance between at least one marker on the bridge crane and the unmanned vehicle includes:
Determining a region of interest corresponding to the at least one marker;
screening the point cloud data according to the position area corresponding to the at least one marker, wherein the screened point cloud data is located in the interest area corresponding to the at least one marker;
determining the position of the at least one marker in a preset coordinate system according to the screened point cloud data;
and determining the relative distance between the at least one marker and the unmanned vehicle according to the position of the at least one marker.
In one possible implementation, the number of markers is 2 or more; the obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle comprises the following steps:
determining the relative distance between the unmanned vehicle and a preset reference position on the bridge crane according to the relative distance between each of at least two markers on the bridge crane and the unmanned vehicle;
and determining the relative distance between the unmanned vehicle and the reference position as the relative position between the unmanned vehicle and the bridge crane.
In one possible implementation, the scanning range of the radar device includes an upper region of the unmanned vehicle.
In one possible implementation, the radar apparatus includes one or more of the following: rotary lidar, solid-state lidar.
In one possible implementation, the rotary lidar is a multi-line lidar.
In one possible implementation, the radar device is fixed above the head of the unmanned vehicle.
In a second aspect, embodiments of the present disclosure provide a vehicle positioning device under a bridge crane, on which a radar device is provided on an unmanned vehicle, the vehicle positioning device including:
the acquisition module is used for acquiring the point cloud data obtained by scanning the radar device;
and the positioning module is used for determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data.
In a possible embodiment, the positioning module is specifically configured to:
determining the relative distance between at least one marker on the bridge crane and the unmanned vehicle according to the point cloud data;
and obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle.
In a possible implementation manner, the positioning module is further specifically configured to:
determining a region of interest corresponding to the at least one marker;
screening the point cloud data according to the position area corresponding to the at least one marker, wherein the screened point cloud data is located in the interest area corresponding to the at least one marker;
determining the position of the at least one marker in the preset coordinate system according to the screened point cloud data;
and determining the relative distance between the at least one marker and the unmanned vehicle according to the position of the at least one marker.
In one possible implementation, the number of markers is 2 or more; the positioning module is specifically configured to:
determining the relative distance between the unmanned vehicle and a preset reference position on the bridge crane according to the relative distance between each of at least two markers on the bridge crane and the unmanned vehicle;
and determining the relative distance between the unmanned vehicle and the reference position as the relative position between the unmanned vehicle and the bridge crane.
In one possible implementation, the scanning range of the radar device includes an upper region of the unmanned vehicle.
In one possible implementation, the radar apparatus includes one or more of the following: rotary lidar, solid-state lidar.
In one possible implementation, the rotary lidar is a multi-line lidar.
In one possible implementation, the radar device is fixed above the head of the unmanned vehicle.
In a third aspect, embodiments of the present disclosure provide an electronic device, comprising:
a memory and a processor;
the memory is used for storing program instructions;
the processor is configured to invoke the program in the memory to perform the method according to the first aspect or each possible implementation manner of the first aspect.
In a fourth aspect, embodiments of the present disclosure provide an unmanned vehicle comprising
A radar apparatus and an electronic device as described in the above third aspect.
In a fifth aspect, embodiments of the present disclosure provide a computer readable storage medium, on which a computer program is stored, the computer program, when executed, implementing a method as described in the first aspect, or each possible implementation of the first aspect.
In a sixth aspect, embodiments of the present disclosure provide a program product comprising instructions, the program product comprising a computer program stored in a storage medium, from which at least one processor can read, the at least one processor implementing a method according to the first aspect, or each possible implementation of the first aspect, when the computer program is executed.
According to the vehicle positioning method, device and equipment under the bridge crane and the storage medium, point cloud data obtained by scanning of the radar device on the unmanned vehicle are obtained, and the relative position of the unmanned vehicle and the bridge crane is determined according to the point cloud data. Therefore, the positioning of the unmanned vehicle under the bridge crane is realized under the condition of not depending on a satellite positioning mode, and the positioning accuracy of the unmanned vehicle under the bridge crane is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is an exemplary diagram of a bridge crane and an unmanned vehicle provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart of a method for positioning a vehicle under a bridge crane according to an embodiment of the disclosure;
FIG. 3 is a flow chart of a method of positioning a vehicle under a bridge crane according to another embodiment of the present disclosure;
FIG. 4 is a flow chart of a method of positioning a vehicle under a bridge crane according to another embodiment of the present disclosure;
FIG. 5 is a flow chart of a method of positioning a vehicle under a bridge crane according to another embodiment of the present disclosure;
FIG. 6 is a schematic view of a vehicle positioning device under a bridge crane according to an embodiment of the disclosure;
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Specific embodiments of the present disclosure have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the disclosed concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
First, terms involved in embodiments of the present disclosure will be explained:
bridge crane: refers to a crane at a port for loading and unloading containers, which may also be referred to as a shoreside container loading bridge, a container loading bridge or a loading bridge. Generally, the bridge crane comprises a portal frame with a travelling mechanism, a pull rod for bearing the weight of the cantilever crane, the cantilever crane and other structures. The door frame is generally divided into an A shape and an H shape, namely, the appearance of the door frame is in the shape of a letter A or the shape of a letter H, the door frame comprises a front door frame and a rear door frame, and at least one cross beam is respectively arranged on each door frame. Taking a bridge crane as a shore container loading and unloading bridge as an example, a cantilever crane of the bridge crane comprises a sea side cantilever crane (a cantilever crane close to the shore side), a land side cantilever crane (a cantilever crane close to the land side) and a door middle cantilever crane (a cantilever crane positioned in the middle of the portal frame), wherein a trolley with a lifting mechanism runs on the cantilever crane, the lifting mechanism is used for bearing the weight of a container lifting appliance and a container, and the container lifting appliance is used for grabbing the container. The construction of the bridge crane is referred to in the prior art and will not be described in detail here.
A radar device: the radar device can scan and obtain point cloud data of surrounding environment, the scanned point cloud data comprises a plurality of points, and each point has corresponding three-dimensional coordinates.
When in port operation, in order to ensure that a lifting appliance on a bridge crane can smoothly grab a container on a vehicle or place the container at a designated position on the vehicle, accurate alignment of the vehicle and the bridge crane is a key point of success or failure of port operation. Vehicles in port operations are typically trucks, including headrests and trailers, and typically, the bridge crane is required to grab or place containers in the trailer.
Generally, a driver with a great deal of driving experience drives a vehicle to a specified position under a bridge crane. In order to improve the efficiency and the intelligent degree of the port operation and reduce the labor cost of the port operation, the container can be transported to or from the bridge crane by adopting the unmanned vehicle, so that the problem of how the unmanned vehicle runs and stays at the designated position under the bridge crane, namely, the problem of how to determine the position of the unmanned vehicle under the bridge crane is required to be solved.
In general, satellite positioning may be used to determine the location of an unmanned vehicle. However, the inventor finds that the bridge crane is usually large, when the unmanned vehicle runs near the bridge crane, the satellite positioning signal is easily blocked by the bridge crane, and the finally received satellite positioning signal is weak, so that the relative position of the unmanned vehicle and the bridge crane cannot be obtained through the satellite positioning signal, or the relative position of the unmanned vehicle and the bridge crane obtained through the satellite positioning signal is inaccurate. In addition, the bridge crane is also movable, so that the relative position of the unmanned vehicle and the bridge crane cannot be accurately known.
In the vehicle positioning method under the bridge crane provided by the embodiment of the disclosure, the radar device is arranged on the unmanned vehicle, and the scanning area of the radar device comprises the upper area of the unmanned vehicle. And acquiring point cloud data obtained by scanning of the radar device, and determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data, so that the positioning of the unmanned vehicle under the bridge crane is realized by means of the radar device, and the positioning accuracy of the unmanned vehicle under the bridge crane is effectively improved.
The vehicle positioning method under the bridge crane provided by the embodiment of the present disclosure may be applied to the unmanned vehicle 101 shown in fig. 1. As shown in fig. 1, a radar device 102 is provided in an unmanned vehicle 101, and the radar device 102 can scan an upper region of the unmanned vehicle 101.
The vehicle positioning method under the bridge crane provided by the embodiment of the disclosure can be suitable for the application scene of port operation as shown in fig. 1. The application scene comprises: the unmanned vehicle 101 and the bridge crane 103, the unmanned vehicle 101 is provided with a radar device 102, and the bridge crane 103 is provided with a beam 104. In port operations, the bridge crane 103 travels along a preset track parallel to the shore and the unmanned vehicle 101 transports containers back and forth along the roadway on the shore. For example, the unmanned vehicle 101 carries away the container unloaded from the ship by the bridge crane 103, and performs the ship unloading operation; or the unmanned vehicle 101 transports the container under the bridge crane 103 so that the bridge crane 103 puts the container on the ship for shipping.
In the process of the unmanned vehicle 101 entering the bridge crane 103 to exiting the bridge crane 103, the unmanned vehicle 101 passes through the front side door frame and the rear side door frame in sequence. For example, the unmanned vehicle 101 may scan for an outside vertical surface of the front door frame upper beam 104 and an inside vertical surface of the rear door frame upper beam 104 before driving into the bridge 103 by the radar device 102; when the unmanned vehicle 101 passes through the front door frame and enters the bridge crane 103, the radar device 102 may scan to obtain the bottom surface of the front door frame upper beam 104 and the inner vertical surface of the rear door frame upper beam 104; after the unmanned vehicle 101 enters the bridge crane 103 and before passing the rear door frame, the radar device 102 may scan to obtain the inner vertical surface of the front door frame upper beam 104 and the inner vertical surface of the rear door frame upper beam 104; when the unmanned vehicle 101 passes through the rear door frame, the radar device 102 may scan to obtain the inner vertical surface of the front door frame upper beam 104 and the bottom surface of the rear door frame upper beam 104; after the unmanned vehicle 101 exits the bridge 103, the radar device 102 may scan for the inside vertical surface of the front door frame upper beam 104 and the outside vertical surface of the rear door frame upper beam 104. Wherein, the beam 104 comprises two vertical surfaces: an inner vertical surface and an outer vertical surface. The inner vertical surface of the beam 104 refers to the inner vertical surface of the beam 104 with respect to the center of the bridge 103, i.e. the vertical surface facing the center of the bridge 103 of the two vertical surfaces of the beam 104, and the outer vertical surface of the beam 104 refers to the outer vertical surface of the beam 104 with respect to the center of the bridge 103, i.e. the vertical surface facing the outside of the bridge 103 on the beam 104.
The following describes in detail, with specific embodiments, a technical solution of an embodiment of the present disclosure and how the technical solution of the present application solves the foregoing technical problems. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present disclosure will be described below with reference to the accompanying drawings.
Fig. 2 is a flow chart of a method for positioning a vehicle under a bridge crane according to an embodiment of the disclosure. As shown in fig. 2, the method includes:
s201, acquiring point cloud data obtained by scanning of a radar device, wherein a scanning area of the radar device comprises an upper area of an unmanned vehicle.
Specifically, in the process that the unmanned vehicle runs towards the bridge crane, point cloud data obtained by scanning of the radar device are obtained. The three-dimensional coordinates of the plurality of points in the scanned point cloud data can reflect the relative positions between the radar apparatus and the object in the vicinity of the radar apparatus (including the bridge scanned by the radar apparatus).
In one possible embodiment, the scanning area of the radar apparatus includes an upper area of the unmanned vehicle. The bridge crane is typically located above the unmanned vehicle, for example, in front of and above the unmanned vehicle during the entrance of the unmanned vehicle into the bridge crane, and in back of and above the unmanned vehicle during the exit of the unmanned vehicle. Therefore, the radar device can scan the bridge crane during the process of driving the unmanned vehicle into the bridge crane or driving the unmanned vehicle out of the bridge crane.
In one possible embodiment, an imaging device may be disposed on the unmanned vehicle, and the imaging device may capture a front image of the unmanned vehicle, and determine the estimated relative distance between the unmanned vehicle and the bridge crane by identifying and analyzing the captured front image. The estimated relative distance may be used to determine whether to activate the radar device and determine the relative position of the unmanned vehicle and the bridge crane based on the point cloud data scanned by the radar device. Specifically, if the estimated relative distance is less than or equal to a preset distance threshold, the radar device is started to perform radar positioning. The estimated relative distance between the unmanned vehicle and the bridge crane is the rough relative distance between the unmanned vehicle and the bridge crane.
In one possible embodiment, the estimated relative distance between the unmanned vehicle and the bridge crane can be determined by a preset positioning device (such as a satellite positioning device, a bluetooth positioning device, etc.) on the unmanned vehicle and a preset positioning device (such as a satellite positioning device, a bluetooth positioning device, etc.) on the bridge crane, so as to determine whether to turn on the radar device for positioning.
S202, determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data.
Specifically, the point cloud data obtained by scanning the radar device can reflect the relative positions of the bridge crane and the radar device. Because the radar device is positioned on the unmanned vehicle, the relative position of the bridge crane and the radar device can be determined as the relative position of the bridge crane and the unmanned vehicle, and the relative position of the bridge crane and the unmanned vehicle can be obtained by converting the relative position of the bridge crane and the radar device according to the position relation of the radar device and the unmanned vehicle.
In one possible implementation, three-dimensional modeling can be performed according to point cloud data obtained by scanning of the radar device, so as to obtain a bridge crane model and a relative position of the bridge crane model and the radar device. The position of the unmanned vehicle model established in advance is determined according to the position of the radar device. According to the position of the unmanned vehicle model and the relative position of the bridge crane model and the radar device, the relative position of the unmanned vehicle model and the bridge crane model can be determined, and then the relative position of the unmanned vehicle and the bridge crane can be determined. Therefore, by carrying out three-dimensional modeling on the point cloud data, the accuracy of positioning the unmanned vehicle under the bridge crane is improved. In this case, since the radar device may scan only a local area of the bridge crane, the bridge crane model may also be a three-dimensional model of the local area.
According to the embodiment of the disclosure, the radar device on the unmanned vehicle scans to obtain the point cloud data, and the relative position of the unmanned vehicle and the bridge crane is determined according to the point cloud data, so that the relative position of the unmanned vehicle and the bridge crane is determined without depending on satellite positioning, and the accuracy of the relative position of the unmanned vehicle and the bridge crane is improved.
Fig. 3 is a flow chart of a method for positioning a vehicle under a bridge crane according to another embodiment of the disclosure. As shown in fig. 3, the method includes:
s301, acquiring point cloud data obtained by scanning of a radar device.
Specifically, step S301 may refer to step S201, and detailed descriptions of each possible embodiment in step S201 are omitted here.
S302, determining the relative distance between at least one marker on the bridge crane and the unmanned vehicle according to the point cloud data.
At least one marker may be provided on the bridge crane in advance, and the marker may be an object that is easily recognized, such as an object of a preset shape (e.g., a cuboid, a sphere, etc.).
If the number of the markers is one, the markers can be scanned by a radar device on the unmanned vehicle in the process of driving the unmanned vehicle into the bridge crane, so that the relative position of the unmanned vehicle and the bridge crane in the process of driving the unmanned vehicle into the bridge crane can be determined according to the relative distance between the markers and the unmanned vehicle. If the number of the markers is multiple, the multiple markers are respectively positioned at different positions of the bridge crane, so that the radar device can always scan to obtain at least one marker in the process of driving the unmanned vehicle into the bridge crane to driving out of the bridge crane, and the relative positions of the unmanned vehicle and the bridge crane can be determined according to the relative distance between the markers and the unmanned vehicle in the process of driving the bridge crane to driving out of the bridge crane.
Specifically, based on the easily identifiable characteristic of the markers, points located on at least one marker may be identified in the point cloud data, and a relative distance between the at least one marker and the radar device may be determined based on the points located on the at least one marker. The radar device is positioned on the unmanned vehicle, so that the relative distance between the at least one marker and the radar device can be determined as the relative distance between the at least one marker and the unmanned vehicle, and the relative distance between the at least one marker and the radar device can be converted according to the position relation between the radar device and the unmanned vehicle to obtain the relative distance between the at least one marker and the unmanned vehicle.
The portal frame of the bridge crane comprises a front door frame and a rear door frame, wherein the track direction of the bridge crane is taken as the front and rear direction, and at least one cross beam is respectively arranged on each door frame. In the process of the unmanned vehicle driving into the bridge crane and driving out of the bridge crane, namely, the process of driving into the bridge crane from the front side door frame and driving out of the bridge crane from the rear side door frame, a radar device on the unmanned vehicle can scan the front side door frame and the rear side door frame, and the radar device comprises at least one beam on the front side door frame and at least one beam on the rear side door frame. Thus, in one possible embodiment, at least one marker may be provided on the front or rear door frame, so that the radar device scans the marker during the entrance and/or exit of the unmanned vehicle to the bridge crane, facilitating the determination of the relative position of the bridge crane to the unmanned vehicle based on the relative distance of the marker to the unmanned vehicle.
Further, the marker may include at least one reflective strip disposed on the front door frame and/or at least one reflective strip disposed on the rear door frame, and by virtue of the characteristic that the reflective strip has a higher capability of reflecting radar signals, points falling on the reflective strip can be more accurately identified in the point cloud data, so that the relative distance between the at least one marker and the radar device can be more accurately determined.
Further, the marker may include at least one beam of the front door frame and/or at least one beam of the rear door frame, by means of the features that the beam is fixed in position on the bridge crane and is easy to identify in shape, and the radar device can scan the beam during the driving in and/or driving out of the unmanned vehicle, the point falling on the beam can be more accurately determined in the point cloud data, and further, the relative distance between the at least one marker and the radar device can be more accurately determined, and no additional marker is required to be added on the bridge.
Further, the marker may include at least one light reflecting strip provided on the cross beam of the front door frame and/or at least one light reflecting strip provided on the cross beam of the rear door frame, so that the relative distance of the at least one marker to the radar apparatus is more accurately determined in combination with the above-described features of the cross beam and the light reflecting strips.
S303, obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between at least one marker and the unmanned vehicle.
Specifically, after the relative distance between one marker and the unmanned vehicle is obtained, the relative position between the unmanned vehicle and the bridge crane can be obtained according to the position of at least one marker on the bridge crane and the relative distance between at least one marker and the unmanned vehicle.
In a possible embodiment, in the case that the at least one marker includes at least one beam of the front side door frame and/or at least one beam of the rear side door frame, after the relative distance between the at least one beam and the unmanned vehicle is obtained, the relative distance between the at least one beam and the unmanned vehicle reflects the relative distance between the unmanned vehicle and the front side door frame and/or the rear side door frame, and the position of the unmanned vehicle on the bridge can be clearly grasped according to the relative distance between the unmanned vehicle and the front side door frame and/or the relative distance between the unmanned vehicle and the rear side door frame, so that the relative position between the unmanned vehicle and the bridge can be obtained according to the relative distance between the at least one beam and the unmanned vehicle, and the relative position at this time includes the relative distance between the unmanned vehicle and the at least one beam or the relative distance between the unmanned vehicle and the front side door frame and/or the rear side door frame. In addition, in the case that the at least one marker includes at least one light reflecting strip disposed on the beam of the front side door frame and/or at least one light reflecting strip disposed on the beam of the rear side door frame, reference may be made to the case that the at least one marker includes at least one beam of the front side door frame and/or at least one beam of the rear side door frame, which will not be described in detail.
According to the embodiment of the disclosure, the radar device on the unmanned vehicle scans to obtain the point cloud data, the relative distance between the unmanned vehicle and at least one marker on the bridge crane is determined according to the point cloud data, and the relative position between the unmanned vehicle and the bridge crane is obtained according to the relative distance between the unmanned vehicle and at least one marker on the bridge crane, so that satellite positioning is not required, the relative position between the unmanned vehicle and the bridge crane is determined, and the accuracy of the relative position between the unmanned vehicle and the bridge crane is improved.
Fig. 4 is a flow chart of a method for positioning a vehicle under a bridge crane according to another embodiment of the disclosure. As shown in fig. 4, the method includes:
s401, acquiring point cloud data obtained by scanning of a radar device.
Specifically, step S401 may refer to step S201 and the detailed description of each possible embodiment in step S201, which will not be repeated.
In one possible implementation manner, after the point cloud data obtained by scanning the radar device are obtained, the point cloud data can be converted from the radar coordinate system of the radar device to the vehicle body coordinate system of the unmanned vehicle according to the conversion relation between the radar coordinate system and the vehicle body coordinate system, and then the relative positions of the unmanned vehicle and the bridge crane can be determined in the vehicle body coordinate system according to the converted point cloud data, so that the convenience of subsequent point cloud data processing is improved, and the positioning efficiency of the unmanned vehicle under the bridge crane is further improved.
Wherein, the radar coordinate system and the vehicle body coordinate system are three-dimensional coordinate systems which are predefined or preset. The radar coordinate system is a three-dimensional coordinate system established by taking the position of the radar device as an origin, the three-dimensional coordinates of each point in the point cloud data scanned by the radar device are the three-dimensional coordinates of each point in the point cloud data in the radar coordinate system, and the vehicle body coordinate system of the unmanned vehicle is a three-dimensional coordinate system established by taking the unmanned vehicle as the origin. For example, in order to improve the convenience of data processing, an origin may be arbitrarily selected in the unmanned vehicle by taking the front side of the body of the unmanned vehicle as the x-axis positive direction, taking the right left side of the body of the unmanned vehicle as the y-axis positive direction, and taking the right upper side of the body of the unmanned vehicle as the z-axis positive direction, for example, the unmanned vehicle is represented by a rectangular parallelepiped, the center of the rectangular parallelepiped is defined as the origin, and the origin, the x-axis, the y-axis, and the z-axis may constitute a body coordinate system of the unmanned vehicle.
S402, determining the position of at least one marker in a preset coordinate system according to the point cloud data.
The preset coordinate system can be a radar coordinate system or a vehicle body coordinate system.
Specifically, from the point cloud data, three-dimensional coordinates of each point on the bridge crane scanned by the radar device in a preset coordinate system can be obtained, and according to the three-dimensional coordinates of each point, the position of at least one marker can be determined in the preset coordinate system. For example, the three-dimensional coordinates of the point on the marker are weighted and averaged to obtain the three-dimensional coordinates of the marker in the preset coordinate system. For another example, a point closest to the center of the marker is obtained from points located on the marker, and the three-dimensional coordinates of the point are determined as the three-dimensional coordinates of the marker in a preset coordinate system.
In one possible embodiment, since the radar device generally scans one or more of all sides of the marker, the sides of the marker scanned by the radar device at different times may be different as the position of the unmanned vehicle changes, the position of one or more sides of the marker in the preset coordinate system may be determined according to the three-dimensional coordinates of the point on the marker in the point cloud data, and the position of one or more sides of the marker in the preset coordinate system may be determined as the position of the marker in the preset coordinate system, thereby improving the marker position calculation accuracy.
Further, when the point cloud data is located on a plurality of sides of the marker, the side with the largest number of points in the point cloud data can be selected from the plurality of sides, and the position of the side in the preset coordinate system is determined as the position of the marker, so that the accuracy of calculating the position of the marker is improved.
The scanning area of the radar apparatus includes an upper area of the unmanned vehicle. In a possible embodiment, in case the at least one marker comprises at least one beam on the front door frame and/or at least one beam on the rear door frame, the radar device may scan for an outer vertical surface of the at least one beam on the front door frame and an inner vertical surface of the at least one beam on the rear door frame before the unmanned vehicle is driven into the bridge crane; when an unmanned vehicle passes through the front door frame and enters the bridge crane, the radar device can scan to obtain the bottom surface of at least one cross beam on the front door frame and the inner vertical surface of at least one cross beam on the rear door frame; after the unmanned vehicle is driven into the bridge crane and before the unmanned vehicle passes through the rear side door frame, the radar device can scan to obtain the inner vertical surface of at least one cross beam on the front side door frame and the inner vertical surface of at least one cross beam on the rear side door frame; when the unmanned vehicle passes through the rear side door frame, the radar device can scan to obtain the inner vertical surface of at least one cross beam on the front side door frame and the bottom surface of at least one cross beam on the rear side door frame; after the unmanned vehicle exits the bridge crane, the radar device may scan to obtain the inside vertical surface of at least one beam on the front door frame and the outside vertical surface of at least one beam on the rear door frame.
Thus, in case the at least one marker comprises at least one beam on the front door frame and/or at least one beam on the rear door frame, the point located on the vertical face of the at least one beam may be determined from the three-dimensional coordinates of the points in the point cloud data, and the position of the vertical face of the at least one beam may be determined in a preset coordinate system from the point falling on the vertical face of the at least one beam. In addition, the case that the at least one marker includes at least one light reflecting strip disposed on the beam of the front side door frame and/or at least one light reflecting strip disposed on the beam of the rear side door frame may refer to the case that the at least one marker includes at least one beam of the front side door frame and/or at least one beam of the rear side door frame, which will not be described in detail.
S403, determining the relative distance between at least one marker and the unmanned vehicle according to the position of at least one marker in a preset coordinate system.
Specifically, after determining the position of at least one marker in the preset coordinate system, if the preset coordinate system is a vehicle body coordinate system, the relative distance between the position of the at least one marker and the origin of the preset coordinate system can be obtained by determining the relative distance between the at least one marker and the unmanned vehicle; if the preset coordinate system is a radar coordinate system, determining a position coordinate corresponding to an origin of the vehicle body coordinate system in the radar coordinate system according to a conversion relation between the vehicle body coordinate system and the radar coordinate system, and further determining a relative distance between the position of at least one marker and the position coordinate to obtain a relative distance between the at least one marker and the unmanned vehicle.
S404, obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between at least one marker and the unmanned vehicle.
Specifically, after the relative distance between the vertical surface of at least one marker and the unmanned vehicle is obtained, the relative distance between the vertical surface of the at least one marker and the unmanned vehicle can be determined as the relative position between the unmanned vehicle and the bridge crane.
In one possible embodiment, the number of the markers is greater than or equal to 2, in the process of determining the relative positions of the unmanned vehicle and the bridge crane, the position of each of the at least two markers can be determined in a preset coordinate system according to the point cloud data, the relative distance between each marker and the unmanned vehicle is determined according to the position of each marker, and the relative position between the unmanned vehicle and the bridge crane is determined according to the relative distance between each marker and the unmanned vehicle, so that the accuracy of the relative positions of the unmanned vehicle and the bridge crane is improved through the relative distances between the at least two markers and the unmanned vehicle.
Further, the at least two markers include at least one beam on the front door frame and at least one beam on the rear door frame. Therefore, the relative positions of the unmanned vehicle and the front side door frame and the rear side door frame of the bridge crane can be determined through the relative distances between the at least one cross beam positioned on the front side door frame and the at least one cross beam positioned on the rear side door frame and the unmanned vehicle, so that whether the unmanned vehicle enters the bridge crane or exits the bridge crane or not can be determined, and the accuracy of the relative positions of the unmanned vehicle and the bridge crane can be improved.
In a possible embodiment, in determining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between each of the at least two markers and the unmanned vehicle, the relative distance between the unmanned vehicle and the bridge crane may be determined by averaging the relative distances between each of the at least two markers and the unmanned vehicle, for example, by summing the relative distances between each of the at least two markers and the unmanned vehicle, and determining the relative distance between the unmanned vehicle and the predetermined reference position on the bridge crane as the relative position between the unmanned vehicle and the bridge crane, so that the relative distance between the unmanned vehicle and the reference position on the bridge crane may be determined by presetting the reference position on the bridge crane. Wherein the reference position is located in the middle of at least two markers. Furthermore, the reference position can be arranged at a parking position under the bridge crane, such as the middle position of the bridge crane, so that the unmanned vehicle can travel to the reference position according to the relative distance between the unmanned vehicle and the reference position, and the lifting appliance on the bridge crane can conveniently load and unload the unmanned vehicle.
According to the embodiment of the disclosure, the radar device on the unmanned vehicle scans to obtain the point cloud data, the relative distance between the unmanned vehicle and at least one marker on the bridge crane is determined according to the point cloud data, and the relative position between the unmanned vehicle and the bridge crane is obtained according to the relative distance between the unmanned vehicle and at least one marker, so that the relative position between the unmanned vehicle and the bridge crane is determined without depending on satellite positioning, and the accuracy of the relative position between the unmanned vehicle and the bridge crane is effectively improved.
Fig. 5 is a flow chart of a method for positioning a vehicle under a bridge crane according to another embodiment of the disclosure. As shown in fig. 5, the method includes:
s501, acquiring point cloud data obtained by scanning of a radar device.
Specifically, step S501 may refer to step S401 and detailed descriptions of each possible implementation in step S401, which are not repeated.
S502, screening the point cloud data, wherein the screened point cloud data are located in the interest area corresponding to at least one marker.
Specifically, a region of interest corresponding to at least one marker in the current driving scenario may be determined. And screening the point cloud data according to the interest region to obtain the point cloud data in the interest region corresponding to the marker. And if the number of the markers is multiple, respectively obtaining point cloud data in the interest areas corresponding to the markers. The region of interest corresponding to the at least one marker is a position region occupied by the at least one marker in a preset coordinate system.
In a possible implementation manner, in the case that the marker includes at least one reflective strip disposed on the bridge crane, according to the characteristic that the reflective strip has strong capability of reflecting radar signals, according to the aggregation degree of points in the point cloud data, the size of the reflective strip and the shape of the reflective strip, a region of interest corresponding to the reflective strip can be determined, and point cloud data in the region of interest can be obtained.
In one possible embodiment, where the marker comprises at least one beam on the front door frame and/or at least one beam on the rear door frame, an estimated relative position of the unmanned vehicle and the bridge crane may be obtained, and the region of interest corresponding to the at least one beam is determined based on the estimated relative position of the unmanned vehicle and the bridge crane, the position of the at least one beam on the bridge crane, and the size of the at least one beam. And screening the point cloud data according to the region of interest corresponding to the at least one cross beam to obtain screened point cloud data. During screening, points in the interest area corresponding to at least one cross beam can be obtained from the point cloud data, and the screened point cloud data is formed by the points. The position of at least one beam on the bridge crane and the size of at least one beam are preset bridge crane parameters. The region of interest corresponding to the at least one beam is a position region occupied by the at least one beam in a preset coordinate system, and the shape and the size of the region of interest are respectively the same as those of the beam. The preset coordinate system is a radar coordinate system of the radar device or a vehicle body coordinate system of the unmanned vehicle.
Further, an imaging device can be arranged on the unmanned vehicle, a front image of the unmanned vehicle is captured by the imaging device, the estimated relative position of the unmanned vehicle and the bridge crane is determined by identifying and analyzing the captured front image, and the estimated relative distance of the unmanned vehicle and the bridge crane is roughly the relative distance of the unmanned vehicle and the bridge crane. Alternatively, the estimated relative position of the unmanned vehicle and the bridge crane can be obtained through a preset positioning device (such as a satellite positioning device, a Bluetooth positioning device and the like) on the unmanned vehicle and a preset positioning device (such as a satellite positioning device, a Bluetooth positioning device and the like) on the bridge crane, so that the obtaining efficiency of the estimated relative position is improved.
S503, determining the position of at least one marker in a preset coordinate system according to the screened point cloud data.
Specifically, step S503 may refer to the detailed description of step 402, and will not be described in detail.
In a possible implementation manner, one or more sides of at least one marker can be obtained by fitting according to the screened point cloud data in a preset coordinate system, so that accuracy of determining the sides of the at least one marker is effectively improved in a plane fitting mode.
In a possible embodiment, in the case that the marker includes at least one beam on the front door frame and/or at least one beam on the rear door frame, the vertical plane of each beam may be obtained by fitting according to the screened point cloud data. In the process of fitting the vertical plane of the cross beam, the normal vector of each point in the point cloud data is calculated for the point cloud data in the region of interest corresponding to the cross beam, wherein the normal vector can be expressed as (n x ,n y ,n z ) Satisfy n x 2 +n y 2 +n z 2 =1. Projecting the calculated normal vector to the XOY plane of the preset coordinate system to obtain a corresponding two-dimensional vector, which can be expressed as (n x ,n y ). And according to the two-dimensional vector obtained by projection, acquiring points with normal vectors parallel to the XOY plane in the point cloud data, and determining the points with the normal vectors approximately parallel to the XOY plane in the point cloud data as points falling on the vertical plane of the beam. The vertical plane equation of the beam, i.e. the vertical plane of the beam, is obtained by plane fitting the points falling on the vertical plane of the beam.
As an example, when points with normal vectors parallel to the XOY plane are obtained from the point cloud data according to the two-dimensional vector obtained by projection, the points with normal vectors parallel to the XOY plane may be screened from the point cloud data by a preset threshold and a screening formula. Wherein, the screening formula can be expressed as:
T is a preset threshold.
In one possible implementation manner, a random sampling consistency (RANdom SAmple Consensus, abbreviated as RANSAC) manner may be adopted to perform plane fitting on points falling on a vertical plane of the beam to obtain a vertical plane equation of the beam, and meanwhile, the influence of the mixed points in the point cloud data on the fitting may be eliminated to obtain internal points in the fitting process, that is, points which are finally used for determining the vertical plane equation in the fitting process.
In a possible implementation manner, after the vertical surface of the at least one beam is obtained by fitting, whether the vertical surface of the at least one beam obtained by fitting is the inner vertical surface of the at least one beam relative to the center of the bridge crane or the outer vertical surface of the at least one beam relative to the center of the bridge crane can be determined according to the point cloud data in the region of interest corresponding to the at least one beam, so that the accuracy of the relative position of the unmanned vehicle and the bridge crane is improved.
In one possible embodiment, in determining whether the fitted vertical surface of the at least one beam is an inner vertical surface of the at least one beam with respect to the center of the bridge crane or an outer vertical surface of the at least one beam with respect to the center of the bridge crane, the determination may be made according to whether the at least one beam is located in the front door frame or the rear door frame, and whether the distance between the vertical surface of the at least one beam and the unmanned vehicle is a positive value or a negative value, thereby improving the efficiency of determining whether the vertical surface of the beam is an inner vertical surface or an outer vertical surface of the beam.
Specifically, when the beam is located at the front side door frame and the distance between the vertical surface of the beam obtained by fitting and the unmanned vehicle is a negative value, the beam on the front side door frame is described as being located at the rear of the unmanned vehicle, and the vertical surface is the inner vertical surface of the beam; when the beam is positioned on the front side door frame and the distance between the vertical surface of the beam obtained by fitting and the unmanned vehicle is a positive value, the beam on the front side door frame is positioned in front of the unmanned vehicle, and the vertical surface is the outer vertical surface of the beam; when the beam is positioned on the rear side door frame and the distance between the vertical surface of the beam obtained by fitting and the unmanned vehicle is a negative value, the beam on the rear side door frame is positioned behind the unmanned vehicle, and the vertical surface is the outer vertical surface of the beam; when the beam is positioned on the rear side door frame and the distance between the vertical surface of the beam obtained by fitting and the unmanned vehicle is a positive value, the beam on the rear side door frame is positioned in front of the unmanned vehicle, and the vertical surface is the inner vertical surface of the beam.
S504, determining the relative distance between at least one marker and the unmanned vehicle according to the position of at least one marker in a preset coordinate system.
Specifically, step S504 may refer to the detailed description of step S403, and will not be described in detail.
In a possible implementation manner, when the marker includes at least one beam on the front door frame and/or at least one beam on the rear door frame, taking a preset coordinate system as an example of a vehicle body coordinate system, after fitting to obtain vertical surfaces of the at least one beam, for each vertical surface, an inner point for fitting the vertical surface may be projected to an XOY plane of the preset coordinate system to obtain projection points of each inner point, which are projected on the XOY plane, and straight line fitting is performed according to each projection point to obtain a corresponding straight line equation, a relative distance between a straight line corresponding to the straight line equation and an origin of the vehicle body coordinate system is calculated, and a relative distance between the straight line corresponding to the straight line equation and the origin of the vehicle body coordinate system is determined as a relative distance between the vertical surface and the origin of the vehicle body coordinate system, thereby improving accuracy and calculation efficiency of the relative distance between the vertical surface of the beam and the unmanned vehicle.
S505, obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between at least one marker and the unmanned vehicle.
Specifically, step S505 may refer to step S404, and detailed descriptions of each possible implementation of step S404 are omitted herein.
In the embodiment of the disclosure, the radar device on the unmanned vehicle scans the upper area of the unmanned vehicle to obtain point cloud data, screening and plane fitting are performed on the point cloud data, the relative distance between the unmanned vehicle and at least one marker on the bridge crane is determined, and the relative position between the unmanned vehicle and the bridge crane is obtained according to the relative distance between the unmanned vehicle and at least one marker on the bridge crane, so that satellite positioning is not required to be relied on, the relative position between the unmanned vehicle and the bridge crane is determined, and the accuracy of the relative position between the unmanned vehicle and the bridge crane is effectively improved.
In the examples shown in fig. 2 to 5, the following possible implementations may also be included:
in one possible embodiment, the radar device comprises one or more lidars, and the quality of the point cloud data scanned by the lidar is high, so that the accuracy between the relative positions of the unmanned vehicle and the bridge crane is improved through scanning by the lidar.
In one possible embodiment, the radar apparatus includes one or more of the following: rotary lidar, solid-state lidar. The rotation type laser radar can rotate by 360 degrees, the scanning range of the rotation scanning comprises an upper area of an unmanned vehicle, the solid-state laser radar does not have a rotation component, the direction of a laser beam is changed by adopting a phased array principle, the solid-state laser radar can scan objects in a preset range, and the scanning range of the solid-state laser radar comprises the upper area of the unmanned vehicle.
In one possible embodiment, the rotary lidar is a multi-line lidar, which can emit laser beams of multiple lines to improve the scanning area range and scanning efficiency of the radar apparatus.
Further, the transmitter in the multi-line laser radar rotates around the Y axis of the vehicle body coordinate system of the unmanned vehicle, wherein the direction of the Y axis is the right left direction or the right direction of the unmanned vehicle, so that the multi-line laser radar can scan the 360-degree range of the Z axis of the vehicle body coordinate system, namely, 360-degree scanning is performed on a vertical plane, and the crane can be fully scanned.
In one possible embodiment, the radar device is disposed above the head of the unmanned vehicle, and thus, by the multi-line lidar mounted above the head of the unmanned vehicle, the upper region of the unmanned vehicle is sufficiently scanned, and the head position of the unmanned vehicle is facilitated to be controlled.
Fig. 6 is a schematic structural diagram of a vehicle positioning device under a bridge crane according to an embodiment of the disclosure, where a radar device is disposed on an unmanned vehicle. As shown in fig. 6, the vehicle positioning device under the bridge crane includes:
an acquisition module 601, configured to acquire point cloud data obtained by scanning by a radar device;
The positioning module 602 is configured to determine a relative position of the unmanned vehicle and the bridge crane according to the point cloud data.
In one possible implementation, the positioning module 602 is specifically configured to:
determining the relative distance between at least one marker on the bridge crane and the unmanned vehicle according to the point cloud data;
and obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle.
In one possible implementation, the positioning module 602 is specifically configured to:
determining a region of interest corresponding to the at least one marker;
screening the point cloud data according to the position area corresponding to the at least one marker, wherein the screened point cloud data is positioned in the interest area corresponding to the at least one marker;
determining the position of at least one marker in a preset coordinate system according to the screened point cloud data;
based on the location of the at least one marker, a relative distance between the at least one marker and the unmanned vehicle is determined.
In one possible embodiment, the number of markers is 2 or more; the positioning module 602 is specifically configured to:
determining the relative distance between the unmanned vehicle and a preset reference position on the bridge crane according to the relative distance between each of at least two markers on the bridge crane and the unmanned vehicle;
The relative distance of the unmanned vehicle and the reference position is determined as the relative position of the unmanned vehicle and the bridge crane.
In one possible embodiment, the scanning range of the radar apparatus includes an upper region of the unmanned vehicle.
In one possible embodiment, the radar apparatus comprises one or more lidars.
In one possible embodiment, the radar apparatus includes one or more of the following: rotary lidar, solid-state lidar.
In one possible embodiment, the rotary lidar is a multi-line lidar.
Further, the transmitter in the multi-line laser radar rotates around the Y axis of the vehicle body coordinate system of the unmanned vehicle, wherein the direction of the Y axis is the right left direction or the right direction of the unmanned vehicle, so that the multi-line laser radar can scan the 360-degree range of the Z axis of the vehicle body coordinate system, namely, 360-degree scanning is performed on a vertical plane, and the crane can be fully scanned.
In one possible embodiment, the radar device is fixed above the head of the unmanned vehicle.
The vehicle positioning device under the bridge crane provided in fig. 6 may perform the above-mentioned corresponding method embodiments, and its implementation principle and technical effects are similar, and will not be described herein again.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. As shown in fig. 7, the electronic device may include: a processor 701 and a memory 702. The memory 702 is used for storing computer-executable instructions, and the processor 701 implements the method of any of the embodiments described above when executing a computer program.
The processor 701 may be a general-purpose processor, including a central processing unit CPU, a network processor (network processor, NP), and the like; but may also be a digital signal processor DSP, an application specific integrated circuit ASIC, a field programmable gate array FPGA or other programmable logic device, a discrete gate or transistor logic device, a discrete hardware component. The memory 702 may include random access memory (random access memory, RAM) and may also include non-volatile memory (non-volatile memory), such as at least one magnetic disk memory.
An embodiment of the present disclosure also provides an unmanned vehicle including a radar apparatus and an electronic device provided by the embodiment shown in fig. 7. Therefore, the unmanned vehicle can realize automatic positioning under the bridge crane and determine the relative position of the unmanned vehicle and the bridge crane.
In one possible embodiment, the scanning range of the radar apparatus includes an upper region of the unmanned vehicle.
In one possible embodiment, the radar apparatus comprises one or more lidars.
In one possible embodiment, the radar apparatus includes one or more of the following: rotary lidar, solid-state lidar.
In one possible embodiment, the rotary lidar is a multi-line lidar.
Further, the transmitter in the multi-line laser radar rotates around the Y axis of the vehicle body coordinate system of the unmanned vehicle, wherein the direction of the Y axis is the right left direction or the right direction of the unmanned vehicle, so that the multi-line laser radar can scan the 360-degree range of the Z axis of the vehicle body coordinate system, namely, 360-degree scanning is performed on a vertical plane, and the crane can be fully scanned.
In one possible embodiment, the radar device is fixed above the head of the unmanned vehicle.
An embodiment of the present disclosure also provides a computer-readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the method of any of the embodiments described above.
An embodiment of the present disclosure also provides a program product containing instructions, the program product comprising a computer program stored in a storage medium, from which at least one processor can read, the at least one processor executing the computer program can implement the method of any of the embodiments described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. The embodiments of the present disclosure are intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A vehicle positioning method under a bridge crane is characterized in that a radar device and a camera device are arranged on an unmanned vehicle; the method comprises the following steps:
acquiring a front image of the unmanned vehicle, which is obtained by shooting by the shooting device;
determining an estimated relative distance between the unmanned vehicle and the bridge crane according to the front image;
if the estimated relative distance is smaller than or equal to a preset threshold value, starting the radar device;
acquiring point cloud data obtained by scanning the radar device;
determining a region of interest corresponding to the at least one marker; the region of interest corresponding to the at least one marker is a position region occupied by the at least one marker in a preset coordinate system; the at least one marker comprises at least one beam on the front side door frame and/or at least one beam on the rear side door frame; the preset coordinate system is a vehicle body coordinate system of the unmanned vehicle;
screening the point cloud data according to the interest area corresponding to the at least one marker, wherein the screened point cloud data is located in the interest area corresponding to the at least one marker;
determining the position of the at least one marker in a preset coordinate system according to the screened point cloud data;
Determining a relative distance between the at least one marker and the unmanned vehicle according to the position of the at least one marker;
obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle;
the determining the position of the at least one marker in a preset coordinate system according to the screened point cloud data comprises the following steps:
aiming at point cloud data in an interest area corresponding to a cross beam, calculating normal vectors of points in the point cloud data;
projecting the normal vector obtained by calculation to an XOY plane of a preset coordinate system to obtain a corresponding two-dimensional vector;
according to the two-dimensional vector obtained by projection, obtaining points with normal vectors parallel to an XOY plane in the point cloud data, and determining the points with the normal vectors parallel to the XOY plane in the point cloud data as points falling on the vertical plane of the cross beam;
obtaining the vertical surface of the cross beam by carrying out plane fitting on points falling on the vertical surface of the cross beam;
the determining the relative distance between the at least one marker and the unmanned vehicle according to the position of the at least one marker comprises:
For each vertical plane, projecting the inner points for fitting the vertical plane to an XOY plane of a preset coordinate system to obtain projection points, corresponding to the inner points, projected on the XOY plane;
performing straight line fitting according to each projection point to obtain a straight line equation;
calculating the relative distance between the straight line corresponding to the straight line equation and the origin of the vehicle body coordinate system, and determining the relative distance between the straight line corresponding to the straight line equation and the origin of the vehicle body coordinate system as the relative distance between the vertical plane and the origin of the vehicle body coordinate system;
after fitting to obtain the vertical plane of the beam, the method further comprises:
according to the point cloud data in the interest area corresponding to the cross beam, determining whether the vertical surface of the cross beam obtained by fitting is the inner vertical surface of the cross beam relative to the center of the bridge crane or the outer vertical surface of the cross beam relative to the center of the bridge crane;
in determining whether the fitted vertical surface of the beam is an inner vertical surface of the beam with respect to the center of the bridge crane or an outer vertical surface of the beam with respect to the center of the bridge crane, it is determined whether the beam is located in the front door frame or the rear door frame, and whether the distance between the vertical surface of the beam and the unmanned vehicle is a positive value or a negative value.
2. The method of claim 1, wherein the number of markers is 2 or more; the obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle comprises the following steps:
determining the relative distance between the unmanned vehicle and a preset reference position on the bridge crane according to the relative distance between each of at least two markers on the bridge crane and the unmanned vehicle;
and determining the relative distance between the unmanned vehicle and the reference position as the relative position between the unmanned vehicle and the bridge crane.
3. The method according to claim 1 or 2, characterized in that the scanning range of the radar device comprises an upper area of the unmanned vehicle.
4. A method according to claim 3, wherein the radar apparatus comprises one or more of: rotary lidar, solid-state lidar.
5. The method of claim 4, wherein the rotary lidar is a multi-line lidar.
6. The method of claim 4, wherein the radar device is disposed above a head of the unmanned vehicle.
7. A vehicle positioning device under a bridge crane, characterized in that a radar device and a camera device are arranged on an unmanned vehicle, the device comprising:
the acquisition module is used for acquiring the point cloud data obtained by scanning the radar device;
the positioning module is used for determining the relative position of the unmanned vehicle and the bridge crane according to the point cloud data;
the acquisition module is also used for acquiring the front image of the unmanned vehicle, which is obtained by shooting by the shooting device;
the processing module is used for determining the estimated relative distance between the unmanned vehicle and the bridge crane according to the front image; if the estimated relative distance is smaller than or equal to a preset threshold value, starting the radar device;
the positioning module is specifically configured to determine a region of interest corresponding to at least one marker; the region of interest corresponding to the at least one marker is a position region occupied by the at least one marker in a preset coordinate system; the at least one marker comprises at least one beam on the front side door frame and/or at least one beam on the rear side door frame; the preset coordinate system is a vehicle body coordinate system of the unmanned vehicle; screening the point cloud data according to the interest area corresponding to the at least one marker, wherein the screened point cloud data is located in the interest area corresponding to the at least one marker; determining the position of the at least one marker in a preset coordinate system according to the screened point cloud data; determining a relative distance between the at least one marker and the unmanned vehicle according to the position of the at least one marker; obtaining the relative position of the unmanned vehicle and the bridge crane according to the relative distance between the at least one marker and the unmanned vehicle;
The positioning module is specifically used for calculating normal vectors of points in point cloud data aiming at the point cloud data in the interest area corresponding to the cross beam; projecting the normal vector obtained by calculation to an XOY plane of a preset coordinate system to obtain a corresponding two-dimensional vector; according to the two-dimensional vector obtained by projection, obtaining points with normal vectors parallel to an XOY plane in the point cloud data, and determining the points with the normal vectors parallel to the XOY plane in the point cloud data as points falling on the vertical plane of the cross beam; obtaining the vertical surface of the cross beam by carrying out plane fitting on points falling on the vertical surface of the cross beam; for each vertical plane, projecting the inner points for fitting the vertical plane to an XOY plane of a preset coordinate system to obtain projection points, corresponding to the inner points, projected on the XOY plane; performing straight line fitting according to each projection point to obtain a straight line equation; calculating the relative distance between the straight line corresponding to the straight line equation and the origin of the vehicle body coordinate system, and determining the relative distance between the straight line corresponding to the straight line equation and the origin of the vehicle body coordinate system as the relative distance between the vertical plane and the origin of the vehicle body coordinate system;
The positioning module is further used for determining whether the vertical surface of the cross beam obtained by fitting is the inner vertical surface of the cross beam relative to the center of the bridge crane or the outer vertical surface of the cross beam relative to the center of the bridge crane according to the point cloud data in the region of interest corresponding to the cross beam; in determining whether the fitted vertical surface of the beam is an inner vertical surface of the beam with respect to the center of the bridge crane or an outer vertical surface of the beam with respect to the center of the bridge crane, it is determined whether the beam is located in the front door frame or the rear door frame, and whether the distance between the vertical surface of the beam and the unmanned vehicle is a positive value or a negative value.
8. An electronic device, the device comprising: a memory and a processor;
the memory is used for storing program instructions;
the processor is configured to invoke program instructions in the memory to perform the method of any of claims 1-6.
9. An unmanned vehicle, the vehicle comprising:
radar apparatus and electronic device according to claim 8.
10. A computer readable storage medium having a computer program stored thereon; the computer program, when executed, implements the method of any of claims 1-6.
CN202010841115.9A 2020-08-20 2020-08-20 Method, device, equipment and storage medium for positioning vehicle under bridge crane Active CN112099025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010841115.9A CN112099025B (en) 2020-08-20 2020-08-20 Method, device, equipment and storage medium for positioning vehicle under bridge crane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010841115.9A CN112099025B (en) 2020-08-20 2020-08-20 Method, device, equipment and storage medium for positioning vehicle under bridge crane

Publications (2)

Publication Number Publication Date
CN112099025A CN112099025A (en) 2020-12-18
CN112099025B true CN112099025B (en) 2024-04-02

Family

ID=73754018

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010841115.9A Active CN112099025B (en) 2020-08-20 2020-08-20 Method, device, equipment and storage medium for positioning vehicle under bridge crane

Country Status (1)

Country Link
CN (1) CN112099025B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112897345B (en) * 2021-01-27 2023-06-23 上海西井信息科技有限公司 Alignment method of container truck and crane and related equipment
CN113460888B (en) * 2021-05-24 2023-11-24 武汉港迪智能技术有限公司 Automatic box grabbing method for gantry crane lifting appliance
CN113759906B (en) * 2021-08-30 2024-07-12 广州文远知行科技有限公司 Vehicle alignment method and device, computer equipment and storage medium
CN115258959B (en) * 2022-09-19 2023-01-03 杭州飞步科技有限公司 Sling control method, equipment and storage medium
CN117387491B (en) * 2023-12-11 2024-04-05 南京理工大学 Binocular vision marker positioning device and method suitable for bridge girder erection machine

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006031126A1 (en) * 2004-09-16 2006-03-23 Juralco As Collision-safe frame for large traffic gantries
WO2014191618A1 (en) * 2013-05-31 2014-12-04 Konecranes Plc Cargo handling by a spreader
CN105787921A (en) * 2015-08-19 2016-07-20 南京大学 Method for reconstructing large-scale complex flyover 3D model by using airborne LiDAR data
CN107521478A (en) * 2017-07-10 2017-12-29 浙江亚太机电股份有限公司 Control method based on ultrasonic radar and millimetre-wave radar
CN108564525A (en) * 2018-03-31 2018-09-21 上海大学 A kind of 3D point cloud 2Dization data processing method based on multi-line laser radar
CN108583432A (en) * 2018-07-05 2018-09-28 广东机电职业技术学院 A kind of intelligent pillar A blind prior-warning device and method based on image recognition technology
CN108845579A (en) * 2018-08-14 2018-11-20 苏州畅风加行智能科技有限公司 A kind of automated driving system and its method of port vehicle
CN108873904A (en) * 2018-07-04 2018-11-23 北京踏歌智行科技有限公司 The unmanned parking scheme of mine vehicle, equipment and readable storage medium storing program for executing
CN109062205A (en) * 2018-07-26 2018-12-21 武汉水草能源科技研发中心(有限合伙) Artificial intelligence automobile Unmanned Systems
CN109828577A (en) * 2019-02-25 2019-05-31 北京主线科技有限公司 The opposite automation field bridge high accuracy positioning parking method of unmanned container truck
CN109872384A (en) * 2018-12-29 2019-06-11 中国科学院遥感与数字地球研究所 A kind of shaft tower automation modeling method based on airborne LIDAR point cloud data
CN109941274A (en) * 2019-03-01 2019-06-28 武汉光庭科技有限公司 Parking method and system, server and medium based on radar range finding identification gantry crane
CN110082775A (en) * 2019-05-23 2019-08-02 北京主线科技有限公司 Vehicle positioning method and system based on laser aid
CN110262508A (en) * 2019-07-06 2019-09-20 深圳数翔科技有限公司 Applied to the automated induction systems and method on the closing unmanned goods stock in place
CN110728753A (en) * 2019-10-09 2020-01-24 湖南大学 Target point cloud 3D bounding box fitting method based on linear fitting
CN111175788A (en) * 2020-01-20 2020-05-19 北京主线科技有限公司 Transverse positioning method and positioning system for automatic driving vehicle
CN111369779A (en) * 2018-12-26 2020-07-03 北京图森智途科技有限公司 Accurate parking method, equipment and system for truck in shore crane area
CN111369780A (en) * 2018-12-26 2020-07-03 北京图森智途科技有限公司 Accurate parking method, equipment and system for truck in shore crane area

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006031126A1 (en) * 2004-09-16 2006-03-23 Juralco As Collision-safe frame for large traffic gantries
WO2014191618A1 (en) * 2013-05-31 2014-12-04 Konecranes Plc Cargo handling by a spreader
CN105787921A (en) * 2015-08-19 2016-07-20 南京大学 Method for reconstructing large-scale complex flyover 3D model by using airborne LiDAR data
CN107521478A (en) * 2017-07-10 2017-12-29 浙江亚太机电股份有限公司 Control method based on ultrasonic radar and millimetre-wave radar
CN108564525A (en) * 2018-03-31 2018-09-21 上海大学 A kind of 3D point cloud 2Dization data processing method based on multi-line laser radar
CN108873904A (en) * 2018-07-04 2018-11-23 北京踏歌智行科技有限公司 The unmanned parking scheme of mine vehicle, equipment and readable storage medium storing program for executing
CN108583432A (en) * 2018-07-05 2018-09-28 广东机电职业技术学院 A kind of intelligent pillar A blind prior-warning device and method based on image recognition technology
CN109062205A (en) * 2018-07-26 2018-12-21 武汉水草能源科技研发中心(有限合伙) Artificial intelligence automobile Unmanned Systems
CN108845579A (en) * 2018-08-14 2018-11-20 苏州畅风加行智能科技有限公司 A kind of automated driving system and its method of port vehicle
CN111369779A (en) * 2018-12-26 2020-07-03 北京图森智途科技有限公司 Accurate parking method, equipment and system for truck in shore crane area
CN111369780A (en) * 2018-12-26 2020-07-03 北京图森智途科技有限公司 Accurate parking method, equipment and system for truck in shore crane area
CN109872384A (en) * 2018-12-29 2019-06-11 中国科学院遥感与数字地球研究所 A kind of shaft tower automation modeling method based on airborne LIDAR point cloud data
CN109828577A (en) * 2019-02-25 2019-05-31 北京主线科技有限公司 The opposite automation field bridge high accuracy positioning parking method of unmanned container truck
CN109941274A (en) * 2019-03-01 2019-06-28 武汉光庭科技有限公司 Parking method and system, server and medium based on radar range finding identification gantry crane
CN110082775A (en) * 2019-05-23 2019-08-02 北京主线科技有限公司 Vehicle positioning method and system based on laser aid
CN110262508A (en) * 2019-07-06 2019-09-20 深圳数翔科技有限公司 Applied to the automated induction systems and method on the closing unmanned goods stock in place
CN110728753A (en) * 2019-10-09 2020-01-24 湖南大学 Target point cloud 3D bounding box fitting method based on linear fitting
CN111175788A (en) * 2020-01-20 2020-05-19 北京主线科技有限公司 Transverse positioning method and positioning system for automatic driving vehicle

Also Published As

Publication number Publication date
CN112099025A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN112099025B (en) Method, device, equipment and storage medium for positioning vehicle under bridge crane
US10928508B2 (en) Camera and radar fusion
US11276189B2 (en) Radar-aided single image three-dimensional depth reconstruction
CN111223135B (en) System and method for enhancing range estimation by monocular cameras using radar and motion data
US11508122B2 (en) Bounding box estimation and object detection
CN113748357B (en) Attitude correction method, device and system of laser radar
US9903946B2 (en) Low cost apparatus and method for multi-modal sensor fusion with single look ghost-free 3D target association from geographically diverse sensors
US11620760B2 (en) Ranging method based on laser-line scanning imaging
WO2020191978A1 (en) Sar imaging method and imaging system thereof
CN110573907A (en) dynamic sensor operation and data processing based on motion information
CN112102396B (en) Method, device, equipment and storage medium for positioning vehicle under bridge crane
CN113625271B (en) Simultaneous positioning and mapping method based on millimeter wave radar and binocular camera
CN114935747B (en) Laser radar calibration method, device, equipment and storage medium
CN116215520A (en) Vehicle collision early warning and processing method and device based on ultrasonic waves and 3D looking around
US20210149412A1 (en) Position estimating apparatus, method for determining position of movable apparatus, and non-transitory computer readable medium
US11662740B2 (en) Position estimating apparatus, method for determining position of movable apparatus, and non-transitory computer readable medium
US20240320978A1 (en) Method for monitoring a loading space
Borthwick et al. Mining haul truck localization using stereo vision
EP4345750A1 (en) Position estimation system, position estimation method, and program
US20230243666A1 (en) Method for mapping, mapping device, computer program, computer readable medium, and vehicle
CN116263952A (en) Method, device, system and storage medium for measuring car hopper
Jiang et al. Real-time hatch detection during unloading of dry bulk carriers with side rolling hatchcover
CN117132633A (en) Method, device, equipment and medium for estimating loading rate based on monocular camera
CN114868150A (en) Information processing device, sensing device, moving object, and information processing method
JPH07334680A (en) Three-dimensional shape processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant