CN117037086A - Head recognition method and device, vehicle scanning system, device and readable medium - Google Patents

Head recognition method and device, vehicle scanning system, device and readable medium Download PDF

Info

Publication number
CN117037086A
CN117037086A CN202311064592.9A CN202311064592A CN117037086A CN 117037086 A CN117037086 A CN 117037086A CN 202311064592 A CN202311064592 A CN 202311064592A CN 117037086 A CN117037086 A CN 117037086A
Authority
CN
China
Prior art keywords
vehicle
image
area
front wheel
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311064592.9A
Other languages
Chinese (zh)
Inventor
孟娈
戴诗语
王永明
涂俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nuctech Technology Jiangsu Co ltd
Nuctech Co Ltd
Original Assignee
Nuctech Technology Jiangsu Co ltd
Nuctech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nuctech Technology Jiangsu Co ltd, Nuctech Co Ltd filed Critical Nuctech Technology Jiangsu Co ltd
Priority to CN202311064592.9A priority Critical patent/CN117037086A/en
Publication of CN117037086A publication Critical patent/CN117037086A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • G01N23/046Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pulmonology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present disclosure provides a vehicle head identification method and apparatus, a vehicle scanning system, a device and a readable medium, and relates to the field of vehicle scanning detection, the method includes: acquiring a regional laser image of a vehicle, wherein the regional laser image comprises depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a height change point of the vehicle body height gradient value in the area laser image, wherein the height change point is larger than a preset threshold value; and determining boundary positions of the locomotive and the carriage according to the position relation between the height change point and the front wheel position, so as to determine the locomotive position. The vehicle head recognition method provided by the disclosure can automatically and conveniently recognize the vehicle head position in the vehicle scanning process so as to realize independent scanning of the carriage, and has strong practicability.

Description

Head recognition method and device, vehicle scanning system, device and readable medium
The application relates to a divisional application with application number 201811615134.9, the name of which is 'head recognition method and device, vehicle scanning system, device and readable medium', which are filed on the following 12 th and 27 th days of 2018.
Technical Field
The present disclosure relates to the field of vehicle scanning detection, and in particular, to a vehicle head recognition method and apparatus, a vehicle scanning method and system, an electronic device, and a computer readable medium.
Background
The smuggling crimes and carrying dangerous goods can cause great damage to social security, national security is compromised, and the information of the incoming and outgoing vehicles is strictly checked on each customs port based on the consideration of national and people security. However, customs daily cargo throughput is large and existing systems detect slower and are heavily dependent on technicians, and existing vehicle detection systems do not meet customs daily inspection requirements. Increasing the degree of automation and the speed of detection systems is therefore a major concern today.
Based on this, no-stop automatic vehicle detection becomes a focus of attention. The automatic vehicle detection technique without stopping is generally to scan the running vehicle by using rays, so as to complete the identification of cargoes in the carriage. However, the rays have great damage to the human body, so that in order to avoid the damage to the driver in the vehicle, the vehicle head part needs to be avoided when the vehicle is scanned by using the rays, and only the carriage position of the vehicle is scanned by using the rays.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the invention and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
In view of this, the present disclosure provides a vehicle head recognition method and apparatus, a vehicle scanning method and system, an electronic device, and a computer readable medium, which can quickly and accurately recognize a vehicle head position, and further facilitate different scanning processes for a vehicle head and a vehicle cabin during vehicle scanning detection.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, a vehicle head identification method is provided, the method comprising:
acquiring a regional laser image of a vehicle, wherein the regional laser image comprises depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a height change point of the vehicle body height gradient value in the area laser image, wherein the height change point is larger than a preset threshold value; and determining boundary positions of the locomotive and the carriage according to the position relation between the height change point and the front wheel position, so as to determine the locomotive position.
In an exemplary embodiment of the present disclosure, the vehicle head identification method further includes: generating a regional binary plane image of the vehicle according to the regional laser image; determining a vehicle air-conditioning position based on the area binary plane image; and removing the vehicle air conditioning information in the area laser image according to the vehicle air conditioning position.
In one exemplary embodiment of the present disclosure, the determining the front wheel position of the vehicle based on the area laser image includes: if the front wheel position of the vehicle cannot be determined based on the area laser image, an area binary plane image of the vehicle is generated from the area laser image, and the front wheel position of the vehicle is determined based on the area binary plane image.
In one exemplary embodiment of the present disclosure, the acquiring the area laser image of the vehicle includes: acquiring sequence image information of the vehicle; and generating the regional laser image according to the sequence image information.
According to a second aspect of the present disclosure, there is provided a vehicle head identification method, the method comprising: acquiring a regional laser image of a vehicle, wherein the regional laser image comprises depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a far point region in the region laser image based on the depth information of the region laser image; and determining the boundary positions of the locomotive and the carriage according to the position relation between the far point area and the front wheel position, so as to determine the locomotive position.
In an exemplary embodiment of the present disclosure, the vehicle head identification method further includes: generating a regional binary plane image of the vehicle according to the regional laser image; determining a vehicle air-conditioning position based on the area binary plane image; and removing the vehicle air conditioning information in the area laser image according to the vehicle air conditioning position.
In one exemplary embodiment of the present disclosure, the determining the front wheel position of the vehicle based on the area laser image includes: if the front wheel position of the vehicle cannot be determined based on the area laser image, an area binary plane image of the vehicle is generated from the area laser image, and the front wheel position of the vehicle is determined based on the area binary plane image.
In one exemplary embodiment of the present disclosure, the acquiring the area laser image of the vehicle includes: acquiring sequence image information of the vehicle; and generating the regional laser image according to the sequence image information.
According to a third aspect of the present disclosure, there is provided a vehicle scanning method, the method comprising: determining the position of the vehicle head according to any head recognition method; and scanning the vehicle based on the vehicle head position.
In one exemplary embodiment of the present disclosure, the scanning the vehicle based on the vehicle head position includes: based on the vehicle head position, individual scanning of the vehicle cabin is achieved.
According to a fourth aspect of the present disclosure, there is provided a vehicle head recognition apparatus, the apparatus comprising: the system comprises a regional laser image acquisition module, a regional laser image acquisition module and a control module, wherein the regional laser image acquisition module is configured to acquire a regional laser image of a vehicle, and the regional laser image comprises depth information and height information; a front wheel position determining module configured to determine a front wheel position of the vehicle based on the area laser image; the height change point determining module is configured to determine a height change point with a gradient value of the height of the vehicle body in the area laser image being larger than a preset threshold value; and the headstock position determining module is configured to determine the boundary position of the headstock and the carriage according to the position relation between the height change point and the front wheel position, so as to determine the headstock position.
According to a fifth aspect of the present disclosure, there is provided a vehicle head recognition apparatus, the apparatus comprising: the system comprises an area laser image acquisition module, an area laser image acquisition module and a control module, wherein the area laser image acquisition module is configured to acquire an area laser image of a vehicle, and the area laser image comprises depth information and height information; a front wheel module configured to determine a front wheel position of the vehicle based on the area laser image; an image far point region position determining module configured to determine a far point region in the region laser image based on depth information of the region laser image; and the vehicle head position determining module is configured to determine the boundary position of the head and the carriage according to the position relation between the far point area and the front wheel position, so as to determine the head position.
According to a sixth aspect of the present disclosure, there is provided a vehicle scanning system, comprising: the laser device is used for acquiring a regional laser image of the vehicle; a scanning device for scanning the vehicle; a control device for receiving a region laser image of a vehicle, the region laser image including depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a height change point of the vehicle body height gradient value in the area laser image, wherein the height change point is larger than a preset threshold value; determining boundary positions of a locomotive and a carriage according to the position relation between the height change point and the front wheel position, so as to determine the locomotive position; and controlling the scanning equipment to scan the vehicle based on the vehicle head position.
According to a seventh aspect of the present disclosure, there is provided a vehicle scanning system, comprising: the laser device is used for acquiring a regional laser image of the vehicle; a scanning device for scanning the vehicle; a control device for receiving a region laser image of a vehicle, the region laser image including depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a far point region in the region laser image based on the depth information of the region laser image; determining boundary positions of a locomotive and a carriage according to the position relation between the far point area and the front wheel position, so as to determine the locomotive position; and controlling the scanning equipment to scan the vehicle based on the vehicle head position.
According to an eighth aspect of the present disclosure, there is provided an electronic device including: one or more processors; and a storage device for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement any of the methods of head recognition described above.
According to a ninth aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements a method of identifying a vehicle head as described in any of the above.
According to the vehicle head identification method and device, the vehicle scanning system, the electronic equipment and the computer readable medium, the vehicle is scanned by laser to obtain the regional laser image, the vehicle head position is identified based on the regional laser image, and then the vehicle head and the carriage are scanned according to the vehicle head position. The vehicle head recognition method provided by the disclosure can automatically and conveniently recognize the position of the vehicle head in the vehicle scanning process so as to realize the scanning of the vehicle head and the carriage respectively, and has strong practicability.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. The drawings described below are merely examples of the present disclosure and other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a schematic diagram illustrating an application scenario that may be used for a vehicle scanning method, according to an exemplary embodiment.
Fig. 2 is a flow chart illustrating a method of identifying a vehicle head according to an exemplary embodiment.
Fig. 3 is a schematic diagram showing a laser device lasing according to an exemplary embodiment.
Fig. 4 is a schematic diagram showing gradual increase of sequence data in a region laser image according to an exemplary embodiment.
FIG. 5 is a schematic diagram illustrating identification of a front wheel of a vehicle from a regional laser image, according to an exemplary embodiment.
Fig. 6 is a binary plan view of a region generated from a region laser image, according to an example embodiment.
FIG. 7 is a schematic diagram illustrating identification of a front wheel of a vehicle from a regional laser image, according to an example embodiment.
Fig. 8 is a schematic diagram illustrating identifying a vehicle air conditioning location from a zone binary plan view according to an exemplary embodiment.
FIG. 9 is a schematic diagram illustrating determining a boundary of a vehicle head and a vehicle cabin from a point of height change in a zone laser image, according to an example embodiment.
Fig. 10 is a flowchart illustrating a head recognition method according to another exemplary embodiment.
FIG. 11 is a schematic diagram illustrating determining an image far point from a region laser image, according to an example embodiment.
Fig. 12 is a flowchart illustrating a vehicle scanning method according to an exemplary embodiment.
Fig. 13 is a flowchart illustrating a vehicle scanning method according to another exemplary embodiment.
Fig. 14 is a block diagram illustrating a head recognition device according to an exemplary embodiment.
Fig. 15 is a block diagram of a head recognition apparatus according to another exemplary embodiment.
Fig. 16 is a block diagram illustrating a head recognition apparatus according to another exemplary embodiment.
Fig. 17 is a block diagram illustrating a head recognition apparatus according to another exemplary embodiment.
Fig. 18 is a block diagram illustrating a head recognition apparatus according to another exemplary embodiment.
Fig. 19 is a block diagram illustrating a head recognition apparatus according to another exemplary embodiment.
Fig. 20 is a block diagram illustrating a head recognition apparatus according to another exemplary embodiment.
Fig. 21 is a block diagram illustrating a head recognition apparatus according to another exemplary embodiment.
Fig. 22 is a block diagram of a vehicle scanning system according to another exemplary embodiment.
Fig. 23 is a schematic diagram showing a structure of a computer system applied to a head recognition device according to an exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments can be embodied in many forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted.
The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
The drawings are merely schematic illustrations of the present invention, in which like reference numerals denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and not necessarily all of the elements or steps are included or performed in the order described. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
In the present specification, the terms "a," "an," "the," "said" and "at least one" are used to indicate the presence of one or more elements/components/etc.; the terms "comprising," "including," and "having" are intended to be inclusive and mean that there may be additional elements/components/etc., in addition to the listed elements/components/etc.; the terms "first," "second," and "third," etc. are used merely as labels, and do not limit the number of their objects.
The following describes example embodiments of the invention in detail with reference to the accompanying drawings.
Fig. 1 is a schematic diagram illustrating an application scenario that may be used for a vehicle scanning method, according to an exemplary embodiment.
As shown in fig. 1, a laser apparatus 101, a vehicle 102, a scanning apparatus 103, and a control apparatus 104 are included in a vehicle scanning scene. During the travel of the vehicle, the laser device 101 continuously scans the vehicle from the head to acquire the sequence image information, and transmits the sequence image information to the control device 104. The control device 104 generates a region laser image according to the sequence image information obtained by scanning by the laser device 101, and determines the boundary between the headstock and the carriage according to the region laser image, thereby determining the headstock position. Once the control device 104 successfully determines the headstock position, the control device 104 controls the laser device 101 to stop acquiring the vehicle sequence image information and controls the scanning device 103 to start scanning the vehicle.
The laser device 101 may be a laser image data acquisition device such as a region laser for acquiring sequential image information of the vehicle as it passes through the system.
The scanning device 103 may be an X-ray image security device for scanning a vehicle, which may be used for example to determine cargo conditions in the cabin of the vehicle.
The control device 104 is used for identifying the position of the headstock according to the image information acquired by the laser device 101 and controlling the scanning device 103 to scan the vehicle. The control device 104 may, for example, acquire a region laser image of the vehicle during the head recognition process, the region laser image including depth information and height information; the control device 104 may determine a front wheel position of the vehicle, for example, based on the area laser image; the control device 104 determines a height change point in the area laser image where the gradient value of the vehicle body height is greater than a predetermined threshold value; the control device 104 determines the boundary position of the vehicle head and the vehicle cabin according to the position relation between the height change point and the front wheel position, thereby determining the vehicle head position.
In another embodiment, the control device 104 may also, for example, acquire a region laser image of the vehicle during the head recognition process, the region laser image including depth information and height information; the control device 104 may determine a front wheel position of the vehicle, for example, based on the area laser image; the control device 104 may determine a far point region in the region laser image, for example, based on depth information of the region laser image; the control device 104 may determine the head position by determining the head and cabin boundary positions, for example, from the positional relationship of the far point region and the front wheel position.
Fig. 2 is a flow chart illustrating a method of identifying a vehicle head according to an exemplary embodiment.
Referring to fig. 2, the head recognition method may include the following steps.
Step S201, a region laser image of the vehicle is acquired, the region laser image including depth information and height information.
In some embodiments, the area laser image of the vehicle may refer to an image of a portion of the vehicle body scanned by the laser device, where the area laser image information includes depth information of the vehicle and actual height information of the vehicle body.
In some embodiments, the laser device may acquire sequential image information of the vehicle, wherein the sequential image information refers to column data constituting an image, wherein the sequential image information includes a linear distance of a body of the scanned portion of the vehicle from the laser device. Depth information of the vehicle body of the scanned part of the vehicle can be further obtained according to the linear distance between the vehicle body of the scanned part of the vehicle and the laser device in the sequence image, and then the regional laser image can be obtained after correcting a plurality of rows of depth information obtained according to the sequence image information according to the speed information of the vehicle.
In some embodiments, as shown in FIG. 3, in one laser scan, the laser device emits a plurality of lasers 301 in different directions, wherein the plurality of lasers propagate in the same laser plane perpendicular to the ground 303, and the angle of the laser propagation trajectory to the perpendicular to the laser plane is θ i ,θ i The range of values is 0 to 180 degrees. When the laser light contacts the vehicle, emission occurs on the surface of the vehicle, and the reflected laser light is further received by the laser light receiving device. The linear distance L between the laser reflection point and the laser equipment can be calculated by calculating the propagation time of laser in the air i By the formula: vehicle height=l i *cosθ i The actual height of the laser device is calculated, and thus the vehicle sequence image information is obtained. The vehicle sequence image information comprises a straight line of a laser reflection point on a vehicle body from a laser deviceDistance and actual height of the laser reflection point.
The collected sequential image information of the vehicle is only one column of data, and the regional laser image information can be obtained according to a plurality of columns of sequential image information. For example, if the sequential image information of the vehicle is known, wherein the sequential image information contains the linear distance L of the laser emission point from the laser device i By the formula: depth value (image gray value) =l i *sinθ i The depth value of the laser reflection point can be calculated. In addition, the formula is as follows: vehicle height=l i *cosθ i The actual height of the laser device is +and the actual height of the laser reflection point can be calculated. Knowing the depth information and the height information of a plurality of rows of vehicles, a regional laser image of the vehicle as shown in fig. 4 can be derived. Fig. 4 shows a process in which sequence data in a generated area laser image gradually increases in a scanning process in which a laser apparatus scans a vehicle body, in order from left to right.
Step S202 determines a front wheel position of the vehicle based on the area laser image.
In some embodiments, after obtaining the area laser image of the vehicle, it is also desirable to determine the position of the front wheels of the vehicle in the area laser image. The known area laser device contains information about the height of the vehicle and, on the basis of common knowledge, the actual height of the position of the wheel is lowest, which can be determined by finding areas in the area laser image whose height is below a determined threshold value. As shown in fig. 5, a point below 10 cm in the area laser image may be determined as a point on the wheel, and since the laser device starts scanning from the head of the vehicle, the wheel that is preferentially found during the scanning of the vehicle by the laser device is the front wheel 501.
In some embodiments, the wheel positions are not found due to the presence of footrests in some vehicle cabs. As shown in fig. 5, the position of the foothold 502 is also very low, and if the front wheel position of the vehicle is judged from the height information of the points in the area laser image, the front wheel position cannot be determined due to the presence of the foothold. For this purpose, an auxiliary judgment is also performed by introducing a regional binary plane image of the vehicle.
In some embodiments, the actual height of the laser reflection point is included in the area laser image. The reflection point with the highest height is stored in the highest row of the binary data, the reflection point with the lowest height is stored in the lowest row of the binary data, and all the reflection points are filled between the highest row and the lowest row according to the height. If the actual heights of the laser reflection points of the plurality of columns are known, a regional binary plane image of the vehicle can be generated according to the above method, as shown in fig. 6.
As shown in fig. 7, after obtaining the area binary plane image of the vehicle, the wheel position in the image can be found by using the image processing knowledge in combination with the information that the wheel is circular. Similarly, since the laser device scans from the head of the vehicle, the wheel that is preferentially found is the front wheel 701.
Step S203, determining a height change point in the area laser image where the gradient value of the vehicle body height is greater than a predetermined threshold value.
In some embodiments, there may be an air conditioner above the junction of the head and the cabin in the area laser image of some vehicles. The presence of air conditioning in the area laser image can seriously affect the determination of the altitude change point that satisfies the condition. Therefore, when determining the height change point that meets the condition, the air-conditioning information in the area laser image is first removed.
In some embodiments, if the air conditioning information in the area laser image is to be removed, an area binary plane image is first generated from the area laser image. In some embodiments, the actual height of the laser reflection point is included in the area laser image. The highest reflecting point is stored in the highest row of the data, the lowest reflecting point is stored in the lowest row of the data, and so on, all reflecting points are filled between the highest row and the lowest row according to the height. If the actual heights of the laser reflection points of the plurality of columns are known, a regional binary plane of the vehicle can be generated according to the above method, as shown in fig. 6.
In some embodiments, as shown in fig. 8, in the area binary plane image of the vehicle, first, the sequence information of the front section of the vehicle in each line of data in the area binary plane information, which is located at the far left end of the area binary plane image, is obtained, and we will refer to these information as air sequence information. According to the air sequence information, a change curve 801 of the leftmost position of the vehicle can be obtained, and the position of the air conditioner of the vehicle can be found according to an inflection point 802 on the change curve 801 of the leftmost position of the vehicle. The inflection point is the concave-convex demarcation point of the change curve 801. The inflection point on the change curve 801 is found to further determine the air-conditioning information, and if the air-conditioning image information can be found in the area binary plane image of the vehicle, the air-conditioning image information can also be found correspondingly in the area laser image.
In some embodiments, if the air conditioner is configured above the vehicle, the air conditioner position of the vehicle is determined according to the area binary plane information of the vehicle, and the air conditioner image information is correspondingly found and removed in the area laser image. After the air conditioning information in the area laser image is removed, determining a height change point with a gradient value of the height of the vehicle body larger than a preset threshold value according to the height information of the area laser image; if no air conditioner is arranged above the vehicle, determining a height change point with a gradient value of the vehicle body height larger than a preset threshold value directly according to the height information of the regional laser image.
Wherein, the height change point refers to the position where the height of the vehicle body changes. The height change point where the gradient value of the vehicle body height is determined to be greater than the predetermined threshold value may be specifically implemented as: calculating a height gradient value between the highest points of adjacent columns of the regional laser images according to the height information contained in the regional laser images; finding a height change point where the height gradient value is greater than a predetermined threshold value, for example, finding a position where the height gradient value is greater than 1 meter can determine that the point is a height change point satisfying the condition. The body of a vehicle may contain a number of points of change in height that meet a condition, for example, the junction connecting the head and the car is well below the head or car, and the junction with the head or car is the point of change in height.
And 204, determining boundary positions of the locomotive and the carriage according to the position relation between the height change point and the front wheel position, so as to determine the locomotive position.
In some embodiments, if a gradient change in the vehicle body height in the area laser image has been found to a height change point that is greater than a predetermined threshold, the head and cabin demarcation location is determined from the positional relationship of the height change point to the front wheel location. For example, if at least one height change point has been found, it is determined whether the above height change point is behind the vehicle front wheel position; if the height change point is determined to be behind the front wheel position of the vehicle, continuing to judge whether the horizontal distance from the height change point to the center point of the front wheel of the vehicle is within the range of 50 cm to 100 cm, if the height change point is not behind the front wheel of the vehicle, discarding the height change point, and continuing to judge other height change points; if the horizontal distance from the height change point behind the front wheel of the vehicle to the center point of the front wheel is determined to be within the range of 50 cm to 100 cm, the position where the height change point is located is considered to be the boundary between the vehicle head and the vehicle cabin, and if the horizontal distance from the height change point behind the front wheel of the vehicle to the center point of the front wheel is determined not to be within the range of 50 cm to 100 cm, the height change point is abandoned, and other height change points are continuously determined. As shown in fig. 9, the height change point 902 is just behind the front wheel 901, and the actual horizontal distance from the center point of the front wheel is only 80 cm, the position of the height change point 902 is the boundary between the vehicle head and the vehicle cabin, which is determined according to the above steps, and the position of the vehicle head can be found according to the boundary between the vehicle head and the vehicle cabin.
However, in actual operation, not all vehicles have a height change point satisfying the requirement, for example, some vehicles have a head and a cabin connected together and have the same height, and at this time, the height change point of the vehicle body satisfying the requirement cannot be determined.
Based on this, the present disclosure also proposes another head recognition method in some embodiments, that is, head recognition is performed based on the position of the far point region in the region laser image.
Fig. 10 is a flowchart illustrating a head recognition method according to another exemplary embodiment.
In step S1001, a region laser image of a vehicle is acquired, the region laser image including depth information and height information.
In some embodiments, the area laser image of the vehicle may refer to an image of a portion of the vehicle body scanned by the laser device, where the area laser image information includes depth information of the vehicle and actual height information of the vehicle body.
In some embodiments, the laser device may acquire sequential image information of the vehicle, wherein the sequential image information refers to column data constituting an image, wherein the sequential image information includes a linear distance of a body of the scanned portion of the vehicle from the laser device. Depth information of the vehicle body of the scanned part of the vehicle can be further obtained according to the linear distance between the vehicle body of the scanned part of the vehicle and the laser device in the sequence image, and then the regional laser image can be obtained after correcting a plurality of rows of depth information obtained according to the sequence image information according to the speed information of the vehicle.
In some embodiments, as shown in FIG. 3, in one laser scan, the laser device emits a plurality of lasers 301 in different directions, wherein the plurality of lasers propagate in the same laser plane perpendicular to the ground 303, and the angle of the laser propagation trajectory to the perpendicular to the laser plane is θ i ,θ i The range of values is 0 to 180 degrees. When the laser light contacts the vehicle, emission occurs on the surface of the vehicle, and the reflected laser light is further received by the laser light receiving device. The linear distance L between the laser reflection point and the laser equipment can be calculated by calculating the propagation time of laser in the air i By the formula: vehicle height=l i *cosθ i The actual height of the laser device is calculated, and thus the vehicle sequence image information is obtained. The vehicle sequence image information includes a linear distance of the laser reflection point on the vehicle body from the laser device and an actual height of the laser reflection point.
The collected sequential image information of the vehicle is only one column of data, and the regional laser image information can be obtained according to a plurality of columns of sequential image information. For example, if the sequential image information of the vehicle is known, wherein the sequential image information contains the linear distance of the laser emission point from the laser device L i By the formula: depth value (image gray value) =l i *sinθ i The depth value of the laser reflection point can be calculated. In addition, the formula is as follows: vehicle height=l i *cosθ i The actual height of the laser device is +and the actual height of the laser reflection point can be calculated. Knowing the depth information and the height information of the reflection points of the multiple trains, a regional laser image of the vehicle as shown in fig. 4 can be derived. Fig. 4 shows a process in which sequence data in a generated area laser image gradually increases in a scanning process in which a laser apparatus scans a vehicle body, in order from left to right.
Step S1002 determines a front wheel position of the vehicle based on the area laser image.
In some embodiments, after obtaining the area laser image of the vehicle, it is also desirable to determine the position of the front wheels of the vehicle in the area laser image. Knowing the height information of the vehicle in the area laser image information, and based on common knowledge, the actual height of the position of the wheel is the lowest, the wheel position can be determined by finding an area in the area laser image where the height is below a determined threshold. As shown in fig. 5, a point below 10 cm in the area laser image may be determined as a point on the wheel, and since the laser device starts scanning from the head of the vehicle, the wheel that is preferentially found during the scanning of the vehicle by the laser device is the front wheel 501.
In some embodiments, the wheel positions are not found due to the presence of footrests outside of some vehicle cabs. As shown in fig. 5, the position of the foothold 502 is also very low, and if the front wheel position of the vehicle is judged from the height information of the points in the area laser image, the front wheel position cannot be determined due to the presence of the foothold. For this purpose, an auxiliary judgment is also performed by introducing a regional binary plane image of the vehicle.
In some embodiments, the actual height of the laser reflection point is also included in the area laser image. Let the highest emitting point be stored in the highest row of data, let the lowest emitting point be stored in the lowest row of data, and so on to fill all reflection points between the highest row and the lowest row by height. If the actual heights of the rows of laser reflection points in the vehicle are known, a regional binary planar image of the vehicle can be generated, as shown in FIG. 6.
As shown in fig. 7, after obtaining the area binary plane image of the vehicle, the wheel position in the image can be found by using the image processing knowledge in combination with the information that the wheel is circular. Similarly, since the laser device scans from the head of the vehicle, the wheel that is preferentially found is the front wheel 701.
Step S1003, determining a far point region in the region laser image based on the depth information of the region laser image.
The far point region refers to a region in the region laser image, in which depth information is greater than a predetermined threshold. For example, a region composed of points having a gradation value greater than 200 in the area laser image may be called a far point region. In fig. 11, two far point areas found from the area laser image are shown according to an exemplary embodiment.
And step S1004, determining boundary positions of the locomotive and the carriage according to the position relation between the far point and the front wheel position, so as to determine the locomotive position.
In some embodiments, if a far point region in the region laser image has been found, the head and cabin demarcation location is determined from the positional relationship of the far point region to the front wheel location. For example, if at least one far point region has been found, it is determined whether the above-mentioned far point region is behind the front wheel position of the vehicle; if the far point area is determined to be behind the front wheel position of the vehicle, continuing to judge whether the horizontal distance from the end, close to the front wheel, of the far point area to the center point of the front wheel of the vehicle is within the range of 50 cm to 100 cm, and if the end, close to the front wheel, of the far point area is not behind the front wheel of the vehicle, discarding the far point area, and continuing to judge other far point areas; if it is determined that the horizontal distance from the end of the far point region located behind the front wheel of the vehicle to the center point of the front wheel falls within the range of 50 cm to 100 cm, the position where the height change point is located is considered to be the boundary between the vehicle head and the vehicle cabin, and if it is determined that the horizontal distance from the far point region located behind the front wheel of the vehicle to the center point of the front wheel does not fall within the range of 50 cm to 100 cm, the high far point region is abandoned, and the other far point regions are continued to be determined. As shown in fig. 11, two areas with larger brightness are far-point areas, the far-point area on the right side can be judged to be just behind the front wheel, and the actual horizontal distance from the center point of the front wheel is only 80 cm, so that the position of the far-point area can be judged to be the boundary between the locomotive and the carriage, which is judged according to the steps, and the position of the locomotive can be found according to the boundary between the locomotive and the carriage.
Fig. 12 shows a flow chart of a vehicle scanning method according to an exemplary embodiment.
Step S1201, determining a vehicle head position according to a head recognition method.
In an exemplary embodiment, when it is desired to scan the head and the car separately in different ways, the head position is first determined.
In some embodiments, the head position of the vehicle may be determined by the head recognition method provided in the above embodiments.
Step S1202, scanning the vehicle based on the vehicle head position.
If the head position of the vehicle has been determined, the head and the cabin may be scanned in different ways based on the vehicle head position. For example, in the case of customs inspection vehicles, it is necessary to scan the running vehicle with X-rays, but the X-rays have a relatively large damage to the human body. Therefore, when the vehicle is scanned by using the X-rays, the position of the vehicle head needs to be determined first, and then the carriage of the vehicle is scanned by using the X-rays while avoiding the position of the vehicle head.
Fig. 13 shows a flow chart of a vehicle scanning method according to another exemplary embodiment.
Referring to fig. 13, the vehicle scanning method mainly includes the following steps.
In step S1301, the laser apparatus collects sequential image information of the vehicle.
Sequential image information of the vehicle is acquired using a laser device.
In step S1302, the laser apparatus transmits the sequence image information to the control apparatus.
The laser device transmits the acquired sequence image information to the control device so as to enable the control device to identify the position of the headstock.
In step S1303, the control apparatus generates a region laser image from the sequence image and identifies a head position from the generated region laser image.
The control device generates a regional laser image according to the sequence image information transmitted by the laser device, and identifies the headstock according to any headstock identification method described above.
Step S1304, it is determined whether the control device has recognized the vehicle head position, if it is determined that the control device has not recognized the vehicle head position, step S1301 is performed to continue collecting vehicle sequence image information, and if it is determined that the control room device has recognized the vehicle head position, step S1305 is performed.
After the recognition of the vehicle head is finished, the control device judges whether the vehicle head position is already recognized. If it is judged that the head position has been recognized, step S1305 is executed to control the scanning device to scan the vehicle; if it is determined that the head position is not recognized, step S1301 is performed to continue to acquire the sequential image information of the vehicle to continue to recognize the head position.
In step S1305, the control device controls the laser device to stop working, and starts the scanning device to scan the vehicle.
If the position of the headstock is judged to be recognized, the control device controls the laser device to stop working and controls the scanning device to scan the vehicle. The control device may control the scanning device to scan the vehicle cabin individually. For example, the control device may control the scanning device to scan the vehicle cabin with X-rays and not scan the vehicle head.
Fig. 14 is a block diagram illustrating a head recognition device according to an exemplary embodiment. Referring to fig. 14, the apparatus 140 includes a region laser image acquisition module 1401, a region laser image determination front wheel position module 1402, a height change point determination module 1403, and a head position determination module 1404.
Wherein the laser image acquisition module 1401 may be configured to acquire a region laser image of the vehicle, the region laser image including depth information and height information.
The area laser image determination front wheel position module 1402 may be configured to determine a front wheel position of the vehicle based on the area laser image.
The height change point determination module 1403 may be configured to determine a height change point in the area laser image where a gradient value of the vehicle body height is greater than a predetermined threshold.
The head position determination module 1404 may be configured to determine the head and carriage interface positions based on the positional relationship of the elevation change point to the front wheel positions, thereby determining the head position.
In some embodiments, as shown in fig. 15, the apparatus 140 may further include a de-air conditioning module 1405, where the de-air conditioning module 1405 may be configured to generate a region binary plane image of the vehicle from the region laser image; determining a vehicle air-conditioning position based on the area binary plane image; and removing the vehicle air conditioning information in the area laser image according to the vehicle air conditioning position.
In an exemplary embodiment, as shown in fig. 16, the area laser image acquisition module 1401 includes: a sequential image acquisition unit 14011, a region laser image generation unit 14012. The sequence image acquisition unit 14011 may be configured to acquire sequence image information of the vehicle, and the region laser image generation unit 14012 may be configured to generate the region laser image from the sequence image information.
In an exemplary embodiment, as shown in fig. 17, if the vehicle front wheel position cannot be determined based on the area laser image, the area laser image determination front wheel position module 1402 may further include: a region binary plane image generation unit 14021 that may be configured to generate a region binary plane image of the vehicle from the region laser image; the area binary plane image determination front wheel unit 14022 may be configured to determine a front wheel position of the vehicle based on the area binary plane image.
Fig. 18 is a block diagram illustrating a head recognition device according to an exemplary embodiment. Referring to fig. 18, the apparatus 180 includes an acquisition area laser image module 1801, an area laser image determination front wheel position module 1802, an image far point area determination module 1803, and a vehicle head position determination module 1804.
Wherein the acquisition area laser image module 1801 may be configured to acquire an area laser image of the vehicle, the area laser image including depth information and height information.
The determine front wheel position module 1802 may be configured to determine a front wheel position of the vehicle based on the area laser image.
The image far point region determination module 1803 may be configured to determine a far point region in the area laser image based on depth information of the area laser image.
The vehicle head position determination module 1804 may be configured to determine a head and cabin interface position based on a positional relationship of the far spot area and the front wheel position, thereby determining a head position.
In an exemplary embodiment, as shown in fig. 19, the apparatus 180 may further include a de-air conditioning module 1805, and the de-air conditioning module 1805 may be configured to generate a region binary plane image of the vehicle from the region laser image; determining a vehicle air-conditioning position based on the area binary plane image; and removing the vehicle air conditioning information in the area laser image according to the vehicle air conditioning position.
In an exemplary embodiment, as shown in fig. 20, the acquisition area laser image module 1801 includes: a sequential image acquisition unit 18011, and a region laser image generation unit 18012. The sequence image acquisition unit 18011 may be configured to acquire sequence image information of the vehicle, and the area laser image generation unit 18012 may be configured to generate the area laser image from the sequence image information.
In an exemplary embodiment, as shown in fig. 21, if the vehicle front wheel position cannot be determined based on the area laser image, the determining front wheel position module 1802 may further include: the area binary plane image generating unit 18021 may be configured to generate an area binary plane image of the vehicle from the area laser image; the area binary plane image determination front wheel position unit 18022 may be configured to determine a front wheel position of the vehicle based on the area binary plane image.
Since the respective functional modules of the head recognition apparatuses 180, 180 of the exemplary embodiment of the present invention correspond to the steps of the exemplary embodiment of the head recognition method described above, a detailed description thereof will be omitted herein.
Fig. 22 is a block diagram of a vehicle scanning system, according to an exemplary embodiment. The system includes a laser device 2201, a scanning device 2202, and a control device 2203. Wherein the laser device 2201 is configured to acquire a region laser image of a vehicle, the scanning device 2202 is configured to scan the vehicle, and the control device 2203 is configured to receive the region laser image of the vehicle, the region laser image including depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a far point region in the region laser image based on the depth information of the region laser image; determining boundary positions of a locomotive and a carriage according to the position relation between the far point area and the front wheel position, so as to determine the locomotive position; and controlling the scanning equipment to scan the vehicle based on the vehicle head position.
In another embodiment, the laser device 2201 is configured to acquire a region laser image of a vehicle, the scanning device 2202 is configured to scan the vehicle, and the control device 2203 is configured to receive the region laser image of the vehicle, the region laser image including depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a far point region in the region laser image based on the depth information of the region laser image; determining boundary positions of a locomotive and a carriage according to the position relation between the far point area and the front wheel position, so as to determine the locomotive position; and controlling the scanning equipment to scan the vehicle based on the vehicle head position.
Referring now to FIG. 23, there is shown a schematic diagram of a computer system 230 suitable for use in implementing an embodiment of the present application. The terminal device shown in fig. 23 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present application.
As shown in fig. 23, the computer system 230 includes a Central Processing Unit (CPU) 2301, which can perform various appropriate actions and processes according to programs stored in a Read Only Memory (ROM) 2302 or programs loaded from a storage portion 2308 into a Random Access Memory (RAM) 2303. In the RAM 2303, various programs and data required for the operation of the system 2300 are also stored. The CPU 2301, ROM 2302, and RAM 2303 are connected to each other through a bus 2304. An input/output (I/O) interface 2305 is also connected to the bus 2304.
The following components are connected to I/O interface 2305: an input section 2306 including a keyboard, mouse, and the like; an output portion 2307 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker, and the like; a storage section 2308 including a hard disk or the like; and a communication section 2309 including a network interface card such as a LAN card, a modem, or the like. The communication section 2309 performs communication processing via a network such as the internet. The drives 2310 are also coupled to the I/O interfaces 2305 as needed. A removable medium 2311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 2310, so that a computer program read out therefrom is installed into the storage section 2308 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 2309, and/or installed from a removable medium 2311. The above-described functions defined in the system of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 2301.
The computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes a transmitting unit, an acquiring unit, a determining unit, and a first processing unit. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to perform functions including: acquiring a regional laser image of a vehicle, wherein the regional laser image comprises depth information and height information; determining a front wheel position of the vehicle based on the area laser image; determining a height change point of the vehicle body height gradient value in the area laser image, wherein the height change point is larger than a preset threshold value; and determining boundary positions of the locomotive and the carriage according to the position relation between the height change point and the front wheel position, so as to determine the locomotive position.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution of the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.), and includes several instructions for causing a computing device (may be a personal computer, a server, a mobile terminal, or a smart device, etc.) to perform a method according to the embodiments of the present application.
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It is to be understood that the invention is not limited to the details of construction, the manner of drawing, or the manner of implementation, which has been set forth herein, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (13)

1. A method for identifying a vehicle head, comprising:
Acquiring an area laser image of a vehicle, wherein the area laser image comprises actual height information of the vehicle, and the actual height information is obtained according to the linear distance between a vehicle body of a scanned part of the vehicle and laser equipment and the included angle between a laser propagation track and the vertical plane of a laser plane;
determining a front wheel position of the vehicle based on the area laser image, wherein the front wheel position is determined by finding an area in the area laser image having a height below a determination threshold;
determining a height change point of the vehicle body height gradient value in the area laser image, wherein the height change point is larger than a preset threshold value;
discarding the height change point if the height change point is not behind the front wheel;
if the height change point is behind the front wheel, continuing to judge whether the horizontal distance between the height change point and the center point of the front wheel is within a preset range;
if the horizontal distance between the height change point and the center point of the front wheel is within a preset range, determining that the position of the height change point is the boundary position of the locomotive and the carriage, and accordingly determining the locomotive position.
2. The method of claim 1, wherein acquiring an area laser image of the vehicle comprises:
Acquiring sequence image information of the vehicle, wherein the sequence image information is column data for constituting an image, wherein the sequence image information includes a straight line distance of a vehicle body of a scanned portion of the vehicle from the laser device;
determining depth information of a vehicle body of the scanned part of the vehicle according to a linear distance between the vehicle body of the scanned part of the vehicle and the laser device in the sequence image;
correcting depth information of a vehicle body of a scanned part of the vehicle according to the speed information of the vehicle;
and determining the regional laser image according to the corrected depth information of the vehicle body of the scanned part of the vehicle.
3. The method as recited in claim 1, further comprising:
generating a regional binary plane image of the vehicle according to the regional laser image;
determining a vehicle air-conditioning position based on the area binary plane image;
and removing the vehicle air conditioning information in the area laser image according to the vehicle air conditioning position.
4. A method according to claim 3, wherein the area laser image comprises the actual height of the laser reflection point; wherein, generating the regional binary plane image of the vehicle according to the regional laser image comprises:
Filling the laser reflection points between the highest line and the lowest line of the binary data according to the actual height, wherein the laser reflection point with the highest actual height is stored in the highest line of the binary data, and the reflection point with the lowest actual height is stored in the lowest line of the binary data;
and generating the regional binary plane image according to the binary data.
5. The method of claim 3, wherein determining a vehicle air conditioning location based on the area binary plane image comprises:
determining air sequence information of the leftmost end of the binary plane image of the head front section position distance area in each line of data in the binary plane image;
determining a change curve of the leftmost position of the vehicle according to the air sequence information;
and determining the position of the air conditioner of the vehicle according to the inflection point on the change curve of the leftmost position of the vehicle, wherein the inflection point is the concave-convex demarcation point of the change curve.
6. The method of claim 1, wherein the determining a front wheel position of the vehicle based on the area laser image comprises:
if the front wheel position of the vehicle cannot be determined based on the area laser image, an area binary plane image of the vehicle is generated from the area laser image, and the front wheel position of the vehicle is determined based on the area binary plane image.
7. The method of claim 1, wherein the area laser image further comprises depth information; wherein the method further comprises:
determining that the boundary position of the locomotive and the carriage cannot be confirmed through the height change point;
determining a far point region in the region laser image based on the depth information of the region laser image, wherein a region consisting of points with the depth information greater than a predetermined threshold value in the region laser image is the far point region;
determining whether the far spot area is behind the front wheel position;
if the far point area is behind the front wheel position, continuing to judge whether the distance from one end of the far point area, which is close to the front wheel, to the center point of the front wheel is within a preset range;
if the distance from one end of the far point area, which is close to the front wheel, to the center point of the front wheel is not within a preset range, discarding the far point area;
if the distance from one end of the far point area close to the front wheel to the center point of the front wheel is within a preset range, determining that the position of the far point area is the boundary position of the vehicle head and the vehicle carriage, and determining the vehicle head position.
8. A vehicle scanning method, characterized by comprising:
the vehicle head identification method according to any one of claims 1 to 7, determining a vehicle head position;
and scanning the vehicle based on the vehicle head position.
9. The method of claim 8, wherein the scanning the vehicle based on the vehicle head position comprises: based on the vehicle head position, individual scanning of the vehicle cabin is achieved.
10. A vehicle head identification device, comprising:
the area laser image acquisition module is configured to acquire an area laser image of the vehicle, wherein the area laser image comprises actual height information of the vehicle, and the actual height information is obtained according to the linear distance between a vehicle body of a scanned part of the vehicle and the laser equipment and the included angle between a laser propagation track and a vertical plane of a laser plane;
a front wheel position determination module configured to determine a front wheel position of the vehicle based on the area laser image, wherein the front wheel position is determined by finding an area in the area laser image having a height below a determination threshold;
the height change point determining module is configured to determine a height change point with a gradient value of the height of the vehicle body in the area laser image being larger than a preset threshold value;
A head position determination module configured to discard the altitude change point if the altitude change point is not behind the front wheel; if the height change point is behind the front wheel, continuing to judge whether the horizontal distance between the height change point and the center point of the front wheel is within a preset range; and if the horizontal distance between the height change point and the central point of the front wheel is within a preset range, determining that the position of the height change point is the boundary position of the locomotive and the carriage, thereby determining the locomotive position.
11. A vehicle scanning system, comprising:
the laser device is used for acquiring a regional laser image of the vehicle;
a scanning device for scanning the vehicle;
a control device for receiving a region laser image of a vehicle, the region laser image including actual height information of the vehicle, the actual height information being obtained from a linear distance of a body of a scanned portion of the vehicle from the laser device; determining a front wheel position of the vehicle based on the area laser image, wherein the front wheel position is determined by finding an area in the area laser image having a height below a determination threshold; determining a height change point of the vehicle body height gradient value in the area laser image, wherein the height change point is larger than a preset threshold value; determining boundary positions of a locomotive and a carriage according to the position relation between the height change point and the front wheel position, so as to determine the locomotive position; and controlling the scanning equipment to scan the vehicle based on the vehicle head position.
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
13. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-7.
CN202311064592.9A 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium Pending CN117037086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311064592.9A CN117037086A (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811615134.9A CN111382597B (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium
CN202311064592.9A CN117037086A (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811615134.9A Division CN111382597B (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium

Publications (1)

Publication Number Publication Date
CN117037086A true CN117037086A (en) 2023-11-10

Family

ID=71216523

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201811615134.9A Active CN111382597B (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium
CN202311064592.9A Pending CN117037086A (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201811615134.9A Active CN111382597B (en) 2018-12-27 2018-12-27 Head recognition method and device, vehicle scanning system, device and readable medium

Country Status (1)

Country Link
CN (2) CN111382597B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462086B (en) * 2020-10-23 2023-08-15 宁波傲视智绘光电科技有限公司 Speed measuring method and device and readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5478419B2 (en) * 2009-10-22 2014-04-23 三菱電機株式会社 Axle detection system
CN104050811B (en) * 2014-06-13 2017-05-24 深圳市砝石激光雷达有限公司 Laser motor vehicle model classification system and method
CN104391339B (en) * 2014-12-17 2018-02-09 同方威视技术股份有限公司 Model recognizing method and the quick inspection system of vehicle using this method
CN105427614A (en) * 2015-08-28 2016-03-23 北京动视元科技有限公司 Model classification system and method

Also Published As

Publication number Publication date
CN111382597A (en) 2020-07-07
CN111382597B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
CN110286387B (en) Obstacle detection method and device applied to automatic driving system and storage medium
US10824880B2 (en) Methods and systems for detecting environmental information of a vehicle
CN110163930B (en) Lane line generation method, device, equipment, system and readable storage medium
CN109284348B (en) Electronic map updating method, device, equipment and storage medium
CN106919908B (en) Obstacle identification method and device, computer equipment and readable medium
CN108509820B (en) Obstacle segmentation method and device, computer equipment and readable medium
CN106845412B (en) Obstacle identification method and device, computer equipment and readable medium
CN106934347B (en) Obstacle identification method and device, computer equipment and readable medium
US7826666B2 (en) Methods and apparatus for runway segmentation using sensor analysis
US10132919B2 (en) Object detecting device, radar device, and object detection method
CN112897345B (en) Alignment method of container truck and crane and related equipment
CN112258519B (en) Automatic extraction method and device for way-giving line of road in high-precision map making
US20230005278A1 (en) Lane extraction method using projection transformation of three-dimensional point cloud map
CN112613424A (en) Rail obstacle detection method, rail obstacle detection device, electronic apparatus, and storage medium
CN113835102B (en) Lane line generation method and device
CN109447003A (en) Vehicle checking method, device, equipment and medium
KR101178508B1 (en) Vehicle Collision Alarm System and Method
CN113514847A (en) Vehicle outer contour dimension detection method and system and storage medium
CN111382597B (en) Head recognition method and device, vehicle scanning system, device and readable medium
WO2021005105A1 (en) Imaging systems, devices and methods
US20220171975A1 (en) Method for Determining a Semantic Free Space
CN109827610B (en) Method and device for verifying sensor fusion result
CN110727269A (en) Vehicle control method and related product
CN115063771B (en) Error correction method, system, storage medium and device for detecting distance of obstacle
CN113936493B (en) Image processing method, apparatus, computer device and medium for automatic driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination