CN118135521A - Infrared aerial view sensing device, aerial view sensing method, controller and medium - Google Patents

Infrared aerial view sensing device, aerial view sensing method, controller and medium Download PDF

Info

Publication number
CN118135521A
CN118135521A CN202410274886.2A CN202410274886A CN118135521A CN 118135521 A CN118135521 A CN 118135521A CN 202410274886 A CN202410274886 A CN 202410274886A CN 118135521 A CN118135521 A CN 118135521A
Authority
CN
China
Prior art keywords
infrared
aerial view
sensing
bird
surrounding environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410274886.2A
Other languages
Chinese (zh)
Inventor
童城
陈洋
梁林林
石本义
王星宇
冯站银
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Infiray Technologies Co Ltd
Original Assignee
Infiray Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Infiray Technologies Co Ltd filed Critical Infiray Technologies Co Ltd
Priority to CN202410274886.2A priority Critical patent/CN118135521A/en
Publication of CN118135521A publication Critical patent/CN118135521A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides infrared aerial view sensing equipment, aerial view sensing method, controller and medium, and belongs to the technical field of auxiliary driving. The infrared aerial view sensing device comprises a mounting body and a plurality of infrared cameras arranged on the mounting body; the installation body is used for assembling the infrared aerial view sensing equipment on the vehicle, and the plurality of infrared cameras are in annular arrangement and are used for synchronously collecting infrared images of the surrounding environment of the vehicle at different collecting visual angles. The infrared aerial view sensing device provided by the application can be assembled on a vehicle through the mounting body, has higher flexibility in application, can clearly acquire all-round infrared images of the surrounding environment of the vehicle in an environment with poor sight, ensures that the aerial view sensing method based on the infrared aerial view sensing device has higher environment sensing precision in different application environments, and is beneficial to improving the safety of vehicle driving.

Description

Infrared aerial view sensing device, aerial view sensing method, controller and medium
Technical Field
The application relates to the technical field of auxiliary driving, in particular to infrared aerial view sensing equipment, aerial view sensing method, processor and medium.
Background
In the field of autopilot, the perception task based on Bird's Eye View (abbreviated BEV) representation is attracting more and more attention. The current bird's eye view sensing scheme includes a pure visual sensing scheme and a multi-sensor fusion sensing scheme. The pure visual perception scheme is as follows: the vehicle body is provided with a plurality of visible light cameras, and the visible light cameras are arranged on the vehicle body to realize omnibearing sensing of the surrounding environment of the vehicle. The multi-sensor fusion sensing scheme is as follows: on the basis of a pure visual perception scheme, other sensors such as a laser radar, a millimeter wave radar and the like are added, the environment information acquired by the visible light camera is fused with the environment information acquired by the other sensors such as the laser radar and/or the millimeter wave radar and the like in the aerial view space, and the precision of environment perception is improved.
However, in an environment with poor vision such as night, haze, dust, etc., the acquisition effect of the visible light camera is poor, and the accuracy of the environmental perception result is adversely affected.
Disclosure of Invention
In order to solve the existing technical problems, the application provides an infrared aerial view sensing device, an aerial view sensing method, a processor and a computer readable program, wherein the aerial view sensing method based on the infrared aerial view sensing device has higher environment sensing precision in different application environments.
According to a first aspect of an embodiment of the present application, there is provided an infrared bird's eye view sensing apparatus, including a mounting body and a plurality of infrared cameras provided on the mounting body;
The installation body is used for assembling the infrared aerial view sensing equipment on a vehicle;
the infrared cameras are annularly arranged and used for synchronously collecting infrared images of the surrounding environment of the vehicle at different collecting visual angles, the infrared images can be spliced into an all-around infrared image of the surrounding environment, and an all-around aerial view sensing result of the surrounding environment can be obtained based on the infrared images.
According to a second aspect of the embodiment of the present application, there is provided a bird's eye view sensing method, including:
Acquiring environment sensing data of the surrounding environment of a vehicle, wherein the environment sensing data comprises infrared images obtained by synchronously acquiring the surrounding environment at different acquisition visual angles by adopting the infrared aerial view sensing equipment;
obtaining a target aerial view feature of the surrounding environment based on the environment perception data comprising the infrared image;
and performing environment sensing based on the target aerial view characteristics to obtain an aerial view sensing result of the surrounding environment.
According to a third aspect of embodiments of the present application, there is provided a controller comprising a processor and a memory for storing computer program instructions, the processor executing the bird's eye view sensing method when executing the computer program instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon computer program instructions;
And when the computer program instructions are executed by the processor, the infrared aerial view sensing method is realized.
From the above, the infrared aerial view sensing device provided by the embodiment of the application can be assembled on a vehicle through the mounting body, the acquisition time of the plurality of infrared cameras is synchronous, and the plurality of synchronously acquired infrared images can be spliced into an omnibearing infrared image of the surrounding environment of the vehicle. Therefore, the infrared aerial view sensing device provided by the embodiment of the application can clearly acquire the all-round infrared image of the surrounding environment of the vehicle in the environment with poor sight, so that the aerial view sensing method of the infrared aerial view sensing device provided by the embodiment of the application has higher environment sensing precision in different application environments, and is beneficial to improving the safety of vehicle driving. In addition, the infrared aerial view sensing device provided by the embodiment of the application can be assembled on the vehicle through the mounting body, has higher application flexibility, does not need fixed configuration before the vehicle leaves the factory, and can be selectively assembled on the vehicle by a user based on requirements, thereby being convenient for replacement and maintenance.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a schematic diagram of an infrared bird's eye view sensing device according to some embodiments of the present application;
FIG. 2 is a schematic illustration of an infrared bird's eye view sensing device according to some embodiments of the present application mounted on a vehicle;
FIG. 3 is a flow chart of a bird's eye view sensing method according to some embodiments of the present application;
Fig. 4 is a bird's-eye view image carrying bird's-eye view image sensing results obtained by a bird's-eye view image sensing method according to some embodiments of the present application;
FIG. 5 is a flow chart of obtaining a target aerial view feature in an aerial view sensing method according to some embodiments of the present application;
fig. 6 is an infrared image carrying 3D detection results obtained by a bird's eye view sensing method according to some embodiments of the present application;
fig. 7 is a schematic structural diagram of a controller according to some embodiments of the application.
Detailed Description
The technical scheme of the application is further elaborated below by referring to the drawings in the specification and the specific embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the implementations of the application. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
In the description of the present application, it should be understood that the terms "center," "upper," "lower," "front," "rear," "left," "right," "vertical," "row," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the present application and simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
For some vehicles, a bird's eye view sensing system is configured before shipment to provide a driving assistance function. The vehicle-mounted bird's-eye view sensing system generally includes a plurality of visible light cameras arranged along the circumferential direction of the vehicle body, and performs bird's-eye view sensing by collecting an environmental surrounding image of the vehicle with a plurality of visible lights. However, the visible light camera is greatly affected by the light intensity, and in some scenes with weak light intensity, the perception precision of the aerial view perception system of the vehicle is low, so that the safe driving of the vehicle is not facilitated. In order to solve the problem that a bird's-eye view sensing system carried by a vehicle cannot meet the sensing requirement under low light intensity, the application provides infrared bird's-eye view sensing equipment capable of being mounted on the vehicle at the rear, so that the bird's-eye view sensing system of the vehicle can obtain a high-precision bird's-eye view sensing result when the ambient light intensity is low.
Referring to fig. 1 and 2, fig. 1 is a schematic structural diagram of an infrared aerial view sensing apparatus 10 according to some embodiments of the present application, and fig. 2 is a schematic structural diagram of an infrared aerial view sensing apparatus 10 according to some embodiments of the present application mounted on a vehicle 20. In some embodiments, the infrared bird's eye view sensing device 10 may be post-mounted to the vehicle 20 for capturing infrared images of the surrounding environment of the vehicle 20 at different capture perspectives. That is, after the vehicle 20 leaves the factory, the user may further install the infrared aerial view sensing device 10 provided in the embodiment of the present application on the vehicle 20 according to the requirement, so as to perform aerial view sensing based on the infrared aerial view sensing device 10.
Specifically, the infrared bird's eye view sensing apparatus 10 includes a mounting body 101 and a plurality of infrared cameras 102 provided on the mounting body 101. The mounting body 101 is used to mount the infrared bird's eye view sensing apparatus 10 on the vehicle 20. The plurality of infrared cameras 102 are annularly arranged and are used for synchronously collecting infrared images of the surrounding environment of the vehicle 20 at different collecting visual angles, the plurality of infrared images can be spliced into an omnidirectional infrared image of the surrounding environment of the vehicle, and an omnidirectional aerial view sensing result of the surrounding environment of the vehicle 20 can be obtained based on the plurality of infrared images.
The omnidirectional infrared image is also known as a look-around infrared image of the environment in which the vehicle 20 is located. The acquisition time of each infrared camera 102 in the infrared aerial view sensing device 10 is synchronous, and the acquisition view angle needs to meet a preset condition, so that after each infrared image synchronously acquired by each infrared camera 102 is spliced, an omnibearing image of the surrounding environment where the vehicle 20 is located can be obtained. The image features of the plurality of synchronously acquired infrared images are converted into the aerial view features under the aerial view space, so that the omnibearing aerial view sensing can be performed based on the aerial view features corresponding to the plurality of infrared images, and an omnibearing aerial view sensing result of the surrounding environment of the vehicle 20 can be obtained.
The infrared aerial view sensing device 10 provided by the embodiment of the application can be assembled on the vehicle 20 through the mounting body 101, the acquisition time of the plurality of infrared cameras 102 is synchronous, and the plurality of synchronously acquired infrared images can be spliced into an omnibearing infrared image of the surrounding environment of the vehicle 20. Therefore, the infrared aerial view sensing device 10 provided by the embodiment of the application can clearly acquire the all-round infrared image of the surrounding environment where the vehicle 20 is located in the environments with poor sight such as night, haze and dust, and the environment sensing of the surrounding environment where the vehicle 20 is located can be accurately performed in different application environments based on the infrared aerial view sensing device 10 provided by the embodiment of the application, which is beneficial to improving the driving safety of the vehicle 20.
In some embodiments, the infrared aerial view sensing device 10 is mounted on the roof of the vehicle 20 through the mounting body 101, so that the infrared aerial view sensing device 10 has a better field of view, which is beneficial to improving the environmental sensing precision of the aerial view sensing method based on the infrared aerial view sensing device 10. Specifically, the infrared bird's-eye view sensing device 10 is detachably mounted on the vehicle 20 through the mounting body 101, such as at a roof middle position of the vehicle 20. The mounting body 101 may be secured to the vehicle 20 by suction cups or other securement means. Each infrared camera 102 may be communicatively connected to a controller in the vehicle 20 based on wired or wireless communication, so that the controller may perform omnidirectional aerial view sensing according to the infrared image collected by the infrared aerial view sensing device 10. Specifically, the controller may be a vehicle controller installed in the vehicle 20. The infrared aerial view sensing device 10 provided by the embodiment of the application is beneficial to improving the environment sensing accuracy of a vehicle in an environment with poor sight. In addition, the infrared aerial view sensing device 10 provided by the embodiment of the application can be assembled on the vehicle 20 through the mounting body 101, so that the application flexibility is high, the fixed configuration of the vehicle 20 before leaving the factory is not needed, and a user can selectively assemble the infrared aerial view sensing device 10 provided by the embodiment of the application on the vehicle 20 based on requirements, thereby being convenient for replacement and maintenance.
In some embodiments, the vehicle 20 may further include an ambient light sensor, where the ambient light sensor detects that the light intensity of the surrounding environment where the vehicle 20 is located is lower than a set value, and the infrared aerial view sensing device 10 is used to collect the surrounding environment information of the vehicle 20, or the infrared aerial view sensing device 10 is used to collect the surrounding environment information together with other environment sensing devices of the vehicle 20, so as to implement multi-sensor fused aerial view environment sensing.
In some embodiments, the plurality of infrared cameras 102 are disposed at equal intervals along the circumferential side surface of the mounting body 101, and the overlapping portions exist in the acquisition view angles of the two adjacent infrared cameras 101, so that the view angles of the plurality of infrared cameras 102 can be identical, which is beneficial to reducing the production cost of the infrared aerial view sensing device 10.
With continued reference to fig. 1, in some embodiments, the cross section of the mounting body 101 is regular hexagon, and the plurality of infrared cameras 102 are respectively disposed corresponding to six peripheral sides of the mounting body 101. That is, at least one infrared camera 102 is disposed on each circumferential side of the mounting body 101, and the collecting lens of each infrared camera 102 faces the corresponding circumferential side to collect environmental information in the direction opposite to the corresponding circumferential side. The installation body 101 with the regular hexagonal cross section divides the surrounding environment of the vehicle 20 into six directions, each circumferential side faces one direction, and each infrared camera 102 is used for collecting the environmental information of the direction of the corresponding circumferential side face. The plurality of infrared cameras 102 are respectively and correspondingly arranged with six peripheral sides of the installation body 101, so that the omnidirectional information of the surrounding environment where the vehicle 20 is located can be conveniently collected, and meanwhile, the lenses of the infrared cameras 102 can be protected by the corresponding peripheral sides of the installation body 101.
Specifically, in some embodiments, the number of infrared cameras 102 is 6, the cross section of the mounting body 101 is regular hexagon, and each circumferential side of the mounting body 101 is provided with one infrared camera 102. The 6 infrared cameras 102 are arranged at equal intervals, so that the 6 infrared cameras 102 can collect all-round information of the surrounding environment where the vehicle 20 is located. The 6 infrared cameras 102 are arranged in one-to-one correspondence and equidistant along the circumferential side surface of the installation body 101 with the cross section being regular hexagon, so that the requirements of each infrared camera 102 on the horizontal view angle, the vertical view angle and the resolution can be reduced, and the cost of the infrared aerial view sensing equipment 10 can be reduced. For example, the horizontal angles of view of the 6 infrared cameras 102 are 75 ° respectively, the vertical angles of view are 63 ° respectively, the resolutions are 640 x512 respectively, and most of the infrared cameras on the market can meet this requirement.
The application further provides a bird's-eye view sensing method of the infrared bird's-eye view sensing device 10, which is provided by the embodiment of the application, and a flow diagram of the bird's-eye view sensing method is shown in fig. 3. In some embodiments, the bird' S eye view sensing method provided by the application includes S02, S04 and S06, and the specific descriptions between the steps are as follows.
S02: the method comprises the steps of obtaining environment sensing data of the surrounding environment of the vehicle, wherein the environment sensing data comprise infrared images obtained by synchronously collecting the surrounding environment at different collecting visual angles by adopting the infrared aerial view sensing equipment provided by any embodiment of the application.
After the infrared aerial view sensing device 10 is turned on, each infrared camera 102 in the infrared aerial view sensing device 10 starts to synchronously collect infrared images of the surrounding environment where the vehicle 20 is located at different collection view angles, so as to obtain all-round infrared information of the surrounding environment where the vehicle 20 is located, and the collected infrared images of all directions are sent to the vehicle controller for the vehicle controller to perform aerial view sensing.
S04: and obtaining target aerial view features of the surrounding environment based on the environment sensing data containing the infrared image.
The environmental awareness data is data acquired by an environmental awareness apparatus mounted on a vehicle, wherein the environmental awareness apparatus includes the infrared bird's eye view awareness apparatus 10 provided by the embodiment of the present application. Image features of the infrared images synchronously acquired by the plurality of infrared cameras 102 in the infrared aerial view sensing device 10 are respectively converted into aerial view space to obtain target aerial view features. The target aerial view feature may be an infrared aerial view feature converted from only the image feature of each infrared image, or may be a fused aerial view feature obtained by fusing each infrared image with environmental perception data acquired by other sensors in a corresponding direction and then converting the image feature of each fused image, or may be a fused aerial view feature obtained by fusing the infrared aerial view feature with other aerial view features acquired by other environmental perception devices. The environment sensing sensor is used for acquiring environment information by other environment sensing devices such as a visible light camera, a laser radar or a millimeter wave radar.
S06: and performing environment sensing based on the target aerial view characteristics to obtain aerial view sensing results of surrounding environment.
The environmental perception based on the target aerial view features includes, but is not limited to, 3D target detection, lane line recognition, target tracking, semantic segmentation, and/or dynamic target trajectory prediction based on the target aerial view features to obtain corresponding aerial view perception results. As shown in fig. 4, the bird's eye view image carrying the bird's eye view image sensing result is obtained by sensing the object in the surrounding environment based on the object bird's eye view image feature, wherein the white rectangle in the figure represents the object in the surrounding environment, and in other embodiments, different types of object may be represented by rectangles of different colors or other shapes.
According to the bird's-eye view sensing method provided by the embodiment of the application, the target bird's-eye view characteristic is obtained based on the infrared image acquired by the infrared bird's-eye view sensing device 10 to perform environment sensing, and as the infrared bird's-eye view sensing device 10 can acquire more effective information in the surrounding environment in the environment with poor sight, the bird's-eye view sensing method provided by the application has higher sensing precision and is beneficial to improving driving safety.
Fig. 5 is a schematic flow chart of obtaining a target aerial view feature in an aerial view sensing method according to some embodiments of the present application. Specifically, in some embodiments, S04 includes S042a, S044a, S046a, and S048a.
S042a: and extracting the characteristics of the infrared image to obtain the corresponding characteristics of the infrared image.
After the infrared image sensing device 10 is turned on, the image feature of each infrared image synchronously collected by the infrared image sensing device 10 is extracted through the image feature extraction network, so as to obtain the infrared image feature corresponding to each infrared image. Specifically, the image feature extraction network includes, but is not limited to, DLA34, resNet, mobileNet networks, or the like.
S044a: and carrying out dimension lifting processing on the infrared image characteristics to obtain corresponding view cone point characteristics.
The dimension-increasing processing of the infrared image features means that the two-dimensional infrared image features are converted into corresponding three-dimensional space features, and the view cone point features are adopted.
Specifically, in some embodiments, a depth estimation network may be used to estimate the depth of pixels in each infrared image, obtain the depth feature of each infrared image, and then convert each pixel in the two-dimensional infrared image feature into a three-dimensional space according to the depth feature and the internal and external parameters of each infrared camera 102, so as to form a cone-of-view point cloud based on the infrared image. Compared with the traditional inverse perspective transformation which is used for completing the conversion from the image space to the aerial view space, the mode for converting the two-dimensional image features into the three-dimensional space provided by the embodiment of the application does not depend on ground assumption and can be suitable for detecting objects with heights.
S046a: a fixed bird's-eye view grid is generated and the view cone point features are converted to initial bird's-eye view features based on the bird's-eye view grid.
Setting a bird's-eye view grid with a certain length and width, wherein the physical size corresponding to each bird's-eye view grid is fixed in the bird's-eye view space, and placing each view cone point cloud into the corresponding bird's-eye view grid according to the position of each view cone point cloud represented by the view cone point characteristics, so as to obtain the initial bird's-eye view characteristics of all view cone points of the surrounding environment.
In some embodiments, sum-pooling is used to add all view cone point features of the same bird's-eye view grid for the case that one bird's-eye view grid corresponds to multiple view cone point clouds.
S048a: and carrying out multi-scale feature fusion processing based on the initial aerial view feature to obtain a target aerial view feature.
The initial aerial view features may be encoded using the FPN structure to extract features of multiple scales of the initial aerial view, and then the features of multiple scales are fused to obtain target aerial view features. Based on the multi-scale fused target aerial view characteristics, the scale change can be better processed, the characteristics of targets with different sizes can be effectively captured, and the accuracy of aerial view perception is improved.
Further, S048a specifically includes: and 3D target detection is carried out based on the target aerial view characteristics, and a 3D target detection result of the target object in the surrounding environment is obtained.
In combination with the foregoing, the bird's-eye view sensing result includes, but is not limited to, a 3D target detection result, a lane line recognition result, a target tracking result, a dynamic target track prediction result, and the like, where the 3D target detection result may perform 3D detection on the target bird's-eye view feature through the 3D detection head to obtain 3D information of each target in the surrounding environment of the vehicle 20. The 3D detection head detects the coordinates of the center point of the 2D detection frame of the target, the offset between the coordinates of the center point of the 2D detection frame and the projected coordinates of the center point of the 3D detection frame of the target in the 2D detection frame, the depth of the center point of the 2D detection frame, the size of the 3D detection frame and the observation angle, and determines the 3D information of the target, namely the 3D target detection result, according to the detection information.
Further, the bird's eye view sensing method according to some embodiments of the present application further includes displaying the infrared image collected by each infrared camera 102 on a display in the vehicle 20, and displaying the 3D detection result of the target on each infrared image. By way of example, fig. 6 shows one of the infrared images carrying the 3D object detection result, which is acquired by one of the infrared cameras 102 acquiring environmental information in front of the vehicle 20. Displaying the 3D object detection results on the infrared image includes displaying a 3D detection frame of each object, such as a white cuboid in fig. 6. In other embodiments, different types of target objects may also be represented on the infrared image with different colored and/or shaped 3D detection boxes. Additionally, in some embodiments, one or more of a plurality of infrared images carrying 3D object detection results and a bird's eye view carrying bird's eye view perception results may be selected for display.
In some embodiments, the environmental awareness data further includes looking around the visible light image, laser radar point cloud data, and/or millimeter wave radar point cloud data, and S04 specifically includes: obtaining infrared aerial view features of the surrounding environment based on the infrared image data, obtaining aerial view features to be fused of the surrounding environment based on the surrounding visible light image, the laser radar point cloud data and/or the millimeter wave radar point cloud data, and fusing the infrared aerial view features with the aerial view features to be fused to obtain target aerial view features.
The looking-around visible light image is an image collected by a plurality of visible light cameras provided on the body of the vehicle 20, the laser radar point cloud data is data collected by a laser radar provided on the vehicle 20, and the millimeter wave radar is data collected by a millimeter wave radar provided on the vehicle 20. The visible light camera, the laser radar and/or the millimeter wave radar are environment sensing devices carried by the vehicle 20 itself, which are already provided on the vehicle 20 before the vehicle 20 leaves the factory. The bird's-eye view sensing method provided by the embodiment is based on the characteristic level fusion of the infrared image acquired by the infrared bird's-eye view sensing device 10 and the environment sensing data acquired by the environment sensing device of the vehicle 20 under the bird's-eye view space, so as to realize multi-sensor fusion sensing, and improve the accuracy of bird's-eye view sensing.
The above embodiment is to fuse the bird's-eye view features in the bird's-eye view space based on the infrared image acquired by the infrared bird's-eye view sensing device 10 and the environmental sensing data acquired by the environmental sensing device of the vehicle 20 itself. In other embodiments, the bird's-eye view sensing method provided by the application further includes performing bird's-eye view sensing based on the environmental sensing data acquired by the infrared bird's-eye view sensing device 10 and the image and other environmental sensing devices of the vehicle 20, respectively, and then performing decision-level fusion on the sensing results to obtain a final bird's-eye view sensing result. Specifically, in other embodiments, S04 specifically includes obtaining a first target aerial view feature of the surrounding environment based on the infrared image data, and obtaining a second target aerial view feature of the surrounding environment based on the panoramic visible image, the laser radar point cloud data, and/or the millimeter wave radar point cloud data. Wherein the second target aerial view feature is an aerial view feature obtained based on environmental awareness data acquired by the environmental awareness devices of the vehicle 20 itself. The step S06 includes: performing environment sensing based on the first target aerial view feature to obtain a first aerial view sensing result of the surrounding environment, performing environment sensing based on the second target aerial view feature to obtain a second aerial view sensing result of the surrounding environment, and then fusing the first aerial view sensing result and the second aerial view sensing result to obtain an aerial view sensing result.
According to the bird's-eye view sensing method provided by the embodiment of the application, bird's-eye view sensing can be performed on the basis of infrared images synchronously acquired by the infrared bird's-eye view sensing equipment 10 at different acquisition angles, and multi-sensor fusion sensing can be performed on the basis of the infrared bird's-eye view sensing equipment 10 and environment sensing equipment of the vehicle 20, so that the accuracy of environment sensing is improved.
As shown in FIG. 7, in some embodiments, the present application also provides a controller comprising a processor 100 and a memory 200. The steps of the bird's eye view sensing method according to any of the embodiments of the present application are performed by the processor 100 when executing the computer program instructions stored in the memory 200. The controller provided according to the embodiment of the present application and the bird's eye view sensing method provided by the embodiment of the present application can achieve the same technical effects, and are not described here. The controller provided by the embodiment of the present application may be specifically a vehicle controller disposed in the vehicle 20, and is communicatively connected to each infrared camera in the infrared aerial view sensing device 10 provided by the embodiment of the present application.
Furthermore, the application also provides a computer readable storage medium, wherein the computer readable storage medium stores computer program instructions; when executed by a processor, the computer program instructions implement the steps of a bird's eye view sensing method according to any of the embodiments of the present application. The computer readable storage medium according to the embodiment of the present application and the bird's eye view sensing method according to the embodiment of the present application can achieve the same technical effects, and are not described here again.
The processor may be a CPU (central processing unit) Central Processing Unit, or an ASIC (Application SPECIFIC INTEGRATED Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors comprised by the control device may be of the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs or FPGAs.
The Memory may include a high-speed RAM (random access Memory ), and may further include an NVM (Non-Volatile Memory), such as at least one magnetic disk Memory.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily appreciate variations or alternatives within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (10)

1. The infrared aerial view sensing device is characterized by comprising a mounting body and a plurality of infrared cameras arranged on the mounting body;
The installation body is used for assembling the infrared aerial view sensing equipment on a vehicle;
the infrared cameras are annularly arranged and used for synchronously collecting infrared images of the surrounding environment of the vehicle at different collecting visual angles, the infrared images can be spliced into an all-around infrared image of the surrounding environment, and an all-around aerial view sensing result of the surrounding environment can be obtained based on the infrared images.
2. The infrared aerial view sensing device of claim 1, wherein a plurality of the infrared cameras are equally spaced along a circumferential side of the mounting body.
3. The infrared aerial view sensing device of claim 2, wherein the cross section of the mounting body is regular hexagonal, and the infrared cameras are respectively arranged corresponding to six circumferential sides of the mounting body.
4. A method of bird's eye view perception, comprising:
Acquiring environment-aware data of the surrounding environment of a vehicle, the environment-aware data comprising infrared images obtained by synchronously acquiring the surrounding environment at different acquisition perspectives using the infrared bird's eye view sensing device according to any one of claims 1 to 3;
obtaining a target aerial view feature of the surrounding environment based on the environment perception data comprising the infrared image;
and performing environment sensing based on the target aerial view characteristics to obtain an aerial view sensing result of the surrounding environment.
5. The bird's eye view sensing method according to claim 4, wherein the obtaining the target bird's eye view feature of the surrounding environment based on the environment sensing data including the infrared image includes:
extracting features of the infrared image to obtain corresponding infrared image features;
performing dimension lifting processing on the infrared image characteristics to obtain corresponding cone point characteristics;
Generating a fixed aerial view grid, and converting the visual cone point characteristics into initial aerial view characteristics based on the aerial view grid;
And carrying out multi-scale feature fusion processing based on the initial aerial view feature to obtain the target aerial view feature.
6. The method for sensing a bird's eye view according to claim 4, wherein the performing environmental sensing based on the target bird's eye view feature to obtain a bird's eye view sensing result of the surrounding environment includes:
and carrying out 3D target detection based on the target aerial view characteristics to obtain a 3D target detection result of the target object in the surrounding environment.
7. The bird's eye view sensing method according to claim 4, wherein the environmental sensing data further includes looking around a visible light image, laser radar point cloud data and/or millimeter wave radar point cloud data, the obtaining the target bird's eye view feature of the surrounding environment based on the environmental sensing data including the infrared image includes:
Based on the infrared image data, obtaining infrared aerial view features of the surrounding environment;
Obtaining a bird's eye view feature to be fused of the surrounding environment based on the looking-around visible light image, the laser radar point cloud data and/or the millimeter wave radar point cloud data;
And fusing the infrared aerial view characteristic with the aerial view characteristic to be fused to obtain the target aerial view characteristic.
8. The bird's eye view sensing method according to claim 4, wherein the environmental sensing data further includes looking around a visible light image, laser radar point cloud data and/or millimeter wave radar point cloud data, the obtaining the target bird's eye view feature of the surrounding environment based on the environmental sensing data including the infrared image includes:
acquiring a first target aerial view feature of the surrounding environment based on the infrared image data;
Obtaining a second target aerial view feature of the surrounding environment based on the panoramic visible light image, the laser radar point cloud data and/or the millimeter wave radar point cloud data;
performing environment sensing based on the target aerial view feature to obtain an aerial view sensing result of the surrounding environment, including:
Performing environment sensing based on the first target aerial view feature to obtain a first aerial view sensing result of the surrounding environment, and performing environment sensing based on the second target aerial view feature to obtain a second aerial view sensing result of the surrounding environment;
and fusing the first aerial view sensing result and the second aerial view sensing result to obtain the aerial view sensing result.
9. A controller comprising a processor and a memory for storing computer program instructions, the processor, when executing the computer program instructions, performing the bird's eye view sensing method according to any of claims 4 to 8.
10. A computer readable storage medium, wherein the computer readable storage medium stores computer program instructions;
the computer program instructions, when executed by a processor, implement the infrared bird's eye view sensing method of any of claims 4 to 8.
CN202410274886.2A 2024-03-12 2024-03-12 Infrared aerial view sensing device, aerial view sensing method, controller and medium Pending CN118135521A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410274886.2A CN118135521A (en) 2024-03-12 2024-03-12 Infrared aerial view sensing device, aerial view sensing method, controller and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410274886.2A CN118135521A (en) 2024-03-12 2024-03-12 Infrared aerial view sensing device, aerial view sensing method, controller and medium

Publications (1)

Publication Number Publication Date
CN118135521A true CN118135521A (en) 2024-06-04

Family

ID=91242449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410274886.2A Pending CN118135521A (en) 2024-03-12 2024-03-12 Infrared aerial view sensing device, aerial view sensing method, controller and medium

Country Status (1)

Country Link
CN (1) CN118135521A (en)

Similar Documents

Publication Publication Date Title
CN110221603B (en) Remote obstacle detection method based on laser radar multi-frame point cloud fusion
CN106650708B (en) Automatic driving obstacle vision detection method and system
US10515271B2 (en) Flight device and flight control method
JP7252943B2 (en) Object detection and avoidance for aircraft
CN110826499A (en) Object space parameter detection method and device, electronic equipment and storage medium
JP6018231B2 (en) Video source and method, system, imaging device, movable device, and program product for detecting stationary elements in an image source
US8548229B2 (en) Method for detecting objects
CN112907676A (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
CN113271400A (en) Imaging device and electronic apparatus
US11004233B1 (en) Intelligent vision-based detection and ranging system and method
WO2011066240A2 (en) Large format digital camera with multiple optical systems and detector arrays
CN109242900B (en) Focal plane positioning method, processing device, focal plane positioning system and storage medium
US20210241492A1 (en) Vehicular vision system with camera calibration using calibration target
JP2018156408A (en) Image recognizing and capturing apparatus
CN111323767B (en) System and method for detecting obstacle of unmanned vehicle at night
JP5539250B2 (en) Approaching object detection device and approaching object detection method
KR101203816B1 (en) Robot fish localization system using artificial markers and method of the same
CN113643345A (en) Multi-view road intelligent identification method based on double-light fusion
US20190318178A1 (en) Method, system and device of obtaining 3d-information of objects
CN116935281A (en) Method and equipment for monitoring abnormal behavior of motor vehicle lane on line based on radar and video
CN109238281B (en) Visual navigation and obstacle avoidance method based on image spiral line
CN109883433A (en) Vehicle positioning method in structured environment based on 360 degree of panoramic views
CN106846385B (en) Multi-sensing remote sensing image matching method, device and system based on unmanned aerial vehicle
JP4696925B2 (en) Image processing device
CN118135521A (en) Infrared aerial view sensing device, aerial view sensing method, controller and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination