CN113741429A - Automatic driving method and system based on infrared polarization image sensor - Google Patents

Automatic driving method and system based on infrared polarization image sensor Download PDF

Info

Publication number
CN113741429A
CN113741429A CN202110937088.XA CN202110937088A CN113741429A CN 113741429 A CN113741429 A CN 113741429A CN 202110937088 A CN202110937088 A CN 202110937088A CN 113741429 A CN113741429 A CN 113741429A
Authority
CN
China
Prior art keywords
vehicle
polarization image
infrared polarization
infrared
motion control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110937088.XA
Other languages
Chinese (zh)
Inventor
赵照
林晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Xinfoo Sensor Technology Co ltd
Original Assignee
Hefei Xinfoo Sensor Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Xinfoo Sensor Technology Co ltd filed Critical Hefei Xinfoo Sensor Technology Co ltd
Priority to CN202110937088.XA priority Critical patent/CN113741429A/en
Publication of CN113741429A publication Critical patent/CN113741429A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses an automatic driving method and system based on an infrared polarization image sensor. The sensor system comprising the infrared polarization image sensor acquires a three-dimensional infrared polarization image containing polarization information of the surrounding environment of the vehicle, the machine learning unit in the host system determines the target object type of the infrared polarization image, the target object is subjected to distance measurement based on a visual algorithm, the motion state of the target object is predicted, and a motion control signal is sent to the motion control system to adjust the motion state of the vehicle and realize the automatic driving function. The contrast ratio of the target and the background in the infrared polarization image is high, the detail characteristics of the target are prominent, and the target identification effect is obvious.

Description

Automatic driving method and system based on infrared polarization image sensor
Technical Field
The application relates to the field of automatic driving, in particular to an automatic driving method and system based on an infrared polarization image sensor.
Background
Ranging is a core function in a vehicle automatic driving technology, and the most common technical means is to realize ranging through a laser radar (LIDAR), along with the commercial popularization of vehicles with an automatic driving function, a large number of vehicles on a road adopt the same radar scanning, so that a mutual interference phenomenon exists, and the radar easily confuses pulses sent by nearby vehicles and signal echoes sent by the vehicles, so that radar failure even threatens the safety of the vehicles and personnel. In addition, the detection distance of the laser radar for the vehicle is limited, the distance measurement of objects beyond 150 meters is difficult, the vehicle speed can be limited due to the short detection distance, and the high-speed running of the vehicle is not facilitated. In addition, the laser radar is difficult to work all weather, and is difficult to work normally under special environments such as strong light, rain or fog.
Disclosure of Invention
In view of the above, the present application provides an automatic driving method and system based on an infrared polarization image sensor.
In order to solve the technical problem, the following technical scheme is adopted in the application:
in a first aspect of the application, an automatic driving method based on an infrared polarization image sensor is provided, and the method comprises the following steps:
a sensor system comprising an infrared polarization image sensor acquires an infrared polarization image of the vehicle surroundings;
the vehicle positioning system acquires position information, route information and motion state of the vehicle when the infrared polarization image is obtained;
the host system receives the infrared polarization image, the position information, the route information and the motion state, and sends a motion control signal to a motion control system according to the infrared polarization image, the position information, the route information and the motion state;
and the motion control system receives the motion control signal and adjusts the motion state of the vehicle according to the motion control signal.
Preferably, the sending a motion control signal to a motion control system according to the infrared polarization image, the position information, the route information, and the motion state includes:
identifying a target object of the infrared polarization image according to the infrared polarization image, determining a horizontal distance between the target object and a vehicle according to the infrared polarization image and the position information, and sending a motion control signal to a motion control system according to the type of the target object, the horizontal distance, the route information and the motion state.
Preferably, the determining the horizontal distance between the target object and the vehicle according to the infrared polarization image and the position information includes:
determining two coordinate values of the target object in the two images according to the infrared polarization image, determining the linear distance of the two positions according to two position information of the vehicle when the two images are correspondingly obtained, and determining the horizontal distance between the target object and the vehicle according to the two coordinate values and the linear distance.
Preferably, the acquiring an infrared polarization image of the vehicle surroundings includes:
and acquiring original data of the surrounding environment of the vehicle, and performing three-dimensional reconstruction on the original data of the infrared polarization image to obtain the infrared polarization image comprising three-dimensional information.
In a second aspect of the present application, there is provided an infrared polarization image sensor-based automatic driving system, the system comprising:
a sensor system including an infrared polarization image sensor for acquiring an infrared polarization image of an environment around the vehicle;
the vehicle positioning system is used for recording the position information, the route information and the motion state of the vehicle when the infrared polarization image is acquired;
the host system is used for receiving the infrared polarization image, the position information, the route information and the motion state and sending a motion control signal to a motion control system according to the infrared polarization image, the position information, the route information and the motion state;
and the motion control system is used for receiving the motion control signal and adjusting the motion state of the vehicle according to the motion control signal.
Preferably, the infrared polarization image sensor comprises a plurality of pixel groups, and each pixel group comprises four pixels with polarization angles of 0 °, 45 °, 90 ° and 135 °, respectively.
Preferably, the host system includes a display device interface for transmitting the infrared polarized image to an external display device electrically connected to the display device interface.
Preferably, the host system comprises a convolutional neural network-based machine learning unit.
Compared with the prior art, the method has the following beneficial effects:
based on the technical scheme, the automatic driving method and system based on the infrared polarization image sensor comprise the steps that the sensor system of the infrared polarization image sensor obtains a three-dimensional infrared polarization image containing polarization information of the surrounding environment of a vehicle, the machine learning unit in the host system determines the type of a target object of the infrared polarization image, the target object is subjected to distance measurement based on a visual algorithm, the motion state of the target object is predicted, and a motion control signal is sent to the motion control system to adjust the motion state of the vehicle so as to achieve the automatic driving function. The contrast ratio of the target and the background in the infrared polarization image is high, the detail characteristics of the target are prominent, and the target identification effect is obvious. In addition, the three-dimensional infrared polarization image containing the polarization information and the calculated target object distance information can be transmitted to a system external display screen, so that personnel in the vehicle can conveniently, comprehensively and deeply know the surrounding environment information of the vehicle.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of an automatic driving method based on an infrared polarization image sensor according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a comparison between a conventional infrared image and an infrared polarization image provided in an embodiment of the present application.
Fig. 3 is a schematic diagram of a wire grid layer structure of an infrared polarization image sensor including different polarization angles according to an embodiment of the present application.
Fig. 4 is a flowchart of an example of a process for acquiring an infrared polarization image of an environment around a vehicle according to an embodiment of the present application.
Fig. 5 is a flowchart of an example of a process for issuing a motion control signal according to an infrared polarization image, position information, route information, and a motion state provided by an embodiment of the present application.
Fig. 6 is a schematic diagram of monocular visual ranging according to an embodiment of the present application.
Fig. 7 is a schematic diagram of an automatic driving system framework based on an infrared polarization image sensor according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or device that comprises a list of elements does not include only those elements but may include other elements not expressly listed. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of additional like elements in the article or device comprising the element.
Before describing the embodiments provided herein, a polarized light imaging technique is first described. The vibration of polarized light has a fixed direction or a regular change in direction, which is classified into linearly polarized light, circularly polarized light, or elliptically polarized light.
After the natural light is irradiated, the amplitudes of the electric vector vertical component and the parallel component in the reflected light are changed, so that the reflected light is not isotropic natural light any more and is partially polarized light or linearly polarized light, polarization characteristics determined by the properties of the reflected light and the electromagnetic radiation process of an object can be generated, different states (such as roughness, water content, material physical and chemical characteristics and the like) of different objects or the same object often have different polarization states in a thermal infrared band, and by utilizing an infrared polarization imaging technology that a target shows radiation or reflected polarization information, multi-dimensional characteristic information such as the intensity, polarization, images and the like of the target can be comprehensively obtained, the contrast ratio of the target and a background is effectively improved, the detailed characteristics of the target are highlighted, the target identification effect is enhanced, and the attributes and behaviors of the target are more comprehensively and deeply known.
Because the traditional thermal imager measures the radiation intensity of an object, and the polarization quantity measures the radiation contrast of the object in different polarization directions, the infrared polarization imaging can also distinguish the objects with the same radiation intensity and different polarization characteristics.
The infrared polarization degree of the ground object background in the natural environment is small (generally less than 1.5%), the infrared polarization degree of the metal material target is relatively large and reaches 2% -7%, and therefore the difference between the polarization degree of the vehicle taking the metal material as the main body and the polarization degree of the ground object background is large, and compared with the traditional infrared imaging technology, the infrared polarization imaging technology has obvious advantages in identifying the vehicle target in the ground object background.
Referring to fig. 1, fig. 1 is a flowchart of an automatic driving method based on an infrared polarization image sensor according to an embodiment of the present application. The process 100 includes: s101, acquiring an infrared polarization image of the surrounding environment of the vehicle; s102, acquiring position information, route information and motion state of the vehicle when the infrared polarization image is obtained; s103, sending a motion control signal according to the infrared polarization image, the position information, the route information and the motion state; and S104, adjusting the motion state of the vehicle according to the motion control signal.
In S101, a sensor system including an infrared polarization image sensor acquires an infrared polarization image of the vehicle surroundings.
The infrared polarization image sensor can receive electromagnetic radiation in a spectral range corresponding to the thermal radiation reflected by an environmental object around the vehicle, for example, electromagnetic waves with the wavelength of 8-14 micrometers, convert an optical signal into raw data in the form of an electric signal through a thermosensitive film, process the raw data to obtain an infrared image containing polarization information reflected by the environmental object, and send the obtained infrared polarization image to a host system. Referring to fig. 2, fig. 2 is a schematic diagram illustrating a comparison between a conventional infrared image and an infrared polarized image provided in the embodiment of the present application, where the obtained infrared polarized image is a depth image with three-dimensional information, and includes multi-dimensional information such as intensity and polarization of a target, the contrast between the target and a background is high, the detailed features of the target are prominent, the target identification effect is excellent, and the attributes and behaviors of the target can be known more comprehensively and deeply.
The sensor system includes at least one infrared polarized image sensor, the at least one infrared polarized image sensor being mounted on the vehicle. The vehicle may be a vehicle such as an automobile or a motorcycle or a robot having a motion function. The infrared polarization image sensor includes a polarizing means, which may be, for example, a crystal, a dichroic, a thin film, and a wire grid. The polarizing device can be integrated on the sensor pixel structure and can also be arranged in an optical module. The polarizing device has a structure with a specific polarization angle, so that light can have a fixed vibration direction or the vibration direction can be regularly changed after passing through the polarizing device. In one embodiment, the polarization device is a wire grid layer integrated on an upper layer of a MEMS pixel in an infrared polarization image sensor, the wire grid layer has a specific polarization angle, the infrared polarization image sensor includes a plurality of pixel groups, each pixel group includes four adjacent pixels, referring to fig. 3, fig. 3 is a schematic structural diagram of a wire grid layer of an infrared polarization image sensor including different polarization angles provided by an embodiment of the present application, and polarization angles of respective upper layer wire grid layer regions of the four pixels are different from each other, and are respectively 0 °, 45 °, 90 °, and 180 °.
In S102, the vehicle positioning system acquires position information, route information, and motion state of the vehicle when the infrared polarization image is obtained.
The vehicle positioning system can be based on a Global Positioning System (GPS) or a Beidou satellite navigation system (BDS), and can receive data transmitted by satellites. When the host system receives the infrared polarization image output by the sensor system, the host system sends a request signal to the vehicle positioning system, the vehicle positioning system sends the position information, the current route information and the vehicle motion state of the vehicle to the host system after receiving the request signal, the vehicle position information can be longitude, latitude or route information, the route information can be an intersection, a turning lane and the like, and the vehicle motion state can be speed or direction.
In S103, the host system receives the infrared polarization image, the position information, the route information and the motion state, and sends a motion control signal to the motion control system according to the infrared polarization image, the position information, the route information and the motion state;
the host system includes a host interface and a machine learning unit. The sensor system is connected to the host interface, and the infrared polarized image frames are transmitted to the host system via the host interface. The host interface may be one of a variety of high speed or high bandwidth interfaces, for example, the host interface may be a peripheral component interconnect bus (PCI).
The machine learning unit may include a convolutional neural network, a support vector machine, or a bayesian network. The machine learning unit is configured to receive the infrared polarized images from the sensor system, the vehicle position information from the vehicle positioning system, the route information, and the vehicle motion state, and the machine learning unit is trained to computationally determine a motion control signal based on the infrared polarized images, the vehicle position information, the route information, and the vehicle motion state, and to transmit the motion control signal to the motion control system so that the motion control system adjusts the vehicle motion state.
In S104, the motion control system receives the motion control signal and adjusts the motion state of the vehicle according to the motion control signal.
The motion control system may include a powertrain, a braking system, and a steering system. The motion control system is configured to receive motion control signals output by the host system, and according to the motion control signals, the power system, the braking system or the steering system make corresponding adjustment to change the motion state of the vehicle so as to realize the automatic driving function.
A power system of a vehicle includes a power source (e.g., a battery or an internal combustion engine) connected to wheels via the power system, which is capable of rotating the wheels and controlling the rotational speed of the wheels to enable the vehicle to run, accelerate or decelerate during running. The steering system of the vehicle may steer the vehicle by angling the wheels relative to the body, for example, by controlling the angular velocity or yaw angle of the vehicle. The braking system of the vehicle may apply a force to the wheels to cause them to perform a certain degree of positive braking, forcing the vehicle to slow down or stop while in motion.
FIG. 4 is a flow chart of an example of a process 400 for acquiring infrared polarized images of the vehicle surroundings. Specifically, process 400 includes: s401, original data of an image of the surrounding environment of the vehicle are obtained, and S402, the original data of the infrared polarization image are reconstructed to obtain the infrared polarization image.
In S401, an infrared polarization image sensor in the sensor system receives electromagnetic radiation reflected from an object in the surroundings of the vehicle, converts an optical signal into an electrical signal through a thermosensitive film, and reads out the electrical signal through a readout circuit to obtain image raw data. Since the infrared polarization image sensor includes a polarization device having a specific polarization angle, the image raw data contains specific polarization information of an object in the vehicle surroundings. In one embodiment, the infrared polarization sensor comprises a plurality of pixel groups, each pixel group comprises four adjacent pixels, polarization angles of upper-layer linear grid layer regions corresponding to the four pixels are different and are respectively 0 °, 45 °, 90 ° and 180 °, and therefore, image raw data comprises information of the four polarization angles.
In S402, three-dimensional reconstruction is performed on the original infrared polarization image data to obtain an infrared polarization image including three-dimensional information.
The sensor system comprises an image processing module, and the image raw data is transmitted to the image processing module through a sensor interface to be subjected to three-dimensional reconstruction. The image processing module may be one or more processors, such as a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a graphics processor, and so forth.
In one embodiment, the reconstruction process reconstructs four adjacent pixels having different polarization angles into one pixel. Specifically, in an image processing module, a Stokes formula is adopted, a Stokes vector of a target is reconstructed according to original data information acquired by the four pixels, the polarization degree of a pixel to be reconstructed is calculated according to the Stokes vector, then, an incident azimuth angle and an incident zenith angle of the corresponding target surface of the pixel to be reconstructed are calculated by utilizing the polarization degree, a normal of the surface of the reconstructed target can be calculated according to the incident azimuth angle and the incident zenith angle, then, three-dimensional information of the target to be reconstructed is calculated according to the surface normal, and the three-dimensional information of the structure of the real target to be reconstructed is calculated through a surface integral Frankot-Chellappa algorithm, so that an infrared polarization image with the three-dimensional information is obtained. After the infrared polarization image is obtained, the infrared polarization image is transmitted to a host system through a data transmission module in the sensor system.
Fig. 5 is a flow chart of an example of a process 500 for issuing a motion control signal based on an infrared polarization image, location information, route information, and motion status. Specifically, the process 500 includes: s501, determining the target object type of the infrared polarization image according to the infrared polarization image; s502, determining the horizontal distance between the target object and the vehicle according to the infrared polarization image and the position information; s503 transmits the motion control signal according to the target object category, the horizontal distance, the route information, and the motion state.
In S501, the classification unit of the machine learning unit of the host system identifies a target object of the infrared polarization image from the infrared polarization image.
The host system includes a host interface and a machine learning unit. The data transmission module of the sensor system is connected to the host interface, and the infrared polarized image frames are sent to the identification unit in the machine learning unit through the host interface. The process of determining a target object in an infrared polarization image is divided into two steps, first, a recognition unit detects a pixel cluster corresponding to thermal radiation from the object through an algorithm, and then, a pre-trained recognition unit determines a specific class of the target object from the image.
The identification unit may detect one or more objects by clusters of pixels of reflected thermal radiation from objects (e.g., people, animals, vehicles, or obstacles) of the vehicle's surroundings in the infrared polarized image, and the identification unit may use object recognition algorithms (e.g., Felzenszwalb image segmentation algorithms), video tracking, and other computer vision techniques to detect clusters of pixels associated with objects appearing within the field of view of the infrared polarized sensor.
The recognition unit can classify objects appearing in the field of view of the infrared polarization sensor and determine the class of the target object, the recognition unit model is input with training data in advance for training, the recognition unit is configured to determine the class of the specific object from the image, the training data can be a limited number of examples of the specific object or the image features, the classification class of the object can be human, animal, traffic sign, static object or dynamic object, and the recognition unit can comprise a convolutional neural network, a support vector machine or a Bayesian network.
In S502, the ranging unit in the machine learning unit determines the horizontal distance between the target object and the vehicle based on the infrared polarization image and the vehicle position information.
And after the host system receives the infrared polarization image output by the sensor system, the host system sends a request signal to the vehicle positioning system, the vehicle positioning system sends the position information of the vehicle and the motion state of the vehicle to the host system, the position information of the vehicle can be longitude, latitude or height, and the motion state of the vehicle can be speed or direction. According to the longitude and latitude information of the vehicle corresponding to the two successively acquired infrared polarization images, the distance measurement unit can calculate the linear distance d of the two positions when the two infrared polarization images are acquired through a distance formula between the two points.
The distance measurement unit detects and classifies objects in the infrared polarization images, obtains coordinate values of a distance measurement target object in two successive infrared polarization images, and calculates a horizontal distance x between the vehicle and the target object when the subsequent image is obtained in a time dimension according to the two coordinate values and the linear distance d.
In some embodiments, the distance measuring unit measures distance using a similar triangle monocular vision distance measuring method, the distance x between the measured vehicle and the target object is the distance between a point on the three-dimensional target object in the world coordinate system and the central point of the lens, referring to fig. 6, two distances of pixels L1 and L2 are obtained according to the coordinate values of the target object in the two images of the target, L1 and L2 respectively represent the height difference between the point on the three-dimensional target object on the two images respectively corresponding to the two-dimensional plane and the same horizontal plane of the central axis of the camera, a formula x = d × L1/(L2-L1) can be derived by the geometric relationship of the similar triangle, and the horizontal distance x between the vehicle and the target object when the subsequent image is obtained in the time dimension is calculated by the formula. It should be noted that the ranging unit may also perform the ranging calculation between the target object and the vehicle by using other ranging algorithms.
In S503, the prediction unit in the machine learning unit transmits a motion control signal to the motion control system according to the target object class, the horizontal distance between the vehicle and the target object, the vehicle motion state, and the vehicle position information.
The position information acquired by the prediction unit from the vehicle positioning system comprises route information, the route information can be road characteristics such as intersections and turning lanes, the prediction unit is input with the route information and training data containing specific objects in advance to carry out training, the prediction unit is configured to predict the motion state of the target object based on the input route information and the recognized specific target object, and the motion control signal is output to adjust the motion state of the vehicle to realize automatic driving by combining the calculated horizontal distance between the target object and the vehicle and the current motion state of the vehicle.
In S501, for example, the identifying unit identifies that there is a target vehicle traveling in a direction opposite to the traveling direction of the own vehicle, the ranging unit measures the distance between the target vehicle and the own vehicle in S502, and, in S503, through the input route information, if the target vehicle is in the special turning lane and the intersection has no traffic lights, the prediction unit may predict that the target vehicle is likely to be subjected to the turning travel and predict the turning travel route, if there is an intersection between the turning travel route of the target vehicle and the travel route of the own vehicle, if the current speed of the self vehicle is higher, the prediction unit sends a motion control signal corresponding to vehicle deceleration to the motion control system to decelerate the vehicle, so that the vehicle is decelerated, and the traffic accident is avoided.
In some embodiments, the host system based on the infrared polarization image sensor autopilot system provided by the present application may be electrically connected to an external device, and the external device may be an input/output device such as a display screen, a mouse, a keyboard, a USB, or the like.
In some embodiments, the host system further includes a display device interface, after the ranging unit in the host system has finished ranging the target object, the infrared polarization image and the target object distance information are sent to a display screen outside the system via the display device interface, so that people in the vehicle can conveniently, comprehensively and deeply know the surrounding environment information of the vehicle, and the image in the display screen is an infrared image with three-dimensional information of the target object and the target object distance information.
Fig. 7 is a schematic diagram of an automatic driving system framework based on an infrared polarization image sensor according to an embodiment of the present application. The system includes a sensor system, a vehicle positioning system, a host system, and a motion control system. The sensor system comprises an infrared polarization sensor, a sensor interface, an image processing module and a data transmission module. The host system includes a host interface, a display device interface, and a machine learning unit. The machine learning unit includes an identification unit, a ranging unit, and a prediction unit. The motion control system includes a powertrain, a braking system, and a steering system.
The foregoing is only a preferred embodiment of the present invention, and although the present invention has been disclosed in the preferred embodiments, it is not intended to limit the present invention. Those skilled in the art can make numerous possible variations and modifications to the present teachings, or modify equivalent embodiments to equivalent variations, without departing from the scope of the present teachings, using the methods and techniques disclosed above. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention are still within the scope of the protection of the technical solution of the present invention, unless the contents of the technical solution of the present invention are departed.

Claims (8)

1. An autonomous driving method, the method comprising:
a sensor system comprising an infrared polarization image sensor acquires an infrared polarization image of the vehicle surroundings;
the vehicle positioning system acquires position information, route information and motion state of the vehicle when the infrared polarization image is obtained;
the host system receives the infrared polarization image, the position information, the route information and the motion state, and sends a motion control signal to a motion control system according to the infrared polarization image, the position information, the route information and the motion state;
and the motion control system receives the motion control signal and adjusts the motion state of the vehicle according to the motion control signal.
2. The method of claim 1, wherein said issuing a motion control signal to a motion control system based on said infrared polarization image, said location information, said route information, and said motion state comprises:
identifying a target object of the infrared polarization image according to the infrared polarization image, determining a horizontal distance between the target object and a vehicle according to the infrared polarization image and the position information, and sending a motion control signal to a motion control system according to the type of the target object, the horizontal distance, the route information and the motion state.
3. The method of claim 2, wherein said determining a horizontal distance of the target object from the vehicle from the infrared polarized image and the location information comprises:
determining two coordinate values of the target object in the two images according to the infrared polarization image, determining the linear distance of the two positions according to two position information of the vehicle when the two images are correspondingly obtained, and determining the horizontal distance between the target object and the vehicle according to the two coordinate values and the linear distance.
4. The method of claim 1, wherein said acquiring an infrared polarized image of the vehicle surroundings comprises:
and acquiring original data of the surrounding environment of the vehicle, and performing three-dimensional reconstruction on the original data of the infrared polarization image to obtain the infrared polarization image comprising three-dimensional information.
5. An autopilot system, the system comprising:
a sensor system including an infrared polarization image sensor for acquiring an infrared polarization image of an environment around the vehicle;
the vehicle positioning system is used for recording the position information, the route information and the motion state of the vehicle when the infrared polarization image is acquired;
the host system is used for receiving the infrared polarization image, the position information, the route information and the motion state and sending a motion control signal to a motion control system according to the infrared polarization image, the position information, the route information and the motion state;
and the motion control system is used for receiving the motion control signal and adjusting the motion state of the vehicle according to the motion control signal.
6. The system of claim 5, wherein the infrared polarization image sensor comprises a plurality of pixel groups, each of the pixel groups comprising four pixels having polarization angles of 0 °, 45 °, 90 °, and 135 °, respectively.
7. The system of claim 5, wherein the host system comprises a display device interface for transmitting the infrared polarized image to an external display device electrically connected to the display device interface.
8. The system of claim 5, wherein the host system comprises a convolutional neural network-based machine learning unit.
CN202110937088.XA 2021-08-16 2021-08-16 Automatic driving method and system based on infrared polarization image sensor Pending CN113741429A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110937088.XA CN113741429A (en) 2021-08-16 2021-08-16 Automatic driving method and system based on infrared polarization image sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110937088.XA CN113741429A (en) 2021-08-16 2021-08-16 Automatic driving method and system based on infrared polarization image sensor

Publications (1)

Publication Number Publication Date
CN113741429A true CN113741429A (en) 2021-12-03

Family

ID=78731189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110937088.XA Pending CN113741429A (en) 2021-08-16 2021-08-16 Automatic driving method and system based on infrared polarization image sensor

Country Status (1)

Country Link
CN (1) CN113741429A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009130709A (en) * 2007-11-26 2009-06-11 Clarion Co Ltd Near infrared camera system
JP2016168874A (en) * 2015-03-11 2016-09-23 トヨタ自動車株式会社 Road face projection system
CN110959143A (en) * 2017-08-04 2020-04-03 索尼公司 Information processing device, information processing method, program, and moving object
CN111158061A (en) * 2019-12-31 2020-05-15 中国船舶重工集团公司第七一七研究所 Multi-dimensional information detection device and measurement method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009130709A (en) * 2007-11-26 2009-06-11 Clarion Co Ltd Near infrared camera system
JP2016168874A (en) * 2015-03-11 2016-09-23 トヨタ自動車株式会社 Road face projection system
CN110959143A (en) * 2017-08-04 2020-04-03 索尼公司 Information processing device, information processing method, program, and moving object
US20200272834A1 (en) * 2017-08-04 2020-08-27 Sony Corporation Information processing apparatus, information processing method, program, and mobile object
CN111158061A (en) * 2019-12-31 2020-05-15 中国船舶重工集团公司第七一七研究所 Multi-dimensional information detection device and measurement method thereof

Similar Documents

Publication Publication Date Title
US11941873B2 (en) Determining drivable free-space for autonomous vehicles
US20210241004A1 (en) Object fence generation for lane assignment in autonomous machine applications
CN111886626B (en) Signal processing device, signal processing method, program, and moving object
US11733353B2 (en) Object detection using local (ground-aware) adaptive region proposals on point clouds
CN112825136A (en) Deep neural network for detecting obstacles using RADAR sensors in autonomous machine applications
CN112825134A (en) Deep neural network for detecting obstacles using RADAR sensors in autonomous machine applications
CN116685873A (en) Vehicle-road cooperation-oriented perception information fusion representation and target detection method
JP7259749B2 (en) Information processing device, information processing method, program, and moving body
WO2022104774A1 (en) Target detection method and apparatus
CA3028659A1 (en) Systems and methods for identifying and positioning objects around a vehicle
CN112904370A (en) Multi-view deep neural network for lidar sensing
CN114442101B (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
WO2021174118A1 (en) Object detection using image alignment for autonomous machine applications
US20230260266A1 (en) Camera-radar data fusion for efficient object detection
CN115136148A (en) Projecting images captured using a fisheye lens for feature detection in autonomous machine applications
US12060080B2 (en) Puddle occupancy grid for autonomous vehicles
CN116051779A (en) 3D surface reconstruction using point cloud densification for autonomous systems and applications using deep neural networks
CN116125356A (en) Calibration of sensors in autonomous vehicle applications
CN116051780A (en) 3D surface reconstruction using artificial intelligence with point cloud densification for autonomous systems and applications
Gazis et al. Examining the sensors that enable self-driving vehicles
WO2023158642A1 (en) Camera-radar data fusion for efficient object detection
CN111833443A (en) Landmark position reconstruction in autonomous machine applications
CN113741429A (en) Automatic driving method and system based on infrared polarization image sensor
WO2020244467A1 (en) Method and device for motion state estimation
US20220281459A1 (en) Autonomous driving collaborative sensing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination