WO2020154903A1 - 一种确定高程的方法、装置及雷达 - Google Patents

一种确定高程的方法、装置及雷达 Download PDF

Info

Publication number
WO2020154903A1
WO2020154903A1 PCT/CN2019/073735 CN2019073735W WO2020154903A1 WO 2020154903 A1 WO2020154903 A1 WO 2020154903A1 CN 2019073735 W CN2019073735 W CN 2019073735W WO 2020154903 A1 WO2020154903 A1 WO 2020154903A1
Authority
WO
WIPO (PCT)
Prior art keywords
antenna
elevation
target object
imaging information
transmitting antenna
Prior art date
Application number
PCT/CN2019/073735
Other languages
English (en)
French (fr)
Inventor
周鹏
苏箐
陈佳民
吴祖光
王筱治
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201980089284.7A priority Critical patent/CN113302519A/zh
Priority to PCT/CN2019/073735 priority patent/WO2020154903A1/zh
Publication of WO2020154903A1 publication Critical patent/WO2020154903A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/03Details of HF subsystems specially adapted therefor, e.g. common to transmitter and receiver

Definitions

  • This application relates to the field of automatic driving, and in particular to a method, device and radar for determining elevation.
  • Artificial Intelligence is a theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge, and use knowledge to obtain the best results.
  • artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new kind of intelligent machine that can react in a similar way to human intelligence.
  • Artificial intelligence is to study the design principles and implementation methods of various intelligent machines, so that the machines have the functions of perception, reasoning and decision-making.
  • Research in the field of artificial intelligence includes robotics, natural language processing, computer vision, decision-making and reasoning, human-computer interaction, recommendation and search, and basic AI theories.
  • Autonomous driving is a mainstream application in the field of artificial intelligence.
  • Autonomous driving technology relies on the collaboration of computer vision, radar, monitoring devices, and global positioning systems to enable motor vehicles to achieve autonomous driving without requiring human active operations.
  • Self-driving vehicles use various computing systems to help transport passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator (such as a navigator, driver, or passenger). The self-driving vehicle allows the operator to switch from the manual mode to the automatic driving mode or a mode in between. Since autonomous driving technology does not require humans to drive motor vehicles, it can theoretically effectively avoid human driving errors, reduce traffic accidents, and improve highway transportation efficiency. Therefore, autonomous driving technology has received more and more attention.
  • radar can realize obstacle measurement, collision prediction, adaptive cruise control and other functions in the vehicle environment, effectively reducing driving difficulty and reducing the incidence of accidents, it has been widely used in the automotive field.
  • Figure 1a is a schematic diagram of radar transmitting and receiving signals.
  • the radar transmits the detection signal (electromagnetic wave) and receives the signal reflected by the target object through the antenna, and amplifies and down-converts the signal reflected by the target object to obtain The relative distance and relative speed between the vehicle and the target object are then tracked, identified and classified according to the obtained information. After a reasonable decision, the driver is notified or warned by sound, light, and touch. Or actively intervene in the vehicle in time to ensure the safety and comfort of the driving process and reduce the probability of accidents.
  • the detection signal electromagnetic wave
  • a possible method for determining the elevation is to use Lidar (Light Detection And Ranging, LIDAR) to measure the elevation of the target object.
  • LIDAR Light Detection And Ranging
  • the working principle of Lidar is to transmit a detection signal (laser beam) to the target, and then the received slave
  • the echo signal reflected by the target object is compared with the transmitted signal, and after proper processing, relevant information of the target object can be obtained, such as target distance, azimuth, elevation, speed and other parameters.
  • relevant information of the target object can be obtained, such as target distance, azimuth, elevation, speed and other parameters.
  • lidar has higher requirements on the level of technology and higher cost, which leads to higher costs for using lidar to determine elevation.
  • the present application provides a method, device and radar for determining elevation, which are used to solve the technical problem of high cost caused by using lidar to determine elevation.
  • this application provides a method for determining elevation, including:
  • the first echo signal is that the target object receives the first transmission An echo signal reflected after the first transmission signal transmitted by the antenna, where the second echo signal is the echo signal reflected after the target object receives the second transmission signal transmitted by the second transmission antenna;
  • the elevation of the target object is determined.
  • the first imaging information and the second imaging information can be respectively generated based on the first echo signal and the second echo signal reflected by the target object
  • the elevation of the target object can be determined based on the first imaging information and the second imaging information.
  • the method is simple to implement and has a lower cost than the lidar to determine the elevation.
  • the embodiments of the present application can implement the use of the vehicle-mounted radar to measure the elevation of the target object without changing the main structure of the vehicle-mounted radar.
  • the first transmitting antenna, the second transmitting antenna, and the receiving antenna are located on a straight line perpendicular to the reference plane of the elevation.
  • the receiving antenna is located between the first transmitting antenna and the second transmitting antenna.
  • the average value of the distance between the receiving antenna and the first transmitting antenna and the distance between the receiving antenna and the second transmitting antenna is less than or equal to the wavelength of the first transmitting signal or the wavelength of the second transmitting signal.
  • the wavelength of the first transmit signal and the wavelength of the second transmit signal are the same.
  • determining the elevation of the target object according to the first imaging information and the second imaging information includes:
  • the distance between the receiving antenna and the first transmitting antenna, the distance between the receiving antenna and the second transmitting antenna, and the wavelength of the first transmission signal or the wavelength of the second transmission signal Obtaining the receiving angle at which the receiving antenna receives the first echo signal or the second echo signal;
  • the elevation of the target object is determined.
  • the distance between the receiving antenna and the first transmitting antenna, the distance between the receiving antenna and the second transmitting antenna, and the wavelength of the first transmitting signal or The wavelength of the second transmitted signal to obtain the receiving angle includes:
  • ⁇ R represents the distance difference
  • the distance difference is the difference between the transmission distance of the first transmission signal and the transmission distance of the second transmission signal
  • represents the receiving angle
  • ⁇ h1 represents the distance between the receiving antenna and the first transmitting antenna
  • ⁇ h2 represents the distance between the receiving antenna and the second transmitting antenna
  • the receiving angle conforms to the following formula:
  • determining the elevation of the target object according to the receiving angle includes:
  • the elevation of the target object is calculated by the following formula:
  • h represents the elevation of the target object
  • H represents the distance between the receiving antenna and the reference plane of the elevation
  • R centre represents the distance between the receiving antenna and the target object
  • represents the receiving angle
  • an embodiment of the present application provides a device for determining elevation.
  • the device may be a radar or a semiconductor chip set in the radar.
  • the device has the function of realizing various possible designs of the first aspect. This function can be realized by hardware, or by hardware executing corresponding software.
  • the hardware or software includes one or more units or modules corresponding to the above-mentioned functions.
  • an embodiment of the present application provides a device that includes a processor, a memory, and an instruction stored in the memory and executable on the processor. When the instruction is executed, the device executes the first aspect. The method described.
  • an embodiment of the present application provides a radar, which includes a first transmitting antenna, a second transmitting antenna, a receiving antenna, and the device described in the third aspect.
  • embodiments of the present application provide a computer-readable storage medium, including instructions, which when run on a computer, cause the computer to execute the method described in any possible design of the first aspect.
  • the embodiments of the present application provide a computer program product, which when running on a computer, causes the computer to execute the method described in any one of the possible designs of the first aspect.
  • FIG. 1a is a schematic diagram of transmitting and receiving signals of a vehicle-mounted radar according to an embodiment of the application
  • FIG. 1b is a schematic structural diagram of an autonomous vehicle provided by an embodiment of the application.
  • FIGS. 2a and 2b are schematic diagrams of the hardware structure of the radar provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of a possible sequence provided by an embodiment of this application.
  • Fig. 4a is an example diagram of a positional relationship between the first transmitting antenna, the second transmitting antenna and the receiving antenna;
  • Fig. 4b is a diagram showing another example of the positional relationship between the first transmitting antenna, the second transmitting antenna and the receiving antenna;
  • Fig. 4c is a diagram showing another example of the positional relationship between the first transmitting antenna, the second transmitting antenna and the receiving antenna;
  • FIG. 5a is a schematic flow diagram corresponding to the method for determining elevation provided by an embodiment of the application.
  • FIG. 5b is a schematic diagram of the working process of the method shown in FIG. 5a executed by radar according to an embodiment of the application;
  • FIG. 6 is a schematic diagram of a process for determining the elevation of a target object provided by an embodiment of the application
  • Fig. 7 is a schematic diagram of determining the elevation of a target object provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of the resolution of a millimeter wave radar provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of a three-dimensional feature of a target object provided by an embodiment of the application.
  • Fig. 10 is a possible exemplary block diagram of a device for determining elevation involved in an embodiment of the application.
  • Radar generally working in the ultrashort wave or microwave band, among them, the radar working in the ultrashort wave band is called ultrashort wave radar or meter wave radar; the radar working in the microwave band is generally called microwave radar.
  • Microwave radar is sometimes subdivided into decimeter wave radar, centimeter wave radar, millimeter wave radar and so on.
  • Millimeter wave radar Working in the millimeter wave band (wavelength is 1-10mm), the working frequency is usually selected in the range of 30-300GHz.
  • the wavelength of millimeter wave is between centimeter wave and light wave, so millimeter wave has the advantages of microwave guidance and photoelectric guidance.
  • the millimeter waveguide seeker Compared with the centimeter waveguide seeker, the millimeter waveguide seeker has the characteristics of small size, light weight and high spatial resolution. Because of its small size and light weight, millimeter-wave radar can make up for the use scenarios that other sensor radars such as infrared, laser, ultrasonic, camera, etc. do not have in vehicle applications.
  • Target object it can be any target that needs to measure distance and/or speed, for example, it can be a moving object or a stationary object.
  • Elevation It can refer to the height of a certain point relative to the reference surface.
  • the elevation of the target object refers to the height of the target object relative to the reference surface.
  • the reference plane may also be referred to as a reference plane or a reference plane, and may be a plane assumed in advance, which is not specifically limited.
  • Fig. 1b is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in the automatic driving mode, and can determine the current state of the vehicle and its surrounding environment through human operations, determine the possible behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle
  • the confidence level corresponding to the possibility of performing possible actions is controlled based on the determined information.
  • the vehicle 100 can be placed to operate without human interaction.
  • the vehicle 100 may include various subsystems, such as a travel system 102, a sensor system 104, a control system 106, one or more peripheral devices 108 and a power supply 110, a computer system 112, and a user interface 116.
  • the vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each of the subsystems and elements of the vehicle 100 may be wired or wirelessly interconnected.
  • the travel system 102 may include components that provide power movement for the vehicle 100.
  • the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121.
  • the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine composed of a gas oil engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
  • the engine 118 converts the energy source 119 into mechanical energy.
  • Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 119 may also provide energy for other systems of the vehicle 100.
  • the transmission device 120 can transmit the mechanical power from the engine 118 to the wheels 121.
  • the transmission device 120 may include a gearbox, a differential, and a drive shaft.
  • the transmission device 120 may also include other devices, such as a clutch.
  • the drive shaft may include one or more shafts that can be coupled to one or more wheels 121.
  • the sensor system 104 may include several sensors that sense information about the environment around the vehicle 100.
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a GPS system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, a radar 126, a laser rangefinder 128, and Camera 130.
  • the sensor system 104 may also include sensors of the internal system of the monitored vehicle 100 (for example, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, direction, speed, etc.). Such detection and identification are key functions for the safe operation of the autonomous vehicle 100.
  • the positioning system 122 can be used to estimate the geographic location of the vehicle 100.
  • the IMU 124 is used to sense changes in the position and orientation of the vehicle 100 based on inertial acceleration.
  • the IMU 124 may be a combination of an accelerometer and a gyroscope.
  • the radar 126 may use radio signals to sense objects in the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing the object, the radar 126 may also be used to sense the speed and/or direction of the object.
  • the laser rangefinder 128 can use laser light to sense objects in the environment where the vehicle 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
  • the camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100.
  • the camera 130 may be a still camera or a video camera.
  • the control system 106 controls the operation of the vehicle 100 and its components.
  • the control system 106 may include various components, including a steering system 132, a throttle 134, a braking unit 136, a sensor fusion algorithm 138, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
  • the steering system 132 is operable to adjust the forward direction of the vehicle 100.
  • it may be a steering wheel system.
  • the throttle 134 is used to control the operating speed of the engine 118 and thereby control the speed of the vehicle 100.
  • the braking unit 136 is used to control the vehicle 100 to decelerate.
  • the braking unit 136 may use friction to slow down the wheels 121.
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electric current.
  • the braking unit 136 may also take other forms to slow down the rotation speed of the wheels 121 to control the speed of the vehicle 100.
  • the computer vision system 140 may be operable to process and analyze the images captured by the camera 130 in order to identify objects and/or features in the surrounding environment of the vehicle 100.
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • the computer vision system 140 may use object recognition algorithms, Structure from Motion (SFM) algorithms, video tracking, and other computer vision technologies.
  • SFM Structure from Motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and so on.
  • the route control system 142 is used to determine the travel route of the vehicle 100.
  • the route control system 142 may combine data from the sensor fusion algorithm 138, the GPS 122, and one or more predetermined maps to determine the driving route for the vehicle 100.
  • the obstacle avoidance system 144 is used to identify, evaluate, and avoid or otherwise cross potential obstacles in the environment of the vehicle 100.
  • control system 106 may additionally or alternatively include components other than those shown and described. Alternatively, a part of the components shown above may be reduced.
  • the vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripheral devices 108.
  • the peripheral device 108 may include a wireless communication system 146, an onboard computer 148, a microphone 150, and/or a speaker 152.
  • the peripheral device 108 provides a means for the user of the vehicle 100 to interact with the user interface 116.
  • the onboard computer 148 may provide information to the user of the vehicle 100.
  • the user interface 116 can also operate the onboard computer 148 to receive user input.
  • the on-board computer 148 can be operated through a touch screen.
  • the peripheral device 108 may provide a means for the vehicle 100 to communicate with other devices located in the vehicle.
  • the microphone 150 may receive audio (eg, voice commands or other audio input) from a user of the vehicle 100.
  • the speaker 152 may output audio to the user of the vehicle 100.
  • the wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network.
  • the wireless communication system 146 may use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as LTE, or 5G cellular communication.
  • the wireless communication system 146 may use WiFi to communicate with a wireless local area network (WLAN).
  • WLAN wireless local area network
  • the wireless communication system 146 may directly communicate with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols such as various vehicle communication systems.
  • the wireless communication system 146 may include one or more dedicated short-range communication (DSRC) devices, which may include vehicles and/or roadside stations. Public and/or private data communications.
  • DSRC dedicated short-range communication
  • the power supply 110 may provide power to various components of the vehicle 100.
  • the power source 110 may be a rechargeable lithium ion or lead acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100.
  • the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
  • the computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as the memory 114.
  • the computer system 112 may also be multiple computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
  • the processor 113 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor.
  • Figure 1b functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, those of ordinary skill in the art should understand that the processor, computer, or memory may actually include Multiple processors, computers, or memories stored in the same physical enclosure.
  • the memory may be a hard disk drive or other storage medium located in a housing other than the computer 110. Therefore, a reference to a processor or computer will be understood to include a reference to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described here, some components such as steering components and deceleration components may each have its own processor that only performs calculations related to component-specific functions .
  • the processor may be located away from the vehicle and communicate wirelessly with the vehicle.
  • some of the processes described herein are executed on a processor disposed in the vehicle and others are executed by a remote processor, including taking the necessary steps to perform a single manipulation.
  • the memory 114 may include instructions 115 (eg, program logic), which may be executed by the processor 113 to perform various functions of the vehicle 100, including those functions described above.
  • the memory 114 may also contain additional instructions, including those for sending data to, receiving data from, interacting with, and/or controlling one or more of the traveling system 102, the sensor system 104, the control system 106, and the peripheral device 108. instruction.
  • the memory 114 may also store data, such as road maps, route information, the location, direction, and speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 100 and the computer system 112 during the operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • the user interface 116 is used to provide information to or receive information from a user of the vehicle 100.
  • the user interface 116 may include one or more input/output devices in the set of peripheral devices 108, such as a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and a speaker 152.
  • the computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (eg, the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control of many aspects of the vehicle 100 and its subsystems.
  • various subsystems eg, the travel system 102, the sensor system 104, and the control system 106
  • the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144.
  • the computer system 112 is operable to provide control of many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • the storage 114 may exist partially or completely separately from the vehicle 100.
  • the aforementioned components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1b should not be construed as a limitation to the embodiments of the present application.
  • An autonomous vehicle traveling on a road can recognize objects in its surrounding environment to determine the adjustment to the current speed.
  • the object may be other vehicles, traffic control equipment, or other types of objects.
  • each recognized object can be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, distance from the vehicle, etc., can be used to determine the speed to be adjusted by the autonomous vehicle.
  • the self-driving car vehicle 100 or the computing device associated with the self-driving vehicle 100 may be based on the characteristics of the recognized object and the state of the surrounding environment (E.g., traffic, rain, ice on the road, etc.) to predict the behavior of the identified object.
  • each recognized object depends on each other's behavior, so all recognized objects can also be considered together to predict the behavior of a single recognized object.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified object.
  • an autonomous vehicle can determine what stable state the vehicle will need to adjust to (for example, accelerate, decelerate, or stop) based on the predicted behavior of the object.
  • other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 on the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and so on.
  • the computing device can also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains an object near the self-driving car (such as , The safe horizontal and vertical distances of cars in adjacent lanes on the road.
  • the above-mentioned vehicle 100 can be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train, and trolley, etc.
  • the application examples are not particularly limited.
  • FIGS. 2a and 2b are schematic diagrams of the hardware structure of the radar provided by an embodiment of the application.
  • the radar 200 may include an antenna 210, a transceiver 220, one or more processors 230 (only one is shown in Figures 2a and 2b), and one or more memories 240 ( Figures 2a and 2b). Only one is shown in).
  • 210 may include a receiving antenna 211 and a transmitting antenna 212, where the transmitting antenna 212 is used to transmit a signal to a target object, and the receiving antenna 211 is used to receive a signal transmitted or reflected by the target object.
  • the transceiver 220 may be called a transceiver or a transceiver circuit, etc., and is used to implement the transceiver function of the radar.
  • the transceiver 220 may include a frequency synthesizer 224, and the frequency synthesizer 224 is configured to synthesize a chirp through a voltage-controlled oscillator (VCO) in the frequency synthesizer 224 under the control of the processor 220
  • VCO voltage-controlled oscillator
  • the transceiver 220 may further include a mixer 223, which is used to down-convert the signal received by the receiving antenna 211, so as to filter out frequency components related to the target object; specifically, the mixer 223 has Two input signals (one input signal is the signal received by the receiving antenna 211, the other input signal is the signal generated by the voltage-controlled oscillator), the output signal can be obtained by multiplying the two input signals, and the frequency of the output signal It can be the sum (or difference) of the frequencies of the two input signals, so as to transform the signal frequency from one magnitude to another.
  • the transceiver 220 may also include an analog-to-digital converter (ADC) 221 and a digital down converter (DDC) 222.
  • ADC analog-to-digital converter
  • DDC digital down converter
  • the ADC 221 is used to perform analog-to-digital conversion on the signal (the frequency meets the Nyquist sampling law) that is down-converted by the mixer 213 under the control of the processor 230.
  • the digital signal output by the ADC 221 can also be used to generate a zero-IF signal through the DDC 222.
  • the transceiver 220 also includes an amplifier (not shown in FIG. 2b) for power amplifying the received signal after the receiving antenna 211 receives the signal transmitted or reflected by the target object, or for Before the transmitting antenna 212 transmits a signal, the power to be transmitted is amplified.
  • the processor 230 may include a general-purpose processor (for example, a central processing unit) and/or a dedicated processor (for example, a baseband processor). Taking the processor 230 including the central processor 232 and the baseband processor 231 as an example, the baseband processor 231 can determine whether there is a target object according to the signal processed by the DDC 222, and after determining the presence of the target object, measure the target object relative to the radar The angle, speed, distance of 200 and the method in the embodiments of this application are used to determine the elevation of the target object; further, the signal processed by DDC222 can be performed before measuring the angle, speed, and distance of the target object relative to the radar 200 Anti-interference processing in the time domain, such as cross-correlation processing, filtering, etc.
  • a general-purpose processor for example, a central processing unit
  • a dedicated processor for example, a baseband processor
  • the central processing unit 232 can implement certain control functions (such as controlling the transceiver 220 and the baseband processor 231 to perform corresponding operations), and can also perform operations such as target clustering, target tracking, and target association according to the measurement results of the baseband processor 231 .
  • the memory 240 may store instructions, and the instructions may be executed on the processor 230 so that the radar 200 executes the method described in the embodiments of the present application.
  • data may also be stored in the memory.
  • instructions and/or data may also be stored in the processor, and the instructions and/or data may be executed by the processor, so that the radar 1100 executes the method described in the embodiments of the present application.
  • the processor and memory can be provided separately or integrated together.
  • the processors and transceivers described in this application can be implemented in integrated circuits (IC), analog ICs, radio frequency integrated circuits, mixed signal ICs, application specific integrated circuits (ASICs), printed circuit boards (printed circuit boards). circuit board, PCB), electronic equipment, etc.
  • the processor and transceiver can also be manufactured by various IC process technologies, such as complementary metal oxide semiconductor (CMOS), nMetal-oxide-semiconductor (NMOS), and P-type Metal oxide semiconductor (positive channel metal oxide semiconductor, PMOS), bipolar junction transistor (Bipolar Junction Transistor, BJT), bipolar CMOS (BiCMOS), silicon germanium (SiGe), gallium arsenide (GaAs), etc.
  • CMOS complementary metal oxide semiconductor
  • NMOS nMetal-oxide-semiconductor
  • PMOS bipolar junction transistor
  • BiCMOS bipolar CMOS
  • SiGe silicon germanium
  • GaAs gallium arsenide
  • the transmitting antenna 212 may include a first transmitting antenna and a second transmitting antenna.
  • the first transmitting antenna and the second transmitting antenna are used for time-sharing transmitting signals, that is, transmitting different signals at different times, such as the first transmitting antenna.
  • the transmitting antenna transmits the first transmitting signal at time t1, and the second transmitting antenna transmits the second transmitting signal at time t2.
  • the wavelength of the first transmitting signal and the wavelength of the second transmitting signal may be the same.
  • Figure 3 is a possible timing diagram. Referring to FIG. 3, the frequency synthesizer 214 generates multiple transmit signals in sequence, and the first transmit antenna and the second transmit antenna time-sharing transmit multiple transmit signals, and then the receive antennas sequentially receive echo signals; further, from FIG. 3 It can be seen that since the transmission signals transmitted by the first transmission antenna and the second transmission antenna are both generated by the frequency synthesizer 214, the wavelengths of the transmission signals transmitted by the first transmission antenna and the second transmission antenna are the same.
  • the first transmitting antenna may include one or more antennas.
  • the first transmitting antenna includes antenna a1 and antenna a2, and the first transmitting antenna transmits the first transmission signal, which may mean that the antenna a1 and the antenna a2 respectively transmit The signal is transmitted after different amplitude modulation and phase modulation, and a certain directional gain (that is, beam pointing) is formed.
  • the second transmitting antenna may also include one or more antennas, and the receiving antenna may also include one or more antennas.
  • the first transmitting antenna, the second transmitting antenna, and the receiving antenna may be located on a straight line perpendicular to the reference plane of the elevation of the target object. Further, the embodiment of the present application does not limit the positional relationship between the first transmitting antenna, the second transmitting antenna, and the receiving antenna.
  • the receiving antenna (including antenna R0, antenna R1, antenna R2, and antenna R3) may be located in the Between a transmitting antenna (including antenna T0) and a second transmitting antenna (including antenna T1), as shown in Figure 4a, at this time, the distance between the receiving antenna and the first transmitting antenna (which can be expressed as ⁇ h1) and the receiving antenna
  • the distance from the second transmitting antenna (which can be expressed as ⁇ h2) may be equal, or may not be equal, and is not specifically limited; for another example, the receiving antenna may be located above the first transmitting antenna and the second transmitting antenna, as shown in the figure As shown in 4b, alternatively, the receiving antenna may also be located below the first transmitting antenna and the second transmitting antenna, as shown in FIG. 4c.
  • ⁇ h1 and ⁇ h2 are both positive values, that is to say: when the first transmitting antenna is located above the receiving antenna, the distance between the first transmitting antenna and the receiving antenna It is expressed as a positive value; when the second transmitting antenna is located below the receiving antenna, the distance between the second transmitting antenna and the receiving antenna is expressed as a positive value.
  • ⁇ h1 is a negative value; since the second transmitting antenna is located below the receiving antenna, ⁇ h2 is a positive value.
  • ⁇ h1 since the first transmitting antenna is located above the receiving antenna, ⁇ h1 is a positive value; since the second transmitting antenna is located above the receiving antenna, ⁇ h2 is a negative value.
  • Fig. 5a is a schematic diagram of the process corresponding to the method for determining elevation provided by an embodiment of the application, as shown in Fig. 5a, including:
  • Step 501 Generate first imaging information according to the first echo signal received by the receiving antenna, and generate second imaging information according to the second echo signal received by the receiving antenna; the first echo signal is the target object An echo signal reflected after receiving the first transmission signal transmitted by the first transmission antenna, and the second echo signal is an echo signal reflected after the target object receives the second transmission signal transmitted by the second transmission antenna.
  • the first imaging information may be understood as the reflection of the target object on the first emission signal, and is mainly image information formed by backscattering of the target object. Further, the first imaging information may include a variety of information, such as phase information, amplitude information, and so on.
  • a possible implementation of generating the first imaging information according to the first echo signal reflected by the target object is to pass the echo after receiving the first echo signal (the first echo signal includes amplitude information and phase information)
  • Signal processing such as down-conversion, analog-to-digital conversion, etc., of the echo signal, and then according to the processed signal, using a synthetic aperture radar (synthetic aperture radar, SAR) imaging algorithm to obtain the first imaging information.
  • SAR synthetic aperture radar
  • Step 502 Determine the elevation of the target object according to the first imaging information and the second imaging information.
  • the above steps 501 and 502 may be performed by the radar 200 shown in FIG. 2a or FIG. 2b or a semiconductor chip provided in the radar 200.
  • the radar 200 may be an ultrashort wave radar working in an ultrashort wave band, or may also be a microwave radar (such as a millimeter wave radar) working in a microwave wave band.
  • the following will mainly take the millimeter wave radar as an example for description.
  • the embodiment of the present application adds a functional module (referred to as the elevation extraction module for short) in the radar 200, for example, in the processor (specifically, the baseband processor) )
  • Add an elevation extraction module can also be understood as an extension of the baseband processor function).
  • the elevation extraction module can receive information through two channels, such as acquiring DDC through channel 0 to process the first echo signal, and through channel 1 to acquire DDC to process the second echo signal After the information, the first imaging information and the second imaging information are generated respectively, and the elevation of the target object is determined.
  • the above method is adopted, since the first transmission signal and the second transmission signal are transmitted through different transmission antennas, the first imaging can be generated based on the first echo signal and the second echo signal reflected by the target object. The information and the second imaging information can then be used to determine the elevation of the target object based on the first imaging information and the second imaging information.
  • the method is simple to implement, and the cost is lower than that of lidar to determine the elevation.
  • the embodiments of the present application can implement the use of the vehicle-mounted radar to measure the elevation of the target object without changing the main structure of the vehicle-mounted radar.
  • determining the elevation of the target object according to the first imaging information and the second imaging information may be: determining according to the first imaging information, the second imaging information, and the distance difference The elevation of the target object; wherein the distance difference is the difference between the transmission distance of the first transmission signal and the transmission distance of the second transmission signal.
  • the elevation of the target object can be determined based on the process shown in FIG. 6.
  • Figure 7 illustrates a schematic diagram for determining the elevation of the target object.
  • Figure 7 is an example of the antenna layout shown in Figure 4a. If the antenna layout shown in Figure 4b or Figure 4c is used, the process shown in Figure 6 can also be used to determine the target object.
  • the difference between Figure 4a, Figure 4b and Figure 4c is that ⁇ h1 and ⁇ h2 may be different.
  • ⁇ h1 and ⁇ h2 in Figure 4a are both positive values
  • ⁇ h1 in Figure 4b is negative.
  • ⁇ h2 is a positive value.
  • ⁇ h1 is a positive value
  • ⁇ h2 is a negative value; for other contents, please refer to the implementation.
  • H represents the distance between the receiving antenna and the reference plane of the elevation of the target object
  • R 0 represents the distance between the first transmitting antenna and the target object
  • R 1 represents the second transmitting antenna
  • R centre represents the distance between the receiving antenna and the target object
  • h centre represents the distance between the receiving antenna and the target object in the vertical direction
  • R ground represents the distance between the antenna module and the target object in the horizontal direction
  • ⁇ h1 represents the distance between the receiving antenna and the first transmitting antenna (specifically, it may refer to the distance between the center position of the receiving antenna and the center position of the first transmitting antenna)
  • ⁇ h2 represents the distance between the receiving antenna and the second transmitting antenna Distance (specifically, it can refer to the distance between the center position of the receiving antenna and the center position of the second transmitting antenna)
  • represents the receiving angle of the receiving antenna to receive the first echo signal or the second echo signal, for the same target object ,
  • the receiving angle at which the receiving antenna receives the first echo signal is the
  • the horizontal direction may refer to a direction parallel to a reference plane of the elevation of the target object
  • the vertical direction may refer to a direction perpendicular to the reference plane of the elevation of the target object.
  • ⁇ h1 and ⁇ h2 can be obtained according to the structural parameters of the radar, H, R centre , R 0 , R 1 can be measured and calculated, and h centre , R ground and ⁇ are all unknown quantities.
  • the process includes:
  • Step 601 Determine the relationship between the distance difference and the receiving angle.
  • the distance difference is the difference between the transmission distance of the first transmission signal and the transmission distance of the second transmission signal, which can be expressed as ⁇ R.
  • the transmission distance of the first transmitted signal conforms to the following formula:
  • the transmission distance of the second transmitted signal conforms to the following formula:
  • Step 602 Determine the phase difference between the first imaging information and the second imaging information (which can be expressed as ) And the distance difference.
  • the first imaging information and the second imaging information may be expressed as two-dimensional complex number data including amplitude information and phase information.
  • the first imaging information may be expressed as c 1 represents the first imaging information, A 1 represents the amplitude of the first imaging information, Represents the phase of the first imaging information;
  • the second imaging information can be expressed as c 2 represents the second imaging information, A 2 represents the amplitude of the second imaging information, Indicates the phase of the second imaging information.
  • the complex interferogram can be obtained as The * in the formula means taking the conjugate, and the interference phase (that is, the phase difference between the first imaging information and the second imaging information) can be expressed as:
  • phase of the first imaging information includes distance information (such as the transmission distance of the first transmitted signal and the transmission distance of the first echo signal) and the scattering characteristics of the target object
  • phase of the first imaging information may be exemplarily Expressed as:
  • the phase of the second imaging information includes distance information (such as the transmission distance of the second transmission signal and the transmission distance of the second echo signal) and the scattering characteristics of the target object.
  • the phase of the second imaging information can be expressed as: Is the scattering phase of the target object. Therefore, the relationship between the phase difference and the distance difference can be obtained in accordance with the following formula:
  • represents the wavelength of the first transmission signal or the wavelength of the second transmission signal.
  • ⁇ h1 and ⁇ h2 are determined by the positional relationship between the first transmitting antenna, the second transmitting antenna, and the receiving antenna.
  • Those skilled in the art can perform the Set the positional relationship between the antenna and the receiving antenna.
  • the positional relationship between the first transmitting antenna, the second transmitting antenna and the receiving antenna can be set according to the wavelength of the first transmitting signal or the wavelength of the second transmitting signal, so that the average value of ⁇ h1 and ⁇ h2 Less than or equal to the wavelength.
  • ⁇ R is less than Therefore, when the average value of ⁇ h1 and ⁇ h2 is less than or equal to the wavelength, ⁇ R is less than ⁇ , so that the phase difference obtained by the conjugate multiplication of the first imaging information and the second imaging information is the phase of the first imaging information and The phase difference of the second imaging information (that is, the true phase difference), there is no phase blur, and no phase unwrapping operation is required.
  • the value range of the receiving angle can also be preset to avoid phase ambiguity, for example:
  • the beam width of the receiving antenna can be set so that the receiving angle meets the foregoing value range, which is not specifically limited.
  • the position resolution capability of a radar is divided into range resolution and azimuth resolution, and one range resolution and one azimuth resolution constitute a resolution unit.
  • the range resolution is inversely proportional to the signal bandwidth of the millimeter wave radar signal, that is, the larger the signal bandwidth, the smaller the range resolution; the azimuth resolution is determined by the antenna size, and the larger the antenna size, the smaller the azimuth resolution.
  • the coordinates of the resolution unit where the target object is located can be used as the coordinates of the target object.
  • FIG. 8 it is a schematic diagram of the resolution of a millimeter wave radar provided by an embodiment of this application.
  • the distance between B1B4 or B2B3 represents the range resolution
  • the distance between B1B2 or B3B4 represents the azimuth resolution
  • the area enclosed by B1, B2, B3, and B4 is a resolution unit. Since ⁇ R is much smaller than the range resolution, it can be explained that the object reflecting the first echo signal and the object reflecting the second echo signal are located in the same resolution unit, that is, it can be considered that the first echo signal and the second echo signal are Reflected by an object (that is, a target object) in a certain resolution unit, so as to ensure that the determined elevation is the elevation of the target object.
  • the first imaging information and the second imaging information cannot be completely overlapped on the pixels, so the first imaging can be performed first.
  • the information and the second imaging information are image registered to achieve a one-to-one correspondence between the pixels of the two images, and then a conjugate multiplication operation is performed based on the registered first imaging information and the second imaging information.
  • Step 603 Obtain the receiving angle according to the relationship between the distance difference and the receiving angle and the relationship between the phase difference value and the distance difference.
  • Step 604 Obtain the elevation of the target object according to the distance between the receiving antenna and the reference plane of the elevation of the target object, the distance between the receiving antenna and the target object, and the receiving angle.
  • the elevation of the target object (which can be expressed as h) can be obtained based on the following formula:
  • the difference between the transmission distance of the first transmission signal and the transmission distance of the second transmission signal is smaller than the wavelength of the transmission signal, so that the phase difference between the first imaging information and the second imaging information is There is no phase ambiguity.
  • the elevation of the target object can be determined by the simplified method described above, which can greatly reduce the complexity of calculation and improve the processing efficiency.
  • the above content specifically describes the manner of determining the elevation of the target object.
  • the three-dimensional feature of the target object can also be determined according to the elevation of the target object.
  • the range of the radar is the x-axis
  • the azimuth is the y-axis
  • the elevation of the target object determined above is That is, the coordinates of the target object on the z axis.
  • the radar can also determine the coordinates of the target object on the x-axis and on the y-axis, thereby obtaining the three-dimensional characteristics of the target object. In this way, in a vehicle-mounted environment, after the radar has determined the three-dimensional characteristics of multiple target objects in the above-mentioned manner, the analysis of road conditions can be realized.
  • the device for determining the elevation may include hardware structures and/or software modules corresponding to each function.
  • the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
  • FIG. 10 shows a possible exemplary block diagram of the apparatus for determining elevation involved in an embodiment of the present application.
  • the device 1000 may include: a generating unit 1001 and a determining unit 1002.
  • the device 1000 may be a semiconductor chip set in a radar.
  • the generating unit 1001 and the determining unit 1002 may be understood as the elevation extraction module shown in FIG. 5b, the generating unit 1001 and the determining unit
  • the specific functions of 1002 can be implemented by a processor (such as a baseband processor).
  • the device 1000 may be a radar.
  • the device 1000 may further include a communication unit 1003 and a storage unit 1004.
  • the generating unit 1001 and the determining unit 1002 can also be collectively referred to as a processing unit, which is used to control and manage the actions of the device 1000
  • the communication unit 1003 is also referred to as a transceiving unit, which can include a receiving unit and/or a sending unit, which are used to execute Receive and send operations.
  • the storage unit 1004 is used to store the program code and/or data of the device 1000.
  • the generating unit 1001 and the determining unit 1002 may be processors, which may implement or execute various exemplary logical blocks, modules, and circuits described in conjunction with the disclosure of the embodiments of the present application.
  • the communication unit 1003 may be a communication interface, a transceiver, or a transceiver circuit, etc., where the communication interface is a general term. In a specific implementation, the communication interface may include multiple interfaces.
  • the storage unit 1004 may be a memory.
  • the generating unit 1001 and the determining unit 1002 can support the device 1000 to perform the actions in the above method examples.
  • the generating unit 1001 is used to perform step 501 in FIG. 5a
  • the determining unit 1002 is used to perform step 502 in FIG. 5a and steps 601 to 604 in FIG. 6.
  • the generating unit 1001 is configured to generate first imaging information according to the first echo signal received by the receiving antenna, and generate second imaging information according to the second echo signal received by the receiving antenna Information; the first echo signal is the echo signal reflected by the target object after receiving the first transmission signal transmitted by the first transmitting antenna, and the second echo signal is the target object receiving the second transmitting antenna and transmitting The echo signal reflected after the second transmission signal of
  • the determining unit 1002 is configured to determine the elevation of the target object according to the first imaging information and the second imaging information.
  • the determining unit 1002 is specifically configured to:
  • the distance between the receiving antenna and the first transmitting antenna, the distance between the receiving antenna and the second transmitting antenna, and the wavelength of the first transmitting signal or the The wavelength of the second transmission signal obtaining the receiving angle of the first echo signal or the second echo signal received by the receiving antenna;
  • the elevation of the target object is determined.
  • the determining unit 1002 is specifically configured to:
  • the distance difference is the difference between the transmission distance of the first transmission signal and the transmission distance of the second transmission signal
  • represents the receiving angle
  • ⁇ h1 represents the receiving antenna and the second transmission distance.
  • the distance between a transmitting antenna, ⁇ h2 represents the distance between the receiving antenna and the second transmitting antenna;
  • the receiving angle conforms to the following formula:
  • the determining unit 1002 is specifically configured to:
  • the elevation of the target object is calculated by the following formula:
  • H represents the elevation of the target object
  • H represents the distance between the receiving antenna and the reference plane of the elevation
  • R centre represents the distance between the receiving antenna and the target object
  • represents the Receiving angle
  • modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • the functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules.
  • the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of software products, and the computer software products are stored in a storage
  • the medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium may be various mediums capable of storing program codes, such as a memory.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • These computer program instructions can also be stored in a computer-readable memory that can direct a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

一种确定高程的方法、装置及雷达,涉及人工智能领域中自动驾驶技术。其中,确定高程的方法包括:根据接收天线接收到的第一回波信号生成第一成像信息,以及根据该接收天线接收到的第二回波信号生成第二成像信息;该第一回波信号为目标对象接收到第一发射天线发射的第一发射信号后反射的回波信号,该第二回波信号为该目标对象接收到第二发射天线发射的第二发射信号后反射的回波信号;根据该第一成像信息和该第二成像信息,确定该目标对象的高程。该方法的实现简单,且相比于激光雷达确定高程来说,成本较低。

Description

一种确定高程的方法、装置及雷达 技术领域
本申请涉及自动驾驶领域,特别涉及一种确定高程的方法、装置及雷达。
背景技术
人工智能(Artificial Intelligence,AI)是利用数字计算机或者数字计算机控制的机器模拟、延伸和扩展人的智能,感知环境、获取知识并使用知识获得最佳结果的理论、方法、技术及应用系统。换句话说,人工智能是计算机科学的一个分支,它企图了解智能的实质,并生产出一种新的能以人类智能相似的方式作出反应的智能机器。人工智能也就是研究各种智能机器的设计原理与实现方法,使机器具有感知、推理与决策的功能。人工智能领域的研究包括机器人,自然语言处理,计算机视觉,决策与推理,人机交互,推荐与搜索,AI基础理论等。
自动驾驶是人工智能领域的一种主流应用,自动驾驶技术依靠计算机视觉、雷达(Radar)、监控装置和全球定位系统等协同合作,让机动车辆可以在不需要人类主动操作下,实现自动驾驶。自动驾驶的车辆使用各种计算系统来帮助将乘客从一个位置运输到另一位置。一些自动驾驶车辆可能要求来自操作者(诸如,领航员、驾驶员、或者乘客)的一些初始输入或者连续输入。自动驾驶车辆准许操作者从手动模操作式切换到自动驾驶模式或者介于两者之间的模式。由于自动驾驶技术无需人类来驾驶机动车辆,所以理论上能够有效避免人类的驾驶失误,减少交通事故的发生,且能够提高公路的运输效率。因此,自动驾驶技术越来越受到重视。
由于雷达能够在车载环境中实现障碍物测量、碰撞预测、自适应巡航控制等功能,有效地降低驾驶难度和减少事故的发生率,因而在汽车领域得到了广泛应用。
图1a为雷达发射和接收信号示意图,如图1a所示,雷达通过天线向外发射检测信号(电磁波)以及接收目标对象反射的信号,对目标对象反射的信号进行放大以及下变频等处理,得到车辆与目标对象之间的相对距离、相对速度等信息,然后根据得到的信息对目标对象进行跟踪和识别分类,经合理决策后,以声、光及触觉等多种方式告知或警告驾驶员,或者及时对车辆做出主动干预,从而保证驾驶过程的安全性和舒适性,减少事故发生几率。
以自动驾驶为例,除了需要获知车辆与目标对象之间的相对距离、相对速度等,往往还需要获知车辆周围的对象的高程,比如车辆前方路面的高度,从而能够及时根据路面状况作出相应的调整(比如调整驾驶速度)。目前,一种可能的确定高程的方法为使用激光雷达(Light Detection And Ranging,LIDAR)测量目标对象的高程,激光雷达的工作原理为向目标发射探测信号(激光束),然后将接收到的从目标对象反射回来的回波信号与发射信号进行比较,作适当处理后,可获得目标对象的有关信息,如目标距离、方位、高程、速度等参数。然而,激光雷达对工艺水平要求较高,造价也比较高,从而导致使用激光雷达确定高程的成本较高。
综上,目前亟需一种确定高程的方法,用于解决使用激光雷达确定高程导致成本较高的技术问题。
发明内容
本申请提供一种确定高程的方法、装置及雷达,用于解决使用激光雷达确定高程导致成本较高的技术问题。
第一方面,本申请提供一种确定高程的方法,包括:
根据接收天线接收到的第一回波信号生成第一成像信息,以及根据该接收天线接收到的第二回波信号生成第二成像信息;该第一回波信号为目标对象接收到第一发射天线发射的第一发射信号后反射的回波信号,该第二回波信号为该目标对象接收到第二发射天线发射的第二发射信号后反射的回波信号;
根据该第一成像信息和该第二成像信息,确定该目标对象的高程。
采用上述方法,由于通过不同的发射天线发射第一发射信号和第二发射信号,从而可以基于目标对象反射的第一回波信号和第二回波信号分别生成第一成像信息和第二成像信息,进而能够基于第一成像信息和第二成像信息确定目标对象的高程,该方法的实现简单,且相比于激光雷达确定高程来说,成本较低。比如在车载环境下,本申请实施例可以在不改变车载雷达的主要架构的前提下,来实现利用车载雷达对目标对象的高程进行测量。
在一种可能的设计中,该第一发射天线、该第二发射天线和该接收天线位于与该高程的参考平面垂直的直线上。
在一种可能的设计中,该接收天线位于该第一发射天线和该第二发射天线之间。
该接收天线与该第一发射天线之间的距离和该接收天线与该第二发射天线之间的距离的平均值小于或等于该第一发射信号的波长或该第二发射信号的波长。
在一种可能的设计中,该第一发射信号的波长和该第二发射信号的波长相同。
在一种可能的设计中,根据该第一成像信息和该第二成像信息,确定该目标对象的高程,包括:
根据第一成像信息、第二成像信息,得到该第一成像信息和该第二成像信息的相位差值;
根据该相位差值、该接收天线与该第一发射天线之间的距离、该接收天线与该第二发射天线之间的距离以及该第一发射信号的波长或该第二发射信号的波长,得到该接收天线接收该第一回波信号或该第二回波信号的接收角度;
根据该接收角度,确定该目标对象的高程。
在一种可能的设计中,根据该相位差值、该接收天线与该第一发射天线之间的距离、该接收天线与该第二发射天线之间的距离以及该第一发射信号的波长或该第二发射信号的波长,得到该接收角度,包括:
确定距离差与该接收角度的关系符合如下公式:
Figure PCTCN2019073735-appb-000001
其中,ΔR表示该距离差,该距离差为该第一发射信号的传输距离和该第二发射信号的传输距离的差值,θ表示该接收角度,Δh1表示接收天线与第一发射天线之间的距离,Δh2表示接收天线与第二发射天线之间的距离;
确定该相位差值与该距离差的关系符合如下公式:
Figure PCTCN2019073735-appb-000002
其中,
Figure PCTCN2019073735-appb-000003
表示该相位差值,λ表示该第一发射信号或该第二发射信号的波长;
根据该距离差与该接收角度的关系以及该相位差值与该距离差的关系,可得到该接收角度符合下公式:
Figure PCTCN2019073735-appb-000004
在一种可能的设计中,根据该接收角度,确定该目标对象的高程,包括:
通过如下公式计算该目标对象的高程:
h=H-R centreCOS(θ)
其中,h表示该目标对象的高程,H表示该接收天线与该高程的参考平面之间的距离,R centre表示该接收天线与该目标对象之间的距离,θ表示该接收角度。
第二方面,本申请实施例提供一种确定高程的装置,该装置可以是雷达或者也可以是设置在雷达中的半导体芯片。该装置具有实现上述第一方面的各种可能的设计的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的单元或模块。
第三方面,本申请实施例提供一种装置,该装置包括处理器、存储器以及存储在存储器上并可在处理器上运行的指令,当该指令被运行时,使得该装置执行上述第一方面所述的方法。
第四方面,本申请实施例提供一种雷达,该雷达包括第一发射天线、第二发射天线、接收天线和上述第三方面所述的装置。
第五方面,本申请实施例提供一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如第一方面任一种可能的设计中所述的方法。
第六方面,本申请实施例提供一种计算机程序产品,当其在计算机上运行时,使得计算机执行如上述第一方面任一种可能的设计中所述的方法。
本申请的这些方面或其他方面在以下实施例的描述中会更加简明易懂。
附图说明
图1a为本申请实施例提供的车载雷达发射和接收信号示意图;
图1b为本申请实施例提供的一种自动驾驶汽车的结构示意图;
图2a和图2b为本申请实施例提供的雷达的硬件结构示意图;
图3为本申请实施例提供的一种可能的时序示意图;
图4a为第一发射天线、第二发射天线和接收天线的一种位置关系示例图;
图4b为第一发射天线、第二发射天线和接收天线的又一种位置关系示例图;
图4c为第一发射天线、第二发射天线和接收天线的又一种位置关系示例图;
图5a为本申请实施例提供的确定高程的方法所对应的流程示意图;
图5b为本申请实施例提供的由雷达执行图5a所示意的方法的工作过程示意图;
图6为本申请实施例提供的确定目标对象的高程的流程示意图;
图7为本申请实施例提供的确定目标对象的高程的原理图;
图8为本申请实施例提供的一种毫米波雷达分辨率示意图;
图9为本申请实施例提供的目标对象的三维特征示意图;
图10为本申请实施例中所涉及的确定高程的装置的可能的示例性框图。
具体实施方式
为了使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请作进一步地详细描述。
首先,对本申请中的部分用语进行解释说明,以便于本领域技术人员理解。
(1)雷达:一般工作在超短波或微波波段,其中,工作在超短波波段的雷达称为超短波雷达或米波雷达;工作在微波波段的雷达通称为微波雷达。微波雷达有时还细分为分米波雷达、厘米波雷达、毫米波雷达等。
(2)毫米波雷达:工作在毫米波波段(波长为1~10mm),工作频率通常选在30~300GHz范围内。毫米波的波长介于厘米波和光波之间,因此毫米波兼有微波制导和光电制导的优点。同厘米波导引头相比,毫米波导引头具有体积小、质量轻和空间分辨率高的特点。毫米波雷达由于尺寸小、重量轻等特性,很好的弥补了如红外、激光、超声波、摄像头等其他传感器雷达在车载应用中所不具备的使用场景。
(3)目标对象:可以为任意一种需要测量距离和/或速度的目标,比如可以为移动的物体,或者也可以为静止的物体。
(4)高程:可以是指某一点相对于基准面的高度,比如目标对象的高程是指目标对象相对于基准面的高度。其中,基准面,也可以称为参考面或参考平面,可以为预先假设的一个平面,具体不做限定。
(5)本申请实施例中涉及的第一、第二等各种数字编号仅为描述方便进行的区分,并不用来限制本申请实施例的范围,也不表示先后顺序。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。“至少一个”是指一个或者多个。至少两个是指两个或者多个。“至少一个”、“任意一个”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个、种),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
图1b是本申请实施例提供的车辆100的功能框图。在一个实施例中,将车辆100配置为完全或部分地自动驾驶模式。例如,车辆100可以在处于自动驾驶模式中的同时控制自身,并且可通过人为操作来确定车辆及其周边环境的当前状态,确定周边环境中的至少一个其他车辆的可能行为,并确定该其他车辆执行可能行为的可能性相对应的置信水平,基于所确定的信息来控制车辆100。在车辆100处于自动驾驶模式中时,可以将车辆100置为在没有和人交互的情况下操作。
车辆100可包括各种子系统,例如行进系统102、传感器系统104、控制系统106、一个或多个外围设备108以及电源110、计算机系统112和用户接口116。可选地,车辆100可包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆100的每个子系统和元件可以通过有线或者无线互连。
行进系统102可包括为车辆100提供动力运动的组件。在一个实施例中,行进系统102 可包括引擎118、能量源119、传动装置120和车轮/轮胎121。引擎118可以是内燃引擎、电动机、空气压缩引擎或其他类型的引擎组合,例如气油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎118将能量源119转换成机械能量。
能量源119的示例包括汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源119也可以为车辆100的其他系统提供能量。
传动装置120可以将来自引擎118的机械动力传送到车轮121。传动装置120可包括变速箱、差速器和驱动轴。在一个实施例中,传动装置120还可以包括其他器件,比如离合器。其中,驱动轴可包括可耦合到一个或多个车轮121的一个或多个轴。
传感器系统104可包括感测关于车辆100周边的环境的信息的若干个传感器。例如,传感器系统104可包括定位系统122(定位系统可以是GPS系统,也可以是北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)124、雷达126、激光测距仪128以及相机130。传感器系统104还可包括被监视车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主车辆100的安全操作的关键功能。
定位系统122可用于估计车辆100的地理位置。IMU 124用于基于惯性加速度来感测车辆100的位置和朝向变化。在一个实施例中,IMU 124可以是加速度计和陀螺仪的组合。
雷达126可利用无线电信号来感测车辆100的周边环境内的物体。在一些实施例中,除了感测物体以外,雷达126还可用于感测物体的速度和/或前进方向。
激光测距仪128可利用激光来感测车辆100所位于的环境中的物体。在一些实施例中,激光测距仪128可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。
相机130可用于捕捉车辆100的周边环境的多个图像。相机130可以是静态相机或视频相机。
控制系统106为控制车辆100及其组件的操作。控制系统106可包括各种元件,其中包括转向系统132、油门134、制动单元136、传感器融合算法138、计算机视觉系统140、路线控制系统142以及障碍规避系统144。
转向系统132可操作来调整车辆100的前进方向。例如在一个实施例中可以为方向盘系统。
油门134用于控制引擎118的操作速度并进而控制车辆100的速度。
制动单元136用于控制车辆100减速。制动单元136可使用摩擦力来减慢车轮121。在其他实施例中,制动单元136可将车轮121的动能转换为电流。制动单元136也可采取其他形式来减慢车轮121转速从而控制车辆100的速度。
计算机视觉系统140可以操作来处理和分析由相机130捕捉的图像以便识别车辆100周边环境中的物体和/或特征。所述物体和/或特征可包括交通信号、道路边界和障碍物。计算机视觉系统140可使用物体识别算法、运动中恢复结构(Structure from Motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统140可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。
路线控制系统142用于确定车辆100的行驶路线。在一些实施例中,路线控制系统142 可结合来自传感器融合算法138、GPS 122和一个或多个预定地图的数据以为车辆100确定行驶路线。
障碍规避系统144用于识别、评估和避免或者以其他方式越过车辆100的环境中的潜在障碍物。
当然,在一个实例中,控制系统106可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
车辆100通过外围设备108与外部传感器、其他车辆、其他计算机系统或用户之间进行交互。外围设备108可包括无线通信系统146、车载电脑148、麦克风150和/或扬声器152。
在一些实施例中,外围设备108提供车辆100的用户与用户接口116交互的手段。例如,车载电脑148可向车辆100的用户提供信息。用户接口116还可操作车载电脑148来接收用户的输入。车载电脑148可以通过触摸屏进行操作。在其他情况中,外围设备108可提供用于车辆100与位于车内的其它设备通信的手段。例如,麦克风150可从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器152可向车辆100的用户输出音频。
无线通信系统146可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统146可使用3G蜂窝通信,例如CDMA、EVD0、GSM/GPRS,或者4G蜂窝通信,例如LTE,或者5G蜂窝通信。无线通信系统146可利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统146可利用红外链路、蓝牙或ZigBee与设备直接通信。其他无线协议,例如各种车辆通信系统,例如,无线通信系统146可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆和/或路边台站之间的公共和/或私有数据通信。
电源110可向车辆100的各种组件提供电力。在一个实施例中,电源110可以为可再充电锂离子或铅酸电池。这种电池的一个或多个电池组可被配置为电源为车辆100的各种组件提供电力。在一些实施例中,电源110和能量源119可一起实现,例如一些全电动车中那样。
车辆100的部分或所有功能受计算机系统112控制。计算机系统112可包括至少一个处理器113,处理器113执行存储在例如存储器114这样的非暂态计算机可读介质中的指令115。计算机系统112还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。
处理器113可以是任何常规的处理器,诸如商业可获得的CPU。替选地,该处理器可以是诸如ASIC或其它基于硬件的处理器的专用设备。尽管图1b功能性地图示了处理器、存储器、和在相同块中的计算机110的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机110的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该车辆并且与该车辆进行无线通 信。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器114可包含指令115(例如,程序逻辑),指令115可被处理器113执行来执行车辆100的各种功能,包括以上描述的那些功能。存储器114也可包含额外的指令,包括向行进系统102、传感器系统104、控制系统106和外围设备108中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令115以外,存储器114还可存储数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆100在自主、半自主和/或手动模式中操作期间被车辆100和计算机系统112使用。
用户接口116,用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口116可包括在外围设备108的集合内的一个或多个输入/输出设备,例如无线通信系统146、车车在电脑148、麦克风150和扬声器152。
计算机系统112可基于从各种子系统(例如,行进系统102、传感器系统104和控制系统106)以及从用户接口116接收的输入来控制车辆100的功能。例如,计算机系统112可利用来自控制系统106的输入以便控制转向单元132来避免由传感器系统104和障碍规避系统144检测到的障碍物。在一些实施例中,计算机系统112可操作来对车辆100及其子系统的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,存储器114可以部分或完全地与车辆100分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1b不应理解为对本申请实施例的限制。
在道路行进的自动驾驶汽车,如上面的车辆100,可以识别其周围环境内的物体以确定对当前速度的调整。所述物体可以是其它车辆、交通控制设备、或者其它类型的物体。在一些示例中,可以独立地考虑每个识别的物体,并且基于物体的各自的特性,诸如它的当前速度、加速度、与车辆的间距等,可以用来确定自动驾驶汽车所要调整的速度。
可选地,自动驾驶汽车车辆100或者与自动驾驶车辆100相关联的计算设备(如图1b的计算机系统112、计算机视觉系统140、存储器114)可以基于所识别的物体的特性和周围环境的状态(例如,交通、雨、道路上的冰、等等)来预测所述识别的物体的行为。可选地,每一个所识别的物体都依赖于彼此的行为,因此还可以将所识别的所有物体全部一起考虑来预测单个识别的物体的行为。车辆100能够基于预测的所述识别的物体的行为来调整它的速度。换句话说,自动驾驶汽车能够基于所预测的物体的行为来确定车辆将需要调整到(例如,加速、减速、或者停止)什么稳定状态。在这个过程中,也可以考虑其它因素来确定车辆100的速度,诸如,车辆100在行驶的道路中的横向位置、道路的曲率、静态和动态物体的接近度等等。
除了提供调整自动驾驶汽车的速度的指令之外,计算设备还可以提供修改车辆100的转向角的指令,以使得自动驾驶汽车遵循给定的轨迹和/或维持与自动驾驶汽车附近的物体(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例 不做特别的限定。
图2a和图2b为本申请实施例提供的雷达的硬件结构示意图。
如图2a所示,该雷达200可以包括天线210、收发器220、一个或多个处理器230(图2a和图2b中仅示意出一个)和一个或多个存储器240(图2a和图2b中仅示意出一个)。
具体来说,结合图2b所示,210可以包括接收天线211、发射天线212,其中,发射天线212用于向目标对象发射信号,接收天线211用于接收目标对象发射或者反射的信号。
收发器220可以称为收发机或者收发电路等,用于实现雷达的收发功能。具体来说,收发器220可以包括频率合成器224,频率合成器224用于在处理器220的控制下,通过频率合成器224中的压控振荡器(voltage-controlled oscillator,VCO)合成线性调频信号(该信号的相位随着时间发生变化,是时间的二次函数,该信号的频率,即相位的一阶导数,是时间的一次函数,即频率随着时间做线性变化),通过发射天线212发射出去。
收发器220还可以包括混频器223,混频器223用于对接收天线211接收到的信号进行下变频,以便于筛选出与目标对象相关的频率分量;具体来说,混频器223具有两个输入信号(其中一个输入信号为接收天线211接收到的信号,另一个输入信号为压控振荡器产生的信号),通过将两个输入信号相乘即可得到输出信号,输出信号的频率可以为两个输入信号的频率之和(或差),进而实现将信号频率由一个量值变换为另一个量值。
收发器220还可以模数转换器(analog to digital converter,ADC)221和数字下变频器(digital down converter,DDC)222。其中,ADC 221用于在处理器230的控制下,对经过混频器213下变频后的信号(频率满足奈奎斯特采样定律)进行模数转换。为了方便之后处理器230的信号处理,ADC 221输出的数字信号还可以通过DDC 222产生零中频信号。
另外,收发器220还包括放大器(未在图2b中示出),用于在接收天线211接收到目标对象发射或者反射的信号后,对接收到的信号进行功率放大,或者用于在所述发射天线212发射信号之前,对所要发射的信号进行功率放大。
处理器230可以包括通用处理器(例如可以是中央处理器)和/或专用处理器(例如可以是基带处理器)。以处理器230包括中央处理器232和基带处理器231为例,基带处理器231可以根据DDC 222处理得到的信号,确定是否存在目标对象,以及在确定存在目标对象后,测量目标对象相对于雷达200的角度、速度、距离以及执行本申请实施例中的方法确定目标对象的高程;进一步地,还可以在测量目标对象相对于雷达200的角度、速度以及距离之前,对DDC222处理得到的信号进行时域上的抗干扰处理,例如互相关处理、滤波等。中央处理器232可以实现一定的控制功能(比如控制收发器220、基带处理器231执行相应的操作),还可以根据基带处理器231的测量结果,进行目标聚类、目标跟踪以及目标关联等操作。
存储器240上可以存有指令,所述指令可在所述处理器230上被运行,使得所述雷达200执行本申请实施例中描述的方法。可选的,所述存储器中还可以存储有数据。可选的,处理器中也可以存储指令和/或数据,所述指令和/或数据可以被所述处理器运行,使得所述雷达1100执行本申请实施例中描述的方法。所述处理器和存储器可以单独设置,也可以集成在一起。
本申请中描述的处理器和收发器可实现在集成电路(integrated circuit,IC)、模拟IC、射频集成电路、混合信号IC、专用集成电路(application specific integrated circuit,ASIC)、 印刷电路板(printed circuit board,PCB)、电子设备等上。该处理器和收发器也可以用各种IC工艺技术来制造,例如互补金属氧化物半导体(complementary metal oxide semiconductor,CMOS)、N型金属氧化物半导体(nMetal-oxide-semiconductor,NMOS)、P型金属氧化物半导体(positive channel metal oxide semiconductor,PMOS)、双极结型晶体管(Bipolar Junction Transistor,BJT)、双极CMOS(BiCMOS)、硅锗(SiGe)、砷化镓(GaAs)等。
下面对本申请实施例提供的雷达的天线进行具体描述。
具体来说,发射天线212可以包括第一发射天线和第二发射天线,第一发射天线用于和第二发射天线用于分时发射信号,即在不同的时刻发射不同的信号,比如第一发射天线在t1时刻发射第一发射信号,第二发射天线在t2时刻发射第二发射信号,第一发射信号的波长和第二发射信号的波长可以相同。图3为一种可能的时序示意图。参见图3,频率合成器214依次生成多个发射信号,由第一发射天线和第二发射天线分时发射多个发射信号,进而由接收天线依次接收回波信号;进一步地,从图3可以看出,由于第一发射天线和第二发射天线发射的发射信号均是由频率合成器214生成的,因此第一发射天线和第二发射天线发射的发射信号的波长均相同。
第一发射天线可以包括一根或多根天线,比如第一发射天线包括天线a1和天线a2,则第一发射天线发射第一发射信号,可以是指:天线a1和天线a2分别对第一发射信号进行不同的幅度调制和相位调制后发射出去,并形成一定的方向性增益(即波束指向),具体实现可以参见现有技术。同样地,第二发射天线也可以包括一根或多根天线,接收天线也可以包括一根或多个天线。
具体实施中,第一发射天线、第二发射天线和接收天线可以位于与目标对象的高程的参考平面垂直的直线上。进一步地,本申请实施例对第一发射天线、第二发射天线和接收天线之间的位置关系不做限定,比如,接收天线(包括天线R0、天线R1、天线R2、天线R3)可以位于第一发射天线(包括天线T0)和第二发射天线(包括天线T1)之间,如图4a所示,此时,接收天线与第一发射天线之间的距离(可以表示为Δh1)和接收天线与第二发射天线之间的距离(可以表示为Δh2)可以相等,或者也可以不相等,具体不做限定;又比如,接收天线可以位于第一发射天线和第二发射天线的上方,如图4b所示,或者,接收天线也可以位于第一发射天线和第二发射天线的下方,如图4c所示。
在一个示例中,可以假设图4a所示意的情形中,Δh1和Δh2均为正值,也就是说:当第一发射天线位于接收天线的上方时,第一发射天线与接收天线之间的距离表示为正值;当第二发射天线位于接收天线的下方时,第二发射天线与接收天线之间的距离表示为正值。相应地,在图4b所示意的情形中,由于第一发射天线位于接收天线的下方,因此Δh1为负值;由于第二发射天线位于接收天线的下方,因此Δh2为正值。在图4c所示意的情形中,由于第一发射天线位于接收天线的上方,因此Δh1为正值;由于第二发射天线位于接收天线的上方,因此Δh2为负值。后文中主要以此示例来进行描述。
图5a为本申请实施例提供的确定高程的方法所对应的流程示意图,如图5a所示,包括:
步骤501,根据接收天线接收到的第一回波信号生成第一成像信息,以及根据所述接收天线接收到的第二回波信号生成第二成像信息;所述第一回波信号为目标对象接收到第一发射天线发射的第一发射信号后反射的回波信号,所述第二回波信号为所述目标对象接 收到第二发射天线发射的第二发射信号后反射的回波信号。
本申请实施例中,第一成像信息可以理解为目标对象对第一发射信号的反映,主要是目标对象的后向散射形成的图像信息。进一步地,第一成像信息可以包括多种信息,比如相位信息、幅度信息等。根据目标对象反射的第一回波信号生成第一成像信息的一种可能的实现方式为,接收到第一回波信号(第一回波信号中包括幅度信息和相位信息)后,通过回波信号进行处理,比如对回波信号进行下变频、模数转换等,进而根据处理得到的信号,采用合成孔径雷达(synthetic aperture radar,SAR)成像算法,可以得到第一成像信息。第二成像信息可以参照第一成像信息的描述,不再赘述。
步骤502,根据所述第一成像信息和所述第二成像信息,确定所述目标对象的高程。
本申请实施例中,可以由图2a或图2b中所示意的雷达200或设置在雷达200中的半导体芯片来执行上述步骤501和步骤502。其中,雷达200可以为工作在超短波波段的超短波雷达,或者也可以为工作在微波波段的微波雷达(比如毫米波雷达),下文中将主要以毫米波雷达为例进行描述。
以雷达200来执行上述步骤501和步骤502为例,可以理解为,本申请实施例是在雷达200中增加一个功能模块(简称为高程提取模块),比如在处理器(具体可以为基带处理器)中增加高程提取模块(也可以理解为扩展基带处理器的功能)。参见图5b所示,高程提取模块可以通过两个通道来接收信息,比如通过通道0获取DDC对第一回波信号进行处理后的信息,以及通过通道1获取DDC对第二回波信号进行处理后的信息,进而分别生成第一成像信息和第二成像信息,并确定目标对象的高程。
本申请实施例中,采用上述方法,由于通过不同的发射天线发射第一发射信号和第二发射信号,从而可以基于目标对象反射的第一回波信号和第二回波信号分别生成第一成像信息和第二成像信息,进而能够基于第一成像信息和第二成像信息确定目标对象的高程,该方法的实现简单,且相比于激光雷达确定高程来说,成本较低。比如在车载环境下,本申请实施例可以在不改变车载雷达的主要架构的前提下,来实现利用车载雷达对目标对象的高程进行测量。
在一种可能的实现方式中,根据所述第一成像信息和所述第二成像信息,确定所述目标对象的高程,可以为:根据第一成像信息、第二成像信息以及距离差,确定所述目标对象的高程;其中,所述距离差是所述第一发射信号的传输距离和所述第二发射信号的传输距离的差值。
具体来说,在一个示例中,可以基于图6所示意的流程,确定目标对象的高程。为便于描述,图7示意出了确定目标对象的高程的原理图。需要说明的是,图7是以图4a所示意的天线布局方式为例来示意的,若以图4b或图4c所示意的天线布局方式同样可以采用图6中所示意的流程来确定目标对象的高程,也就是说,具体计算过程中,图4a、图4b和图4c差异之处在于Δh1和Δh2可能不同,比如图4a中Δh1和Δh2均为正值,图4b中Δh1为负值,Δh2为正值,图4c中Δh1为正值,Δh2为负值;除此之外的其它内容,可以参照执行。
下面对可能涉及到的参数进行说明:H表示接收天线与目标对象的高程的参考平面之间的距离,R 0表示第一发射天线与目标对象之间的距离,R 1表示第二发射天线与目标对象之间的距离,R centre表示接收天线与目标对象之间的距离,h centre表示接收天线与目标对象 在垂直方向上的距离,R ground表示天线模块与目标对象在水平方向上的距离,Δh1表示接收天线与第一发射天线之间的距离(具体可以是指接收天线的中心位置与第一发射天线的中心位置之间的距离),Δh2表示接收天线与第二发射天线之间的距离(具体可以是指接收天线的中心位置与第二发射天线的中心位置之间的距离),θ表示接收天线接收第一回波信号或第二回波信号的接收角度,针对于同一目标对象,接收天线接收第一回波信号的接收角度和接收第二回波信号的接收角度相同。所述水平方向可以是指与目标对象的高程的参考平面平行的方向,所述垂直方向可以是指与目标对象的高程的参考平面垂直的方向。其中,Δh1和Δh2可以根据雷达的结构参数得到,H、R centre、R 0、R 1可以测算得到,而h centre、R ground、θ均为未知量。
如图6所示,该流程包括:
步骤601,确定距离差与接收角度的关系,距离差为所述第一发射信号的传输距离和所述第二发射信号的传输距离的差值,可以表示为ΔR。
根据图7可知,第一发射信号的传输距离符合如下公式:
Figure PCTCN2019073735-appb-000005
由于Δh1远小于h centre,因此根据泰勒级数展开可得:
Figure PCTCN2019073735-appb-000006
第二发射信号的传输距离符合如下公式:
Figure PCTCN2019073735-appb-000007
由于Δh2远小于h centre,因此根据泰勒级数展开可得:
Figure PCTCN2019073735-appb-000008
进一步地,距离差与接收角度的关系符合如下公式:
Figure PCTCN2019073735-appb-000009
本申请实施例中,若Δh1=Δh2,则距离差与接收角度的关系可以表示为:
ΔR=COS(θ)Δh1=COS(θ)Δh2
步骤602,确定第一成像信息和第二成像信息的相位差值(可以表示为
Figure PCTCN2019073735-appb-000010
)与距离差的关系。
本申请实施例中,第一成像信息和第二成像信息可以表示为包含幅度信息和相位信息的二维复数数据,比如,第一成像信息可以表示为
Figure PCTCN2019073735-appb-000011
c 1表示第一成像信息,A 1表示第一成像信息的幅度,
Figure PCTCN2019073735-appb-000012
表示第一成像信息的相位;第二成像信息可以表示为
Figure PCTCN2019073735-appb-000013
c 2表示第二成像信息,A 2表示第二成像信息的幅度,
Figure PCTCN2019073735-appb-000014
表示第二成像信息的相位。通过将第一成像信息和第二成像信息进行共轭相乘,可得复干涉图为
Figure PCTCN2019073735-appb-000015
式中的*表示取共轭,进而干涉相位(即第一成像信息和第二成像信息的相位差值)可以表示为:
Figure PCTCN2019073735-appb-000016
进一步地,由于第一成像信息的相位包含距离信息(比如第一发射信号的传输距离和第一回波信号的传输距离)和目标对象的散射特性,示例性地,第一成像信息的相位可以表示为:
Figure PCTCN2019073735-appb-000017
第二成像信息的相位包含距离信息(比如第二发射信号的传输距离和第二回波信号的传输距离)和目标对象的散射特性,示例性地,第二成像信息的相位可以表示为:
Figure PCTCN2019073735-appb-000018
为目标对象的散射相位。因此,可得到相位差值与距离差的关系符合如下公式:
Figure PCTCN2019073735-appb-000019
其中,λ表示所述第一发射信号的波长或所述第二发射信号的波长。
需要说明的是:(1)Δh1和Δh2是由第一发射天线、第二发射天线和接收天线之间的位置关系决定的,本领域技术人员可以根据实际需要对第一发射天线、第二发射天线和接收天线之间的位置关系进行设置。在一个示例中,可以根据第一发射信号的波长或第二发射信号的波长来设置第一发射天线、第二发射天线和接收天线之间的位置关系,从而使得Δh1和Δh2的平均值
Figure PCTCN2019073735-appb-000020
小于或等于所述波长。由于ΔR小于
Figure PCTCN2019073735-appb-000021
因此当Δh1和Δh2的平均值小于或等于所述波长时,ΔR小于λ,从而使得上述第一成像信息和第二成像信息共轭相乘得到的相位差值即为第一成像信息的相位与第二成像信息的相位的差值(即真实的相位差值),不存在相位模糊,无需进行相位解缠操作。
或者,本申请实施例中也可以预先设置接收角度的取值范围,来避免相位模糊,比如:
Figure PCTCN2019073735-appb-000022
在一个示例中,可以通过设置接收天线的波束宽度使得接收角度符合上述取值范围,具体不做限定。
(2)本申请实施例中,雷达(比如毫米波雷达)对位置的分辨能力分为距离向分辨率和方位向分辨率,一个距离向分辨率和一个方位向分辨率构成一个分辨单元。距离向分辨率与毫米波雷达发射信号的信号带宽成反比,即信号带宽越大,距离向分辨率越小;方位向分辨率由天线尺寸决定,天线尺寸越大,方位向分辨率越小。毫米波雷达在定位目标对象时,可以将目标对象所在的分辨单元的坐标作为目标对象的坐标。如图8所示,为本申请实施例提供的一种毫米波雷达分辨率示意图。图8中,B1B4或B2B3之间的距离表示距离向分辨率,B1B2或B3B4之间的距离表示方位向分辨率,B1、B2、B3、B4围成的区域为一个分辨单元。由于ΔR远小于距离向分辨率,可以说明反射第一回波信号的对象和第二回波信号的对象位于同一分辨单元,也就是说,可以认为第一回波信号和第二回波信号是由某一分辨单元内的对象(即为目标对象)反射的,从而可以保证确定出的高程为目标对象的高程。
(3)本申请实施例中,由于第一发射天线和第二发射天线的下视角不同,从而使得第一成像信息和第二成像信息在像素上并不能完全重合,因此可以先对第一成像信息和第二成像信息进行图像配准,以达到两幅图像的像素点一一对应,进而基于配准后的第一成像信息和第二成像信息进行共轭相乘运算。
步骤603,根据距离差与接收角度的关系以及相位差值与距离差的关系,得到接收角度。
具体来说,结合上述公式,可知:
Figure PCTCN2019073735-appb-000023
Figure PCTCN2019073735-appb-000024
当Δh1=Δh2时,可以得到:
Figure PCTCN2019073735-appb-000025
步骤604,根据接收天线与目标对象的高程的参考平面之间的距离、接收天线和目标对象之间的距离和接收角度,得到目标对象的高程。
比如,基于下述公式可以得到目标对象的高程(可以表示为h):
h=H-h centre=H-R centreCOS(θ)
通过步骤601至步骤604可以看出,由于第一发射信号的传输距离和第二发射信号的传输距离的差值小于发射信号的波长,从而使得第一成像信息和第二成像信息的相位差值不存在相位模糊,此时可通过上述所描述的简化方式来确定目标对象的高程,从而能大大降低计算的复杂度,提高处理效率。
上述内容具体描述了确定目标对象的高程的方式,本申请实施例中还可以根据目标对象的高程确定出目标对象的三维特征。参见图9所示,以第一发射天线、第二发射天线和接收天线所在的直线为z轴,以雷达的距离向为x轴,方位向为y轴,则上述确定出的目标对象的高程即为目标对象在z轴上的坐标。进一步地,雷达还可以确定出目标对象在x轴和在y轴上的坐标,从而得到目标对象的三维特征。如此,在车载环境中,雷达通过采用上述方式确定出多个目标对象的三维特征后,即可实现对路面状况的分析。
上述主要对本申请实施例提供的方案进行了介绍。可以理解的是,为了实现上述功能,确定高程的装置可以包括执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请的实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在采用集成的单元(模块)的情况下,图10示出了本申请实施例中所涉及的确定高程的装置的可能的示例性框图。该装置1000可以包括:生成单元1001和确定单元1002。
进一步地,在一个示例中,该装置1000可以为设置在雷达中的半导体芯片,此时,生成单元1001和确定单元1002可以理解为图5b中所示意的高程提取模块,生成单元1001和确定单元1002的具体功能可以由处理器(比如基带处理器)来实现。
在又一个示例中,该装置1000可以为雷达,此时,该装置1000还可以包括通信单元1003、存储单元1004。其中,生成单元1001和确定单元1002也可以统称为处理单元,用 于对装置1000的动作进行控制管理,通信单元1003也称为收发单元,可以包括接收单元和/或发送单元,分别用于执行接收和发送操作。存储单元1004用于存储装置1000的程序代码和/或数据。具体来说,生成单元1001和确定单元1002可以为处理器,其可以实现或执行结合本申请的实施例公开内容所描述的各种示例性的逻辑方框,模块和电路。通信单元1003可以是通信接口、收发器或收发电路等,其中,该通信接口是统称,在具体实现中,该通信接口可以包括多个接口。存储单元1004可以是存储器。
本申请实施例中,生成单元1001和确定单元1002可以支持装置1000执行上文方法示例中的动作。例如,生成单元1001用于执行图5a中的步骤501,确定单元1002用于执行图5a中的步骤502和图6中的步骤601至步骤604。
具体地,在一个实施例中,生成单元1001,用于根据接收天线接收到的第一回波信号生成第一成像信息,以及根据所述接收天线接收到的第二回波信号生成第二成像信息;所述第一回波信号为目标对象接收到第一发射天线发射的第一发射信号后反射的回波信号,所述第二回波信号为所述目标对象接收到第二发射天线发射的第二发射信号后反射的回波信号;
确定单元1002,用于根据所述第一成像信息和所述第二成像信息,确定所述目标对象的高程。
在一种可能的设计中,所述确定单元1002具体用于:
根据第一成像信息、第二成像信息,得到所述第一成像信息和所述第二成像信息的相位差值;
根据所述相位差值、所述接收天线与所述第一发射天线之间的距离、所述接收天线与所述第二发射天线之间的距离以及所述第一发射信号的波长或所述第二发射信号的波长,得到所述接收天线接收所述第一回波信号或所述第二回波信号的接收角度;
根据所述接收角度,确定所述目标对象的高程。
在一种可能的设计中,所述确定单元1002具体用于:
确定距离差与所述接收角度的关系符合如下公式:
Figure PCTCN2019073735-appb-000026
其中,ΔR表示所述距离差,所述距离差为所述第一发射信号的传输距离和所述第二发射信号的传输距离的差值,θ表示所述接收角度,Δh1表示接收天线与第一发射天线之间的距离,Δh2表示接收天线与第二发射天线之间的距离;
确定所述相位差值与所述距离差的关系符合如下公式:
Figure PCTCN2019073735-appb-000027
其中,
Figure PCTCN2019073735-appb-000028
表示所述相位差值,λ表示所述第一发射信号或所述第二发射信号的波长;
根据所述距离差与所述接收角度的关系以及所述相位差值与所述距离差的关系,可得到所述接收角度符合下公式:
Figure PCTCN2019073735-appb-000029
在一种可能的设计中,所述确定单元1002具体用于:
通过如下公式计算所述目标对象的高程:
h=H-R centreCOS(θ)
其中,h表示所述目标对象的高程,H表示所述接收天线与所述高程的参考平面之间的距离,R centre表示所述接收天线与所述目标对象之间的距离,θ表示所述接收角度。
需要说明的是,本申请实施例中对单元(模块)的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。在本申请的实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质可以为存储器等各种可以存储程序代码的介质。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本发明实施例是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他 可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本发明实施例进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本发明实施例的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (18)

  1. 一种确定高程的方法,其特征在于,所述方法包括:
    根据接收天线接收到的第一回波信号生成第一成像信息,以及根据所述接收天线接收到的第二回波信号生成第二成像信息;所述第一回波信号为目标对象接收到第一发射天线发射的第一发射信号后反射的回波信号,所述第二回波信号为所述目标对象接收到第二发射天线发射的第二发射信号后反射的回波信号;
    根据所述第一成像信息和所述第二成像信息,确定所述目标对象的高程。
  2. 根据权利要求1所述的方法,其特征在于:
    所述第一发射天线、所述第二发射天线和所述接收天线位于与所述高程的参考平面垂直的直线上。
  3. 根据权利要求2所述的方法,其特征在于:
    所述接收天线位于所述第一发射天线和所述第二发射天线之间。
  4. 根据权利要求3所述的方法,其特征在于:
    所述接收天线与所述第一发射天线之间的距离和所述接收天线与所述第二发射天线之间的距离的平均值小于或等于所述第一发射信号的波长或所述第二发射信号的波长。
  5. 根据权利要求4所述的方法,其特征在于,所述第一发射信号的波长和所述第二发射信号的波长相同。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,根据所述第一成像信息和所述第二成像信息,确定所述目标对象的高程,包括:
    根据第一成像信息、第二成像信息,得到所述第一成像信息和所述第二成像信息的相位差值;
    根据所述相位差值、所述接收天线与所述第一发射天线之间的距离、所述接收天线与所述第二发射天线之间的距离以及所述第一发射信号的波长或所述第二发射信号的波长,得到所述接收天线接收所述第一回波信号或所述第二回波信号的接收角度;
    根据所述接收角度,确定所述目标对象的高程。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述接收角度确定所述目标对象的高程包括:
    通过如下公式计算所述目标对象的高程:
    h=H-R centreCOS(θ)
    其中,h表示所述目标对象的高程,H表示所述接收天线与所述高程的参考平面之间的距离,R centre表示所述接收天线与所述目标对象之间的距离,θ表示所述接收角度。
  8. 一种确定高程的装置,其特征在于,所述装置包括:
    生成单元,用于根据接收天线接收到的第一回波信号生成第一成像信息,以及根据所述接收天线接收到的第二回波信号生成第二成像信息;所述第一回波信号为目标对象接收到第一发射天线发射的第一发射信号后反射的回波信号,所述第二回波信号为所述目标对象接收到第二发射天线发射的第二发射信号后反射的回波信号;
    确定单元,用于根据所述第一成像信息和所述第二成像信息,确定所述目标对象的高程。
  9. 根据权利要求8所述的装置,其特征在于:
    所述第一发射天线、所述第二发射天线和所述接收天线位于与所述高程的参考平面垂直的直线上。
  10. 根据权利要求9所述的装置,其特征在于:
    所述接收天线位于所述第一发射天线和所述第二发射天线之间。
  11. 根据权利要求10所述的装置,其特征在于:
    所述接收天线与所述第一发射天线之间的距离和所述接收天线与所述第二发射天线之间的距离的平均值小于或等于所述第一发射信号的波长或所述第二发射信号的波长。
  12. 根据权利要求11所述的装置,其特征在于,所述第一发射信号的波长和所述第二发射信号的波长相同。
  13. 根据权利要求8至12中任一项所述的装置,其特征在于,所述确定单元具体用于:
    根据第一成像信息、第二成像信息,得到所述第一成像信息和所述第二成像信息的相位差值;
    根据所述相位差值、所述接收天线与所述第一发射天线之间的距离、所述接收天线与所述第二发射天线之间的距离以及所述第一发射信号的波长或所述第二发射信号的波长,得到所述接收天线接收所述第一回波信号或所述第二回波信号的接收角度;
    根据所述接收角度,确定所述目标对象的高程。
  14. 根据权利要求13所述的装置,其特征在于,所述确定单元具体用于:
    通过如下公式计算所述目标对象的高程:
    h=H-R centreCOS(θ)
    其中,h表示所述目标对象的高程,H表示所述接收天线与所述高程的参考平面之间的距离,R centre表示所述接收天线与所述目标对象之间的距离,θ表示所述接收角度。
  15. 一种确定高程的装置,其特征在于,所述装置包括处理器、存储器以及存储在存储器上并可在处理器上运行的指令,当所述指令被运行时,使得所述装置执行如权利要求1至7中任一项所述的方法。
  16. 一种雷达,其特征在于,所述雷达包括第一发射天线、第二发射天线、接收天线和如权利要求15所述的装置。
  17. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至7中任一项所述的方法。
  18. 一种计算机程序产品,其特征在于,当其在计算机上运行时,使得计算机执行如权利要求1至7中任一项所述的方法。
PCT/CN2019/073735 2019-01-29 2019-01-29 一种确定高程的方法、装置及雷达 WO2020154903A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980089284.7A CN113302519A (zh) 2019-01-29 2019-01-29 一种确定高程的方法、装置及雷达
PCT/CN2019/073735 WO2020154903A1 (zh) 2019-01-29 2019-01-29 一种确定高程的方法、装置及雷达

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/073735 WO2020154903A1 (zh) 2019-01-29 2019-01-29 一种确定高程的方法、装置及雷达

Publications (1)

Publication Number Publication Date
WO2020154903A1 true WO2020154903A1 (zh) 2020-08-06

Family

ID=71841744

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/073735 WO2020154903A1 (zh) 2019-01-29 2019-01-29 一种确定高程的方法、装置及雷达

Country Status (2)

Country Link
CN (1) CN113302519A (zh)
WO (1) WO2020154903A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220196827A1 (en) * 2020-12-22 2022-06-23 Electronics And Telecommunications Research Institute Method and apparatus for calculating altitude of target

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024113254A1 (zh) * 2022-11-30 2024-06-06 北京小米移动软件有限公司 散射体位置确定方法、装置及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101017202A (zh) * 2006-12-18 2007-08-15 电子科技大学 一种雷达高度表及采用该表对飞行器位置的测量方法
CN103412308A (zh) * 2013-08-21 2013-11-27 中国科学院电子学研究所 一种高精度干涉合成孔径雷达系统
CN103713287A (zh) * 2013-12-26 2014-04-09 中国科学院电子学研究所 一种基于互质多基线的高程重建方法及装置
CN104380136A (zh) * 2012-06-25 2015-02-25 奥托里夫Asp股份有限公司 用于三维探测的双通道单脉冲雷达
US20170187102A1 (en) * 2015-12-24 2017-06-29 Nidec Elesys Corporation On-vehicle radar device
CN109239710A (zh) * 2018-08-31 2019-01-18 中国科学院电子学研究所 雷达高程信息的获取方法及装置、计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101017202A (zh) * 2006-12-18 2007-08-15 电子科技大学 一种雷达高度表及采用该表对飞行器位置的测量方法
CN104380136A (zh) * 2012-06-25 2015-02-25 奥托里夫Asp股份有限公司 用于三维探测的双通道单脉冲雷达
CN103412308A (zh) * 2013-08-21 2013-11-27 中国科学院电子学研究所 一种高精度干涉合成孔径雷达系统
CN103713287A (zh) * 2013-12-26 2014-04-09 中国科学院电子学研究所 一种基于互质多基线的高程重建方法及装置
US20170187102A1 (en) * 2015-12-24 2017-06-29 Nidec Elesys Corporation On-vehicle radar device
CN109239710A (zh) * 2018-08-31 2019-01-18 中国科学院电子学研究所 雷达高程信息的获取方法及装置、计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHANG, JICHAO: "Research on Practical Methods of the DEM Generation from InSAR", DOCTORAL DISSERTATION, no. 2, 15 December 2002 (2002-12-15), pages 1 - 57, XP009522372, ISSN: 1671-6779 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220196827A1 (en) * 2020-12-22 2022-06-23 Electronics And Telecommunications Research Institute Method and apparatus for calculating altitude of target

Also Published As

Publication number Publication date
CN113302519A (zh) 2021-08-24

Similar Documents

Publication Publication Date Title
US11821990B2 (en) Scene perception using coherent doppler LiDAR
US11520024B2 (en) Automatic autonomous vehicle and robot LiDAR-camera extrinsic calibration
CN113287036B (zh) 一种速度解模糊的方法及回波信号处理装置
US11729520B2 (en) Sensor layout for autonomous vehicles
US11892560B2 (en) High precision multi-sensor extrinsic calibration via production line and mobile station
US10818110B2 (en) Methods and systems for providing a mixed autonomy vehicle trip summary
US11247675B2 (en) Systems and methods of autonomously controlling vehicle states
EP3669246B1 (en) Detecting motion of an autonomous vehicle using radar technology
WO2022020995A1 (zh) 一种信号处理方法、装置以及存储介质
US10522887B2 (en) Communication system for a vehicle comprising a dual channel rotary joint coupled to a plurality of interface waveguides for coupling electromagnetic signals between plural communication chips
US11688917B2 (en) Radar system for use in a vehicle comprising a rotary joint where a non-rotational unit is fixed to the vehicle and a rotational unit includes antennas configured for use with radar signals
US20210213930A1 (en) Lane prediction and smoothing for extended motion planning horizon
US20200094687A1 (en) Supplemental battery system
WO2020154903A1 (zh) 一种确定高程的方法、装置及雷达
CN112714877A (zh) 信号传输方法及装置、信号处理方法及装置以及雷达系统
CN115087881B (zh) 一种波达角aoa估计方法和装置
CN112673271B (zh) 一种近场估计的方法及装置
US11571987B2 (en) Optimization of battery pack size using swapping
EP3936896A1 (en) Distance measurement method and device based on detection signal
CN115407344B (zh) 栅格地图创建方法、装置、车辆及可读存储介质
US20220334244A1 (en) Radar ranging method and device, radar and in-vehicle system
WO2024113207A1 (zh) 一种数据处理方法以及装置
CN114640152A (zh) 充电器基础设施上的天线

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913585

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913585

Country of ref document: EP

Kind code of ref document: A1