CN114789734A - Perception information compensation method, device, vehicle, storage medium, and program - Google Patents

Perception information compensation method, device, vehicle, storage medium, and program Download PDF

Info

Publication number
CN114789734A
CN114789734A CN202210280799.9A CN202210280799A CN114789734A CN 114789734 A CN114789734 A CN 114789734A CN 202210280799 A CN202210280799 A CN 202210280799A CN 114789734 A CN114789734 A CN 114789734A
Authority
CN
China
Prior art keywords
vehicle
relative
moving object
target moving
relative position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210280799.9A
Other languages
Chinese (zh)
Inventor
朱浩天
肖晖
程洋
郭丰瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chery Automobile Co Ltd
Original Assignee
Chery Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chery Automobile Co Ltd filed Critical Chery Automobile Co Ltd
Priority to CN202210280799.9A priority Critical patent/CN114789734A/en
Publication of CN114789734A publication Critical patent/CN114789734A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a perception information compensation method, a device, a vehicle, a storage medium and a program, and belongs to the field of intelligent driving. The method comprises the following steps: determining the relative position and relative speed of the first vehicle and the second vehicle, acquiring perception information of the second vehicle, converting the relative position and relative speed of the target moving object and the second vehicle into the relative position and relative speed of the target moving object and the first vehicle based on the relative position and relative speed of the first vehicle and the second vehicle, and compensating the perception information of the first vehicle. Since the perception information of the second vehicle includes the relative position and the relative speed of the target moving object and the second vehicle, the second vehicle is located within the perception range of the first vehicle. Therefore, the first vehicle can convert the relative position and the relative velocity of the target moving object and the second vehicle into the relative position and the relative velocity of the target moving object and the first vehicle based on the relative position and the relative velocity of the first vehicle and the second vehicle.

Description

Perception information compensation method, device, vehicle, storage medium, and program
Technical Field
The present application relates to the field of intelligent driving, and in particular, to a method and an apparatus for compensating perception information, a vehicle, a storage medium, and a program.
Background
In the field of intelligent driving, a vehicle can sense road conditions through various sensors mounted on the vehicle body so as to obtain road condition information. In this way, the driver can control the vehicle based on the road condition information. However, in practical applications, because the road conditions are complicated and changeable, various sensors mounted on the vehicle body may be shielded, so that the sensors cannot sense part of the road conditions, thereby influencing the control of the vehicle by the driver. Therefore, how to compensate the perception information of the vehicle becomes a problem which needs to be solved urgently at present.
Disclosure of Invention
The application provides a perception information compensation method, a device, a vehicle, a storage medium and a program, which can solve the problem that the perception information of the vehicle in the related art cannot be compensated. The technical scheme is as follows:
in one aspect, a perceptual information compensation method is provided, applied to a first vehicle, the method comprising:
determining a relative position and a relative speed of the first vehicle and a second vehicle, the second vehicle being within a perception range of the first vehicle;
acquiring perception information of the second vehicle, wherein the perception information of the second vehicle comprises a relative position and a relative speed of a target moving object and the second vehicle, the target moving object is located in a perception range of the second vehicle, and the target moving object is located in a blind area of the first vehicle;
and converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle so as to compensate the perception information of the first vehicle.
Optionally, the acquiring perception information of the second vehicle includes:
acquiring actual positioning information of the first vehicle and actual positioning information of the second vehicle;
determining theoretical positioning information of the second vehicle based on actual positioning information of the first vehicle and relative positions and relative speeds of the first vehicle and the second vehicle;
and acquiring perception information of the second vehicle under the condition that the actual positioning information of the second vehicle is the same as the theoretical positioning information of the second vehicle.
Optionally, the perception information of the second vehicle further includes a data check code;
the converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle includes:
coding the relative position and the relative speed of the target moving object and the second vehicle to obtain a data code;
and under the condition that the data code is the same as the data check code, converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle on the basis of the relative position and the relative speed of the first vehicle and the second vehicle.
Optionally, the relative position comprises a relative distance and a relative angle;
the converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle includes:
determining a relative distance of the target moving object from the first vehicle based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle;
determining a relative angle of the target moving object to the first vehicle based on a relative angle of the first vehicle to the second vehicle and a relative angle of the target moving object to the second vehicle;
subtracting the relative speed of the target moving object and the second vehicle from the relative speed of the first vehicle and the second vehicle to obtain the relative speed of the target moving object and the first vehicle.
Optionally, the determining the relative distance of the target moving object from the first vehicle based on the relative distance of the first vehicle from the second vehicle, the relative distance of the target moving object from the second vehicle, the relative angle of the first vehicle from the second vehicle, and the relative angle of the target moving object from the second vehicle comprises:
determining a relative distance of the target moving object from the first vehicle according to the following formula based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle;
Figure BDA0003556790700000031
wherein, in the above formula, d BC Representing the relative distance of the target moving object from the first vehicle, d BA Representing the relative distance of the first vehicle from the second vehicle, d AC Representing the relative distance, θ, of the target moving object from the second vehicle BA Represents a relative angle, θ, of the first vehicle and the second vehicle AC Representing a relative angle of the target moving object and the second vehicle.
Optionally, the target moving object comprises a moving vehicle, a pedestrian, an animal.
In another aspect, there is provided a perceptual information compensating apparatus applied to a first vehicle, the apparatus comprising:
the determining module is used for determining the relative position and the relative speed of the first vehicle and a second vehicle, wherein the second vehicle is positioned in the perception range of the first vehicle;
the acquisition module is used for acquiring perception information of the second vehicle, wherein the perception information of the second vehicle comprises a relative position and a relative speed of a target moving object and the second vehicle, the target moving object is located in a perception range of the second vehicle, and the target moving object is located in a blind area of the first vehicle;
the conversion module is used for converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle so as to compensate the perception information of the first vehicle.
Optionally, the obtaining module is specifically configured to:
acquiring actual positioning information of the first vehicle and actual positioning information of the second vehicle;
determining theoretical positioning information of the second vehicle based on actual positioning information of the first vehicle and relative positions and relative speeds of the first vehicle and the second vehicle;
and acquiring the perception information of the second vehicle under the condition that the actual positioning information of the second vehicle is the same as the theoretical positioning information of the second vehicle.
Optionally, the perception information of the second vehicle further includes a data check code;
the conversion module is specifically configured to:
coding the relative position and the relative speed of the target moving object and the second vehicle to obtain a data code;
and under the condition that the data code is the same as the data check code, converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle on the basis of the relative position and the relative speed of the first vehicle and the second vehicle.
Optionally, the relative position comprises a relative distance and a relative angle;
the conversion module includes:
a first determination unit configured to determine a relative distance of the target moving object from the first vehicle based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle;
a second determination unit configured to determine a relative angle of the target moving object with respect to the first vehicle based on a relative angle of the first vehicle with respect to the second vehicle and a relative angle of the target moving object with respect to the second vehicle;
an obtaining unit, configured to subtract the relative speed between the target moving object and the second vehicle from the relative speed between the first vehicle and the second vehicle to obtain a relative speed between the target moving object and the first vehicle.
Optionally, the first determining unit is specifically configured to:
determining a relative distance of the target moving object from the first vehicle according to the following formula based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle;
Figure BDA0003556790700000041
wherein, in the above formula, d BC Representing the relative distance of the target moving object from the first vehicle, d BA Representing the relative distance of the first vehicle from the second vehicle, d AC Representing movement of said objectRelative distance of object from the second vehicle, θ BA Representing the relative angle, θ, of the first vehicle and the second vehicle AC Representing a relative angle of the target moving object and the second vehicle.
Optionally, the target moving object comprises a moving vehicle, a pedestrian, an animal.
In another aspect, a vehicle is provided, which includes a memory for storing a computer program and a processor for executing the computer program stored in the memory to implement the steps of the perceptual information compensation method described above.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the perceptual information compensation method described above.
In another aspect, a computer program is provided comprising instructions which, when run on a computer, cause the computer to perform the steps of the perceptual information compensation method described above.
The technical scheme provided by the application can at least bring the following beneficial effects:
in a case where the target moving object is located within the blind area of the first vehicle, the first vehicle may acquire perception information of the second vehicle. Since the perception information of the second vehicle includes the relative position and the relative speed of the target moving object and the second vehicle, the second vehicle is located within the perception range of the first vehicle. Therefore, the first vehicle can convert the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle, so as to compensate the perception information of the first vehicle.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a vehicle according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for compensating perceptual information according to an embodiment of the present application;
FIG. 3 is a schematic diagram of determining relative positions of a first vehicle and a second vehicle according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram for determining a relative angle between a target moving object and a first vehicle according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a perceptual information compensation apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a perceptual information compensation apparatus according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the perceptual information compensation method provided by the embodiment of the present application in detail, an application scenario provided by the embodiment of the present application is introduced.
The perception information compensation method provided by the embodiment of the application can be applied to various scenes, for example, in a vehicle driving scene, a first vehicle can perceive road conditions through various sensors arranged on a vehicle body, and therefore a driver is assisted in controlling the vehicle. However, in some cases, the second vehicle may occlude portions of the sensors of the first vehicle, such that the occluded sensors cannot perceive the target moving object. At this time, the driver of the first vehicle cannot acquire the relative position and the relative speed of the target moving object and the first vehicle, so that the first vehicle cannot be controlled to avoid a possible risk. Therefore, the first vehicle can perform perception information compensation according to the perception information compensation method provided by the embodiment of the application, so that the relative position and the relative speed of the target moving object and the first vehicle are obtained, and the driver can avoid possible risks in advance.
The perception information compensation method provided by the embodiment of the application can be executed by a vehicle. Furthermore, the perception information compensation method provided by the embodiment of the present application may also be implemented by a perception information compensation device, and a vehicle is taken as an example to be described later.
Referring to fig. 1, a first vehicle includes a positioning module, a sensing module, and a communication module. The positioning module is used for determining actual positioning information of the first vehicle, the sensing module is used for determining the relative position and the relative speed of the first vehicle and the second vehicle, and the communication module is used for receiving the sensing information of the second vehicle and the actual positioning information of the second vehicle.
The perception information compensation device may be any electronic product that can perform human-Computer interaction with a user through one or more modes such as a keyboard, a touch pad, a touch screen, a remote controller, voice interaction or handwriting equipment, for example, a PC (Personal Computer), a mobile phone, a smart phone, a PDA (Personal Digital Assistant), a handheld PC (pocket PC), a tablet Computer, a smart car machine, a smart television, a smart sound box, and the like.
Those skilled in the art should understand that the above-mentioned application scenarios and perceptual information compensation apparatuses are only examples, and other existing or future application scenarios and perceptual information compensation apparatuses that may be present in the present application embodiments, such as may be applicable to the present application embodiments, are also included in the scope of the present application embodiments and are incorporated herein by reference.
It should be noted that the service scenario described in the embodiment of the present application is for more clearly illustrating the technical solution of the embodiment of the present application, and does not constitute a limitation to the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows, with the occurrence of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
Next, a method for compensating for perceptual information provided in an embodiment of the present application will be explained in detail.
Fig. 2 is a flowchart of a perceptual information compensation method according to an embodiment of the present application, please refer to fig. 2, which includes the following steps.
Step 201: the first vehicle determines a relative position and a relative speed of the first vehicle to a second vehicle, the second vehicle being within a perception range of the first vehicle.
In some embodiments, the first vehicle determines the relative position and relative velocity of the first vehicle and the second vehicle via a perception module. Wherein, perception module includes camera and radar, and the radar includes millimeter wave radar and ultrasonic radar etc..
The relative position of the first vehicle and the second vehicle includes the relative distance of the first vehicle and the second vehicle, and the relative angle of the first vehicle and the second vehicle. And the first vehicle determines the relative angle between the first vehicle and the second vehicle by taking the vehicle body center point as a starting point and a ray with the direction as the horizontal direction as a reference line, wherein the relative angle between the first vehicle and the second vehicle is an included angle between a connecting line of the vehicle body center point of the first vehicle and the vehicle body center point of the second vehicle and the reference line.
For example, please refer to FIG. 3, d BA Representing the relative distance between the first vehicle and the second vehicle, taking the vehicle body center point B as a starting point and a ray with the direction as the horizontal direction as a reference line, and taking the relative angle theta of the first vehicle and the second vehicle as a reference line BA Is the included angle between the connecting line of the vehicle body center point B of the first vehicle and the vehicle body center point A of the second vehicle and the reference line.
The first vehicle transmits an electromagnetic wave signal to the second vehicle through the radar and receives an echo signal reflected by the second vehicle, so that the total time from transmitting the electromagnetic wave signal to receiving the echo signal is determined. Then, based on the total time duration and the propagation speed of the electromagnetic wave signal, the relative distance of the first vehicle from the second vehicle is determined.
As an example, the first vehicle may determine the relative distance of the first vehicle from the second vehicle according to equation (1) below.
Figure BDA0003556790700000071
Wherein, in the above formula (1), d BA Represents the relative distance between the first vehicle and the second vehicle, c represents the propagation speed of the electromagnetic wave signal, and T represents the total time length from the transmission of the electromagnetic wave signal to the reception of the echo signal.
In some embodiments, the second vehicle may also transmit electromagnetic wave signals to the first vehicle via a radar, such that each of the plurality of radars of the first vehicle may receive the electromagnetic wave signals transmitted by the second vehicle. Since the plurality of radars of the first vehicle are in different positions, distances between the plurality of radars and the second vehicle are different. Thus, when the plurality of radars of the first vehicle receive the electromagnetic wave signal transmitted by the second vehicle, different phases are generated. Furthermore, the electromagnetic wave signals emitted by the second vehicle may be attenuated during transmission. That is, as the transmission distance increases, the energy of the electromagnetic wave signal gradually decreases. Thus, when the plurality of radars of the first vehicle receive the electromagnetic wave signal transmitted by the second vehicle, different amplitudes are generated. The first vehicle may determine a relative angle of the first vehicle and the second vehicle based on the phase difference and the amplitude difference between the electromagnetic wave signals received by the plurality of radars.
The first vehicle determines a radar having the largest amplitude of the received electromagnetic wave signal among the plurality of radars as a target radar. Then, a phase difference between the electromagnetic wave signal received by the target radar and the electromagnetic wave signal received by the adjacent radar is acquired, and a relative angle of the first vehicle and the second vehicle is determined based on the phase difference.
As an example, the first vehicle may determine the relative angle of the first vehicle and the second vehicle according to the following equation (2).
Figure BDA0003556790700000081
Wherein, in the above formula (2), θ BA Representing the relative angle of the first vehicle to the second vehicle,
Figure BDA0003556790700000083
represents a phase difference between an electromagnetic wave signal received by a target radar and an electromagnetic wave signal received by an adjacent radar, λ represents a wavelength of the electromagnetic wave signal, and d represents a distance between the target radar and the adjacent radar.
The first vehicle transmits an electromagnetic wave signal to the second vehicle through a radar, receives an echo signal reflected by the second vehicle, and further performs Fast Fourier Transform (FFT) on the echo signal to obtain a frequency of the doppler shift. Then, based on the frequency of the doppler shift, the transmission frequency of the electromagnetic wave signal, and the propagation speed of the electromagnetic wave signal, the relative speed of the first vehicle and the second vehicle is determined.
As one example, the first vehicle may determine the relative speed of the first vehicle and the second vehicle according to equation (3) below.
Figure BDA0003556790700000082
Wherein, in the above formula (3), v BA Representing the relative speed of the first vehicle and the second vehicle, c representing the propagation speed of the electromagnetic wave signal, f d Frequency, f, representing the Doppler shift 0 Representing the transmission frequency of the electromagnetic wave signal.
It should be noted that, the determination of the relative position and the relative speed of the first vehicle and the second vehicle by the sensing module for the first vehicle is an example. In other embodiments, the first vehicle may also determine the relative position and relative speed of the first vehicle and the second vehicle through other modules. Of course, the first vehicle may also determine the relative position and the relative speed of the first vehicle and the second vehicle in other ways, which is not limited in this application.
Step 202: the first vehicle acquires perception information of the second vehicle, the perception information of the second vehicle comprises a relative position and a relative speed of the target moving object and the second vehicle, the target moving object is located in a perception range of the second vehicle, and the target moving object is located in a blind area of the first vehicle.
The method comprises the steps that a first vehicle obtains actual positioning information of the first vehicle and actual positioning information of a second vehicle, theoretical positioning information of the second vehicle is determined based on the actual positioning information of the first vehicle and relative positions and relative speeds of the first vehicle and the second vehicle, and perception information of the second vehicle is obtained under the condition that the actual positioning information of the second vehicle is the same as the theoretical positioning information of the second vehicle.
That is, the positioning module of the second vehicle determines the actual positioning information of the second vehicle and sends the actual positioning information to the communication module of the first vehicle through the communication module of the second vehicle. And the communication module of the first vehicle receives the actual positioning information of the second vehicle, so that the first vehicle can acquire the actual positioning information of the second vehicle. The positioning module of the first vehicle determines actual positioning information of the first vehicle, and the sensing module of the first vehicle determines relative position and relative speed of the first vehicle and the second vehicle. Then, the first vehicle determines theoretical positioning information of the second vehicle according to a related algorithm based on the actual positioning information of the first vehicle and the relative position and relative speed of the first vehicle and the second vehicle. And if the actual positioning information of the second vehicle is the same as the theoretical positioning information of the second vehicle, indicating that the actual positioning information of the second vehicle is accurate. At this time, the communication module of the first vehicle receives the perception information of the second vehicle, so that the first vehicle can acquire the perception information of the second vehicle. And if the actual positioning information of the second vehicle is different from the theoretical positioning information of the second vehicle, indicating that the actual positioning information of the second vehicle is inaccurate. At this time, the communication module of the first vehicle does not receive the perception information of the second vehicle.
The process of determining the relative position and the relative speed between the target moving object and the second vehicle by the second vehicle is similar to the process of determining the relative position and the relative speed between the first vehicle and the second vehicle by the first vehicle in step 201, and therefore, reference may be made to the relevant content of step 201, which is not described herein again. The target moving object comprises a moving vehicle, a pedestrian and an animal.
The Positioning module may be a GPS (Global Positioning System) module. Of course, the positioning module may also be another module, and the communication module may be TBOX (telematics box). Of course, the communication module may also be another module, which is not limited in this application.
It should be noted that, in the above description, the perception information of the second vehicle is obtained in the case where the actual positioning information of the second vehicle determined by the first vehicle is the same as the theoretical positioning information of the second vehicle. That is, the second vehicle determines the relative position and relative speed of the target moving object and the second vehicle to obtain the perception information of the second vehicle. The second vehicle then broadcasts the perception information of the second vehicle. In this way, the sensory information of the second vehicle broadcast by the second vehicle is received in the event that the first vehicle determines that the actual positioning information of the second vehicle is accurate. In an actual application process, the second vehicle may also send the perception information to the first vehicle when determining that the actual positioning information of the first vehicle is the same as the theoretical positioning information of the first vehicle. That is, the second vehicle determines the relative position and the relative speed of the target moving object and the second vehicle to obtain the perception information of the second vehicle. Then, the second vehicle receives the perception information access request sent by the first vehicle and sends the perception information of the second vehicle to the first vehicle.
In some embodiments, the first vehicle sends an awareness information access request to the second vehicle, the awareness information access request for accessing awareness information of the second vehicle. After the second vehicle receives the perception information access request sent by the first vehicle, the theoretical positioning information of the first vehicle can be determined based on the actual positioning information of the first vehicle carried by the perception information access request, and the relative position and the relative speed of the first vehicle and the second vehicle. And under the condition that the actual positioning information of the first vehicle is the same as the theoretical positioning information of the first vehicle, the second vehicle directly sends the perception information of the second vehicle to the first vehicle. And under the condition that the actual positioning information of the first vehicle is not the same as the theoretical positioning information of the first vehicle, the second vehicle refuses to send the perception information of the second vehicle to the first vehicle.
Step 203: the first vehicle converts the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle so as to compensate perception information of the first vehicle.
In some embodiments, the first vehicle may convert the relative position and relative velocity of the target moving object and the second vehicle into the relative position and relative velocity of the target moving object and the first vehicle based directly on the relative position and relative velocity of the first vehicle and the second vehicle to compensate for the perceptual information of the first vehicle.
In some embodiments, the perception information of the second vehicle further includes a data check code, the first vehicle encodes the relative position and relative speed of the target moving object and the second vehicle to obtain a data code, and in the case that the data code is the same as the data check code, the relative position and relative speed of the target moving object and the second vehicle are converted into the relative position and relative speed of the target moving object and the first vehicle based on the relative position and relative speed of the first vehicle and the second vehicle.
And after the second vehicle determines the relative position and the relative speed of the target moving object and the second vehicle, coding the relative position and the relative speed of the target moving object and the second vehicle according to a preset algorithm to obtain a data check code. After the first vehicle acquires the perception information of the second vehicle, the relative position and the relative speed of the target moving object and the second vehicle, which are included in the perception information of the second vehicle, are coded according to the same algorithm, so that a data code is obtained. Then, the first vehicle compares the data code with the data check code included in the perception information of the second vehicle, so as to judge whether the perception information of the second vehicle is tampered in the transmission process. If the data code is the same as the data check code, the perception information of the second vehicle is not tampered in the transmission process. At this time, the first vehicle converts the relative position and the relative velocity of the target moving object and the second vehicle into the relative position and the relative velocity of the target moving object and the first vehicle based on the relative position and the relative velocity of the first vehicle and the second vehicle. And if the data code is not the same as the data check code, the sensing information of the second vehicle is tampered in the transmission process. At this time, the first vehicle ends the process of compensating the perception information of the first vehicle this time.
Wherein the perception information of the second vehicle may further include a vehicle identification of the second vehicle. Alternatively, actual positioning information of the second vehicle may also be included. Of course, in practical applications, the perception information of the second vehicle may also include other contents, which is not limited in this embodiment of the application.
The vehicle identification is used to uniquely identify the vehicle, and may be the vehicle's number, manufacturer, make, model, etc., or may be a combination of such information.
Based on the above description, the relative position includes the relative distance and the relative angle. In some embodiments, the first vehicle may convert the relative position and the relative velocity of the target moving object and the second vehicle into the relative position and the relative velocity of the target moving object and the first vehicle based on the relative position and the relative velocity of the first vehicle and the second vehicle according to the following steps (1) - (3).
(1) The first vehicle determines the relative distance between the target moving object and the first vehicle based on the relative distance between the first vehicle and the second vehicle, the relative distance between the target moving object and the second vehicle, the relative angle between the first vehicle and the second vehicle, and the relative angle between the target moving object and the second vehicle.
As an example, the first vehicle may determine the relative distance of the target moving object from the first vehicle according to the following formula (4).
Figure BDA0003556790700000111
Wherein, in the above formula (4), d BC Representing the relative distance of the target moving object from the first vehicle, d BA Representing the relative distance of the first vehicle from the second vehicle, d AC Representing the relative distance, θ, of the target moving object from the second vehicle BA Representing the relative angle of the first vehicle to the second vehicle, theta AC Representing the relative angle of the target moving object and the second vehicle.
(2) The first vehicle determines the relative angle of the target moving object and the first vehicle based on the relative angle of the first vehicle and the second vehicle and the relative angle of the target moving object and the second vehicle.
For example, referring to fig. 4, assume a relative angle θ between the first vehicle and the second vehicle BA Is the included angle between the connecting line of the center point B of the first vehicle body and the center point A of the second vehicle body and the reference line, and the relative angle theta of the target moving object and the second vehicle AC Is the included angle between the reference line and the connecting line of the center point C of the target moving object and the center point A of the vehicle body of the second vehicle. At this time, the relative angle θ of the target moving object to the first vehicle BC =θ BAAC /2。
The first vehicle determines the relative distance between the target moving object and the first vehicle, and the relative angle between the target moving object and the first vehicle in the above manner are an example. In other embodiments, the first vehicle may also determine the relative distance between the target moving object and the first vehicle and the relative angle between the target moving object and the first vehicle in other manners. Illustratively, the first vehicle converts the relative position of the first vehicle and the second vehicle into a corresponding vector to derive a first vector. The first vehicle converts the relative position of the target moving object and the second vehicle into a corresponding vector to obtain a second vector. Then, the first vehicle may determine the relative position of the target moving object and the first vehicle according to the following formula (5).
Figure BDA0003556790700000121
Wherein, in the above formula (5),
Figure BDA0003556790700000122
a vector representing a relative position of the target moving object to the first vehicle,
Figure BDA0003556790700000123
which represents a first vector of the first vector,
Figure BDA0003556790700000124
which represents the second vector of the first vector,
Figure BDA0003556790700000125
die of
Figure BDA0003556790700000126
For indicating the relative distance of the target moving object from the first vehicle,
Figure BDA0003556790700000127
corresponding unit vector
Figure BDA0003556790700000128
For indicating the relative angle of the target moving object and the first vehicle.
(3) The first vehicle subtracts the relative speed of the target moving object and the second vehicle from the relative speed of the first vehicle and the second vehicle to obtain the relative speed of the target moving object and the first vehicle.
In the embodiment of the present application, in the case where the target moving object is located within the blind area of the first vehicle, the first vehicle may acquire the perception information of the second vehicle. Since the perception information of the second vehicle includes the relative position and the relative speed of the target moving object and the second vehicle, the second vehicle is located within the perception range of the first vehicle. Therefore, the first vehicle can convert the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle, so as to compensate the perception information of the first vehicle. In addition, after the first vehicle acquires the perception information of the second vehicle, whether the perception information of the second vehicle is tampered in the transmission process can be judged based on the data codes and the data check codes, and the accuracy of the relative position and the relative speed of the converted target moving object and the first vehicle is further improved.
Fig. 5 is a schematic structural diagram of a perceptual information compensation apparatus provided in an embodiment of the present application, where the perceptual information compensation apparatus may be implemented as part or all of a perceptual information compensation device by software, hardware, or a combination of the two. Referring to fig. 5, the apparatus includes: a determination module 501, an acquisition module 502 and a conversion module 503.
The determining module 501 is configured to determine a relative position and a relative speed of a first vehicle and a second vehicle, where the second vehicle is located within a sensing range of the first vehicle. For the detailed implementation process, reference is made to corresponding contents in the above embodiments, and details are not repeated here.
The obtaining module 502 is configured to obtain perception information of a second vehicle, where the perception information of the second vehicle includes a relative position and a relative speed of a target moving object and the second vehicle, the target moving object is located within a perception range of the second vehicle, and the target moving object is located in a blind area of the first vehicle. For the detailed implementation process, reference is made to corresponding contents in the above embodiments, and details are not repeated here.
A conversion module 503, configured to convert the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle, so as to compensate for the perception information of the first vehicle. For the detailed implementation process, reference is made to corresponding contents in the foregoing embodiments, and details are not repeated here.
Optionally, the obtaining module 502 is specifically configured to:
acquiring actual positioning information of a first vehicle and actual positioning information of a second vehicle;
determining theoretical positioning information of the second vehicle based on the actual positioning information of the first vehicle and the relative position and the relative speed of the first vehicle and the second vehicle;
and acquiring the perception information of the second vehicle under the condition that the actual positioning information of the second vehicle is the same as the theoretical positioning information of the second vehicle.
Optionally, the perception information of the second vehicle further includes a data check code;
the conversion module 503 is specifically configured to:
coding the relative position and the relative speed of the target moving object and the second vehicle to obtain a data code;
and under the condition that the data code is the same as the data check code, converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle on the basis of the relative position and the relative speed of the first vehicle and the second vehicle.
Optionally, the relative position comprises a relative distance and a relative angle;
the conversion module 503 includes:
a first determination unit configured to determine a relative distance of the target moving object from the first vehicle based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle;
a second determination unit configured to determine a relative angle of the target moving object and the first vehicle based on the relative angle of the first vehicle and the second vehicle and the relative angle of the target moving object and the second vehicle;
and the obtaining unit is used for subtracting the relative speed of the target moving object and the second vehicle from the relative speed of the first vehicle and the second vehicle to obtain the relative speed of the target moving object and the first vehicle.
Optionally, the first determining unit is specifically configured to:
determining the relative distance between the target moving object and the first vehicle according to the following formula based on the relative distance between the first vehicle and the second vehicle, the relative distance between the target moving object and the second vehicle, the relative angle between the first vehicle and the second vehicle, and the relative angle between the target moving object and the second vehicle;
Figure BDA0003556790700000141
wherein, in the above formula, d BC Representing the relative distance of the target moving object from the first vehicle, d BA Representing the relative distance of the first vehicle from the second vehicle, d AC Representing the relative distance, θ, of the target moving object from the second vehicle BA Representing the relative angle of the first vehicle to the second vehicle, θ AC Representing the relative angle of the target moving object and the second vehicle.
Optionally, the target moving object includes a moving vehicle, a pedestrian, an animal.
In the embodiment of the present application, in the case where the target moving object is located within the blind area of the first vehicle, the first vehicle may acquire the perception information of the second vehicle. Since the perception information of the second vehicle includes the relative position and the relative speed of the target moving object and the second vehicle, the second vehicle is located within the perception range of the first vehicle. Therefore, the first vehicle can convert the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle, so as to compensate the perception information of the first vehicle. In addition, after the first vehicle acquires the perception information of the second vehicle, whether the perception information of the second vehicle is tampered in the transmission process can be judged based on the data code and the data check code, and the accuracy of the relative position and the relative speed of the converted target moving object and the first vehicle is further improved.
It should be noted that: in the embodiment, when the sensing information compensation device performs sensing information compensation, only the division of the functional modules is illustrated, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the perceptual information compensation apparatus and the perceptual information compensation method provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiments, and are not described herein again.
Fig. 6 is a block diagram of a perceptual information compensating apparatus 600 according to an embodiment of the present application. The perception information compensating apparatus 600 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer or a desktop computer. Perceptual information compensating device 600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
In general, the perceptual information compensating apparatus 600 includes: a processor 601 and a memory 602.
Processor 601 may include one or more processing cores, such as 4-core processors, 8-core processors, and so forth. The processor 601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 602 may include one or more computer-readable storage media, which may be non-transitory. Memory 602 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 602 is used to store at least one instruction for execution by the processor 601 to implement the perceptual information compensation methods provided by the method embodiments herein.
In some embodiments, the perceptual information compensation device 600 may further include: a peripheral interface 603 and at least one peripheral. The processor 601, memory 602 and peripherals interface 603 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 603 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 604, a touch screen display 605, a camera 606, an audio circuit 607, a positioning component 608, and a power supply 609.
The peripheral interface 603 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 601 and the memory 602. In some embodiments, the processor 601, memory 602, and peripheral interface 603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 601, the memory 602, and the peripheral interface 603 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 604 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 604 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 604 may further include a circuit related to NFC (Near Field Communication), which is not limited in this application.
The display 605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 605 is a touch display screen, the display screen 605 also has the ability to capture touch signals on or above the surface of the display screen 605. The touch signal may be input to the processor 601 as a control signal for processing. At this point, the display 605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 605 may be one, providing the front panel of the perceptual information compensation device 600; in other embodiments, the display 605 may be at least two, respectively disposed on different surfaces of the perceptual information compensating apparatus 600 or in a folded design; in still other embodiments, the display 605 may be a flexible display disposed on a curved surface or on a folded surface of the perceptual information compensating device 600. Even more, the display 605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 606 is used to capture images or video. Optionally, camera assembly 606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the perception information compensating apparatus, and a rear camera is disposed at a rear surface of the perception information compensating apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 606 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 601 for processing or inputting the electric signals to the radio frequency circuit 604 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the perceptual information compensating apparatus 600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert the electrical signals from the processor 601 or the radio frequency circuit 604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 607 may also include a headphone jack.
The positioning component 608 is used to position the current geographic Location of the information aware compensation device 600 to implement navigation or LBS (Location Based Service). The Positioning component 608 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 609 is used to supply power to various components in the perceptual information compensating apparatus 600. The power supply 609 may be ac, dc, disposable or rechargeable. When the power supply 609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the perceptual information compensating device 600 further comprises one or more sensors 610. The one or more sensors 610 include, but are not limited to: acceleration sensor 611, gyro sensor 612, pressure sensor 613, fingerprint sensor 614, optical sensor 615, and proximity sensor 616.
The acceleration sensor 611 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established by the perceptual information compensating apparatus 600. For example, the acceleration sensor 611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 601 may control the touch screen 605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 611. The acceleration sensor 611 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 612 may detect a body direction and a rotation angle of the sensing information compensating apparatus 600, and the gyro sensor 612 may acquire a 3D motion of the user on the sensing information compensating apparatus 600 in cooperation with the acceleration sensor 611. The processor 601 may implement the following functions according to the data collected by the gyro sensor 612: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization while shooting, game control, and inertial navigation.
The pressure sensor 613 may be disposed on a side frame of the sensing information compensating apparatus 600 and/or an underlying layer of the touch display screen 605. When the pressure sensor 613 is disposed at a side frame of the sensing information compensating apparatus 600, a user's holding signal to the sensing information compensating apparatus 600 can be detected, and the processor 601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 613. When the pressure sensor 613 is arranged at the lower layer of the touch display screen 605, the processor 601 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 614 is used for collecting a fingerprint of a user, and the processor 601 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 614, or the fingerprint sensor 614 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 601 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 614 may be disposed on the front, back, or side of the sensory information compensating device 600. When a physical button or a vendor Logo is provided on the sensory information compensating apparatus 600, the fingerprint sensor 614 may be integrated with the physical button or the vendor Logo.
The optical sensor 615 is used to collect the ambient light intensity. In one embodiment, processor 601 may control the display brightness of touch display 605 based on the ambient light intensity collected by optical sensor 615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 605 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 605 is turned down. In another embodiment, the processor 601 may also dynamically adjust the shooting parameters of the camera assembly 606 according to the ambient light intensity collected by the optical sensor 615.
The proximity sensor 616, also called a distance sensor, is generally disposed at the front panel of the perception information compensating apparatus 600. The proximity sensor 616 is used to collect the distance between the user and the front of the perception information compensating device 600. In one embodiment, the touch display screen 605 is controlled by the processor 601 to switch from the bright screen state to the dark screen state when the proximity sensor 616 detects that the distance between the user and the front surface of the perceptual information compensating apparatus 600 is gradually decreased; when the proximity sensor 616 detects that the distance between the user and the front surface of the perception information compensating apparatus 600 is gradually increased, the touch display 605 is controlled by the processor 601 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 6 does not constitute a limitation of the perceptual information compensating apparatus 600, and may include more or less components than those shown, or combine some components, or adopt a different arrangement of components.
In some embodiments, a computer-readable storage medium is also provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the perceptual information compensation method in the above embodiments. For example, the computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is noted that the computer-readable storage medium mentioned in the embodiments of the present application may be a non-volatile storage medium, in other words, a non-transitory storage medium.
It should be understood that all or part of the steps to implement the above embodiments may be implemented by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The computer instructions may be stored in the computer-readable storage medium described above.
That is, in some embodiments, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of the perceptual information compensation method described above.
It is to be understood that reference herein to "at least one" means one or more and "a plurality" means two or more. In the description of the embodiments of the present application, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It should be noted that the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, displayed data, etc.) and signals referred to in the embodiments of the present application are authorized by the user or fully authorized by various parties, and the collection, use and processing of the relevant data need to comply with relevant laws and regulations and standards in relevant countries and regions. For example, the sensing information and the actual positioning information referred to in the embodiments of the present application are obtained under sufficient authorization.
The above-mentioned embodiments are provided not to limit the present application, and any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A perceptual information compensation method, applied to a first vehicle, the method comprising:
determining a relative position and a relative speed of the first vehicle and a second vehicle, the second vehicle being within a perception range of the first vehicle;
acquiring perception information of the second vehicle, wherein the perception information of the second vehicle comprises a relative position and a relative speed of a target moving object and the second vehicle, the target moving object is located in a perception range of the second vehicle, and the target moving object is located in a blind area of the first vehicle;
and converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle so as to compensate the perception information of the first vehicle.
2. The method of claim 1, wherein the obtaining perception information of the second vehicle comprises:
acquiring actual positioning information of the first vehicle and actual positioning information of the second vehicle;
determining theoretical positioning information of the second vehicle based on actual positioning information of the first vehicle and relative positions and relative speeds of the first vehicle and the second vehicle;
and acquiring the perception information of the second vehicle under the condition that the actual positioning information of the second vehicle is the same as the theoretical positioning information of the second vehicle.
3. The method of claim 1, wherein the sensory information of the second vehicle further comprises a data check code;
the converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle includes:
coding the relative position and the relative speed of the target moving object and the second vehicle to obtain a data code;
and under the condition that the data code is the same as the data check code, converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle on the basis of the relative position and the relative speed of the first vehicle and the second vehicle.
4. The method of claim 1 or 3, wherein the relative positions comprise relative distances and relative angles;
the converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle includes:
determining a relative distance of the target moving object from the first vehicle based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle;
determining a relative angle of the target moving object to the first vehicle based on a relative angle of the first vehicle to the second vehicle and a relative angle of the target moving object to the second vehicle;
subtracting the relative speed of the target moving object and the second vehicle from the relative speed of the first vehicle and the second vehicle to obtain the relative speed of the target moving object and the first vehicle.
5. The method of claim 4, wherein the determining the relative distance of the target moving object from the first vehicle based on the relative distance of the first vehicle from the second vehicle, the relative distance of the target moving object from the second vehicle, the relative angle of the first vehicle from the second vehicle, and the relative angle of the target moving object from the second vehicle comprises:
determining a relative distance of the target moving object from the first vehicle based on a relative distance of the first vehicle from the second vehicle, a relative distance of the target moving object from the second vehicle, a relative angle of the first vehicle from the second vehicle, and a relative angle of the target moving object from the second vehicle as follows;
Figure FDA0003556790690000021
wherein, in the above formula, d BC Representing the relative distance of the target moving object from the first vehicle, d BA Representing the relative distance of the first vehicle from the second vehicle, d AC Representing the relative distance, θ, of the target moving object from the second vehicle BA Representing the relative angle, θ, of the first vehicle and the second vehicle AC Representing a relative angle of the target moving object and the second vehicle.
6. The method of claim 1, wherein the target moving object comprises a moving vehicle, a pedestrian, an animal.
7. A perceptual information compensation apparatus, applied to a first vehicle, the apparatus comprising:
a determination module to determine a relative position and a relative speed of the first vehicle and a second vehicle, the second vehicle being within a perception range of the first vehicle;
the acquisition module is used for acquiring perception information of the second vehicle, wherein the perception information of the second vehicle comprises a relative position and a relative speed of a target moving object and the second vehicle, the target moving object is located in a perception range of the second vehicle, and the target moving object is located in a blind area of the first vehicle;
the conversion module is used for converting the relative position and the relative speed of the target moving object and the second vehicle into the relative position and the relative speed of the target moving object and the first vehicle based on the relative position and the relative speed of the first vehicle and the second vehicle so as to compensate perception information of the first vehicle.
8. A vehicle comprising a memory for storing a computer program and a processor for executing the computer program stored in the memory to perform the steps of the method of any one of claims 1 to 6.
9. A computer-readable storage medium, characterized in that the storage medium has stored therein a computer program which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
10. A computer program, characterized in that it comprises instructions which, when run on the computer, cause the computer to carry out the steps of the method according to any one of claims 1 to 6.
CN202210280799.9A 2022-03-21 2022-03-21 Perception information compensation method, device, vehicle, storage medium, and program Pending CN114789734A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210280799.9A CN114789734A (en) 2022-03-21 2022-03-21 Perception information compensation method, device, vehicle, storage medium, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210280799.9A CN114789734A (en) 2022-03-21 2022-03-21 Perception information compensation method, device, vehicle, storage medium, and program

Publications (1)

Publication Number Publication Date
CN114789734A true CN114789734A (en) 2022-07-26

Family

ID=82460633

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210280799.9A Pending CN114789734A (en) 2022-03-21 2022-03-21 Perception information compensation method, device, vehicle, storage medium, and program

Country Status (1)

Country Link
CN (1) CN114789734A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115442849A (en) * 2022-11-09 2022-12-06 成都市以太节点科技有限公司 Differentiated communication method and device for railway vehicle-mounted millimeter wave terminal and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115442849A (en) * 2022-11-09 2022-12-06 成都市以太节点科技有限公司 Differentiated communication method and device for railway vehicle-mounted millimeter wave terminal and storage medium
CN115442849B (en) * 2022-11-09 2023-03-24 成都市以太节点科技有限公司 Differentiated communication method and device for railway vehicle-mounted millimeter wave terminal and storage medium

Similar Documents

Publication Publication Date Title
CN111854780B (en) Vehicle navigation method, device, vehicle, electronic equipment and storage medium
CN111010537B (en) Vehicle control method, device, terminal and storage medium
CN112749590B (en) Object detection method, device, computer equipment and computer readable storage medium
CN111127541B (en) Method and device for determining vehicle size and storage medium
CN113099378B (en) Positioning method, device, equipment and storage medium
CN110775056B (en) Vehicle driving method, device, terminal and medium based on radar detection
CN114789734A (en) Perception information compensation method, device, vehicle, storage medium, and program
CN111179628B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN112184802B (en) Calibration frame adjusting method, device and storage medium
CN112365088B (en) Method, device and equipment for determining travel key points and readable storage medium
CN114550717A (en) Voice sound zone switching method, device, equipment and storage medium
CN111717205B (en) Vehicle control method, device, electronic equipment and computer readable storage medium
CN114598992A (en) Information interaction method, device, equipment and computer readable storage medium
CN114779920A (en) Whole vehicle window gesture control system based on biological recognition and control method thereof
CN113255906A (en) Method, device, terminal and storage medium for returning obstacle 3D angle information in automatic driving
CN114566064B (en) Method, device, equipment and storage medium for determining position of parking space
CN114419913B (en) In-vehicle reminding method and device, vehicle and storage medium
CN116311976A (en) Signal lamp control method, device, equipment and computer readable storage medium
CN117173520A (en) Method and device for determining three-dimensional fusion data
CN117372320A (en) Quality detection method, device and equipment for positioning map and readable storage medium
CN116665671A (en) Effective communication method, device, terminal and storage medium for following vehicle
CN116452653A (en) Method, device, equipment and computer readable storage medium for determining traffic information
CN116681478A (en) Method, device and terminal for judging effectiveness of trial-driving path of vehicle
CN116331196A (en) Automatic driving automobile data security interaction system, method, terminal and medium
CN113450799A (en) Vehicle-mounted schedule management method, system, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination