WO2022147667A1 - Procédé et dispositif de communication dans un système avancé d'aide à la conduite - Google Patents

Procédé et dispositif de communication dans un système avancé d'aide à la conduite Download PDF

Info

Publication number
WO2022147667A1
WO2022147667A1 PCT/CN2021/070371 CN2021070371W WO2022147667A1 WO 2022147667 A1 WO2022147667 A1 WO 2022147667A1 CN 2021070371 W CN2021070371 W CN 2021070371W WO 2022147667 A1 WO2022147667 A1 WO 2022147667A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
pmd
frame
modulated
bits
Prior art date
Application number
PCT/CN2021/070371
Other languages
English (en)
Chinese (zh)
Inventor
欧阳涛
潘众
蒋晓明
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2021/070371 priority Critical patent/WO2022147667A1/fr
Priority to CN202180089133.9A priority patent/CN116830536A/zh
Publication of WO2022147667A1 publication Critical patent/WO2022147667A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L27/00Modulated-carrier systems
    • H04L27/26Systems using multi-frequency codes

Definitions

  • the present application relates to advanced driver assistance technology, and in particular, to a communication method and device in an advanced driver assistance system.
  • ADAS Advanced driver assistance systems
  • sensors installed in the car during the driving process, such as millimeter wave radar, lidar, single/binocular camera and satellite navigation, etc.
  • ADAS can identify, detect and track static and dynamic objects based on data from sensors, and combine it with navigation map data to perform calculations and analysis, so that drivers can be aware of possible dangers in advance. Therefore, the acquisition of environmental information has become a key technology for the development of ADAS, especially as the types of sensors become more and more diverse and the accuracy becomes higher and higher, correspondingly, the transmission requirements between the sensors and the mobile data center (MDC) also getting higher.
  • MDC mobile data center
  • the communication mode between sensor and MDC is mainly divided into two categories, namely single-carrier modulation mode and multi-carrier modulation mode.
  • the single-carrier modulation mode is suitable for lower-rate scenarios and consumes less power
  • the multi-carrier modulation mode is suitable for high-speed scenarios and consumes more power. How to combine the advantages of both has become a problem to be solved.
  • the present application provides a communication method and device in an advanced driver assistance system, which combines data from different sources and shields the difference in modulation methods, so that the communication between the serializer and the deserializer can be compatible with single-carrier modulation and multi-carrier modulation Two debug modes are modulated.
  • the present application provides a communication technology in an advanced driving assistance system, including: acquiring a plurality of data to be modulated, the plurality of data to be modulated come from a plurality of data sources; and combining the plurality of data to be modulated into one Physical medium PMD frame; perform single-carrier modulation or multi-carrier modulation on the PMD frame to obtain a modulated signal; send the modulated signal.
  • the initial sources of the plurality of data to be modulated are a plurality of data sources, which may be various sensors.
  • the plurality of data to be modulated acquired by the PHY layer have been processed by the MAC layer, and have been encoded at the PHY layer, and the processing procedure of the MAC layer and the encoding processing procedure of the PHY layer are not limited in this application.
  • the present application can integrate packaged and encoded data to be modulated from multiple data sources into one physical media dependent (PMD) frame.
  • PMD physical media dependent
  • the PHY layer may fill in a plurality of data to be modulated in sequence according to the set order to obtain a PMD frame, the plurality of data to be modulated occupy N bits respectively, N is a preset positive integer, and N>1.
  • the length of the PMD frame may be N ⁇ n, where n represents the total number of data sources.
  • the data of each data source occupies a fixed number of bits in the PMD frame. If the length of the data is less than N, the extra bits when filling in the data can be filled with a default value, such as 0; if the length of the data is greater than N, the data exceeding N bits will be encapsulated into the next PMD frame.
  • the setting order means that the bits occupied by n data sources in the PMD frame are preset.
  • the data sources include two cameras and two radars, and the order of cameras precedes radars. Therefore, in the PMD frame, the first N bits are filled with the camera data, and the last N bits are filled with the radar data, as shown in Figure 5a.
  • the PHY layer may first fill in length indication information, where the length indication information is used to indicate the respective lengths of the multiple data to be modulated, and then fill in the multiple data to be modulated in sequence according to the set order to obtain the PMD frame.
  • a piece of information for indicating the respective lengths of multiple data to be modulated is filled in the header of the PMD frame, so that the bits after the information can be filled with multiple data to be modulated according to the set order, and There is no need to reserve empty bits as in the above-mentioned embodiment, so as to reduce the bit consumption of the PMD.
  • the data sources include cameras and radars. The sequence of cameras precedes the radar. The data length of the camera is N1 bits, and the data length of the radar is N1 bits.
  • the first field is the length indication information, which is filled in N1 and N2;
  • the second field is N1 bits long, and the camera data is filled in;
  • the third field is N2 bits long, which is filled in the radar. data, as shown in Figure 5b.
  • the PHY layer can fill in the current data corresponding to the current sequence, and fill in the end identifier after the end of the current data to obtain the PMD frame, where the current data is one of multiple data to be modulated, and multiple The data to be modulated is pre-set in sequence.
  • an end identifier is included in the PMD frame, and the end identifier is used to indicate the end of a piece of data.
  • the data sources include cameras and radars. The sequence of cameras precedes the radar. The data length of the camera is N1 bits, and the data length of the radar is N1 bits. Therefore, in the PMD frame, the first field is N1 bits long and filled with the camera data; the second field contains m bits, filled with an end identifier, such as 001, 1010, etc.; the third field The length is N2 bits, and the data of the radar is filled in; the subsequent fourth field contains m bits, and an end identifier, such as 001, 1010, etc., is filled in, as shown in Figure 5c.
  • the end identifiers filled in after the end of each data to be modulated may be identical, completely different, or not identical, as long as the deserializer can identify the end identifiers .
  • this embodiment does not specifically limit the length of the end identifier.
  • the PHY layer can fill in the start identifier before filling in the current data corresponding to the current sequence, and fill in the current data after the start identifier to obtain the PMD frame, the current data is a plurality of to-be-modulated For one of the data, a plurality of data to be modulated are preset in sequence.
  • a start identifier is included in the PMD frame, and the start identifier is used to indicate the start of a data.
  • the data sources include cameras and radars. The sequence of cameras precedes the radar. The data length of the camera is N1 bits, and the data length of the radar is N1 bits. Therefore, in the PMD frame, the first field has a length of m bits and is filled with a start identifier, such as 001, 1010, etc.; the second field is N1 bits, filled with camera data; the third field has a length of m The bit is filled with a start identifier, such as 001, 1010, etc.; the fourth field is N2 bits long, and the data of the radar is filled in, as shown in Figure 5d.
  • the start identifiers filled before the start of each data to be modulated may be the same, completely different, or not exactly the same, as long as the deserializer can identify the start identifiers That's it.
  • this embodiment does not specifically limit the length of the start identifier.
  • the present application may also set other formats for the PMD frame, which is not specifically limited.
  • the PHY layer may map each bit of the PMD frame to corresponding amplitudes according to the bit order of the PMD frame to obtain a modulated signal. For this process, reference may be made to the related technology of single-carrier modulation, which will not be repeated here.
  • the PHY layer can multiplex multiple bits contained in the PMD frame onto each sub-carrier in the multi-carrier, and map and inverse fast Fourier transform the bits carried by each sub-carrier to obtain Modulated signal. For this process, reference may be made to the related art of multi-carrier modulation, which will not be repeated here.
  • the PMD frame is used as a modulation object, and modulation processing can be performed based on the PMD frame whether it is single-carrier modulation or multi-carrier modulation.
  • the PHY layer of the serializer of the present application encodes the data of multiple data sources separately, it is integrated into a PMD frame.
  • the subsequent modulation process can use the PMD frame as a unit to combine the data from different sources, shielding the modulation method.
  • the difference makes the communication between the serializer and the deserializer compatible with two debug modes, single-carrier modulation and multi-carrier modulation.
  • the present application provides a communication technology in an advanced driving assistance system, including: receiving a modulated signal; performing single-carrier demodulation or multi-carrier demodulation on the modulated signal to obtain a physical medium PMD frame; The frame obtains a plurality of data to be decoded, and the plurality of data to be decoded corresponds to a plurality of data sources.
  • the deserializer receives the modulated signal over the cable.
  • the PHY layer performs single-carrier demodulation or multi-carrier demodulation on the modulated signal based on the communication method to obtain the PMD frame.
  • the PHY layer obtains the data to be decoded based on the format of the PMD frame, and the deserializer and the serializer pre-agreed the format of the PMD frame, so that the serializer does not need to transmit the format information of the PMD frame to the deserializer.
  • the PHY layer of the deserializer of the present application receives the modulated signal, since the demodulation obtains a unified PMD frame, whether it is single-carrier modulation or multi-carrier modulation
  • the data is combined to shield the difference of demodulation methods, so that the communication between the serializer and the deserializer can be compatible with two debugging modes of single-carrier modulation and multi-carrier modulation.
  • the present application provides a communication device, comprising: an acquisition module for acquiring a plurality of data to be modulated, the plurality of data to be modulated from a plurality of data sources; a framing module for combining the plurality of data to be modulated
  • the data to be modulated forms a physical medium PMD frame; a modulation module is used to perform single-carrier modulation or multi-carrier modulation on the PMD frame to obtain a modulated signal; and send the modulated signal.
  • the framing module is specifically configured to sequentially fill in the plurality of data to be modulated according to a set order to obtain the PMD frame, and the plurality of data to be modulated occupy N pieces of data respectively.
  • Bit, N is a preset positive integer, N>1.
  • the framing module is specifically configured to fill in length indication information, where the length indication information is used to indicate the respective lengths of the plurality of data to be modulated; fill in sequentially according to the set order the plurality of data to be modulated to obtain the PMD frame.
  • the framing module is specifically configured to fill in the current data corresponding to the current sequence, and fill in the end identifier after the current data ends, so as to obtain the PMD frame, so
  • the current data is one of the plurality of to-be-modulated data, and the plurality of to-be-modulated data are pre-set in sequence.
  • the framing module is specifically configured to fill in a start identifier before filling in the current data corresponding to the current sequence, and fill in the current data after the start identifier data to obtain the PMD frame, the current data is one of the plurality of data to be modulated, and the plurality of data to be modulated are pre-set in sequence.
  • the modulation module is specifically configured to map each bit of the PMD frame to a corresponding amplitude according to the bit order of the PMD frame to obtain the modulation signal.
  • the modulation module is specifically configured to multiplex a plurality of bits included in the PMD frame onto each sub-carrier in the multi-carrier; Mapping and inverse fast Fourier transform are performed respectively to obtain the modulated signal.
  • the present application provides a communication device, comprising: a receiving module for receiving a modulated signal; a demodulation module for performing single-carrier demodulation or multi-carrier demodulation on the modulated signal to obtain a physical medium PMD frame ; Obtain a plurality of data to be decoded according to the PMD frame, and the plurality of data to be decoded correspond to a plurality of data sources.
  • the demodulation module is specifically configured to obtain a piece of data to be decoded every time N bits are extracted according to the bit order in the PMD frame, where N is a preset positive integer, N>1.
  • the demodulation module is specifically configured to obtain length indication information according to the PMD frame, where the length indication information is used to indicate the respective lengths of a plurality of data to be decoded; according to the length indication The information sequentially extracts bits of corresponding lengths from the PMD frame to obtain corresponding data to be decoded.
  • the demodulation module is specifically configured to extract bits from the PMD frame one by one until an end identifier is extracted; multiple bits extracted before the end identifier are extracted
  • the bits form current data, which is one of the plurality of data to be decoded.
  • the demodulation module is specifically configured to extract bits from the PMD frame one by one after the start identifier is extracted, until the next start identifier is extracted; A plurality of bits extracted between the start identifiers constitute current data, and the current data is one of the plurality of data to be decoded.
  • the present application provides a communication device, comprising: one or more processors; a memory for storing one or more programs; when the one or more programs are executed by the one or more processors , causing the one or more processors to implement the method according to any one of the above first aspects.
  • the present application provides a communication device, comprising: one or more processors; a memory for storing one or more programs; when the one or more programs are executed by the one or more processors , causing the one or more processors to implement the method according to any one of the above second aspects.
  • the present application provides a computer-readable storage medium, comprising a computer program, which, when executed on a computer, causes the computer to execute the method described in any one of the first to second aspects above.
  • the present application provides a computer program, when the computer program is executed by a computer, for executing the method described in any one of the first to second aspects above.
  • Fig. 1 is an exemplary functional block diagram of the vehicle of the present application
  • FIG. 2 is a schematic diagram of an exemplary application scenario of an ADAS system
  • Fig. 3a is an exemplary flow chart of single carrier modulation
  • Figure 3b is an exemplary flow chart of multi-carrier modulation
  • FIG. 4 is an exemplary flowchart of the communication method in the advanced driving assistance system of the present application.
  • 5a to 5d are schematic diagrams of several exemplary formats of the PMD of the application.
  • FIG. 6 is an exemplary flowchart of the communication method in the advanced driving assistance system of the present application.
  • FIG. 7 is a schematic structural diagram of an exemplary communication device of the present application.
  • FIG. 8 is a schematic structural diagram of an exemplary communication device of the present application.
  • At least one (item) refers to one or more, and "a plurality” refers to two or more.
  • “And/or” is used to describe the relationship between related objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B exist , where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, c can be single or multiple.
  • FIG. 1 is an exemplary functional block diagram of the vehicle of the present application.
  • components coupled to or included in vehicle 100 may include propulsion system 110 , sensor system 120 , control system 130 , peripherals 140 , power supply 150 , computing device 160 , and user interface 170 .
  • the components of the vehicle 100 may be configured to operate in interconnection with each other and/or with other components coupled to the various systems.
  • power supply 150 may provide power to all components of vehicle 100 .
  • Computing device 160 may be configured to receive data from and control propulsion system 110 , sensor system 120 , control system 130 , and peripherals 140 .
  • Computing device 160 may also be configured to generate a display of images on user interface 170 and receive input from user interface 170 .
  • vehicle 100 may include more, fewer, or different systems, and each system may include more, fewer, or different components.
  • the illustrated systems and components may be combined or divided in any manner, which is not specifically limited in this application.
  • Computing device 160 may include processor 161 , transceiver 162 and memory 163 .
  • Computing device 160 may be a controller or part of a controller of vehicle 100 .
  • Memory 163 may store instructions 1631 running on processor 161 and may also store map data 1632.
  • Processor 161 included in computing device 160 may include one or more general-purpose processors and/or one or more special-purpose processors (eg, image processors, digital signal processors, etc.). To the extent processor 161 includes more than one processor, such processors may operate individually or in combination.
  • Computing device 160 may implement functions for controlling vehicle 100 based on input received through user interface 170 .
  • the transceiver 162 is used for communication between the computing device 160 and various systems.
  • the memory 163 may in turn include one or more volatile storage components and/or one or more non-volatile storage components, such as optical, magnetic and/or organic storage devices, and the memory 163 may be in whole or in part with the processor 161 integrated.
  • the memory 163 may contain instructions 1631 (eg, program logic) executable by the processor 161 to perform various vehicle functions, including any of the functions or methods described herein.
  • Propulsion system 110 may provide powered motion for vehicle 100 .
  • propulsion system 110 may include engine/engine 114 , energy source 113 , transmission 112 , and wheels/tires 111 . Additionally, propulsion system 110 may additionally or alternatively include other components than those shown in FIG. 1 . This application does not specifically limit this.
  • Sensor system 120 may include several sensors for sensing information about the environment in which vehicle 100 is located.
  • the sensors of the sensor system 120 include a global positioning system (Global Positioning System, GPS) 126 , an inertial measurement unit (Inertial Measurement Unit, IMU) 125 , a lidar sensor 124 , a camera sensor 123 , and a millimeter-wave radar sensor 122 And a detent 121 for modifying the position and/or orientation of the sensor.
  • GPS 126 may be any sensor used to estimate the geographic location of vehicle 100 .
  • the GPS 126 may include a transceiver that estimates the position of the vehicle 100 relative to the earth based on satellite positioning data.
  • computing device 160 may be used to use GPS 126 in conjunction with map data 1632 to estimate the road on which vehicle 100 is traveling.
  • the IMU 125 may be used to sense position and orientation changes of the vehicle 100 based on inertial acceleration and any combination thereof.
  • the combination of sensors in IMU 125 may include, for example, an accelerometer and a gyroscope. Additionally, other combinations of sensors in IMU 125 are possible.
  • the lidar sensor 124 can be considered an object detection system that uses light sensing to detect objects in the environment in which the vehicle 100 is located. Generally, the lidar sensor 124 may be an optical remote sensing technique that measures the distance to the target or other properties of the target by illuminating the target with light.
  • the lidar sensor 124 may include a laser source and/or laser scanner configured to emit laser pulses, and a detector for receiving reflections of the laser pulses.
  • the lidar sensor 124 may include a laser rangefinder reflected by a turning mirror and scan the laser around the digitized scene in one or two dimensions, collecting distance measurements at specified angular intervals.
  • the lidar sensor 124 may include components such as light (eg, laser) sources, scanners and optical systems, light detectors and receiver electronics, and position and navigation systems.
  • the LiDAR sensor 124 determines the distance of an object by scanning the laser light reflected back from an object, and can form a 3D environment map with an accuracy of up to centimeter level.
  • Camera sensor 123 may include any camera (eg, still camera, video camera, etc.) used to acquire images of the environment in which vehicle 100 is located. To this end, the camera sensor 123 may be configured to detect visible light, or may be configured to detect light from other parts of the spectrum, such as infrared light or ultraviolet light. Other types of camera sensors 123 are also possible. The camera sensor 123 may be a two-dimensional detector, or may have a three-dimensional spatial range detection function. In some examples, camera sensor 123 may be, for example, a distance detector configured to generate a two-dimensional image indicative of distances from camera sensor 123 to several points in the environment. To this end, camera sensor 123 may use one or more distance detection techniques.
  • camera sensor 123 may use one or more distance detection techniques.
  • the camera sensor 123 may be configured to use structured light technology, wherein the vehicle 100 illuminates objects in the environment with a predetermined light pattern, such as a grid or checkerboard pattern, and the camera sensor 123 is used to detect the predetermined light pattern from the object reflection. Based on the distortion in the reflected light pattern, the vehicle 100 may be configured to detect the distance to a point on the object.
  • the predetermined light pattern may include infrared light or other wavelengths of light.
  • the Millimeter-Wave Radar sensor (Millimeter-Wave Radar) 122 generally refers to an object detection sensor with a wavelength of 1-10 mm, and the frequency is roughly in the range of 10 GHz-200 GHz.
  • the measurement value of the millimeter-wave radar sensor 122 has depth information and can provide the distance of the target; secondly, because the millimeter-wave radar sensor 122 has obvious Doppler effect, it is very sensitive to the speed, and the speed of the target can be directly obtained.
  • the Puller shift can extract the velocity of the target.
  • the two mainstream automotive millimeter-wave radar application frequency bands are 24GHz and 77GHz.
  • the wavelength of the former is about 1.25cm, which is mainly used for short-range sensing, such as the surrounding environment of the vehicle body, blind spots, parking assistance, lane change assistance, etc.; the latter wavelength About 4mm, used for medium and long distance measurement, such as automatic following, adaptive cruise (ACC), emergency braking (AEB), etc.
  • Sensor system 120 may also include additional sensors, including, for example, sensors that monitor internal systems of vehicle 100 (eg, an O2 monitor, fuel gauge, oil temperature, etc.). Sensor system 120 may also include other sensors. This application does not specifically limit this.
  • the control system 130 may be configured to control the operation of the vehicle 100 and its components.
  • the control system 130 may include a steering unit 136 , a throttle 135 , a braking unit 134 , a sensor fusion algorithm 133 , a computer vision system 132 , and a navigation/routing control system 131 .
  • Control system 130 may additionally or alternatively include other components than those shown in FIG. 1 . This application does not specifically limit this.
  • Peripherals 140 may be configured to allow vehicle 100 to interact with external sensors, other vehicles, and/or a user.
  • peripheral devices 140 may include, for example, a wireless communication system 144 , a touch screen 143 , a microphone 142 and/or a speaker 141 .
  • Peripherals 140 may additionally or alternatively include other components than those shown in FIG. 1 . This application does not specifically limit this.
  • Power supply 150 may be configured to provide power to some or all components of vehicle 100 .
  • the power source 150 may include, for example, a rechargeable lithium-ion or lead-acid battery.
  • one or more battery packs may be configured to provide power.
  • Other power supply materials and configurations are also possible.
  • power source 150 and energy source 113 may be implemented together, as in some all-electric vehicles.
  • Components of the vehicle 100 may be configured to operate in interconnection with other components within and/or outside of their respective systems. To this end, the components and systems of the vehicle 100 may be communicatively linked together through a system bus, network, and/or other connection mechanisms.
  • FIG. 2 is a schematic diagram of an exemplary application scenario of the ADAS system.
  • the scenario is communication between a camera (Camera) and an MDC.
  • the camera captures image data, and the image data is transmitted to the MDC through a serializer and a deserializer; the MDC can also transmit control commands in reverse to control or monitor the operation of the camera.
  • the cameras are set around the body, so there is a certain distance between each camera and the MDC, and the serializer and deserializer need to be connected by cables.
  • the serializer modulates the image data of the camera and sends it to the cable for transmission.
  • the deserializer receives the signal from the cable, and then demodulates and restores the image data transmitted by the camera, and then transmits it to the MDC.
  • this embodiment only exemplarily shows the communication method between the camera and the MDC.
  • the communication methods between these sensors and the MDC are similar, and will not be described in detail.
  • Fig. 3a is an exemplary flow chart of single-carrier modulation.
  • the data to be transmitted comes from data source 1 and data source 2, and data source 1 and data source 2 can be any two in the ADAS system Sensors, such as those included in sensor system 120 shown in FIG. 1 .
  • the medium access control layer (Medium Access Control, MAC) performs MAC encapsulation on the data from data source 1 and data source 2 respectively, and then performs packet-based scheduling on these MAC data packets.
  • the physical layer (PHY) performs coding based on the scheduling of the MAC layer (the coding here refers to generalized forward error correction (FEC) coding, including possible 64/66b encapsulation, interleaving, scrambling, pre-coding After encoding and other operations), map the encoded bits into corresponding symbols in turn.
  • FEC forward error correction
  • non-return-to-zero code is a bit mapped to a symbol
  • 4 Amplitude pulse amplitude modulation (4pulse amplitude modulation, PAM4) is two bits mapped to a symbol
  • 8 pulse amplitude modulation 8Pulse Amplitude Modulation, PAM8) is three bits mapped to a symbol, and so on, the mapping result is the corresponding
  • the amplitude is transmitted through the analog front end onto the cable.
  • the signal on the cable is received through the analog front end, and the PHY layer first equalizes the received signal during demodulation, that is, compensates for the channel attenuation, and then demaps to obtain the bit stream, which is then decoded
  • the processing restores the bit string.
  • the MAC layer performs packet-based distribution of the bit string to obtain data packets corresponding to data source 1 and data source 2, and then performs MAC decapsulation on the data packets respectively to obtain data corresponding to data source 1 and data source 2.
  • the symbol rate is high, and one symbol corresponds to fewer bits.
  • All communication systems based on single-carrier modulation do not distinguish services at the PHY layer, that is, the service layer above the PHY (usually the MAC layer) identifies the service, and completes the multiplexing of data before entering the PHY layer, but does not exist at the PHY layer. Any frame concept. Therefore, information belonging to the PHY layer that needs to be terminated, such as data symbols or frame acknowledgment (Acknowledge, ACK), is often transmitted at the MAC layer or above, resulting in retransmission only at the MAC layer or above.
  • data symbols or frame acknowledgment Acknowledge, ACK
  • Fig. 3b is an exemplary flowchart of multi-carrier modulation.
  • the data to be transmitted comes from data source 1, data source 2, ..., data n, data source 1, data source 2, ..., data n can be any two sensors in the ADAS system, eg, the sensors included in the sensor system 120 shown in FIG. 1 .
  • the MAC layer performs MAC encapsulation on the data from data source 1, data source 2, . . . , and data n respectively.
  • the PHY layer encodes the encapsulation packets of data source 1, data source 2, ..., data n respectively, and then multiplexes the encoded n bit streams onto different subcarriers, and then performs mapping and inverse fast Fourier transform. (invert fast fourier transformation, IFFT) to get the time domain signal.
  • the PHY layer On the deserializer at the receiving end, the PHY layer performs fast Fourier transform (FFT) and frequency domain equalization on the received time-domain signal, and then performs subcarrier data distribution and separate decoding.
  • the MAC layer performs MAC decapsulation respectively to obtain data corresponding to data source 1, data source 2, . . . , and data n respectively.
  • Single-carrier modulation performs multiplexing and scheduling of data at the MAC layer, and the PHY layer acts as a pipeline for multiplexing and transmission of multi-source data.
  • the multi-carrier modulation is to multiplex and transmit multi-source data by means of frequency division multiplexing at the PHY layer.
  • the MDC may need to receive signals from multiple sensors to collect complete road conditions, environment and other information, so the deserializer on the MDC side can include multiple ports to support different sensors. Sensors of different bandwidths may coexist, and thus multiple modulation modes may coexist in an ADAS system containing multiple serializers and a deserializer.
  • the communication method between the serializer and the deserializer needs to satisfy as many application scenarios as possible, and the more scenarios it covers, the better the compatibility of the system.
  • This application provides a communication method in an advanced driver assistance system, which is compatible with single-carrier modulation and multi-carrier modulation, and improves the compatibility of ADAS systems.
  • FIG. 4 is an exemplary flowchart of the communication method in the advanced driver assistance system of the present application.
  • the process 400 may be performed by a serializer, and specifically refers to a PHY layer in the serializer.
  • Process 400 is described as a series of steps or operations, and it should be understood that process 400 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in FIG. 4 .
  • Step 401 Acquire a plurality of data to be modulated, and the plurality of data to be modulated come from a plurality of data sources.
  • the initial sources of the multiple data to be modulated are multiple data sources.
  • the multiple data sources reference may be made to the sensor system 120 shown in FIG. 1 , which will not be repeated here.
  • the plurality of data to be modulated acquired by the PHY layer have been processed by the MAC layer, and have been encoded at the PHY layer, and the processing procedure of the MAC layer and the encoding processing procedure of the PHY layer are not limited in this application.
  • Step 402 form a PMD frame with a plurality of data to be modulated.
  • the present application can integrate packaged and encoded data to be modulated from multiple data sources into one physical media dependent (PMD) frame.
  • PMD physical media dependent
  • the PHY layer may fill in a plurality of data to be modulated in sequence according to the set order to obtain a PMD frame, the plurality of data to be modulated occupy N bits respectively, N is a preset positive integer, and N>1.
  • the length of the PMD frame may be N ⁇ n, where n represents the total number of data sources.
  • the data of each data source occupies a fixed number of bits in the PMD frame. If the length of the data is less than N, the extra bits when filling in the data can be filled with a default value, such as 0; if the length of the data is greater than N, the data exceeding N bits will be encapsulated into the next PMD frame.
  • the setting order means that the bits occupied by the n data sources in the PMD frame are preset.
  • the data sources include two cameras and two radars, and the order of the cameras precedes the radars. Therefore, in the PMD frame, the first N bits are filled with the camera data, and the last N bits are filled with the radar data, as shown in Figure 5a.
  • the PHY layer may first fill in length indication information, where the length indication information is used to indicate the respective lengths of the multiple data to be modulated, and then fill in the multiple data to be modulated in sequence according to the set order to obtain the PMD frame.
  • a piece of information for indicating the respective lengths of multiple data to be modulated is filled in the header of the PMD frame, so that the bits after the information can be filled with multiple data to be modulated according to the set order, and There is no need to reserve empty bits as in the above-mentioned embodiment, so as to reduce the bit consumption of the PMD.
  • the data sources include cameras and radars. The sequence of cameras precedes the radar. The data length of the camera is N1 bits, and the data length of the radar is N1 bits.
  • the first field is the length indication information, which is filled in N1 and N2;
  • the second field is N1 bits long, and the camera data is filled in;
  • the third field is N2 bits long, which is filled in the radar. data, as shown in Figure 5b.
  • the PHY layer can fill in the current data corresponding to the current sequence, and fill in the end identifier after the end of the current data to obtain the PMD frame, where the current data is one of multiple data to be modulated, and multiple The data to be modulated is pre-set in sequence.
  • an end identifier is included in the PMD frame, and the end identifier is used to indicate the end of a piece of data.
  • the data sources include cameras and radars. The sequence of cameras precedes the radar. The data length of the camera is N1 bits, and the data length of the radar is N1 bits. Therefore, in the PMD frame, the first field is N1 bits long and filled with the camera data; the second field contains m bits, filled with an end identifier, such as 001, 1010, etc.; the third field The length is N2 bits, and the data of the radar is filled in; the subsequent fourth field contains m bits, and an end identifier, such as 001, 1010, etc., is filled in, as shown in Figure 5c.
  • the end identifiers filled in after the end of each data to be modulated may be identical, completely different, or not identical, as long as the deserializer can identify the end identifiers .
  • this embodiment does not specifically limit the length of the end identifier.
  • the PHY layer can fill in the start identifier before filling in the current data corresponding to the current sequence, and fill in the current data after the start identifier to obtain the PMD frame, the current data is a plurality of to-be-modulated For one of the data, a plurality of data to be modulated are preset in sequence.
  • a start identifier is included in the PMD frame, and the start identifier is used to indicate the start of a data.
  • the data sources include cameras and radars. The sequence of cameras precedes the radar. The data length of the camera is N1 bits, and the data length of the radar is N1 bits. Therefore, in the PMD frame, the first field has a length of m bits and is filled with a start identifier, such as 001, 1010, etc.; the second field is N1 bits, filled with camera data; the third field has a length of m The bit is filled with a start identifier, such as 001, 1010, etc.; the fourth field is N2 bits long, and the data of the radar is filled in, as shown in Figure 5d.
  • the start identifiers filled before the start of each data to be modulated may be the same, completely different, or not exactly the same, as long as the deserializer can identify the start identifiers That's it.
  • this embodiment does not specifically limit the length of the start identifier.
  • the present application may also set other formats for the PMD frame, which is not specifically limited.
  • Step 403 Perform single-carrier modulation or multi-carrier modulation on the PMD frame to obtain a modulated signal.
  • the PHY layer may map each bit of the PMD frame to corresponding amplitudes according to the bit order of the PMD frame to obtain a modulated signal.
  • reference may be made to the related technology of single-carrier modulation, which will not be repeated here.
  • the PHY layer can multiplex multiple bits contained in the PMD frame onto each sub-carrier in the multi-carrier, and map and inverse fast Fourier transform the bits carried by each sub-carrier to obtain Modulated signal.
  • map and inverse fast Fourier transform the bits carried by each sub-carrier to obtain Modulated signal.
  • Step 404 Send the modulated signal.
  • the serializer sends the modulated signal onto the cable for transmission to the deserializer.
  • the PHY layer of the serializer of the present application encodes the data of multiple data sources separately, it is integrated into a PMD frame.
  • the subsequent modulation process can use the PMD frame as a unit to combine the data from different sources, shielding the modulation method.
  • the difference makes the communication between the serializer and the deserializer compatible with two debug modes, single-carrier modulation and multi-carrier modulation.
  • FIG. 6 is an exemplary flowchart of the communication method in the advanced driver assistance system of the present application.
  • the process 600 may be performed by a deserializer, and specifically refers to a PHY layer in the deserializer.
  • Process 600 is described as a series of steps or operations, and it should be understood that process 600 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in FIG. 6 .
  • Step 601 Receive a modulated signal.
  • the deserializer receives the modulated signal over the cable.
  • Step 602 Perform single-carrier demodulation or multi-carrier demodulation on the modulated signal to obtain a PMD frame.
  • the PHY layer performs single-carrier demodulation or multi-carrier demodulation on the modulated signal based on the communication method to obtain the PMD frame. This process is opposite to step 403 of the embodiment shown in FIG. 4 .
  • Step 603 Obtain a plurality of data to be decoded according to the PMD frame, where the plurality of data to be decoded correspond to a plurality of data sources.
  • step 402 The process of obtaining the data to be decoded by the PHY layer is opposite to that of step 402 in the embodiment shown in FIG. 4 , but the format of the PMD frame is the same as that described in step 402 , and details are not repeated here.
  • the PHY layer of the deserializer of the present application receives the modulated signal, since the demodulation obtains a unified PMD frame, whether it is single-carrier modulation or multi-carrier modulation
  • the data is combined to shield the difference of demodulation methods, so that the communication between the serializer and the deserializer can be compatible with two debugging modes of single-carrier modulation and multi-carrier modulation.
  • FIG. 7 is an exemplary schematic structural diagram of the communication apparatus of the present application. As shown in FIG. 7 , the apparatus of this embodiment may be applied to the serializer in the above-mentioned embodiment.
  • the communication apparatus includes: an acquisition module 701 , a framing module 702 , a modulation module 703 and a transmission module 704 . in,
  • the acquisition module 701 is used to acquire a plurality of data to be modulated, and the plurality of data to be modulated come from a plurality of data sources; the framing module 702 is used to form a physical medium PMD frame of the plurality of data to be modulated; the modulation module 703, for performing single-carrier modulation or multi-carrier modulation on the PMD frame to obtain a modulated signal; and a sending module 704, for sending the modulated signal.
  • the framing module 702 is specifically configured to sequentially fill in the plurality of data to be modulated according to a set order to obtain the PMD frame, and the plurality of data to be modulated occupy N respectively.
  • bits, N is a preset positive integer, N>1.
  • the framing module 702 is specifically configured to fill in length indication information, where the length indication information is used to indicate the respective lengths of the plurality of data to be modulated; input the plurality of data to be modulated to obtain the PMD frame.
  • the framing module 702 is specifically configured to fill in the current data corresponding to the current sequence, and fill in the end identifier after the current data ends, so as to obtain the PMD frame,
  • the current data is one of the plurality of data to be modulated, and the plurality of data to be modulated are pre-set in sequence.
  • the framing module 702 is specifically configured to fill in a start identifier before filling in the current data corresponding to the current sequence, and fill in the start identifier after the start identifier To obtain the PMD frame, the current data is one of the plurality of data to be modulated, and the plurality of data to be modulated are pre-set in sequence.
  • the modulation module 703 is specifically configured to map each bit of the PMD frame to a corresponding amplitude according to the bit order of the PMD frame to obtain the modulation signal.
  • the modulation module 703 is specifically configured to multiplex multiple bits included in the PMD frame onto each sub-carrier in the multi-carrier; The bits are separately mapped and inverse fast Fourier transformed to obtain the modulated signal.
  • the apparatus in this embodiment can be used to execute the technical solution of the method embodiment shown in FIG. 4 , and the implementation principle and technical effect thereof are similar, and are not repeated here.
  • FIG. 8 is an exemplary schematic structural diagram of the communication apparatus of the present application. As shown in FIG. 8 , the apparatus of this embodiment may be applied to the deserializer in the foregoing embodiment.
  • the communication apparatus includes: a receiving module 801 and a demodulating module 802 . in,
  • the receiving module 801 is used to receive a modulated signal; the demodulation module 802 is used to perform single-carrier demodulation or multi-carrier demodulation on the modulated signal to obtain a physical medium PMD frame; obtain a plurality of data to be decoded according to the PMD frame , the multiple data to be decoded correspond to multiple data sources.
  • the demodulation module 802 is specifically configured to extract a piece of data to be decoded every time N bits are extracted according to the bit order in the PMD frame, where N is a preset positive integer , N>1.
  • the demodulation module 802 is specifically configured to obtain length indication information according to the PMD frame, where the length indication information is used to indicate the respective lengths of a plurality of data to be decoded; The indication information sequentially extracts bits of corresponding lengths from the PMD frame to obtain corresponding data to be decoded.
  • the demodulation module 802 is specifically configured to extract bits from the PMD frame one by one until an end identifier is extracted; The bits form current data, and the current data is one of the plurality of data to be decoded.
  • the demodulation module 802 is specifically configured to extract bits from the PMD frame one by one after the start identifier is extracted, until the next start identifier is extracted; A plurality of bits extracted between two start identifiers constitute current data, and the current data is one of the plurality of data to be decoded.
  • the apparatus of this embodiment can be used to execute the technical solution of the method embodiment shown in FIG. 6 , and its implementation principle and technical effect are similar, and details are not repeated here.
  • each step of the above method embodiments may be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software.
  • the processor can be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other Programming logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the methods disclosed in the embodiments of the present application may be directly embodied as executed by a hardware coding processor, or executed by a combination of hardware and software modules in the coding processor.
  • the software module may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
  • the memory mentioned in the above embodiments may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically programmable Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be random access memory (RAM), which acts as an external cache.
  • RAM random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • SDRAM double data rate synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • direct rambus RAM direct rambus RAM
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (personal computer, server, or network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Procédé et dispositif de communication dans un système avancé d'aide à la conduite. Le procédé fait appel aux étapes suivantes : l'obtention de multiples éléments de données à moduler, lesdites données provenant de multiples sources de données ; la combinaison desdites données en une trame dépendante du support physique (PMD) ; la réalisation d'une modulation à porteuse unique ou d'une modulation à porteuses multiples sur la trame PMD afin d'obtenir un signal de modulation ; et l'envoi du signal de modulation. Des données provenant de différentes sources sont combinées, et les différences entre les modes de modulation sont blindées, de sorte que la communication entre un convertisseur parallèle-série et un convertisseur parallèle-série puisse être compatible avec deux modes de modulation de modulation à porteuse unique et de modulation à porteuses multiples.
PCT/CN2021/070371 2021-01-05 2021-01-05 Procédé et dispositif de communication dans un système avancé d'aide à la conduite WO2022147667A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/070371 WO2022147667A1 (fr) 2021-01-05 2021-01-05 Procédé et dispositif de communication dans un système avancé d'aide à la conduite
CN202180089133.9A CN116830536A (zh) 2021-01-05 2021-01-05 高级驾驶辅助系统中的通信方法和装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/070371 WO2022147667A1 (fr) 2021-01-05 2021-01-05 Procédé et dispositif de communication dans un système avancé d'aide à la conduite

Publications (1)

Publication Number Publication Date
WO2022147667A1 true WO2022147667A1 (fr) 2022-07-14

Family

ID=82357082

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/070371 WO2022147667A1 (fr) 2021-01-05 2021-01-05 Procédé et dispositif de communication dans un système avancé d'aide à la conduite

Country Status (2)

Country Link
CN (1) CN116830536A (fr)
WO (1) WO2022147667A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190349874A1 (en) * 2016-12-27 2019-11-14 Lg Electronics Inc. V2x communication apparatus and data communication method therefor
CN110476403A (zh) * 2017-09-29 2019-11-19 Lg电子株式会社 V2x通信设备及由其发送/接收多媒体内容的方法
CN110692262A (zh) * 2017-03-31 2020-01-14 Lg电子株式会社 用于v2x通信的装置和方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190349874A1 (en) * 2016-12-27 2019-11-14 Lg Electronics Inc. V2x communication apparatus and data communication method therefor
CN110692262A (zh) * 2017-03-31 2020-01-14 Lg电子株式会社 用于v2x通信的装置和方法
CN110476403A (zh) * 2017-09-29 2019-11-19 Lg电子株式会社 V2x通信设备及由其发送/接收多媒体内容的方法

Also Published As

Publication number Publication date
CN116830536A (zh) 2023-09-29

Similar Documents

Publication Publication Date Title
US10952243B2 (en) Method, system and device for network communications within a vehicle
US20210136087A1 (en) Firewall
US10608941B2 (en) Dual-network for fault tolerance
US10616259B2 (en) Real-time network vulnerability analysis and patching
US10530816B2 (en) Method for detecting the use of unauthorized security credentials in connected vehicles
US20190129431A1 (en) Visual place recognition based self-localization for autonomous vehicles
WO2019108494A1 (fr) Procédé et appareil destinés au traitement et à la journalisation simultanés d'un système de vision d'automobile avec des commandes et une surveillance de défaillance
US10538174B2 (en) Real-time nonlinear receding horizon control of batteries for power systems
US11636077B2 (en) Methods, devices, and systems for processing sensor data of vehicles
WO2021188541A1 (fr) Dispositifs et systèmes lidar simulés
CN105245584A (zh) 一种基于ofdm雷达通信一体化的车联网感知系统及其构建方法
WO2019191380A1 (fr) Procédés de fusion de capteurs pour navigation à réalité augmentée
US11431370B2 (en) Vehicle reception apparatus for receiving broadcast signal and vehicle reception method for receiving broadcast signal
CN106332113B (zh) 通讯方法和终端
KR20210059980A (ko) 차량의 원격 제어방법 및 이를 위한 혼합현실 디바이스 및 차량
KR20210070701A (ko) 3차원 이미지 생성 방법 및 시스템
KR102135254B1 (ko) 차량 내 사용자 모니터링을 위한 배경 이미지 생성 방법 및 이를 위한 장치
WO2022147667A1 (fr) Procédé et dispositif de communication dans un système avancé d'aide à la conduite
WO2018020884A1 (fr) Appareil terminal et système d'appareil
CN109624991B (zh) 由专用网络上共享的数字签名保护的地理标记和时间戳数据
CN209462498U (zh) 路侧基站、车载终端和道路车辆视觉范围扩展系统
WO2021136416A1 (fr) Procédé de transmission d'informations, dispositif et système de communication, et support d'enregistrement lisible par ordinateur
KR102490309B1 (ko) Ar 네비게이션의 화면 표시 방법 및 ar 네이게이션 시스템
CN114693536A (zh) 一种图像处理方法,装置及储存介质
US11768728B2 (en) Routing multiple diagnostic pathways

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21916725

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180089133.9

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21916725

Country of ref document: EP

Kind code of ref document: A1