WO2020060304A1 - Procédé d'émission par un terminal d'un signal de liaison latérale dans un système de communications sans fil, et dispositif associé - Google Patents

Procédé d'émission par un terminal d'un signal de liaison latérale dans un système de communications sans fil, et dispositif associé Download PDF

Info

Publication number
WO2020060304A1
WO2020060304A1 PCT/KR2019/012262 KR2019012262W WO2020060304A1 WO 2020060304 A1 WO2020060304 A1 WO 2020060304A1 KR 2019012262 W KR2019012262 W KR 2019012262W WO 2020060304 A1 WO2020060304 A1 WO 2020060304A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
pssch
vehicle
data
dmrs
Prior art date
Application number
PCT/KR2019/012262
Other languages
English (en)
Korean (ko)
Inventor
홍의현
서한별
채혁진
이승민
황대성
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020060304A1 publication Critical patent/WO2020060304A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L5/00Arrangements affording multiple use of the transmission path

Definitions

  • the following description relates to a wireless communication system, and more particularly, to a method and apparatus for generating and / or transmitting a DMRS sequence that can be distinguished between sidelink terminals.
  • a wireless communication system is a multiple access system capable of supporting communication with multiple users by sharing available system resources (bandwidth, transmission power, etc.).
  • Examples of the multiple access system include a code division multiple access (CDMA) system, a frequency division multiple access (FDMA) system, a time division multiple access (TDMA) system, an orthogonal frequency division multiple access (OFDMA) system, and a single carrier frequency (SC-FDMA).
  • CDMA code division multiple access
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • OFDMA orthogonal frequency division multiple access
  • SC-FDMA single carrier frequency division multiple access
  • MC multi-carrier frequency division multiple access
  • RATs radio access technologies
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution
  • WiFi wireless fidelity
  • 5G 5th Generation
  • the three main requirements areas of 5G are: (1) Enhanced Mobile Broadband (eMBB) area, (2) Massive Machine Type Communication (mMTC) area, and (3) Super-reliability and Ultra-reliable and Low Latency Communications (URLLC) domain.
  • eMBB Enhanced Mobile Broadband
  • mMTC Massive Machine Type Communication
  • URLLC Ultra-reliable and Low Latency Communications
  • KPI key performance indicator
  • 5G is a flexible and reliable way to support these various use cases.
  • eMBB goes far beyond basic mobile Internet access, and covers media and entertainment applications in rich interactive work, cloud or augmented reality.
  • Data is one of the key drivers of 5G, and for the first time in the 5G era, dedicated voice services may not be seen.
  • 5G it is expected that voice will be processed as an application program simply using the data connection provided by the communication system.
  • the main causes for increased traffic volume are increased content size and increased number of applications requiring high data rates.
  • Streaming services (audio and video), interactive video and mobile internet connections will become more widely used as more devices connect to the internet. Many of these applications require always-on connectivity to push real-time information and notifications to users.
  • Cloud storage and applications are rapidly increasing in mobile communication platforms, which can be applied to both work and entertainment.
  • cloud storage is a special use case that drives the growth of uplink data transfer rate.
  • 5G is also used for remote work in the cloud, requiring much lower end-to-end delay to maintain a good user experience when a tactile interface is used.
  • Entertainment For example, cloud gaming and video streaming are another key factor in increasing demand for mobile broadband capabilities. Entertainment is essential for smartphones and tablets anywhere, including high mobility environments such as trains, cars and airplanes.
  • Another use case is augmented reality and information retrieval for entertainment.
  • augmented reality requires a very low delay and an instantaneous amount of data.
  • URLLC includes new services that will transform the industry through ultra-reliable / low-latency links, such as remote control of the main infrastructure and self-driving vehicles. Reliability and level of delay are essential for smart grid control, industrial automation, robotics, drone control and coordination.
  • 5G can complement fiber-to-the-home (FTTH) and cable-based broadband (or DOCSIS) as a means to provide streams rated at hundreds of megabits per second to gigabit per second. This fast speed is required to deliver TV in 4K (6K, 8K and higher) resolutions as well as virtual and augmented reality.
  • Virtual Reality (VR) and Augmented Reality (AR) applications include almost immersive sports events. Certain application programs may require special network settings. For VR games, for example, game companies may need to integrate the core server with the network operator's edge network server to minimize latency.
  • Automotive is expected to be an important new driver for 5G, along with many use cases for mobile communications to vehicles. For example, entertainment for passengers requires simultaneous high capacity and high mobility mobile broadband. This is because future users continue to expect high-quality connections regardless of their location and speed.
  • Another example of application in the automotive field is the augmented reality dashboard. It identifies objects in the dark over what the driver sees through the front window, and superimposes and displays information telling the driver about the distance and movement of the object.
  • wireless modules will enable communication between vehicles, exchange of information between the vehicle and the supporting infrastructure and exchange of information between the vehicle and other connected devices (eg, devices carried by pedestrians).
  • the safety system helps the driver to reduce the risk of accidents by guiding alternative courses of action to make driving safer.
  • the next step will be remote control or a self-driven vehicle.
  • This requires very reliable and very fast communication between different self-driving vehicles and between the vehicle and the infrastructure.
  • self-driving vehicles will perform all driving activities, and drivers will focus only on traffic beyond which the vehicle itself cannot identify.
  • the technical requirements of self-driving vehicles require ultra-low delays and ultra-high-speed reliability to increase traffic safety to levels beyond human reach.
  • Smart cities and smart homes will be embedded in high-density wireless sensor networks.
  • the distributed network of intelligent sensors will identify the conditions for cost and energy-efficient maintenance of the city or home. Similar settings can be made for each assumption.
  • Temperature sensors, window and heating controllers, burglar alarms and consumer electronics are all connected wirelessly. Many of these sensors are typically low data rates, low power and low cost. However, for example, real-time HD video may be required in certain types of devices for surveillance.
  • the smart grid interconnects these sensors using digital information and communication technologies to collect information and act accordingly. This information can include supplier and consumer behavior, so smart grids can improve efficiency, reliability, economics, production sustainability and distribution of fuels like electricity in an automated way.
  • the smart grid can be viewed as another sensor network with low latency.
  • the health sector has many applications that can benefit from mobile communications.
  • the communication system can support telemedicine that provides clinical care from a distance. This can help reduce barriers to distance and improve access to medical services that are not continuously available in remote rural areas. It is also used to save lives in critical care and emergency situations.
  • a wireless sensor network based on mobile communication can provide remote monitoring and sensors for parameters such as heart rate and blood pressure.
  • Wireless and mobile communications are becoming increasingly important in industrial applications. Wiring is expensive to install and maintain. Thus, the possibility of replacing cables with wireless links that can be reconfigured is an attractive opportunity in many industries. However, achieving this requires that the wireless connection operates with cable-like delay, reliability and capacity, and that management is simplified. Low latency and very low error probability are new requirements that need to be connected to 5G.
  • Logistics and freight tracking are important use cases for mobile communications that enable the tracking of inventory and packages from anywhere using location-based information systems.
  • Logistics and freight tracking use cases typically require low data rates, but require wide range and reliable location information.
  • Embodiment (s) relates to a method of generating and / or transmitting a DMRS sequence that enables to distinguish between sidelink terminals.
  • a method in which a terminal transmits a sidelink signal in a wireless communication system comprising: generating a DMRS for a physical sidelink shared channel (PSSCH); And transmitting the DMRS for the PSSCH and the PSSCH, and a cyclical redundancy check (CRC) used for a physical sidelink control channel (PSCCH) associated with the PSSCH is used when generating a sequence related to the DMRS for the PSSCH.
  • CRC cyclical redundancy check
  • a side link terminal device in a wireless communication system comprising: a memory; And a processor coupled to the memory, wherein the processor generates a DMRS for a physical sidelink shared channel (PSSCH), transmits the DMRS and the PSSCH for the PSSCH, and a physical sidelink (PSCCH) associated with the PSSCH.
  • PSSCH physical sidelink shared channel
  • PSCCH physical sidelink
  • CRC cyclical redundancy check
  • CRC control channel
  • the CRC may be used instead of ID information used for initialization of a sequence related to DMRS for the PSSCH.
  • the ID information or It can be one of.
  • Initialization of a sequence related to DMRS for the PSSCH is the following initial value Is performed by,
  • Is the number of symbols in the slot Is the slot number in the frame, Is an OFDM symbol number, and Is the least significant bit (LSB) of the CRC, the In bits and CRC for cell ID For LSB, the next 6 LSBs may be included.
  • LSB least significant bit
  • the PSCCH may include control information related to reception of the PSSCH.
  • the DMRS for the PSCCH may be independent of the DMRS for the PSSCH.
  • a fixed value for each terminal or a preset value for each resource pool may be used.
  • the CRC may be used instead of ID information used for initialization of a sequence related to DMRS for the PSSCH.
  • the terminal may be included in an autonomous vehicle or an autonomous vehicle.
  • FIG. 1 is a diagram illustrating a vehicle according to embodiment (s).
  • FIG. 2 is a control block diagram of a vehicle according to embodiment (s).
  • FIG. 3 is a control block diagram of an autonomous driving device according to the embodiment (s).
  • FIG. 4 is a block diagram of an autonomous driving device according to embodiment (s).
  • FIG. 5 is a diagram showing the interior of a vehicle according to the embodiment (s).
  • FIG. 6 is a block diagram referred to for describing a cabin system for a vehicle according to embodiment (s).
  • FIG 7 shows the structure of an LTE system to which the embodiment (s) can be applied.
  • FIG. 8 shows a radio protocol architecture for a user plane to which embodiment (s) can be applied.
  • FIG. 9 shows a radio protocol structure for a control plane to which the embodiment (s) can be applied.
  • FIG. 10 shows the structure of an NR system to which the embodiment (s) can be applied.
  • FIG 11 shows functional division between NG-RAN and 5GC to which the embodiment (s) can be applied.
  • FIG. 12 shows the structure of a radio frame of NR to which the embodiment (s) can be applied.
  • FIG. 13 shows a slot structure of an NR frame to which the embodiment (s) can be applied.
  • a method in which a transmission resource of a next packet is also reserved may be used for selection of a transmission resource.
  • 16 shows an example of physical layer processing at a transmission side to which embodiment (s) can be applied.
  • FIG 17 shows an example of physical layer processing at the receiving side to which the embodiment (s) can be applied.
  • 19-20 are flowcharts for describing one embodiment (s).
  • 21 to 27 are views for explaining various devices to which the embodiment (s) can be applied.
  • FIG. 1 is a view showing a vehicle according to an embodiment.
  • a vehicle 10 is defined as a transportation means traveling on a road or a track.
  • the vehicle 10 is a concept including an automobile, a train, and a motorcycle.
  • the vehicle 10 may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
  • the vehicle 10 may be a vehicle owned by an individual.
  • the vehicle 10 may be a shared vehicle.
  • the vehicle 10 may be an autonomous vehicle.
  • FIG. 2 is a control block diagram of a vehicle according to an embodiment.
  • the vehicle 10 includes a user interface device 200, an object detection device 210, a communication device 220, a driving operation device 230, a main ECU 240, and a drive control device 250 ), An autonomous driving device 260, a sensing unit 270, and a location data generating device 280.
  • Each of 280 may be implemented as an electronic device that generates electrical signals and exchanges electrical signals with each other.
  • the user interface device 200 is a device for communication between the vehicle 10 and a user.
  • the user interface device 200 may receive user input and provide information generated in the vehicle 10 to the user.
  • the vehicle 10 may implement a user interface (UI) or a user experience (UX) through the user interface device 200.
  • the user interface device 200 may include an input device, an output device, and a user monitoring device.
  • the object detection device 210 may generate information about an object outside the vehicle 10.
  • the information on the object may include at least one of information on the presence or absence of the object, location information of the object, distance information between the vehicle 10 and the object, and relative speed information between the vehicle 10 and the object. .
  • the object detection device 210 may detect an object outside the vehicle 10.
  • the object detection device 210 may include at least one sensor capable of detecting an object outside the vehicle 10.
  • the object detection device 210 may include at least one of a camera, a radar, a lidar, an ultrasonic sensor, and an infrared sensor.
  • the object detection device 210 may provide data on an object generated based on a sensing signal generated by the sensor to at least one electronic device included in the vehicle.
  • the camera may generate information about an object outside the vehicle 10 using an image.
  • the camera may include at least one lens, at least one image sensor, and at least one processor that is electrically connected to the image sensor and processes a received signal, and generates data for an object based on the processed signal.
  • the camera may be at least one of a mono camera, a stereo camera, and an AVM (Around View Monitoring) camera.
  • the camera may acquire position information of an object, distance information of an object, or relative speed information of an object using various image processing algorithms. For example, in the acquired image, the camera may acquire distance information and relative speed information with an object based on a change in object size over time. For example, the camera may acquire distance information and relative speed information with an object through a pin hole model, road surface profiling, and the like. For example, the camera may acquire distance information and relative speed information with an object based on disparity information in a stereo image obtained from a stereo camera.
  • the camera may be mounted at a position capable of securing a field of view (FOV) in the vehicle to photograph the exterior of the vehicle.
  • the camera may be placed close to the front windshield, in the interior of the vehicle, to obtain an image in front of the vehicle.
  • the camera can be placed around the front bumper or radiator grille.
  • the camera may be placed close to the rear glass, in the interior of the vehicle, to obtain an image behind the vehicle.
  • the camera can be placed around the rear bumper, trunk or tailgate.
  • the camera may be disposed close to at least one of the side windows in the interior of the vehicle in order to acquire an image on the side of the vehicle.
  • the camera may be disposed around a side mirror, fender, or door.
  • the radar may generate information about an object outside the vehicle 10 using radio waves.
  • the radar may include at least one processor that is electrically connected to the electromagnetic wave transmission unit, the electromagnetic wave reception unit, and the electromagnetic wave transmission unit and the electromagnetic wave reception unit, processes a received signal, and generates data for an object based on the processed signal.
  • Radar may be implemented in a pulse radar method or a continuous wave radar method in accordance with the principle of radio wave launch.
  • the radar may be implemented by a FMCW (Frequency Modulated Continuous Wave) method or a FSK (Frequency Shift Keyong) method according to a signal waveform among continuous wave radar methods.
  • FMCW Frequency Modulated Continuous Wave
  • FSK Frequency Shift Keyong
  • the radar detects an object based on a time of flight (TOF) method or a phase-shift method via electromagnetic waves, and detects the position of the detected object, the distance from the detected object, and the relative speed. You can.
  • the radar can be placed at an appropriate location outside the vehicle to detect objects located in the front, rear or side of the vehicle.
  • the lidar may generate information about an object outside the vehicle 10 using laser light.
  • the lidar may include at least one processor that is electrically connected to the optical transmitter, the optical receiver, and the optical transmitter and the optical receiver to process the received signal and generate data for the object based on the processed signal. .
  • the lidar may be implemented in a time of flight (TOF) method or a phase-shift method.
  • the lidar can be implemented as driven or non-driven. When implemented as a driving type, the rider is rotated by a motor and can detect objects around the vehicle 10. When implemented in a non-driven manner, the rider can detect an object located within a predetermined range relative to the vehicle by light steering.
  • the vehicle 100 may include a plurality of non-driven lidars.
  • the rider detects an object based on a time-of-flight (TOF) method or a phase-shift method with a laser light medium, and detects the position of the detected object, the distance to the detected object, and the relative speed. Can be detected.
  • the lidar can be placed at an appropriate location outside of the vehicle to detect objects located in front, rear, or side of the vehicle.
  • the communication device 220 can exchange signals with a device located outside the vehicle 10.
  • the communication device 220 may exchange signals with at least one of an infrastructure (eg, a server, a broadcasting station), another vehicle, and a terminal.
  • the communication device 220 may include at least one of a transmitting antenna, a receiving antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
  • RF radio frequency
  • a communication device may exchange signals with an external device based on C-V2X (Cellular V2X) technology.
  • C-V2X technology may include side link communication based on LTE and / or side link communication based on NR. Details related to C-V2X will be described later.
  • DSRC Dedicated Short Range Communications
  • WAVE Wireless Access in Vehicular Environment
  • DSRC or WAVE standard
  • ITS Intelligent Transport System
  • DSRC technology may use a frequency of 5.9GHz band, may be a communication method having a data transmission rate of 3Mbps ⁇ 27Mbps.
  • IEEE 802.11p technology can be combined with IEEE 1609 technology to support DSRC technology (or WAVE standard).
  • the communication device can exchange signals with an external device using either C-V2X technology or DSRC technology.
  • the communication device may exchange signals with an external device by hybridizing C-V2X technology and DSRC technology.
  • the driving manipulation device 230 is a device that receives a user input for driving. In the manual mode, the vehicle 10 may be driven based on a signal provided by the driving manipulation device 230.
  • the driving manipulation device 230 may include a steering input device (eg, steering wheel), an acceleration input device (eg, an accelerator pedal), and a brake input device (eg, a brake pedal).
  • the main ECU 240 may control the overall operation of at least one electronic device provided in the vehicle 10.
  • the driving control device 250 is a device that electrically controls various vehicle driving devices in the vehicle 10.
  • the drive control device 250 may include a power train drive control device, a chassis drive control device, a door / window drive control device, a safety device drive control device, a lamp drive control device, and an air conditioning drive control device.
  • the power train drive control device may include a power source drive control device and a transmission drive control device.
  • the chassis drive control device may include a steering drive control device, a brake drive control device, and a suspension drive control device.
  • the safety device drive control device may include a seat belt drive control device for seat belt control.
  • the drive control device 250 includes at least one electronic control device (eg, a control electronic control unit (ECU)).
  • ECU control electronic control unit
  • the ball control device 250 may control the vehicle driving device based on the signal received from the autonomous driving device 260.
  • the control device 250 may control the power train, steering device, and brake device based on signals received from the autonomous driving device 260.
  • the autonomous driving device 260 may generate a pass for autonomous driving based on the acquired data.
  • the autonomous driving device 260 may generate a driving plan for driving along the generated route.
  • the autonomous driving device 260 may generate a signal for controlling the movement of the vehicle according to the driving plan.
  • the autonomous driving device 260 may provide the generated signal to the driving control device 250.
  • the autonomous driving device 260 may implement at least one ADAS (Advanced Driver Assistance System) function.
  • ADAS Advanced Driver Assistance System
  • ADAS Adaptive Cruise Control
  • AEB Autonomous Emergency Braking
  • FCW Forward Collision Warning
  • LKA Lane Change Assist
  • LKA Lane Change Assist
  • LKA Lane Change Assist
  • TSR Traffic Sign Recognition
  • TSA Traffic Sign Assist System
  • NV Night Vision System
  • DSM Driver Status Monitoring
  • TJA Traffic Jam Assist
  • the autonomous driving device 260 may perform a switching operation from an autonomous driving mode to a manual driving mode or a switching operation from a manual driving mode to an autonomous driving mode. For example, the autonomous driving device 260 switches the mode of the vehicle 10 from the autonomous driving mode to the manual driving mode or the autonomous driving mode in the manual driving mode based on a signal received from the user interface device 200. You can switch to
  • the sensing unit 270 may sense the state of the vehicle.
  • the sensing unit 270 includes an inertial measurement unit (IMU) sensor, a collision sensor, a wheel sensor, a speed sensor, an inclination sensor, a weight sensor, a heading sensor, a position module, and a vehicle It may include at least one of a forward / reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor, a temperature sensor, a humidity sensor, an ultrasonic sensor, an illuminance sensor, and a pedal position sensor.
  • the inertial measurement unit (IMU) sensor may include at least one of an acceleration sensor, a gyro sensor, and a magnetic sensor.
  • the sensing unit 270 may generate state data of the vehicle based on signals generated by at least one sensor.
  • the vehicle status data may be information generated based on data detected by various sensors provided inside the vehicle.
  • the sensing unit 270 includes vehicle attitude data, vehicle motion data, vehicle yaw data, vehicle roll data, vehicle pitch data, vehicle collision data, vehicle direction data, vehicle angle data, vehicle speed Data, vehicle acceleration data, vehicle tilt data, vehicle forward / reverse data, vehicle weight data, battery data, fuel data, tire pressure data, vehicle interior temperature data, vehicle interior humidity data, steering wheel rotation angle data, vehicle exterior illumination Data, pressure data applied to the accelerator pedal, pressure data applied to the brake pedal, and the like can be generated.
  • the location data generation device 280 may generate location data of the vehicle 10.
  • the location data generating device 280 may include at least one of a Global Positioning System (GPS) and a Differential Global Positioning System (DGPS).
  • GPS Global Positioning System
  • DGPS Differential Global Positioning System
  • the location data generation device 280 may generate location data of the vehicle 10 based on a signal generated from at least one of GPS and DGPS.
  • the location data generating apparatus 280 may correct the location data based on at least one of an IMU (Inertial Measurement Unit) of the sensing unit 270 and a camera of the object detection apparatus 210.
  • the location data generating device 280 may be referred to as a Global Navigation Satellite System (GNSS).
  • GNSS Global Navigation Satellite System
  • the vehicle 10 may include an internal communication system 50.
  • a plurality of electronic devices included in the vehicle 10 may exchange signals through the internal communication system 50. Signals may include data.
  • the internal communication system 50 may use at least one communication protocol (eg, CAN, LIN, FlexRay, MOST, Ethernet).
  • FIG. 3 is a control block diagram of an autonomous driving device according to an embodiment.
  • the autonomous driving device 260 may include a memory 140, a processor 170, an interface unit 180, and a power supply unit 190.
  • the memory 140 is electrically connected to the processor 170.
  • the memory 140 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data.
  • the memory 140 may store data processed by the processor 170.
  • the memory 140 may be configured in hardware at least one of a ROM, RAM, EPROM, flash drive, and hard drive.
  • the memory 140 may store various data for the overall operation of the autonomous driving device 260, such as a program for processing or controlling the processor 170.
  • the memory 140 may be implemented integrally with the processor 170. According to an embodiment, the memory 140 may be classified as a sub configuration of the processor 170.
  • the interface unit 180 may exchange signals with wires or wirelessly with at least one electronic device provided in the vehicle 10.
  • the interface unit 280 includes an object detection device 210, a communication device 220, a driving operation device 230, a main ECU 240, a drive control device 250, a sensing unit 270, and a location data generation device
  • the signal may be exchanged with at least one of 280 by wire or wireless.
  • the interface unit 280 may be configured as at least one of a communication module, terminal, pin, cable, port, circuit, element, and device.
  • the power supply unit 190 may supply power to the autonomous driving device 260.
  • the power supply unit 190 may receive power from a power source (eg, a battery) included in the vehicle 10 and supply power to each unit of the autonomous driving device 260.
  • the power supply unit 190 may be operated according to a control signal provided from the main ECU 240.
  • the power supply unit 190 may include a switched-mode power supply (SMPS).
  • SMPS switched-mode power supply
  • the processor 170 is electrically connected to the memory 140, the interface unit 280, and the power supply unit 190 to exchange signals.
  • the processor 170 includes application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, and controllers It may be implemented using at least one of (controllers), micro-controllers, microprocessors, and electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors and controllers It may be implemented using at least one of (controllers), micro-controllers, microprocessors, and electrical units for performing other functions.
  • the processor 170 may be driven by power provided from the power supply unit 190.
  • the processor 170 may receive data, process data, generate a signal, and provide a signal while the power is supplied by the power supply unit 190.
  • the processor 170 may receive information from another electronic device in the vehicle 10 through the interface unit 180.
  • the processor 170 may provide a control signal to another electronic device in the vehicle 10 through the interface unit 180.
  • the autonomous driving device 260 may include at least one printed circuit board (PCB).
  • the memory 140, the interface unit 180, the power supply unit 190, and the processor 170 may be electrically connected to a printed circuit board.
  • the processor 170 may perform a reception operation.
  • the processor 170 receives data from at least one of the object detection device 210, the communication device 220, the sensing unit 270, and the location data generation device 280 through the interface unit 180. You can.
  • the processor 170 may receive object data from the object detection device 210.
  • the processor 170 may receive HD map data from the communication device 220.
  • the processor 170 may receive vehicle status data from the sensing unit 270.
  • the processor 170 may receive location data from the location data generating device 280.
  • the processor 170 may perform a processing / judgment operation.
  • the processor 170 may perform a processing / judgment operation based on the driving situation information.
  • the processor 170 may perform a processing / determination operation based on at least one of object data, HD map data, vehicle status data, and location data.
  • the processor 170 may generate driving plan data.
  • the processor 1700 may generate electronic horizon data.
  • the electronic horizon data is understood as driving plan data within a range from the point where the vehicle 10 is located to the horizon.
  • the horizon may be understood as a point in front of a predetermined distance from a point where the vehicle 10 is located, based on a preset driving route. It may mean a point from which the vehicle 10 can reach after a predetermined time.
  • the electronic horizon data may include horizon map data and horizon pass data.
  • the horizon map data may include at least one of topology data, road data, HD map data, and dynamic data.
  • the horizon map data may include a plurality of layers.
  • the horizon map data may include one layer matching topology data, a second layer matching road data, a third layer matching HD map data, and a fourth layer matching dynamic data.
  • the horizon map data may further include static object data.
  • Topology data can be described as a map created by connecting road centers.
  • the topology data is suitable for roughly indicating the position of the vehicle, and may be mainly in the form of data used in navigation for drivers.
  • the topology data may be understood as data on road information from which information on a lane is excluded.
  • the topology data may be generated based on data received from an external server through the communication device 220.
  • the topology data may be based on data stored in at least one memory provided in the vehicle 10.
  • the road data may include at least one of road slope data, road curvature data, and road speed data.
  • the road data may further include overtaking prohibited section data.
  • Road data may be based on data received from an external server through the communication device 220.
  • Road data may be based on data generated by the object detection device 210.
  • the HD map data includes detailed lane-level topology information of each road, connection information of each lane, and feature information (eg, traffic signs, Lane Marking / Properties, Road furniture, etc.) for localization of vehicles. You can.
  • the HD map data may be based on data received from an external server through the communication device 220.
  • the dynamic data may include various dynamic information that may be generated on the road.
  • the dynamic data may include construction information, variable speed lane information, road surface condition information, traffic information, moving object information, and the like.
  • the dynamic data may be based on data received from an external server through the communication device 220.
  • the dynamic data may be based on data generated by the object detection device 210.
  • the processor 170 may provide map data within a range from a point where the vehicle 10 is located to a horizon.
  • the horizon pass data may be described as a trajectory that the vehicle 10 can take within the range from the point where the vehicle 10 is located to the horizon.
  • the horizon pass data may include data indicating a relative probability of selecting any one road at a decision point (eg, fork, junction, intersection, etc.). Relative probability can be calculated based on the time it takes to reach the final destination. For example, in the decision point, if the first road is selected, if the time to reach the final destination is smaller than when selecting the second road, the probability of selecting the first road is greater than the probability of selecting the second road. Can be calculated higher.
  • Horizon pass data may include a main pass and a sub pass.
  • the main pass can be understood as an orbit connecting roads with a relatively high probability of being selected.
  • the sub-pass can be branched at at least one decision point on the main pass.
  • the sub-pass may be understood as an orbit connecting at least one road having a relatively low probability of being selected from at least one decision point on the main pass.
  • the processor 170 may perform a control signal generation operation.
  • the processor 170 may generate a control signal based on the electronic horizon data.
  • the processor 170 may generate at least one of a powertrain control signal, a brake device control signal, and a steering device control signal based on the electronic horizon data.
  • the processor 170 may transmit the generated control signal to the driving control device 250 through the interface unit 180.
  • the driving control device 250 may transmit a control signal to at least one of the power train 251, the brake device 252, and the steering device 253.
  • FIG. 5 is a view showing the interior of a vehicle according to an embodiment.
  • FIG. 6 is a block diagram referred to for describing a vehicle cabin system according to an embodiment.
  • a vehicle cabin system 300 (hereinafter, a cabin system) may be defined as a convenience system for a user using the vehicle 10.
  • Cabin system 300 may be described as a top-level system including display system 350, cargo system 355, seat system 360 and payment system 365.
  • the cabin system 300 includes a main controller 370, a memory 340, an interface unit 380, a power supply unit 390, an input device 310, an imaging device 320, a communication device 330, and a display system. 350, cargo system 355, seat system 360, and payment system 365.
  • the cabin system 300 may further include other components in addition to the components described herein, or may not include some of the components described.
  • the main controller 370 is electrically connected to the input device 310, the communication device 330, the display system 350, the cargo system 355, the seat system 360, and the payment system 365 to exchange signals. can do.
  • the main controller 370 may control the input device 310, the communication device 330, the display system 350, the cargo system 355, the seat system 360 and the payment system 365.
  • the main controller 370 includes application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors (processors), It may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors processors
  • It may be implemented using at least
  • the main controller 370 may be configured with at least one sub-controller. According to an embodiment, the main controller 370 may include a plurality of sub-controllers. Each of the plurality of sub-controllers may individually control devices and systems included in the grouped cabin system 300. The devices and systems included in the cabin system 300 may be grouped by function or grouped based on a seat that can be seated.
  • the main controller 370 may include at least one processor 371. Although the main controller 370 is illustrated in FIG. 6 as including one processor 371, the main controller 371 may also include a plurality of processors. The processor 371 may be classified as any one of the sub-controllers described above.
  • the processor 371 may receive a signal, information, or data from a user terminal through the communication device 330.
  • the user terminal may transmit signals, information, or data to the cabin system 300.
  • the processor 371 may specify a user based on image data received from at least one of an internal camera and an external camera included in the imaging device.
  • the processor 371 may specify a user by applying an image processing algorithm to image data.
  • the processor 371 may compare the information received from the user terminal with image data to identify the user.
  • the information may include at least one of the user's route information, body information, passenger information, luggage information, location information, preferred content information, preferred food information, disability information, and usage history information. .
  • the main controller 370 may include an artificial intelligence agent 372.
  • the AI agent 372 may perform machine learning based on data obtained through the input device 310.
  • the AI agent 372 may control at least one of the display system 350, the cargo system 355, the seat system 360, and the payment system 365 based on the machine-learned results.
  • the memory 340 is electrically connected to the main controller 370.
  • the memory 340 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data.
  • the memory 340 may store data processed by the main controller 370.
  • the memory 340 may be configured in hardware at least one of a ROM, RAM, EPROM, flash drive, and hard drive.
  • the memory 340 may store various data for operations of the cabin system 300 in general, such as a program for processing or controlling the main controller 370.
  • the memory 340 may be implemented integrally with the main controller 370.
  • the interface unit 380 may exchange signals with wires or wirelessly with at least one electronic device provided in the vehicle 10.
  • the interface unit 380 may be configured as at least one of a communication module, terminal, pin, cable, port, circuit, element, and device.
  • the power supply unit 390 may supply power to the cabin system 300.
  • the power supply unit 390 may receive power from a power source (eg, a battery) included in the vehicle 10 and supply power to each unit of the cabin system 300.
  • the power supply unit 390 may operate according to a control signal provided from the main controller 370.
  • the power supply unit 390 may be implemented as a switched-mode power supply (SMPS).
  • SMPS switched-mode power supply
  • the cabin system 300 may include at least one printed circuit board (PCB).
  • PCB printed circuit board
  • the main controller 370, the memory 340, the interface unit 380, and the power supply unit 390 may be mounted on at least one printed circuit board.
  • the input device 310 may receive a user input.
  • the input device 310 may convert a user input into an electrical signal.
  • the electrical signal converted by the input device 310 may be converted into a control signal and provided to at least one of the display system 350, the cargo system 355, the seat system 360, and the payment system 365.
  • the main controller 370 or at least one processor included in the cabin system 300 may generate a control signal based on an electrical signal received from the input device 310.
  • the input device 310 may include at least one of a touch input unit, a gesture input unit, a mechanical input unit, and a voice input unit.
  • the touch input unit may convert a user's touch input into an electrical signal.
  • the touch input unit may include at least one touch sensor to sense a user's touch input.
  • the touch input unit may be formed integrally with at least one display included in the display system 350 to implement a touch screen.
  • the touch screen may provide an input interface and an output interface between the cabin system 300 and the user.
  • the gesture input unit may convert a user's gesture input into an electrical signal.
  • the gesture input unit may include at least one of an infrared sensor and an image sensor for sensing a user's gesture input.
  • the gesture input unit may detect a user's 3D gesture input.
  • the gesture input unit may include a light output unit outputting a plurality of infrared light or a plurality of image sensors.
  • the gesture input unit may detect a user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
  • the mechanical input unit may convert a user's physical input (eg, pressing or rotating) through a mechanical device into an electrical signal.
  • the mechanical input unit may include at least one of a button, a dome switch, a jog wheel, and a jog switch. Meanwhile, the gesture input unit and the mechanical input unit may be integrally formed.
  • the input device 310 may include a jog dial device that includes a gesture sensor and is removably formed in a portion of a peripheral structure (eg, at least one of a seat, an armrest, and a door). .
  • a jog dial device When the jog dial device is flat with the surrounding structures, the jog dial device may function as a gesture input. When the jog dial device is protruding relative to the surrounding structure, the jog dial device may function as a mechanical input.
  • the voice input unit may convert the user's voice input into an electrical signal.
  • the voice input unit may include at least one microphone.
  • the voice input unit may include a beam foaming microphone.
  • the imaging device 320 may include at least one camera.
  • the imaging device 320 may include at least one of an internal camera and an external camera.
  • the internal camera can capture an image in the cabin.
  • the external camera can take a video outside the vehicle.
  • the internal camera can acquire an image in the cabin.
  • the imaging device 320 may include at least one internal camera. It is preferable that the imaging device 320 includes a number of cameras corresponding to the number of people who can board.
  • the imaging device 320 may provide an image acquired by an internal camera.
  • At least one processor included in the main controller 370 or the cabin system 300 detects a user's motion based on an image acquired by an internal camera, and generates a signal based on the detected motion, thereby displaying the system 350, cargo system 355, seat system 360, and payment system 365.
  • the external camera may acquire an image outside the vehicle.
  • the imaging device 320 may include at least one external camera. It is preferable that the imaging device 320 includes a number of cameras corresponding to the boarding door.
  • the imaging device 320 may provide an image acquired by an external camera.
  • At least one processor included in the main controller 370 or the cabin system 300 may acquire user information based on an image acquired by an external camera.
  • At least one processor included in the main controller 370 or the cabin system 300 authenticates the user based on the user information, or the user's body information (eg, height information, weight information, etc.), the user's Passenger information, user's luggage information, and the like can be obtained.
  • the communication device 330 can exchange signals wirelessly with an external device.
  • the communication device 330 may exchange signals with an external device through a network or directly exchange signals with an external device.
  • the external device may include at least one of a server, a mobile terminal, and another vehicle.
  • the communication device 330 may exchange signals with at least one user terminal.
  • the communication device 330 may include at least one of an antenna, a radio frequency (RF) circuit capable of implementing at least one communication protocol, and an RF device to perform communication.
  • RF radio frequency
  • the communication device 330 may use a plurality of communication protocols.
  • the communication device 330 may switch the communication protocol according to the distance from the mobile terminal.
  • a communication device may exchange signals with an external device based on C-V2X (Cellular V2X) technology.
  • C-V2X technology may include side link communication based on LTE and / or side link communication based on NR. Details related to C-V2X will be described later.
  • DSRC Dedicated Short Range Communications
  • WAVE Wireless Access in Vehicular Environment
  • DSRC or WAVE standard
  • ITS Intelligent Transport System
  • DSRC technology may use a frequency of 5.9GHz band, may be a communication method having a data transmission rate of 3Mbps ⁇ 27Mbps.
  • IEEE 802.11p technology can be combined with IEEE 1609 technology to support DSRC technology (or WAVE standard).
  • the communication device can exchange signals with an external device using either C-V2X technology or DSRC technology.
  • the communication device may exchange signals with an external device by hybridizing C-V2X technology and DSRC technology.
  • the display system 350 may display graphic objects.
  • the display system 350 may include at least one display device.
  • the display system 350 may include a publicly available first display device 410 and a separately available second display device 420.
  • the first display device 410 may include at least one display 411 for outputting visual content.
  • the display 411 included in the first display device 410 is a flat panel display. It may be implemented as at least one of a curved display, a rollable display, and a flexible display.
  • the first display device 410 may include a first display 411 positioned at the rear of the sheet and formed to be able to be stored in and out of a cabin, and a first mechanism for moving the first display 411.
  • the first display 411 may be disposed in a slot formed in the seat main frame so that it can be put in and out.
  • the first display device 410 may further include a flexible area adjustment mechanism.
  • the first display may be formed to be flexible, and the flexible area of the first display may be adjusted according to a user's location.
  • the first display device 410 may include a second display formed on a ceiling in the cabin and rollable, and a second mechanism for winding or unwinding the second display.
  • the second display may be formed to enable screen output on both sides.
  • the first display device 410 may include a third display formed on a ceiling in the cabin, flexible, and a third mechanism for bending or unfolding the third display.
  • the display system 350 may further include at least one processor that provides a control signal to at least one of the first display device 410 and the second display device 420.
  • the processor included in the display system 350 generates a control signal based on a signal received from at least one of the main controller 370, the input device 310, the imaging device 320, and the communication device 330. You can.
  • the display area of the display included in the first display device 410 may be divided into a first area 411a and a second area 411b.
  • the first area 411a may define content as a display area.
  • the first area 411 may display at least one of entertainment content (eg, movies, sports, shopping, music, etc.), video conference, food menu, and graphic objects corresponding to the augmented reality screen. You can.
  • the first area 411a may display a graphic object corresponding to the driving condition information of the vehicle 10.
  • the driving situation information may include at least one of object information, navigation information, and vehicle state information outside the vehicle.
  • the object information outside the vehicle may include information about the presence or absence of the object, location information of the object, distance information between the vehicle 300 and the object, and relative speed information between the vehicle 300 and the object.
  • the navigation information may include at least one of map information, set destination information, route information according to the destination setting, information on various objects on the route, lane information, and current location information of the vehicle.
  • the vehicle status information includes vehicle attitude information, vehicle speed information, vehicle tilt information, vehicle weight information, vehicle direction information, vehicle battery information, vehicle fuel information, vehicle tire pressure information, and vehicle steering information. , Vehicle room temperature information, vehicle room humidity information, pedal position information, and vehicle engine temperature information.
  • the second area 411b may be defined as a user interface area.
  • the second area 411b may output an artificial intelligence agent screen.
  • the second region 411b may be located in a region divided by a sheet frame. In this case, the user can look at the content displayed on the second area 411b between the plurality of sheets.
  • the first display device 410 may provide hologram content.
  • the first display device 410 may provide hologram content for a plurality of users so that only the user who requested the content can watch the content.
  • the second display device 420 may include at least one display 421.
  • the second display device 420 may provide the display 421 at a position where only individual passengers can check the display contents.
  • the display 421 may be disposed on the arm rest of the seat.
  • the second display device 420 may display a graphic object corresponding to the user's personal information.
  • the second display device 420 may include a number of displays 421 corresponding to the number of people who can board.
  • the second display device 420 may form a mutual layer structure with the touch sensor or be integrally formed, thereby realizing a touch screen.
  • the second display device 420 may display a graphic object for receiving a user input of seat adjustment or room temperature adjustment.
  • the cargo system 355 may provide the product to the user according to the user's request.
  • the cargo system 355 may be operated based on an electrical signal generated by the input device 310 or the communication device 330.
  • the cargo system 355 may include a cargo box.
  • the cargo box can be concealed in a part of the bottom of the sheet while the products are loaded.
  • the cargo box may be exposed as a cabin.
  • the user can select a required product among items loaded in the exposed cargo box.
  • the cargo system 355 may include a sliding moving mechanism and a product pop-up mechanism for exposing the cargo box according to user input.
  • the cargo system 355 may include a plurality of cargo boxes to provide various types of products. In the cargo box, a weight sensor for determining whether products are provided for each product may be incorporated.
  • the seat system 360 can provide a user with a customized sheet.
  • the seat system 360 can be operated based on an electrical signal generated by the input device 310 or the communication device 330.
  • the seat system 360 may adjust at least one element of the seat based on the acquired user body data.
  • the seat system 360 may include a user detection sensor (eg, a pressure sensor) to determine whether the user is seated.
  • the seat system 360 may include a plurality of seats each of which can be seated by a plurality of users. Any one of the plurality of sheets may be disposed to face at least the other. At least two users inside the cabin can sit facing each other.
  • the payment system 365 may provide a payment service to the user.
  • the payment system 365 may be operated based on an electrical signal generated by the input device 310 or the communication device 330.
  • the payment system 365 may calculate a price for at least one service used by the user and request that the calculated price be paid.
  • a wireless communication system is a multiple access system that supports communication with multiple users by sharing available system resources (eg, bandwidth, transmission power, etc.).
  • Examples of the multiple access system include a code division multiple access (CDMA) system, a frequency division multiple access (FDMA) system, a time division multiple access (TDMA) system, an orthogonal frequency division multiple access (OFDMA) system, and a single carrier frequency (SC-FDMA).
  • CDMA code division multiple access
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • OFDMA orthogonal frequency division multiple access
  • SC-FDMA single carrier frequency division multiple access
  • MC multi-carrier frequency division multiple access
  • a sidelink refers to a communication method in which a direct link is established between user equipments (UEs) to directly transmit or receive voice or data between terminals without going through a base station (BS).
  • the side link is considered as one method to solve the burden of the base station due to the rapidly increasing data traffic.
  • V2X vehicle-to-everything means a communication technology that exchanges information with other vehicles, pedestrians, and infrastructure-built objects through wired / wireless communication.
  • V2X can be divided into four types: vehicle-to-vehicle (V2V), vehicle-to-infrastructure (V2I), vehicle-to-network (V2N), and vehicle-to-pedestrian (V2P).
  • V2X communication may be provided through a PC5 interface and / or a Uu interface.
  • RAT radio access technology
  • NR new radio
  • V2X Vehicle-to-everything
  • CDMA code division multiple access
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • OFDMA orthogonal frequency division multiple access
  • SC-FDMA single carrier frequency division multiple access
  • CDMA may be implemented with a radio technology such as universal terrestrial radio access (UTRA) or CDMA2000.
  • TDMA may be implemented with radio technologies such as global system for mobile communications (GSM) / general packet radio service (GPRS) / enhanced data rates for GSM evolution (EDGE).
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • EDGE enhanced data rates for GSM evolution
  • OFDMA can be implemented with wireless technologies such as the Institute of Electrical and Electronics Engineers (IEEE) 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802-20, and Evolved UTRA (E-UTRA).
  • IEEE Institute of Electrical and Electronics Engineers
  • Wi-Fi Wi-Fi
  • WiMAX IEEE 802.16
  • IEEE 802-20 and Evolved UTRA
  • IEEE 802.16m is an evolution of IEEE 802.16e, and provides backward compatibility with a system based on IEEE 802.16e.
  • UTRA is part of a universal mobile telecommunications system (UMTS).
  • 3rd generation partnership project (3GPP) long term evolution (LTE) is part of evolved UMTS (E-UMTS) using evolved-UMTS terrestrial radio access (E-UTRA), employing OFDMA in the downlink and SC in the uplink -Adopt FDMA.
  • LTE-A (advanced) is an evolution of 3GPP LTE.
  • 5G NR is the successor to LTE-A, and is a new clean-slate type mobile communication system with characteristics such as high performance, low latency, and high availability. 5G NR can utilize all available spectrum resources, from low frequency bands below 1 GHz to mid-frequency bands from 1 GHz to 10 GHz, and high frequency (millimeter wave) bands above 24 GHz.
  • LTE-A or 5G NR is mainly described, but the technical idea is not limited thereto.
  • E-UTRAN Evolved-UMTS Terrestrial Radio Access Network
  • LTE Long Term Evolution
  • the E-UTRAN includes a base station (BS) 20 that provides a control plane and a user plane to the terminal 10.
  • the terminal 10 may be fixed or mobile, and may be referred to as other terms such as a mobile station (MS), a user terminal (UT), a subscriber station (SS), a mobile terminal (MT), and a wireless device.
  • the base station 20 refers to a fixed station that communicates with the terminal 10, and may be referred to as other terms such as an evolved-NodeB (eNB), a base transceiver system (BTS), and an access point.
  • eNB evolved-NodeB
  • BTS base transceiver system
  • the base stations 20 may be connected to each other through an X2 interface.
  • the base station 20 is connected to an EPC (Evolved Packet Core 30) through an S1 interface, and more specifically, a mobility management entity (MME) through an S1-MME and a serving gateway (S-GW) through an S1-U.
  • EPC Evolved Packet Core 30
  • MME mobility management entity
  • S-GW serving gateway
  • EPC 30 is composed of MME, S-GW and P-GW (Packet Data Network-Gateway).
  • the MME has access information of the terminal or information about the capability of the terminal, and this information is mainly used for mobility management of the terminal.
  • S-GW is a gateway with E-UTRAN as an endpoint
  • P-GW is a gateway with PDN as an endpoint.
  • the layers of the radio interface protocol between the terminal and the network are based on the lower three layers of the Open System Interconnection (OSI) reference model, which is widely known in communication systems, L1 (first layer), It can be divided into L2 (second layer) and L3 (third layer).
  • OSI Open System Interconnection
  • the physical layer belonging to the first layer provides an information transfer service using a physical channel
  • the radio resource control (RRC) layer located in the third layer is a radio resource between the terminal and the network. It plays a role of controlling.
  • the RRC layer exchanges RRC messages between the terminal and the base station.
  • FIG. 8 shows a radio protocol architecture for a user plane to which the present invention can be applied.
  • the user plane is a protocol stack for transmitting user data
  • the control plane is a protocol stack for transmitting control signals.
  • a physical layer provides an information transmission service to an upper layer using a physical channel.
  • the physical layer is connected to the upper layer of the MAC (Medium Access Control) layer through a transport channel. Data moves between the MAC layer and the physical layer through the transport channel. Transmission channels are classified according to how and with what characteristics data is transmitted through a wireless interface.
  • MAC Medium Access Control
  • the physical channel can be modulated by an orthogonal frequency division multiplexing (OFDM) method, and utilizes time and frequency as radio resources.
  • OFDM orthogonal frequency division multiplexing
  • the MAC layer provides a service to a higher level RLC (radio link control) layer through a logical channel.
  • the MAC layer provides a mapping function from a plurality of logical channels to a plurality of transport channels.
  • the MAC layer provides a logical channel multiplexing function by mapping from a plurality of logical channels to a single number of transport channels.
  • the MAC sub-layer provides data transmission services on logical channels.
  • the RLC layer performs concatenation, segmentation and reassembly of RLC SDUs.
  • the RLC layer In order to guarantee various quality of service (QoS) required by a radio bearer (RB), the RLC layer has a transparent mode (TM), an unacknowledged mode (UM), and an acknowledgment mode (Acknowledged Mode). , AM).
  • TM transparent mode
  • UM unacknowledged mode
  • Acknowledged Mode Acknowledged Mode
  • RRC Radio Resource Control
  • the RRC layer is responsible for control of logical channels, transport channels, and physical channels in connection with configuration, re-configuration, and release of radio bearers.
  • RB means a logical path provided by the first layer (PHY layer) and the second layer (MAC layer, RLC layer, PDCP layer) for data transmission between the terminal and the network.
  • the functions of the Packet Data Convergence Protocol (PDCP) layer in the user plane include the transfer of user data, header compression, and ciphering.
  • the functions of the Packet Data Convergence Protocol (PDCP) layer in the control plane include transmission of control plane data and encryption / integrity protection.
  • the establishment of RB means a process of defining characteristics of a radio protocol layer and a channel to provide a specific service, and setting each specific parameter and operation method.
  • the RB can be divided into two types: a signaling radio bearer (SRB) and a data radio bearer (DRB).
  • SRB is used as a path for transmitting RRC messages in the control plane
  • DRB is used as a path for transmitting user data in the user plane.
  • the UE When an RRC connection is established between the RRC layer of the UE and the RRC layer of the E-UTRAN, the UE is in the RRC_CONNEDTED state, otherwise it is in the RRC_IDLE state.
  • the RRC_INACTIVE state is further defined, and the terminal in the RRC_INACTIVE state can release the connection with the base station while maintaining the connection with the core network.
  • Downlink transport channels for transmitting data from a network to a terminal include a broadcast channel (BCH) for transmitting system information and a downlink shared channel (SCH) for transmitting user traffic or control messages.
  • Traffic or control messages of a downlink multicast or broadcast service may be transmitted through a downlink SCH or may be transmitted through a separate downlink multicast channel (MCH).
  • an uplink transmission channel for transmitting data from a terminal to a network includes a random access channel (RACH) for transmitting an initial control message and an uplink shared channel (SCH) for transmitting user traffic or a control message.
  • RACH random access channel
  • SCH uplink shared channel
  • Logical channels that are located above the transport channel and are mapped to the transport channel include Broadcast Control Channel (BCCH), Paging Control Channel (PCCH), Common Control Channel (CCCH), Multicast Control Channel (MCCH), and Multicast Traffic (MTCH). Channel).
  • BCCH Broadcast Control Channel
  • PCCH Paging Control Channel
  • CCCH Common Control Channel
  • MCCH Multicast Control Channel
  • MTCH Multicast Traffic
  • the physical channel is composed of several OFDM symbols in the time domain and several sub-carriers in the frequency domain.
  • One sub-frame is composed of a plurality of OFDM symbols in the time domain.
  • the resource block is a resource allocation unit, and is composed of a plurality of OFDM symbols and a plurality of sub-carriers.
  • each subframe may use specific subcarriers of specific OFDM symbols (eg, the first OFDM symbol) of a corresponding subframe for a physical downlink control channel (PDCCH), that is, an L1 / L2 control channel.
  • TTI Transmission Time Interval
  • FIG. 10 shows the structure of an NR system to which the present invention can be applied.
  • the NG-RAN may include a gNB and / or eNB that provides a user plane and control plane protocol termination to a terminal.
  • 10 illustrates a case in which only the gNB is included.
  • the gNB and the eNB are connected to each other by an Xn interface.
  • the gNB and the eNB are connected through a 5G Core Network (5GC) and an NG interface.
  • 5GC 5G Core Network
  • AMF access and mobility management function
  • UPF user plane function
  • 11 shows a functional division between NG-RAN and 5GC to which the present invention can be applied.
  • gNB is an inter-cell radio resource management (Inter Cell RRM), radio bearer management (RB control), connection mobility control (Connection Mobility Control), radio admission control (Radio Admission Control), measurement settings and provision It can provide functions such as (Measurement configuration & Provision), dynamic resource allocation, and the like.
  • AMF can provide functions such as NAS security and idle state mobility processing.
  • UPF may provide functions such as mobility anchoring and PDU processing.
  • the Session Management Function (SMF) may provide functions such as terminal IP address allocation and PDU session control.
  • FIG. 12 shows the structure of an NR radio frame to which the present invention can be applied.
  • radio frames may be used for uplink and downlink transmission in NR.
  • the radio frame has a length of 10 ms, and may be defined as two 5 ms half-frames (HFs).
  • the half-frame may include 5 1ms subframes (Subframe, SF).
  • the subframe may be divided into one or more slots, and the number of slots in the subframe may be determined according to subcarrier spacing (SCS).
  • SCS subcarrier spacing
  • Each slot may include 12 or 14 OFDM (A) symbols according to a cyclic prefix (CP).
  • each slot may include 14 symbols.
  • each slot may include 12 symbols.
  • the symbol may include an OFDM symbol (or CP-OFDM symbol) and an SC-FDMA symbol (or DFT-s-OFDM symbol).
  • Table 1 shows the number of symbols per slot according to the SCS setting ( ⁇ ) when a normal CP is used ( ), The number of slots per frame ( ) And the number of slots per subframe ( ).
  • Table 2 illustrates the number of symbols per slot, the number of slots per frame, and the number of slots per subframe according to the SCS when an extended CP is used.
  • OFDM (A) numerology eg, SCS, CP length, etc.
  • a numerology eg, SCS, CP length, etc.
  • a (absolute time) section of a time resource eg, subframe, slot, or TTI
  • TU Time Unit
  • FIG. 13 shows a slot structure of an NR frame to which the present invention can be applied.
  • a slot includes a plurality of symbols in the time domain. For example, in the case of a normal CP, one slot includes 14 symbols, but in the case of an extended CP, one slot may include 12 symbols. Alternatively, in the case of a normal CP, one slot includes 7 symbols, but in the case of an extended CP, one slot may include 6 symbols.
  • the carrier wave includes a plurality of subcarriers in the frequency domain.
  • a resource block (RB) may be defined as a plurality of (eg, 12) consecutive subcarriers in the frequency domain.
  • the BWP (Bandwidth Part) may be defined as a plurality of consecutive (P) RBs in the frequency domain, and may correspond to one numerology (eg, SCS, CP length, etc.).
  • the carrier may include up to N (eg, 5) BWPs. Data communication can be performed through an activated BWP.
  • Each element may be referred to as a resource element (RE) in the resource grid, and one complex symbol may be mapped.
  • RE resource element
  • a method in which a transmission resource of a next packet is also reserved may be used for selection of a transmission resource.
  • FIG. 14 shows an example in which a transmission resource to which the present invention can be applied is selected.
  • two transmissions per MAC PDU may be performed.
  • a resource for retransmission may be reserved with a certain time gap.
  • the terminal can grasp transmission resources reserved by other terminals or resources used by other terminals through sensing within the sensing window, and after excluding it in the selection window, random among the remaining resources with less interference Resources can be selected.
  • the UE may decode a PSCCH including information on a period of reserved resources, and measure PSSCH RSRP from resources periodically determined based on the PSCCH.
  • the UE may exclude resources in which the PSSCH RSRP value exceeds a threshold within a selection window. Thereafter, the terminal may randomly select the sidelink resource among the remaining resources in the selection window.
  • the terminal may determine the resources with low interference (for example, resources corresponding to the lower 20%) by measuring the received signal strength indication (RSSI) of periodic resources in the sensing window. And, the terminal may randomly select a sidelink resource from among the resources included in the selection window among the periodic resources. For example, when the UE fails to decode the PSCCH, the UE may use the above method.
  • RSSI received signal strength indication
  • PSCCH and PSSCH are transmitted by FDM.
  • PSCCH and PSSCH can be transmitted by FDM on different frequency resources on the same time resource for this purpose.
  • the PSCCH and the PSSCH may not be directly adjacent as shown in FIG. 15 (a), and the PSCCH and the PSSCH may be directly adjacent as shown in FIG. 15 (b).
  • the basic unit of transmission is a sub-channel.
  • the sub-channel may be a resource unit having one or more RB sizes on a frequency axis on a predetermined time resource (eg, time resource unit).
  • the number of RBs included in the sub-channel (ie, the size of the sub-channel and the starting position on the frequency axis of the sub-channel) may be indicated by higher layer signaling.
  • the embodiment of FIG. 15 may be applied to NR sidelink resource allocation mode 1 or mode 2.
  • CAM Cooperative Awareness Message
  • DENM Decentralized Environmental Notification Message
  • a periodic message type CAM In vehicle-to-vehicle communication, a periodic message type CAM, an event triggered message type DENM, and the like can be transmitted.
  • the CAM may include basic vehicle information such as dynamic state information of a vehicle such as direction and speed, vehicle static data such as dimensions, external lighting conditions, and route history.
  • the size of CAM can be 50-300 bytes.
  • CAM is broadcast, and latency should be less than 100ms.
  • DENM may be a message generated in the event of a vehicle breakdown or an accident.
  • the size of DENM can be smaller than 3000 bytes, and any vehicle within the transmission range can receive the message. At this time, DENM may have a higher priority than CAM.
  • Carrier reselection for V2X / sidelink communication may be performed at the MAC layer based on CBR (Channel Busy Ratio) of the set carriers and PPPP (Prose Per-Packet Priority) of the V2X message to be transmitted.
  • CBR Channel Busy Ratio
  • PPPP Prose Per-Packet Priority
  • CBR may mean the portion of sub-channels in the resource pool in which the S-RSSI measured by the UE is detected to exceed a preset threshold.
  • PPPP associated with each logical channel may exist, and the setting of the PPPP value should reflect the latency required for both the terminal and the base station.
  • the UE may select one or more of the candidate carriers in increasing order from the lowest CBR.
  • a data unit to which the present invention can be applied can be subjected to physical layer processing at a transmitting side before being transmitted through a wireless interface, and a wireless signal carrying a data unit to which the present invention can be applied is a receiving side ( receiving side).
  • 16 shows an example of physical layer processing at a transmission side to which the present invention can be applied.
  • Table 3 may indicate a mapping relationship between an uplink transport channel and a physical channel
  • Table 4 may indicate a mapping relationship between uplink control channel information and a physical channel.
  • Table 5 may indicate a mapping relationship between a downlink transport channel and a physical channel
  • Table 6 may indicate a mapping relationship between downlink control channel information and a physical channel.
  • Table 7 may indicate a mapping relationship between a sidelink transmission channel and a physical channel
  • Table 8 may indicate a mapping relationship between sidelink control channel information and a physical channel.
  • the transmitting side may perform encoding on a transport block (TB).
  • Data and control streams from the MAC layer can be encoded to provide transport and control services over a radio transmission link in the PHY layer.
  • TB from the MAC layer can be encoded as a codeword at the transmitting side.
  • the channel coding scheme may be a combination of error detection, error correcting, rate matching, interleaving and control information or transport channels separated from physical channels.
  • the channel coding scheme may be a combination of error detection, error correcting, rate matching, interleaving and control information mapped on a physical channel or a transmission channel. have.
  • the following channel coding scheme can be used for different types of transport channels and different types of control information.
  • the channel coding scheme for each transport channel type may be as shown in Table 9.
  • the channel coding scheme for each control information type may be as shown in Table 10.
  • Control information Channel coding method DCI Polar code SCI UCI Block code, Polar code
  • the transmitting side may attach a cyclic redundancy check (CRC) sequence to TB.
  • CRC cyclic redundancy check
  • the transmitting side can provide error detection to the receiving side.
  • the transmitting side may be a transmitting terminal, and the receiving side may be a receiving terminal.
  • a communication device may use LDPC codes to encode / decode UL-SCH, DL-SCH, and the like.
  • the NR system can support two LDPC base graphs (ie, two LDPC base metrics).
  • the two LDPC base graphs can be LDPC base graph 1 optimized for small TB and LDPC base graph for large TB.
  • the transmitting side may select LDPC base graph 1 or 2 based on the size and coding rate (R) of TB.
  • the coding rate may be indicated by a modulation coding scheme (MCS) index (I_MCS).
  • MCS index may be dynamically provided to the UE by PDCCH scheduling PUSCH or PDSCH.
  • the MCS index may be dynamically provided to the UE by a PDCCH that reinitializes or reactivates UL configured grant 2 or DL SPS.
  • the MCS index may be provided to the UE by RRC signaling associated with UL configured grant type 1.
  • the transmitting side may divide the TB with the CRC attached into a plurality of code blocks. And, the transmitting side may attach additional CRC sequences to each code block.
  • the maximum code block size for LDPC base graph 1 and LDPC base graph 2 may be 8448 bits and 3480 bits, respectively. If the TB with the CRC attached is not larger than the maximum code block size for the selected LDPC base graph, the transmitting side may encode the TB with the CRC attached to the selected LDPC base graph.
  • the transmitting side can encode each code block of TB into a selected LDPC basic graph. And, LDPC coded blocks can be individually rate matched.
  • Code block concatenation can be performed to generate a codeword for transmission on the PDSCH or PUSCH.
  • PDSCH For PDSCH, up to two codewords (ie, up to two TBs) can be transmitted on the PDSCH simultaneously.
  • PUSCH may be used for transmission of UL-SCH data and layer 1 and / or 2 control information.
  • layer 1 and / or 2 control information may be multiplexed with a codeword for UL-SCH data.
  • the transmitting side may perform scrambling and modulation on the codeword.
  • the bits of the codeword can be scrambled and modulated to produce a block of complex-valued modulation symbols.
  • the transmitting side may perform layer mapping.
  • the complex value modulation symbols of the codeword may be mapped to one or more multiple input multiple output (MIMO) layers.
  • Codewords can be mapped to up to four layers.
  • the PDSCH can carry two codewords, so the PDSCH can support up to 8-layer transmission.
  • the PUSCH can support a single codeword, and thus the PUSCH can support up to 4-ator transmission.
  • the transmitting side may perform precoding conversion.
  • the downlink transmission waveform may be general OFDM using a cyclic prefix (CP).
  • transform precoding ie, discrete Fourier transform (DFT)
  • DFT discrete Fourier transform
  • the uplink transmission waveform may be conventional OFDM using a CP having a transform precoding function that performs DFT spreading that can be disabled or enabled.
  • transform precoding can be selectively applied.
  • the transform precoding may be to spread uplink data in a special way to reduce the peak-to-average power ratio (PAPR) of the waveform.
  • PAPR peak-to-average power ratio
  • the transform precoding may be a form of DFT. That is, the NR system can support two options for the uplink waveform. One may be CP-OFDM (same as DL waveform), and the other may be DFT-s-OFDM. Whether the UE should use CP-OFDM or DFT-s-OFDM can be determined by the base station through RRC parameters.
  • the transmitting side may perform subcarrier mapping. Layers can be mapped to antenna ports.
  • a transparent manner (non-codebook based) mapping may be supported, and how beamforming or MIMO precoding is performed may be transparent to the UE. have.
  • both non-codebook based mapping and codebook based mapping can be supported.
  • the transmitting side can map complex-valued modulation symbols to subcarriers in a resource block allocated to the physical channel. have.
  • the transmitting side may perform OFDM modulation.
  • the communication device at the transmitting side adds CP and performs IFFT, so that the time-continuous OFDM baseband signal on the antenna port p and the subcarrier spacing setting for the OFDM symbol l in the TTI for the physical channel (u ).
  • the communication device at the transmitting side can perform an Inverse Fast Fourier Transform (IFFT) on a complex-valued modulation symbol (MAP) mapped to a resource block of the corresponding OFDM symbol.
  • IFFT Inverse Fast Fourier Transform
  • MAP complex-valued modulation symbol
  • the communication device on the transmission side can add CP to the IFFT signal to generate the OFDM baseband signal.
  • the transmitting side may perform up-conversion.
  • the transmitting communication device can up-convert the OFDM baseband signal, the subcarrier spacing setting (u), and the OFDM symbol (l) for the antenna port (p) to the carrier frequency (f0) of the cell to which the physical channel is assigned. .
  • the processors 9011 and 9021 of FIG. 23 may be configured to perform encoding, scrambling, modulation, layer mapping, precoding transformation (for uplink), subcarrier mapping and OFDM modulation.
  • 17 shows an example of physical layer processing at a receiving side to which the present invention can be applied.
  • the physical layer processing at the receiving side may be basically the reverse processing of the physical layer processing at the transmitting side.
  • the receiving side may perform frequency down-conversion.
  • the communication device at the reception side may receive an RF signal having a carrier frequency through an antenna.
  • the transceivers 9013 and 9023 receiving the RF signal at the carrier frequency may downconvert the carrier frequency of the RF signal to the baseband to obtain the OFDM baseband signal.
  • the receiving side may perform OFDM demodulation.
  • the communication device at the receiving side may acquire a complex-valued modulation symbol through CP separation and FFT. For example, for each OFDM symbol, the communication device on the receiving side can remove the CP from the OFDM baseband signal. Then, the communication device on the receiving side performs FFT on the CP-removed OFDM baseband signal to obtain a complex value modulation symbol for the antenna port (p), subcarrier spacing (u), and OFDM symbol (l). You can.
  • the receiving side may perform subcarrier demapping.
  • Subcarrier demapping may be performed on a complex value modulated symbol to obtain a complex value modulated symbol of a corresponding physical channel.
  • the processor of the terminal may obtain a complex value modulation symbol mapped to a subcarrier belonging to the PDSCH among complex value modulation symbols received in a bandwidth part (BWP).
  • BWP bandwidth part
  • the receiving side may perform transform de-precoding.
  • transform de-precoding eg, IDFT
  • IDFT a complex value modulation symbol of an uplink physical channel.
  • transform de-precoding may not be performed.
  • step S114 the receiving side may perform layer demapping. Complex-valued modulation symbols can be demapped into one or two codewords.
  • the receiving side may perform demodulation and descrambling.
  • the complex value modulation symbol of the codeword can be demodulated and descrambled with bits of the codeword.
  • the receiving side may perform decoding.
  • the codeword can be decoded into TB.
  • LDPC base graph 1 or 2 may be selected based on the size and coding rate (R) of TB.
  • the codeword may include one or more coded blocks. Each coded block may be decoded into a code block with a CRC attached to a selected LDPC base graph or a TB with a CRC attached. If code block segmentation is performed on the TB where the CRC is attached at the transmitting side, the CRC sequence can be removed from each of the code blocks where the CRC is attached, and code blocks can be obtained.
  • the code block may be connected to the TB where the CRC is attached.
  • the TB CRC sequence can be removed from the TB to which the CRC is attached, whereby the TB can be obtained.
  • TB can be delivered to the MAC layer.
  • the processors 9011 and 9021 of FIG. 22 may be configured to perform OFDM demodulation, subcarrier demapping, layer demapping, demodulation, descrambling and decoding.
  • time and frequency domain resources eg, OFDM symbols, subcarriers, and carrier frequencies
  • OFDM modulation e.g., OFDM symbols, subcarriers, and carrier frequencies
  • frequency up / down conversion related to subcarrier mapping are allocated to resources (eg For example, it may be determined based on uplink grand and downlink allocation.
  • TDMA time division multiple access
  • FDMA frequency division multiples access
  • ISI inter symbol interference
  • ICI inter carrier interference
  • SLSS sidelink synchronization signal
  • MIB-SL-V2X master information block-sidelink-V2X
  • RLC radio link control
  • a terminal may be synchronized to GNSS indirectly through a terminal (in network coverage or out of network coverage) synchronized directly to GNSS (global navigation satellite systems), or directly to GNSS. You can.
  • the UE can calculate the DFN and subframe number using Coordinated Universal Time (UTC) and (Pre) set DFN (Direct Frame Number) offset.
  • UTC Coordinated Universal Time
  • Pre Pre
  • the terminal may be synchronized directly with the base station or with other terminals time / frequency synchronized to the base station.
  • the base station may be an eNB or gNB.
  • the terminal receives synchronization information provided by the base station, and may be directly synchronized with the base station. Thereafter, the terminal may provide synchronization information to other adjacent terminals.
  • the base station timing is set as a synchronization criterion, the terminal is a cell associated with a corresponding frequency (if within the cell coverage at the frequency), a primary cell or a serving cell (if outside the cell coverage at the frequency) for synchronization and downlink measurement ).
  • the base station may provide synchronization settings for carriers used for V2X / sidelink communication.
  • the terminal may follow the synchronization setting received from the base station. If the terminal does not detect any cell on the carrier used for the V2X / sidelink communication and has not received a synchronization setting from the serving cell, the terminal may follow a preset synchronization setting.
  • the terminal may be synchronized to another terminal that has not directly or indirectly obtained synchronization information from the base station or GNSS.
  • the synchronization source and preference may be preset to the terminal.
  • the synchronization source and preference may be set through a control message provided by the base station.
  • the sidelink synchronization source can be associated with the synchronization priority.
  • the relationship between the synchronization source and synchronization priority can be defined as shown in Table 11.
  • Table 11 is only an example, and the relationship between the synchronization source and the synchronization priority may be defined in various forms.
  • GNSS-based synchronization Base station based synchronization (eNB / gNB-based synchronization) P0 GNSS Base station P1 All terminals synchronized directly to GNSS All terminals synchronized directly to the base station P2 All terminals indirectly synchronized to GNSS All terminals indirectly synchronized to the base station P3 All other terminals GNSS P4 N / A All terminals synchronized directly to GNSS P5 N / A All terminals indirectly synchronized to GNSS P6 N / A All other terminals
  • Whether to use GNSS-based synchronization or base station-based synchronization may be set in advance.
  • the terminal can derive the transmission timing of the terminal from the available synchronization criteria with the highest priority.
  • GNSS, eNB, and UE may be set / selected as a synchronization (talk) reference.
  • gNB was introduced, and thus NR gNB can also be a synchronization reference, and it is necessary to determine the synchronization source priority of gNB.
  • the NR terminal may not implement the LTE synchronization signal detector or access the LTE carrier. (non-standalone NR UE) In this situation, the LTE terminal and the NR terminal may have different timings, which is not desirable from the viewpoint of effective allocation of resources.
  • Synchronization source / reference may be defined as a subject that transmits a synchronization signal or a synchronization signal used to induce timing for a UE to transmit / receive sidelink signals or to induce subframe boundaries. If the UE receives the GNSS signal and derives a subframe boundary based on UTC timing derived from the GNSS, the GNSS signal or the GNSS may be a synchronization source / reference.
  • GNSS, eNB, and UE may be set / selected as a synchronization (talk) reference.
  • gNB was introduced, and thus NR gNB can also be a synchronization reference, and it is necessary to determine the synchronization source priority of gNB.
  • the NR terminal may not implement the LTE synchronization signal detector or access the LTE carrier. (non-standalone NR UE) In this situation, the LTE terminal and the NR terminal may have different timings, which is not desirable from the viewpoint of effective allocation of resources.
  • Synchronization source / reference may be defined as a subject that transmits a synchronization signal or a synchronization signal used to induce timing for a UE to transmit / receive sidelink signals or to induce subframe boundaries. If the UE receives the GNSS signal and derives a subframe boundary based on UTC timing derived from the GNSS, the GNSS signal or the GNSS may be a synchronization source / reference.
  • the DMRS sequence of PDSCH / PUSCH in NR uses a gold sequence, and the corresponding pseudo-random sequence is initialized by Equation 1 below.
  • ID used for initialization ( ) Is given as a higher-layer parameter, and is UE specific configured. The ID uses 16bits, and through this, terminals can be distinguished. Also, OFDM symbol index (l), slot index ( ) To generate a DMRS sequence per OFDM symbol.
  • PSSCH DMRS sequence is generated in the future NR V2X, it is necessary to not only distinguish the sidelink terminals, but also need to be distinguished from the DMRS sequence set between the NR Uus described above. Therefore, hereinafter, a method for generating a V2X PSSCH DMRS sequence for this will be described. This method can be similarly applied when generating a PSCCH DMRS sequence.
  • the terminal according to the embodiment (s) may generate a DMRS for a physical sidelink shared channel (PSSCH) (S1901 in FIG. 19), and transmit the DMRS and the PSSCH for the PSSCH (S1902 in FIG. 19).
  • PSSCH physical sidelink shared channel
  • a cyclical redundancy check (CRC) used for a physical sidelink control channel (PSCCH) related to the PSSCH may be used when generating a sequence related to DMRS for the PSSCH.
  • the CRC may be used instead of ID information used for initialization of a sequence related to DMRS for the PSSCH, and the ID information may be one of or. That is, the above equation 1 in And / or Can be derived from the CRC for the corresponding SCI (PSCCH).
  • Is given by the CRC's LSB (least significant bit) Can be given by 1008 and the next 6 LSBs of the CRC.
  • 16bit or higher
  • In 6bit other than 10bit for medium cell ID in CRC For the next 6 LSBs of the LSB for use.
  • the PSSCH DMRS sequence can be easily randomized.
  • the DMRS of the NR SL can be distinguished from the DMRS of the NR Uu without any special process.
  • the PSCCH may include control information related to reception of the PSSCH.
  • the DMRS for the PSCCH may be independent of the DMRS for the PSSCH.
  • the MU-MIMO used in Equation 1 above (1 bit) can always be assumed to be 0 (or 1).
  • the value may be a fixed value for each terminal or a (pre) configured value for each resource pool.
  • Equation 1 A value of 0 or 1 may be used depending on whether the UE supports MU-MIMO and whether it is indicated in SCI.
  • MU-MIMO may be supported. In this case, it is necessary to consider whether the Tx UE uses MU-MIMO in consideration of the channel condition, so a process of reporting the CSI to the base station by the corresponding UE may be necessary.
  • PSSCH resource is used in PSCCH considering that it is implicitly interlocked with PSCCH. You can also succeed. in this case, The value can also be set in the same way as the PSCCH.
  • a gold sequence is used, and a pseudo-random sequence can be initialized using Equation (1).
  • a destination group ID in a groupcast and a destination ID in a unicast can be used as ID parameters.
  • ID (16 bits) used in NR Uu can be divided by NR sidelink or used in NR sidelink except for the ID to distinguish it from the value of Equation 1 being used for initialization in NR Uu. For example, ⁇ 0, ... Among the IDs of, 65535 ⁇ , half of ⁇ 0,... , 32767 ⁇ is used in NR Uu, the other half is ⁇ 32768, ...
  • 65535 ⁇ can be used in the NR sidelink.
  • ⁇ 0, ... , 65535 ⁇ among IDs, X% can be used in NR Uu, and (100-X)% can be used in NR sidelink.
  • ⁇ 0, ... , 65535 ⁇ is used in NR Uu, ⁇ 65536, ... , 65536 + 2 ⁇ (sum of the number of bits used for the destination group ID and destination ID) ⁇ can be used in the NR sidelink.
  • Equation 1 Cell ID ⁇ 0,1,... , 1007 ⁇ , so the value excluding ⁇ 1008,...
  • Some values among., 65535 ⁇ can be used as (pre) configured values or predefined values for sidelink purposes.
  • the NR sidelink used It can be assumed that the value will not be used in NR Uu.
  • Also used in MU-MIMO in Equation 1 May be randomly selected as a value of 0 or 1 in the terminal.
  • parameters may be pre-configured for each antenna port to generate orthogonal sequences in the time domain and / or frequency domain for initialization in the MIMO case.
  • parameters of Equation 1 may be (pre) configured in the PSSCH for each SCI format. At this time within the formula, 1 bit can be additionally used to distinguish between MIMO and non-MIMO cases.
  • a parameter capable of generating a DMRS sequence can be used by avoiding other detected terminals.
  • the value of the corresponding bit in the detected terminal may be set differently.
  • the Tx terminal may be configured to allow the Tx terminal to generate a DMRS sequence by avoiding the value of the corresponding bit in the terminal detected by the Rx terminal.
  • the value of the A bit of the Tx terminal described above can be configured in the Rx terminal.
  • a method for generating a DMRS sequence in NR V2X can be used in the same manner in PSCCH as in PSSCH. That is, when there is no higher-layer signal, Cell ID ⁇ 0,1,... , 1007 ⁇ , so the value excluding ⁇ 1008,... ., 65535 ⁇ may be a (pre) configured value or a predefined value for sidelink purposes. In this case the NR sidelink used It can be assumed that the value will not be used in NR Uu. Also used in MU-MIMO in Equation 1 Is a value of 0 or 1, and may be a (pre) configured value or a predefined value.
  • Equation 1 When setting the length of the control message for each terminal in the control resource pool of the NR V2X sidelink, it may be necessary to vary the length of the control message according to the payload size of the control message or service type / target coverage.
  • a long / short PUCCH structure used in NR can be used in NR V2X, and a method of generating a DMRS sequence capable of distinguishing it can be considered.
  • parameters of Equation 1 may be (pre) configured for each PSCCH format. For example, by assigning B bits in Equation 1, the value of the corresponding bit can be set differently for each control resource pool.
  • a control channel of one terminal may be composed of one or multiple transmission basic units, and the number of transmission basic units is expressed through aggregation level (AL) Can be.
  • AL aggregation level
  • the value of the B bit can be set differently for each AL.
  • PSCCH is fixed for initialize Values are available. For example, a fixed value can be designated for each antenna port. As another example, fix values can be specified for each mode. Alternatively, a combination of the two can be specified.
  • the parameter of Equation 1 can be (pre) configured in the PSCCH for initialization. For example, parameters can be (pre) configured for each SCI format used in PSCCH.
  • the base station may apply the above-described DMRS sequence generation method to a terminal (transmission / reception terminal).
  • the order and performance of the illustrated S2001 and S2002 steps can be modified.
  • the receiving terminal may receive information on the DMRS sequence generation through the transmitting terminal
  • the transmitting terminal may receive information on the DMRS sequence generation through the receiving terminal
  • the order of the two steps may be modified, The two steps may be performed in one step.
  • the transmitting terminal may include detecting neighboring terminals in the sequence generation step.
  • the transmitting terminal may generate a DMRS sequence of the transmitting terminal using a parameter for distinguishing the DMRS sequence between the surrounding terminal and the transmitting terminal through step S2003.
  • a DMRS sequence of a transmitting terminal may be generated using a parameter for classifying terminals by PSCCH format.
  • PSCCH format information may be transmitted.
  • step S2004 of FIG. 20 the transmitting terminal transmits data to the receiving terminal through a side link (SL).
  • the DMRS sequence of the transmitting terminal used for transmission is specified / determined through step S2003.
  • the content of the embodiment (s) is not limited only to direct communication between terminals, and may be used in uplink or downlink.
  • a base station or a relay node may use the proposed method.
  • the examples of the proposed method described above may also be included as one of the implementation methods, and thus may be regarded as a kind of proposed methods. Further, the above-described proposed schemes may be implemented independently, but may also be implemented in a combination (or merge) form of some suggested schemes. Whether or not the proposed methods are applied (or information on the rules of the proposed methods) includes a signal (eg, a physical layer signal or a higher layer signal) predefined by a base station to a terminal or a transmitting terminal to a receiving terminal. Rules can be defined to inform you through.
  • a signal eg, a physical layer signal or a higher layer signal
  • 21 shows a wireless communication device according to an embodiment.
  • the wireless communication system may include a first device 9010 and a second device 9020.
  • the first device 9010 is a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, a connected car, a drone (Unmanned Aerial Vehicle), UAV), AI (Artificial Intelligence) module, robot, Augmented Reality (AR) device, Virtual Reality (VR) device, Mixed Reality (MR) device, Hologram device, Public safety device, MTC device, IoT device, Medical device, Pin It may be a tech device (or financial device), a security device, a climate / environment device, a device related to 5G services, or another device related to the fourth industrial revolution.
  • a tech device or financial device
  • a security device a climate / environment device, a device related to 5G services, or another device related to the fourth industrial revolution.
  • the second device 9020 is a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, a connected car, a drone (Unmanned Aerial Vehicle), UAV), AI (Artificial Intelligence) module, robot, Augmented Reality (AR) device, Virtual Reality (VR) device, Mixed Reality (MR) device, Hologram device, Public safety device, MTC device, IoT device, Medical device, Pin It may be a tech device (or financial device), a security device, a climate / environment device, a device related to 5G services, or another device related to the fourth industrial revolution.
  • a tech device or financial device
  • a security device a climate / environment device, a device related to 5G services, or another device related to the fourth industrial revolution.
  • the terminal is a mobile phone, a smart phone, a laptop computer, a terminal for digital broadcasting, a personal digital assistants (PDA), a portable multimedia player (PMP), navigation, a slate PC, a tablet
  • PDA personal digital assistants
  • PMP portable multimedia player
  • slate PC a tablet
  • It may include a PC (tablet PC), ultrabook (ultrabook), wearable device (wearable device, for example, a watch-type terminal (smartwatch), glass-type terminal (smart glass), HMD (head mounted display), and the like.
  • the HMD may be a display device worn on the head.
  • HMD can be used to implement VR, AR or MR.
  • a drone may be a vehicle that does not ride and is flying by radio control signals.
  • the VR device may include a device that implements an object or background of a virtual world.
  • the AR device may include a device that is implemented by connecting an object or background of the virtual world to an object or background of the real world.
  • the MR device may include a device that fuses and implements an object or background in the virtual world, such as an object or background in the real world.
  • the hologram device may include a device that implements a 360-degree stereoscopic image by recording and reproducing stereoscopic information by utilizing the interference phenomenon of light generated when two laser lights called holography meet.
  • the public safety device may include a video relay device or a video device wearable on a user's body.
  • the MTC device and the IoT device may be devices that do not require direct human intervention or manipulation.
  • the MTC device and the IoT device may include a smart meter, a bending machine, a thermometer, a smart light bulb, a door lock, or various sensors.
  • a medical device may be a device used for the purpose of diagnosing, treating, alleviating, treating or preventing a disease.
  • a medical device may be a device used for the purpose of diagnosing, treating, reducing or correcting an injury or disorder.
  • a medical device may be a device used for the purpose of examining, replacing, or modifying a structure or function.
  • the medical device may be a device used to control pregnancy.
  • the medical device may include a medical device, a surgical device, a (in vitro) diagnostic device, a hearing aid, or a surgical device.
  • the security device may be a device installed in order to prevent a risk that may occur and to maintain safety.
  • the security device may be a camera, CCTV, recorder or black box.
  • the fintech device may be a device capable of providing financial services such as mobile payment.
  • the fintech device may include a payment device or a point of sales (POS).
  • a climate / environmental device may include a device that monitors or predicts the climate / environment.
  • the first device 9010 may include at least one processor, such as a processor 9011, at least one memory, such as a memory 9012, and at least one transceiver, such as a transceiver 9013.
  • the processor 9011 may perform the functions, procedures, and / or methods described above.
  • the processor 9011 may perform one or more protocols.
  • the processor 9011 can perform one or more layers of a radio interface protocol.
  • the memory 9012 is connected to the processor 9011 and can store various types of information and / or instructions.
  • the transceiver 9013 is connected to the processor 9011 and can be controlled to transmit and receive wireless signals.
  • the transceiver 9013 may be connected to one or more antennas 9014-1 to 9014-n, and the transceiver 9013 may include the methods herein and one or more antennas 9014-1 to 9014-n. / Or may be set to transmit and receive user data, control information, radio signals / channels, and the like mentioned in the operation flow chart.
  • the n antennas may be the number of physical antennas or the number of logical antenna ports.
  • the second device 9020 may include at least one processor, such as processor 9021, at least one memory device, such as memory 9022, and at least one transceiver, such as transceiver 9023.
  • the processor 9021 may perform the functions, procedures, and / or methods described above.
  • the processor 9021 may implement one or more protocols.
  • the processor 9021 may implement one or more layers of a radio interface protocol.
  • the memory 9022 is connected to the processor 9031 and may store various types of information and / or instructions.
  • the transceiver 9023 is connected to the processor 9021 and may be controlled to transmit and receive wireless signals.
  • the transceiver 9023 may be connected to one or more antennas 9024-1 to 9024-n, and the transceiver 9023 may include the methods herein and one or more antennas 9024-1 to 9024-n. / Or may be set to transmit and receive user data, control information, radio signals / channels, and the like mentioned in the operation flow chart.
  • the memory 9012 and / or the memory 9022 may be connected to each other inside or outside the processor 9011 and / or the processor 9021, and may be connected to other processors through various technologies such as a wired or wireless connection. 22 may be a wireless communication device according to an embodiment.
  • the wireless communication device in FIG. 22 may be a more detailed view of the first or second devices 9010 and 9020 of FIG. 21.
  • the wireless communication device in FIG. 22 is not limited to the terminal.
  • the wireless communication device can be any suitable mobile computer device configured to perform one or more implementations, such as a vehicle communication system or device, a wearable device, a portable computer, a smartphone, and the like.
  • the terminal includes at least one processor (eg, DSP or microprocessor), such as a processor 9110, a transceiver 9115, a power management module 9125, an antenna 9140, and a battery 9155 ), Display 9115, keypad 9120, Global Positioning System (GPS) chip 9160, sensor 9165, memory 9130, (optionally) subscriber identification module (SIM) card 9125, speaker ( 9145), a microphone 9150, and the like.
  • the terminal may include one or more antennas.
  • the processor 9110 may be configured to perform the functions, procedures, and / or methods described above. According to an implementation example, the processor 9110 may perform one or more protocols, such as layers of a radio interface protocol.
  • the memory 9130 is connected to the processor 9110 and may store information related to the operation of the processor 9110.
  • the memory 9130 may be located inside or outside the processor 9110 and may be connected to other processors through various technologies such as wired or wireless connections.
  • the user may input various types of information (for example, command information such as a telephone number) by pressing a button on the keypad 9120 or using various techniques such as voice activation using the microphone 9150.
  • the processor 9110 may receive and process user information and perform an appropriate function, such as dialing a telephone number.
  • data eg, operational data
  • the processor 9110 may receive and process GPS information from the GPS chip 9160 to perform functions related to the location of the terminal, such as vehicle navigation and map services.
  • the processor 9110 may display various types of information and data on the display 9115 for user reference or convenience.
  • the transceiver 9115 is connected to the processor 9110 and may transmit and receive a radio signal such as an RF signal.
  • the processor 9110 may control the transceiver 9115 to initiate communication and to transmit wireless signals including various types of information or data, such as voice communication data.
  • the transceiver 9115 may include one receiver and one transmitter to send or receive wireless signals.
  • the antenna 9140 may facilitate transmission and reception of wireless signals. According to an implementation example, in receiving radio signals, the transceiver 9115 may forward and convert the signals to a baseband frequency for processing using the processor 9110.
  • the processed signals can be processed according to various techniques, such as being converted into information that can be heard or read to be output through the speaker 9145.
  • the senor 9165 may be connected to the processor 9110.
  • the sensor 9165 may include one or more sensing devices configured to discover various types of information including, but not limited to, speed, acceleration, light, vibration, proximity, location, images, and the like.
  • the processor 9110 may receive and process sensor information obtained from the sensor 9165, and may perform various types of functions such as collision prevention and automatic driving.
  • various components may be further included in the terminal.
  • the camera may be connected to the processor 9110, and may be used for various services such as automatic driving and vehicle safety services.
  • FIG. 22 is only an example of a terminal, and implementation is not limited thereto.
  • some components eg keypad 9120, GPS chip 9160, sensor 9165, speaker 9145 and / or microphone 9150
  • FIG. 23 shows a transceiver of a wireless communication device according to an embodiment.
  • FIG. 23 may show an example of a transceiver that may be implemented in a frequency division duplex (FDD) system.
  • FDD frequency division duplex
  • At least one processor can process data to be transmitted and send signals such as analog output signals to the transmitter 9210.
  • the analog output signal at the transmitter 9210 can be filtered by a low pass filter (LPF) 9211, for example to remove noise due to previous digital-to-analog conversion (ADC).
  • LPF low pass filter
  • ADC analog-to-analog conversion
  • VGA variable gain amplifier
  • the amplified signal can be filtered by filter 9214, amplified by power amplifier (PA) 9215, routed through duplexer 9250 / antenna switches 9260, and antenna 9270 ).
  • PA power amplifier
  • the antenna 9270 can receive signals in a wireless environment, and the received signals can be routed at the antenna switch 9260 / duplexer 9250 and sent to the receiver 9220.
  • the signal received at the receiver 9220 can be amplified by an amplifier such as a low noise amplifier (LNA) 9223, filtered by a band pass filter 9224, and downconverter (e.g. For example, it may be downconverted from RF to baseband by a mixer 9225.
  • LNA low noise amplifier
  • the down-converted signal can be filtered by a low pass filter (LPF) 9262, amplified by an amplifier such as VGA 9227 to obtain an analog input signal, and the analog input signal is one or more processors. Can be provided.
  • LPF low pass filter
  • the local oscillator (LO) 9240 may generate and receive LO signals and send them to the upconverter 9212 and downconverter 9225, respectively.
  • a phase locked loop (PLL) 9230 may receive control information from the processor and may send control signals to the LO generator 9240 to generate transmission and reception of LO signals at a suitable frequency.
  • PLL phase locked loop
  • Implementations are not limited to the particular arrangement shown in FIG. 23, and various components and circuits may be arranged differently than the example shown in FIG. 23.
  • FIG. 24 illustrates a transceiver of a wireless communication device according to an embodiment.
  • FIG. 24 may show an example of a transceiver that may be implemented in a time division duplex (TDD) system.
  • TDD time division duplex
  • the transmitter 9310 and receiver 9320 of the transceiver of the TDD system may have one or more similar characteristics to the transmitter and receiver of the transceiver of the FDD system.
  • the structure of the transceiver of the TDD system will be described.
  • the signal amplified by the transmitter's power amplifier (PA) 9315 is routed through a band select switch 9350, a band pass filter (BPF) 9260, and an antenna switch (s) 9370. It can be transmitted to the antenna 9380.
  • PA power amplifier
  • BPF band pass filter
  • s antenna switch
  • the antenna 9380 receives signals from the wireless environment and the received signals are routed through an antenna switch (s) 9370, a band pass filter (BPF) 9260, and a band select switch 9350. It can be provided to the receiver 9320.
  • s antenna switch
  • BPF band pass filter
  • the operation of the wireless device related to the sidelink described in FIG. 25 is merely an example, and sidelink operations using various technologies may be performed in the wireless device.
  • the sidelink may be a terminal-to-terminal interface for sidelink communication and / or sidelink discovery.
  • the side link may correspond to the PC5 interface.
  • the sidelink operation may be transmission and reception of information between terminals.
  • the sidelink can carry various types of information.
  • the wireless device may acquire information related to the sidelink.
  • the information related to the sidelink may be one or more resource configurations.
  • Information related to the sidelink may be obtained from other wireless devices or network nodes.
  • the wireless device After obtaining the information related to the sidelink, in step S9420, the wireless device can decode the information related to the sidelink.
  • the wireless device may perform one or more sidelink operations based on the information related to the sidelink.
  • the sidelink operation (s) performed by the wireless device may include one or more operations described herein.
  • FIG. 26 illustrates an operation of a network node related to a side link according to an embodiment.
  • the operation of the network node related to the sidelink described in FIG. 26 is merely an example, and sidelink operations using various techniques may be performed at the network node.
  • the network node may receive information on the sidelink from the wireless device.
  • the sidelink information may be sidelink UE information used to inform the network node of sidelink information.
  • the network node may determine whether to transmit one or more commands related to the sidelink based on the received information.
  • the network node may transmit the command (s) related to the sidelink to the wireless device.
  • the wireless device may perform one or more sidelink operation (s) based on the received command.
  • FIG. 27 shows an implementation of a wireless device and a network node according to an embodiment.
  • the network node can be replaced with a wireless device or terminal.
  • the wireless device 9610 may include a communication interface 9611 to communicate with one or more other wireless devices, network nodes and / or other elements in the network.
  • Communication interface 9611 may include one or more transmitters, one or more receivers, and / or one or more communication interfaces.
  • the wireless device 9610 may include a processing circuit 9612.
  • the processing circuit 9612 may include one or more processors, such as processor 9313, and one or more memories, such as memory 9614.
  • the processing circuit 9612 may be configured to control any methods and / or processes described herein and / or, for example, to cause the wireless device 9610 to perform such methods and / or processes.
  • the processor 9313 may correspond to one or more processors for performing wireless device functions described herein.
  • the wireless device 9610 may include a memory 9614 configured to store data, program software code, and / or other information described herein.
  • memory 9614 may include software code (including instructions) that, when one or more processors, such as processor 9313, are executed, processor 9613 performs some or all of the processes according to the invention described above. 9615).
  • one or more processors that control one or more transceivers, such as transceiver 2223, to transmit and receive information may perform one or more processes related to the transmission and reception of information.
  • the network node 9620 may include a communication interface 9621 to communicate with one or more other network nodes, wireless devices and / or other elements on the network.
  • the communication interface 9621 may include one or more transmitters, one or more receivers, and / or one or more communication interfaces.
  • Network node 9620 may include processing circuitry 9622.
  • the processing circuit may include a processor 9623 and a memory 9624.
  • memory 9624 when executed by one or more processors, such as processor 9623, software code 9625 including instructions that cause processor 9923 to perform some or all of the processes in accordance with the present invention. ).
  • one or more processors that control one or more transceivers, such as the transceiver 2213, to transmit and receive information may perform one or more processes related to the transmission and reception of information.
  • each structural element or function can be considered selectively.
  • Each of the structural elements or features can be performed without being combined with other structural elements or features.
  • some structural elements and / or features can be combined with each other to construct implementations.
  • the order of operation described in the implementation can be changed.
  • Some structural elements or features of one implementation may be included in another implementation, or may be replaced by structural elements or features corresponding to another implementation.
  • Implementations in the present invention can be made by various techniques, such as hardware, firmware, software, or combinations thereof.
  • a method according to implementation may include one or more Application Specific Integrated Circuits (ASICs), one or more Digital Signal Processors (DSPs), one or more Digital Signal Processing Devices (DSPDs), one or more Programmable Logic Devices (PLDs), one or more Field programmable gate arrays (FPGAs), one or more processors, one or more controllers, one or more microcontrollers, one or more microprocessors, and the like.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs Field programmable gate arrays
  • processors one or more controllers, one or more microcontrollers, one or more microprocessors, and the like.
  • firmware or software implementations may be implemented in the form of modules, procedures, functions, and the like.
  • the software code can be stored in memory and executed by a processor.
  • the memory may be located inside or outside the processor, and may transmit and receive data from the processor in various ways.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne, selon un mode de réalisation, un procédé par lequel un terminal émet un signal de liaison latérale dans un système de communications sans fil, le procédé comportant les étapes consistant à: générer un DMRS pour un canal physique partagé de liaison latérale (PSSCH); et émettre le DMRS pour PSSCH et le PSSCH, un contrôle de redondance cyclique (CRC) utilisé pour un canal physique de commande de liaison latérale (PSCCH) lié au PSSCH étant utilisé lorsqu'une séquence liée au DMRS pour PSSCH est générée.
PCT/KR2019/012262 2018-09-20 2019-09-20 Procédé d'émission par un terminal d'un signal de liaison latérale dans un système de communications sans fil, et dispositif associé WO2020060304A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201862734210P 2018-09-20 2018-09-20
US62/734,210 2018-09-20
US201962825787P 2019-03-28 2019-03-28
US62/825,787 2019-03-28
US201962887472P 2019-08-15 2019-08-15
US62/887,472 2019-08-15

Publications (1)

Publication Number Publication Date
WO2020060304A1 true WO2020060304A1 (fr) 2020-03-26

Family

ID=69887724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/012262 WO2020060304A1 (fr) 2018-09-20 2019-09-20 Procédé d'émission par un terminal d'un signal de liaison latérale dans un système de communications sans fil, et dispositif associé

Country Status (1)

Country Link
WO (1) WO2020060304A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210385804A1 (en) * 2019-11-28 2021-12-09 Apple Inc. V2X Sidelink Channel Design

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017176096A1 (fr) * 2016-04-07 2017-10-12 엘지전자 주식회사 Procédé pour réserver un nombre fini de ressources utilisées pour effectuer une communication v2x dans un système de communication sans fil, et terminal d'utilisation associé
US20180062809A1 (en) * 2016-08-24 2018-03-01 Qualcomm Incorporated Demodulation reference signal sequence selection in device-to-device communication

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017176096A1 (fr) * 2016-04-07 2017-10-12 엘지전자 주식회사 Procédé pour réserver un nombre fini de ressources utilisées pour effectuer une communication v2x dans un système de communication sans fil, et terminal d'utilisation associé
US20180062809A1 (en) * 2016-08-24 2018-03-01 Qualcomm Incorporated Demodulation reference signal sequence selection in device-to-device communication

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"On PSSCH DMRS Signal Generation and Performance Analysis", R1-1702138, 3GPP TSG RANI WG MEETING #88, 7 February 2017 (2017-02-07), Athens, Greece, XP051221047 *
LG ELECTRONICS: "Remaining issues for PC5 V2V", R1-1702394. 3GPP TSG RANI WG MEETING #88, 7 February 2017 (2017-02-07), Athens, Greece, XP051221254 *
SAMSUNG: "Further randomization on DMRS and scrambling code", R1-1702861. 3GPP TSG RANI WG MEETING #88, 7 February 2017 (2017-02-07), Athens, Greece, XP051221685 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210385804A1 (en) * 2019-11-28 2021-12-09 Apple Inc. V2X Sidelink Channel Design
US11601919B2 (en) * 2019-11-28 2023-03-07 Apple Inc. V2X sidelink channel design

Similar Documents

Publication Publication Date Title
WO2020022845A1 (fr) Procédé et appareil destinés à transmettre un signal par un terminal de liaison montante dans un système de communication sans fil
WO2020145785A1 (fr) Procédé et appareil permettant à un terminal de liaison latérale de transmettre un signal dans un système de communication sans fil
WO2020096435A1 (fr) Procédé et appareil d'émission d'un signal de rétroaction au moyen d'un terminal de liaison latérale dans un système de communication sans fil
WO2020246818A1 (fr) Procédé de transmission de signal en liaison latérale dans un système de communication sans fil
WO2020209594A1 (fr) Procédé de fonctionnement d'un ue lié à une communication de liaison latérale et à une rétroaction dans un système de communication sans fil
WO2019216627A1 (fr) Procédé et dispositif permettant d'ajuster un paramètre de transmission au moyen d'un terminal de liaison latérale dans des communications v2x nr
WO2019226026A1 (fr) Procédé et appareil de transmission de signal de liaison latérale dans un système de communication sans fil
WO2021045565A1 (fr) Procédé et dispositif de mesure de l'emplacement d'un terminal dans un système de communication sans fil
WO2019240550A1 (fr) Procédé et appareil pour rapporter un type de diffusion par un ue dans nr v2x
WO2021040489A1 (fr) Procédé et appareil de mesure de position de terminal dans un système de communication sans fil
WO2020171669A1 (fr) Procédé et appareil permettant à un terminal de liaison latérale d'émettre et de recevoir un signal relatif à un rapport d'état de canal dans un système de communication sans fil
WO2020262906A1 (fr) Procédé de fonctionnement d'un terminal de liaison latérale associé à un décalage de constellation dans un système de communication sans fil
WO2020067761A1 (fr) Procédé de transmission et de réception de signal de données et dispositif associé
WO2020032764A1 (fr) Procédé et appareil destinés à transmettre une pluralité de paquets par un terminal à liaison latérale dans un système de communication sans fil
WO2020027572A1 (fr) Procédé et dispositif de transmission d'un signal de synchronisation au moyen d'un terminal de liaison latérale dans un système de communication sans fil
WO2020091346A1 (fr) Procédé et dispositif de transmission de pssch par un terminal dans un système de communication sans fil
WO2020159297A1 (fr) Procédé et appareil permettant de transmettre un signal au moyen d'un terminal de liaison latérale dans un système de communication sans fil
WO2020242211A1 (fr) Procédé de transmission de signal de liaison latérale dans un système de communication sans fil
WO2021040143A1 (fr) Procédé pour véhicule pour transmettre un signal dans un système de communication sans fil, et véhicule associé
WO2020027636A1 (fr) Procédé et dispositif pour effectuer une commande de puissance dans nr v2x
WO2021100935A1 (fr) Procédé de transmission, par un terminal d'un usager de la route vulnérable, d'un signal dans un système de communication sans fil
WO2021100938A1 (fr) Procédé de transmission de signal entre un véhicule, un terminal et un réseau dans un système de communication sans fil, et véhicule, terminal et réseau correspondants
WO2021075595A1 (fr) Procédé d'émission et de réception, par un équipement utilisateur, de message destiné à un usager de la route vulnérable dans un système de communication sans fil
WO2020067760A1 (fr) Procédé pour réaliser une surveillance de liaison radio et appareil associé
WO2020091500A1 (fr) Procédé pour la transmission et la réception d'un signal de synchronisation dans une communication sans fil de dispositif à dispositif, et appareil associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19863303

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19863303

Country of ref document: EP

Kind code of ref document: A1