WO2021221481A1 - 무선통신시스템에서 vru 위치에 관련된 rsu의 동작 방법 - Google Patents
무선통신시스템에서 vru 위치에 관련된 rsu의 동작 방법 Download PDFInfo
- Publication number
- WO2021221481A1 WO2021221481A1 PCT/KR2021/005463 KR2021005463W WO2021221481A1 WO 2021221481 A1 WO2021221481 A1 WO 2021221481A1 KR 2021005463 W KR2021005463 W KR 2021005463W WO 2021221481 A1 WO2021221481 A1 WO 2021221481A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vru
- location information
- information
- rsu
- message
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/04—Systems determining the presence of a target
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/90—Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/396—Determining accuracy or reliability of position or pseudorange measurements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/0009—Transmission of position information to remote stations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/03—Protecting confidentiality, e.g. by encryption
Definitions
- the following description relates to a wireless communication system, and more particularly, to a method of operating a Road Side Unit (RSU) related to a Vulnerable Road User (VRU) location.
- RSU Road Side Unit
- VRU Vulnerable Road User
- a wireless communication system is a multiple access system that can support communication with multiple users by sharing available system resources (bandwidth, transmission power, etc.).
- Examples of the multiple access system include a code division multiple access (CDMA) system, a frequency division multiple access (FDMA) system, a time division multiple access (TDMA) system, an orthogonal frequency division multiple access (OFDMA) system, and a single carrier frequency (SC-FDMA) system.
- CDMA code division multiple access
- FDMA frequency division multiple access
- TDMA time division multiple access
- OFDMA orthogonal frequency division multiple access
- SC-FDMA single carrier frequency
- 5G Radio Access Technology
- various RAT Radio Access Technology
- LTE, LTE-A, and WiFi are used
- 5G is also included in this.
- the three main requirements areas for 5G are (1) Enhanced Mobile Broadband (eMBB) area, (2) Massive Machine Type Communication (mMTC) area and (3) Ultra-reliable and It includes an Ultra-reliable and Low Latency Communications (URLLC) area.
- eMBB Enhanced Mobile Broadband
- mMTC Massive Machine Type Communication
- URLLC Ultra-reliable and Low Latency Communications
- KPI key performance indicator
- 5G is to support these various use cases in a flexible and reliable way.
- eMBB goes far beyond basic mobile internet access, covering rich interactive work, media and entertainment applications in the cloud or augmented reality.
- Data is one of the key drivers of 5G, and for the first time in the 5G era, we may not see dedicated voice services.
- voice is simply expected to be processed as an application using the data connection provided by the communication system.
- the main causes for increased traffic volume are an increase in content size and an increase in the number of applications requiring high data rates.
- Streaming services audio and video
- interactive video and mobile Internet connections will become more widely used as more devices are connected to the Internet. Many of these applications require always-on connectivity to push real-time information and notifications to users.
- Cloud storage and applications are rapidly increasing in mobile communication platforms, which can be applied to both work and entertainment.
- cloud storage is a special use case that drives the growth of uplink data rates.
- 5G is also used for remote work in the cloud, requiring much lower end-to-end latency to maintain a good user experience when tactile interfaces are used.
- Entertainment For example, cloud gaming and video streaming are other key factors that increase the demand for mobile broadband capabilities. Entertainment is essential on smartphones and tablets anywhere, including in high-mobility environments such as trains, cars and airplanes.
- Another use example is augmented reality for entertainment and information retrieval.
- augmented reality requires very low latency and instantaneous amount of data.
- URLLC includes new services that will transform the industry through ultra-reliable/available low-latency links such as self-driving vehicles and remote control of critical infrastructure. This level of reliability and latency is essential for smart grid control, industrial automation, robotics, and drone control and coordination.
- 5G could complement fiber-to-the-home (FTTH) and cable-based broadband (or DOCSIS) as a means of delivering streams rated at hundreds of megabits per second to gigabits per second. This high speed is required to deliver TVs in resolutions of 4K and higher (6K, 8K and higher), as well as virtual and augmented reality.
- Virtual Reality (VR) and Augmented Reality (AR) applications almost include immersive sporting events. Certain applications may require special network settings. For VR games, for example, game companies may need to integrate core servers with network operators' edge network servers to minimize latency.
- Automotive is expected to be an important new driving force for 5G, with many use cases for mobile communication to vehicles. For example, entertainment for passengers requires simultaneous high capacity and high mobility mobile broadband. The reason is that future users will continue to expect high-quality connections regardless of their location and speed.
- Another use case in the automotive sector is augmented reality dashboards. It identifies objects in the dark and overlays information that tells the driver about the distance and movement of the object over what the driver is seeing through the front window.
- wireless modules will enable communication between vehicles, information exchange between vehicles and supporting infrastructure, and information exchange between automobiles and other connected devices (eg, devices carried by pedestrians).
- Safety systems can help drivers reduce the risk of accidents by guiding alternative courses of action to help them drive safer.
- the next step will be remote-controlled or self-driven vehicles.
- Smart cities and smart homes referred to as smart societies, will be embedded with high-density wireless sensor networks.
- a distributed network of intelligent sensors will identify conditions for cost and energy-efficient maintenance of a city or house.
- a similar setup can be performed for each household.
- Temperature sensors, window and heating controllers, burglar alarms and appliances are all connected wirelessly. Many of these sensors are typically low data rates, low power and low cost. However, for example, real-time HD video may be required in certain types of devices for surveillance.
- Smart grids use digital information and communication technologies to interconnect these sensors to collect information and act on it. This information can include supplier and consumer behavior, enabling smart grids to improve efficiency, reliability, economy, sustainability of production and distribution of fuels such as electricity in an automated manner.
- the smart grid can also be viewed as another low-latency sensor network.
- the health sector has many applications that can benefit from mobile communications.
- the communication system may support telemedicine providing clinical care from a remote location. This can help reduce barriers to distance and improve access to consistently unavailable health care services in remote rural areas. It is also used to save lives in critical care and emergency situations.
- a wireless sensor network based on mobile communication may provide remote monitoring and sensors for parameters such as heart rate and blood pressure.
- Wireless and mobile communications are becoming increasingly important in industrial applications. Wiring is expensive to install and maintain. Thus, the possibility of replacing cables with reconfigurable wireless links is an attractive opportunity for many industries. However, achieving this requires that the wireless connection operate with cable-like delay, reliability and capacity, and that its management be simplified. Low latency and very low error probability are new requirements that need to be connected with 5G.
- Logistics and freight tracking are important use cases for mobile communications that use location-based information systems to enable tracking of inventory and packages from anywhere.
- Logistics and freight tracking use cases typically require low data rates but require wide range and reliable location information.
- the wireless communication system is a multiple access system that supports communication with multiple users by sharing available system resources (eg, bandwidth, transmission power, etc.).
- Examples of the multiple access system include a code division multiple access (CDMA) system, a frequency division multiple access (FDMA) system, a time division multiple access (TDMA) system, an orthogonal frequency division multiple access (OFDMA) system, and a single carrier frequency (SC-FDMA) system.
- CDMA code division multiple access
- FDMA frequency division multiple access
- TDMA time division multiple access
- OFDMA orthogonal frequency division multiple access
- SC-FDMA single carrier frequency
- a sidelink refers to a communication method in which a direct link is established between user equipment (UE), and voice or data is directly exchanged between terminals without going through a base station (BS).
- SL is being considered as one way to solve the burden of the base station due to the rapidly increasing data traffic.
- V2X vehicle-to-everything refers to a communication technology that exchanges information with other vehicles, pedestrians, and infrastructure-built objects through wired/wireless communication.
- V2X can be divided into four types: vehicle-to-vehicle (V2V), vehicle-to-infrastructure (V2I), vehicle-to-network (V2N), and vehicle-to-pedestrian (V2P).
- V2X communication may be provided through a PC5 interface and/or a Uu interface.
- next-generation radio access technology in consideration of the like may be referred to as a new radio access technology (RAT) or a new radio (NR).
- RAT new radio access technology
- NR new radio
- V2X vehicle-to-everything
- FIG. 1 is a diagram for explaining a comparison of V2X communication based on RAT before NR and V2X communication based on NR.
- V2X message may include location information, dynamic information, attribute information, and the like.
- the UE may transmit a periodic message type CAM and/or an event triggered message type DENM to another UE.
- the CAM may include basic vehicle information such as dynamic state information of the vehicle such as direction and speed, vehicle static data such as dimensions, external lighting conditions, and route details.
- the UE may broadcast the CAM, and the CAM latency may be less than 100 ms.
- the terminal may generate a DENM and transmit it to another terminal.
- all vehicles within the transmission range of the terminal may receive the CAM and/or DENM.
- the DENM may have a higher priority than the CAM.
- V2X scenarios are being presented in NR.
- various V2X scenarios may include vehicle platooning, advanced driving, extended sensors, remote driving, and the like.
- vehicles can be dynamically grouped and moved together.
- vehicles belonging to the group may receive periodic data from a leading vehicle.
- the vehicles belonging to the group may use periodic data to reduce or widen the distance between the vehicles.
- the vehicle can be semi-automated or fully automated.
- each vehicle may adjust trajectories or maneuvers based on data obtained from local sensors of the proximate vehicle and/or proximate logical entity.
- each vehicle may share driving intention with adjacent vehicles.
- raw data or processed data obtained through local sensors, or live video data may include a vehicle, a logical entity, a terminal of a pedestrian and / or can be interchanged between V2X application servers.
- the vehicle may recognize an environment that is improved over an environment that can be detected using its own sensor.
- a remote driver or V2X application may operate or control the remote vehicle.
- a route can be predicted such as in public transportation
- cloud computing-based driving may be used to operate or control the remote vehicle.
- access to a cloud-based back-end service platform may be considered for remote driving.
- the embodiment(s) is a method of operation of a Road Side Unit (RSU) on how to measure/determine/correct a Vulnerable Road User (VRU) location.
- RSU Road Side Unit
- VRU Vulnerable Road User
- the RSU receives a Personal Safety Messages (PSM) message of the VRU; determining, by the RSU, the location information of the VRU based on the first location information of the VRU obtained through the image information and the second location information of the VRU obtained through the PSM message; and the RSU transmits the location information of the VRU to the VRU.
- PSM Personal Safety Messages
- a Road Side Unit for performing a VRU (Vulnerable Road User) related operation, at least one processor; and at least one computer memory operably coupled to the at least one processor and storing instructions that, when executed, cause the at least one processor to perform operations, the operations comprising: receiving a PSM message from a VRU; ; determining the location information of the VRU based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message; and an RSU that transmits the location information of the VRU to the VRU.
- RSU Road Side Unit
- An embodiment provides a processor for performing operations for a Road Side Unit (RSU) in a wireless communication system, the operations comprising: receiving a PSM message of a VRU; determining the location information of the VRU based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message; and transmitting the location information of the VRU to the VRU.
- RSU Road Side Unit
- One embodiment provides non-volatile computer readable storage storing at least one computer program comprising instructions that, when executed by at least one processor, cause the at least one processor to perform operations for a Road Side Unit (RSU). 17.
- a medium comprising: receiving a PSM message at a VRU; determining the location information of the VRU based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message; and transmitting location information of the VRU to the VRU.
- the VRU transmits a PSM message to the RSU; and the VRU receives the location information of the VRU from the RSU, wherein the location information of the VRU includes the first location information of the VRU obtained by the RSU through image information and the VRU obtained through the PSM message. is determined based on the second location information of
- a VRU Vehicleable Road User
- RSU Raad Side Unit
- at least one processor At least one processor
- at least one computer memory operably coupled to the at least one processor, the at least one computer memory storing instructions that, when executed, cause the at least one processor to perform operations that cause the VRU to: Send PSM messages to; and the VRU receives the location information of the VRU from the RSU, wherein the location information of the VRU includes first location information of the VRU obtained by the RSU through image information and the VRU obtained through the PSM message. is determined based on the second location information of the VRU.
- a weight may be applied to the first location information and the second location information.
- the image information is captured by the RSU in the observation area and may be shared with other RSUs.
- the location information of the VRU may be shared with the other RSUs.
- the location information of the VRU may include weight information of the first location information with respect to the second location information used for location determination.
- the PSM message may include location information obtained by the VRU through a Global Navigation Satellite System (GNSS).
- GNSS Global Navigation Satellite System
- the PSM message may include location information obtained by the VRU through another RSU, another VRU, or a base station.
- the location information of the VRU may be determined as a value having a smaller error range among the first location information and the second location information.
- the location information of the VRU may be determined based on an error range obtained by averaging the error range of the first location information and the error range of the second location information.
- the location information of the VRU may be included in an intersection of an area corresponding to the error range of the first location information and an area corresponding to the error range of the second location information.
- the location of the VRU may be more accurately measured/corrected using the image of the RSU.
- FIG. 1 is a diagram for explaining a comparison of V2X communication based on RAT before NR and V2X communication based on NR.
- FIG 2 shows the structure of an LTE system according to an embodiment of the present disclosure.
- FIG 3 illustrates a radio protocol architecture for a user plane and a control plane according to an embodiment of the present disclosure.
- FIG. 4 shows a structure of an NR system according to an embodiment of the present disclosure.
- 5 illustrates functional division between NG-RAN and 5GC according to an embodiment of the present disclosure.
- FIG. 6 shows the structure of a radio frame of NR to which embodiment(s) can be applied.
- FIG. 7 illustrates a slot structure of an NR frame according to an embodiment of the present disclosure.
- FIG. 8 illustrates a radio protocol architecture for SL communication according to an embodiment of the present disclosure.
- FIG 9 illustrates a radio protocol architecture for SL communication according to an embodiment of the present disclosure.
- FIG. 10 illustrates a procedure for a terminal to perform V2X or SL communication according to a transmission mode, according to an embodiment of the present disclosure.
- 11 to 23 are diagrams for explaining the embodiment(s).
- 24 to 30 are diagrams for explaining various devices to which embodiment(s) can be applied.
- “/” and “,” should be interpreted as indicating “and/or”.
- “A/B” may mean “A and/or B”.
- “A, B” may mean “A and/or B”.
- “A/B/C” may mean “at least one of A, B, and/or C”.
- “A, B, and C” may mean “at least one of A, B and/or C”.
- “or” should be construed as indicating “and/or”.
- “A or B” may include “only A”, “only B”, and/or “both A and B”.
- “or” should be construed as indicating “additionally or alternatively”.
- CDMA code division multiple access
- FDMA frequency division multiple access
- TDMA time division multiple access
- OFDMA orthogonal frequency division multiple access
- SC-FDMA single carrier frequency division multiple access
- CDMA may be implemented with a radio technology such as universal terrestrial radio access (UTRA) or CDMA2000.
- TDMA may be implemented with a radio technology such as global system for mobile communications (GSM)/general packet radio service (GPRS)/enhanced data rates for GSM evolution (EDGE).
- GSM global system for mobile communications
- GPRS general packet radio service
- EDGE enhanced data rates for GSM evolution
- OFDMA may be implemented with a wireless technology such as Institute of Electrical and Electronics Engineers (IEEE) 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802-20, and evolved UTRA (E-UTRA).
- IEEE 802.16m is an evolution of IEEE 802.16e, and provides backward compatibility with a system based on IEEE 802.16e.
- UTRA is part of the universal mobile telecommunications system (UMTS).
- 3rd generation partnership project (3GPP) long term evolution (LTE) is a part of evolved UMTS (E-UMTS) using evolved-UMTS terrestrial radio access (E-UTRA), and employs OFDMA in downlink and SC in uplink - Adopt FDMA.
- LTE-A (advanced) is an evolution of 3GPP LTE.
- 5G NR is a successor technology of LTE-A, and is a new clean-slate type mobile communication system with characteristics such as high performance, low latency, and high availability. 5G NR can utilize all available spectrum resources, from low frequency bands below 1 GHz, to intermediate frequency bands from 1 GHz to 10 GHz, and high frequency (millimeter wave) bands above 24 GHz.
- LTE-A or 5G NR is mainly described, but the technical idea according to an embodiment of the present disclosure is not limited thereto.
- E-UTRAN Evolved-UMTS Terrestrial Radio Access Network
- LTE Long Term Evolution
- the E-UTRAN includes a base station 20 that provides a control plane and a user plane to the terminal 10 .
- the terminal 10 may be fixed or mobile, and may be called by other terms such as a mobile station (MS), a user terminal (UT), a subscriber station (SS), a mobile terminal (MT), and a wireless device.
- the base station 20 refers to a fixed station that communicates with the terminal 10, and may be called by other terms such as an evolved-NodeB (eNB), a base transceiver system (BTS), and an access point.
- eNB evolved-NodeB
- BTS base transceiver system
- the base stations 20 may be connected to each other through an X2 interface.
- the base station 20 is connected to an Evolved Packet Core (EPC) 30 through an S1 interface, more specifically, a Mobility Management Entity (MME) through S1-MME and a Serving Gateway (S-GW) through S1-U.
- EPC Evolved Packet Core
- the EPC 30 is composed of an MME, an S-GW, and a Packet Data Network-Gateway (P-GW).
- the MME has access information of the terminal or information about the capability of the terminal, and this information is mainly used for mobility management of the terminal.
- the S-GW is a gateway having E-UTRAN as an end point
- the P-GW is a gateway having a PDN (Packet Date Network) as an end point.
- the layers of the Radio Interface Protocol between the terminal and the network are based on the lower three layers of the Open System Interconnection (OSI) standard model widely known in communication systems, L1 (Layer 1), It may be divided into L2 (second layer) and L3 (third layer).
- OSI Open System Interconnection
- the physical layer belonging to the first layer provides an information transfer service using a physical channel
- the RRC (Radio Resource Control) layer located in the third layer is a radio resource between the terminal and the network. plays a role in controlling To this end, the RRC layer exchanges RRC messages between the terminal and the base station.
- 3A illustrates a radio protocol architecture for a user plane according to an embodiment of the present disclosure.
- the user plane is a protocol stack for transmitting user data
- the control plane is a protocol stack for transmitting a control signal.
- a physical layer provides an information transmission service to an upper layer using a physical channel.
- the physical layer is connected to a medium access control (MAC) layer, which is an upper layer, through a transport channel.
- MAC medium access control
- Data moves between the MAC layer and the physical layer through the transport channel. Transmission channels are classified according to how and with what characteristics data is transmitted over the air interface.
- the physical channel may be modulated in an Orthogonal Frequency Division Multiplexing (OFDM) scheme, and time and frequency are used as radio resources.
- OFDM Orthogonal Frequency Division Multiplexing
- the MAC layer provides a service to a radio link control (RLC) layer, which is an upper layer, through a logical channel.
- RLC radio link control
- the MAC layer provides a mapping function from a plurality of logical channels to a plurality of transport channels.
- the MAC layer provides a logical channel multiplexing function by mapping a plurality of logical channels to a single transport channel.
- the MAC sublayer provides data transfer services on logical channels.
- the RLC layer performs concatenation, segmentation, and reassembly of RLC Serving Data Units (SDUs).
- SDUs RLC Serving Data Units
- the RLC layer is a transparent mode (Transparent Mode, TM), an unacknowledged mode (Unacknowledged Mode, UM) and an acknowledged mode (Acknowledged Mode).
- TM Transparent Mode
- UM Unacknowledged Mode
- AM acknowledged Mode
- AM RLC provides error correction through automatic repeat request (ARQ).
- the RRC (Radio Resource Control) layer is defined only in the control plane.
- the RRC layer is responsible for controlling logical channels, transport channels and physical channels in relation to configuration, re-configuration, and release of radio bearers.
- RB means a logical path provided by the first layer (physical layer or PHY layer) and the second layer (MAC layer, RLC layer, PDCP (Packet Data Convergence Protocol) layer) for data transfer between the terminal and the network.
- the functions of the PDCP layer in the user plane include delivery of user data, header compression and ciphering.
- the functions of the PDCP layer in the control plane include transmission of control plane data and encryption/integrity protection.
- Setting the RB means defining the characteristics of a radio protocol layer and channel to provide a specific service, and setting each specific parameter and operation method.
- the RB may be further divided into a Signaling Radio Bearer (SRB) and a Data Radio Bearer (DRB).
- SRB Signaling Radio Bearer
- DRB Data Radio Bearer
- the UE When an RRC connection is established between the RRC layer of the UE and the RRC layer of the E-UTRAN, the UE is in the RRC_CONNECTED state, otherwise it is in the RRC_IDLE state.
- the RRC_INACTIVE state is additionally defined, and the UE in the RRC_INACTIVE state may release the connection with the base station while maintaining the connection with the core network.
- a downlink transmission channel for transmitting data from the network to the terminal there are a BCH (Broadcast Channel) for transmitting system information and a downlink SCH (Shared Channel) for transmitting user traffic or control messages. Traffic or control messages of downlink multicast or broadcast services may be transmitted through a downlink SCH or may be transmitted through a separate downlink multicast channel (MCH).
- a random access channel RACH
- SCH uplink shared channel
- the logical channels that are located above the transport channel and are mapped to the transport channel include a Broadcast Control Channel (BCCH), a Paging Control Channel (PCCH), a Common Control Channel (CCCH), a Multicast Control Channel (MCCH), and a Multicast Traffic Channel (MTCH). channels), etc.
- BCCH Broadcast Control Channel
- PCCH Paging Control Channel
- CCCH Common Control Channel
- MCCH Multicast Control Channel
- MTCH Multicast Traffic Channel
- a physical channel consists of several OFDM symbols in the time domain and several sub-carriers in the frequency domain.
- One sub-frame is composed of a plurality of OFDM symbols in the time domain.
- a resource block is a resource allocation unit and includes a plurality of OFDM symbols and a plurality of sub-carriers.
- each subframe may use specific subcarriers of specific OFDM symbols (eg, the first OFDM symbol) of the corresponding subframe for a Physical Downlink Control Channel (PDCCH), that is, an L1/L2 control channel.
- PDCCH Physical Downlink Control Channel
- a Transmission Time Interval (TTI) is a unit time of subframe transmission.
- FIG. 4 shows a structure of an NR system according to an embodiment of the present disclosure.
- a Next Generation Radio Access Network may include a next generation-Node B (gNB) and/or an eNB that provides user plane and control plane protocol termination to a UE.
- gNB next generation-Node B
- eNB that provides user plane and control plane protocol termination to a UE.
- 4 illustrates a case in which only gNBs are included.
- the gNB and the eNB are connected to each other through an Xn interface.
- the gNB and the eNB are connected to the 5G Core Network (5GC) through the NG interface.
- the access and mobility management function AMF
- the user plane function UPF
- 5 illustrates functional division between NG-RAN and 5GC according to an embodiment of the present disclosure.
- the gNB is inter-cell radio resource management (Inter Cell RRM), radio bearer management (RB control), connection mobility control (Connection Mobility Control), radio admission control (Radio Admission Control), measurement setup and provision Functions such as (Measurement configuration & Provision) and dynamic resource allocation may be provided.
- AMF may provide functions such as NAS (Non Access Stratum) security, idle state mobility processing, and the like.
- the UPF may provide functions such as mobility anchoring and protocol data unit (PDU) processing.
- a Session Management Function (SMF) may provide functions such as terminal Internet Protocol (IP) address assignment, PDU session control, and the like.
- IP Internet Protocol
- FIG. 6 shows the structure of an NR radio frame to which the present invention can be applied.
- radio frames may be used in uplink and downlink transmission in NR.
- a radio frame has a length of 10 ms and may be defined as two 5 ms half-frames (HF).
- a half-frame may include 5 1ms subframes (Subframe, SF).
- a subframe may be divided into one or more slots, and the number of slots in a subframe may be determined according to a subcarrier spacing (SCS).
- SCS subcarrier spacing
- Each slot may include 12 or 14 OFDM(A) symbols according to a cyclic prefix (CP).
- CP cyclic prefix
- each slot may include 14 symbols.
- each slot may include 12 symbols.
- the symbol may include an OFDM symbol (or a CP-OFDM symbol) and an SC-FDMA symbol (or a DFT-s-OFDM symbol).
- Table 1 shows the number of symbols per slot ( ), the number of slots per frame ( ) and the number of slots per subframe ( ) is exemplified.
- Table 2 illustrates the number of symbols per slot, the number of slots per frame, and the number of slots per subframe according to SCS when the extended CP is used.
- OFDM(A) numerology eg, SCS, CP length, etc.
- OFDM(A) numerology eg, SCS, CP length, etc.
- the (absolute time) interval of a time resource eg, subframe, slot, or TTI
- TU Time Unit
- multiple numerology or SCS to support various 5G services may be supported. For example, when SCS is 15 kHz, wide area in traditional cellular bands can be supported, and when SCS is 30 kHz/60 kHz, dense-urban, lower latency) and a wider carrier bandwidth may be supported. For SCS of 60 kHz or higher, bandwidths greater than 24.25 GHz may be supported to overcome phase noise.
- the NR frequency band may be defined as two types of frequency ranges.
- the two types of frequency ranges may be FR1 and FR2.
- the numerical value of the frequency range may be changed, for example, the two types of frequency ranges may be as shown in Table 3 below.
- FR1 may mean “sub 6GHz range”
- FR2 may mean “above 6GHz range” and may be called millimeter wave (mmW).
- mmW millimeter wave
- FR1 may include a band of 410 MHz to 7125 MHz as shown in Table 4 below. That is, FR1 may include a frequency band of 6 GHz (or 5850, 5900, 5925 MHz, etc.) or higher. For example, a frequency band of 6 GHz (or 5850, 5900, 5925 MHz, etc.) included in FR1 may include an unlicensed band. The unlicensed band may be used for various purposes, for example, for communication for a vehicle (eg, autonomous driving).
- FIG. 7 illustrates a slot structure of an NR frame according to an embodiment of the present disclosure.
- a slot includes a plurality of symbols in the time domain.
- one slot may include 14 symbols, but in the case of an extended CP, one slot may include 12 symbols.
- one slot may include 7 symbols, but in the case of an extended CP, one slot may include 6 symbols.
- a carrier wave includes a plurality of subcarriers in the frequency domain.
- a resource block (RB) may be defined as a plurality of (eg, 12) consecutive subcarriers in the frequency domain.
- BWP Bandwidth Part
- P Physical Resource Block
- a carrier may include a maximum of N (eg, 5) BWPs. Data communication may be performed through the activated BWP.
- Each element may be referred to as a resource element (RE) in the resource grid, and one complex symbol may be mapped.
- RE resource element
- the wireless interface between the terminal and the terminal or the wireless interface between the terminal and the network may be composed of an L1 layer, an L2 layer, and an L3 layer.
- the L1 layer may mean a physical layer.
- the L2 layer may mean at least one of a MAC layer, an RLC layer, a PDCP layer, and an SDAP layer.
- the L3 layer may mean an RRC layer.
- V2X or SL (sidelink) communication will be described.
- FIG. 8 illustrates a radio protocol architecture for SL communication according to an embodiment of the present disclosure. Specifically, FIG. 8(a) shows a user plane protocol stack of LTE, and FIG. 8(b) shows a control plane protocol stack of LTE.
- FIG. 9 illustrates a radio protocol architecture for SL communication according to an embodiment of the present disclosure. Specifically, FIG. 9(a) shows a user plane protocol stack of NR, and FIG. 9(b) shows a control plane protocol stack of NR.
- the transmission mode may be referred to as a mode or a resource allocation mode.
- a transmission mode in LTE may be referred to as an LTE transmission mode
- a transmission mode in NR may be referred to as an NR resource allocation mode.
- (a) of FIG. 10 shows a terminal operation related to LTE transmission mode 1 or LTE transmission mode 3.
- (a) of FIG. 10 shows a terminal operation related to NR resource allocation mode 1.
- LTE transmission mode 1 may be applied to general SL communication
- LTE transmission mode 3 may be applied to V2X communication.
- (b) of FIG. 10 shows a terminal operation related to LTE transmission mode 2 or LTE transmission mode 4.
- (b) of FIG. 10 shows a terminal operation related to NR resource allocation mode 2.
- the base station may schedule an SL resource to be used by the terminal for SL transmission.
- the base station may perform resource scheduling to UE 1 through a PDCCH (more specifically, Downlink Control Information (DCI)), and UE 1 may perform V2X or SL communication with UE 2 according to the resource scheduling.
- DCI Downlink Control Information
- UE 1 transmits SCI (Sidelink Control Information) to UE 2 through a Physical Sidelink Control Channel (PSCCH), and then transmits data based on the SCI to UE 2 through a Physical Sidelink Shared Channel (PSSCH).
- SCI Segmentlink Control Information
- PSCCH Physical Sidelink Control Channel
- PSSCH Physical Sidelink Shared Channel
- the UE may be provided with or allocated resources for transmission of one or more SLs of one TB (Transport Block) from the base station through a dynamic grant.
- the base station may provide a resource for transmission of the PSCCH and/or PSSCH to the terminal by using a dynamic grant.
- the transmitting terminal may report the SL HARQ (Hybrid Automatic Repeat Request) feedback received from the receiving terminal to the base station.
- PUCCH resources and timing for reporting SL HARQ feedback to the base station may be determined based on an indication in the PDCCH for the base station to allocate resources for SL transmission.
- DCI may indicate a slot offset between DCI reception and a first SL transmission scheduled by DCI.
- the minimum gap between the DCI scheduling the SL transmission resource and the first scheduled SL transmission resource may not be smaller than the processing time of the corresponding terminal.
- the terminal may be provided or allocated a resource set from the base station periodically for a plurality of SL transmissions through a configured grant.
- the to-be-configured grant may include a configured grant type 1 or a configured grant type 2.
- the terminal may determine the TB to transmit in each case (occasions) indicated by a given configured grant (given configured grant).
- the base station may allocate the SL resource to the terminal on the same carrier, and may allocate the SL resource to the terminal on different carriers.
- the NR base station may control LTE-based SL communication.
- the NR base station may transmit the NR DCI to the terminal to schedule the LTE SL resource.
- a new RNTI for scrambling the NR DCI may be defined.
- the terminal may include an NR SL module and an LTE SL module.
- the NR SL module may convert the NR SL DCI to LTE DCI type 5A, and the NR SL module is X ms LTE DCI type 5A may be delivered to the LTE SL module as a unit.
- the LTE SL module may apply activation and/or release to the first LTE subframe after Z ms.
- the X may be dynamically indicated using a field of DCI.
- the minimum value of X may be different according to UE capability.
- the terminal may report a single value according to the terminal capability.
- X may be a positive number.
- the terminal can determine the SL transmission resource within the SL resource set by the base station / network or the preset SL resource.
- the configured SL resource or the preset SL resource may be a resource pool.
- the UE may autonomously select or schedule a resource for SL transmission.
- the terminal may perform SL communication by selecting a resource by itself within a set resource pool.
- the terminal may select a resource by itself within the selection window by performing a sensing (sensing) and resource (re)selection procedure.
- the sensing may be performed in units of subchannels.
- UE 1 which has selected a resource within the resource pool, transmits the SCI to UE 2 through the PSCCH, and may transmit data based on the SCI to UE 2 through the PSSCH.
- the terminal may help select an SL resource for another terminal.
- the UE may receive a configured grant for SL transmission.
- the terminal may schedule SL transmission of another terminal.
- the UE may reserve an SL resource for blind retransmission.
- the first terminal may indicate to the second terminal the priority of SL transmission by using SCI.
- the second terminal may decode the SCI, and the second terminal may perform sensing and/or resource (re)selection based on the priority.
- the resource (re)selection procedure includes the step of the second terminal identifying a candidate resource in a resource selection window, and the second terminal selecting a resource for (re)transmission from among the identified candidate resources can do.
- the resource selection window may be a time interval during which the terminal selects a resource for SL transmission.
- the resource selection window may start at T1 ⁇ 0, and the resource selection window is determined by the remaining packet delay budget of the second terminal. may be limited.
- a specific resource is indicated by the SCI received by the second terminal from the first terminal, and the L1 SL RSRP measurement value for the specific resource is If the SL RSRP threshold is exceeded, the second terminal may not determine the specific resource as a candidate resource.
- the SL RSRP threshold may be determined based on the priority of the SL transmission indicated by the SCI received by the second terminal from the first terminal and the priority of the SL transmission on the resource selected by the second terminal.
- the L1 SL RSRP may be measured based on an SL DMRS (Demodulation Reference Signal).
- SL DMRS Demodulation Reference Signal
- one or more PSSCH DMRS patterns may be set or preset for each resource pool in the time domain.
- the PDSCH DMRS configuration type 1 and/or type 2 may be the same as or similar to the frequency domain pattern of the PSSCH DMRS.
- the exact DMRS pattern may be indicated by the SCI.
- the transmitting terminal may select a specific DMRS pattern from among DMRS patterns configured or preset for the resource pool.
- the transmitting terminal may perform initial transmission of a TB (Transport Block) without reservation. For example, based on the sensing and resource (re)selection procedure, the transmitting terminal may reserve the SL resource for the initial transmission of the second TB by using the SCI associated with the first TB.
- a TB Transport Block
- the UE may reserve a resource for feedback-based PSSCH retransmission through signaling related to previous transmission of the same TB (Transport Block).
- the maximum number of SL resources reserved by one transmission including the current transmission may be two, three, or four.
- the maximum number of SL resources may be the same regardless of whether HARQ feedback is enabled.
- the maximum number of HARQ (re)transmissions for one TB may be limited by configuration or preset.
- the maximum number of HARQ (re)transmissions may be up to 32.
- the maximum number of HARQ (re)transmissions may be unspecified.
- the setting or preset may be for a transmitting terminal.
- HARQ feedback for releasing resources not used by the UE may be supported.
- the UE may indicate to another UE one or more subchannels and/or slots used by the UE by using SCI.
- the UE may indicate to another UE one or more subchannels and/or slots reserved by the UE for PSSCH (re)transmission by using SCI.
- the minimum allocation unit of the SL resource may be a slot.
- the size of the subchannel may be set for the terminal or may be preset.
- SCI Servicelink Control Information
- Control information transmitted by the base station to the terminal through the PDCCH may be referred to as downlink control information (DCI), while control information transmitted by the terminal to another terminal through the PSCCH may be referred to as SCI.
- DCI downlink control information
- SCI control information transmitted by the terminal to another terminal through the PSCCH
- the UE may know the start symbol of the PSCCH and/or the number of symbols of the PSCCH.
- the SCI may include SL scheduling information.
- the UE may transmit at least one SCI to another UE to schedule the PSSCH.
- one or more SCI formats may be defined.
- the transmitting terminal may transmit the SCI to the receiving terminal on the PSCCH.
- the receiving terminal may decode one SCI to receive the PSSCH from the transmitting terminal.
- the transmitting terminal may transmit two consecutive SCIs (eg, 2-stage SCI) to the receiving terminal on the PSCCH and/or the PSSCH.
- the receiving terminal may decode two consecutive SCIs (eg, 2-stage SCI) to receive the PSSCH from the transmitting terminal.
- the SCI configuration fields are divided into two groups in consideration of the (relatively) high SCI payload size
- the SCI including the first SCI configuration field group is called the first SCI or the 1st SCI.
- the SCI including the second SCI configuration field group may be referred to as a second SCI or a 2nd SCI.
- the transmitting terminal may transmit the first SCI to the receiving terminal through the PSCCH.
- the transmitting terminal may transmit the second SCI to the receiving terminal on the PSCCH and/or the PSSCH.
- the second SCI may be transmitted to the receiving terminal through (independent) PSCCH, or may be piggybacked and transmitted together with data through PSSCH.
- two consecutive SCIs may be applied for different transmissions (eg, unicast, broadcast, or groupcast).
- the transmitting terminal may transmit some or all of the following information to the receiving terminal through SCI.
- the transmitting terminal may transmit some or all of the following information to the receiving terminal through the first SCI and/or the second SCI.
- PSSCH and / or PSCCH related resource allocation information for example, time / frequency resource location / number, resource reservation information (eg, period), and / or
- SL CSI transmission indicator (or SL (L1) RSRP (and / or SL (L1) RSRQ and / or SL (L1) RSSI) information transmission indicator), and / or
- NDI New Data Indicator
- RV Redundancy Version
- QoS information eg, priority information, and/or
- - Reference signal eg, DMRS, etc.
- information related to decoding and/or channel estimation of data transmitted through the PSSCH for example, information related to a pattern of a (time-frequency) mapping resource of DMRS, rank (rank) ) information, antenna port index information;
- the first SCI may include information related to channel sensing.
- the receiving terminal may decode the second SCI using the PSSCH DMRS.
- a polar code used for the PDCCH may be applied to the second SCI.
- the payload size of the first SCI may be the same for unicast, groupcast and broadcast.
- the receiving terminal does not need to perform blind decoding of the second SCI.
- the first SCI may include scheduling information of the second SCI.
- the transmitting terminal since the transmitting terminal may transmit at least one of SCI, the first SCI, and/or the second SCI to the receiving terminal through the PSCCH, the PSCCH is the SCI, the first SCI and/or the second SCI. 2 may be substituted/substituted with at least one of SCI. And/or, for example, SCI may be replaced/substituted with at least one of PSCCH, first SCI, and/or second SCI. And/or, for example, since the transmitting terminal may transmit the second SCI to the receiving terminal through the PSSCH, the PSSCH may be replaced/substituted with the second SCI.
- each object when a plurality of objects observe the object in measuring information related to a specific object, for example, location information, etc., each object may have different measurement error values, and a specific measurement value among them may have a small error range or Alternatively, it may be data with high reliability. Alternatively, even for measurement values with the same statistical characteristics (eg, error range), it is possible to obtain reliable data by reducing the error range by taking many samples and processing the data. Therefore, it may be used to determine the presence/detection of a specific entity by referring to the measured values, or it may be possible to position with better performance.
- error range statistical characteristics
- a road user such as a vehicle or an infra structure such as an RSU transmits a V2X message (eg, VRU, pedestrian-related) to a surrounding road user (eg, a target VRU, pedestrian or surrounding vehicle, etc.) or It is received from the surrounding upper network (e.g., eNB, V2X server, etc. collect and distribute), and this message includes, for example, the location information of a specific road user (e.g., VRU, pedestrian). there may be At this time, the location information is when the specific road user receives the information obtained through the GNSS information directly from the corresponding road user or through the network, the V2X message may include a measurement error of the GNSS receiver.
- V2X message eg, VRU, pedestrian-related
- the location of the road user may be detected and compared with the received information. As a result, if there is a difference from the information detected directly through the camera or other ADAS sensor or device (at least), the received information and the detected information are compared and fed back (the place that sent) or corrected information to the surrounding road users to convey Alternatively, when the RSU receives a PSM of a specific UE or VRU, the RSU corrects the location information included in the received PSM based on the location information of the specific UE or VRU obtained from a detection device included in the RSU to surround It may inform or feedback to the VRU, neighboring UEs or the network. Here, the RSU may transmit the corrected location information to neighboring VRUs through sidelink signals such as PSCCH and PSSCH.
- sidelink signals such as PSCCH and PSSCH.
- VRU-related message is not limited to a PSM message, and may be transmitted by being loaded in any type of message that can use or propagate information of the VRU, such as VAM, CPM, or other types of V2X messages.
- RSU is an infra-structure that can perform direct communication (eg PC5 interface) through V2X dedicated spectrum with road users such as VRUs or vehicles. It detects and predicts collisions between VRUs and vehicles (or other road users). It can be used to improve the protection of VRUs based on the ability to detect VRUs without communication equipment.
- the collected information of road users can be transmitted to the base station through cellular spectrum (e.g. Uu interface), and the information can be processed by the upper V2X server so that nearby road users can receive information about safety messages.
- the RSU may be located near the VRU on the road or in a VRU dense area. At this time, the RSU may play a role in detecting the VRU by mounting a camera or receiving a message transmitted by the VRU and delivering it to a user on the surrounding road.
- the RSU receives the PSM message (or Collective Perception Message (CPM), other V2X message, etc.) of the VRU (S1101 in FIG. 11), and the first location information of the VRU obtained through image information and the The location information of the VRU may be determined based on the second location information of the VRU obtained through the PSM message ( S1102 in FIG. 11 ).
- the RSU may transmit the location information of this VRU to the VRU (S1103 in FIG. 11).
- the location information of the VRU may also be provided to other VRUs or RSUs.
- the location information of the VRU may include a value corresponding to a correction level or reliability.
- the value corresponding to the correction level or reliability may be a ratio or weight level of position information obtained from image information to PSM information, or a value corresponding thereto, or information related to accuracy of corrected position information to be described later.
- the location information of the VRU may include weight information of the first location information with respect to the second location information used for location determination.
- the location information of the VRU or the location information exchanged between each subject before correction is the source of the acquired location information (eg, location information about the target VRU acquired from another RSU, the directly acquired target VRU) location information, location information on a target VRU acquired from another VRU, etc.) related information.
- the PSM message is a message propagated by road users other than VRUs such as pedestrians or vehicle drivers, and the movement status (position, speed, direction, etc.) of the road users and related information (path history, position error) etc.) are used to broadcast safety data including In particular, it is transmitted to nearby vehicles and is used as a warning message to identify and call attention to the presence of road users that are difficult to detect with the driver's naked eye or a vehicle sensor. It can also perform additional functions such as route prediction and density determination.
- VRU is a road user who is not a driver of a vehicle, which means the handicapped, and is defined by the European Commission (EC) ITS Directive as pedestrians, cyclists, motorcyclists and non-motorized road users such as persons with disabilities or people with reduced mobility. do.
- a VRU may or may not have communication equipment such as V2X.
- a VRU having GNSS or GPS may measure its location through this.
- the PSM message may include location information obtained by the VRU through GNSS.
- a VRU without GNSS may know its location through an RSU, a VRU, or a base station.
- the PSM message may include location information obtained by the VRU through another RSU, another VRU, or a base station.
- the VRU receives location information and location error information from the GNSS (or the location error information may be predetermined according to the type of GNSS service or satellite it receives) and can propagate to the surroundings, and this information is It may be transmitted to a nearby RSU directly or through a network.
- the VRU transmits the location information measured through GNSS, etc. to the surrounding road user (PC5) and/or network (Uu interface) through a PSM message. (Based on personal communication devices), an error range of about 10m level can be assumed.
- the image information may be captured by the RSU in an observation area, and the image information may be shared with other RSUs.
- the RSU is an RSU including a function of acquiring and processing image information, and may have the ability to extract and utilize location information based on the image information.
- the RSU may include a device such as a camera/camcorder for direct image capturing and an image processing device, or may be connected by wire/wireless from these devices.
- the RSU may use the installed position or the like as a reference point for relative positioning, and when the VRU is detected from an image of a camera installed in the RSU, the relative position of the VRU may be determined through preset information or the like.
- location information of the corresponding VRU can be obtained.
- the accuracy of VRU detection and/or VRU positioning may be increased.
- the first location information of the VRU is obtained through image information.
- the RSU is installed at a fixed location such as the roadside of the road, and the value of the position indicated by each pixel in the image captured by the camera attached to the RSU is determined. That is, if you know the pixel value of the location where the image was taken, you can also know the location value. However, since the resolution may vary depending on the distance from the camera, the error of the position value may also vary. That is, if the camera does not shoot the top view, the error range is different for each pixel.
- the RSU camera has a (horizontal) angle of view of 120 degrees, a 16:9 ratio image sensor and display are installed, and the resolution of the display (assuming the same pixel as the image sensor for convenience) is HD (1280 X 720) and 5 m from the ground.
- the top row of the display displays the subject at a distance of 508m and represents a length of 1m (horizontal) X 63m (vertical) per pixel.
- the line corresponding to the 360th pixel which is the midpoint, displays the subject at a distance of 5m, and the length of each pixel is 1.9cm (horizontal) X 2.7cm (vertical).
- the line corresponding to the 720th pixel which is the lowest part, displays a subject at a distance of 5.6cm from the camera and represents a length of 0.97cm (horizontal) X 0.7cm (vertical) per pixel.
- the detection error is the pixel position, particularly the distance from the VRU, as shown above. It can vary depending on the pixel position (row, row) in the vertical direction that determines .
- the RSU detects the surrounding VRUs through the RSU camera, the closer the distance between the image sensor, that is, the camera (lens) and the VRU, the finer the image resolution (one pixel contains information in a smaller area. ), the position error of the VRU is also smaller.
- the longer the distance between the camera (lens) and the VRU is, the coarser the image resolution (one pixel contains more area information) and the greater the position error of the VRU. .
- the RSU may detect the location of the VRU with a large error through a camera image, etc. (far distance), or may detect the location of the VRU with a small error compared to the GNSS-based location measurement (close distance) .
- the RSU may not be able to detect the VRU in the video, or the VRU may not be accurately detected due to the nature of the detection environment. For example, in a dark environment after sunset, when an RSU camera without an infrared camera lens or the like tries to detect a VRU wearing dark clothes, it may not be detected accurately.
- the location information of the VRU is determined based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message.
- the location information of the VRU is determined based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message.
- the first location information and the second location information have an error range. If the values are different depending on the methods in which the location information is generated, it may be appropriate to use data having a smaller error range as a representative value.
- the location information of the VRU may be determined as a value having a smaller error range among the first location information and the second location information. For example, as shown in FIG.
- the error range of the location information carried in the PSM message is the error range of the location information of the VRU detected through the image (for example, If it is larger on the x-axis than the x-axis 3m, y-axis 15m) but smaller on the y-axis, the x-axis value of the VRU location information is x of the VRU location information (p'_c,p1) detected through the image.
- the RSU obtains the second location information of the specific VRU (or UE) and the first location information of the specific VRU (or UE) from the imaging device in the PSM message, and the obtained first location information and the second location information
- the location information for the VRU may be corrected based on the second location information, where the predetermined threshold is an error related to the imaging device It may be determined based on the range.
- the corrected location information of the VRU may be delivered to a neighboring VRU or neighboring UEs as position information on the specific VRU, or may be fed back to a network.
- the location information of the VRU may be determined based on an error range obtained by averaging the error range of the first location information and the error range of the second location information. That is, a method of averaging position errors by taking the average of samples (or candidates) of position values may be considered.
- both the SemiMajorAxisAccuracy and SemiMinorAxisAccuracy parts are mapped to a value corresponding to 7.38m (for example, a value corresponding to 148 when expressed in units of 0.05m or mapping to 10010100 in binary).
- the location information of the VRU may be included in an intersection of an area corresponding to the error range of the first location information and an area corresponding to the error range of the second location information. If the error range of the samples of the position value needs to determine the corrected position value within the overlapping region, it may be corrected in the same way as in FIG. 13 (d).
- the RSU may correct the location information of the VRU based on the first location information and the second location information. For example, the RSU may correct the location information for the VRU by averaging the first location information and the second location information, an error range related to the first location information and an error range related to the second location information. .
- the RSU may correct the location information for the VRU based on an overlapping area between the first location information and the second location information.
- the corrected location information of the VRU may be delivered to a neighboring VRU or neighboring UEs as position information on the specific VRU, or may be fed back to a network.
- p'_p1 a*p'_c,p1 + (1-a)*p_p1 may be determined in the form of interpolation of two position values, and the position error at this time is sqrt( a*(3m) ⁇ 2 + (1-a)*(3m) ⁇ 2 ). Meanwhile, the above-described method may assume an overlap between the first area according to the first location information and the error range and the second area according to the second location information and the error range.
- the above error range may indicate a nominal error range such as standard deviation of the error value, and the actual position value may exist in a region outside the above range. Therefore, the above methods can be equally applied even when the error range of the location information carried in the PSM message and the error range of the location information of the VRU detected through the image do not overlap as shown in FIG. 14A . However, when the difference between the two position values is very large, for example, when the error range is d_th or more from each other as shown in FIG. 14 ( b ), at least one position value may be regarded as incorrect.
- the RSU may not perform feedback requesting error correction or generation of an associated VRU message (PSM, VRU message, CPM message, etc.).
- a value having a larger error range may be regarded as incorrect and the RSU may not perform any operation or may perform feedback requesting error correction or generation of an associated VRU message in some cases.
- the above operation can be equally applied when the location of the PSM message of the associated VRU points to an area other than the image display area.
- only the value necessary for correction eg, p_c,k and p_c, (k+1), or Its corrected form, p'_c, p1, and the associated time value(s), error range, etc.
- the VRU and the RSU may be synchronized to the same base station, or even if they belong to different base stations, the V2X server may process data according to each other's base station timing, or at least use the same GNSS reference time.
- VRU message information generation time and location generated at the same or similar time by comparing the generation time of the PSM message and the VRU detection time in the video as follows It is checked whether the same VRU is indicated by comparing related information, for example, a PSM message DSecond field and a Position-related field) and image information (a frame creation time of a VRU entity detected for each frame and a position calculation of the detected VRU). For example, suppose that the PSM message is generated at a time point between specific frames of a camera image, that is, a specific time point between t_k and t_(k+1), t_p1 . At this time, as shown in FIG.
- a value of p'_c,p1 generated by appropriately correcting may be compared with location information p_p1 of a VRU carried in a PSM message generated at a time t_p1.
- the position error value (or range) of the VRU at each time may also be corrected in the same way.
- the offset for the image capture may be adjusted so that the PSM message reception time and the image capture time can be reduced to a minimum. For example, when the timing as shown in FIG. 15 (a) is given, if the PSM message transmission interval (eg, T_p) is 1 s, image capture should also occur one second after t_p1 (eg, t_p2). will be. Assuming that the offset value is t_ofs and the interval between frames of an image is T_c, it can be summarized as in Equation 1 below.
- the configuration of the RSU camera for reducing the position error will be described. It can be seen that when detecting the position of the VRU based on the image of the camera, the error value varies greatly depending on the position of the vertical pixel on the image sensor or the display. That is, as shown in FIG. 16 ( a ), it can be seen that the closer the subject is, the smaller the error value is, and the farther the subject is, the larger the error value is.
- one RSU and a camera are installed on the opposite side where the RSU and the camera are installed, and similarly to the method described above, a value with a small error range among the values detected from the two cameras is a representative value.
- image data or detected VRU position values are shared among RSUs, and at least one RSU performs an operation of correcting VRU position values, or each RSU raises image data or detected VRU position values. It can be transmitted to the network to perform an operation of correcting the position value of the VRU in the V2X application server or the like.
- FIG. 16 (b) one RSU and a camera are installed on the opposite side where the RSU and the camera are installed, and similarly to the method described above, a value with a small error range among the values detected from the two cameras is a representative value.
- image data or detected VRU position values are shared among RSUs, and at least one RSU performs an operation of correcting VRU position values, or each RSU raises image data or detected VRU position values. It can be transmitted to the
- the error range is increased in one RSU camera, the error range is decreased in the other RSU cameras. If the error range of the measured (or corrected) data is the same (or the difference is less than a certain threshold value), the two data may be averaged and used.
- two RSU cameras may be configured to photograph a specific VRU in different directions. More specifically, if the above-described method is that RSU1 and RSU2 are arranged at intervals of 180 degrees, this method can be said that RSU1 and RSU2 are arranged at intervals of 90 degrees. At this time, even if the VRUs (commonly) entering the range of the RSU camera experience a large error in the x-axis or y-axis direction by a specific RSU camera, the error in the corresponding axis can be offset by the remaining RSU camera.
- VRU1 it is appropriate for VRU1 to use a value measured and detected by RSU1 as a representative value, and for VRU4 to use a value measured and detected by RSU2 as a representative value as shown in FIG. 17B .
- VRU2 can accurately detect both RSU1 and RSU2, if the value with the smaller error is taken between the two, or the error range of the measured (or corrected) data in each is the same (or differs by less than a certain threshold value) Data can also be averaged.
- RSU1 measured and detected the x-axis data of VRU3 and RSU2 measured and detected the y-axis data of VRU3 with a small error. In this case, the x-axis data of RSU1 and the y-axis data of RSU2 are selectively taken.
- the RSU may shift the imaging angle of the imaging device or camera through the following methods, and may correct the location information of the VRU by comparing the image of the VRU before the shift and the image of the VRU after the shift.
- the detected VRU is located at the bottom of the image (or movement of the VRU) to improve the accuracy of image detection.
- the camera can be shifted so that it is positioned higher than the lowermost position (that is, the position with the most detailed resolution per pixel) to respond to the change in position.
- the VRU detected at the position closest to the RSU is positioned at the bottom of the image (or higher than the bottom to correspond to the change in the moving position of the VRU) as described above (that is, per pixel).
- the position where the resolution is the most detailed) You can shift the camera.
- the camera may be shifted so that the largest number of VRUs can be detected.
- a new VRU when a new VRU is detected by shifting the camera, it can be used to improve the location accuracy of the VRU or to determine whether a VRU is present by comparing it with the data before the shift (eg, the location of the VRU).
- the information related to the location estimation accuracy is received through a positioning device (eg, GNSS, camera) used by each road user, infrastructure structure, and network, and/or through a location-based application, etc. Statistically, it can be obtained through measurement and correction or calculation.
- the obtained location-related information such as VRUs and pedestrians may be transmitted through the Position3D and PositionAccuracy fields of the PSM message as shown in FIG. 18 .
- the location information is derived, the ratio or weight level in which the first location information and the second location information are reflected, or a corresponding value is mapped to a separate field and may be included in a PSM / VAM message or other VRU-related message transmitted by the VRU. .
- the location information of the VRU may utilize information obtained from other road users or infrastructure/network as well as information obtained by the VRU device itself, and there may be cases where it is necessary to distinguish this.
- a field eg S1: VRU (0), S2: RSU (1), .
- S1 VRU (0), S2: RSU (1), .
- a field that identifies the unique ID of each source may be added separately.
- an entity that only supports positioning may request to correct the position value while feeding back values related to the fields.
- the eNB receives the VRU location information and related correction information from the VRU or infra (eg, RSU), and the V2X (application) server that receives the information from the eNB corrects the VRU location information while continuously receiving the corresponding values. It can be used to do this, and it can also deliver the corrected VRU position value or a PSM message carrying the corresponding information to the V2X server. In this way, the eNB may transmit the corrected VRU location information or a PSM message carrying the corresponding information to surrounding road users.
- VRU location information and related correction information from the VRU or infra (eg, RSU)
- the V2X (application) server that receives the information from the eNB corrects the VRU location information while continuously receiving the corresponding values. It can be used to do this, and it can also deliver the corrected VRU position value or a PSM message carrying the corresponding information to the V2X server. In this way, the eNB may transmit the corrected VRU location information or a PSM message
- the VRU is detected from the camera image of the RSU. If the VRU is not detected in the camera image, the following method may be used.
- the location information contained in the PSM message can be recognized as the location of the VRU, and if it is not corrected from this, it notifies the V2X server that there is no separate feedback or no intention to request correction.
- the VRU is not detected in the camera image and the location information included in the PSM message indicates a specific point in the video area as shown in FIG. 19 (a)
- the actual location of the VRU is any point within the error range from the PSM location information. can be assumed to be Therefore, as shown in FIG. 19( b ), a specific point among the intersection of the image outer region and the PSM position information error range may be recognized as the location of the VRU.
- a location closest to the PSM location information received in the corresponding area or a (extended external) pixel may be recognized as the location of the VRU.
- the boundary area between the pixel and the pixel half pixel is recognized as the position of the VRU as shown in FIG. 19(c) or in contact with the external pixel as shown in FIG. It can recognize the internal pixel as the location of the VRU.
- the VRU location information in the message transmitted by the VRU may come within the video range of the camera (a situation where it may go out of the video range considering the error range), or as shown in FIG. It may be located outside the image range of In this case, the RSU may shift the photographing position of the camera or the imaging device so that the RSU that has transmitted the PSM is located within the image range.
- the position of the VRU received by moving the camera to verify the information of the VRU message is within the image range of the camera (or the center of the image or the most of the image).
- the RSU may estimate a direction and an angle to shift a photographing position of the camera or imaging device based on the mobility information included in the PSM, and shift the camera or imaging device in the estimated direction and angle can
- the mobility information may include a moving direction, a moving speed, and the like of the VRU.
- the VRU when it is difficult for the VRU to detect the VRU according to its imaging device (lack of light, sunset, sunrise, or device failure), the VRU is included in the PSM without performing additional correction on the location information of the VRU included in the PSM. It can be treated as valid VRU location information.
- the RSU feeds back to the V2X server that the location information of the VRU is abnormal, or in order not to participate in the event. It may not give any feedback.
- the V2X server that the location information of the VRU is abnormal, or in order not to participate in the event. It may not give any feedback.
- the received message is Since there is no basis for judging the information, the received location information is considered valid and feedback to the V2X server that there is no abnormality in the location information of the VRU or may not provide any feedback in order not to participate in the event.
- the VRU object may be incompletely detected as an image due to reflected light or ambient (road user, etc.) lighting in some cases in a situation such as FIG. can
- the detection area may be displayed broadly or blurry than in reality due to the influence of lighting or a picture, or it may be suitable for specifying the location as shown in FIG. 21(d). Even if a specific object is detected with a small error, it may be difficult to determine whether the object is a VRU or another object on a road or sidewalk.
- the methods listed above are used when processing the detection image information, but as an example, the error range due to image detection is applied larger than the general case (eg, 3m) (for example, For example, 2x, 3m ... 6m) may be processed so that the weight for the location information of the PSM message is highly reflected. Or (extremely), if the difference between the error ranges of the two location values is within a given constant value (ie d_th), only the information in the PSM message is considered valid and the V2X server indicates that there is no abnormality in the location information of the VRU. You can either give feedback or not give any feedback in order not to participate in the event.
- FIG. 21 (d) Although the case of FIG. 21 (d) is also an incomplete detection situation, only the information of the PSM message cannot be regarded as valid. For example, when a set of candidate groups such as pedestrians, cyclists, or packaging boxes (objects) is determined (with similar probability) through image processing, the location information (and/or the error range of the location) of the received PSM message(s) The image is compared with the detected position (p'_c,p1) (and/or the margin of error) and whether there is a similar (or related) PSM message by applying the methods listed above.
- a set of candidate groups such as pedestrians, cyclists, or packaging boxes (objects)
- the location information (and/or the error range of the location) of the received PSM message(s) The image is compared with the detected position (p'_c,p1) (and/or the margin of error) and whether there is a similar (or related) PSM message by applying the methods listed above.
- RSU can be determined as an error in image detection (VRU type determination) of the camera.
- the RSU may give feedback to the V2X server that there is no abnormality in the VRU's PSM message and location information, or may not give any feedback in order not to participate in the event.
- the RSU is the location included in the PSM based on the amount of light or time zone.
- the validity of the information can be determined. For example, when the RSU is estimated based on the time and/or light amount that it is difficult to detect an image such as a sunset or a night situation, even if the VRU is not clearly detected by the camera at a position corresponding to the location information of the PSM It can be estimated that the location information according to the PSM is valid.
- the RSU may determine the validity of the location information of the PSM by increasing an error range of the location information acquired by the imaging device based on time and/or light quantity.
- VRU-related prior information including VRU location information included in the VRU's PSM message
- VRU location information included in the VRU's PSM message
- it may be used as a condition for VRU determination or as an input of an artificial intelligence learning algorithm for VRU detection.
- related mobility data eg, speed, direction, acceleration, etc.
- PSM data e.g., speed, direction, acceleration, etc.
- entity e.g, vehicle
- the entity may be determined as a non-equipped road user. In this case, it is possible to determine whether the type of the road user, for example, is a VRU (eg, pedestrian, cyclist, etc.) or a vehicle, based on the map data and mobility data of the corresponding road user.
- additional information about the road user may be transmitted through a PSM, a BSM message, or other additional V2X messages (eg, VRU awareness message: VAM).
- VAM VRU awareness message
- data about the appearance of a road user may be loaded and transmitted. More specifically, in the case of a pedestrian road user, information valid for a long period of time such as height (height), gender, skin color, etc. (i.e. long term information) may be simply input at the time of device registration and initialization. Also, information that frequently changes on the day, such as the color of clothes, can be manually input through an application related to the road user (ie VRU), or a reminder requesting to take a self-camera photo can be sent to induce the user to take a photo. . Alternatively, if there is a picture taken before going out or a picture taken while going out and moving or walking, the application may search for and recognize the user's device and use it to generate VRU information, particularly information related to appearance.
- a road user type or more specifically a vehicle type, a VRU type, etc. may be mapped to messages sent by road users more explicitly, and this information can be used to compare with information detected from an image.
- the type information for example, in the case of a VRU, the user type may be changed depending on the situation. For example, it was a pedestrian mode on the sidewalk, a passenger mode while riding public transportation, and a driver mode or vehicle mode while riding a private vehicle again. can be changed. This can be determined by using a combination of sensor information of the device or V2X communication (or using other communication methods) with the means of transportation used.
- the RSU compares information on the corresponding VRU with a pre-received (or received after a predetermined time delay) V2X (VRU) message to check the validity of the information.
- VRU V2X
- the RSU may request feedback on the message or if information correction is required (or information that correction is not required may be fed back). . This may be directly received from the VRU through the PC5 interface as shown in FIG. 22 or may be received from the eNB through a higher-level network.
- the VRU may transmit a PSM message to the eNB through a Uu interface, etc., and the eNB may retransmit (deliver) the RSU to the RSU through a Uu interface or a wired interface.
- the eNB may collect or cluster messages of a plurality of VRUs and transmit them to the RSU.
- the detected location is calculated using the method described above, and this value is corrected in consideration of the PSM creation time, etc. It is possible to feedback the VRU location information correction request, process the RSU request, regenerate a PSM message, or transmit the changed VRU information through a CPM or other VRU-related message.
- the RSU requests the V2X server to correct the position or to generate a CPM message, information extracted from the raw data generated when the image is detected can be transmitted.
- the RSU transmits the time(s) at which the image was captured, the calculated position(s) of the VRU corresponding to the time(s), and the (positioning) accuracy of the corresponding measurement value to the eNB, and the V2X server that received it
- the PSM message on which the location correction has been performed or the CPM message including the corresponding information may be transmitted again (duplicate) to the surrounding area, or the location information correction may be requested from the VRU.
- a corrected position value may be transmitted to the corresponding VRU, or a correction value of position information (eg, a delta difference value) may be transmitted.
- the RSU may transmit the corrected location information.
- a position value corresponding to the corresponding time point is generated based on image detection (Position3D and PositioningAccuracy, etc.) and transmitted to the eNB.
- the performed PSM message or the CPM message including the corresponding information may be transmitted again (duplicate) to the surrounding area, or a correction of location information may be requested from the corresponding VRU.
- a corrected position value may be transmitted to the corresponding VRU, or a correction value of position information (eg, a delta difference value) may be transmitted.
- the RSU may directly regenerate a PSM message without feeding back a location correction request to the upper network, or may generate a CPM or other VRU-related message and transmit the changed VRU information to the surroundings.
- the PSM message or CPM message generated above may be information (position estimation, correction value, etc.) generated at the same time point as indicated in the message transmitted by the VRU or eNB, or the camera image is captured (new) It may be information generated at a point in time or at another point in time.
- a request is made to (new) the upper network to generate a PSM message or a CPM message for the related VRU, or the RSU directly sends a PSM message Alternatively, an associated CPM message may be generated and transmitted to the surroundings.
- the RSU when a VRU specific zone and/or a vehicle specific zone is set in the area covered by the RSU, the RSU also identifies which zone the VRU belongs to or at which time the zone is switched, and this is used for VRU detection. It can be used as prior information. For example, even though the VRU is configured to transmit a PSM message every time it moves by switching zones, even if the corresponding message is not received even after the point at which the VRU is supposed to transmit the PSM message has passed, or furthermore, it waits for a certain period of time (e.g.
- the RSU directly requests the VRU to transmit a status change message, or VRU detection information and status change (zone switching) to the upper network information can be reported.
- a Road Side Unit (RSU) for performing a Vulnerable Road User (VRU)-related operation comprising: at least one processor; and at least one computer memory operably coupled to the at least one processor and storing instructions that, when executed, cause the at least one processor to perform operations, the operations comprising: receiving a PSM message from a VRU; ; determining the location information of the VRU based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message; and transmit location information of the VRU to the VRU.
- RSU Road Side Unit
- VRU Vulnerable Road User
- a processor for performing operations for a Road Side Unit (RSU) in a wireless communication system comprising: receiving a PSM message of a VRU; determining the location information of the VRU based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message; and transmit location information of the VRU to the VRU.
- RSU Road Side Unit
- a non-volatile computer readable storage medium storing at least one computer program comprising instructions that, when executed by at least one processor, cause the at least one processor to perform operations for a UE, the operations comprising: receive PSM messages; determining the location information of the VRU based on the first location information of the VRU obtained through image information and the second location information of the VRU obtained through the PSM message; and transmit location information of the VRU to the VRU.
- the VRU transmits a PSM message to the RSU; and the VRU receives the location information of the VRU from the RSU, wherein the location information of the VRU includes first location information of the VRU obtained by the RSU through image information and the VRU obtained through the PSM message. It may be determined based on the second location information of
- a VRU Vehicleable Road User
- RSU Road Side Unit
- a VRU Vehicleable Road User
- the at least one computer memory operably coupled to the at least one processor, the at least one computer memory storing instructions that, when executed, cause the at least one processor to perform operations that cause the VRU to: Send PSM messages to; and the VRU receives the location information of the VRU from the RSU, wherein the location information of the VRU includes first location information of the VRU obtained by the RSU through image information and the VRU obtained through the PSM message. It may be determined based on the second location information of
- the communication system 1 applied to the present invention includes a wireless device, a base station, and a network.
- the wireless device refers to a device that performs communication using a radio access technology (eg, 5G NR (New RAT), LTE (Long Term Evolution)), and may be referred to as a communication/wireless/5G device.
- a radio access technology eg, 5G NR (New RAT), LTE (Long Term Evolution)
- the wireless device includes a robot 100a, a vehicle 100b-1, 100b-2, an eXtended Reality (XR) device 100c, a hand-held device 100d, and a home appliance 100e. ), an Internet of Things (IoT) device 100f, and an AI device/server 400 .
- the vehicle may include a vehicle equipped with a wireless communication function, an autonomous driving vehicle, a vehicle capable of performing inter-vehicle communication, and the like.
- the vehicle may include an Unmanned Aerial Vehicle (UAV) (eg, a drone).
- UAV Unmanned Aerial Vehicle
- XR devices include AR (Augmented Reality)/VR (Virtual Reality)/MR (Mixed Reality) devices, and include a Head-Mounted Device (HMD), a Head-Up Display (HUD) provided in a vehicle, a television, a smartphone, It may be implemented in the form of a computer, a wearable device, a home appliance, a digital signage, a vehicle, a robot, and the like.
- the portable device may include a smart phone, a smart pad, a wearable device (eg, a smart watch, smart glasses), a computer (eg, a laptop computer), and the like.
- Home appliances may include a TV, a refrigerator, a washing machine, and the like.
- the IoT device may include a sensor, a smart meter, and the like.
- the base station and the network may be implemented as a wireless device, and a specific wireless device 200a may operate as a base station/network node to other wireless devices.
- the wireless devices 100a to 100f may be connected to the network 300 through the base station 200 .
- AI Artificial Intelligence
- the network 300 may be configured using a 3G network, a 4G (eg, LTE) network, or a 5G (eg, NR) network.
- the wireless devices 100a to 100f may communicate with each other through the base station 200/network 300, but may also communicate directly (e.g. sidelink communication) without passing through the base station/network.
- the vehicles 100b-1 and 100b-2 may perform direct communication (e.g. Vehicle to Vehicle (V2V)/Vehicle to everything (V2X) communication).
- the IoT device eg, sensor
- the IoT device may directly communicate with other IoT devices (eg, sensor) or other wireless devices 100a to 100f.
- Wireless communication/connection 150a, 150b, and 150c may be performed between the wireless devices 100a to 100f/base station 200 and the base station 200/base station 200 .
- the wireless communication/connection includes uplink/downlink communication 150a and sidelink communication 150b (or D2D communication), and communication between base stations 150c (eg relay, IAB (Integrated Access Backhaul)).
- This can be done through technology (eg 5G NR)
- Wireless communication/connection 150a, 150b, 150c allows the wireless device and the base station/radio device, and the base station and the base station to transmit/receive wireless signals to each other.
- the wireless communication/connection 150a, 150b, and 150c may transmit/receive signals through various physical channels.
- various signal processing processes eg, channel encoding/decoding, modulation/demodulation, resource mapping/demapping, etc.
- resource allocation processes etc.
- 25 illustrates a wireless device that can be applied to the present invention.
- the first wireless device 100 and the second wireless device 200 may transmit and receive wireless signals through various wireless access technologies (eg, LTE, NR).
- ⁇ first wireless device 100, second wireless device 200 ⁇ is ⁇ wireless device 100x, base station 200 ⁇ of FIG. 24 and/or ⁇ wireless device 100x, wireless device 100x) ⁇ can be matched.
- the first wireless device 100 includes one or more processors 102 and one or more memories 104 , and may further include one or more transceivers 106 and/or one or more antennas 108 .
- the processor 102 controls the memory 104 and/or the transceiver 106 and may be configured to implement the descriptions, functions, procedures, suggestions, methods, and/or flow charts disclosed herein.
- the processor 102 may process the information in the memory 104 to generate the first information/signal, and then transmit a wireless signal including the first information/signal through the transceiver 106 .
- the processor 102 may receive the radio signal including the second information/signal through the transceiver 106 , and then store the information obtained from the signal processing of the second information/signal in the memory 104 .
- the memory 104 may be connected to the processor 102 and may store various information related to the operation of the processor 102 .
- the memory 104 may provide instructions for performing some or all of the processes controlled by the processor 102 , or for performing the descriptions, functions, procedures, suggestions, methods, and/or operational flowcharts disclosed herein. may store software code including
- the processor 102 and the memory 104 may be part of a communication modem/circuit/chip designed to implement a wireless communication technology (eg, LTE, NR).
- a wireless communication technology eg, LTE, NR
- the transceiver 106 may be coupled with the processor 102 , and may transmit and/or receive wireless signals via one or more antennas 108 .
- the transceiver 106 may include a transmitter and/or a receiver.
- the transceiver 106 may be used interchangeably with a radio frequency (RF) unit.
- RF radio frequency
- a wireless device may refer to a communication modem/circuit/chip.
- the second wireless device 200 includes one or more processors 202 , one or more memories 204 , and may further include one or more transceivers 206 and/or one or more antennas 208 .
- the processor 202 controls the memory 204 and/or the transceiver 206 and may be configured to implement the descriptions, functions, procedures, suggestions, methods, and/or operational flowcharts disclosed herein.
- the processor 202 may process the information in the memory 204 to generate third information/signal, and then transmit a wireless signal including the third information/signal through the transceiver 206 .
- the processor 202 may receive the radio signal including the fourth information/signal through the transceiver 206 , and then store information obtained from signal processing of the fourth information/signal in the memory 204 .
- the memory 204 may be connected to the processor 202 and may store various information related to the operation of the processor 202 .
- the memory 204 may provide instructions for performing some or all of the processes controlled by the processor 202 , or for performing the descriptions, functions, procedures, suggestions, methods, and/or operational flowcharts disclosed herein. may store software code including
- the processor 202 and the memory 204 may be part of a communication modem/circuit/chip designed to implement a wireless communication technology (eg, LTE, NR).
- a wireless communication technology eg, LTE, NR
- the transceiver 206 may be coupled to the processor 202 and may transmit and/or receive wireless signals via one or more antennas 208 .
- the transceiver 206 may include a transmitter and/or a receiver.
- the transceiver 206 may be used interchangeably with an RF unit.
- a wireless device may refer to a communication modem/circuit/chip.
- one or more protocol layers may be implemented by one or more processors 102 , 202 .
- one or more processors 102 , 202 may implement one or more layers (eg, functional layers such as PHY, MAC, RLC, PDCP, RRC, SDAP).
- the one or more processors 102, 202 may be configured to process one or more Protocol Data Units (PDUs) and/or one or more Service Data Units (SDUs) according to the description, function, procedure, proposal, method, and/or operational flowcharts disclosed herein.
- PDUs Protocol Data Units
- SDUs Service Data Units
- One or more processors 102, 202 may generate messages, control information, data, or information according to the description, function, procedure, proposal, method, and/or flow charts disclosed herein.
- the one or more processors 102 and 202 generate a signal (eg, a baseband signal) including PDUs, SDUs, messages, control information, data or information according to the functions, procedures, proposals and/or methods disclosed in this document. , to one or more transceivers 106 and 206 .
- the one or more processors 102 , 202 may receive signals (eg, baseband signals) from one or more transceivers 106 , 206 , and may be described, functions, procedures, proposals, methods, and/or operational flowcharts disclosed herein.
- PDUs, SDUs, messages, control information, data, or information may be acquired according to the above.
- One or more processors 102 , 202 may be referred to as a controller, microcontroller, microprocessor, or microcomputer.
- One or more processors 102, 202 may be implemented by hardware, firmware, software, or a combination thereof.
- ASICs Application Specific Integrated Circuits
- DSPs Digital Signal Processors
- DSPDs Digital Signal Processing Devices
- PLDs Programmable Logic Devices
- FPGAs Field Programmable Gate Arrays
- firmware or software which may be implemented to include modules, procedures, functions, and the like.
- the descriptions, functions, procedures, proposals, methods, and/or flow charts disclosed herein provide that firmware or software configured to perform is included in one or more processors 102 , 202 , or stored in one or more memories 104 , 204 . It may be driven by the above processors 102 and 202 .
- the descriptions, functions, procedures, suggestions, methods, and/or flowcharts of operations disclosed herein may be implemented using firmware or software in the form of code, instructions, and/or a set of instructions.
- One or more memories 104 , 204 may be coupled with one or more processors 102 , 202 and may store various forms of data, signals, messages, information, programs, code, instructions, and/or instructions.
- One or more memories 104 , 204 may be comprised of ROM, RAM, EPROM, flash memory, hard drives, registers, cache memory, computer readable storage media, and/or combinations thereof.
- One or more memories 104 , 204 may be located inside and/or external to one or more processors 102 , 202 .
- one or more memories 104 , 204 may be coupled to one or more processors 102 , 202 through various technologies, such as wired or wireless connections.
- One or more transceivers 106 , 206 may transmit user data, control information, radio signals/channels, etc. referred to in the methods and/or operational flowcharts of this document to one or more other devices.
- One or more transceivers 106, 206 may receive user data, control information, radio signals/channels, etc. referred to in the descriptions, functions, procedures, suggestions, methods and/or flow charts, etc. disclosed herein, from one or more other devices. have.
- one or more transceivers 106 , 206 may be coupled to one or more processors 102 , 202 and may transmit and receive wireless signals.
- one or more processors 102 , 202 may control one or more transceivers 106 , 206 to transmit user data, control information, or wireless signals to one or more other devices.
- one or more processors 102 , 202 may control one or more transceivers 106 , 206 to receive user data, control information, or wireless signals from one or more other devices.
- one or more transceivers 106, 206 may be coupled to one or more antennas 108, 208, and the one or more transceivers 106, 206 may be coupled via one or more antennas 108, 208 to the descriptions, functions, and functions disclosed herein. , procedures, proposals, methods and/or operation flowcharts, etc.
- one or more antennas may be a plurality of physical antennas or a plurality of logical antennas (eg, antenna ports).
- the one or more transceivers 106, 206 convert the received radio signal/channel, etc. from the RF band signal to process the received user data, control information, radio signal/channel, etc. using the one or more processors 102, 202. It can be converted into a baseband signal.
- One or more transceivers 106 and 206 may convert user data, control information, radio signals/channels, etc. processed using one or more processors 102 and 202 from baseband signals to RF band signals.
- one or more transceivers 106 , 206 may include (analog) oscillators and/or filters.
- the vehicle or autonomous driving vehicle may be implemented as a mobile robot, vehicle, train, manned/unmanned aerial vehicle (AV), ship, or the like.
- AV unmanned aerial vehicle
- the vehicle or autonomous driving vehicle 100 includes an antenna unit 108 , a communication unit 110 , a control unit 120 , a driving unit 140a , a power supply unit 140b , a sensor unit 140c and autonomous driving. It may include a part 140d.
- the antenna unit 108 may be configured as a part of the communication unit 110 .
- the communication unit 110 may transmit/receive signals (eg, data, control signals, etc.) to and from external devices such as other vehicles, base stations (eg, base stations, roadside units, etc.), servers, and the like.
- the controller 120 may control elements of the vehicle or the autonomous driving vehicle 100 to perform various operations.
- the controller 120 may include an Electronic Control Unit (ECU).
- the driving unit 140a may cause the vehicle or the autonomous driving vehicle 100 to run on the ground.
- the driving unit 140a may include an engine, a motor, a power train, a wheel, a brake, a steering device, and the like.
- the power supply unit 140b supplies power to the vehicle or the autonomous driving vehicle 100 , and may include a wired/wireless charging circuit, a battery, and the like.
- the sensor unit 140c may obtain vehicle status, surrounding environment information, user information, and the like.
- the sensor unit 140c includes an inertial measurement unit (IMU) sensor, a collision sensor, a wheel sensor, a speed sensor, an inclination sensor, a weight sensor, a heading sensor, a position module, and a vehicle forward movement.
- IMU inertial measurement unit
- a collision sensor a wheel sensor
- a speed sensor a speed sensor
- an inclination sensor a weight sensor
- a heading sensor a position module
- a vehicle forward movement / may include a reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor, a temperature sensor, a humidity sensor, an ultrasonic sensor, an illuminance sensor, a pedal position sensor, and the like.
- the autonomous driving unit 140d includes a technology for maintaining a driving lane, a technology for automatically adjusting speed such as adaptive cruise control, a technology for automatically driving along a predetermined route, and a technology for automatically setting a route when a destination is set. technology can be implemented.
- the communication unit 110 may receive map data, traffic information data, and the like from an external server.
- the autonomous driving unit 140d may generate an autonomous driving route and a driving plan based on the acquired data.
- the controller 120 may control the driving unit 140a to move the vehicle or the autonomous driving vehicle 100 along the autonomous driving path (eg, speed/direction adjustment) according to the driving plan.
- the communication unit 110 may non/periodically acquire the latest traffic information data from an external server, and may acquire surrounding traffic information data from surrounding vehicles.
- the sensor unit 140c may acquire vehicle state and surrounding environment information.
- the autonomous driving unit 140d may update the autonomous driving route and driving plan based on the newly acquired data/information.
- the communication unit 110 may transmit information about a vehicle location, an autonomous driving route, a driving plan, and the like to an external server.
- the external server may predict traffic information data in advance using AI technology or the like based on information collected from the vehicle or autonomous vehicles, and may provide the predicted traffic information data to the vehicle or autonomous vehicles.
- FIG. 27 illustrates a vehicle to which the present invention is applied.
- the vehicle may also be implemented as a means of transportation, a train, an aircraft, a ship, and the like.
- the vehicle 100 may include a communication unit 110 , a control unit 120 , a memory unit 130 , an input/output unit 140a , and a position measurement unit 140b .
- the communication unit 110 may transmit and receive signals (eg, data, control signals, etc.) with other vehicles or external devices such as a base station.
- the controller 120 may control components of the vehicle 100 to perform various operations.
- the memory unit 130 may store data/parameters/programs/codes/commands supporting various functions of the vehicle 100 .
- the input/output unit 140a may output an AR/VR object based on information in the memory unit 130 .
- the input/output unit 140a may include a HUD.
- the position measuring unit 140b may acquire position information of the vehicle 100 .
- the location information may include absolute location information of the vehicle 100 , location information within a driving line, acceleration information, location information with a surrounding vehicle, and the like.
- the position measuring unit 140b may include a GPS and various sensors.
- the communication unit 110 of the vehicle 100 may receive map information, traffic information, and the like from an external server and store it in the memory unit 130 .
- the position measuring unit 140b may obtain vehicle position information through GPS and various sensors and store it in the memory unit 130 .
- the controller 120 may generate a virtual object based on map information, traffic information, and vehicle location information, and the input/output unit 140a may display the created virtual object on a window inside the vehicle ( 1410 and 1420 ).
- the controller 120 may determine whether the vehicle 100 is normally operating within the driving line based on the vehicle location information. When the vehicle 100 deviates from the driving line abnormally, the controller 120 may display a warning on the windshield of the vehicle through the input/output unit 140a.
- control unit 120 may broadcast a warning message regarding the driving abnormality to surrounding vehicles through the communication unit 110 .
- control unit 120 may transmit the location information of the vehicle and information on driving/vehicle abnormality to the related organization through the communication unit 110 .
- the XR device may be implemented as an HMD, a head-up display (HUD) provided in a vehicle, a television, a smartphone, a computer, a wearable device, a home appliance, a digital signage, a vehicle, a robot, and the like.
- HMD head-up display
- the XR device may be implemented as an HMD, a head-up display (HUD) provided in a vehicle, a television, a smartphone, a computer, a wearable device, a home appliance, a digital signage, a vehicle, a robot, and the like.
- HUD head-up display
- the XR device 100a may include a communication unit 110 , a control unit 120 , a memory unit 130 , an input/output unit 140a , a sensor unit 140b , and a power supply unit 140c. .
- the communication unit 110 may transmit/receive signals (eg, media data, control signals, etc.) to/from external devices such as other wireless devices, portable devices, or media servers.
- Media data may include images, images, sounds, and the like.
- the controller 120 may perform various operations by controlling the components of the XR device 100a.
- the controller 120 may be configured to control and/or perform procedures such as video/image acquisition, (video/image) encoding, and metadata generation and processing.
- the memory unit 130 may store data/parameters/programs/codes/commands necessary for driving the XR device 100a/creating an XR object.
- the input/output unit 140a may obtain control information, data, and the like from the outside, and may output the generated XR object.
- the input/output unit 140a may include a camera, a microphone, a user input unit, a display unit, a speaker, and/or a haptic module.
- the sensor unit 140b may obtain an XR device state, surrounding environment information, user information, and the like.
- the sensor unit 140b may include a proximity sensor, an illumination sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, and/or a radar. have.
- the power supply unit 140c supplies power to the XR device 100a, and may include a wired/wireless charging circuit, a battery, and the like.
- the memory unit 130 of the XR device 100a may include information (eg, data, etc.) necessary for generating an XR object (eg, AR/VR/MR object).
- the input/output unit 140a may obtain a command to operate the XR device 100a from the user, and the controller 120 may drive the XR device 100a according to the user's driving command. For example, when the user wants to watch a movie or news through the XR device 100a, the controller 120 transmits the content request information through the communication unit 130 to another device (eg, the mobile device 100b) or can be sent to the media server.
- the communication unit 130 may download/stream contents such as movies and news from another device (eg, the portable device 100b) or a media server to the memory unit 130 .
- the controller 120 controls and/or performs procedures such as video/image acquisition, (video/image) encoding, and metadata generation/processing for the content, and is acquired through the input/output unit 140a/sensor unit 140b
- An XR object can be created/output based on information about one surrounding space or a real object.
- the XR device 100a is wirelessly connected to the portable device 100b through the communication unit 110 , and the operation of the XR device 100a may be controlled by the portable device 100b.
- the portable device 100b may operate as a controller for the XR device 100a.
- the XR device 100a may obtain 3D location information of the portable device 100b and then generate and output an XR object corresponding to the portable device 100b.
- Robots can be classified into industrial, medical, home, military, etc. depending on the purpose or field of use.
- the robot 100 may include a communication unit 110 , a control unit 120 , a memory unit 130 , an input/output unit 140a , a sensor unit 140b , and a driving unit 140c .
- the communication unit 110 may transmit/receive signals (eg, driving information, control signals, etc.) with external devices such as other wireless devices, other robots, or control servers.
- the controller 120 may perform various operations by controlling the components of the robot 100 .
- the memory unit 130 may store data/parameters/programs/codes/commands supporting various functions of the robot 100 .
- the input/output unit 140a may obtain information from the outside of the robot 100 and may output information to the outside of the robot 100 .
- the input/output unit 140a may include a camera, a microphone, a user input unit, a display unit, a speaker, and/or a haptic module.
- the sensor unit 140b may obtain internal information, surrounding environment information, user information, and the like of the robot 100 .
- the sensor unit 140b may include a proximity sensor, an illumination sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, a radar, and the like.
- the driving unit 140c may perform various physical operations such as moving a robot joint. In addition, the driving unit 140c may make the robot 100 travel on the ground or fly in the air.
- the driving unit 140c may include an actuator, a motor, a wheel, a brake, a propeller, and the like.
- AI devices are fixed or mobile devices such as TVs, projectors, smartphones, PCs, laptops, digital broadcasting terminals, tablet PCs, wearable devices, set-top boxes (STBs), radios, washing machines, refrigerators, digital signage, robots, and vehicles. It may be implemented in any possible device or the like.
- the AI device 100 includes a communication unit 110 , a control unit 120 , a memory unit 130 , input/output units 140a/140b , a learning processor unit 140c , and a sensor unit 140d). may include.
- the communication unit 110 uses wired/wireless communication technology to communicate with external devices such as other AI devices (eg, FIGS. 24, 100x, 200, 400) or an AI server (eg, 400 in FIG. 24) and wired/wireless signals (eg, sensor information). , user input, learning model, control signal, etc.) can be transmitted and received. To this end, the communication unit 110 may transmit information in the memory unit 130 to an external device or transmit a signal received from the external device to the memory unit 130 .
- AI devices eg, FIGS. 24, 100x, 200, 400
- an AI server eg, 400 in FIG. 24
- wired/wireless signals eg, sensor information
- the communication unit 110 may transmit information in the memory unit 130 to an external device or transmit a signal received from the external device to the memory unit 130 .
- the controller 120 may determine at least one executable operation of the AI device 100 based on information determined or generated using a data analysis algorithm or a machine learning algorithm. In addition, the controller 120 may control the components of the AI device 100 to perform the determined operation. For example, the control unit 120 may request, search, receive, or utilize the data of the learning processor unit 140c or the memory unit 130, and may be a predicted operation among at least one executable operation or determined to be preferable. Components of the AI device 100 may be controlled to execute the operation. In addition, the control unit 120 collects history information including user feedback on the operation contents or operation of the AI device 100 and stores it in the memory unit 130 or the learning processor unit 140c, or the AI server ( 24 and 400) may be transmitted to an external device. The collected historical information may be used to update the learning model.
- the memory unit 130 may store data supporting various functions of the AI device 100 .
- the memory unit 130 may store data obtained from the input unit 140a , data obtained from the communication unit 110 , output data of the learning processor unit 140c , and data obtained from the sensing unit 140 .
- the memory unit 130 may store control information and/or software codes necessary for the operation/execution of the control unit 120 .
- the input unit 140a may acquire various types of data from the outside of the AI device 100 .
- the input unit 140a may obtain training data for model learning, input data to which the learning model is applied, and the like.
- the input unit 140a may include a camera, a microphone, and/or a user input unit.
- the output unit 140b may generate an output related to sight, hearing, or touch.
- the output unit 140b may include a display unit, a speaker, and/or a haptic module.
- the sensing unit 140 may obtain at least one of internal information of the AI device 100 , surrounding environment information of the AI device 100 , and user information by using various sensors.
- the sensing unit 140 may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, and/or a radar. have.
- the learning processor unit 140c may train a model composed of an artificial neural network by using the training data.
- the learning processor unit 140c may perform AI processing together with the learning processor unit of the AI server ( FIGS. 24 and 400 ).
- the learning processor unit 140c may process information received from an external device through the communication unit 110 and/or information stored in the memory unit 130 . Also, the output value of the learning processor unit 140c may be transmitted to an external device through the communication unit 110 and/or stored in the memory unit 130 .
Abstract
Description
Claims (15)
- 무선통신시스템에서 RSU (Road Side Unit)의 VRU (Vulnerable Road User) 관련 동작에 있어서,상기 RSU가 VRU의 PSM (Personal Safety Messages) 메시지 수신;상기 RSU가 영상 정보를 통해 획득한 상기 VRU의 제1 위치 정보와 상기 PSM 메시지를 통해 획득한 상기 VRU의 제2 위치 정보에 기초하여 상기 VRU의 위치 정보를 결정; 및상기 RSU가 상기 VRU에게 상기 VRU의 위치 정보를 전송하는, 방법.
- 제1항에 있어서,상기 VRU의 위치 정보 결정시, 상기 제1 위치 정보와 상기 제2 위치 정보에는 가중치가 적용되는, 방법.
- 제1항에 있어서,상기 영상 정보는 상기 RSU가 관찰 영역에서 촬영한 것으로써 다른 RSU와 공유되는, 방법.
- 제3항에 있어서,상기 VRU의 위치 정보는 상기 다른 RSU에게 공유되는, 방법.
- 제4항에 있어서,상기 VRU의 위치 정보는 위치 결정에 사용된 상기 제2 위치정보에 대한 상기 제1 위치정보의 가중치 정보를 포함하는,, 방법.
- 제1항에 있어서,상기 PSM 메시지는 상기 VRU가 GNSS(Global Navigation Satellite System)를 통해 획득한 위치 정보를 포함하는, 방법.
- 제1항에 있어서,상기 PSM 메시지는 상기 VRU가 다른 RSU, 다른 VRU 또는 기지국을 통해 획득한 위치 정보를 포함하는, 방법.
- 제1항에 있어서,상기 VRU의 위치 정보는, 상기 제1 위치 정보와 상기 제2 위치 정보 중 오차 범위가 작은 값으로 결정되는, 방법.
- 제1항에 있어서,상기 VRU의 위치 정보는, 상기 제1 위치 정보의 오차 범위와 상기 제2 위치 정보의 오차 범위를 평균한 오차 범위에 기초하여 결정되는, 방법.
- 제9항에 있어서,상기 VRU의 위치 정보는, 상기 제1 위치 정보의 오차 범위에 해당하는 영역과 상기 제2 위치 정보의 오차 범위에 해당하는 영역의 교집합에 포함되는 것인, 방법.
- 무선통신시스템에서, VRU (Vulnerable Road User) 관련 동작을 수행하는 RSU (Road Side Unit)에 있어서,적어도 하나의 프로세서; 및상기 적어도 하나의 프로세서에 동작 가능하게 연결될 수 있고, 실행될 때 상기 적어도 하나의 프로세서로 하여금 동작들을 수행하게 하는 명령들을 저장하는 적어도 하나의 컴퓨터 메모리를 포함하며,상기 동작들은, VRU의 PSM 메시지 수신;영상 정보를 통해 획득한 상기 VRU의 제1 위치 정보와 상기 PSM 메시지를 통해 획득한 상기 VRU의 제2 위치 정보에 기초하여 상기 VRU의 위치 정보를 결정; 및상기 VRU에게 상기 VRU의 위치 정보를 전송하는, RSU.
- 무선통신시스템에서, RSU (Road Side Unit)를 위한 동작들을 수행하게 하는 프로세서에 있어서,상기 동작들은, VRU의 PSM 메시지 수신;영상 정보를 통해 획득한 상기 VRU의 제1 위치 정보와 상기 PSM 메시지를 통해 획득한 상기 VRU의 제2 위치 정보에 기초하여 상기 VRU의 위치 정보를 결정; 및상기 VRU에게 상기 VRU의 위치 정보를 전송하는, 프로세서.
- 적어도 하나의 프로세서에 의해 실행될 때, 적어도 하나의 프로세서가 RSU (Road Side Unit)를 위한 동작들을 수행하게 하는 명령을 포함하는 적어도 하나의 컴퓨터 프로그램을 저장하는 비휘발성 컴퓨터 판독 가능 저장 매체에 있어서,상기 동작들은, VRU의 PSM 메시지 수신;영상 정보를 통해 획득한 상기 VRU의 제1 위치 정보와 상기 PSM 메시지를 통해 획득한 상기 VRU의 제2 위치 정보에 기초하여 상기 VRU의 위치 정보를 결정; 및상기 VRU에게 상기 VRU의 위치 정보를 전송하는, 저장 매체.
- 무선통신시스템에서 RSU (Road Side Unit)에 관련된 VRU (Vulnerable Road User)의 동작에 있어서,상기 VRU가 상기 RSU로 PSM 메시지를 전송;상기 VRU가 상기 RSU로부터 상기 VRU의 위치 정보를 수신;을 포함하며,상기 VRU의 위치 정보는, 상기 RSU가 영상 정보를 통해 획득한 상기 VRU의 제1 위치 정보와 상기 PSM 메시지를 통해 획득한 상기 VRU의 제2 위치 정보에 기초하여 결정되는 것인, 방법.
- 무선통신시스템에서, RSU (Road Side Unit)에 관련된 VRU (Vulnerable Road User)에 있어서,적어도 하나의 프로세서; 및상기 적어도 하나의 프로세서에 동작 가능하게 연결될 수 있고, 실행될 때 상기 적어도 하나의 프로세서로 하여금 동작들을 수행하게 하는 명령들을 저장하는 적어도 하나의 컴퓨터 메모리를 포함하며,상기 동작들은, 상기 VRU가 상기 RSU로 PSM 메시지를 전송;상기 VRU가 상기 RSU로부터 상기 VRU의 위치 정보를 수신;을 포함하며,상기 VRU의 위치 정보는, 상기 RSU가 영상 정보를 통해 획득한 상기 VRU의 제1 위치 정보와 상기 PSM 메시지를 통해 획득한 상기 VRU의 제2 위치 정보에 기초하여 결정되는 것인, VRU.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020227039901A KR20230002757A (ko) | 2020-04-29 | 2021-04-29 | 무선통신시스템에서 vru 위치에 관련된 rsu의 동작 방법 |
JP2022565832A JP7419564B2 (ja) | 2020-04-29 | 2021-04-29 | 無線通信システムにおいてvru位置に関連するrsuの動作方法 |
US17/997,523 US20230176212A1 (en) | 2020-04-29 | 2021-04-29 | Method for operating rsu related to vru location in wireless communication system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2020-0052594 | 2020-04-29 | ||
KR20200052594 | 2020-04-29 | ||
KR20200069349 | 2020-06-09 | ||
KR10-2020-0069349 | 2020-06-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021221481A1 true WO2021221481A1 (ko) | 2021-11-04 |
Family
ID=78373734
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/005463 WO2021221481A1 (ko) | 2020-04-29 | 2021-04-29 | 무선통신시스템에서 vru 위치에 관련된 rsu의 동작 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230176212A1 (ko) |
JP (1) | JP7419564B2 (ko) |
KR (1) | KR20230002757A (ko) |
WO (1) | WO2021221481A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002257564A (ja) * | 2001-03-06 | 2002-09-11 | Nippon Telegr & Teleph Corp <Ntt> | 精度と計測時刻の異なる複数の位置情報に基づく位置推定方法、システム、プログラム及びプログラムを記録した記録媒体 |
KR20090013026A (ko) * | 2007-07-31 | 2009-02-04 | 가부시끼가이샤 도시바 | 가시광선 통신을 이용한 이동 물체의 위치 판정 방법 및장치 |
JP2014137321A (ja) * | 2013-01-18 | 2014-07-28 | Nec Corp | 位置座標変換システム、位置座標変換方法、車載装置、世界座標計測装置および位置座標変換プログラム |
KR20190032090A (ko) * | 2017-09-19 | 2019-03-27 | 삼성전자주식회사 | 외부 이동 수단으로 릴레이 메시지를 전송하는 전자 장치 및 그 동작 방법 |
US20190251847A1 (en) * | 2018-02-11 | 2019-08-15 | TuSimple | Method, device and system for vehicle positioning |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1137776A (ja) * | 1997-07-24 | 1999-02-12 | Denso Corp | 車両用ナビゲーション装置 |
JP2008249666A (ja) * | 2007-03-30 | 2008-10-16 | Fujitsu Ten Ltd | 車両位置特定装置および車両位置特定方法 |
JP5365792B2 (ja) * | 2009-06-01 | 2013-12-11 | マツダ株式会社 | 車両用位置測定装置 |
ES2904564T3 (es) * | 2015-08-20 | 2022-04-05 | Zendrive Inc | Método de navegación asistida por acelerómetro |
JP2018180860A (ja) * | 2017-04-11 | 2018-11-15 | 株式会社デンソー | 交通情報システム |
JP6816058B2 (ja) * | 2017-10-25 | 2021-01-20 | 日本電信電話株式会社 | パラメータ最適化装置、パラメータ最適化方法、プログラム |
JP7119346B2 (ja) * | 2017-11-13 | 2022-08-17 | トヨタ自動車株式会社 | 環境改善システム、ならびにそれに用いられるサーバ |
JP7031256B2 (ja) * | 2017-11-29 | 2022-03-08 | 富士通株式会社 | 表示制御方法、表示制御プログラムおよび端末装置 |
KR20200113242A (ko) * | 2018-02-06 | 2020-10-06 | 씨에이브이에이치 엘엘씨 | 지능형 도로 인프라구조 시스템(iris): 시스템 및 방법 |
-
2021
- 2021-04-29 JP JP2022565832A patent/JP7419564B2/ja active Active
- 2021-04-29 KR KR1020227039901A patent/KR20230002757A/ko unknown
- 2021-04-29 US US17/997,523 patent/US20230176212A1/en active Pending
- 2021-04-29 WO PCT/KR2021/005463 patent/WO2021221481A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002257564A (ja) * | 2001-03-06 | 2002-09-11 | Nippon Telegr & Teleph Corp <Ntt> | 精度と計測時刻の異なる複数の位置情報に基づく位置推定方法、システム、プログラム及びプログラムを記録した記録媒体 |
KR20090013026A (ko) * | 2007-07-31 | 2009-02-04 | 가부시끼가이샤 도시바 | 가시광선 통신을 이용한 이동 물체의 위치 판정 방법 및장치 |
JP2014137321A (ja) * | 2013-01-18 | 2014-07-28 | Nec Corp | 位置座標変換システム、位置座標変換方法、車載装置、世界座標計測装置および位置座標変換プログラム |
KR20190032090A (ko) * | 2017-09-19 | 2019-03-27 | 삼성전자주식회사 | 외부 이동 수단으로 릴레이 메시지를 전송하는 전자 장치 및 그 동작 방법 |
US20190251847A1 (en) * | 2018-02-11 | 2019-08-15 | TuSimple | Method, device and system for vehicle positioning |
Also Published As
Publication number | Publication date |
---|---|
JP2023523441A (ja) | 2023-06-05 |
KR20230002757A (ko) | 2023-01-05 |
JP7419564B2 (ja) | 2024-01-22 |
US20230176212A1 (en) | 2023-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021029672A1 (ko) | 무선통신시스템에서 사이드링크 drx에 관련된 ue의 동작 방법 및 장치 | |
WO2021091244A1 (ko) | 무선통신시스템에서 sci 관련 ue의 동작 방법 | |
WO2021096244A1 (ko) | 무선통신시스템에서 사이드링크 drx에 관련된 ue의 동작 방법 | |
WO2021085908A1 (ko) | 무선통신시스템에서 as configuration 관련 사이드링크 ue의 동작 방법 | |
WO2021162506A1 (ko) | 무선통신시스템에서 릴레이 ue에 관련된 ue의 동작 방법 | |
WO2021221448A1 (ko) | 무선통신시스템에서 릴레이 수립 요청에 관련된 릴레이 ue의 동작 방법 | |
WO2021154061A1 (ko) | 무선통신시스템에서 사이드링크 csi 보고에 관련된 ue의 동작 방법 | |
WO2022080702A1 (ko) | 무선통신시스템에서 사이드링크 릴레이 및 rlf에 관련된 ue의 동작 방법 | |
WO2022019643A1 (ko) | 무선통신시스템에서 릴레이 ue의 동작 방법 | |
WO2022025667A1 (ko) | 무선통신시스템에서 bwp에 관련된 릴레이 ue의 동작 방법 | |
WO2022025665A1 (ko) | 무선통신시스템에서 릴레이 ue 선택에 관련된 동작 방법 | |
WO2021206462A1 (ko) | 무선통신시스템에서 사이드링크 릴레이에 관련된 릴레이 ue의 동작 방법 | |
WO2021085909A1 (ko) | 무선통신시스템에서 pc5 유니캐스트 링크 해제에 관련된 ue의 동작 방법 | |
WO2021040378A1 (ko) | 무선통신시스템에서 rlf 발생 후 사이드링크 신호를 송수신하는 ue의 동작 방법 및 장치 | |
WO2021075877A1 (ko) | 무선통신시스템에서 사이드링크 타이머에 관련된 ue의 동작 방법 | |
WO2021040361A1 (ko) | 무선통신시스템에서 단말의 신호 송수신 방법 | |
WO2022035182A1 (ko) | 무선통신시스템에서 센서 로우 데이터 공유와 피드백에 관련된 ue의 동작 방법. | |
WO2022060117A1 (ko) | 무선통신시스템에서 사이드링크 릴레이와 시스템 정보에 관련된 ue의 동작 방법 | |
WO2021145751A1 (ko) | 무선통신시스템에서 사이드링크 ptrs에 관련된 ue의 동작 방법 | |
WO2022025615A1 (ko) | 무선통신시스템에서 사이드링크 디스커버리에 관련된 동작 방법 | |
WO2021235863A1 (ko) | 무선통신시스템에서 rsu와 신호를 송수신하는 vru의 동작 방법 | |
WO2021256908A1 (ko) | 무선통신시스템에서 릴레이에 관련된 ue의 동작 방법 | |
WO2021221481A1 (ko) | 무선통신시스템에서 vru 위치에 관련된 rsu의 동작 방법 | |
WO2021187910A1 (ko) | 무선통신시스템에서 vru의 메시지 송수신 관련 동작 방법 | |
WO2021182909A1 (ko) | 무선통신시스템에서 플래투닝에 관련된 ue의 동작 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21795409 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022565832 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227039901 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21795409 Country of ref document: EP Kind code of ref document: A1 |