WO2020162719A1 - Terminal mobile pour indiquer si une qualité de service est satisfaite dans un système de communication sans fil - Google Patents

Terminal mobile pour indiquer si une qualité de service est satisfaite dans un système de communication sans fil Download PDF

Info

Publication number
WO2020162719A1
WO2020162719A1 PCT/KR2020/001786 KR2020001786W WO2020162719A1 WO 2020162719 A1 WO2020162719 A1 WO 2020162719A1 KR 2020001786 W KR2020001786 W KR 2020001786W WO 2020162719 A1 WO2020162719 A1 WO 2020162719A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
qos
unit
mobile terminal
satisfied
Prior art date
Application number
PCT/KR2020/001786
Other languages
English (en)
Korean (ko)
Inventor
김래영
윤명준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020162719A1 publication Critical patent/WO2020162719A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/02Traffic management, e.g. flow control or congestion control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/16Central resource management; Negotiation of resources or communication parameters, e.g. negotiating bandwidth or QoS [Quality of Service]
    • H04W28/24Negotiating SLA [Service Level Agreement]; Negotiating QoS [Quality of Service]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Definitions

  • the following description relates to a wireless communication system, and more specifically, to a mobile terminal indicating whether QoS is satisfied.
  • a wireless communication system is a multiple access system capable of supporting communication with multiple users by sharing available system resources (bandwidth, transmission power, etc.).
  • multiple access systems include a code division multiple access (CDMA) system, a frequency division multiple access (FDMA) system, a time division multiple access (TDMA) system, an orthogonal frequency division multiple access (OFDMA) system, and a single carrier frequency (SC-FDMA) system.
  • CDMA code division multiple access
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • OFDMA orthogonal frequency division multiple access
  • SC-FDMA single carrier frequency division multiple access
  • division multiple access division multiple access
  • MC-FDMA multi carrier frequency division multiple access
  • RATs radio access technologies
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution
  • WiFi wireless fidelity
  • 5G 5th Generation
  • the three main requirements areas of 5G are: (1) Enhanced Mobile Broadband (eMBB) area, (2) Massive Machine Type Communication (mMTC) area, and (3) Super-reliability and It includes the area of ultra-reliable and low latency communications (URLLC).
  • eMBB Enhanced Mobile Broadband
  • mMTC Massive Machine Type Communication
  • URLLC ultra-reliable and low latency communications
  • KPI key performance indicator
  • 5G supports these various use cases in a flexible and reliable way.
  • eMBB goes far beyond basic mobile Internet access, and covers media and entertainment applications in rich interactive work, cloud or augmented reality.
  • Data is one of the key drivers of 5G, and it may not be possible to see dedicated voice services for the first time in the 5G era.
  • voice is expected to be handled as an application program simply using the data connection provided by the communication system.
  • the main causes for increased traffic volume are increased content size and increased number of applications requiring high data rates.
  • Streaming services (audio and video), interactive video and mobile Internet connections will become more prevalent as more devices connect to the Internet. Many of these applications require always-on connectivity to push real-time information and notifications to the user.
  • Cloud storage and applications are rapidly increasing in mobile communication platforms, which can be applied to both work and entertainment.
  • cloud storage is a special use case that drives the growth of uplink data rates.
  • 5G is also used for remote work in the cloud, and requires much lower end-to-end delays to maintain a good user experience when tactile interfaces are used.
  • Entertainment For example, cloud gaming and video streaming are another key factor in increasing demand for mobile broadband capabilities. Entertainment is essential on smartphones and tablets anywhere, including high mobility environments such as trains, cars and airplanes.
  • Another use case is augmented reality and information retrieval for entertainment.
  • augmented reality requires very low latency and instantaneous amount of data.
  • one of the most anticipated 5G use cases relates to the ability to seamlessly connect embedded sensors in all fields, namely mMTC. It is predicted that by 2020, there are 20 billion potential IoT devices.
  • Industrial IoT is one of the areas where 5G plays a key role in enabling smart cities, asset tracking, smart utilities, agriculture and security infrastructure.
  • URLLC includes new services that will transform the industry through ultra-reliable/low-latency links, such as remote control of the main infrastructure and self-driving vehicles.
  • the level of reliability and delay is essential for smart grid control, industrial automation, robotics, drone control and coordination.
  • 5G can complement fiber-to-the-home (FTTH) and cable-based broadband (or DOCSIS) as a means of providing streams rated at hundreds of megabits per second to gigabits per second. This fast speed is required to deliver TV in 4K (6K, 8K and above) resolutions as well as virtual and augmented reality.
  • Virtual Reality (VR) and Augmented Reality (AR) applications involve almost immersive sports events.
  • Certain application programs may require special network settings. For VR games, for example, game companies may need to integrate the core server with the network operator's edge network server to minimize latency.
  • Automotive is expected to be an important new driver for 5G, along with many use cases for mobile communications to vehicles. For example, entertainment for passengers requires simultaneous high capacity and high mobility mobile broadband. The reason is that future users continue to expect high quality connections regardless of their location and speed.
  • Another application example in the automotive field is an augmented reality dashboard. It identifies objects in the dark over what the driver sees through the front window and superimposes information that tells the driver about the distance and movement of the object.
  • wireless modules will enable communication between vehicles, exchange of information between the vehicle and the supporting infrastructure, and exchange of information between the vehicle and other connected devices (eg, devices carried by pedestrians).
  • the safety system guides alternative courses of action to help the driver drive more safely, reducing the risk of accidents.
  • the next step will be remote control or a self-driven vehicle.
  • Smart cities and smart homes will be embedded in high-density wireless sensor networks.
  • the distributed network of intelligent sensors will identify the conditions for cost and energy-efficient maintenance of a city or home.
  • a similar setup can be done for each household.
  • Temperature sensors, window and heating controllers, burglar alarms and consumer electronics are all connected wirelessly. Many of these sensors are typically low data rates, low power and low cost. However, for example, real-time HD video may be required in certain types of devices for surveillance.
  • the smart grid interconnects these sensors using digital information and communication technologies to collect information and act accordingly. This information can include the behavior of suppliers and consumers, allowing smart grids to improve efficiency, reliability, economics, sustainability of production and the distribution of fuels such as electricity in an automated manner.
  • the smart grid can be viewed as another sensor network with low latency.
  • the health sector has many applications that can benefit from mobile communications.
  • the communication system can support telemedicine that provides clinical care from a distance. This helps to reduce barriers to distance and can improve access to medical services that are not continuously available in remote rural areas. It is also used to save lives in critical care and emergency situations.
  • a mobile communication based wireless sensor network can provide remote monitoring and sensors for parameters such as heart rate and blood pressure.
  • Wireless and mobile communications are becoming increasingly important in industrial applications. Wiring is expensive to install and maintain. Thus, the possibility of replacing cables with wireless links that can be reconfigured is an attractive opportunity in many industries. However, achieving this requires that the wireless connection operate with cable-like delay, reliability and capacity, and that management be simplified. Low latency and very low error probability are new requirements that need to be connected to 5G.
  • Logistics and freight tracking are important use cases for mobile communications that enable the tracking of inventory and packages from anywhere using location-based information systems. Logistics and freight tracking use cases typically require low data rates, but require a wide range and reliable location information.
  • the embodiment discloses a mobile terminal displaying whether QoS is satisfied based on a change in a user plane congestion state.
  • a display unit for displaying a driving route; And a control unit for controlling the display unit, wherein the control unit displays whether or not QoS (Quality of Service) is satisfied on the display unit, and the control unit relates to application adjustment based on a notification that QoS is not satisfied from the V2X application server.
  • QoS Quality of Service
  • the application adjustment-related information includes information instructing to change the driving route, information instructing to change the Level of Automation (LoA), information informing that the application should be terminated after a predetermined time, information instructing to change the driving speed, and stopping the operation. It may be one of the indicated information.
  • LoA Level of Automation
  • the LoA may be composed of 0-no automation, 1-driver assistance, 2-partial automation, 3-conditional automation, 4-high automation, 5-full automation.
  • the information instructing the change of the driving route may be displayed together with information on the section requiring the change.
  • the application adjustment-related information may be displayed together with a map on which a driving route is displayed.
  • Whether or not the QoS is satisfied may be related to execution of an application initiated by the user's selection.
  • the application may be one of autonomous driving and platoon driving.
  • Whether the QoS is satisfied may be based on a notification from the V2X application server.
  • Whether the QoS is satisfied may be based on a notification related to a change in a user plane congestion status received by the V2X application server.
  • the notification related to the change in the user plane congestion state is the date of notification of the user plane congestion analysis (analytics for the user plane congestion) received by the V2X application server from the Network Data Analytics Function (NWDAF) through the Network Exposure Function (NEF). I can.
  • NWDAF Network Data Analytics Function
  • NEF Network Exposure Function
  • the user plane congestion analysis may be based on a change in a user plane congestion status received by the NWDAF from Operations and Maintenance (OAM).
  • OAM Operations and Maintenance
  • the user plane congestion analysis notification may include a location and time at which a potential change in QoS may occur.
  • the congestion level can be more accurately provided than the conventional GBR bearer is determined by the number of times the bearer disappears.
  • FIG. 1 is a diagram showing a schematic structure of an EPS (Evolved Packet System) including an Evolved Packet Core (EPC).
  • EPS Evolved Packet System
  • EPC Evolved Packet Core
  • FIG. 2 is an exemplary diagram showing the architecture of a general E-UTRAN and EPC.
  • 3 is an exemplary diagram showing the structure of a radio interface protocol in a control plane.
  • FIG. 4 is an exemplary diagram showing the structure of a radio interface protocol in the user plane.
  • 5 is a flow diagram for explaining a random access process.
  • RRC radio resource control
  • FIG. 7 is a diagram for describing a 5G system.
  • FIG. 20 illustrates a signal processing circuit for a transmission signal.
  • 21 shows another example of a wireless device applied to the present invention.
  • FIG. 22 illustrates a portable device applied to the present invention.
  • FIG. 23 illustrates a vehicle or an autonomous vehicle applied to the present invention.
  • 25 illustrates an XR device applied to the present invention.
  • 26 illustrates a robot applied to the present invention.
  • FIG 27 illustrates an AI device applied to the present invention.
  • each component or feature may be considered to be optional unless explicitly stated otherwise.
  • Each component or feature may be implemented in a form that is not combined with other components or features.
  • some components and/or features may be combined to constitute an embodiment of the present invention.
  • the order of operations described in the embodiments of the present invention may be changed. Some configurations or features of one embodiment may be included in other embodiments, or may be replaced with corresponding configurations or features of other embodiments.
  • Embodiments of the present invention may be supported by standard documents disclosed in connection with at least one of an Institute of Electrical and Electronics Engineers (IEEE) 802 system, a 3GPP system, a 3GPP LTE and LTE-A system, and a 3GPP2 system. That is, among the embodiments of the present invention, steps or parts not described to clearly reveal the technical idea of the present invention may be supported by the above documents. In addition, all terms disclosed in this document can be described by the standard document.
  • IEEE Institute of Electrical and Electronics Engineers
  • -UMTS Universal Mobile Telecommunications System
  • 3G Global System for Mobile Communication
  • 3G Generation
  • Evolved Packet System A network system composed of an Evolved Packet Core (EPC), which is an Internet Protocol (IP)-based packet switched (PS) core network, and an access network such as LTE/UTRAN.
  • EPC Evolved Packet Core
  • IP Internet Protocol
  • PS packet switched
  • UMTS is an evolved type of network.
  • -NodeB a base station of GERAN/UTRAN. It is installed outdoors and its coverage is macro cell scale.
  • -eNodeB a base station of E-UTRAN. It is installed outdoors and its coverage is macro cell scale.
  • -UE User Equipment
  • the UE may also be referred to in terms of terminal, mobile equipment (ME), mobile station (MS), and the like.
  • the UE may be a portable device such as a notebook computer, a mobile phone, a personal digital assistant (PDA), a smart phone, or a multimedia device, or may be a non-portable device such as a personal computer (PC) or a vehicle-mounted device.
  • the term UE or UE may refer to an MTC device.
  • -HNB Home NodeB: As a base station of the UMTS network, it is installed indoors and its coverage is in the micro cell scale.
  • -HeNB Home eNodeB
  • Home eNodeB As a base station of the EPS network, it is installed indoors and its coverage is on a micro cell scale.
  • -MME Mobility Management Entity: A network node of an EPS network that performs mobility management (MM) and session management (SM) functions.
  • MM mobility management
  • SM session management
  • -PDN-GW Packet Data Network-Gateway
  • PGW Packet Data Network-Gateway
  • PGW A network node of an EPS network that performs UE IP address allocation, packet screening and filtering, and charging data collection functions.
  • SGW Serving Mobility Management Entity: A network node of an EPS network that performs a function of triggering a mobility anchor, packet routing, idle mode packet buffering, and triggering the MME to page the UE.
  • Non-Access Stratum The upper end (stratum) of the control plane (control plane) between the UE and the MME.
  • control plane control plane
  • the main function is to support.
  • -PDN Packet Data Network
  • MMS multimedia messaging service
  • WAP wireless application protocol
  • -PDN connection a logical connection between the UE and the PDN, expressed by one IP address (one IPv4 address and/or one IPv6 prefix).
  • -RAN Radio Access Network
  • RNC Radio Network Controller
  • HSS Home Subscriber Server
  • the HSS may perform functions such as configuration storage, identity management, and user state storage.
  • -PLMN Public Land Mobile Network
  • ProSe Service Proximity based Service
  • a service that enables discovery between physically adjacent devices and direct communication with each other, communication through a base station, or communication through a third device.
  • user plane data is exchanged through a direct data path without going through a 3GPP core network (eg, EPC).
  • 3GPP core network eg, EPC
  • EPC Evolved Packet Core
  • FIG. 1 is a diagram showing a schematic structure of an EPS (Evolved Packet System) including an Evolved Packet Core (EPC).
  • EPS Evolved Packet System
  • EPC Evolved Packet Core
  • EPC is a key element of SAE (System Architecture Evolution) to improve the performance of 3GPP technologies.
  • SAE is a research project that determines a network structure that supports mobility between various types of networks.
  • SAE aims to provide an optimized packet-based system, for example, supporting various wireless access technologies based on IP and providing improved data transmission capability.
  • the EPC is a core network of an IP mobile communication system for a 3GPP LTE system, and can support packet-based real-time and non-real-time services.
  • the core network is connected through two distinct sub-domains: CS (Circuit-Switched) for voice and PS (Packet-Switched) for data.
  • CS Circuit-Switched
  • PS Packet-Switched
  • the connection between the terminal and the terminal having IP capability is an IP-based base station (e.g., eNodeB (evolved Node B)), EPC, application domain (e.g., IMS ( IP Multimedia Subsystem)).
  • EPC is an essential structure for implementing end-to-end IP services.
  • the EPC may include various components, and in FIG. 1, some of them, SGW (Serving Gateway), PDN GW (Packet Data Network Gateway), MME (Mobility Management Entity), SGSN (Serving General Packet Radio Service) Supporting Node) and ePDG (enhanced packet data gateway) are shown.
  • SGW Serving Gateway
  • PDN GW Packet Data Network Gateway
  • MME Mobility Management Entity
  • SGSN Serving General Packet Radio Service
  • ePDG enhanced packet data gateway
  • the SGW (or S-GW) operates as a boundary point between the radio access network (RAN) and the core network, and is an element that functions to maintain a data path between the eNodeB and the PDN GW.
  • the SGW serves as a local mobility anchor point. That is, packets may be routed through the SGW for mobility within the E-UTRAN (Evolved-UMTS (Universal Mobile Telecommunications System) Terrestrial Radio Access Network defined after 3GPP Release-8).
  • E-UTRAN Evolved-UMTS (Universal Mobile Telecommunications System) Terrestrial Radio Access Network defined after 3GPP Release-8).
  • SGW has mobility with other 3GPP networks (RANs defined before 3GPP Release-8, for example, UTRAN or GERAN (Global System for Mobile Communication) / EDGE (Enhanced Data rates for Global Evolution) Radio Access Network). It may also function as an anchor point for.
  • 3GPP networks RANs defined before 3GPP Release-8, for example, UTRAN or GERAN (Global System for Mobile Communication) / EDGE (Enhanced Data rates for Global Evolution) Radio Access Network). It may also function as an anchor point for.
  • the PDN GW corresponds to the termination point of the data interface towards the packet data network.
  • PDN GW can support policy enforcement features, packet filtering, charging support, etc.
  • mobility management between 3GPP networks and non-3GPP networks e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA) networks or trusted networks such as WiMax) Can serve as an anchor point for 3GPP networks and non-3GPP networks (e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA) networks or trusted networks such as WiMax) Can serve as an anchor point for 3GPP networks and non-3GPP networks (e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA) networks or trusted networks such as WiMax) Can serve as an anchor point for 3GPP networks and non-3GPP networks (e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network
  • the SGW and the PDN GW are configured as separate gateways, but two gateways may be implemented according to a single gateway configuration option.
  • the MME is an element that performs signaling and control functions to support access to the network connection of the UE, allocation of network resources, tracking, paging, roaming, and handover.
  • the MME controls control plane functions related to subscriber and session management.
  • the MME manages numerous eNodeBs and performs signaling for selection of a conventional gateway for handover to other 2G/3G networks.
  • the MME performs functions such as security procedures, terminal-to-network session handling, and idle terminal location management.
  • SGSN handles all packet data such as user mobility management and authentication to other 3GPP networks (eg GPRS networks).
  • 3GPP networks eg GPRS networks.
  • the ePDG serves as a secure node for untrusted non-3GPP networks (eg, I-WLAN, WiFi hotspot, etc.).
  • untrusted non-3GPP networks eg, I-WLAN, WiFi hotspot, etc.
  • a terminal having IP capability is an IP service network provided by an operator (ie, an operator) through various elements in the EPC based on 3GPP access as well as non-3GPP access. (For example, IMS) can be accessed.
  • FIG. 1 shows various reference points (eg, S1-U, S1-MME, etc.).
  • a conceptual link connecting two functions existing in different functional entities of E-UTRAN and EPC is defined as a reference point.
  • Table 1 below summarizes the reference points shown in FIG. 1.
  • various reference points may exist according to the network structure.
  • S1-U Reference point between E-UTRAN and Serving GW for the per bearer user plane tunnelling and inter eNodeB path switching during handover for path switching between eNBs and user plane tunneling per bearer during handover
  • S3 Reference point between MME and SGSN that provides user and bearer information exchange for mobility between 3GPP access networks in an idle and/or active state.
  • This reference point can be used within PLMN- or between PLMNs (eg, in case of PLMN-inter-handover)) (It enables user and bearer information exchange for inter 3GPP access network mobility in idle and/or active state .
  • This reference point can be used intra-PLMN or inter-PLMN (eg in the case of Inter-PLMN HO).)
  • S4 A reference point between SGW and SGSN that provides related control and mobility support between the GPRS core and the 3GPP anchor function of the SGW. Also, if a direct tunnel is not established, it provides related control and mobility support between GPRS Core.
  • S5 A reference point that provides user plane tunneling and tunnel management between SGW and PDN GW. It is used for SGW relocation when connection to a PDN GW not co-located with the SGW is required due to terminal mobility and required PDN connectivity (It provides user plane tunneling and tunnel management between Serving GW and PDN GW. It is used) for Serving GW relocation due to UE mobility and if the Serving GW needs to connect to a non-collocated PDN GW for the required PDN connectivity.) S11 Reference point between MME and SGW SGi A reference point between the PDN GW and the PDN.
  • the PDN may be a public or private PDN outside the operator, or may be, for example, an intra-operator PDN for provision of IMS services.
  • This reference point corresponds to the Gi of 3GPP access (It is the reference point between the PDN GW and the packet data network.
  • Packet data network may be an operator external public or private packet data network or an intra operator packet data network, eg for provision of IMS services.This reference point corresponds to Gi for 3GPP accesses.)
  • S2a and S2b correspond to non-3GPP interfaces.
  • S2a is a reference point that provides control and mobility support between trusted non-3GPP access and PDN GW to the user plane.
  • S2b is a reference point that provides related control and mobility support between ePDG and PDN GW to the user plane.
  • FIG. 2 is an exemplary diagram showing the architecture of a general E-UTRAN and EPC.
  • the eNodeB is routing to the gateway while the Radio Resource Control (RRC) connection is active, scheduling and transmitting a paging message, scheduling and transmitting a broadcaster channel (BCH), and resources in the uplink and downlink. It can perform functions for dynamic allocation to UE, configuration and provision for measurement of eNodeB, radio bearer control, radio admission control, and connection mobility control. Within the EPC, paging generation, LTE_IDLE state management, user plane encryption, SAE bearer control, NAS signaling encryption and integrity protection functions can be performed.
  • RRC Radio Resource Control
  • FIG. 3 is an exemplary diagram showing the structure of a radio interface protocol in a control plane between a terminal and a base station
  • FIG. 4 is an exemplary diagram showing the structure of a radio interface protocol in a user plane between a terminal and a base station .
  • the air interface protocol is based on the 3GPP radio access network standard.
  • the wireless interface protocol horizontally consists of a physical layer, a data link layer, and a network layer, and vertically, a user plane and control for data information transmission. It is divided into a control plane for signal transmission.
  • the protocol layers are L1 (Layer 1), L2 (Layer 2), L3 (Layer 3) based on the lower three layers of the Open System Interconnection (OSI) reference model widely known in communication systems. ) Can be distinguished.
  • OSI Open System Interconnection
  • the first layer provides an information transfer service using a physical channel.
  • the physical layer is connected to an upper medium access control layer through a transport channel, and data between the medium access control layer and the physical layer is transmitted through the transport channel.
  • data is transmitted between different physical layers, that is, between the physical layers of the transmitting side and the receiving side through a physical channel.
  • the physical channel is composed of several subframes on the time axis and several sub-carriers on the frequency axis.
  • one sub-frame is composed of a plurality of symbols and a plurality of subcarriers on the time axis.
  • One subframe is composed of a plurality of resource blocks (Resource Block), and one resource block is composed of a plurality of symbols (Symbol) and a plurality of subcarriers.
  • the transmission time interval (TTI) which is a unit time at which data is transmitted, is 1 ms corresponding to one subframe.
  • the physical channels existing in the physical layer of the transmitting side and the receiving side are according to 3GPP LTE, a data channel PDSCH (Physical Downlink Shared Channel) and PUSCH (Physical Uplink Shared Channel), and a control channel PDCCH (Physical Downlink Control Channel), It can be divided into PCFICH (Physical Control Format Indicator Channel), PHICH (Physical Hybrid-ARQ Indicator Channel), and PUCCH (Physical Uplink Control Channel).
  • PCFICH Physical Control Format Indicator Channel
  • PHICH Physical Hybrid-ARQ Indicator Channel
  • PUCCH Physical Uplink Control Channel
  • the Medium Access Control (MAC) layer of the second layer plays a role of mapping various logical channels to various transport channels, and also logical channel multiplexing that maps several logical channels to one transport channel. It plays the role of (Multiplexing).
  • the MAC layer is connected to the RLC layer, which is the upper layer, through a logical channel, and the logical channel has a control channel and a control channel that transmits information of the control plane according to the type of transmitted information. It is divided into a traffic channel that transmits information on the user plane.
  • the Radio Link Control (RLC) layer of the second layer adjusts the data size so that the lower layer is suitable for transmitting data to the radio section by segmenting and concatenating the data received from the upper layer. Play a role.
  • RLC Radio Link Control
  • the second layer's Packet Data Convergence Protocol (PDCP) layer is an IP that is relatively large in size and contains unnecessary control information for efficient transmission in a wireless section with a small bandwidth when transmitting an IP packet such as IPv4 or IPv6. It performs a header compression function that reduces the packet header size.
  • the PDCP layer also performs a security function, which consists of encryption (Ciphering) to prevent data interception by a third party and integrity protection (Integrity protection) to prevent data manipulation by a third party.
  • the radio resource control (Radio Resource Control; hereinafter abbreviated as RRC) layer located at the top of the third layer is defined only in the control plane, and configuration and reconfiguration of radio bearers (Radio Bearer; abbreviated as RB).
  • RRC Radio Resource Control
  • RB refers to a service provided by the second layer for data transmission between the UE and the E-UTRAN.
  • the terminal When there is an RRC connection between the RRC of the terminal and the RRC layer of the radio network, the terminal is in an RRC connected mode, otherwise, the terminal is in an RRC idle mode.
  • the RRC state refers to whether the RRC of the UE is in a logical connection with the RRC of the E-UTRAN, and when it is connected, it is called an RRC_CONNECTED state, and when it is not connected, it is called an RRC_IDLE state. Since the UE in the RRC_CONNECTED state has an RRC connection, the E-UTRAN can determine the existence of the corresponding UE at the cell level, and thus can effectively control the UE.
  • the E-UTRAN cannot determine the existence of the UE, and the core network is managed by the TA (Tracking Area) unit, which is a larger area unit than the cell. That is, the UE in the RRC_IDLE state is determined only whether the UE exists in a larger area unit than the cell, and in order to receive a normal mobile communication service such as voice or data, the UE must transition to the RRC_CONNECTED state.
  • Each TA is classified through a Tracking Area Identity (TAI).
  • the terminal may configure the TAI through a tracking area code (TAC), which is information broadcasted from the cell.
  • TAC tracking area code
  • the terminal When the user first turns on the power of the terminal, the terminal first searches for an appropriate cell, establishes an RRC connection in the cell, and registers the terminal information in the core network. After that, the terminal stays in the RRC_IDLE state. The terminal staying in the RRC_IDLE state (re) selects a cell as necessary, and looks at system information or paging information. This is called camping on the cell. The UE that has stayed in the RRC_IDLE state makes an RRC connection with the RRC of the E-UTRAN through an RRC connection procedure and transitions to the RRC_CONNECTED state when it is necessary to establish an RRC connection.
  • the NAS (Non-Access Stratum) layer located above the RRC layer performs functions such as connection management (Session Management) and mobility management (Mobility Management).
  • ESM evolved session management
  • the default bearer resource has the characteristic that it is allocated from the network when it is connected to the network when it first accesses a specific packet data network (PDN).
  • PDN packet data network
  • the network allocates an IP address available to the terminal so that the terminal can use the data service, and also allocates the QoS of the default bearer.
  • LTE largely supports two types: a bearer having a guaranteed bit rate (GBR) QoS characteristic that guarantees a specific bandwidth for data transmission and reception, and a non-GBR bearer having a best effort QoS characteristic without guaranteeing the bandwidth.
  • GBR guaranteed bit rate
  • Non-GBR a bearer having QoS characteristics of GBR or Non-GBR may be allocated.
  • the bearer allocated to the terminal in the network is called an evolved packet service (EPS) bearer, and when the EPS bearer is allocated, the network allocates one ID. This is called EPS Bearer ID.
  • EPS bearer ID One EPS bearer has a QoS characteristic of a maximum bit rate (MBR) or/and a guaranteed bit rate (GBR).
  • 5 is a flowchart illustrating a random access procedure in 3GPP LTE.
  • the random access procedure is used for the UE to obtain UL synchronization with the base station or to be allocated UL radio resources.
  • the UE receives a root index and a physical random access channel (PRACH) configuration index from the eNodeB.
  • PRACH physical random access channel
  • ZC Zadoff-Chu
  • the PRACH configuration index indicates a specific subframe in which transmission of a random access preamble is possible and a preamble format.
  • the UE transmits a randomly selected random access preamble to the eNodeB.
  • the UE selects one of 64 candidate random access preambles.
  • a corresponding subframe is selected by the PRACH configuration index.
  • the UE transmits the selected random access preamble in the selected subframe.
  • the eNodeB Upon receiving the random access preamble, the eNodeB sends a random access response (RAR) to the UE.
  • RAR random access response
  • the random access response is detected in two steps. First, the UE detects a PDCCH masked with a random access-RNTI (RA-RNTI). The UE receives a random access response in a Medium Access Control (MAC) Protocol Data Unit (PDU) on the PDSCH indicated by the detected PDCCH.
  • MAC Medium Access Control
  • PDU Protocol Data Unit
  • RRC 6 shows a connection process in a radio resource control (RRC) layer.
  • RRC radio resource control
  • the RRC state is shown depending on whether the RRC is connected.
  • the RRC state refers to whether or not the entity of the RRC layer of the UE is in a logical connection with the entity of the RRC layer of the eNodeB, and when connected, it is called an RRC connected state, and the connection The state that has not been set is called the RRC idle state.
  • the E-UTRAN can determine the existence of the corresponding terminal at the cell level, and thus can effectively control the UE.
  • the UE in the idle state cannot be recognized by the eNodeB, and is managed by the Core Network in units of a tracking area, which is a larger area unit than a cell.
  • the tracking area is a set unit of cells. That is, only the existence of an idle state UE is determined in a large area unit, and the UE needs to transition to a connected state in order to receive a normal mobile communication service such as voice or data.
  • the UE When the user first turns on the power of the UE, the UE first searches for an appropriate cell and then stays in an idle state in the cell. The UE, which has stayed in the idle state, establishes an RRC connection with the RRC layer of the eNodeB through an RRC connection procedure when it is necessary to establish an RRC connection, and then transitions to the RRC connected state. .
  • a call attempt by a user or uplink data transmission is required, or a paging message is received from the EUTRAN.
  • a response message may be transmitted.
  • the RRC connection process is largely a process in which the UE transmits an RRC connection request message to the eNodeB, the eNodeB transmits an RRC connection setup message to the UE, and the UE completes RRC connection setup to the eNodeB. It includes the process of transmitting a (RRC connection setup complete) message. This process will be described in more detail with reference to FIG. 6 as follows.
  • the UE When the UE in the idle mode wants to establish an RRC connection for reasons such as a call attempt, a data transmission attempt, or a response to the eNodeB's paging, the UE first sends an RRC connection request message. Send to eNodeB.
  • the eNB When receiving the RRC connection request message from the UE, the eNB accepts the RRC connection request of the UE when radio resources are sufficient, and transmits an RRC connection setup message, which is a response message, to the UE. .
  • the UE When the UE receives the RRC connection setup message, it transmits an RRC connection setup complete message to the eNodeB. When the UE successfully transmits the RRC connection setup message, the UE finally establishes an RRC connection with the eNodeB and transitions to the RRC connection mode.
  • the MME is divided into an Access and Mobility Management Function (AMF) and a Session Management Function (SMF) in the Next Generation system (or 5G Core Network (CN)). Accordingly, NAS interaction with the UE and MM (Mobility Management) are performed by AMF, and SM (Session Management) is performed by SMF.
  • AMF Access and Mobility Management Function
  • SMF Session Management
  • UPF User Plane Function
  • P-GW Packet Control Plane part
  • the user-plane part can be considered to be in charge of the UPF.
  • one or more UPFs may exist between the RAN and the DN (Data Network). That is, the conventional EPC may be configured as illustrated in FIG. 7 in 5G.
  • PDU session refers to an association between a UE and a DN that provides not only IP type but also Ethernet type or unstructured type PDU connectivity service.
  • UDM Unified Data Management
  • PCF Policy Control Function
  • the functions can be provided in an expanded form to satisfy the requirements of the 5G system.
  • each function, and each interface, TS 23.501 applies mutatis mutandis.
  • Non-3GPP access is typically WLAN access, which may include both a trusted WLAN and an untrusted WLAN.
  • the Access and Mobility Management Function (AMF) of the 5G system performs Registration Management (RM) and Connection Management (CM) for non-3GPP access as well as 3GPP access.
  • RM Registration Management
  • CM Connection Management
  • the same AMF serves the UE for 3GPP access and non-3GPP access belonging to the same PLMN, so that one network function is integrated and efficient for authentication, mobility management as well as session management for UEs registered through two different accesses. Can apply.
  • each eV2X service may be provided with different application configurations such as levels of automation, gaps between vehicles, etc.
  • Each application configuration can have different QoS requirements.
  • applications may have to adjust their configuration when QoS changes according to the new QoS that can be delivered.
  • the eV2X service it may be important for some application(s) to be notified in advance of potential changes in delivered QoS in order to be able to dynamically adjust the configuration.
  • the notification can take into account the locations the UE is likely to drive for a given time. This key issue is to study 5GS enhancements to support application tuning for eV2X services based on notifications of potential changes in QoS delivered.
  • FIG. 8 shows a procedure used to retrieve user plane congestion analysis for a specific geographic area by NF. This procedure can be used to request a one-time or continuous report for user plane congestion analysis.
  • step S801 the NF transmits Nnwdaf_AnalyticsInfo_Request to NWDAF to instruct a request for analysis for user plane congestion at a specific location.
  • the NF can request statistics or predictions or both.
  • the analysis type is set to user plane congestion, and the analysis target is set to location (eg ECGI, TA).
  • steps S802 to S803 if the request is approved, in order to provide the requested analysis, the NWDAF may request the OAM for a user plane congestion state for the requested location, and the OAM provides the requested information. If the NWDAF already has information about the congestion state of the user plane at the requested location, this step is omitted.
  • step S804 the NWDAF derives the requested analysis.
  • step S805 the NWDAF provides the NF with an analysis of the user plane congestion.
  • step S806 the NF transmits an Nnwdaf_EventsSubscription_Subscribe request to the NWDAF to request analysis of user plane congestion at a specific location (eg, ECGI, TA). NF can request statistics or predictions, or both.
  • the NWDAF subscribes to the OAM to obtain a user plane congestion state for the requested location, providing a congestion level threshold if possible, and the OAM returns the first report for the requested information in response. to provide.
  • step S809 the NWDAF derives the requested analysis.
  • step S810 the NWDAF provides an analysis of the user plane congestion to the NF.
  • step S811 a change in the user plane congestion state corresponding to exceeding the threshold value set by the NWDAF is detected by the OAM and notified to the NWDAF.
  • step S812 the NWDAF derives a new analysis.
  • step S813 the NWDAF provides an analysis notification on the user plane congestion for the NF.
  • a method of processing QoS change prediction through a 3GPP 5G system (5G mobile communication system, next-generation mobile communication system) proposed below may be configured by a combination of one or more of the following operations/configurations/steps.
  • the method proposed in the embodiment(s) is useful for V2X services.
  • the V2X application server can be replaced with an application function or application server below.
  • the V2X service is used in combination with a V2X application, a V2X message, V2X traffic, and V2X data.
  • the UE may include all various UEs such as vehicle UEs as well as pedestrian UEs.
  • QoS may be QoS for PC5 communication and/or QoS for Uu communication.
  • the NWDAF receives a notification based on a change in user plane congestion status from Operations and Maintenance (OAM), and the NWDAF analyzes user plane congestion based on the notification ( analytics for the user plane congestion) notification can be transmitted to the V2X application server through Network Exposure Function (NEF).
  • OAM Operations and Maintenance
  • NEF Network Exposure Function
  • the change in the user plane congestion state is determined based on one or more information including whether a QoS Notification Control (QNC) notification has been transmitted, and whether the QNC notification has been transmitted is determined by the NG-RAN GFBR cannot be fulfilled. It may be determined based on a counter value that increases when /guaranteed is transmitted to the SMF, and decreases when NG-RAN transmits GFBR cannot be fulfilled/guaranteed to the SMF.
  • QoS requirements eg, 5QI(s), GFBR-UL and DL, MFBR-UL and DL
  • Changes in user plane congestion status detected by OAM are, for example, related 5QI (s) packet delay, average UL / DL throughput, DRB accessibility/conservation, whether or not a QNC notification is transmitted (whether notification for QNC has been sent) (e.g., whether NG-RAN has informed SMF that it cannot yet be notified after NG-RAN notifies SMF that it cannot fulfill GFBR requirements) fulfil the GFBR requirement but re-fulfillment has not been notified yet)).
  • 5QI s
  • OAM is a QoS Flow of a specific 5QI
  • SMF i.e., core network
  • the V2X application server or NWDAF may request to notify that there is a change in the user plane congestion status when there is a transmission of a QNC notification (i.e., 1) or more than n times as described above.
  • the user plane congestion analysis notification may include a location and time at which a potential change in QoS may occur.
  • the NWDAF receiving the notification based on the change in the user plane congestion state from the OAM may be performed after the next step, which is a subscription request/response procedure between the OAM and the V2X application server, is performed.
  • the subscription request/response procedure includes: receiving, by the NWDAF, a second subscription request from the NEF having received a first subscription request related to congestion-related analysis information from the V2X application server; Transmitting, by the NWDAF, a third subscription request to the OAM while providing a threshold value from the NWDAF based on the first subscription request; Receiving, by the NWDAF, a response to the third subscription request from the OAM; And transmitting, by the NWDAF, an analysis of the user plane congestion derived based on the response to the V2X application server through the NEF.
  • the first subscription request may include a subscription request for a location.
  • the location may be all or part of the route as shown in FIG. 9.
  • the V2X application server sets a location where user plane congestion related analytics is requested to cover the entire area (FIG. 9(a)) or partial area (FIG. 9(b)) along the path.
  • the path length represented by the location can be selected to suit the needs of a specific application, and should be sufficient for safe operation within a specific time window (i.e., it does not have to be an end-to-end path that the UE should finally reach).
  • the V2X application server analyzes the next location along the route (e.g. Location # 2) until the request location covers the final destination.
  • Request the NWDAF to report the information at an appropriate time in consideration of the UE speed, path, and V2X application. Therefore, notification of potential changes in QoS can support application coordination.
  • the previously requested location may overlap the next requested location.
  • the V2X application server may cancel the subscription for the location after performing a subscription for the next location on the path.
  • the first subscription request may include a subscription request for a plurality of locations.
  • the subscription request for the plurality of locations may include an observation start time and an end time for each of the plurality of locations. That is, the V2X application server may request a subscription including a plurality of locations. Multiple Locations may cover the entire path or only part of the path. For each location, you may include the observation start and end times, or the observation start and validity periods. For example, it can be included in the form of a list as follows.
  • the NWDAF may perform a subscription request to the OAM for each location according to the observation start time of each location. NEF may perform this operation. In the future, it is possible to request update only for some locations. In this case, updated information can be provided for the ID of the location to be updated.
  • Each of the plurality of locations may be a geographical area designated/detailed by the V2X application server.
  • the geographic area may be one of Cell ID(s)), TAI(s), polygon, circle, and address.
  • the NWDAF receives the notification based on the change in the user plane congestion state from the OAM, and the NWDAF transmits the user plane congestion analysis notification to the V2X application server through NEF based on the notification, steps S1007 and S1008 of FIG. 10, respectively. It may correspond to S1009.
  • step S1001 the UE provides information on a path, a path start time, and a QoS requirement (eg, 5QI(s)) to the V2X application server.
  • a QoS requirement eg, 5QI(s)
  • Steps S1002 to S1009 are based on the mechanisms and procedures specified in section 6.1.1'Analytics Subscribe / Unsubscribe' section 6.12'User plane congestion analytics' of TS 23.288.
  • step S1002 the V2X application server subscribes to analytic information from NWDAF through NEF.
  • the analysis type is set to user plane congestion and the analysis target is set to location.
  • the V2X application server can request statistics or predictions or both.
  • For the analysis type 'user plane congestion' can be used, or a new type can be defined and used.
  • the request includes location information.
  • the requested location is the geographical area specified/detailed by the V2X application server, and the cell level (Cell ID(s)), TA level (TAI(s)) or other format (e.g. polygon, circle, etc. or civic Can be addresses (eg streets, districts, etc.).
  • Cell ID(s) Cell ID(s)
  • TAI(s) TA level
  • other format e.g. polygon, circle, etc. or civic Can be addresses (eg streets, districts, etc.).
  • the requested location may cover the entire area along the path or a partial area along the path. If the initially requested location contains a partial area along the route (e.g., Location # 1), the V2X application server is the next location along the route (e.g., Location # 2) until the requested location contains the final destination. Subscribing to NWDAF analysis information for, and canceling the subscription of NWDAF analysis information for the previous location (eg, Location # 1). The previously requested location may overlap with the next requested location, and the V2X application server subscribes to the NWDAF analysis information for the next location at an appropriate time in consideration of, for example, UE speed, route, and V2X application. .
  • QoS requirements e.g., 5QI(s), GFBR-UL & DL, MFBR-UL & DL
  • QoS requirements may be provided as threshold information to be used for user plane congestion notification regarding a potential change in QoS.
  • QoS requirements and thresholds information may be separately provided.
  • QoS requirements XX Mbps
  • UL/DL MFBR YY Mbps
  • UL/DL GFBR AA Mbps
  • UL/DL MFBR BB Mbps
  • the V2X application server If multiple UEs have received the same information from these UEs as they move to the same path (i.e., the same origin and destination) at the same time (step S1001), the V2X application server provides one subscription to analytic information for these UEs. to NWDAF can also be performed. In this case, upon receiving notification for potential change in QoS, it may be notified to the UEs.
  • the request may also include a start time.
  • the NEF records the association between the analysis trigger and the requester ID.
  • step S1003 the NEF subscribes to the analysis information from the NWDAF according to the request of the V2X application server.
  • NEF may apply restrictions on NWDAF (e.g., restrictions on parameters or parameter values from Nnwdaf_AnalyticsSubscription_Subscribe service operations) to subscription requests according to operator configuration.
  • NWDAF e.g., restrictions on parameters or parameter values from Nnwdaf_AnalyticsSubscription_Subscribe service operations
  • NEF can provide V2X application server provided location information in a format understood by the 3GPP system (eg, TA list, cell list, etc.).
  • step S1004 the NWDAF provides congestion level thresholds and subscribes to OAM to obtain a user plane congestion state for the requested location, and the OAM provides the first report on the requested information as a response. do.
  • step S1005 the NWDAF derives the requested analysis and provides the analysis for the user plane congestion to the NEF.
  • NEF provides an analysis of user plane congestion to the V2X application server.
  • step S1007 a change in the user plane congestion state exceeding the threshold set by the NWDAF is detected by the OAM, and notified to the NWDAF.
  • the notification includes where and when a potential change in QoS can occur.
  • the change of the user plane congestion information detected by OAM is described in FIG. 11 below.
  • the notification may include an item or reason for which QoS potential change is expected.
  • step S1008 the NWDAF derives a new analysis and provides an analysis notification for the user plane congestion for NEF.
  • the NWDAF may configure notification information to be delivered to the V2X application server based on the notification information provided by OAM.
  • the NEF provides a notification of the user plane congestion analysis (analytics for the user plane congestion) to the V2X application server. If necessary, NEF provides the location information provided by the NWDAF, taking into account the location information where potential QoS changes may occur, in a form understood by the V2X application server (e.g., polygons, circles, etc.) or civic addresses (e.g. streets, districts, etc.) )).
  • the V2X application server e.g., polygons, circles, etc.
  • civic addresses e.g. streets, districts, etc.
  • V2X application coordination may take place in the UE and/or the V2X application server upon receiving a notification about a potential change in QoS.
  • the above procedure involves continuous reporting, but one-time reporting may also be used.
  • the V2X application server can interact directly with the NWDAF (e.g., if the V2X application server is a trusted application function (AF))
  • a procedure in which the V2X application server subscribes to NWDAF analysis information for a next location along a path and cancels subscription to NWDAF analysis information on a previous location will be described with reference to FIG. 11. As shown, when the path is changed and the updated path is provided from the UE, the V2X application server subscribes to the NWDAF analysis information for the updated location along the path, and NWDAF analysis information on the previously requested location You can unsubscribe about.
  • step S1101 the V2X application server subscribes to the user plane congestion analysis for the NWDAF for the next location along the path (eg, Location # 2) through NEF. This step corresponds to step S1002-6 of FIG. 10.
  • the V2X application server unsubscribes the user plane congestion analysis for the NWDAF for the previous location (eg, Location # 1) through NEF.
  • step S1104 the NWDAF unsubscribes the user plane congestion to the OAM for the previous location (eg, Location # 1).
  • steps S1105 to 6 the V2X application server receives a response.
  • steps S1107 to 8 subscription for user plane congestion analysis for the next location along the route and unsubscription for user plane congestion analysis for the previous location are repeated until the requested location includes the final destination.
  • the V2X application server may prevent separate unsubscription from being performed.
  • a subscription period eg, Observation period, duration, valid period, etc.
  • the V2X application server uses this to perform Subscription to NWDAF analytic information, and previously subscribed. You can also unsubscribe.
  • New/changed location or new/changed QoS requirements and/or thresholds information (these can also be interpreted as Analytic Filters), perform a new subscription, and update the existing subscription with the changed parameters instead of canceling the previous subscription. You may.
  • identification information eg, Analytic ID
  • Identification information can refer/identify an existing subscription can be provided in a subscription update request.
  • the V2X application server may request a subscription including a plurality of locations. Multiple Locations may cover the entire path or only part of the path. For each location, you may include the observation start and end times, or the observation start and validity periods. For example, it can be included in the form of a list as follows.
  • the NWDAF may perform a subscription request to the OAM for each location according to the observation start time of each location. NEF may perform this operation. In the future, it is possible to request update only for some locations. In this case, updated information can be provided for the ID of the location to be updated.
  • the V2X application server can interact directly with the NWDAF (for example, if the V2X application server is a trusted AF)
  • step S1201 the UE provides information on a path, a path start time, and a QoS requirement (eg, 5QI) to the V2X application server.
  • a QoS requirement eg, 5QI
  • Steps S1202 to 6 are based on the mechanisms and procedures defined in 6.1.2 “Analyze Request” and 6.12 “User Plane Congestion Analysis” of TS 23.288.
  • step S1202 the V2X application server requests reception of analysis information from NWDAF through NEF.
  • NEF approves the request for analysis information
  • NEF records the association between the analysis trigger and the requester ID.
  • NEF transmits Nnwdaf_AnalyticsInfo_Request to NWDAF according to the request of the V2X application server. This refers to the contents of S1003 of FIG. 10.
  • step S1204 if the request is approved, in order to provide the requested analysis, the NWDAF may request the OAM for a user plane congestion state for the requested location, and the OAM provides the requested information. If the NWDAF already has information about the congestion state of the user plane at the requested location, this step is omitted.
  • step S1205 the NWDAF derives the requested analysis and provides the NEF with the analysis for the user plane congestion. This refers to the contents of S1008 of FIG. 10.
  • step S1206 the NWDAF provides the analysis of the user plane congestion to the V2X application server. This refers to the contents of S1009 of FIG. 10.
  • steps S1207 to 8 when the location requested in step S1202 by the V2X application server covers only a part of the path, the V2X application server must continuously perform analytic information request for the next location. This should be performed at an appropriate time in consideration of UE speed, the path, and V2X application. As a result, information on potential change in QoS provided by the V2X application server can assist Application Adjustment. This analytic information request can be performed until the requested location covers the final destination.
  • V2X application coordination may take place in the UE and/or the V2X application server upon receiving a notification about a potential change in QoS.
  • the V2X application server may make a request including a plurality of locations. Multiple Locations may cover the entire path or only part of the path. For each location, you can include the observation start time. For example, it can be included in the form of a list as follows.
  • the NWDAF may perform a subscription request to the OAM for each location in accordance with the observation start time of each location. NEF may perform this operation. In the future, it is possible to request update only for some locations. In this case, updated information can be provided for the ID of the location to be updated.
  • Mobile terminals include mobile phones, smart phones, laptop computers, digital broadcasting terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), navigation, slate PCs, and tablet PCs.
  • PC personal digital assistants
  • PMPs portable multimedia players
  • PC ultrabook
  • wearable device for example, a watch-type terminal (smartwatch), a glass-type terminal (smart glass), a head mounted display (HMD)), and the like.
  • it may be used for controlling at least one device in an Internet of Things (IoT) environment or a smart greenhouse.
  • IoT Internet of Things
  • FIG. 11 is a block diagram illustrating a mobile terminal related to the present invention.
  • the mobile terminal 100 includes a transmission/reception device 110, a processor 120, a memory 130, a sensing unit 140, an output unit 150, an interface unit 160, an input unit 170, and a power supply unit 190. ), etc.
  • the components shown in FIG. 11 are not essential for implementing the mobile terminal, and thus, the mobile terminal described in the present specification may have more or fewer components than those listed above.
  • the transmission/reception device 110 is, between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 and an external server. It may include one or more modules to enable wireless communication between. Further, the transmission/reception device 110 may include one or more modules that connect the mobile terminal 100 to one or more networks.
  • the transmission/reception device 110 may include at least one of a broadcast reception module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a location information module 115. .
  • the input unit 170 includes a camera 171 or an image input unit for inputting a video signal, a microphone 172 for inputting an audio signal, or an audio input unit, and a user input unit 173 for receiving information from a user, for example. , A touch key, a mechanical key, etc.). Voice data or image data collected by the input unit 170 may be analyzed and processed as a user's control command.
  • the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, information on surrounding environments surrounding the mobile terminal, and user information.
  • the sensing unit 140 includes a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
  • G-sensor for example, camera (see 171)), microphone (microphone, see 172), battery gauge, environmental sensor (for example, barometer, hygrometer, thermometer, radiation detection sensor, It may include at least one of a heat sensor, a gas sensor, etc.) and a chemical sensor (eg, an electronic nose, a healthcare sensor, a biometric sensor, etc.).
  • the mobile terminal disclosed in the present specification may combine and utilize information sensed by at least two or more of these sensors.
  • the output unit 150 is for generating an output related to visual, auditory or tactile sense, and includes at least one of a display unit 151, an audio output unit 152, a hap tip module 153, and a light output unit 154. can do.
  • the display unit 151 may form a layer structure with the touch sensor or are integrally formed to implement a touch screen.
  • Such a touch screen may function as a user input unit 173 that provides an input interface between the mobile terminal 100 and a user, and may provide an output interface between the mobile terminal 100 and a user.
  • the interface unit 160 serves as a passage between various types of external devices connected to the mobile terminal 100.
  • the interface unit 160 connects a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, and a device equipped with an identification module. It may include at least one of a port, an audio input/output (I/O) port, an input/output (video I/O) port, and an earphone port.
  • the mobile terminal 100 may perform appropriate control related to the connected external device in response to the connection of the external device to the interface unit 160.
  • the memory 130 stores data supporting various functions of the mobile terminal 100.
  • the memory 130 may store a plurality of application programs or applications driven by the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these application programs may be downloaded from an external server through wireless communication. In addition, at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions of the mobile terminal 100 (eg, incoming calls, outgoing functions, message reception, and outgoing functions). Meanwhile, the application program may be stored in the memory 130, installed on the mobile terminal 100, and driven by the processor 120 to perform an operation (or function) of the mobile terminal.
  • the processor 120 In addition to the operation related to the application program, the processor 120 generally controls the overall operation of the mobile terminal 100.
  • the processor 120 may provide or process appropriate information or functions to a user by processing signals, data, information, etc. input or output through the above-described components or by driving an application program stored in the memory 130.
  • the processor 120 may control at least some of the components discussed with reference to FIG. 11 in order to drive the application program stored in the memory 130. Further, the processor 120 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
  • the power supply unit 190 receives external power and internal power under the control of the processor 120 and supplies power to each component included in the mobile terminal 100.
  • the power supply unit 190 includes a battery, and the battery may be a built-in battery or a replaceable battery.
  • At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of a mobile terminal according to various embodiments described below.
  • the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 130.
  • the broadcast receiving module 111 of the transmitting and receiving device 110 receives a broadcast signal and/or broadcast-related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • Two or more broadcast reception modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.
  • the mobile communication module 112 includes technical standards or communication methods for mobile communication (eg, Global System for Mobile communication (GSM)), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), EV -DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access), LTE (Long Term Evolution), LTE-A (Long Term Evolution-Advanced), 3GPP NR (New Radio access technology), etc.) Transmits and receives radio signals with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV -DO Enhanced Voice-Data Optimized or Enhanced Voice-Data Only
  • WCDMA Wideband CDMA
  • HSDPA High Speed Downlink Packet Access
  • HSUPA High Speed Up
  • the wireless signal may include a voice call signal, a video call signal, or various types of data according to transmission/reception of text/multimedia messages.
  • the wireless Internet module 113 refers to a module for wireless Internet access, and may be built-in or external to the mobile terminal 100.
  • the wireless Internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.
  • wireless Internet technologies include WLAN (Wireless LAN), Wi-Fi (Wireless-Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro (Wireless Broadband), WiMAX (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access), LTE (Long Term Evolution), LTE-A (Long Term Evolution-Advanced), 3GPP NR, etc.
  • the Internet module 113 transmits and receives data according to at least one wireless Internet technology in a range including Internet technologies not listed above.
  • the module 113 may be understood as a kind of the mobile communication module 112.
  • the short range communication module 114 is for short range communication, and includes BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC.
  • Near field communication may be supported by using at least one of (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technologies.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • NFC Near field communication
  • Near field communication may be supported by using at least one of (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technologies.
  • the other mobile terminal 100 is a wearable device capable of exchanging (or interlocking with) data with the mobile terminal 100 according to the present invention, for example, a smartwatch, a smart glasses. It can be (smart glass), neckband, head mounted display (HMD).
  • the short-range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. Furthermore, when the sensed wearable device is a device that is authenticated to communicate with the mobile terminal 100 according to the present invention, the processor 120 transmits at least part of the data processed by the mobile terminal 100 to the short-range communication module ( 114) can be transmitted to the wearable device.
  • a user of the wearable device can use data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a call is received by the mobile terminal 100, the user performs a phone call through the wearable device, or when a message is received by the mobile terminal 100, the user receives the received call through the wearable device. It is possible to check the message.
  • screen mirroring is performed with a TV located in a home or a display in a vehicle through the short-range communication module 114, and a corresponding function is performed based on, for example, MirrorLink or Miracast standards.
  • the location information module 115 is a module for obtaining a location (or current location) of a mobile terminal, and representative examples thereof include a GPS (Global Positioning System) module or a WiFi (Wireless Fidelity) module.
  • a GPS Global Positioning System
  • WiFi Wireless Fidelity
  • the mobile terminal may acquire the location of the mobile terminal based on information of the Wi-Fi module and a wireless access point (AP) that transmits or receives a wireless signal.
  • AP wireless access point
  • the location information module 115 may perform any function among other modules of the transmitting/receiving device 110 to obtain data on the location of the mobile terminal as a substitute or additionally.
  • the location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
  • Each of the broadcast reception module 111, the mobile communication module 112, the short-range communication module 114, and the location information module 115 may be implemented as separate modules that perform a corresponding function, or the broadcast reception module 111, Functions corresponding to two or more of the mobile communication module 112, the short-range communication module 114, and the location information module 115 may be implemented by one module.
  • the input unit 170 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
  • the mobile terminal 100 Alternatively, a plurality of cameras 171 may be provided.
  • the camera 171 processes an image frame such as a still image or a video obtained by an image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display unit 151 or stored in the memory 130.
  • a plurality of cameras 171 provided in the mobile terminal 100 may be arranged to form a matrix structure, and through the camera 171 forming a matrix structure as described above, various angles or focal points may be applied to the mobile terminal 100.
  • a plurality of image information may be input.
  • the plurality of cameras 171 may be arranged in a stereo structure to obtain a left image and a right image for implementing a stereoscopic image.
  • the microphone 172 processes an external sound signal into electrical voice data.
  • the processed voice data may be used in various ways according to a function (or an application program being executed) being executed by the mobile terminal 100. Meanwhile, the microphone 172 may be implemented with various noise removal algorithms to remove noise generated in the process of receiving an external sound signal.
  • the user input unit 173 is for receiving information from a user, and when information is input through the user input unit 173, the processor 120 may control the operation of the mobile terminal 100 to correspond to the input information.
  • the user input unit 173 is a mechanical (mechanical) input means (or a mechanical key, for example, a button located on the front, rear or side of the mobile terminal 100, a dome switch (dome switch), a jog wheel, Jog switch, etc.) and a touch-type input means.
  • the touch input means is composed of a virtual key, a soft key, or a visual key displayed on a touch screen through software processing, or parts other than the touch screen It may be made of a touch key (touch key) disposed on.
  • the virtual key or the visual key can be displayed on the touch screen while having various forms. For example, graphic, text, icon, video, or these It can be made of a combination of.
  • the sensing unit 140 senses at least one of information in the mobile terminal, information on surrounding environment surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
  • the processor 120 may control the driving or operation of the mobile terminal 100 or perform data processing, functions, or operations related to an application program installed in the mobile terminal 100 based on such a sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.
  • the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object existing in the vicinity using the force of an electromagnetic field or infrared rays without mechanical contact.
  • the proximity sensor 141 may be disposed in an inner area of the mobile terminal surrounded by the touch screen as described above or near the touch screen.
  • the proximity sensor 141 examples include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like.
  • the proximity sensor 141 may be configured to detect the proximity of the object by a change in the electric field according to the proximity of the conductive object. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.
  • proximity touch the action of allowing an object to be recognized as being positioned on the touch screen without touching the object on the touch screen
  • the touch The act of actually touching an object on the screen is referred to as "contact touch”.
  • a position at which an object is touched in proximity on the touch screen means a position at which the object is vertically corresponding to the touch screen when the object is touched in proximity.
  • the proximity sensor 141 may detect a proximity touch and a proximity touch pattern (eg, proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch movement state, etc.). have.
  • the processor 120 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Further, the processor 120 may control the mobile terminal 100 so that different operations or data (or information) are processed according to whether a touch to the same point on the touch screen is a proximity touch or a touch touch. .
  • the touch sensor detects a touch (or touch input) applied to the touch screen (or display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. do.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion into an electrical input signal.
  • the touch sensor may be configured to detect a location, area, pressure when touched, capacitance when touched, etc. of a touch object applied to the touch screen on the touch sensor.
  • the touch object is an object that applies a touch to the touch sensor, and may be, for example, a finger, a touch pen, a stylus pen, or a pointer.
  • the touch controller processes the signal(s) and then transmits the corresponding data to the processor 120.
  • the processor 120 can know whether an area of the display unit 151 is touched.
  • the touch controller may be a separate component from the processor 120 or may be the processor 120 itself.
  • the processor 120 may perform different controls or the same control according to the type of the touch object by touching the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to an operation state of the mobile terminal 100 or an application program being executed.
  • the touch sensor and the proximity sensor described above are independently or in combination, and a short (or tap) touch, a long touch, a multi touch, and a drag touch on the touch screen. ), flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. You can sense the touch.
  • the ultrasonic sensor may recognize location information of a sensing target by using ultrasonic waves.
  • the processor 120 may calculate the location of the wave generator through information sensed from the optical sensor and the plurality of ultrasonic sensors.
  • the location of the wave generator may be calculated by using a property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time that the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generator may be calculated using a time difference between a time when the ultrasonic wave arrives using light as a reference signal.
  • the camera 171 includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or image sensor), and a laser sensor.
  • a camera sensor eg, CCD, CMOS, etc.
  • a photo sensor or image sensor
  • a laser sensor e.g., a laser sensor
  • the camera 171 and the laser sensor may be combined with each other to detect a touch of a sensing target for a 3D stereoscopic image.
  • the photosensor may be stacked on the display device, and the photosensor is configured to scan a motion of a sensing object close to the touch screen. More specifically, the photo sensor scans the contents placed on the photo sensor by mounting a photo diode and a transistor (TR) in a row/column and using an electrical signal that changes according to the amount of light applied to the photo diode. That is, the photosensor may calculate the coordinates of the sensing object according to the amount of light change, and through this, the location information of the sensing object may be obtained.
  • TR transistor
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100, or UI (User Interface) and GUI (Graphic User Interface) information according to such execution screen information. .
  • the display unit 151 may be configured as a three-dimensional display unit that displays a three-dimensional image.
  • a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (no glasses method), and a projection method (holographic method) may be applied to the stereoscopic display unit.
  • the sound output unit 152 may output audio data received from the transmission/reception device 110 or stored in the memory 130 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, or the like.
  • the sound output unit 152 also outputs sound signals related to functions (eg, call signal reception sound, message reception sound, etc.) performed by the mobile terminal 100.
  • the sound output unit 152 may include a receiver, a speaker, and a buzzer.
  • the haptic module 153 generates various tactile effects that a user can feel.
  • a typical example of the tactile effect generated by the haptic module 153 may be vibration.
  • the intensity and pattern of vibration generated by the haptic module 153 may be controlled by a user's selection or a processor setting. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output them.
  • the haptic module 153 is designed to respond to stimuli such as an arrangement of pins that move vertically with respect to the contact skin surface, blowing force or suction force of air through the injection or inlet, grazing against the skin surface, contact of electrodes, and electrostatic force. It can generate various tactile effects, such as the effect by the effect and the effect by reproducing the feeling of cooling and warming using an endothermic or heat generating element.
  • the haptic module 153 may not only deliver a tactile effect through direct contact, but may also be implemented so that a user can feel the tactile effect through muscle sensations such as a finger or an arm. Two or more haptic modules 153 may be provided depending on the configuration aspect of the mobile terminal 100.
  • the light output unit 154 outputs a signal for notifying the occurrence of an event using light from a light source of the mobile terminal 100.
  • Examples of events occurring in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, and information reception through an application.
  • the signal output from the light output unit 154 is implemented as the mobile terminal emits a single color or multiple colors of light to the front or rear.
  • the signal output may be terminated when the mobile terminal detects the user's event confirmation.
  • the interface unit 160 serves as a passage for all external devices connected to the mobile terminal 100.
  • the interface unit 160 receives data from an external device or receives power and transmits it to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • a wired/wireless headset port for example, a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device equipped with an identification module. (port), an audio input/output (I/O) port, a video input/output (I/O) port, an earphone port, and the like may be included in the interface unit 160.
  • the identification module is a chip that stores various types of information for authenticating the right to use the mobile terminal 100, and includes a user identification module (UIM), a subscriber identity module (SIM), and universal user authentication. It may include a module (universal subscriber identity module; USIM).
  • a device equipped with an identification module (hereinafter referred to as'identification device') may be manufactured in the form of a smart card. Accordingly, the identification device may be connected to the terminal 100 through the interface unit 160.
  • the interface unit 160 serves as a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or is input from the cradle by a user. It may be a path through which various command signals are transmitted to the mobile terminal 100. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
  • the memory 130 may store a program for the operation of the processor 120 and may temporarily store input/output data (eg, a phone book, a message, a still image, a video, etc.).
  • the memory 130 may store data on vibrations and sounds of various patterns output when a touch input on the touch screen is performed.
  • the memory 130 is a flash memory type, a hard disk type, a solid state disk type, an SDD type, a multimedia card micro type. ), card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read (EEPROM) -only memory), programmable read-only memory (PROM), magnetic memory, magnetic disk, and optical disk.
  • card-type memory e.g., SD or XD memory
  • RAM random access memory
  • SRAM static random access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read
  • PROM programmable read-only memory
  • magnetic memory magnetic disk, and optical disk.
  • the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 130 over the Internet.
  • the processor 120 controls an operation related to an application program and, in general, an overall operation of the mobile terminal 100. For example, when the state of the mobile terminal satisfies a set condition, the processor 120 may execute or release a lock state that restricts the user's input of a control command for applications.
  • the processor 120 performs control and processing related to voice calls, data communication, and video calls, or performs pattern recognition processing capable of recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively. I can. Furthermore, the processor 120 may control any one or a combination of a plurality of components described above in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.
  • the power supply unit 190 receives external power and internal power under the control of the processor 120 and supplies power necessary for the operation of each component.
  • the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging or the like.
  • the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 to which an external charger supplying power for charging a battery is electrically connected.
  • the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
  • the power supply unit 190 uses at least one of an inductive coupling method based on a magnetic induction phenomenon or a magnetic resonance coupling method based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
  • various embodiments below may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly holds and uses in the hand.
  • wearable devices include smart watch, smart glass, and head mounted display (HMD).
  • HMD head mounted display
  • the wearable device may be configured to exchange (or interlock) data with other mobile terminals 100.
  • the short-range communication module 114 may detect (or recognize) a wearable device capable of communicating around the mobile terminal 100. Furthermore, when the detected wearable device is a device authenticated to communicate with the mobile terminal 100, the processor 120 transmits at least part of the data processed by the mobile terminal 100 to the wearable device through the short-range communication module 114. Can be transferred to. Accordingly, the user can use data processed by the mobile terminal 100 through the wearable device. For example, it is possible to perform a phone call through a wearable device when a call is received from the mobile terminal 100, or to check the received message through the wearable device when a message is received by the mobile terminal 100. .
  • the terminal or mobile terminal may be a fixed or detachable terminal mounted inside a vehicle, a portable terminal, or one or more of the above-described ones.
  • a mobile terminal includes: a display unit for displaying a driving route; And a control unit for controlling the display unit, wherein the control unit displays whether or not QoS (Quality of Service) is satisfied on the display unit, and the control unit relates to application adjustment based on a notification that QoS is not satisfied from the V2X application server. Information may be displayed on the display.
  • QoS Quality of Service
  • the application adjustment-related information includes information instructing to change the driving route, information instructing to change the Level of Automation (LoA), information informing that the application should be terminated after a predetermined time, information instructing to change the driving speed, and driving. It may be one of information indicating stop.
  • the application adjustment-related information may be one of the following.
  • the present invention is not limited thereto, and may be various operations that may be performed as QoS is not satisfied.
  • Information indicating a change of route of travel As additional information, it is possible to include section information that needs to be changed and section information that has been reset to the section in question.
  • LoA Level of Automation
  • 3GPP 3GPP system provides the necessary performance for all levels of automation. According to the level of SAE, LoA is as follows.
  • the information instructing to change the driving route may be displayed together with section information requiring the change.
  • the application adjustment-related information may be displayed together with a map on which a driving route is displayed.
  • Whether or not the QoS is satisfied may be related to the execution of an application initiated by a user's selection, wherein the application may be one of autonomous driving and cluster driving.
  • the QoS satisfaction indication may be expressed in various forms as follows. They can also be expressed in combination. However, the display is not limited thereto and may be displayed so that the user can recognize it.
  • the form of road congestion For example, it can be displayed in color (green when QoS is satisfied, red when QoS is not satisfied, etc.)
  • the V2X application server may inform the UE of information related to b) to f) instead of notifying the UE whether QoS is satisfied.
  • the V2X application server may inform the UE of the information on a) to f) in combination.
  • the items notified by the V2X application server regarding QoS to the UE and the indication of whether QoS is satisfied are also applied below.
  • whether the QoS is satisfied may be based on a notification from the V2X application server.
  • Whether the QoS is satisfied may be based on a notification related to a change in a user plane congestion status received by the V2X application server.
  • the notification related to the change in the user plane congestion state the V2X application server received from the Network Data Analytics Function (NWDAF) through the Network Exposure Function (NEF), the user plane congestion analysis (analytics for the user plane congestion) Notice.
  • NWDAF Network Data Analytics Function
  • NEF Network Exposure Function
  • the user plane congestion analysis is based on a change in a user plane congestion status received by NWDAF from Operations and Maintenance (OAM), and the user plane congestion analysis notification may result in a potential change in QoS. May include location and time.
  • step S1401 of FIG. 14 a departure point and a destination are determined by a user's selection on a selection screen displayed on the display of the UE.
  • Fig. 15 shows such an example.
  • step S1402 an application to be started (eg, automatic driving/autonomous driving, platoon driving, etc.) is determined by the user's selection on the selection screen displayed on the display of the UE.
  • an application to be started eg, automatic driving/autonomous driving, platoon driving, etc.
  • the LoA may be set in advance.
  • S1402 may be performed first, or S1401 and S1402 may be combined.
  • step S1403 the departure time is determined by the user's selection in the selection screen displayed on the display of the UE. If this is not selected, it may be decided as "start now" by default.
  • S1401, S1402, and S1403 may be changed, or may be performed in a combined form.
  • step S1404 the UE receives a notification from the V2X application server that all QoS is satisfied for the initial 5 km of departure, and displays this on the display. These indications may be displayed together or separately on a map showing the route the UE travels.
  • step S1405 the UE starts driving.
  • step S1406 as the UE travels, a map showing the route the UE travels is shown on the display of the UE, and the location of the UE is displayed.
  • step S1407 whether or not QoS is satisfied for an area in which the UE is scheduled to operate is displayed. Whether or not the QoS is satisfied may be based on notification from the V2X application server. The indication may be displayed together or separately on a map showing a route on which the UE travels.
  • a warning guide may be displayed along with the indication of whether QoS is satisfied or not, or instead of indicating whether the QoS is satisfied.
  • the warning guide may be displayed together or separately on a map showing a route on which the UE travels. 16 shows such an example.
  • the V2X application server may provide application adjustment-related information to the UE together with the QoS-related information described in S1407 or instead of the QoS-related information.
  • Information related to application adjustment is displayed on the display of the UE. This may be displayed together with S1407 or may be displayed separately.
  • the application adjustment-related information may be received from the V2X application server, or may be a result of performing the adjustment in the application of the UE after the UE receives the QoS-related information described in S1407.
  • NWDAF is mentioned as a function that performs QoS prediction, but it may be another NF.
  • V2X service it may be a V2X Control Function.
  • 17 is a block diagram of a configuration of a terminal in which an embodiment presented in the present specification is implemented.
  • the terminal may include a QoS area management unit (S1721), a QoS setting management unit (S17622), a QoS session management unit (S1723), and a QoS information management unit (S1724).
  • S1721 QoS area management unit
  • S17622 QoS setting management unit
  • S1723 QoS session management unit
  • S1724 QoS information management unit
  • the QoS area management unit S1721, the QoS setting management unit S1722, the QoS session management unit S1723, and the QoS information management unit S1724 may be included in the processor 120 of FIG. 13.
  • the QoS area management unit S1721 determines an area that satisfies the QoS. To this end, the QoS area management unit S1721 may acquire location information of the terminal.
  • the QoS setting management unit S1722 may display a setting screen (ie, UI) related to whether or not QoS is satisfied, and receive and store an input from a user.
  • a setting screen ie, UI
  • the QoS session management unit S1723 may establish, modify, or release a PDU session for QoS.
  • the QoS information management unit S1724 may provide to the QoS area management unit S1721, the QoS setting management unit S1722, and the QoS session management unit S1723 after receiving and storing information related to whether or not QoS is satisfied from the network. .
  • the present invention described above can be implemented as a computer-readable code on a medium on which a program is recorded.
  • the computer-readable medium includes all types of recording devices that store data that can be read by a computer system. Examples of computer-readable media include HDD (Hard Disk Drive), SSD (Solid State Disk), SDD (Silicon Disk Drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. There is also a carrier wave (for example, transmission over the Internet) includes the implementation of the form.
  • the computer may include the processor 120 of the terminal. Therefore, the detailed description above should not be construed as restrictive in all respects and should be considered as illustrative. The scope of the present invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.
  • the communication system 1 applied to the present invention includes a wireless device, a base station and a network.
  • the wireless device means a device that performs communication using wireless access technology (eg, 5G NR (New RAT), LTE (Long Term Evolution)), and may be referred to as a communication/wireless/5G device.
  • wireless devices include robots 100a, vehicles 100b-1 and 100b-2, eXtended Reality (XR) devices 100c, hand-held devices 100d, and home appliances 100e. ), an Internet of Thing (IoT) device 100f, and an AI device/server 400.
  • the vehicle may include a vehicle equipped with a wireless communication function, an autonomous driving vehicle, a vehicle capable of performing inter-vehicle communication, and the like.
  • the vehicle may include an Unmanned Aerial Vehicle (UAV) (eg, a drone).
  • UAV Unmanned Aerial Vehicle
  • XR devices include Augmented Reality (AR)/Virtual Reality (VR)/Mixed Reality (MR) devices, including Head-Mounted Device (HMD), Head-Up Display (HUD), TV, smartphone, It can be implemented in the form of computers, wearable devices, home appliances, digital signage, vehicles, robots, and the like.
  • Portable devices may include smart phones, smart pads, wearable devices (eg, smart watches, smart glasses), computers (eg, notebook computers, etc.).
  • Home appliances may include TVs, refrigerators, and washing machines.
  • IoT devices may include sensors, smart meters, and the like.
  • a base station and a network may be implemented as a wireless device, and a specific wireless device 200a may operate as a base station/network node to another wireless device.
  • the wireless devices 100a to 100f may be connected to the network 300 through the base station 200.
  • AI Artificial Intelligence
  • the network 300 may be configured using a 3G network, a 4G (eg, LTE) network, or a 5G (eg, NR) network.
  • the wireless devices 100a to 100f may communicate with each other through the base station 200/network 300, but may communicate directly (e.g. sidelink communication) without passing through the base station/network.
  • the vehicles 100b-1 and 100b-2 may perform direct communication (e.g.
  • V2V Vehicle to Vehicle
  • V2X Vehicle to Everything
  • the IoT device eg, sensor
  • the IoT device may directly communicate with other IoT devices (eg, sensors) or other wireless devices 100a to 100f.
  • Wireless communication/connections 150a, 150b, and 150c may be established between the wireless devices 100a to 100f / base station 200 and the base station 200 / base station 200.
  • the wireless communication/connection is various wireless access such as uplink/downlink communication 150a and sidelink communication 150b (or D2D communication), base station communication 150c (eg relay, IAB (Integrated Access Backhaul)). It can be achieved through technology (eg, 5G NR), and wireless devices/base stations/wireless devices, base stations and base stations can transmit/receive radio signals to each other through wireless communication/connections 150a, 150b, 150c.
  • the wireless communication/connections 150a, 150b, 150c can transmit/receive signals through various physical channels.
  • various configuration information setting processes e.g, channel encoding/decoding, modulation/demodulation, resource mapping/demapping, etc.
  • resource allocation processes e.g., resource allocation processes, and the like.
  • the first wireless device 100 and the second wireless device 200 may transmit and receive wireless signals through various wireless access technologies (eg, LTE and NR).
  • ⁇ the first wireless device 100, the second wireless device 200 ⁇ is ⁇ wireless device 100x, base station 200 ⁇ and/or ⁇ wireless device 100x), wireless device 100x in FIG. ⁇ .
  • the first wireless device 100 includes one or more processors 102 and one or more memories 104, and may further include one or more transceivers 106 and/or one or more antennas 108.
  • the processor 102 controls the memory 104 and/or the transceiver 106 and may be configured to implement the descriptions, functions, procedures, suggestions, methods, and/or operational flowcharts disclosed herein.
  • the processor 102 may process information in the memory 104 to generate the first information/signal, and then transmit the wireless signal including the first information/signal through the transceiver 106.
  • the processor 102 may receive the wireless signal including the second information/signal through the transceiver 106 and store the information obtained from the signal processing of the second information/signal in the memory 104.
  • the memory 104 may be connected to the processor 102 and may store various information related to the operation of the processor 102.
  • the memory 104 may perform some or all of the processes controlled by the processor 102, or instructions for performing the descriptions, functions, procedures, suggestions, methods, and/or operational flow charts disclosed in this document. It can store software code including
  • the processor 102 and the memory 104 may be part of a communication modem/circuit/chip designed to implement wireless communication technology (eg, LTE, NR).
  • the transceiver 106 may be coupled with the processor 102 and may transmit and/or receive radio signals through one or more antennas 108.
  • the transceiver 106 may include a transmitter and/or a receiver.
  • the transceiver 106 may be mixed with an RF (Radio Frequency) unit.
  • the wireless device may mean a communication modem/circuit/chip.
  • the second wireless device 200 includes one or more processors 202, one or more memories 204, and may further include one or more transceivers 206 and/or one or more antennas 208.
  • the processor 202 controls the memory 204 and/or the transceiver 206 and may be configured to implement the descriptions, functions, procedures, suggestions, methods, and/or operational flowcharts disclosed herein.
  • the processor 202 may process information in the memory 204 to generate third information/signal, and then transmit a wireless signal including the third information/signal through the transceiver 206.
  • the processor 202 may receive a radio signal including the fourth information/signal through the transceiver 206 and then store information obtained from signal processing of the fourth information/signal in the memory 204.
  • the memory 204 may be connected to the processor 202 and may store various information related to the operation of the processor 202.
  • the memory 204 may perform some or all of the processes controlled by the processor 202, or instructions for performing the descriptions, functions, procedures, suggestions, methods, and/or operational flow charts disclosed in this document. It can store software code including
  • the processor 202 and the memory 204 may be part of a communication modem/circuit/chip designed to implement wireless communication technology (eg, LTE, NR).
  • the transceiver 206 may be connected to the processor 202 and may transmit and/or receive radio signals through one or more antennas 208.
  • the transceiver 206 may include a transmitter and/or a receiver.
  • the transceiver 206 may be used interchangeably with an RF unit.
  • the wireless device may mean a communication modem/circuit/chip.
  • one or more protocol layers may be implemented by one or more processors 102, 202.
  • one or more processors 102, 202 may implement one or more layers (eg, functional layers such as PHY, MAC, RLC, PDCP, RRC, SDAP).
  • One or more processors 102, 202 may be configured to generate one or more Protocol Data Units (PDUs) and/or one or more Service Data Units (SDUs) according to the description, functions, procedures, proposals, methods, and/or operational flowcharts disclosed in this document. Can be generated.
  • PDUs Protocol Data Units
  • SDUs Service Data Units
  • One or more processors 102, 202 may generate messages, control information, data, or information according to the description, function, procedure, proposal, method, and/or operational flow chart disclosed herein. At least one processor (102, 202) generates a signal (e.g., baseband signal) containing PDU, SDU, message, control information, data or information according to the functions, procedures, proposals and/or methods disclosed in this document. , Can be provided to one or more transceivers (106, 206).
  • a signal e.g., baseband signal
  • One or more processors 102, 202 may receive signals (e.g., baseband signals) from one or more transceivers 106, 206, and the descriptions, functions, procedures, proposals, methods and/or operational flowcharts disclosed herein PDUs, SDUs, messages, control information, data or information may be obtained according to the parameters.
  • signals e.g., baseband signals
  • One or more of the processors 102 and 202 may be referred to as a controller, microcontroller, microprocessor, or microcomputer.
  • One or more of the processors 102 and 202 may be implemented by hardware, firmware, software, or a combination thereof.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs Field Programmable Gate Arrays
  • firmware or software may be implemented using firmware or software, and firmware or software may be implemented to include modules, procedures, functions, and the like.
  • the description, functions, procedures, proposals, methods and/or operational flow charts disclosed in this document include firmware or software configured to be performed in one or more processors 102, 202, or stored in one or more memories 104, 204, and It may be driven by the above processors 102 and 202.
  • the descriptions, functions, procedures, suggestions, methods, and/or operational flowcharts disclosed in this document may be implemented using firmware or software in the form of codes, instructions and/or a set of instructions.
  • One or more memories 104, 204 may be connected to one or more processors 102, 202 and may store various types of data, signals, messages, information, programs, codes, instructions and/or instructions.
  • One or more memories 104 and 204 may be composed of ROM, RAM, EPROM, flash memory, hard drive, register, cache memory, computer readable storage medium, and/or combinations thereof.
  • One or more memories 104 and 204 may be located inside and/or outside of one or more processors 102 and 202.
  • the one or more memories 104, 204 may be connected to the one or more processors 102, 202 through various techniques such as wired or wireless connection.
  • the one or more transceivers 106 and 206 may transmit user data, control information, radio signals/channels, and the like mentioned in the methods and/or operation flow charts of this document to one or more other devices.
  • One or more transceivers (106, 206) may receive user data, control information, radio signals/channels, etc. mentioned in the description, functions, procedures, proposals, methods and/or operation flowcharts disclosed in this document from one or more other devices. have.
  • one or more transceivers 106 and 206 may be connected to one or more processors 102 and 202, and may transmit and receive wireless signals.
  • one or more processors 102, 202 may control one or more transceivers 106, 206 to transmit user data, control information, or radio signals to one or more other devices.
  • one or more processors 102, 202 may control one or more transceivers 106, 206 to receive user data, control information, or radio signals from one or more other devices.
  • one or more transceivers (106, 206) may be connected with one or more antennas (108, 208), and one or more transceivers (106, 206) through one or more antennas (108, 208), the description and functionality disclosed in this document.
  • one or more antennas may be a plurality of physical antennas or a plurality of logical antennas (eg, antenna ports).
  • One or more transceivers (106, 206) in order to process the received user data, control information, radio signal / channel, etc. using one or more processors (102, 202), the received radio signal / channel, etc. in the RF band signal. It can be converted into a baseband signal.
  • One or more transceivers 106 and 206 may convert user data, control information, radio signals/channels, etc. processed using one or more processors 102 and 202 from a baseband signal to an RF band signal.
  • one or more transceivers 106, 206 may include (analog) oscillators and/or filters.
  • FIG. 20 illustrates a signal processing circuit for a transmission signal.
  • the signal processing circuit 1000 may include a scrambler 1010, a modulator 1020, a layer mapper 1030, a precoder 1040, a resource mapper 1050, and a signal generator 1060.
  • the operations/functions of FIG. 20 may be performed in the processors 102 and 202 and/or the transceivers 106 and 206 of FIG. 19.
  • the hardware elements of FIG. 20 may be implemented in the processors 102 and 202 and/or the transceivers 106 and 206 of FIG. 19.
  • blocks 1010 to 1060 may be implemented in processors 102 and 202 of FIG. 19.
  • blocks 1010 to 1050 may be implemented in the processors 102 and 202 of FIG. 19
  • block 1060 may be implemented in the transceivers 106 and 206 of FIG. 19.
  • the codeword may be converted into a wireless signal through the signal processing circuit 1000 of FIG. 20.
  • the codeword is an encoded bit sequence of an information block.
  • the information block may include a transport block (eg, a UL-SCH transport block, a DL-SCH transport block).
  • the radio signal may be transmitted through various physical channels (eg, PUSCH, PDSCH).
  • the codeword may be converted into a scrambled bit sequence by the scrambler 1010.
  • the scramble sequence used for scramble is generated based on an initialization value, and the initialization value may include ID information of a wireless device.
  • the scrambled bit sequence may be modulated by the modulator 1020 into a modulation symbol sequence.
  • the modulation scheme may include pi/2-Binary Phase Shift Keying (pi/2-BPSK), m-Phase Shift Keying (m-PSK), m-Quadrature Amplitude Modulation (m-QAM), and the like.
  • the complex modulation symbol sequence may be mapped to one or more transport layers by the layer mapper 1030.
  • the modulation symbols of each transport layer may be mapped to the corresponding antenna port(s) by the precoder 1040 (precoding).
  • the output z of the precoder 1040 can be obtained by multiplying the output y of the layer mapper 1030 by the N*M precoding matrix W.
  • N is the number of antenna ports
  • M is the number of transmission layers.
  • the precoder 1040 may perform precoding after performing transform precoding (eg, DFT transform) on complex modulation symbols. Further, the precoder 1040 may perform precoding without performing transform precoding.
  • the resource mapper 1050 may map modulation symbols of each antenna port to a time-frequency resource.
  • the time-frequency resource may include a plurality of symbols (eg, CP-OFDMA symbols, DFT-s-OFDMA symbols) in the time domain, and may include a plurality of subcarriers in the frequency domain.
  • CP Cyclic Prefix
  • DAC Digital-to-Analog Converter
  • the signal processing process for the received signal in the wireless device may be configured in reverse of the signal processing process 1010 to 1060 of FIG. 20.
  • a wireless device eg, 100 and 200 in FIG. 19
  • the received radio signal may be converted into a baseband signal through a signal restorer.
  • the signal restorer may include a frequency downlink converter, an analog-to-digital converter (ADC), a CP canceller, and a Fast Fourier Transform (FFT) module.
  • ADC analog-to-digital converter
  • FFT Fast Fourier Transform
  • the baseband signal may be reconstructed into a codeword through a resource de-mapper process, a postcoding process, a demodulation process, and a de-scramble process.
  • a signal processing circuit for a received signal may include a signal restorer, a resource demapper, a postcoder, a demodulator, a descrambler, and a decoder.
  • the wireless device may be implemented in various forms according to use-example/service (see FIG. 18).
  • the wireless devices 100 and 200 correspond to the wireless devices 100 and 200 of FIG. 19, and various elements, components, units/units, and/or modules ) Can be composed of.
  • the wireless devices 100 and 200 may include a communication unit 110, a control unit 120, a memory unit 130, and an additional element 140.
  • the communication unit may include a communication circuit 112 and a transceiver(s) 114.
  • communication circuit 112 may include one or more processors 102,202 and/or one or more memories 104,204 of FIG.
  • the transceiver(s) 114 may include one or more transceivers 106,206 and/or one or more antennas 108,208 of FIG. 19.
  • the control unit 120 is electrically connected to the communication unit 110, the memory unit 130, and the additional element 140 and controls all operations of the wireless device.
  • the controller 120 may control the electrical/mechanical operation of the wireless device based on the program/code/command/information stored in the memory unit 130.
  • the control unit 120 transmits the information stored in the memory unit 130 to an external (eg, other communication device) through the communication unit 110 through a wireless/wired interface, or through the communication unit 110 to the outside (eg, Information received through a wireless/wired interface from another communication device) may be stored in the memory unit 130.
  • the additional element 140 may be variously configured according to the type of wireless device.
  • the additional element 140 may include at least one of a power unit/battery, an I/O unit, a driving unit, and a computing unit.
  • wireless devices include robots (FIGS. 18, 100A), vehicles (FIGS. 18, 100B-1, 100B-2), XR devices (FIGS. 18, 100C), portable devices (FIGS. 18, 100D), and household appliances.
  • Fig. 18, 100e) IoT device (Fig. 18, 100f), digital broadcasting terminal, hologram device, public safety device, MTC device, medical device, fintech device (or financial device), security device, climate/environment device, It may be implemented in the form of an AI server/device (FIGS. 18 and 400), a base station (FIGS. 18 and 200), a network node, and the like.
  • the wireless device can be used in a mobile or fixed place depending on the use-example/service.
  • various elements, components, units/units, and/or modules in the wireless devices 100 and 200 may be entirely interconnected through a wired interface, or at least some may be wirelessly connected through the communication unit 110.
  • the control unit 120 and the communication unit 110 are connected by wire, and the control unit 120 and the first unit (eg, 130, 140) are connected through the communication unit 110. It can be connected wirelessly.
  • each element, component, unit/unit, and/or module in the wireless device 100 and 200 may further include one or more elements.
  • the controller 120 may be configured with one or more processor sets.
  • control unit 120 may be composed of a set of a communication control processor, an application processor, an electronic control unit (ECU), a graphic processing processor, and a memory control processor.
  • memory unit 130 is a random access memory (RAM), a dynamic RAM (DRAM), a read only memory (ROM), a flash memory, a volatile memory, and a non-volatile memory. volatile memory) and/or a combination thereof.
  • FIG. 21 An implementation example of FIG. 21 will be described in more detail with reference to the drawings.
  • Portable devices may include smart phones, smart pads, wearable devices (eg, smart watches, smart glasses), and portable computers (eg, notebook computers).
  • the portable device may be referred to as a mobile station (MS), a user terminal (UT), a mobile subscriber station (MSS), a subscriber station (SS), an advanced mobile station (AMS), or a wireless terminal (WT).
  • MS mobile station
  • UT user terminal
  • MSS mobile subscriber station
  • SS subscriber station
  • AMS advanced mobile station
  • WT wireless terminal
  • the portable device 100 includes an antenna unit 108, a communication unit 110, a control unit 120, a memory unit 130, a power supply unit 140a, an interface unit 140b, and an input/output unit 140c. ) Can be included.
  • the antenna unit 108 may be configured as a part of the communication unit 110.
  • Blocks 110 to 130/140a to 140c correspond to blocks 110 to 130/140 of FIG. 21, respectively.
  • the communication unit 110 may transmit and receive signals (eg, data, control signals, etc.) with other wireless devices and base stations.
  • the controller 120 may perform various operations by controlling components of the portable device 100.
  • the control unit 120 may include an application processor (AP).
  • the memory unit 130 may store data/parameters/programs/codes/commands required for driving the portable device 100.
  • the memory unit 130 may store input/output data/information, and the like.
  • the power supply unit 140a supplies power to the portable device 100 and may include a wired/wireless charging circuit, a battery, and the like.
  • the interface unit 140b may support connection between the portable device 100 and other external devices.
  • the interface unit 140b may include various ports (eg, audio input/output ports, video input/output ports) for connection with external devices.
  • the input/output unit 140c may receive or output image information/signal, audio information/signal, data, and/or information input from a user.
  • the input/output unit 140c may include a camera, a microphone, a user input unit, a display unit 140d, a speaker, and/or a haptic module.
  • the input/output unit 140c acquires information/signals (eg, touch, text, voice, image, video) input from the user, and the obtained information/signals are stored in the memory unit 130. Can be saved.
  • the communication unit 110 may convert information/signals stored in the memory into wireless signals, and may directly transmit the converted wireless signals to other wireless devices or to a base station.
  • the communication unit 110 may restore the received radio signal to the original information/signal. After the restored information/signal is stored in the memory unit 130, it may be output in various forms (eg, text, voice, image, video, heptic) through the input/output unit 140c.
  • the vehicle or autonomous vehicle may be implemented as a mobile robot, a vehicle, a train, an aerial vehicle (AV), or a ship.
  • AV aerial vehicle
  • the vehicle or autonomous vehicle 100 includes an antenna unit 108, a communication unit 110, a control unit 120, a driving unit 140a, a power supply unit 140b, a sensor unit 140c, and autonomous driving. It may include a unit (140d).
  • the antenna unit 108 may be configured as a part of the communication unit 110.
  • Blocks 110/130/140a to 140d correspond to blocks 110/130/140 of FIG. 21, respectively.
  • the communication unit 110 may transmit and receive signals (eg, data, control signals, etc.) with external devices such as other vehicles, base stations (e.g. base stations, roadside base stations, etc.), and servers.
  • the controller 120 may perform various operations by controlling elements of the vehicle or the autonomous vehicle 100.
  • the control unit 120 may include an Electronic Control Unit (ECU).
  • the driving unit 140a may cause the vehicle or the autonomous vehicle 100 to travel on the ground.
  • the driving unit 140a may include an engine, a motor, a power train, a wheel, a brake, a steering device, and the like.
  • the power supply unit 140b supplies power to the vehicle or the autonomous vehicle 100, and may include a wired/wireless charging circuit, a battery, and the like.
  • the sensor unit 140c may obtain vehicle status, surrounding environment information, user information, and the like.
  • the sensor unit 140c is an IMU (inertial measurement unit) sensor, a collision sensor, a wheel sensor, a speed sensor, an inclination sensor, a weight detection sensor, a heading sensor, a position module, and a vehicle advancement. /Reverse sensor, battery sensor, fuel sensor, tire sensor, steering sensor, temperature sensor, humidity sensor, ultrasonic sensor, illuminance sensor, pedal position sensor, etc. may be included.
  • the autonomous driving unit 140d is a technology that maintains a driving lane, a technology that automatically adjusts the speed such as adaptive cruise control, a technology that automatically drives along a predetermined route, and automatically sets a route when a destination is set. Technology, etc. can be implemented.
  • the communication unit 110 may receive map data and traffic information data from an external server.
  • the autonomous driving unit 140d may generate an autonomous driving route and a driving plan based on the acquired data.
  • the control unit 120 may control the driving unit 140a so that the vehicle or the autonomous vehicle 100 moves along the autonomous driving path according to the driving plan (eg, speed/direction adjustment).
  • the communication unit 110 asynchronously/periodically acquires the latest traffic information data from an external server, and may acquire surrounding traffic information data from surrounding vehicles.
  • the sensor unit 140c may acquire vehicle state and surrounding environment information.
  • the autonomous driving unit 140d may update the autonomous driving route and the driving plan based on newly acquired data/information.
  • the communication unit 110 may transmit information about a vehicle location, an autonomous driving route, and a driving plan to an external server.
  • the external server may predict traffic information data in advance using AI technology or the like based on information collected from the vehicle or autonomous vehicles, and may provide the predicted traffic information data to the vehicle or autonomous vehicles.
  • Vehicles can also be implemented as vehicles, trains, aircraft, ships, and the like.
  • the vehicle 100 may include a communication unit 110, a control unit 120, a memory unit 130, an input/output unit 140a, and a position measurement unit 140b.
  • blocks 110 to 130/140a to 140b correspond to blocks 110 to 130/140 of FIG. 21, respectively.
  • the communication unit 110 may transmit and receive signals (eg, data, control signals, etc.) with other vehicles or external devices such as a base station.
  • the controller 120 may control various components of the vehicle 100 to perform various operations.
  • the memory unit 130 may store data/parameters/programs/codes/commands supporting various functions of the vehicle 100.
  • the input/output unit 140a may output an AR/VR object based on information in the memory unit 130.
  • the input/output unit 140a may include a HUD.
  • the location measuring unit 140b may obtain location information of the vehicle 100.
  • the location information may include absolute location information of the vehicle 100, location information within a driving line, acceleration information, location information with surrounding vehicles, and the like.
  • the location measurement unit 140b may include GPS and various sensors.
  • the communication unit 110 of the vehicle 100 may receive map information, traffic information, and the like from an external server and store them in the memory unit 130.
  • the location measurement unit 140b may acquire vehicle location information through GPS and various sensors and store the vehicle location information in the memory unit 130.
  • the controller 120 may generate a virtual object based on map information, traffic information, vehicle location information, and the like, and the input/output unit 140a may display the generated virtual object on a window in the vehicle (1410, 1420).
  • the controller 120 may determine whether the vehicle 100 is operating normally within the driving line based on the vehicle location information. When the vehicle 100 deviates abnormally from the driving line, the control unit 120 may display a warning on the glass window in the vehicle through the input/output unit 140a.
  • control unit 120 may broadcast a warning message about driving abnormalities to nearby vehicles through the communication unit 110.
  • controller 120 may transmit location information of the vehicle and information on driving/vehicle abnormality to a related organization through the communication unit 110.
  • the XR device may be implemented as an HMD, a head-up display (HUD) provided in a vehicle, a television, a smartphone, a computer, a wearable device, a home appliance, a digital signage, a vehicle, a robot, and the like.
  • HMD head-up display
  • a television a television
  • smartphone a smartphone
  • a computer a wearable device
  • a home appliance a digital signage
  • a vehicle a robot, and the like.
  • the XR device 100a may include a communication unit 110, a control unit 120, a memory unit 130, an input/output unit 140a, a sensor unit 140b, and a power supply unit 140c.
  • blocks 110 to 130/140a to 140c correspond to blocks 110 to 130/140 of FIG. 21, respectively.
  • the communication unit 110 may transmit and receive signals (eg, media data, control signals, etc.) with other wireless devices, portable devices, or external devices such as a media server.
  • Media data may include images, images, and sounds.
  • the controller 120 may perform various operations by controlling components of the XR device 100a.
  • the controller 120 may be configured to control and/or perform procedures such as video/image acquisition, (video/image) encoding, and metadata generation and processing.
  • the memory unit 130 may store data/parameters/programs/codes/commands required for driving the XR device 100a/generating an XR object.
  • the input/output unit 140a may obtain control information, data, etc. from the outside, and may output the generated XR object.
  • the input/output unit 140a may include a camera, a microphone, a user input unit, a display unit, a speaker, and/or a haptic module.
  • the sensor unit 140b may obtain XR device status, surrounding environment information, user information, and the like.
  • the sensor unit 140b may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, and/or a radar. have.
  • the power supply unit 140c supplies power to the XR device 100a, and may include a wired/wireless charging circuit, a battery, and the like.
  • the memory unit 130 of the XR device 100a may include information (eg, data, etc.) necessary to generate an XR object (eg, AR/VR/MR object).
  • the input/output unit 140a may obtain a command to manipulate the XR device 100a from the user, and the control unit 120 may drive the XR device 100a according to the user's driving command. For example, when a user tries to watch a movie, news, etc. through the XR device 100a, the controller 120 transmits the content request information through the communication unit 130 to another device (for example, the mobile device 100b) or Can be sent to the media server.
  • another device for example, the mobile device 100b
  • the communication unit 130 may download/stream contents such as movies and news from another device (eg, the portable device 100b) or a media server to the memory unit 130.
  • the control unit 120 controls and/or performs procedures such as video/image acquisition, (video/image) encoding, and metadata generation/processing for the content, and is obtained through the input/output unit 140a/sensor unit 140b.
  • An XR object may be generated/output based on information on a surrounding space or a real object.
  • the XR device 100a is wirelessly connected to the mobile device 100b through the communication unit 110, and the operation of the XR device 100a may be controlled by the mobile device 100b.
  • the portable device 100b may operate as a controller for the XR device 100a.
  • the XR device 100a may obtain 3D location information of the portable device 100b, and then generate and output an XR object corresponding to the portable device 100b.
  • Robots can be classified into industrial, medical, household, military, etc. depending on the purpose or field of use.
  • the robot 100 may include a communication unit 110, a control unit 120, a memory unit 130, an input/output unit 140a, a sensor unit 140b, and a driving unit 140c.
  • blocks 110 to 130/140a to 140c correspond to blocks 110 to 130/140 of FIG. 21, respectively.
  • the communication unit 110 may transmit and receive signals (eg, driving information, control signals, etc.) with other wireless devices, other robots, or external devices such as a control server.
  • the controller 120 may perform various operations by controlling the components of the robot 100.
  • the memory unit 130 may store data/parameters/programs/codes/commands supporting various functions of the robot 100.
  • the input/output unit 140a obtains information from the outside of the robot 100 and may output information to the outside of the robot 100.
  • the input/output unit 140a may include a camera, a microphone, a user input unit, a display unit, a speaker, and/or a haptic module.
  • the sensor unit 140b may obtain internal information, surrounding environment information, user information, and the like of the robot 100.
  • the sensor unit 140b may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, a radar, and the like.
  • the driving unit 140c may perform various physical operations such as moving a robot joint. In addition, the driving unit 140c may make the robot 100 travel on the ground or fly in the air.
  • the driving unit 140c may include an actuator, a motor, a wheel, a brake, a propeller, and the like.
  • AI devices are fixed devices such as TVs, projectors, smartphones, PCs, laptops, digital broadcasting terminals, tablet PCs, wearable devices, set-top boxes (STBs), radios, washing machines, refrigerators, digital signage, robots, vehicles, etc. It can be implemented with possible devices.
  • the AI device 100 includes a communication unit 110, a control unit 120, a memory unit 130, an input/output unit 140a/140b, a running processor unit 140c, and a sensor unit 140d. It may include. Blocks 110 to 130/140a to 140d correspond to blocks 110 to 130/140 of FIG. 21, respectively.
  • the communication unit 110 uses wired/wireless communication technology to provide external devices such as other AI devices (eg, FIGS. 18, 100x, 200, 400) or AI servers (eg, 400 in FIG. 18) and wired/wireless signals (eg, sensor information). , User input, learning model, control signals, etc.). To this end, the communication unit 110 may transmit information in the memory unit 130 to an external device or transmit a signal received from the external device to the memory unit 130.
  • AI devices eg, FIGS. 18, 100x, 200, 400
  • AI servers eg, 400 in FIG. 18
  • wired/wireless signals eg, sensor information
  • the communication unit 110 may transmit information in the memory unit 130 to an external device or transmit a signal received from the external device to the memory unit 130.
  • the controller 120 may determine at least one executable operation of the AI device 100 based on information determined or generated using a data analysis algorithm or a machine learning algorithm. In addition, the controller 120 may perform a determined operation by controlling the components of the AI device 100. For example, the control unit 120 may request, search, receive, or utilize data from the learning processor unit 140c or the memory unit 130, and the predicted or desirable operation among at least one executable operation Components of the AI device 100 may be controlled to execute the operation. In addition, the control unit 120 collects history information including the user's feedback on the operation content or the operation of the AI device 100 and stores it in the memory unit 130 or the running processor unit 140c, or the AI server ( 18 and 400). The collected history information can be used to update the learning model.
  • the memory unit 130 may store data supporting various functions of the AI device 100.
  • the memory unit 130 may store data obtained from the input unit 140a, data obtained from the communication unit 110, output data from the running processor unit 140c, and data obtained from the sensing unit 140.
  • the memory unit 130 may store control information and/or software codes necessary for the operation/execution of the controller 120.
  • the input unit 140a may acquire various types of data from the outside of the AI device 100.
  • the input unit 140a may acquire training data for model training and input data to which the training model is to be applied.
  • the input unit 140a may include a camera, a microphone, and/or a user input unit.
  • the output unit 140b may generate output related to visual, auditory, or tactile sense.
  • the output unit 140b may include a display unit, a speaker, and/or a haptic module.
  • the sensing unit 140 may obtain at least one of internal information of the AI device 100, surrounding environment information, and user information of the AI device 100 by using various sensors.
  • the sensing unit 140 may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, and/or a radar. have.
  • the learning processor unit 140c may train a model composed of an artificial neural network by using the training data.
  • the running processor unit 140c may perform AI processing together with the running processor unit of the AI server (FIGS. 18 and 400 ).
  • the learning processor unit 140c may process information received from an external device through the communication unit 110 and/or information stored in the memory unit 130.
  • the output value of the learning processor unit 140c may be transmitted to an external device through the communication unit 110 and/or may be stored in the memory unit 130.
  • Embodiments as described above can be applied to various mobile communication systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Un mode de réalisation concerne un terminal mobile comportant: une unité d'affichage pour afficher un trajet de déplacement; et une unité de commande pour commander l'unité d'affichage, l'unité de commande indiquant, sur l'unité d'affichage, si une qualité de service (QoS) est satisfaite; et sur la base d'une notification provenant d'un serveur d'application V2X, indiquant que la qualité de service n'est pas satisfaite, l'unité de commande affiche une information relative au réglage d'application sur l'unité d'affichage.
PCT/KR2020/001786 2019-02-08 2020-02-07 Terminal mobile pour indiquer si une qualité de service est satisfaite dans un système de communication sans fil WO2020162719A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0015175 2019-02-08
KR20190015175 2019-02-08

Publications (1)

Publication Number Publication Date
WO2020162719A1 true WO2020162719A1 (fr) 2020-08-13

Family

ID=71948351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/001786 WO2020162719A1 (fr) 2019-02-08 2020-02-07 Terminal mobile pour indiquer si une qualité de service est satisfaite dans un système de communication sans fil

Country Status (1)

Country Link
WO (1) WO2020162719A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116235526A (zh) * 2020-09-24 2023-06-06 华为技术有限公司 一种数据分析方法及装置

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"TSGRAN; Enhancement of 3GPP support for V2X scenarios; Stage 1 (Release 16", 3GPP TS 22.186 V16.1.0, 21 December 2018 (2018-12-21), XP051591353 *
CHINA MOBILE ET AL.: "TS 23.288 NWADF-assisted QoS profile provision", S 2-1900950 . 3GPP TSG SAWG2 #130, 23 January 2019 (2019-01-23), Kochi, India, XP051595566 *
HUAWEI ET AL.: "KI#15: new solution to assist Application Adjustment", S 2-1810822 . 3GPP TSG SA WG2 #129, 18 October 2018 (2018-10-18), DongGuan , China, XP051539766 *
HUAWEI ET AL.: "New Key Issue and Solution for Dynamic Application Adjustment", S 2-188565 . 3GPP TSG SA WG2 #128B, 26 August 2018 (2018-08-26), Sophia Antipolis, France, XP051503003 *
NOKIA ET AL.: "Adding network status exposure capability with optional use of analytics", S 2-1900655 . 3GPP TSG SA WG2 #130, 15 January 2019 (2019-01-15), Kochi, India, XP051590323 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116235526A (zh) * 2020-09-24 2023-06-06 华为技术有限公司 一种数据分析方法及装置

Similar Documents

Publication Publication Date Title
WO2020162720A1 (fr) Procédé par lequel une nwdaf émet et reçoit un signal dans un système de communication sans fil et dispositif associé
WO2020027639A1 (fr) Terminal mobile pour afficher si une qos est satisfaite dans un système de communication sans fil
WO2020141859A1 (fr) Procédé d'émission et de réception d'un signal lié à pdb dans un système de communication sans fil et dispositif associé
WO2020080913A1 (fr) Procédé prenant en charge une transmission de données séparée pour des tranches de réseau indépendantes dans un système de communication sans fil
WO2020166767A1 (fr) Procédé et terminal permettant d'afficher des informations pour utiliser une session pdu à ma
WO2020111912A1 (fr) Procédé d'émission et de réception de signal de recherche de mobile dans un système de communications sans fil, et appareil associé
WO2020046094A1 (fr) Procédé et appareil de sélection de réseau mobile terrestre public (plmn) d'accès dans un système de communication sans fil
WO2020204536A1 (fr) Procédé permettant à un terminal de se connecter à un réseau dans un système de communication sans fil
WO2020141956A1 (fr) Procédé de sélection de réseau dans un système de communication sans fil
WO2020060007A1 (fr) Procédé et dispositif sans fil pour gérer une session de pdu dans une communication mobile 5g
WO2020046093A1 (fr) Procédé et dispositif de sélection de réseau mobile terrestre public (plmn) dans un système de communication sans fil
WO2021225317A1 (fr) Communication associée à une commande d'encombrement
WO2020138985A1 (fr) Procédé permettant de fournir un service de communication dans un système de communication sans fil
WO2020204309A1 (fr) Procédé de communication pour gérer une erreur de réseau
WO2020022716A1 (fr) Procédé et dispositif de commande d'état de transmission de données dans un système de communication sans fil
WO2021045339A1 (fr) Procédé et appareil permettant de prendre en charge une sécurité pour une mo-edt dans une division cu-du dans un système de communication sans fil
WO2020226401A1 (fr) Procédé de fonctionnement d'un équipement utilisateur par rapport à un identifiant pfi dans un système de communication sans fil, et appareil y relatif
WO2020076144A1 (fr) Procédé de configuration, à un réseau, de capacité d'un terminal prenant en charge de multiples systèmes d'accès sans fil dans un système de communication sans fil, et dispositif associé
WO2020067711A1 (fr) Procédé et appareil d'entrée dans un état connecté avec un réseau pour poursuivre une transmission dans un système de communication sans fil
WO2020213817A1 (fr) Procédé d'affichage d'écran après connexion à un autre plmn pour gérer une défaillance de réseau
WO2021187829A1 (fr) Communication relative à une tranche de réseau
WO2022186458A1 (fr) Procédé et appareil destinés à effectuer un transfert intercellulaire sur la base d'un modèle ai dans un système de communication sans fil
WO2021187936A1 (fr) Procédé de communication utilisant une tranche de réseau
WO2020226435A1 (fr) Procédé de fonctionnement lié à la qualité de service (qos) dans un système de communication sans fil, et appareil associé
WO2020032638A1 (fr) Procédé de réalisation d'un contrôle d'accès et dispositif le prenant en charge

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20752835

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20752835

Country of ref document: EP

Kind code of ref document: A1