WO2020171311A1 - Procédé et terminal d'utilisation d'une session pdu toujours active en 5 gs - Google Patents
Procédé et terminal d'utilisation d'une session pdu toujours active en 5 gs Download PDFInfo
- Publication number
- WO2020171311A1 WO2020171311A1 PCT/KR2019/008627 KR2019008627W WO2020171311A1 WO 2020171311 A1 WO2020171311 A1 WO 2020171311A1 KR 2019008627 W KR2019008627 W KR 2019008627W WO 2020171311 A1 WO2020171311 A1 WO 2020171311A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pdu session
- always
- information
- terminal
- network
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W60/00—Affiliation to network, e.g. registration; Terminating affiliation with the network, e.g. de-registration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W68/00—User notification, e.g. alerting and paging, for incoming communication, change of service or the like
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W76/00—Connection management
- H04W76/10—Connection setup
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W80/00—Wireless network protocols or protocol adaptations to wireless operation
- H04W80/08—Upper layer protocols
- H04W80/10—Upper layer protocols adapted for application session management, e.g. SIP [Session Initiation Protocol]
Definitions
- the present invention relates to next-generation mobile communication.
- LTE/SAE Long Term Evolution/System Architecture Evolution
- 3GPP SAE centered on 3GPP SA WG2
- 3GPP SA WG2 is a study on network technology aimed at determining the network structure and supporting mobility between heterogeneous networks in parallel with the LTE work of 3GPP TSG RAN, and important standardization issues of 3GPP. Is one of them. This is a work to develop a 3GPP system into a system supporting various wireless access technologies based on IP, and work has been carried out aiming at an optimized packet-based system that minimizes transmission delay with improved data transmission capability.
- the EPS (Evolved Packet System) high-level reference model defined in 3GPP SA WG2 includes a non-roaming case and a roaming case of various scenarios.For details, refer to the 3GPP standard document. It can be referenced in TS 23.401 and TS 23.402.
- the network structure diagram of FIG. 1 is a simplified reconstruction of this.
- 1 is a structural diagram of an evolved mobile communication network.
- the EPC may include various components, and in FIG. 1, some of them, S-GW (Serving Gateway) 52, PDN Packet Data Network Gateway (GW) 53, and MME (Mobility Management Entity) (51), SGSN (Serving General Packet Radio Service) Supporting Node), and ePDG (enhanced Packet Data Gateway) are shown.
- S-GW Serving Gateway
- GW Packet Data Network Gateway
- MME Mobility Management Entity
- SGSN Serving General Packet Radio Service
- ePDG enhanced Packet Data Gateway
- the S-GW 52 operates as a boundary point between the radio access network (RAN) and the core network, and functions to maintain a data path between the eNodeB 20 and the PDN GW 53.
- the S-GW 52 serves as a local mobility anchor point. That is, packets may be routed through the S-GW 52 for mobility within the E-UTRAN (Evolved-UMTS (Universal Mobile Telecommunications System) Terrestrial Radio Access Network defined after 3GPP Release-8).
- E-UTRAN Evolved-UMTS (Universal Mobile Telecommunications System) Terrestrial Radio Access Network defined after 3GPP Release-8).
- S-GW (52) is another 3GPP network (RAN defined before 3GPP Release-8, for example, UTRAN or GERAN (Global System for Mobile Communication) / EDGE (Enhanced Data rates for Global Evolution) Radio Access Network) can also function as an anchor point for mobility.
- UTRAN or GERAN (Global System for Mobile Communication)
- EDGE Enhanced Data rates for Global Evolution Radio Access Network
- the PDN GW (or P-GW) 53 corresponds to a termination point of a data interface toward a packet data network.
- the PDN GW 53 may support policy enforcement features, packet filtering, charging support, and the like.
- mobility management between 3GPP networks and non-3GPP networks e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA) networks or trusted networks such as WiMax)can serve as an anchor point for 3GPP networks and non-3GPP networks (e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA) networks or trusted networks such as WiMax) Can serve as an anchor point for 3GPP networks and non-3GPP networks (e.g., untrusted networks such as I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA) networks or trusted networks such as WiMax) Can serve as an anchor point for I-WLAN (Interworking Wireless Local Area Network), Code Division Multiple Access (CDMA
- the S-GW 52 and the PDN GW 53 are configured as separate gateways, but two gateways may be implemented according to a single gateway configuration option. have.
- the MME 51 is an element that performs signaling and control functions to support access to the network connection of the UE, allocation of network resources, tracking, paging, roaming, and handover. .
- the MME 51 controls control plane functions related to subscriber and session management.
- the MME 51 manages a number of eNodeBs 20 and performs signaling for selection of a conventional gateway for handover to another 2G/3G network.
- the MME 51 performs functions such as security procedures, UE-to-network session handling, and idle terminal location management.
- SGSN handles all packet data such as user mobility management and authentication to other access 3GPP networks (eg GPRS network, UTRAN/GERAN).
- 3GPP networks eg GPRS network, UTRAN/GERAN.
- the ePDG serves as a security node for untrusted non-3GPP networks (eg, I-WLAN, WiFi hotspot, etc.).
- untrusted non-3GPP networks eg, I-WLAN, WiFi hotspot, etc.
- a UE (or UE) having IP capability is provided by an operator (ie, an operator) through various elements in the EPC based on 3GPP access as well as non-3GPP access. It can access the IP service network (eg IMS).
- IMS IP service network
- FIG. 1 shows various reference points (eg, S1-U, S1-MME, etc.).
- a conceptual link connecting two functions existing in different functional entities of E-UTRAN and EPC is defined as a reference point.
- Table 1 below summarizes the reference points shown in FIG. 1.
- various reference points may exist according to the network structure.
- This reference point can be used for intra-PLMN or inter-PLMN (eg, for inter-PLMN handover)
- S4 A reference point between the SGW and SGSN that provides the associated control and mobility support between the GPRS core and the SGW's 3GPP anchor function.
- a direct tunnel is not established, user plane tunneling is provided.
- S5 A reference point that provides user plane tunneling and tunnel management between SGW and PDN GW. Used for SGW relocation when connection to a PDN GW that is not co-located with the SGW is required due to UE mobility and for the required PDN connectivity.
- S11 Reference point between MME and SGW SGi PDN A reference point between GW and PDN.
- the PDN may be a public or private PDN outside the operator, or may be, for example, an intra-operator PDN for provision of IMS services. This reference point corresponds to Gi of 3GPP access
- LTE long term evolution
- LTE-A LTE-Advanced
- 5th generation mobile communication defined by the International Telecommunication Union (ITU) refers to providing a maximum 20Gbps data transmission speed and a sensible transmission speed of at least 100Mbps or more anywhere. Its official name is'IMT-2020' and it aims to be commercialized globally in 2020.
- ITU International Telecommunication Union
- ITU proposes three usage scenarios, e.g. eMBB (enhanced mobile broadband), mMTC (massive machine type communication), and URLLC (Ultra Reliable and Low Latency Communications).
- eMBB enhanced mobile broadband
- mMTC massive machine type communication
- URLLC Ultra Reliable and Low Latency Communications
- URLLC relates to a usage scenario requiring high reliability and low latency.
- services such as automatic driving, factory automation, and augmented reality require high reliability and low latency (for example, a delay time of 1 ms or less).
- the latency of 4G (LTE) is statistically 21-43ms (best 10%), 33-75ms (median). This is insufficient to support a service that requires a delay time of less than 1ms.
- the eMBB usage scenario relates to a usage scenario requiring mobile ultra-wideband.
- FIG. 2 is an exemplary diagram showing an expected structure of next-generation mobile communication from a node perspective.
- the UE is connected to a data network (DN) through a next-generation radio access network (RAN).
- DN data network
- RAN next-generation radio access network
- the illustrated control plane function (CPF) node is all or part of the functions of a mobility management entity (MME) of 4G mobile communication, and a control plane function of a serving gateway (S-GW) and a PDN gateway (P-GW). Do all or part of.
- the CPF node includes an Access and Mobility Management Function (AMF) and a Session Management Function (SMF).
- AMF Access and Mobility Management Function
- SMF Session Management Function
- the illustrated User Plane Function (UPF) node is a type of gateway through which user data is transmitted and received.
- the UPF node may perform all or part of the user plane functions of S-GW and P-GW of 4G mobile communication.
- the illustrated PCF Policy Control Function
- Policy Control Function is a node that controls the operator's policy.
- the illustrated application function is a server for providing various services to the UE.
- the illustrated Unified Data Management is a kind of server that manages subscriber information, such as a 4G mobile communication HSS (Home Subscriber Server).
- the UDM stores and manages the subscriber information in a Unified Data Repository (UDR).
- UDR Unified Data Repository
- the illustrated authentication server function (AUSF) authenticates and manages the UE.
- the illustrated network slice selection function (NSSF) is a node for network slicing as described below.
- the UE may simultaneously access two data networks by using multiple protocol data unit or packet data unit (PDU) sessions.
- PDU packet data unit
- FIG. 3 is an exemplary diagram showing an architecture for supporting simultaneous access to two data networks.
- FIG. 3 an architecture for a UE to access two data networks simultaneously using one PDU session is shown.
- FIG. 4 is another exemplary diagram showing the structure of a radio interface protocol between a UE and a gNB.
- the radio interface protocol is based on the 3GPP radio access network standard.
- the radio interface protocol horizontally consists of a physical layer (Physical layer), a data link layer (Data Link layer), and a network layer (Network layer), and vertically, a user plane and control for data information transmission. It is divided into a control plane for signal transmission.
- the protocol layers are L1 (layer 1), L2 (layer 2), and L3 (layer 3) based on the lower 3 layers of the Open System Interconnection (OSI) reference model widely known in communication systems. ) Can be separated.
- OSI Open System Interconnection
- the first layer provides an information transfer service using a physical channel.
- the physical layer is connected to an upper medium access control layer through a transport channel, and data between the medium access control layer and the physical layer is transmitted through the transport channel.
- data is transmitted between different physical layers, that is, between the physical layers of the transmitting side and the receiving side through a physical channel.
- the second layer includes a Medium Access Control (MAC) layer, a Radio Link Control (RLC) layer, and a Packet Data Convergence Protocol (PDCP) layer.
- MAC Medium Access Control
- RLC Radio Link Control
- PDCP Packet Data Convergence Protocol
- the third layer includes Radio Resource Control (hereinafter abbreviated as RRC).
- RRC Radio Resource Control
- the RRC layer is defined only in the control plane, and is related to setting (setting), resetting (Re-setting) and release (Release) of radio bearers (Radio Bearer; RB).
- Radio Bearer Radio Bearer
- RB means a service provided by the second layer for data transmission between the UE and the E-UTRAN.
- the NAS (Non-Access Stratum) layer performs functions such as connection management (session management) and mobility management.
- the NAS layer is divided into a NAS entity for mobility management (MM) and a NAS entity for session management (SM).
- MM mobility management
- SM session management
- NAS entity for MM provides the following functions in general.
- NAS procedures related to AMF including the following.
- AMF supports the following functions.
- the NAS entity for the SM performs session management between the UE and the SMF.
- the SM signaling message is processed, that is, generated and processed at the NAS-SM layer of the UE and SMF.
- the contents of the SM signaling message are not interpreted by the AMF.
- the NAS entity for the MM generates a NAS-MM message that derives how and where to deliver the SM signaling message through the security header representing the NAS transmission of SM signaling, and additional information about the receiving NAS-MM.
- the NAS entity for the SM upon receiving the SM signaling, performs an integrity check of the NAS-MM message, analyzes the additional information, and derives a method and place to derive the SM signaling message.
- an RRC layer, an RLC layer, a MAC layer, and a PHY layer located below the NAS layer are collectively referred to as an Access Stratum (AS).
- AS Access Stratum
- a terminal supporting the 5G system can support services of various characteristics, and in particular, must support a service having characteristics such as Ultra Reliable and Low Latency Communication (URLLC) having very high reliability and ultra-low latency characteristics.
- URLLC Ultra Reliable and Low Latency Communication
- the UE may establish an always-on PDU session in order to support a PDU session having characteristics such as URLLC.
- a User Plane (UP) resource When the always-on PDU session is established, a User Plane (UP) resource must be established whenever switching from the mobility management (MM) idle mode to the connected mode.
- UP User Plane
- the terminal may request to establish an always-on PDU session based on the indication from the upper layer. Then, the network can determine whether the PDU session should be established as an always-on PDU session.
- one disclosure of the present specification provides a method of using an always-on protocol data unit (PDU) session of a terminal.
- the method includes the steps of displaying first output information on a screen indicating that an always-on PDU session is established or in use, based on first received information to the network; Displaying second output information on the screen indicating that the always-on PDU session is not supported or cannot be used, based on the second received information from the network; Further, based on the second reception information from the network, it may include not requesting the always-on PDU session again until a predetermined condition is satisfied.
- PDU protocol data unit
- the method may further include displaying a setting screen for any one of activation and release of the always-on PDU session when receiving a selection input from the user for the first output information displayed on the screen.
- the method may further include displaying information on an application that is using the always-on PDU session when receiving a selection input from the user for the first output information displayed on the screen.
- the method may further include displaying a setting screen for one of activation and release of the always-on PDU session when the always-on PDU session becomes available.
- the second output information may include information on the remaining time until the always-on PDU session can be requested again.
- the first information is included in a PDU session acceptance message or a PDU session modification command message
- the second information is included in any one of a PDU session acceptance message, a PDU session modification command message, a PDU session establishment rejection message, and a PDU session modification rejection message. I can.
- the predetermined condition may include one or more of expiration of a running first back-off timer, power off, and removal of a SIM card.
- the method may further include the step of informing the application layer of the non-access stratum (NAS) layer of the terminal that the always-on PDU session is not supported or cannot be used.
- NAS non-access stratum
- the user plane (user plane) resources for the always-on PDU session may be maintained.
- the method may further include transmitting a PDU session request message even when at least one of a session management (SM) back-off timer and a back-off timer for DNN-based congestion control are running.
- SM session management
- the terminal may be a mobile terminal or a device mounted on an autonomous vehicle.
- the terminal may communicate with at least one of a network and an autonomous vehicle.
- the terminal comprises: a transmission/reception unit for receiving first reception information or second reception information to a network; A display unit; And it may include a processor that controls the transmitting and receiving unit and the display unit.
- the processor may display, on the display unit, first output information indicating that an always-on PDU session is established or in use, based on the first received information from the network.
- the processor may display second output information indicating that the always-on PDU session is not supported or cannot be used on the display unit based on the second reception information from the network.
- the processor may not request the always-on PDU session again until a predetermined condition is satisfied based on the second reception information from the network.
- 1 is a structural diagram of an evolved mobile communication network.
- FIG. 2 is an exemplary diagram showing an expected structure of next-generation mobile communication from a node perspective.
- FIG. 3 is an exemplary diagram showing an architecture for supporting simultaneous access to two data networks.
- FIG. 4 is another exemplary diagram showing the structure of a radio interface protocol between a UE and a gNB.
- 5A and 5B are signal flow diagrams illustrating an exemplary registration procedure.
- 6A and 6B are signal flow diagrams illustrating an exemplary PDU session establishment procedure.
- 7D shows an example of receiving always-on PDU session configuration information through a new NAS procedure.
- 8A to 8G are exemplary diagrams illustrating a display screen of a terminal according to an implementation of the first disclosure.
- FIG 9 is an exemplary view showing an operation according to the first scheme of the third disclosure of the present specification.
- FIG 10 is an exemplary view showing an operation according to the second scheme of the third disclosure of the present specification.
- 11A to 11D are exemplary views showing a screen of a terminal according to an implementation of the first scheme of the third disclosure of the present specification.
- FIG. 12 is a block diagram of a configuration of a terminal in which an embodiment presented in the present specification is implemented.
- FIG. 13 illustrates a wireless communication system according to an embodiment.
- FIG. 14 illustrates a block diagram of a network node according to an embodiment.
- 15 is a block diagram showing a configuration of a terminal according to an embodiment.
- FIG. 16 is a block diagram showing the configuration of the terminal illustrated in FIG. 15 in more detail.
- FIG 17 shows an example of a 5G usage scenario.
- first and second may be used to describe various elements, but the elements should not be limited by the terms. These terms are only used for the purpose of distinguishing one component from another component. For example, without departing from the scope of the present invention, a first component may be referred to as a second component, and similarly, a second component may be referred to as a first component.
- a component When a component is connected to or is said to be connected to another component, it may be directly connected or connected to the other component, but other components may exist in the middle. On the other hand, when a component is directly connected to or directly connected to another component, it should be understood that there is no other component in the middle.
- a UE User Equipment
- the illustrated UE may be referred to in terms of UE (100) (Terminal), ME (Mobile Equipment), and the like.
- the UE may be a portable device such as a notebook computer, a mobile phone, a PDA, a smart phone, or a multimedia device, or may be a non-portable device such as a PC or a vehicle-mounted device.
- UE/MS User Equipment/Mobile Station, refers to a UE (100) device.
- EPS stands for Evolved Packet System, and refers to a core network supporting a Long Term Evolution (LTE) network.
- LTE Long Term Evolution
- PDN Packet Data Network
- PDN-GW Packet Data Network Gateway
- Network node of the EPS network that performs the functions of UE IP address allocation, Packet screening & filtering, and Charging data collection.
- Serving GW Network node of EPS network that performs mobility anchor, packet routing, idle mode packet buffering, and triggering MME to page UE functions.
- eNodeB An EPS (Evolved Packet System) base station installed outdoors, and the cell coverage scale corresponds to a macro cell.
- EPS Evolved Packet System
- MME Mobility Management Entity, and serves to control each entity within the EPS to provide session and mobility for the UE.
- a session is a path for data transmission, and its unit may be a PDN, a bearer, an IP flow unit, etc.
- the difference between each unit can be classified into the entire target network unit (APN or PDN unit) as defined in 3GPP, the unit classified by QoS (Bearer unit) within the unit, and the destination IP address unit.
- APN An abbreviation for Access Point Name, which is provided to the UE as the name of an access point managed by the network. In other words, it is a character string that refers to or identifies the PDN.
- PDN In order to access the requested service or network (PDN), the P-GW goes through the corresponding P-GW, which is a predefined name (string) in the network so that this P-GW can be found.
- the APN may be in the form of internet.mnc012.mcc345.gprs.
- PDN connection Indicates the connection from the UE to the PDN, that is, the association (connection) between the UE expressed by the ip address and the PDN expressed by the APN. This means a connection between entities in the core network (UE (100)-PDN GW) so that a session can be formed.
- UE Context context information of the UE used to manage the UE in the network, that is, context information consisting of UE id, mobility (current location, etc.), and session properties (QoS, priority, etc.)
- NAS Non-Access-Stratum: Upper stratum of the control plane between the UE and the MME. Supports mobility management, session management, and IP address maintenance between the UE and the network
- PLMN An abbreviation for Public Land Mobile Network, which means the operator's network identification number.
- HPLMN Home PLMN
- VPLMN Visited PLMN
- DNN Acronym for Data Network Name, similar to APN, it is provided to the UE as the name of the access point managed by the network. In 5G system, DNN is used equivalent to APN.
- NSSP Network Slice Selection Policy: Used by the UE for mapping of the application and S-NSSAI (Session Network Slice Selection Assistance Information).
- the UE needs to obtain authorization in order to enable mobility tracking, enable data reception, and receive services. For this, the UE must register with the network.
- the registration procedure is performed when the UE needs to do initial registration for the 5G system.
- the registration procedure when the UE performs periodic registration update, moves to a new registration area (RA) or a tracking area (TA) not included in the TAI (tracking area identity) list in the idle mode.
- RA registration area
- TA tracking area identity
- the ID of the UE can be obtained from the UE.
- AMF can deliver PEI (IMEISV) to UDM, SMF and PCF.
- PEI IMEISV
- 5A and 5B are signal flow diagrams illustrating an exemplary registration procedure.
- the UE can transmit an AN message to the RAN.
- the AN message may include an AN parameter and a registration request message.
- the registration request message may include information such as a registration type, subscriber permanent ID or temporary user ID, security parameters, Network Slice Selection Assistance Information (NSSAI), 5G capability of the terminal, and protocol data unit (PDU) session status.
- NSSAI Network Slice Selection Assistance Information
- 5G capability of the terminal and protocol data unit (PDU) session status.
- PDU protocol data unit
- the AN parameter may include a SUPI (Subscription Permanent Identifier) or a temporary user ID, a selected network, and NSSAI.
- SUPI Subscriber Permanent Identifier
- NSSAI Network Access Management Function
- the registration type is "initial registration” (ie, the terminal is in a non-registered state), "mobility registration update” (ie, the terminal is in a registered state and starts the registration process due to mobility) or “regular registration update "(Ie, the terminal is in a registered state and starts a registration procedure due to expiration of a periodic update timer).
- the temporary user ID indicates the last serving AMF. If the terminal is already registered through non-3GPP access in a PLMN different from the PLMN of 3GPP access, the terminal may not provide the temporary ID of the terminal allocated by the AMF during the registration procedure through the non-3GPP access.
- Security parameters can be used for authentication and integrity protection.
- the PDU session state may indicate a (pre-established) PDU session available in the terminal.
- the RAN may select AMF based on (R)AT and NSSAI.
- the (R)AN cannot select an appropriate AMF, it selects a random AMF according to local policy, and transmits a registration request to the selected AMF. If the selected AMF cannot serve the terminal, the selected AMF selects another AMF more appropriate for the terminal.
- the RAN transmits an N2 message to a new AMF.
- the N2 message includes an N2 parameter and a registration request.
- the registration request may include a registration type, a subscriber permanent identifier or a temporary user ID, a security parameter, and a default setting for NSSAI and MICO modes.
- the N2 parameter includes location information related to a cell in which the UE is camping, a cell identifier, and a RAT type.
- steps 4 to 17 described later may not be performed.
- the newly selected AMF may transmit an information request message to the previous AMF.
- the new AMF can send an information request message containing complete registration request information to the previous AMF to request the SUPI and MM context of the UE. have.
- the previous AMF transmits an information response message to the newly selected AMF.
- the information response message may include SUPI, MM context, and SMF information.
- the previous AMF transmits an information response message including the SUPI and MM context of the UE.
- SMF information including the ID of the SMF and the PDU session ID may be included in the information response message in the previous AMF.
- the new AMF transmits an Identity Request message to the UE when SUPI is not provided by the UE or is not retrieved from the previous AMF.
- the terminal transmits an Identity Response message including the SUPI to the new AMF.
- AMF may decide to trigger AUSF.
- AMF may select AUSF based on SUPI.
- AUSF can initiate authentication of UE and NAS security functions.
- the new AMF may transmit an information response message to the previous AMF.
- the new AMF may transmit the information response message to confirm delivery of the UE MM context.
- the new AMF may transmit an Identity Request message to the UE.
- an Identity Request message may be transmitted in order for the AMF to retrieve the PEI.
- the new AMF checks the ME identifier.
- step 14 described later the new AMF selects UDM based on SUPI.
- the new AMF After the final registration, if the AMF is changed, there is no valid subscription context for the terminal in the AMF, or the terminal provides a SUPI that does not refer to a valid context in the AMF, the new AMF starts the update location procedure. . Alternatively, it may be initiated even when the UDM initiates a cancel location for the previous AMF. The old AMF discards the MM context and notifies all possible SMF(s), and the new AMF creates an MM context for the terminal after obtaining AMF-related subscription data from the UDM.
- AMF acquires the NSSAI allowed based on the requested NSSAI, UE subscription and local policy. If AMF is not suitable to support the allowed NSSAI, it will reroute the registration request.
- the new AMF can select a PCF based on SUPI.
- the new AMF transmits a UE Context Establishment Request message to the PCF.
- the AMF may request an operator policy for the terminal from the PCF.
- the PCF transmits a UE Context Establishment Acknowledged message to the new AMF.
- the new AMF transmits an N11 request message to the SMF.
- the new AMF when the AMF is changed, notifies each SMF of the new AMF serving the terminal.
- the AMF verifies the PDU session state from the UE with available SMF information.
- usable SMF information may be received from the previous AMF.
- the new AMF may request the SMF to release network resources related to a PDU session that is not active in the terminal.
- the new AMF transmits an N11 response message to the SMF.
- the previous AMF transmits a UE Context Termination Request message to the PCF.
- the previous AMF may delete the UE context in the PCF.
- the PCF may transmit a UE Context Termination Request message to the previous AMF.
- the new AMF transmits a registration acceptance message to the UE.
- the registration acceptance message may include a temporary user ID, a registration area, mobility restriction, PDU session state, NSSAI, a regular registration update timer, and an allowed MICO mode.
- the registration acceptance message may include the allowed NSSAI and information of the mapped NSSAI.
- the allowed NSSAI information on the access type of the UE may be included in an N2 message including a registration acceptance message.
- the mapped NSSAI information is information obtained by mapping each S-NSSAI of the allowed NSSAI to the S-NASSI of the NSSAI set for HPLMN.
- a temporary user ID may be further included in the registration acceptance message.
- information indicating mobility limitation may be additionally included in the registration acceptance message.
- the AMF may include information indicating the PDU session state for the terminal in the registration acceptance message.
- the terminal may remove any internal resources related to a PDU session that is not marked as active in the received PDU session state. If the PDU session state information is in the Registration Request, the AMF may include information indicating the PDU session state to the UE in the registration acceptance message.
- the terminal transmits a registration completion message to the new AMF.
- PDU session establishment procedure there may be two types of PDU session establishment procedures as follows.
- the network may transmit a device trigger message to the application(s) of the UE.
- 6A and 6B are signal flow diagrams illustrating an exemplary PDU session establishment procedure.
- the procedure illustrated in FIGS. 6A and 6B assumes that the terminal has already registered on the AMF according to the registration procedure illustrated in FIGS. 5A and 5B. Therefore, it is assumed that the AMF has already obtained user subscription data from UDM.
- the terminal transmits the NAS message to the AMF.
- the message may include Session Network Slice Selection Assistance Information (S-NSSAI), DNN, PDU session ID, request type, N1 SM information, and the like.
- S-NSSAI Session Network Slice Selection Assistance Information
- the terminal includes the S-NSSAI from the allowed (allowed) NSSAI of the current access type. If the information on the mapped NSSAI is provided to the terminal, the terminal may provide both an S-NSSAI based on the allowed NSSAI and a corresponding S-NSSAI based on information on the mapped NSSAI.
- the mapped NSSAI information is information obtained by mapping each S-NSSAI of the allowed NSSAI to the S-NASSI of the NSSAI set for HPLMN.
- the terminal may extract and store information of the allowed S-NSSAI and the mapped S-NSSAI included in the registration acceptance message received from the network (ie, AMF) in the registration procedure of FIG. have. Accordingly, the terminal may include and transmit both the S-NSSAI based on the allowed NSSAI and the corresponding S-NSSAI based on information of the mapped NSSAI in the PDU session establishment request message.
- the UE may generate a new PDU session ID.
- the terminal may initiate a PDU session establishment procedure initiated by the terminal by transmitting a NAS message including the PDU session establishment request message in the N1 SM information.
- the PDU session establishment request message may include a request type, an SSC mode, and a protocol configuration option.
- the request type indicates "initial request”. However, when there is an existing PDU session between 3GPP access and non-3GPP access, the request type may indicate "existing PDU session”.
- the NAS message transmitted by the terminal is encapsulated in the N2 message by the AN.
- the N2 message is transmitted through AMF, and may include user location information and access technology type information.
- the N1 SM information may include an SM PDU DN request container that includes information on PDU session authentication by an external DN.
- the AMF may determine that the message corresponds to a request for a new PDU session when the request type indicates "initial request" and when the PDU session ID is not used for the existing PDU session of the UE.
- the AMF may determine the default S-NSSAI for the requested PDU session according to the UE subscription.
- the AMF may store the PDU session ID and the SMF ID in association with each other.
- AMF transmits an SM request message to the SMF.
- the SM request message may include a subscriber permanent ID, DNN, S-NSSAI, PDU session ID, AMF ID, N1 SM information, user location information, and access technology type.
- the N1 SM information may include a PDU session ID and a PDU session establishment request message.
- the AMF ID is used to identify the AMF serving the terminal.
- the N1 SM information may include a PDU session establishment request message received from the UE.
- the SMF transmits a subscriber data request message to the UDM.
- the subscriber data request message may include a subscriber permanent ID and DNN.
- step 3 if the request type indicates "existing PDU session", the SMF determines that the request is due to handover between 3GPP access and non-3GPP access.
- the SMF can identify an existing PDU session based on the PDU session ID.
- the SMF may request subscription data.
- UDM may transmit a subscription data response message to the SMF.
- the subscription data may include information on an authenticated request type, an authenticated SSC mode, and a basic QoS profile.
- the SMF can check whether the UE request complies with the user subscription and local policy. Alternatively, the SMF rejects the UE request through NAS SM signaling (including the related SM rejection cause) delivered by the AMF, and the SMF informs the AMF that the PDU session ID should be considered released.
- NAS SM signaling including the related SM rejection cause
- SMF sends a message to DN through UPF.
- the SMF selects the UPF and triggers the PDU.
- the SMF ends the PDU session establishment procedure and notifies the terminal of rejection.
- the SMF may initiate PDU-CAN session establishment towards the PCF to obtain basic PCC rules for the PDU session. If the request type in step 3 indicates "existing PDU session", the PCF may start modifying the PDU-CAN session instead.
- step 3 If the request type of step 3 indicates "initial request", the SMF selects the SSC mode for the PDU session. If step 5 is not performed, the SMF may also select UPF. In case of request type IPv4 or IPv6, SMF can allocate IP address/prefix for PDU session.
- the SMF can start the PDU-CAN session.
- step 5 If the request type indicates "initial request” and step 5 is not performed, the SMF starts the N4 session establishment procedure using the selected UPF, otherwise the N4 session modification procedure can start using the selected UPF.
- SMF transmits an N4 session establishment/modification request message to the UPF.
- the SMF may provide a packet detection, enforcement and reporting rule to be installed in the UPF for the PDU session.
- CN tunnel information may be provided to the UPF.
- UPF can respond by sending an N4 session establishment/modification response message.
- CN tunnel information may be provided to the SMF.
- the SMF transmits an SM response message to the AMF.
- the message may include cause, N2 SM information, and N1 SM information.
- the N2 SM information may include PDU session ID, QoS profile, and CN tunnel information.
- the N1 SM information may include a PDU session establishment acceptance message.
- the PDU session establishment acceptance message may include an authorized QoS rule, SSC mode, S-NSSAI, and an assigned IPv4 address.
- the N2 SM information is information that the AMF must deliver to the RAN and may include the following.
- -CN tunnel information This corresponds to the core network address of the N3 tunnel corresponding to the PDU session.
- -PDU Session ID This can be used to indicate to the UE the association between AN resources for the UE and the PDU session by AN signaling to the UE.
- the N1 SM information includes a PDU session acceptance message that AMF must provide to the terminal.
- Multiple QoS rules may be included in the N1 SM information and the N2 SM information in the PDU session establishment acceptance message.
- the SM response message also contains information that allows the PDU session ID and AMF to determine which target UE as well as which access should be used for the terminal.
- AMF transmits an N2 PDU session request message to the RAN.
- the message may include N2 SM information and NAS message.
- the NAS message may include a PDU session ID and a PDU session establishment acceptance message.
- the AMF may transmit a NAS message including a PDU session ID and a PDU session establishment acceptance message. Also, the AMF includes received N2 SM information from the SMF in the N2 PDU session request message and transmits it to the RAN.
- the RAN may exchange specific signaling with the UE related to information received from the SMF.
- the RAN also allocates RAN N3 tunnel information for the PDU session.
- the RAN delivers the NAS message provided in step 10 to the terminal.
- the NAS message may include PDU session ID and N1 SM information.
- the N1 SM information may include a PDU session establishment acceptance message.
- the RAN transmits a NAS message to the terminal only when necessary RAN resources are set and allocation of RAN tunnel information is successful.
- the RAN transmits an N2 PDU session response message to the AMF.
- the message may include PDU session ID, cause, and N2 SM information.
- the N2 SM information may include a PDU session ID, (AN) tunnel information, and a list of allowed/rejected QoS profiles.
- -RAN tunnel information may correspond to the access network address of the N3 tunnel corresponding to the PDU session.
- the AMF may transmit an SM request message to the SMF.
- the SM request message may include N2 SM information.
- the AMF may be to transmit the N2 SM information received from the RAN to the SMF.
- the SMF may start the N4 session establishment procedure together with the UPF. Otherwise, the SMF can start the N4 session modification procedure using UPF.
- SMF may provide AN tunnel information and CN tunnel information. CN tunnel information may be provided only when the SMF selects CN tunnel information in step 8.
- the UPF may transmit an N4 session establishment/modification response message to the SMF.
- the SMF may transmit an SM response message to the AMF.
- the AMF can deliver the related event to the SMF. Occurs at handover when RAN tunnel information is changed or AMF is relocated.
- SMF transmits information to the terminal through UPF. Specifically, in the case of PDU Type IPv6, the SMF may generate an IPv6 Router Advertisement and transmit it to the UE through N4 and UPF.
- the SMF is used by the user through source access (3GPP or non-3GPP access). Release the plane.
- the SMF may call "UDM_Register UE serving NF service" including the SMF address and DNN.
- UDM can store SMF's ID, address, and related DNN.
- the SMF During the procedure, if PDU session establishment is not successful, the SMF notifies the AMF.
- a User Plane (UP) resource When an always-on PDU session is established, a User Plane (UP) resource must be established whenever switching from the mobility management (MM) idle mode to the connected mode.
- UP User Plane
- the UE is in a state in which the current UP (User Plane) context is not created or UP resource is not allocated in the idle mode or in the connected mode, that is, protocol data (PDU). unit or packet data unit) It is possible to request UP activation for the PDU session(s) in which the UP of the session is deactivated.
- PDU protocol data
- the idle mode it is possible through a service request or a registration request, and in the connected mode, it is also possible through a service request procedure.
- the terminal If a PDU session having an always-on characteristic is requested, the terminal includes an indication that there is data to be transmitted in an uplink data status information element (IE). By marking /, it sends a service request message or a registration request message to the network.
- IE uplink data status information element
- the UE may distinguish whether or not the corresponding PDU session should be established as an always-on PDU session and inform the network.
- the PDU session establishment request message or the PDU session modification request message may include an IE indicating an always-on PDU session.
- the network responds to the terminal including information indicating the permission of the always-on PDU session requested by the terminal or not.
- the response message to the PDU session establishment request message or the PDU session modification request message may include an always-on PDU session IE.
- the PDU session establishment request message may include an "always-on PDU Session requested” IE.
- the PDU session modification request message may include an "always-on PDU Session requested” IE.
- the "always-on PDU Session requested” IE may indicate whether the PDU session requested by the terminal is an always-on PDU session.
- the "always-on PDU Session requested” IE may include at least one bit. When the value of the bit is 0, it may indicate that an always-on PDU session is not requested. When the value of the bit is 1, it may indicate that an always-on PDU session is requested.
- the PDU session establishment acceptance message may include "always-on PDU Session indication”.
- the PDU session modification command message may include "always-on PDU Session indication”.
- the always-on PDU Session indication may indicate whether a corresponding PDU session is established as an always-on PDU session.
- the "always-on PDU Session indication” may include at least one bit. When the value of the bit is 0, it may indicate that an always-on PDU session is not allowed. When the value of the bit is 1, it may indicate that an always-on PDU session is allowed.
- always-on PDU sessions are quickly and accurately guaranteed services, such as Augmented Reality (AR), Virtual Reality (VR), and Vehicle to everything (V2X).
- AR Augmented Reality
- VR Virtual Reality
- V2X Vehicle to everything
- the terminal is required to guarantee a fast always-on PDU session connection, and the network is required to process a fast always-on PDU session.
- a backoff timer e.g., a session management (SM) back-off timer; a T3396 timer for DNN-based congestion control, a T3584 timer for DNN and S-NSSAI-based congestion control
- SM session management
- the PDU session is “always-on PDU session not allowed. ”Respond to the terminal with information.
- the network may transmit by including always-on PDU session not allowed" information in a PDU session establishment acceptance message or a PDU session modification command message.
- the UE recognizes that the corresponding PDU session is not an always-on PDU session. However, the UE may transmit a PDU session establishment request message including “always-on PDU session requested” again after the corresponding PDU session is released or the UE is switched to an idle mode.
- the terminal may be configured as to which application is associated with an always-on PDU session.
- These applications may refer to applications that require rapid data transmission, such as low latency data transmission or URLLC (Ultra-Reliable Low-Latency Communication application).
- the application When the application is triggered (or driven) for rapid data transmission, the application notifies the NAS layer of the terminal about the triggering (or driving), and the NAS layer of the terminal receives a new PDU session establishment request message Alternatively, a PDU session modification request message is transmitted to the network.
- the terminal may include an "always-on PDU session requested" IE indicating whether the corresponding PDU session is a PDU session corresponding to an always-on PDU session in the message.
- Information on whether the application is associated with an always-on PDU session or not may be preset in the terminal for rapid data transmission.
- Such an always-on PDU session may be set to the UE as follows.
- Network/operator provides settings to terminal through OTA
- Network/operator provides related setting information through one of the following NAS procedures
- Registration procedure e.g., registration request message, registration acceptance message or registration rejection message
- the always-on PDU session may be set in advance in the USIM.
- FIG. 7B it shows an example in which configuration information for an always-on PDU session is received through a registration procedure (eg, a registration request message, a registration acceptance message, and a registration rejection message).
- a registration procedure eg, a registration request message, a registration acceptance message, and a registration rejection message.
- FIG. 7C it shows an example in which setting information for an always-on PDU session is received through a setting procedure (eg, a setting completion command message, a setting update completion message).
- a setting procedure eg, a setting completion command message, a setting update completion message.
- 7D shows an example of receiving always-on PDU session configuration information through a new NAS procedure.
- FIG. 7D it shows an example in which configuration information for an always-on PDU session is received through a new NAS procedure.
- the always-on PDU session setting information may be applied and provided as follows.
- the terminal may be configured to know whether there is an association between an application and an always-on PDU session.
- the terminal may know information about whether the DNN is associated with an always-on PDU session and information about whether the application is associated with an always-on PDU session DNN.
- always-on PDU session service area and always-on PDU session DNN is DN-based in the AMF (that is, another terminal accessing the same always-on PDU session Can be set to).
- the always-on PDU session service area may be the same regardless of other elements (ie, a registration area of a UE or a subscriber of a UE).
- always-on PDU session (configuration) information is always-on (always-on) PDU session service area and / or always-on (always-on) PDU session DNN and / or S-NSSAI and / Or an operating system (OS) ID + OS specific application ID and/or 5QI and/or an IP address/port number.
- the always-on PDU session (configuration) information is always-on (always-on) PDU session service area and / or always-on (always-on) PDU session DNN and / or S-NSSAI and / Or it may include one or more combinations of OS ID + OS specific application ID and/or 5QI and/or IP address/port number.
- Always-on PDU session information or always-on PDU session setting information may be provided to the UE by the AMF during the registration procedure or during the UE configuration update procedure or during a new NAS signaling procedure.
- the corresponding always-on PDU session service area information is a tracking area belonging to the registration area allocated to the terminal by the AMF. Area) may include a set (eg, an overlap of an always-on PDU session service area and an assigned registration area). The AMF may not create a registration area based on the availability of an always-on PDU session.
- always-on (always-on) PDU session information may be provided to SMF/AMF from UE subscriber data of UDM.
- the terminal may delete the always-on PDU session information for the DNN. .
- the AMF may provide the changed information to the terminal through a configuration update procedure, a registration procedure, or a new dedicated NAS procedure. .
- Always-on PDU session configuration information may be configured and applied in a combination of 1 to 5 above.
- always-on (always-on) PDU session information or the always-on PDU session configuration information may be provided/set to the terminal by being included in the following registration acceptance message and configuration update command message.
- 8A to 8G are exemplary diagrams illustrating a display screen of a terminal according to an implementation of the first disclosure.
- the terminal displays a screen on the display unit of the terminal (eg, 1041 in FIG. 12 or 13) to enable the terminal to receive a setting input from the user regarding whether to activate the always-on PDU session. Can be marked on.
- the terminal may receive an input from a user through an input unit (1053 of FIG. 12 or 13). If a touch screen, which is a type of the input unit 1053, is embedded in the display 1041, the terminal may receive the input through a touch input on a setting screen displayed on the display unit 1041.
- the terminal may determine whether to display the screen when an application requesting an always-on PDU session is executed. The determination may be performed based on the always-on PDU session configuration information described above. As described above, the always-on PDU session configuration information may be stored in advance in the terminal or may be received from a network.
- the terminal allows the user to set whether to allow access to the application associated with the always-on PDU session (or whether to want to access the application associated with the always-on PDU session).
- a setting screen may be displayed on the display unit (eg, 1041 of FIG. 12 or 13) of the terminal.
- the terminal receives an input that the user does not allow access to the application associated with the always-on PDU session (or does not want to access the application associated with the always-on PDU session)
- the terminal receives the always-on A PDU session establishment procedure for creating a PDU session may not be performed.
- the terminal may display a screen for receiving a user input on whether to allow an always-on PDU session for each application or whether to allow access to applications related to the always-on PDU session.
- a list of applications and a toggle switch for setting whether to allow an always-on PDU session for the application are shown.
- the mentioned screen (e.g., icon, notification window, etc.) differs in shape, display form (e.g., blinking), color, etc. compared to the status screen indicating whether the application associated with the always-on PDU session is connected. can do.
- the shape of the screen display e.g, icon, notification window, etc.
- the display effect e.g, blinking
- the terminal when the always-on PDU session is established, the application is using the always-on PDU session, or the always-on PDU session is available, the terminal as shown in FIG. May display information (eg, an indicator) indicating this on the display unit 1041.
- information eg, an indicator
- the terminal may display a notification window indicating that the always-on PDU session has been successfully established.
- the notification window shown in FIG. 8E may be automatically displayed even if there is no user input. Such automatic display can be canceled or activated by the user through the setting screen.
- the terminal may display a screen indicating the always-on PDU session state as shown in FIG. 8F.
- a list of applications using the always-on PDU session and each state may be displayed. For example, App #1 may indicate that the always-on PDU session is being used, and App #3 may indicate that the always-on PDU session is not being used.
- the terminal may display a screen as shown in FIGS. 8A to 8C. That is, the terminal may display a setting screen for either activation or release of the always-on PDU session, or information on an application that is using the always-on PDU session.
- the terminal informs that the always-on PDU session is released as shown in FIG. 8G or information indicating that the application is not using the always-on PDU session
- a screen eg, an icon, a notification window, etc. may be displayed on the display unit 1041.
- the terminal back-off timer (e.g., SM back-off timer; T3396 for DNN-based congestion control, T3585 for S-NSSAI-based congestion control, and congestion based on a specific S-NSSAI and a specific DNN combination
- T3584 for control is in operation, the running back-off timer is overridden for PDU session establishment request or PDU session modification request messages for the always-on PDU session request. After that, the message can be transmitted to the network.
- the network is currently performing congestion control (e.g., DNN-based congestion control, DNN and S-NSSAI-based congestion control, or S-NSSAI-based congestion control) in the current congestion situation, always-on received from the terminal Processes PDU session establishment request message or PDU session modification request message for PDU session request.
- congestion control e.g., DNN-based congestion control, DNN and S-NSSAI-based congestion control, or S-NSSAI-based congestion control
- the terminal may transmit a service request message to switch to a connected mode.
- the network processes the service request message transmitted by the terminal even if it is performing MM (mobility management) congestion control in the current congestion situation.
- an indication/information indicating that it is for an always-on PDU session request or an indication/information capable of passing through MM congestion control is included and transmitted.
- the terminal transmits a service request message general indication/information may be included and transmitted.
- the terminal After the service request message has been successfully processed, after the terminal is switched from an idle mode to a connected mode, the terminal establishes a PDU session for the always-on PDU session request. Request messages or PDU session modification request messages are transmitted to the network, and the network processes these messages.
- the first scheme of the third disclosure a scheme for positively signaling an always-on PDU session (i.e., providing a specific/dedicated (rejection) cause) (e.g., an always-on PDU session is not supported or used Provide a cause value indicating that it cannot)
- FIG 9 is an exemplary view showing an operation according to the first scheme of the third disclosure of the present specification.
- information about whether the application of the terminal is associated with or not associated with an always-on PDU session is in advance (in advance) in the terminal. May be set to
- Such always-on PDU session establishment information may be stored with the first initiation or may be received from the network.
- the application of the terminal triggers such rapid data transmission to the NAS layer of the terminal for rapid data transmission (low delay data transmission or URLLC). At this time, when the applications are triggered (driven) for rapid data transmission, information on the triggering (driven) is transmitted to the NAS layer of the terminal.
- the UE When the UE transmits a new PDU session establishment request message or a PDU session modification request message to the network, the UE identifies whether the PDU session is a PDU session corresponding to an always-on PDU session or not to the network (e.g., SMF) (the PDU session establishment/modification request message includes an "always-on PDU session requested" IE).
- the network e.g., SMF
- the terminal does not support an always-on PDU session request. It responds to the terminal including information indicating that it cannot be used or cannot be used (ie, a specific/dedicated cause value, for example, a cause value indicating that an always-on PDU session is not supported or cannot be used). At this time, information indicating that the terminal's always-on PDU session request is not supported or cannot be used (i.e., a specific/dedicated cause value, for example, an always-on PDU session is not supported.
- a specific/dedicated cause value for example, an always-on PDU session is not supported.
- a cause value indicating that it cannot be used or cannot be used may be provided by being included in an always-on PDU session indication in the PDU session establishment acceptance message/modification command message.
- information indicating that the terminal's always-on PDU session request is not supported or cannot be used is provided through a new indication or information different from the always-on PDU session indication.
- the network eg, SMF
- the back-off timer may be a timer different from T3396, T3584, and T3585 for conventional NAS level congestion control.
- a network e.g., SMF
- the NAS layer of the terminal is powered off before the terminal is powered off (power switch off). Or, until the USIM is removed, an always-on PDU session (re) request is not made to the network (eg, SMF).
- a back-off timer is additionally provided from the network (e.g., SMF), during the provided back-off timer value (i.e., until the provided back-off timer value expires or stops) )
- Always-on (always-on) PDU session (re) request is not made to the network (eg, SMF).
- information indicating that the terminal application layer does not support or cannot use an always-on PDU session request from the network i.e., SMF
- a specific/dedicated cause value such as always-on (always-on) -on
- Triggering/indication/information for an always-on PDU session (re)request regardless of receiving a cause value indicating that the PDU session is not supported or cannot be used) of the terminal It can be provided to the hierarchy.
- information indicating that the always-on PDU session request from the network (e.g., SMF) is not supported or cannot be used (i.e., a specific/dedicated cause value, e.g., always-on PDU
- a specific/dedicated cause value e.g., always-on PDU
- the NAS layer of the terminal which has been provided with a cause value indicating that a session is not supported or cannot be used), does not make an always-on PDU session (re) request to the network (eg, SMF) as described above.
- the NAS layer of the terminal may deliver it to the application layer.
- the UE application layer does not trigger an always-on PDU session (re) request to the NAS layer of the UE until the UE is powered off or the USIM is removed.
- a (back-off) timer is additionally provided from a network (e.g., SMF), it is stopped or until the provided (back-off) timer expires for a time based on the provided (back-off) timer. Until) always-on (always-on) PDU session (re) request is not triggered to the NAS layer of the terminal.
- a network e.g., SMF
- the second scheme of the third disclosure a scheme for negatively signaling an always-on PDU session (i.e., providing a specific/dedicated (rejection) cause) (e.g., an always-on PDU session is not supported or used Provide a cause value indicating that it cannot)
- FIG 10 is an exemplary view showing an operation according to the second scheme of the third disclosure of the present specification.
- information about whether the application of the terminal is associated with or not associated with an always-on PDU session i.e., data transmission in a general PDU session
- information about whether the application of the terminal is associated with or not associated with an always-on PDU session is in advance (in advance) in the terminal. May be set to This always-on (always-on) PDU session setting information may be stored or received together with the first initiation.
- the application of the terminal triggers such rapid data transmission to the NAS layer of the terminal for rapid data transmission (low delay data transmission or URLLC). At this time, when the applications are triggered (driving) for rapid data transmission, information on the triggering (driving) is transmitted to the NAS layer of the terminal.
- the UE When the UE transmits a new PDU session establishment request message or a PDU session modification request message to the network, the UE identifies whether the PDU session is a PDU session corresponding to an always-on PDU session or not to the network (e.g., SMF) (the PDU session establishment/modification request message includes an "always-on PDU session requested" IE).
- the network e.g., SMF
- the terminal does not support an always-on PDU session request. It responds to the terminal including information indicating that it cannot be used or cannot be used (ie, a specific/dedicated cause value, for example, a cause value indicating that an always-on PDU session is not supported or cannot be used). At this time, information indicating that the terminal's always-on PDU session request is not supported or cannot be used (i.e., a specific/dedicated cause value, e.g., an always-on PDU session is not supported.
- a specific/dedicated cause value e.g., an always-on PDU session is not supported.
- the cause value indicating that it cannot be used or cannot be used may be included in an always-on PDU session indication in the PDU session establishment rejection/modification rejection message and provided.
- the information indicating that the terminal's always-on PDU session request is not supported or cannot be used is through a new indication or information different from the always-on PDU session indication.
- the network eg, SMF
- the (back-off) timer may be a timer value different from those of T3396, T3584, and T3585 for conventional NAS level congestion control.
- a back-off timer is additionally provided from the network (e.g., SMF), during the provided back-off timer value (i.e., until the provided back-off timer value expires or stops) )
- Always-on (always-on) PDU session (re) request is not made to the network (eg, SMF).
- information indicating that the terminal application layer does not support or cannot use an always-on PDU session request from the network i.e., SMF
- a specific/dedicated cause value such as always-on (always-on) -on
- information indicating that the always-on PDU session request from the network (eg, SMF) is not supported or cannot be used i.e., a specific/dedicated cause value, e.g., always-on
- the NAS layer of the terminal which has been provided with a cause value indicating that the (always-on) PDU session is not supported or cannot be used, requests an always-on PDU session (re) to the network (e.g., SMF) do not.
- the NAS layer of the terminal may deliver it to the application layer.
- the UE application layer does not trigger an always-on PDU session (re) request to the NAS layer of the UE until the UE is powered off or the USIM is removed.
- a (back-off) timer is additionally provided from a network (e.g., SMF), it is stopped or until the provided (back-off) timer expires for a time based on the provided (back-off) timer. Until) always-on (always-on) PDU session (re) request is not triggered to the NAS layer of the terminal.
- a network e.g., SMF
- the network e.g., SMF
- SMF distinguishes whether or not the PDU sessions correspond to an always-on PDU session. It can be applied more appropriately when requesting and notifying to.
- the terminal transmits several new PDU session establishment request messages or PDU session modification request messages to the network, all PDU sessions are always-on PDU sessions. Accordingly, the PDU sessions may be more appropriately applied when requesting is notified to the network (eg, SMF) by discriminating whether the PDU sessions correspond to an always-on PDU session.
- the network eg, SMF
- 11A to 11D are exemplary views showing a screen of a terminal according to an implementation of the first scheme of the third disclosure of the present specification.
- the terminal When the always-on PDU session setting information is received from the network or is changed and received in process 0 shown in FIG. 9 or 10, the terminal provides information indicating this as shown in FIG. 11A (for example, an indicator) may be displayed on the display unit 1041.
- the terminal displays an information screen (e.g., an icon, a notification window, etc.) notifying this, as shown in FIG. 11B. Can be marked on.
- an information screen e.g., an icon, a notification window, etc.
- Such information or information screens may have different shapes, display types (eg, blinking), colors, and the like according to the number of applications associated with the always-on PDU session.
- the shape, display type (eg, blinking), color, etc. of the screen may be different according to the number of applications associated with the always-on PDU session to which the terminal can connect.
- the terminal may transmit a response message to the network after receiving the always-on PDU session establishment information.
- the response message may be a Configuration Update Complete message.
- the terminal may display information as shown in FIG. 11A or a screen as shown in FIG. 11B.
- step 1 shown in FIG. 9 or 10 when the application layer performs triggering, the terminal displays information or a screen indicating that establishment/modification of the always-on PDU session is in progress (e.g., icon, notification window Etc.) can be displayed on the display unit 1041.
- information or a screen indicating that establishment/modification of the always-on PDU session is in progress e.g., icon, notification window Etc.
- step 3 shown in FIG. 9 when information indicating that an always-on PDU session is not supported is included in a message received from a network node (eg, AMF), the terminal is always configured as shown in FIG.
- Information indicating that the -on PDU session is not supported or a screen may be displayed on the display unit 1041.
- the information or screen may include such information.
- the information or screen may include information on a remaining time until retry is possible based on the back-off timer.
- step 3 shown in FIG. 10 if information indicating that the always-on PDU session is not supported is included in the rejection message received from the network node (eg, AMF), the terminal as shown in FIG. Information or a screen (eg, an icon, a notification window, etc.) indicating that establishment/modification of an always-on PDU session has been rejected may be displayed on the display unit 1041.
- the information or screen may include such information.
- FIG. 12 is a block diagram of a configuration of a terminal in which an embodiment presented in the present specification is implemented.
- the terminal (or wireless device) 100 is always on (Always on) session use unit 1021, always on (Always on) PDU session management unit 1022 and always on (Always on) PDU session It may include a setting information management unit 1023.
- the always on PDU session use unit 1021, the always on PDU session management unit 1022, and the always on PDU session setting information management unit 1023 are the processor 1020a of FIG. , May be included in the processor 1020 of FIG. 15 and the processor 1020 of FIG. 16.
- the always on PDU session use unit 1021 may include one or more applications or services.
- the always on PDU session use unit may transmit and receive data through the established always on PDU session.
- the always on PDU session management unit 1022 manages establishment/release of the always on PDU session.
- the always on PDU session management unit 1022 may display the above-described screen to establish the always on PDU session.
- the always on PDU session management unit 1022 may display the above-described screen.
- the always on PDU session management unit 1022 may transmit an input from the user to the always on PDU session setting information management unit 1023.
- the always on PDU session setting information management unit 1023 may store information received from a network and transfer the received information to the always on PDU session management unit 1022. In addition, the always on PDU session setting information management unit 1023 may receive and store and manage an input from a user from the always on PDU session management unit 1022.
- FIG. 13 illustrates a wireless communication system according to an embodiment.
- the wireless communication system may include a first device 100a and a second device 100b.
- the first device 100a includes a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, a connected car, a drone (Unmanned Aerial Vehicle, UAV), AI (Artificial Intelligence) Module, Robot, Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device, Hologram Device, Public Safety Device, MTC Device, IoT Device, Medical Device, Pin It may be a tech device (or financial device), a security device, a climate/environment device, a device related to 5G service, or a device related to the fourth industrial revolution field.
- UAV Unmanned Aerial Vehicle
- AI Artificial Intelligence
- Robot Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device
- MR Mixed Reality
- Hologram Device Hologram Device
- MTC Device IoT Device
- Medical Device Pin It may be a tech device (or financial device
- the second device 100b includes a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, a connected car, a drone (Unmanned Aerial Vehicle, UAV), AI (Artificial Intelligence) Module, Robot, Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device, Hologram Device, Public Safety Device, MTC Device, IoT Device, Medical Device, Pin It may be a tech device (or financial device), a security device, a climate/environment device, a device related to 5G service, or a device related to the fourth industrial revolution field.
- UAV Unmanned Aerial Vehicle
- AI Artificial Intelligence
- Robot Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device
- Hologram Device Augmented Reality
- MTC Device Virtual Reality
- IoT Device Medical Device
- Pin It may be a tech device (or financial device),
- the terminal is a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistants (PDA), a portable multimedia player (PMP), a navigation system, a slate PC, and a tablet.
- PDA personal digital assistants
- PMP portable multimedia player
- PC tablet PC
- ultrabook ultrabook
- wearable device wearable device, for example, a watch-type terminal (smartwatch), glass-type terminal (smart glass), HMD (head mounted display)
- the HMD may be a display device worn on the head.
- HMD can be used to implement VR, AR or MR.
- a drone may be a vehicle that is not human and is flying by a radio control signal.
- the VR device may include a device that implements an object or a background of a virtual world.
- the AR device may include a device that connects and implements an object or background of a virtual world, such as an object or background of the real world.
- the MR device may include a device that combines and implements an object or background of a virtual world, such as an object or background of the real world.
- the hologram device may include a device that implements a 360-degree stereoscopic image by recording and reproducing stereoscopic information by utilizing an interference phenomenon of light generated by the encounter of two laser lights called holography.
- the public safety device may include an image relay device or an image device wearable on a user's human body.
- the MTC device and the IoT device may be devices that do not require direct human intervention or manipulation.
- the MTC device and the IoT device may include a smart meter, a bending machine, a thermometer, a smart light bulb, a door lock, or various sensors.
- the medical device may be a device used for the purpose of diagnosing, treating, alleviating, treating or preventing a disease.
- the medical device may be a device used for the purpose of diagnosing, treating, alleviating or correcting an injury or disorder.
- a medical device may be a device used for the purpose of examining, replacing or modifying a structure or function.
- the medical device may be a device used for the purpose of controlling pregnancy.
- the medical device may include a device for treatment, a device for surgery, a device for (extra-corporeal) diagnosis, a device for hearing aid or a procedure.
- the security device may be a device installed to prevent a risk that may occur and maintain safety.
- the security device may be a camera, CCTV, recorder, or black box.
- the fintech device may be a device capable of providing financial services such as mobile payment.
- the fintech device may include a payment device or a point of sales (POS).
- the climate/environment device may include a device that monitors or predicts the climate/environment.
- the first device 100a may include at least one or more processors such as the processor 1020a, at least one or more memories such as the memory 1010a, and at least one or more transceivers such as the transceiver 1031a.
- the processor 1020a may perform the functions, procedures, and/or methods described above.
- the processor 1020a may perform one or more protocols.
- the processor 1020a may perform one or more layers of a radio interface protocol.
- the memory 1010a is connected to the processor 1020a and may store various types of information and/or commands.
- the transceiver 1031a may be connected to the processor 1020a and controlled to transmit and receive radio signals.
- the second device 100b may include at least one processor such as a processor 1020b, at least one memory device such as a memory 1010b, and at least one transceiver such as a transceiver 1031b.
- the processor 1020b may perform the functions, procedures, and/or methods described above.
- the processor 1020b may implement one or more protocols.
- the processor 1020b may implement one or more layers of a radio interface protocol.
- the memory 1010b is connected to the processor 1020b and may store various types of information and/or commands.
- the transceiver 1031b may be connected to the processor 1020b and controlled to transmit and receive radio signals.
- the memory 1010a and/or the memory 1010b may be respectively connected inside or outside the processor 1020a and/or the processor 1020b, or other processors through various technologies such as wired or wireless connection. It can also be connected to.
- the first device 100a and/or the second device 100b may have one or more antennas.
- the antenna 1036a and/or the antenna 1036b may be configured to transmit and receive wireless signals.
- FIG. 14 illustrates a block diagram of a network node according to an embodiment.
- FIG. 14 when the base station is divided into a central unit (CU) and a distributed unit (DU), the network node of FIG. 13 is illustrated in more detail above.
- CU central unit
- DU distributed unit
- the base stations W20 and W30 may be connected to the core network W10, and the base station W30 may be connected to the neighboring base station W20.
- the interface between the base stations W20 and W30 and the core network W10 may be referred to as NG, and the interface between the base station W30 and the neighboring base stations W20 may be referred to as Xn.
- the base station W30 may be divided into a CU (W32) and DU (W34, W36). That is, the base station W30 may be hierarchically separated and operated.
- the CU (W32) may be connected to one or more DUs (W34, W36), for example, the interface between the CU (W32) and the DU (W34, W36) may be referred to as F1.
- the CU (W32) may perform the function of upper layers of the base station, and the DUs (W34, W36) may perform the function of lower layers of the base station.
- the CU (W32) is a logical node that hosts radio resource control (RRC), service data adaptation protocol (SDAP), and packet data convergence protocol (PDCP) layers of a base station (eg, gNB)
- RRC radio resource control
- SDAP service data adaptation protocol
- PDCP packet data convergence protocol
- the DU (W34, W36) may be a logical node hosting a radio link control (RLC), a media access control (MAC), and a physical (PHY) layer of the base station.
- the CU (W32) may be a logical node hosting the RRC and PDCP layers of the base station (eg, en-gNB).
- One DU (W34, W36) may support one or more cells. One cell can be supported by only one DU (W34, W36).
- One DU (W34, W36) may be connected to one CU (W32), and one DU (W34, W36) may be connected to a plurality of CUs by appropriate implementation.
- 15 is a block diagram showing a configuration of a terminal according to an embodiment.
- FIG. 15 is a diagram illustrating the terminal of FIG. 13 in more detail above.
- the terminal includes a memory 1010, a processor 1020, a transmission/reception unit 1031, a power management module 1091, a battery 1092, a display 1041, an input unit 1053, a speaker 1042 and a microphone 1052, and SIM (subscriber identification module) card, contains one or more antennas.
- SIM subscriber identification module
- the processor 1020 may be configured to implement the proposed functions, procedures and/or methods described herein. Layers of the air interface protocol may be implemented in the processor 1020.
- the processor 1020 may include an application-specific integrated circuit (ASIC), another chipset, a logic circuit, and/or a data processing device.
- the processor 1020 may be an application processor (AP).
- the processor 1020 may include at least one of a digital signal processor (DSP), a central processing unit (CPU), a graphics processing unit (GPU), and a modem (modulator and demodulator).
- DSP digital signal processor
- CPU central processing unit
- GPU graphics processing unit
- modem modulator and demodulator
- processor 1020 examples include SNAPDRAGONTM series processors manufactured by Qualcomm®, EXYNOSTM series processors manufactured by Samsung®, A series processors manufactured by Apple®, HELIOTM series processors manufactured by MediaTek®, INTEL®. It may be an ATOMTM series processor manufactured by or a corresponding next-generation processor.
- the power management module 1091 manages power for the processor 1020 and/or the transceiver 1031.
- the battery 1092 supplies power to the power management module 1091.
- the display 1041 outputs the result processed by the processor 1020.
- the input unit 1053 receives an input to be used by the processor 1020.
- the input unit 1053 may be displayed on the display 1041.
- a SIM card is an integrated circuit used to securely store an international mobile subscriber identity (IMSI) used to identify and authenticate a subscriber in a mobile phone device such as a mobile phone and a computer and a key associated therewith. You can even store contact information on many SIM cards.
- IMSI international mobile subscriber identity
- the memory 1010 is operatively coupled to the processor 1020 and stores various pieces of information for operating the processor 610.
- the memory 1010 may include read-only memory (ROM), random access memory (RAM), flash memory, memory card, storage medium, and/or other storage device.
- ROM read-only memory
- RAM random access memory
- flash memory memory card
- storage medium storage medium
- other storage device any storage device that stores instructions.
- modules may be stored in memory 1010 and executed by processor 1020.
- the memory 1010 may be implemented inside the processor 1020. Alternatively, the memory 1010 may be implemented outside the processor 1020 and may be communicatively connected to the processor 1020 through various means known in the art.
- the transceiver 1031 is operatively coupled to the processor 1020, and transmits and/or receives a radio signal.
- the transceiver 1031 includes a transmitter and a receiver.
- the transceiver 1031 may include a baseband circuit for processing radio frequency signals.
- the transceiver unit controls one or more antennas to transmit and/or receive radio signals.
- the processor 1020 transmits command information to the transmission/reception unit 1031 to transmit, for example, a radio signal constituting voice communication data in order to initiate communication.
- the antenna functions to transmit and receive radio signals.
- the transceiver 1031 may transmit a signal for processing by the processor 1020 and convert the signal into a baseband.
- the processed signal may be converted into audible or readable information output through the speaker 1042.
- the speaker 1042 outputs a sound-related result processed by the processor 1020.
- the microphone 1052 receives a sound related input to be used by the processor 1020.
- the user inputs command information such as a telephone number, for example, by pressing (or touching) a button of the input unit 1053 or by voice activation using the microphone 1052.
- the processor 1020 receives the command information and processes to perform an appropriate function, such as dialing a phone number. Operational data may be extracted from the SIM card or the memory 1010. In addition, the processor 1020 may display command information or driving information on the display 1041 for user recognition and convenience.
- FIG. 16 is a block diagram showing the configuration of the terminal illustrated in FIG. 15 in more detail.
- the terminal 100 includes a transmission/reception unit 1030, a processor 1020, a memory 1030, a sensing unit 1060, an output unit 1040, an interface unit 1090, an input unit 1050, a power supply unit 1080, etc. It may include. Since the components shown in FIG. 16 are not essential for implementing the terminal, the terminal described in the present specification may have more or fewer components than the components listed above.
- the transmission/reception unit 1030 includes wireless communication between the terminal 100 and a wireless communication system, between the terminal 100 and another terminal 100, or between the terminal 100 and an external server. It may include one or more modules that enable it. In addition, the transmission/reception unit 1030 may include one or more modules that connect the terminal 100 to one or more networks.
- the transmission/reception unit 1030 may include at least one of a broadcast reception unit 1032, a mobile communication transmission/reception unit 1031, a wireless Internet transmission/reception unit 1033, a short range communication unit 1034, and a location information module 1150. .
- the input unit 1050 includes a camera 1051 or an image input unit for inputting a video signal, a microphone 1052 for inputting an audio signal, or an audio input unit, and a user input unit 1053 for receiving information from a user, for example, , A touch key, a mechanical key, etc.).
- the voice data or image data collected by the input unit 1050 may be analyzed and processed as a user's control command.
- the sensing unit 1060 may include one or more sensors for sensing at least one of information in the mobile terminal, information on surrounding environments surrounding the mobile terminal, and user information.
- the sensing unit 1060 includes a proximity sensor 1061, an illumination sensor 1062, a touch sensor, an acceleration sensor, a magnetic sensor, and a gravity sensor.
- G-sensor gyroscope sensor
- motion sensor RGB sensor
- infrared sensor IR sensor
- fingerprint sensor fingerprint sensor
- ultrasonic sensor ultrasonic sensor
- Optical sensor e.g., camera (see 1051)
- microphone microphone
- battery gauge environmental sensor
- environmental sensor e.g., barometer, hygrometer, thermometer, radiation detection sensor, It may include at least one of a heat sensor, a gas sensor, etc.
- a chemical sensor eg, an electronic nose, a healthcare sensor, a biometric sensor, etc.
- the mobile terminal disclosed in the present specification may combine and utilize information sensed by at least two or more of these sensors.
- the output unit 1040 is for generating an output related to visual, auditory, or tactile sense, and includes at least one of the display unit 1041, the sound output unit 1042, the hap tip output unit 1043, and the light output unit 1044.
- the display unit 1041 may form a layer structure with the touch sensor or be integrally formed, thereby implementing a touch screen.
- Such a touch screen may function as a user input unit 1053 that provides an input interface between the terminal 100 and a user, and may provide an output interface between the terminal 100 and a user.
- the interface unit 1090 serves as a passage for various types of external devices connected to the terminal 100.
- the interface unit 1090 connects a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, and a device equipped with an identification module. It may include at least one of a port, an audio input/output (I/O) port, an input/output (video I/O) port, and an earphone port.
- the terminal 100 may perform appropriate control related to the connected external device.
- the memory 1030 stores data supporting various functions of the terminal 100.
- the memory 1030 may store a plurality of application programs (application programs or applications) driven by the terminal 100, data for operation of the terminal 100, and commands. At least some of these application programs may be downloaded from an external server through wireless communication. In addition, at least some of these application programs may exist on the terminal 100 from the time of shipment for basic functions of the terminal 100 (eg, incoming calls, outgoing functions, message reception, and outgoing functions). Meanwhile, the application program may be stored in the memory 1030, installed on the terminal 100, and driven by the processor 1020 to perform an operation (or function) of the mobile terminal.
- the processor 1020 In addition to the operation related to the application program, the processor 1020 generally controls the overall operation of the terminal 100.
- the processor 1020 may provide or process appropriate information or functions to a user by processing signals, data, information, etc. input or output through the above-described components, or driving an application program stored in the memory 1030.
- the processor 1020 may control at least some of the components described with reference to FIG. XX in order to drive the application program stored in the memory 1030. Further, in order to drive the application program, the processor 1020 may operate by combining at least two or more of the components included in the terminal 100 with each other.
- the power supply unit 1080 receives external power and internal power under the control of the processor 1020 and supplies power to each of the components included in the terminal 100.
- the power supply unit 1080 includes a battery, and the battery may be a built-in battery or a replaceable battery.
- At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of a mobile terminal according to various embodiments described below.
- the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 1030.
- the broadcast reception unit 1032 of the transmission/reception unit 1030 receives a broadcast signal and/or broadcast-related information from an external broadcast management server through a broadcast channel.
- the broadcast channel may include a satellite channel and a terrestrial channel.
- Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching of at least two broadcast channels.
- the mobile communication transmission/reception unit 1031 includes technical standards or communication methods for mobile communication (for example, GSM (Global System for Mobile communication), CDMA (Code Division Multi Access), CDMA2000 (Code Division Multi Access 2000)), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access), LTE (Long Term Evolution), LTE- A radio signal is transmitted and received with at least one of a base station, an external terminal, and a server on a mobile communication network established according to A (Long Term Evolution-Advanced) and 3GPP NR (New Radio access technology).
- GSM Global System for Mobile communication
- CDMA Code Division Multi Access
- CDMA2000 Code Division Multi Access 2000
- EV-DO Enhanced Voice-Data Optimized or Enhanced Voice-Data Only
- WCDMA Wideband CDMA
- HSDPA High Speed Downlink Packet Access
- HSUPA High Speed
- the wireless signal may include a voice call signal, a video call signal, or various types of data according to transmission/reception of text/multimedia messages.
- the wireless Internet transmission/reception unit 1033 refers to a module for wireless Internet access, and may be built-in or external to the terminal 100.
- the wireless Internet transceiver 1033 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.
- wireless Internet technologies include WLAN (Wireless LAN), Wi-Fi (Wireless-Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro (Wireless Broadband), WiMAX (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access), LTE (Long Term Evolution), LTE-A (Long Term Evolution-Advanced), 3GPP NR, etc.
- the Internet transmission/reception unit 1033 transmits and receives data according to at least one wireless Internet technology in a range including Internet technologies not listed above.
- the transmission/reception unit 1033 may be understood as a kind of the mobile communication transmission/reception unit 1031.
- the short range communication unit 1034 is for short range communication, and includes Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, NFC ( Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless Universal Serial Bus (USB) technologies may be used to support short-range communication.
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- UWB Ultra Wideband
- ZigBee ZigBee
- NFC Near Field Communication
- Wi-Fi Wireless-Fidelity
- Wi-Fi Direct Wireless Universal Serial Bus
- USB Wireless Universal Serial Bus
- the short-distance communication unit 1034 may be configured between the terminal 100 and a wireless communication system, between the terminal 100 and another terminal 100, or between the terminal 100 and another mobile terminal through wireless area networks. Wireless communication between networks in which (1000, or an external server) is located may be supported.
- the local area wireless communication network may be a wireless personal area network (Wir
- the other terminal 100 is a wearable device capable of exchanging (or interlocking with) data with the terminal 100 according to the present invention, for example, a smartwatch, a smart glasses. glass), neckband, head mounted display (HMD)).
- the short-range communication unit 1034 may detect (or recognize) a wearable device capable of communicating with the terminal 100 in the vicinity of the terminal 100.
- the processor 1020 transmits at least a part of the data processed by the terminal 100 to the short-range communication unit 1034. It can be transmitted to the wearable device through the device.
- a user of the wearable device can use data processed by the terminal 100 through the wearable device. For example, according to this, when a call is received by the terminal 100, the user performs a phone call through the wearable device, or when a message is received by the terminal 100, the user transmits the received message through the wearable device. It is possible to check.
- screen mirroring is performed with a TV located in a house or a display inside a vehicle through the short-range communication unit 1034, and a corresponding function is performed based on, for example, MirrorLink or Miracast standards.
- a TV or a display inside a vehicle using the terminal 100.
- the location information module 1150 is a module for obtaining a location (or current location) of a mobile terminal, and representative examples thereof include a GPS (Global Positioning System) module or a WiFi (Wireless Fidelity) module.
- a GPS Global Positioning System
- WiFi Wireless Fidelity
- the mobile terminal may acquire the location of the mobile terminal based on information of the Wi-Fi module and a wireless access point (AP) that transmits or receives a wireless signal.
- AP wireless access point
- the location information module 1150 may perform any function among other modules of the transmission/reception unit 1030 in order to obtain data on the location of the mobile terminal as a substitute or additionally.
- the location information module 1150 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
- Each of the broadcast reception unit 1032, the mobile communication transmission/reception unit 1031, the short range communication unit 1034, and the location information module 1150 may be implemented as separate modules that perform a corresponding function, or the broadcast reception unit 1032, mobile communication Functions corresponding to two or more of the transmission/reception unit 1031, the short range communication unit 1034, and the location information module 1150 may be implemented by one module.
- the input unit 1050 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
- the terminal 100 is one or A plurality of cameras 1051 may be provided.
- the camera 1051 processes an image frame such as a still image or a moving picture obtained by an image sensor in a video call mode or a photographing mode.
- the processed image frame may be displayed on the display unit 1041 or stored in the memory 1030.
- a plurality of cameras 1051 provided in the terminal 100 may be arranged to form a matrix structure, and through the camera 1051 forming a matrix structure, the terminal 100 has a plurality of cameras 1051 having various angles or focuses.
- the image information of may be input.
- the plurality of cameras 1051 may be arranged in a stereo structure to obtain a left image and a right image for implementing a stereoscopic image.
- the microphone 1052 processes an external sound signal into electrical voice data.
- the processed voice data may be utilized in various ways according to a function (or an application program being executed) being executed by the terminal 100. Meanwhile, the microphone 1052 may be implemented with various noise removal algorithms for removing noise generated in the process of receiving an external sound signal.
- the user input unit 1053 is for receiving information from a user. When information is input through the user input unit 1053, the processor 1020 may control the operation of the terminal 100 to correspond to the input information.
- the user input unit 1053 is a mechanical (mechanical) input means (or a mechanical key, for example, a button located on the front, rear or side of the terminal 100, dome switch (dome switch), jog wheel, jog Switch, etc.) and a touch-type input means.
- the touch-type input means comprises a virtual key, a soft key, or a visual key displayed on a touch screen through software processing, or a portion other than the touch screen It may be made of a touch key (touch key) disposed on.
- the virtual key or visual key can be displayed on the touch screen while having various forms, for example, graphic, text, icon, video, or these It can be made of a combination of.
- the sensing unit 1060 senses at least one of information in the mobile terminal, information on surrounding environments surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
- the processor 1020 may control driving or operation of the terminal 100 or perform data processing, functions, or operations related to an application program installed in the terminal 100 based on such a sensing signal. Representative sensors among various sensors that may be included in the sensing unit 1060 will be described in more detail.
- the proximity sensor 1061 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object existing in the vicinity using the force of an electromagnetic field or infrared light without mechanical contact.
- the proximity sensor 1061 may be disposed in an inner area of the mobile terminal surrounded by the touch screen described above or near the touch screen.
- the proximity sensor 1061 examples include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like.
- the proximity sensor 1061 may be configured to detect the proximity of the object with a change in the electric field according to the proximity of the conductive object. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.
- proximity touch the action of allowing an object to be recognized as being positioned on the touch screen by being approached without contacting an object on the touch screen
- contact touch the touch
- a position at which an object is touched in proximity on the touch screen means a position at which the object is vertically corresponding to the touch screen when the object is touched in proximity.
- the proximity sensor 1061 may detect a proximity touch and a proximity touch pattern (eg, proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch movement state, etc.). have.
- the processor 1020 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 1061, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Furthermore, the processor 1020 may control the terminal 100 to process different operations or data (or information) according to whether a touch to the same point on the touch screen is a proximity touch or a touch touch.
- the touch sensor detects a touch (or touch input) applied to the touch screen (or display unit 1041) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. do.
- the touch sensor may be configured to convert a pressure applied to a specific portion of the touch screen or a change in capacitance generated at a specific portion into an electrical input signal.
- the touch sensor may be configured to detect a location, an area, a pressure upon touch, a capacitance upon touch, and the like at which a touch object applying a touch on the touch screen is touched on the touch sensor.
- the touch object is an object that applies a touch to the touch sensor, and may be, for example, a finger, a touch pen, a stylus pen, or a pointer.
- the touch controller processes the signal(s) and then transmits the corresponding data to the processor 1020.
- the processor 1020 can know which area of the display unit 1041 has been touched.
- the touch controller may be a separate component from the processor 1020 or may be the processor 1020 itself.
- the processor 1020 may perform different controls or perform the same control according to the type of the touch object by touching the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to an operating state of the current terminal 100 or an application program being executed.
- the touch sensor and the proximity sensor described above are independently or in combination, and a short (or tap) touch, a long touch, a multi touch, and a drag touch on the touch screen. ), flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. You can sense the touch.
- the ultrasonic sensor may recognize location information of a sensing target by using ultrasonic waves.
- the processor 1020 may calculate the location of the wave generator through information sensed from the optical sensor and a plurality of ultrasonic sensors.
- the location of the wave generator may be calculated by using a property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time that the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generator may be calculated using a time difference between a time when the ultrasonic wave arrives using light as a reference signal.
- the camera 1051 viewed as the configuration of the input unit 1050, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or image sensor), and a laser sensor.
- a camera sensor eg, CCD, CMOS, etc.
- a photo sensor or image sensor
- a laser sensor e.g., a laser sensor
- the camera 1051 and the laser sensor are combined with each other to detect a touch of a sensing target for a 3D stereoscopic image.
- the photosensor may be stacked on the display device, and the photosensor is configured to scan a motion of a sensing object close to the touch screen. More specifically, the photo sensor scans the contents placed on the photo sensor by mounting a photo diode and a transistor (TR) in a row/column and using an electrical signal that changes according to the amount of light applied to the photo diode. That is, the photosensor calculates the coordinates of the sensing target according to the amount of light change, and through this, the location information of the sensing target may be obtained.
- TR transistor
- the display unit 1041 displays (outputs) information processed by the terminal 100.
- the display unit 1041 may display execution screen information of an application program driven in the terminal 100, or UI (User Interface) and GUI (Graphic User Interface) information according to such execution screen information.
- UI User Interface
- GUI Graphic User Interface
- the display unit 1041 may be configured as a three-dimensional display unit that displays a three-dimensional image.
- a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (no glasses method), and a projection method (holographic method) may be applied to the stereoscopic display unit.
- the sound output unit 1042 may output audio data received from the transmission/reception unit 1030 or stored in the memory 1030 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
- the sound output unit 1042 also outputs sound signals related to functions (eg, call signal reception sound, message reception sound, etc.) performed by the terminal 100.
- the sound output unit 1042 may include a receiver, a speaker, a buzzer, and the like.
- the haptic module 1530 generates various tactile effects that a user can feel.
- a typical example of the tactile effect generated by the haptic output unit 1043 may be vibration.
- the intensity and pattern of vibration generated by the haptic output unit 1043 may be controlled by a user's selection or a processor setting.
- the haptic output unit 1043 may synthesize and output different vibrations or sequentially output them.
- the haptic output unit 1043 in addition to vibration, is a pin arrangement that moves vertically with respect to the contact skin surface, the blowing force or suction force of air through the injection or inlet, the grazing of the skin surface, contact of the electrode, stimulation of electrostatic force, etc.
- a variety of tactile effects such as effects by and effects by reproducing the feeling of cooling and warming using an endothermic or exothermic element, can be generated.
- the haptic output unit 1043 may not only deliver a tactile effect through direct contact, but may also be implemented so that a user can feel the tactile effect through muscle sensations such as a finger or an arm. Two or more haptic output units 1043 may be provided depending on the configuration of the terminal 100.
- the light output unit 1044 outputs a signal for notifying the occurrence of an event using light from a light source of the terminal 100.
- Examples of events occurring in the terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, and information reception through an application.
- the signal output from the light output unit 1044 is implemented as the mobile terminal emits a single color or multiple colors of light to the front or rear.
- the signal output may be terminated when the mobile terminal detects the user's event confirmation.
- the interface unit 1090 serves as a passage for all external devices connected to the terminal 100.
- the interface unit 1090 receives data from an external device or receives power and transmits it to each component inside the terminal 100, or transmits data inside the terminal 100 to an external device.
- a wired/wireless headset port for example, a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device equipped with an identification module. (port), an audio input/output (I/O) port, a video input/output (I/O) port, an earphone port, and the like may be included in the interface unit 1090.
- the identification module is a chip that stores various types of information for authenticating the right to use the terminal 100, and includes a user identification module (UIM), a subscriber identity module (SIM), and a universal user authentication module. (universal subscriber identity module; USIM), etc. may be included.
- a device equipped with an identification module hereinafter,'identification device' may be manufactured in the form of a smart card. Accordingly, the identification device may be connected to the terminal 100 through the interface unit 1090.
- the interface unit 1090 serves as a path through which power from the cradle is supplied to the terminal 100 when the terminal 100 is connected to an external cradle, or various commands input from the cradle by a user. It may be a path through which a signal is transmitted to the terminal 100. Various command signals or the power input from the cradle may be operated as signals for recognizing that the terminal 100 is correctly mounted on the cradle.
- the memory 1030 may store a program for the operation of the processor 1020 and may temporarily store input/output data (eg, a phone book, a message, a still image, a video, etc.).
- the memory 1030 may store data related to vibrations and sounds of various patterns output when a touch input on the touch screen is performed.
- the memory 1030 is a flash memory type, a hard disk type, a solid state disk type, an SDD type, a multimedia card micro type. ), card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read (EEPROM) -only memory), programmable read-only memory (PROM), magnetic memory, magnetic disk, and optical disk.
- card-type memory e.g., SD or XD memory
- RAM random access memory
- SRAM static random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read
- PROM programmable read-only memory
- magnetic memory magnetic disk, and optical disk.
- the terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 1030 over the Internet.
- the processor 1020 controls an operation related to an application program and, in general, an overall operation of the terminal 100. For example, when the state of the mobile terminal satisfies a set condition, the processor 1020 may execute or release a lock state limiting input of a user's control command for applications.
- the processor 1020 performs control and processing related to voice calls, data communication, and video calls, or performs pattern recognition processing capable of recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively. I can. Furthermore, the processor 1020 may control any one or a combination of a plurality of components described above in order to implement various embodiments described below on the terminal 100 according to the present invention.
- the power supply unit 1080 receives external power and internal power under the control of the processor 1020 and supplies power necessary for the operation of each component.
- the power supply unit 1080 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to a terminal body for charging or the like.
- the power supply unit 1080 may include a connection port, and the connection port may be configured as an example of an interface 1090 to which an external charger that supplies power for charging a battery is electrically connected.
- the power supply unit 1080 may be configured to charge the battery in a wireless manner without using the connection port.
- the power supply unit 1080 uses at least one of an inductive coupling method based on a magnetic induction phenomenon or a magnetic resonance coupling method based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
- various embodiments may be implemented in a recording medium that can be read by a computer or a similar device using, for example, software, hardware, or a combination thereof.
- the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly holds and uses in the hand.
- wearable devices include smart watch, smart glass, and head mounted display (HMD).
- HMD head mounted display
- the wearable device may be configured to exchange (or interlock) data with another terminal 100.
- the short-range communication unit 1034 may detect (or recognize) a wearable device capable of communicating around the terminal 100. Furthermore, when the detected wearable device is a device that is authenticated to communicate with the terminal 100, the processor 1020 may transmit at least part of the data processed by the terminal 100 to the wearable device through the short-range communication unit 1034. have. Accordingly, the user can use data processed by the terminal 100 through the wearable device. For example, when a call is received from the terminal 100, a phone call may be performed through a wearable device, or when a message is received by the terminal 100, the received message may be checked through the wearable device.
- the present invention described above can be implemented as a computer-readable code in a medium on which a program is recorded.
- the computer-readable medium includes all types of recording devices storing data that can be read by a computer system. Examples of computer-readable media include HDD (Hard Disk Drive), SSD (Solid State Disk), SDD (Silicon Disk Drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. There is also a carrier wave (eg, transmission over the Internet). Also, the computer may include a processor 1020 of the terminal. Therefore, the detailed description above should not be construed as restrictive in all respects and should be considered as illustrative. The scope of the present invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.
- a UE User Equipment
- the illustrated UE may be referred to in terms of a terminal, a mobile equipment (ME), a wireless device, a mobile terminal, and the like.
- the UE may be a portable device such as a notebook computer, a mobile phone, a PDA, a smart phone, or a multimedia device, or may be a non-portable device such as a PC or a vehicle-mounted device.
- wireless devices or mobile terminals, etc.
- wireless devices include mobile phones, smart phones, laptop computers, digital broadcasting terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), Navigation, slate PC, tablet PC, ultrabook, wearable device, e.g., smartwatch, smart glass, HMD (head mounted display)), etc.
- PDAs personal digital assistants
- PMPs portable multimedia players
- Navigation slate PC
- tablet PC tablet PC
- ultrabook ultrabook
- wearable device e.g., smartwatch, smart glass, HMD (head mounted display)
- HMD head mounted display
- the always-on PDU session for URLLC having low-latency characteristics may be used for artificial intelligence, robots, autonomous driving, extended reality, and the like among the 5G scenarios below.
- FIG 17 shows an example of a 5G usage scenario.
- the 5G usage scenario shown in FIG. 17 is merely exemplary, and the technical features of the present invention can be applied to other 5G usage scenarios not shown in FIG. 17.
- the three main requirements areas of 5G are (1) enhanced mobile broadband (eMBB) area, (2) massive machine type communication (mMTC) area, and (3) high reliability. Includes ultra-reliable and low latency communications (URLLC) areas. Some use cases may require multiple areas for optimization, and other use cases may focus only on one key performance indicator (KPI). 5G supports these various use cases in a flexible and reliable way.
- eMBB enhanced mobile broadband
- mMTC massive machine type communication
- URLLC ultra-reliable and low latency communications
- KPI key performance indicator
- eMBB focuses on the overall improvement of data rate, latency, user density, capacity and coverage of mobile broadband access.
- eMBB targets a throughput of around 10Gbps.
- eMBB goes far beyond basic mobile Internet access, covering rich interactive work, media and entertainment applications in the cloud or augmented reality.
- Data is one of the key drivers of 5G, and it may not be possible to see dedicated voice services for the first time in the 5G era.
- voice is expected to be processed as an application program simply using the data connection provided by the communication system.
- the main reason for the increased traffic volume is an increase in content size and an increase in the number of applications requiring high data rates.
- Streaming services audio and video
- interactive video and mobile Internet connections will become more prevalent as more devices connect to the Internet.
- Cloud storage and applications are increasing rapidly in mobile communication platforms, which can be applied to both work and entertainment.
- Cloud storage is a special use case that drives the growth of uplink data rates.
- 5G is also used for remote work in the cloud and requires much lower end-to-end latency to maintain a good user experience when tactile interfaces are used.
- cloud gaming and video streaming are another key factor demanding improvements in mobile broadband capabilities.
- Entertainment is essential on smartphones and tablets anywhere, including in highly mobile environments such as trains, cars and airplanes.
- Another use case is augmented reality and information retrieval for entertainment.
- augmented reality requires very low latency and an instantaneous amount of data.
- the mMTC is designed to enable communication between a large number of low-cost, battery-powered devices, and is intended to support applications such as smart metering, logistics, field and body sensors.
- the mMTC targets 10 years of batteries and/or 1 million units per km2.
- the mMTC enables seamless connection of embedded sensors in all fields to form a sensor network, and is one of the most anticipated 5G use cases.
- IoT devices are predicted to reach 20.4 billion by 2020.
- Smart networks using industrial IoT are one of the areas in which 5G plays a major role in enabling smart cities, asset tracking, smart utilities, agriculture and security infrastructure.
- URLLC enables devices and machines to communicate very reliably, with very low latency and high availability, allowing communication and control between autonomous vehicles, industrial control, factory automation, mission-critical applications such as telesurgery and healthcare, smart grids and public domains. Ideal for safety applications.
- URLLC aims for a delay of the order of 1ms.
- URLLC includes new services that will transform the industry through high-reliability/ultra-low latency links such as remote control of key infrastructure and autonomous vehicles. The level of reliability and delay is essential for smart grid control, industrial automation, robotics, drone control and coordination.
- 5G can complement fiber-to-the-home (FTTH) and cable-based broadband (or DOCSIS) as a means of providing streams rated from hundreds of megabits per second to gigabits per second.
- FTTH fiber-to-the-home
- DOCSIS cable-based broadband
- Such high speed may be required to deliver TVs in resolutions of 4K or higher (6K, 8K and higher) as well as virtual reality (VR) and augmented reality (AR).
- VR and AR applications involve almost immersive sports events. Certain applications may require special network configuration. In the case of VR games, for example, the game company may need to integrate the core server with the network operator's edge network server to minimize latency.
- Automotive is expected to be an important new driving force in 5G, with many use cases for mobile communication to vehicles. For example, entertainment for passengers simultaneously demands high capacity and high mobile broadband. The reason is that future users will continue to expect high-quality connections, regardless of their location and speed.
- Another use case in the automotive sector is an augmented reality dashboard.
- the augmented reality contrast board allows the driver to identify objects in the dark on top of what they see through the front window.
- the augmented reality dashboard superimposes information to inform the driver about the distance and movement of objects.
- wireless modules will enable communication between vehicles, exchange of information between the vehicle and the supporting infrastructure, and exchange of information between the vehicle and other connected devices (eg, devices carried by pedestrians).
- the safety system can lower the risk of accidents by guiding the driver through alternative courses of action to make driving safer.
- the next step will be a remotely controlled vehicle or an autonomous vehicle.
- This requires very reliable and very fast communication between different autonomous vehicles and/or between vehicles and infrastructure.
- autonomous vehicles will perform all driving activities, and drivers will be forced to focus only on traffic anomalies that the vehicle itself cannot identify.
- the technical requirements of autonomous vehicles require ultra-low latency and ultra-fast reliability to increase traffic safety to levels that cannot be achieved by humans.
- Smart cities and smart homes referred to as smart society will be embedded with high-density wireless sensor networks as an example of smart networks.
- a distributed network of intelligent sensors will identify the conditions for cost and energy efficient maintenance of a city or home.
- a similar setup can be done for each household.
- Temperature sensors, window and heating controllers, burglar alarms and appliances are all wirelessly connected. Many of these sensors typically require low data rates, low power and low cost.
- real-time HD video may be required in certain types of devices for surveillance.
- the smart grid interconnects these sensors using digital information and communication technologies to collect information and act accordingly. This information can include the behavior of suppliers and consumers, enabling smart grids to improve efficiency, reliability, economics, sustainability of production and the distribution of fuels such as electricity in an automated manner.
- the smart grid can also be viewed as another low-latency sensor network.
- the health sector has many applications that can benefit from mobile communications.
- the communication system can support telemedicine providing clinical care from remote locations. This can help reduce barriers to distance and improve access to medical services that are not consistently available in remote rural areas. It is also used to save lives in critical care and emergencies.
- a wireless sensor network based on mobile communication may provide remote monitoring and sensors for parameters such as heart rate and blood pressure.
- Wireless and mobile communications are becoming increasingly important in industrial applications. Wiring is expensive to install and maintain. Thus, the possibility of replacing cables with reconfigurable wireless links is an attractive opportunity for many industries. However, achieving this requires that the wireless connection operates with a delay, reliability and capacity similar to that of the cable, and its management is simplified. Low latency and very low error probability are new requirements that need to be connected to 5G.
- Logistics and cargo tracking is an important use case for mobile communications that enables tracking of inventory and packages from anywhere using a location-based information system. Logistics and freight tracking use cases typically require low data rates, but require a wide range and reliable location information.
- Machine learning refers to the field of researching methodologies to define and solve various problems dealt with in the field of artificial intelligence. do.
- Machine learning is also defined as an algorithm that improves the performance of a task through continuous experience.
- An artificial neural network is a model used in machine learning, and may refer to an overall model with problem-solving capabilities, composed of artificial neurons (nodes) that form a network by combining synapses.
- the artificial neural network may be defined by a connection pattern between neurons of different layers, a learning process for updating model parameters, and an activation function for generating an output value.
- the artificial neural network may include an input layer, an output layer, and optionally one or more hidden layers. Each layer includes one or more neurons, and the artificial neural network may include neurons and synapses connecting neurons. In an artificial neural network, each neuron can output a function of an activation function for input signals, weights, and biases input through synapses.
- Model parameters refer to parameters determined through learning, and include weights of synaptic connections and biases of neurons.
- hyperparameters refer to parameters that must be set before learning in a machine learning algorithm, and include a learning rate, iteration count, mini-batch size, and initialization function.
- the purpose of learning artificial neural networks can be seen as determining model parameters that minimize the loss function.
- the loss function can be used as an index to determine an optimal model parameter in the learning process of the artificial neural network.
- Machine learning can be classified into supervised learning, unsupervised learning, and reinforcement learning according to the learning method.
- Supervised learning refers to a method of training an artificial neural network when a label for training data is given, and a label indicates the correct answer (or result value) that the artificial neural network should infer when training data is input to the artificial neural network. It can mean.
- Unsupervised learning may refer to a method of training an artificial neural network in a state where a label for training data is not given.
- Reinforcement learning may mean a learning method in which an agent defined in a certain environment learns to select an action or action sequence that maximizes the cumulative reward in each state.
- machine learning implemented as a deep neural network (DNN) including a plurality of hidden layers is sometimes referred to as deep learning (deep learning), and deep learning is a part of machine learning.
- DNN deep neural network
- machine learning is used in the sense including deep learning.
- a robot may refer to a machine that automatically processes or operates a task given by its own capabilities.
- a robot having a function of recognizing the environment and performing an operation by self-determining may be referred to as an intelligent robot.
- Robots can be classified into industrial, medical, household, military, etc. depending on the purpose or field of use.
- the robot may be provided with a driving unit including an actuator or a motor to perform various physical operations such as moving a robot joint.
- a driving unit including an actuator or a motor to perform various physical operations such as moving a robot joint.
- the movable robot includes a wheel, a brake, a propeller, etc. in a driving unit, and can travel on the ground or fly in the air through the driving unit.
- Autonomous driving refers to self-driving technology
- autonomous driving vehicle refers to a vehicle that is driven without a user's manipulation or with a user's minimal manipulation.
- a technology that maintains a driving lane a technology that automatically adjusts the speed such as adaptive cruise control, a technology that automatically drives along a specified route, and a technology that automatically sets a route when a destination is set, etc. All of these can be included.
- the vehicle includes all of a vehicle having only an internal combustion engine, a hybrid vehicle including an internal combustion engine and an electric motor, and an electric vehicle including only an electric motor, and may include not only automobiles, but also trains and motorcycles.
- the autonomous vehicle can be viewed as a robot having an autonomous driving function.
- the extended reality collectively refers to Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).
- VR technology provides only CG images of real world objects or backgrounds
- AR technology provides virtually created CG images on top of real object images
- MR technology is a computer that mixes and combines virtual objects in the real world. It is a graphic technology.
- MR technology is similar to AR technology in that it shows real and virtual objects together.
- virtual objects are used in a form that complements real objects
- MR technology virtual objects and real objects are used with equal characteristics.
- XR technology can be applied to HMD (Head-Mount Display), HUD (Head-Up Display), mobile phones, tablet PCs, laptops, desktops, TVs, digital signage, etc., and devices applied with XR technology are XR devices. It can be called as.
- HMD Head-Mount Display
- HUD Head-Up Display
- mobile phones tablet PCs, laptops, desktops, TVs, digital signage, etc.
- devices applied with XR technology are XR devices. It can be called as.
- the AI system 1 includes at least one of an AI server 200, a robot 100a, an autonomous vehicle 100b, an XR device 100c, a smartphone 100d, or a home appliance 100e. It is connected to the cloud network 10.
- the robot 100a to which the AI technology is applied, the autonomous vehicle 100b, the XR device 100c, the smartphone 100d, or the home appliance 100e may be referred to as the AI devices 100a to 100e.
- the cloud network 10 may constitute a part of the cloud computing infrastructure or may mean a network that exists in the cloud computing infrastructure.
- the cloud network 10 may be configured using a 3G network, a 4G or Long Term Evolution (LTE) network, or a 5G network.
- LTE Long Term Evolution
- the devices 100a to 100e and 200 constituting the AI system 1 may be connected to each other through the cloud network 10.
- the devices 100a to 100e and 200 may communicate with each other through a base station, but may communicate with each other directly without through a base station.
- the AI server 200 may include a server that performs AI processing and a server that performs an operation on big data.
- the AI server 200 includes at least one of a robot 100a, an autonomous vehicle 100b, an XR device 100c, a smartphone 100d, or a home appliance 100e, which are AI devices constituting the AI system 1 It is connected through the cloud network 10 and may help at least part of the AI processing of the connected AI devices 100a to 100e.
- the AI server 200 may train an artificial neural network according to a machine learning algorithm in place of the AI devices 100a to 100e, and may directly store the learning model or transmit it to the AI devices 100a to 100e.
- the AI server 200 receives input data from the AI devices 100a to 100e, infers a result value for the received input data using a learning model, and generates a response or control command based on the inferred result value. It can be generated and transmitted to the AI devices 100a to 100e.
- the AI devices 100a to 100e may infer a result value of input data using a direct learning model, and generate a response or a control command based on the inferred result value.
- the robot 100a is applied with AI technology and may be implemented as a guide robot, a transport robot, a cleaning robot, a wearable robot, an entertainment robot, a pet robot, an unmanned flying robot, and the like.
- the robot 100a may include a robot control module for controlling an operation, and the robot control module may refer to a software module or a chip implementing the same as hardware.
- the robot 100a acquires status information of the robot 100a by using sensor information acquired from various types of sensors, detects (recognizes) the surrounding environment and objects, generates map data, or moves paths and travels. It can decide a plan, decide a response to user interaction, or decide an action.
- the robot 100a may use sensor information obtained from at least one sensor from among a lidar, a radar, and a camera in order to determine a moving route and a driving plan.
- the robot 100a may perform the above operations using a learning model composed of at least one artificial neural network.
- the robot 100a may recognize a surrounding environment and an object using a learning model, and may determine an operation using the recognized surrounding environment information or object information.
- the learning model may be directly learned by the robot 100a or learned by an external device such as the AI server 200.
- the robot 100a may perform an operation by generating a result using a direct learning model, but it transmits sensor information to an external device such as the AI server 200 and performs the operation by receiving the result generated accordingly. You may.
- the robot 100a determines a movement path and a driving plan using at least one of map data, object information detected from sensor information, or object information acquired from an external device, and controls the driving unit to determine the determined movement path and travel plan. Accordingly, the robot 100a can be driven.
- the map data may include object identification information on various objects arranged in a space in which the robot 100a moves.
- the map data may include object identification information on fixed objects such as walls and doors and movable objects such as flower pots and desks.
- the object identification information may include a name, type, distance, and location.
- the robot 100a may perform an operation or run by controlling a driving unit based on a user's control/interaction.
- the robot 100a may acquire interaction intention information according to a user's motion or voice speech, and determine a response based on the obtained intention information to perform an operation.
- the autonomous vehicle 100b may be implemented as a mobile robot, vehicle, or unmanned aerial vehicle by applying AI technology.
- the autonomous driving vehicle 100b may include an autonomous driving control module for controlling an autonomous driving function, and the autonomous driving control module may refer to a software module or a chip implementing the same as hardware.
- the autonomous driving control module may be included inside as a configuration of the autonomous driving vehicle 100b, but may be configured as separate hardware and connected to the exterior of the autonomous driving vehicle 100b.
- the autonomous driving vehicle 100b acquires state information of the autonomous driving vehicle 100b using sensor information obtained from various types of sensors, detects (recognizes) surrounding environments and objects, or generates map data, It is possible to determine the travel route and travel plan, or to determine the motion.
- the autonomous vehicle 100b may use sensor information obtained from at least one sensor from among a lidar, a radar, and a camera, similar to the robot 100a, in order to determine a moving route and a driving plan.
- the autonomous vehicle 100b may recognize an environment or object in an area where the view is obscured or an area greater than a certain distance by receiving sensor information from external devices, or directly recognized information from external devices. .
- the autonomous vehicle 100b may perform the above operations using a learning model composed of at least one artificial neural network.
- the autonomous vehicle 100b may recognize a surrounding environment and an object using a learning model, and may determine a driving movement using the recognized surrounding environment information or object information.
- the learning model may be directly learned by the autonomous vehicle 100b or learned by an external device such as the AI server 200.
- the autonomous vehicle 100b may perform an operation by generating a result using a direct learning model, but it operates by transmitting sensor information to an external device such as the AI server 200 and receiving the result generated accordingly. You can also do
- the autonomous vehicle 100b determines a movement path and a driving plan using at least one of map data, object information detected from sensor information, or object information acquired from an external device, and controls the driving unit to determine the determined movement path and driving.
- the autonomous vehicle 100b can be driven according to a plan.
- the map data may include object identification information on various objects arranged in a space (eg, a road) in which the autonomous vehicle 100b travels.
- the map data may include object identification information on fixed objects such as street lights, rocks, and buildings, and movable objects such as vehicles and pedestrians.
- the object identification information may include a name, type, distance, and location.
- the autonomous vehicle 100b may perform an operation or drive by controlling a driving unit based on a user's control/interaction.
- the autonomous vehicle 100b may acquire interaction intention information according to a user's motion or voice speech, and determine a response based on the obtained intention information to perform the operation.
- the XR device 100c is applied with AI technology, such as HMD (Head-Mount Display), HUD (Head-Up Display) provided in the vehicle, TV, mobile phone, smartphone, computer, wearable device, home appliance, digital signage , A vehicle, a fixed robot, or a mobile robot.
- HMD Head-Mount Display
- HUD Head-Up Display
- the XR device 100c analyzes 3D point cloud data or image data acquired through various sensors or from an external device to generate location data and attribute data for 3D points, thereby providing information on surrounding spaces or real objects.
- the XR object to be acquired and output can be rendered and output.
- the XR apparatus 100c may output an XR object including additional information on the recognized object in correspondence with the recognized object.
- the XR apparatus 100c may perform the above operations using a learning model composed of at least one artificial neural network.
- the XR device 100c may recognize a real object from 3D point cloud data or image data using a learning model, and may provide information corresponding to the recognized real object.
- the learning model may be directly learned by the XR device 100c or learned by an external device such as the AI server 200.
- the XR device 100c may directly generate a result using a learning model to perform an operation, but transmits sensor information to an external device such as the AI server 200 and receives the result generated accordingly to perform the operation. You can also do it.
- the robot 100a may be implemented as a guide robot, a transport robot, a cleaning robot, a wearable robot, an entertainment robot, a pet robot, an unmanned flying robot, etc. by applying AI technology and autonomous driving technology.
- the robot 100a to which AI technology and autonomous driving technology are applied may refer to a robot having an autonomous driving function or a robot 100a interacting with the autonomous driving vehicle 100b.
- the robot 100a having an autonomous driving function may collectively refer to devices that move by themselves according to a given movement line without the user's control or by determining the movement line by themselves.
- the robot 100a having an autonomous driving function and the autonomous driving vehicle 100b may use a common sensing method to determine one or more of a moving route or a driving plan.
- the robot 100a having an autonomous driving function and the autonomous driving vehicle 100b may determine one or more of a movement route or a driving plan using information sensed through a lidar, a radar, and a camera.
- the robot 100a interacting with the autonomous driving vehicle 100b exists separately from the autonomous driving vehicle 100b and is linked to an autonomous driving function inside or outside the autonomous driving vehicle 100b, or ), you can perform an operation associated with the user on board.
- the robot 100a interacting with the autonomous driving vehicle 100b acquires sensor information on behalf of the autonomous driving vehicle 100b and provides it to the autonomous driving vehicle 100b, or acquires sensor information and information about the surrounding environment or By generating object information and providing it to the autonomous vehicle 100b, it is possible to control or assist the autonomous driving function of the autonomous driving vehicle 100b.
- the robot 100a interacting with the autonomous vehicle 100b may monitor a user in the autonomous vehicle 100b or control the function of the autonomous vehicle 100b through interaction with the user. .
- the robot 100a may activate an autonomous driving function of the autonomous driving vehicle 100b or assist the control of a driving unit of the autonomous driving vehicle 100b.
- the functions of the autonomous driving vehicle 100b controlled by the robot 100a may include not only an autonomous driving function, but also functions provided by a navigation system or an audio system provided inside the autonomous driving vehicle 100b.
- the robot 100a interacting with the autonomous driving vehicle 100b may provide information or assist a function to the autonomous driving vehicle 100b from outside the autonomous driving vehicle 100b.
- the robot 100a may provide traffic information including signal information to the autonomous vehicle 100b, such as a smart traffic light, or interact with the autonomous driving vehicle 100b, such as an automatic electric charger for an electric vehicle. You can also automatically connect an electric charger to the charging port.
- the robot 100a may be implemented as a guide robot, a transport robot, a cleaning robot, a wearable robot, an entertainment robot, a pet robot, an unmanned flying robot, a drone, etc., by applying AI technology and XR technology.
- the robot 100a to which the XR technology is applied may refer to a robot that is an object of control/interaction in an XR image.
- the robot 100a is distinguished from the XR device 100c and may be interlocked with each other.
- the robot 100a which is the object of control/interaction in the XR image, acquires sensor information from sensors including a camera
- the robot 100a or the XR device 100c generates an XR image based on the sensor information.
- the XR device 100c may output the generated XR image.
- the robot 100a may operate based on a control signal input through the XR device 100c or a user's interaction.
- the user can check the XR image corresponding to the viewpoint of the robot 100a linked remotely through an external device such as the XR device 100c, and adjust the autonomous driving path of the robot 100a through the interaction.
- You can control motion or driving, or check information on surrounding objects.
- the autonomous vehicle 100b may be implemented as a mobile robot, a vehicle, or an unmanned aerial vehicle by applying AI technology and XR technology.
- the autonomous driving vehicle 100b to which the XR technology is applied may refer to an autonomous driving vehicle including a means for providing an XR image, or an autonomous driving vehicle that is an object of control/interaction within the XR image.
- the autonomous vehicle 100b, which is an object of control/interaction in the XR image is distinguished from the XR device 100c and may be interlocked with each other.
- the autonomous vehicle 100b provided with a means for providing an XR image may acquire sensor information from sensors including a camera, and may output an XR image generated based on the acquired sensor information.
- the autonomous vehicle 100b may provide an XR object corresponding to a real object or an object in a screen to the occupant by outputting an XR image with a HUD.
- the XR object when the XR object is output to the HUD, at least a part of the XR object may be output to overlap the actual object facing the occupant's gaze.
- the XR object when the XR object is output on a display provided inside the autonomous vehicle 100b, at least a part of the XR object may be output to overlap the object in the screen.
- the autonomous vehicle 100b may output XR objects corresponding to objects such as lanes, other vehicles, traffic lights, traffic signs, motorcycles, pedestrians, and buildings.
- the autonomous driving vehicle 100b which is the object of control/interaction in the XR image, acquires sensor information from sensors including a camera
- the autonomous driving vehicle 100b or the XR device 100c is based on the sensor information.
- An XR image is generated, and the XR device 100c may output the generated XR image.
- the autonomous vehicle 100b may operate based on a control signal input through an external device such as the XR device 100c or a user's interaction.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
La présente invention concerne un procédé d'utilisation d'une session d'unité de données de protocole (PDU) toujours active d'un terminal. Le procédé peut comprendre les étapes consistant : à afficher sur un écran, sur la base de premières informations de réception destinées à un réseau, des premières informations de sortie indiquant qu'une session de PDU toujours active est établie ou en cours d'utilisation ; à afficher sur l'écran, sur la base de secondes informations de réception en provenance du réseau, des secondes informations de sortie indiquant que la session de PDU toujours active n'est pas prise en charge ou ne peut pas être utilisée ; et, sur la base des secondes informations de réception en provenance du réseau, à ne pas demander à nouveau la session de PDU toujours active jusqu'à ce qu'une condition prédéterminée soit satisfaite.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2019-0021329 | 2019-02-22 | ||
KR20190021329 | 2019-02-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020171311A1 true WO2020171311A1 (fr) | 2020-08-27 |
Family
ID=72144614
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/008627 WO2020171311A1 (fr) | 2019-02-22 | 2019-07-12 | Procédé et terminal d'utilisation d'une session pdu toujours active en 5 gs |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020171311A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022143697A1 (fr) * | 2020-12-31 | 2022-07-07 | 华为技术有限公司 | Procédé et appareil permettant d'éviter une boucle |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018008980A1 (fr) * | 2016-07-05 | 2018-01-11 | 엘지전자(주) | Procédé permettant de sélectionner une opération de ressource préférée par l'utilisateur dans un système de communication sans fil et dispositif associé |
WO2018111029A1 (fr) * | 2016-12-15 | 2018-06-21 | 엘지전자(주) | Procédé de réalisation de transfert intercellulaire dans un système de communication sans fil et appareil associé |
WO2018194315A1 (fr) * | 2017-04-19 | 2018-10-25 | 엘지전자 주식회사 | Procédé de traitement d'une procédure d'établissement de session pdu et nœud amf |
WO2018236164A1 (fr) * | 2017-06-21 | 2018-12-27 | 엘지전자(주) | Procédé et dispositif d'exécution d'une procédure de demande de service dans un système de communication sans fil |
-
2019
- 2019-07-12 WO PCT/KR2019/008627 patent/WO2020171311A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018008980A1 (fr) * | 2016-07-05 | 2018-01-11 | 엘지전자(주) | Procédé permettant de sélectionner une opération de ressource préférée par l'utilisateur dans un système de communication sans fil et dispositif associé |
WO2018111029A1 (fr) * | 2016-12-15 | 2018-06-21 | 엘지전자(주) | Procédé de réalisation de transfert intercellulaire dans un système de communication sans fil et appareil associé |
WO2018194315A1 (fr) * | 2017-04-19 | 2018-10-25 | 엘지전자 주식회사 | Procédé de traitement d'une procédure d'établissement de session pdu et nœud amf |
WO2018236164A1 (fr) * | 2017-06-21 | 2018-12-27 | 엘지전자(주) | Procédé et dispositif d'exécution d'une procédure de demande de service dans un système de communication sans fil |
Non-Patent Citations (1)
Title |
---|
NOKIA: "23.501 § 5.6.2: Registration and PDU session setup for always-on", 3GPP TSG SA WG2 MEETING #122, S2-174303, 20 June 2017 (2017-06-20), San Jose del Cabo , Mexico, XP051309382 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022143697A1 (fr) * | 2020-12-31 | 2022-07-07 | 华为技术有限公司 | Procédé et appareil permettant d'éviter une boucle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020166767A1 (fr) | Procédé et terminal permettant d'afficher des informations pour utiliser une session pdu à ma | |
WO2020141965A1 (fr) | Procédé et dispositif de réalisation d'un enregistrement sur un réseau dans un système de communication sans fil | |
WO2020027639A1 (fr) | Terminal mobile pour afficher si une qos est satisfaite dans un système de communication sans fil | |
WO2020111912A1 (fr) | Procédé d'émission et de réception de signal de recherche de mobile dans un système de communications sans fil, et appareil associé | |
WO2020080913A1 (fr) | Procédé prenant en charge une transmission de données séparée pour des tranches de réseau indépendantes dans un système de communication sans fil | |
WO2020046094A1 (fr) | Procédé et appareil de sélection de réseau mobile terrestre public (plmn) d'accès dans un système de communication sans fil | |
WO2020256425A1 (fr) | Procédé et appareil pour la prise en charge de sessions de pdu redondantes | |
WO2020204309A1 (fr) | Procédé de communication pour gérer une erreur de réseau | |
WO2020204536A1 (fr) | Procédé permettant à un terminal de se connecter à un réseau dans un système de communication sans fil | |
WO2020141956A1 (fr) | Procédé de sélection de réseau dans un système de communication sans fil | |
WO2020166881A1 (fr) | Procédé et dispositif de session pdu ma | |
WO2020046093A1 (fr) | Procédé et dispositif de sélection de réseau mobile terrestre public (plmn) dans un système de communication sans fil | |
WO2020149522A1 (fr) | Ue permettant l'établissement d'une session pdu et twif | |
WO2020213817A1 (fr) | Procédé d'affichage d'écran après connexion à un autre plmn pour gérer une défaillance de réseau | |
WO2021045339A1 (fr) | Procédé et appareil permettant de prendre en charge une sécurité pour une mo-edt dans une division cu-du dans un système de communication sans fil | |
WO2020060007A1 (fr) | Procédé et dispositif sans fil pour gérer une session de pdu dans une communication mobile 5g | |
WO2021020933A1 (fr) | Mesure pour un changement de plmn | |
WO2020009440A1 (fr) | Procédé et appareil de détermination de service pouvant être pris en charge dans un système de communications sans fil | |
WO2020076144A1 (fr) | Procédé de configuration, à un réseau, de capacité d'un terminal prenant en charge de multiples systèmes d'accès sans fil dans un système de communication sans fil, et dispositif associé | |
WO2021187828A1 (fr) | Communication associée à une tranche de réseau | |
WO2020022716A1 (fr) | Procédé et dispositif de commande d'état de transmission de données dans un système de communication sans fil | |
EP4055886A1 (fr) | Procédé et appareil de récupération après une panne dans un système de communication sans fil | |
WO2020067733A1 (fr) | Terminal effectuant un enregistrement dans un accès non 3 gpp et procédé effectué par celui-ci | |
WO2021091153A1 (fr) | Procédé et dispositif de commande de configuration relative à une communication de liaison latérale dans un système de communication sans fil | |
WO2021194134A1 (fr) | Procédé et appareil de gestion de défaillance de mobilité conditionnelle dans un système de communication sans fil |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19915712 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19915712 Country of ref document: EP Kind code of ref document: A1 |