WO2019017885A1 - Network assisted augmented reality - Google Patents

Network assisted augmented reality Download PDF

Info

Publication number
WO2019017885A1
WO2019017885A1 PCT/US2017/042420 US2017042420W WO2019017885A1 WO 2019017885 A1 WO2019017885 A1 WO 2019017885A1 US 2017042420 W US2017042420 W US 2017042420W WO 2019017885 A1 WO2019017885 A1 WO 2019017885A1
Authority
WO
WIPO (PCT)
Prior art keywords
user equipment
remote user
area
data
response
Prior art date
Application number
PCT/US2017/042420
Other languages
French (fr)
Inventor
Dirk Gaschler
Rui LUZ
Original Assignee
Nokia Solutions And Networks Oy
Nokia Usa Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Solutions And Networks Oy, Nokia Usa Inc. filed Critical Nokia Solutions And Networks Oy
Priority to PCT/US2017/042420 priority Critical patent/WO2019017885A1/en
Publication of WO2019017885A1 publication Critical patent/WO2019017885A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/90Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

Definitions

  • Augmented reality refers to a view of a physical, real-world scene that has somehow been augmented by a computer with additional information, such as video, location or map information, photos, graphics, sensor data (including Internet of Things (IoT) sensor data), and/or the like.
  • a user may look at a display presented by a user equipment, such as a smartphone camera, a tablet, a personal computer, and/or other processor-based devices.
  • This display may depict an actual, real-world scene of a city street full of people, but this reality can be "augmented" with, for example, a graphic (which is presented with the street view) indicating the location of a nearby coffee shop, or can be augmented with other types of data, such as sensor data, information regarding environmental conditions (for example, temperature, wind, humidity, barometric pressure, and/or the like), altitude, and/or other types of data as well.
  • the augmented reality can be implemented as an immersive, virtual reality (VR), in which a three-dimensional (3D) view may be used, although the augmented reality experience can be provided without such a 3D VR experience as well.
  • VR virtual reality
  • a method that includes sending a request for assistance information, the assistance information enabling an augmented reality view of an area; receiving a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; receiving data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and presenting, based on at least the received data, the augmented reality view of the area and/or the location.
  • the request may be sent in response to a call for emergency assistance and/or a message for emergency assistance.
  • the remote user equipment may originate the call or the message.
  • the received response may include a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message.
  • the received response may further include the data from the plurality of remote user equipment.
  • the request for assistance information may include a request to identify the plurality of remote user equipment associated with the area and/or the location.
  • the response may further include subscription information. The subscription information may be checked to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data.
  • the remote user equipment and/or the plurality of remote user equipment may be accessed to obtain the data for at least the portion of the augmented reality view of the area and/or the location.
  • the remote user equipment and/or the plurality of remote user equipment may be requested to capture and to return real-time video, the location information, and/or other information.
  • the accessing may include sending a message to the remote user equipment and/or the plurality of remote user equipment to perform an action and/or to control at least a portion of the remote user equipment and/or the plurality of remote user equipment.
  • the augmented reality view of the area and/or the location may include the data including the real-time video augmented with the location information, and/or the other information.
  • a user equipment associated with a public safety center may access the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or the data may be received from a network node that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data.
  • the network node may include a server included in, or coupled to, a cellular network.
  • a method that includes receiving a request for assistance information to enable an augmented reality view of an area; generating a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; and providing, based on data provided by at least the remote user equipment identified in the response, the generated response to enable presentation of the augmented reality view of the area and/or the location.
  • the request may be received in response to a call for emergency assistance and/or a message for emergency assistance.
  • the remote user equipment may originate the call or the message.
  • the generated response may include a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message.
  • the generated response may further include the data from the plurality of remote user equipment.
  • the request for assistance information may include a request to identify the plurality of remote user equipment associated with the area and/or the location.
  • the generated response may further include subscription information.
  • the subscription information may be checked to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data. Based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment may be accessed to obtain the data for at least the portion of the augmented reality view of the area and/or the location.
  • a network node may accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or wherein the network node forwards the data to a user equipment associated with a public safety center that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data.
  • the network node may include a server included in, or coupled to, a cellular network, the network node further configured to at least receive the request, generate the response, and/or provide the generated response.
  • FIG. 1A depicts an example of user equipment including augmented reality user interfaces in which network assistance is provided, in accordance with some example embodiments;
  • FIG. IB depicts an example of a system for network assisted augmented reality, in accordance with some example embodiments.
  • FIGs. 2A-G depict an example signaling flow sequence for network assisted augmented reality, in accordance with some example embodiments
  • FIG. 3A-F depicts another example signaling flow sequence for network assisted augmented reality, in accordance with some example embodiments
  • FIG. 4 depicts another example of user equipment including augmented reality user interfaces in which network assistance is provided, in accordance with some example embodiments;
  • FIG. 5 depict an example of a network node for providing assistance information for augmented reality, in accordance with some example embodiments.
  • FIG. 6 depicts an example of an apparatus, in accordance with some example embodiments.
  • a public safety center may be a center for responding to events such as emergencies or other types of events.
  • a PSC may be called in response to an emergency call, such as a call related to medical attention, public safety, security, fire, and/or other services provided to a community or a region. It may be useful for a PSC to have some "insight" into an area associated with an event, such as the emergency, although the PSC may seek the insight for other reasons such as for preventative or other reasons.
  • a user equipment e.g., a tablet, a computer, a smartphone, and/or the like
  • an application such as an AR application
  • This application may be assisted by the network, such as a mobile wireless network or network node therein.
  • This assistance may include information to enable the PSC to obtain data for the augmented reality session.
  • the AR application may be implemented as a service, such as a cloud-based service at a server coupled to the Internet.
  • the AR application may be implemented at locations other than the PSC.
  • the PSC's user equipment may provide the PSC with an augmented reality view into an area of interest.
  • the PSC's user equipment may provide the augmented reality service with real-time video augmented with other data including photos, audio, geolocation information, time information, and/or other data (e.g., weather, temperature, altitude, barometric pressure, wind, sensor data, IoT sensor data, and/or the like) in order to respond to, for example, an event in the area of interest.
  • data e.g., weather, temperature, altitude, barometric pressure, wind, sensor data, IoT sensor data, and/or the like
  • the event may be any type of event including an emergency, a natural disaster (e.g., earthquake, fire, tsunami, floods, volcanic eruption, storm, etc.), a security or safety event (e.g., an accident, crime, etc.), incident prevention (e.g., any kind of known public demonstration or mass participated event, evolving meteorological conditions, national infrastructure incidents including the telecom network infrastructure itself, etc.), tracking of one or more persons for security or law enforcement reasons, monitoring of an area of interest, and/or for other types of events.
  • the network assisted augmented reality may also be used for non-event driven situations, such as general surveillance, monitoring, preventative reasons, and/or for other situations/reasons.
  • FIG. 1A depicts an example of a system 100 including a viewer 105, user equipment 102A-B presenting user interfaces 104A-B, in accordance with some example embodiments.
  • the user equipment 102A-B may be located at a location, such as a PSC and/or at other locations, to provide network assisted augmented reality, in accordance with some example embodiments.
  • the PSC (which may include the system 100 which is described further below) may receive a message regarding an event, such as a call, an SMS message, an email, and/or other indication.
  • the PSC's user equipment 102A or B may request, from the network (e.g., a wireless mobile network or node therein) assistance information.
  • the assistance information for augmented reality may include the location of the caller (or the caller's user equipment), location of the event, the identity of the caller (e.g., caller's MSISDN, IMSI, Internet Protocol (IP) address and/or the like), the identity of one or more other user equipment in the vicinity of the event or caller (e.g., a list of at least one MSISDN, IMSI, and/or the like).
  • the identity of the caller e.g., caller's MSISDN, IMSI, Internet Protocol (IP) address and/or the like
  • IP Internet Protocol
  • the PSC's user equipment 102A or B may, via the network, access one or more user equipment in an area in the vicinity of the event to determine whether the one or more user equipment in the area of the vicinity of the event will capture and provide to the PSC's user equipment 102A or B certain information regarding the event.
  • the PSC's user equipment 102A or B may request that the neighboring user equipment in an area associated with the event capture and forward to the PSC real-time video, photographic, audio, and/or other types of data (for example, weather, temperature, altitude, barometric pressure, wind, sensor data including IoT sensor data, and/or the like).
  • the neighboring user equipment in the vicinity of the event may allow the PSC to control what information the one or more user equipment in the area of the event should capture, such as capturing photos or video using the front facing camera, turning on a microphone to enable audio recording, providing geolocation information, and/or the like.
  • the PSC's user equipment 102A or B may send a message to a user equipment in the area associated with the event to perform an action.
  • the PSC's user equipment 102A or B may send a message (for example, an SMS message, a command, and/or the like and may include text and/or graphics) to the user equipment instructing a pan of the camera up or down or instructing some other operation.
  • the PSC's user equipment 102A or B may access the one or more remote user equipment in the area of interest, such as the event. This access may be direct (e.g., by sending messages or making a call directly to the one or more remote user equipment to obtain the data such as real-time video and/or the like), and/or the access may be via a network node, in which case the network node may obtain the data for the augmented reality view from the one or more remote user equipment.
  • the user interface 104A may present some of the network assistance information provided by the network.
  • the network assistance information may include the geo-location of the event (e.g., the location of the "HELP" request depicted at user interface 104A), geo-location of user equipment 112 (which may be requesting the HELP), geo-location of user equipment 106A-C associated with (e.g., in the area in the vicinity of the event), the identity (e.g., MSISDN, IMSI, IP address, etc.) of user equipment 112, the identity of user equipment 106A-C associated with the event, and/or other assistance information.
  • the geo-location of the event e.g., the location of the "HELP" request depicted at user interface 104A
  • geo-location of user equipment 112 which may be requesting the HELP
  • geo-location of user equipment 106A-C associated with e.g., in the area in the vicinity of the event
  • the identity e.g., MSISDN, IMSI, IP address, etc.
  • a user may select the graphical user interface element representative of a source of data which can be used to augment reality which in this example is user equipment 106C presented at 104A.
  • the PSCs user equipment 102A or B may, via the network, access user equipment 106C to obtain a real-time video feed from user equipment 106C.
  • This video feed may be presented at user interface 104B, and can be viewed by a user (which as noted may be wearing the viewer 105 which can allow the user to "zoom in" on what is presented at 104A-B).
  • the user interface 104B depicts the augmented reality view of the event.
  • the "insight" may include real-time video (from the perspective of the selected user equipment 106C) as well as an overlay depicting data such as the location and identities of the other user equipment (e.g., 106A, B, and so forth) and the location and of the event (e.g., 112).
  • a selection of 106B may request remote user equipment 106B having MSISDN 2 to provide a real-time video feed as well as other types of data.
  • user interfaces 104A or B may present, as part of the augmented reality, other types of data provided by other devices, such as user equipment 106A-C as well as other devices.
  • the augmented reality may include weather, temperature, altitude, barometric pressure, wind, sensor data, IoT sensor data, and/or other types of data collected by the user equipment and/or other devices.
  • the user such as the PSC operator, using the viewer 105 as a virtual reality 3D viewer
  • the user/PSC operator may use other types devices to view and/or consume the video, audio, location, and/or other types of data (including sensor data/IoT sensor data) which can be gathered for the area associated with the event.
  • the user equipment depicted at FIG. 1A may be implemented as a sensor, such as an IOT sensor.
  • some of the user equipment may be stationary, while others may be mobile.
  • the view 104A depicts a "PSC operator view" providing a zoomed out view of the area and identifying potential additional sources of data that can be accessed to provide data for VR augmentation.
  • This view 104A may include data from one or more sources of data including augmented reality data sources such as user equipment 106A-C as well as other devices.
  • augmented reality data sources such as user equipment 106A-C as well as other devices.
  • a selection of for example user equipment 106C requests data from user equipment 106C, which in this example is audio, video, and/or other types of data which can be presented at a user interface such as user interface 104B (which may be rendered in two-dimensions or in a more immersive format, such as 3D or VR).
  • User interface 104B depicts in the example of FIG. 1A a zoomed in view of the area associated with the event from the perspective of the selected device, which in this example is user equipment 106C.
  • the user equipment 102A or B may, via the network, request or command user equipment 106C to perform other operations, such as configure a camera for zoom, panoramic, and/or 360 degree video capture, turn on a microphone and provide the audio recording from the microphone at user equipment 106C, turn on a flash of the user equipment 106C, message the user of the user equipment 106C to perform certain actions, provide time information/stamps, provide location information, provide a list of nearby Bluetooth and/or WiFi interfaces (e.g., the WiFi or Bluetooth interfaces of the other nearby user equipment 106A, 112, and/or the like).
  • the user equipment 102A or B may, via the network, access and/or control other user equipment, such as user equipment 106A, and so forth.
  • a network node may also be configured to obtain the data from the remote user equipment and then forward the obtained data to the PSC's user equipment.
  • FIG. 1A depicts two user equipment 102A-B at the PSC, other quantities may be used as well.
  • a single user equipment may be used to present the user interface views 104A-B to the user, such as the PSC operator.
  • user equipment 102A-B may be in locations other than the PSC.
  • FIG. IB depicts a system 199, in accordance with some example embodiments.
  • the system 199 may include a PSC 150 including system 100, a network 155 (e.g., a cellular network, the Internet, and/or a network node therein), at least one wireless access point 110A-C (e.g., WiFi access points, evolved node B base stations, and/or the like), and at least one user equipment 112A and 106A-C.
  • a network 155 e.g., a cellular network, the Internet, and/or a network node therein
  • at least one wireless access point 110A-C e.g., WiFi access points, evolved node B base stations, and/or the like
  • user equipment 112A and 106A-C e.g., WiFi access points, evolved node B base stations, and/or the like
  • the network 155 may assist the PSC's system 100 by providing information for building different views of augmented reality.
  • the network infrastructure (which may include access points 110A-C and/or at least one server) may provide to the PSC the assistance information.
  • the network 155 may provide to the system 100, via wired and/or wireless links 160, the following assistance information: geo-location information (e.g., latitude, longitude, altitude, cell identifier, street addresses, and/or the like) associated with an event area, geo-location information of user equipment 112 associated with the event, geo-location information of user equipment 106A-C in an area such as in the vicinity of the event, other identifiers (e.g., MSISDN, IMSI, IP address, and/or the like) in an area associated with user equipment 112 and/or the event, cell parameters, other types of information.
  • geo-location information e.g., latitude, longitude, altitude, cell identifier, street addresses, and/or the like
  • This other data/information may include temperature, altitude, barometric pressure, movement based on for example an embedded gyroscope/motion sensor, terminal type (e.g., smartphone, tablet, sensor, video camera, watch/wearables, IoT devices), and/or other types of information.
  • the network may provide to the system 100 subscription information indicative of whether user equipment in an area 276 (see, e.g., FIG. 1A) associated with the event is subscribed to network assisted augmented reality and, as such, allow the PSC and the network to access or obtain data from the user equipment for purposes of preventively or reactively collecting the data, which may assist in the provisioning of augmented reality at system 100.
  • the network 155 may route, via wired and/or wireless backhaul 162 and/or wireless link 162, to user equipment 112, the system's 100 access, request, or command to obtain video, audio, and/or other types of data from a remote user equipment, such as user equipment 112, 106A-C and/or the like.
  • This user equipment obtained video, audio, and/or other information may then be routed via the network to the PSC including system 100 to provide the augmented reality service as shown at user interfaces 104A-B.
  • the network may thus provide some degree of fault tolerance and distribution of the PSC system 100.
  • FIG. IB depicts system 100 at the PSC 150
  • the user equipment 102A or 102B of system 100 may be remoted to other locations.
  • user equipment 102B may be implemented in other locations including within an area of interest, such as in the vicinity of the event.
  • FIGs. 2A-2G depict an example of a signaling flow, in accordance with some example embodiments.
  • the process depicted in the signaling flow is initiated by a user, such as a caller reporting an emergency, requesting help, and/or for other reasons.
  • the user equipment 112 may make a call, such as an emergency call, to the PSC 150 including system 100, in accordance with some example embodiments.
  • the user equipment 112 may make to the PSC 112 a call (e.g., a voice call, an SMS text, an email, and/or the like) to report an event, such as an emergency.
  • This call may be forwarded to the PSC 150 via a wireless uplink (and/or downlink) 164, base station 110A, backhaul 162, network 155, and link 160.
  • the network 155 may establish, as shown at FIG. 2B, a connection 210, such as an emergency call connection 210, to couple via network 155 the PSC 150 and the user equipment 112, in accordance with some example embodiments.
  • the system 100 may trigger the process for other reasons as well. For example, if the PSC 150 including system 100 become aware of an event, the PSC including system 100 may initiate connection 210 without receiving the call. Alternatively or additionally, the PSC 150 including system 100 may proactive ly (for example, before an event) initiate connection 210 without receiving the call. Alternatively or additionally, a user equipment, such as user equipment 106A for example, may be associated with the PSC 150. For example, user equipment 106A may be considered a remote unit of the PSC, in which case the user equipment 106A may be accessed at any time (without the call at 205) and without the need for subscription verification.
  • user equipment 106A may be considered a remote unit of the PSC, in which case the user equipment 106A may be accessed at any time (without the call at 205) and without the need for subscription verification.
  • the PSC 150 including the system 100 may request from the network 155 assistance information, associated with the call 205, in accordance with some example embodiments.
  • the PSC including the system 100 may request, via wired and/or wireless links, assistance information such as the identities of remote user equipment in the vicinity, such as area 276, of the event.
  • the requested assistance information may include geo-location information for the user equipment 112, geo-location information for neighboring user equipment 106A-C in the vicinity of the event, terminal information (e.g., regarding the type of user equipment being used), measurements, identity information for the user equipment 112, identity information for user equipment 106A-C in the vicinity of the event, subscription information, and/or the like.
  • the network 155 may forward to the PSC 150 including system 100, the requested assistance information, in accordance with some example embodiments.
  • the network may gather and/or provide assistance information including the identities such as MSISDNs of the user equipment 106A-C (as well as other user equipment) in the vicinity or area 276 of the event.
  • the assistance information provided to system 100 may include the location of the user equipment 112 making the call, the locations of at least one other user equipment 106A-C (as well as the others) within a certain area 276 in the vicinity of the event, the identities of user equipment 112 making the call, terminal type, measurements, and/or other assistance information which can be used to provide augmented reality at system 100.
  • the system 100 may check the subscription, such as the network assisted augmented reality subscription, of the user equipment 112, 106A-C included in the assistance information provided at 220, in accordance with some example embodiments.
  • the subscription information may indicate which of the identified user equipment in area 276 allow the PSC 150 including system 100 to obtain audio, video, photo, and/or other types of data directly from the user equipment to enable the PSC including system 100 to have an augmented reality look into the event.
  • the subscription information may indicate which of the identified user equipment in area 276 allow the PSC 150 including system 100 to obtain audio, video, photo, and/or other types of data directly from the user equipment to enable the PSC including system 100 to have an augmented reality look into the event.
  • not all of the user equipment in area of interest 276 subscribe to the augmented reality service being accessed by system 100.
  • the subscriptions may be verified by the network 155 and, if subscribed, an indication of verification can be provided to system 100. In some implementations, the subscription verification may not be performed (for example, if allowed for an emergency by law or for other reason), [043]
  • the system 100 may access one or more user equipment 112, 106C to request obtaining audio, video, photo, and/or other types of data in accordance with some example embodiments.
  • system 100 including user equipment 102A or B may request that the one or more user equipment in an area associated with the event capture and forward video, audio, location, and/or other types of data which can be gathered in real-time at, or in the vicinity of, the event.
  • the one or more user equipment in the area of interest may allow the system 100 including user equipment 102A or B to control what information the one or more user equipment in the vicinity of the event should capture.
  • the PSC's user equipment 102A or B may, as noted, send a message to the user equipment in the vicinity of the event to perform an action.
  • a network node may access the one more user equipment 112, 106C and/or the like to obtain the data, such as audio, photo, and/or other types of data in accordance with some example embodiments. When this is the case, the network node may forward the obtained data to the system 100.
  • the system 100 including user equipment 102A or B may receive, from one or more user equipment 112, 106C, the requested audio, video, photo, and/or other types of data, in accordance with some example embodiments.
  • This information may be used to provide an augmented reality view of the event.
  • An example of an augmented reality view is depicted at user interface 104B at FIG. 1A.
  • FIGs. 3A-3F depict an example of a signaling flow, in accordance with some example embodiments.
  • the process depicted in the signaling flow of FIGs. 3A-3F is similar in some respects to the flow depicted at FIGs. 2A-2G but flow FIGs. 3A-3F is initiated by a request from the PSC 150 including system 100.
  • the PSC 150 including system 100 may need information for a certain area of interest 276 in order to provide an augmented reality look into the area 276, in accordance with some example embodiments.
  • the PSC may receive a call regarding an event, although PSC may want to monitor area 276 without receiving a call as well.
  • the PSC 150 including system 100 may send, in response to this need, to the network 155 a request for assistance information such as augmented reality assistance information for a given location or area, such as area 276, in accordance with some example embodiments.
  • the request sent to the network may include a location, such as area 276, and an indication that the identity of any user equipment at that location be provided to enable network assisted augmented reality at the PSC.
  • a selection of at user interface 104A see, e.g., FIG.
  • 1A may be performed to identify a location, such as an area of interest 276 by, for example, tracing the area 276 on the user interface 104A, selecting a street address (or point of interest such as a building, etc.), and/or selecting a point, such as the HELP graphical user interface element presented at user interface 104A.
  • the network 155 may provide to the PSC 150 including system 100 assistance information regarding a given event, user, user equipment, and/or area, in accordance with some example embodiments.
  • the network 155 may provide the requested identities of the remote user equipment in area 276. These identities may include the MSISDN, IMSIs, IP addresses, and/or other information identifying the user equipment in area 276.
  • the network may provide other network assistance information to system 100, such as the subscription information for the user equipment in area 276.
  • the PSC 150 including system 100 may check the subscription, such as the network assisted augmented reality (NAAR) subscription of the user equipment 1 12, 106A-C identified by the network at 315, in accordance with some example embodiments.
  • the subscription information may indicate which of the identified user equipment in area 276 allow the PSC 150 including system 100 to obtain audio, video, photo, and/or other information directly from the user equipment to enable the PSC including system 100 to have an augmented reality look into the event, such as an emergency.
  • the network 155 may check the subscription information, in which case only user equipment in area 276 that subscribe to the network assisted augmented reality will be identified at 315 to the system 100.
  • the system 100 at PSC 150 may access one or more user equipment 112, 106A-C to request obtaining audio, video, photo, and/or other information in accordance with some example embodiments.
  • the network 155 may access the user equipment as noted above with respect to 230.
  • a network node may access the one more user equipment to obtain the data, such as audio, photo, and/or other types of data in accordance with some example embodiments. When this is the case, the network node may forward the obtained data to the PSC 150.
  • the network node may forward the obtained data to the PSC 150.
  • the system 100 at PSC 150 may receive, from one or more user equipment 112, 106C, the requested audio, video, photo, and/or other information, in accordance with some example embodiments.
  • the network 155 may receive the information as noted above with respect to 230.
  • FIG. 4 depicts system 100 but with user equipment 102A-B depicting a PSC initiated session, in accordance with some example embodiments.
  • the session is initiated by the PSC 150 including system 100, and depicts audio, video, and/or other information to augment the real-time video obtained from user equipment 106C.
  • the real-time video is augmented with the identity, such as MSISDN identities, of other nearby remote user equipment and their corresponding locations. Selecting the graphical user interface element 466 (which identifies user equipment 106B/MSISDN 2) may trigger a request for real-time video from user equipment 106B as well as other information.
  • FIG. 5 depicts a block diagram of a network node 500, in accordance with some example embodiments.
  • the network node 500 may be configured to assist the PSC 150 including system 100 by providing network assistance information to system 100, routing requests to user equipment 112, 106A-C and/or the like, to obtain video, audio, and/or other information for the augmented reality session at system 100, and forwarding the obtained video, audio, and/or other information for the augmented reality session to system 100.
  • the network node 500 may be located at a core network of a mobile wireless network, at a base station, and/or at other locations as well.
  • a mobile wireless network may have a plurality of the network nodes as well.
  • the network node 500 may include a network interface 502, a processor 20, a memory 504, and a network assisted augmented reality (NAAR) service 550, in accordance with some example embodiments.
  • the network interface 502 may include wired and/or wireless transceivers to enable access to the PSC 150 including system 100, the base stations 1 10A-C, other networks, the Internet, and/or other nodes.
  • the memory 504 may comprise volatile and/or non-volatile memory including program code, which when executed by at least one processor 20 provides, among other things, the NAAR service 550.
  • the network node may be configured to at least receive a request for assistance information including at least one identifier of at least one remote user equipment at a location associated with an event, generate a response including the at least one identifier of the at least one remote user equipment at the location, and provide the generated response to a user equipment to enable the user equipment to provide an augmented reality view of the event via the at least one user equipment identified in the response.
  • FIG. 6 illustrates a block diagram of an apparatus 10, in accordance with some example embodiments
  • the apparatus 10 may represent a user equipment, such as the user equipment 102A-B which may be a part of system 100. Alternatively or additionally, the apparatus 10 may represent user equipment 112, 106A-C, and so forth. Alternatively or additionally, the apparatus 100 may include, or be able to access, an application, such as an augmented reality application, or a service, such an augmented reality service. Moreover, this application/service may be able to present augmented reality views with network assistance as shown for example at user interfaces 104A-B and 405 A-B.
  • this application/service be configured to allow system 100 to control the user equipment, such as allow system 100 to activate cameras, microphones, and/or the like, as well as forward the obtained video, audio, and/or other types of data to system 100 via network 155.
  • the apparatus 10 may comprise, or be coupled to, a sensor 99.
  • sensors which can be used as sensor 99 include a camera, a gyroscope, a barometer, and/or other types of sensors or actuators.
  • the apparatus 10 may be stationary and/or mobile.
  • the apparatus 10 may be implemented as a dedicated sensor, IoT sensor, and/or the like.
  • the IoT sensor may be implemented as a traffic camera, a temperature sensor, and/or other type of sensor fixedly attached to a building or traffic light, although the IoT sensor may be mobile as well.
  • the apparatus 10 may include a less powerful processor and/or less memory, when compared to for example a smartphone.
  • the IoT sensor may access the cellular network via another device.
  • the IoT sensor may couple to the cellular network via a first interface, such as Bluetooth or a WiFi, to another apparatus having a second interface to the cellular network.
  • the apparatus 10 may include at least one antenna 12 in communication with a transmitter 14 and a receiver 16. Alternatively transmit and receive antennas may be separate.
  • the apparatus 10 may also include a processor 20 configured to provide signals to and receive signals from the transmitter and receiver, respectively, and to control the functioning of the apparatus.
  • Processor 20 may be configured to control the functioning of the transmitter and receiver by effecting control signaling via electrical leads to the transmitter and receiver.
  • processor 20 may be configured to control other elements of apparatus 10 by effecting control signaling via electrical leads connecting processor 20 to the other elements, such as a display or a memory.
  • the processor 20 may, for example, be embodied in a variety of ways including circuitry, at least one processing core, one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits (for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and/or the like), or some combination thereof. Accordingly, although illustrated in FIG. 6 as a single processor, in some example embodiments the processor 20 may comprise a plurality of processors or processing cores.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the apparatus 10 may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like.
  • Signals sent and received by the processor 20 may include signaling information in accordance with an air interface standard of an applicable cellular system, and/or any number of different wireline or wireless networking techniques, comprising but not limited to Wi-Fi, wireless local access network (WLAN) techniques, such as Institute of Electrical and Electronics Engineers (IEEE) 802.11, 802.16, 802.3, ADSL, DOCSIS, and/or the like.
  • these signals may include speech data, user generated data, user requested data, and/or the like.
  • the apparatus 10 and/or a cellular modem therein may be capable of operating in accordance with various first generation (1G) communication protocols, second generation (2G or 2.5G) communication protocols, third-generation (3G) communication protocols, fourth- generation (4G) communication protocols, fifth-generation (5G) communication protocols, Internet Protocol Multimedia Subsystem (IMS) communication protocols (for example, session initiation protocol (SIP) and/or the like.
  • the apparatus 10 may be capable of operating in accordance with 2G wireless communication protocols IS-136, Time Division Multiple Access TDMA, Global System for Mobile communications, GSM, IS-95, Code Division Multiple Access, CDMA, and/or the like.
  • the apparatus 10 may be capable of operating in accordance with 2.5G wireless communication protocols General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), and/or the like. Further, for example, the apparatus 10 may be capable of operating in accordance with 3G wireless communication protocols, such as Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), and/or the like. The apparatus 10 may be additionally capable of operating in accordance with 3.9G wireless communication protocols, such as Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), and/or the like. Additionally, for example, the apparatus 10 may be capable of operating in accordance with 4G wireless communication protocols, such as LTE Advanced, 5G, and/or the like as well as similar wireless communication protocols that may be subsequently developed.
  • GPRS General Packet Radio Service
  • EDGE Enhanced Data GSM Environment
  • the processor 20 may include circuitry for implementing audio/video and logic functions of apparatus 10.
  • the processor 20 may comprise a digital signal processor device, a microprocessor device, an analog-to-digital converter, a digital-to-analog converter, and/or the like. Control and signal processing functions of the apparatus 10 may be allocated between these devices according to their respective capabilities.
  • the processor 20 may additionally comprise an internal voice coder (VC) 20a, an internal data modem (DM) 20b, and/or the like.
  • the processor 20 may include functionality to operate one or more software programs, which may be stored in memory. In general, processor 20 and stored software instructions may be configured to cause apparatus 10 to perform actions.
  • processor 20 may be capable of operating a connectivity program, such as a web browser.
  • the connectivity program may allow the apparatus 10 to transmit and receive web content, such as location-based content, according to a protocol, such as wireless application protocol, WAP, hypertext transfer protocol, HTTP, and/or the like.
  • Apparatus 10 may also comprise a user interface including, for example, an earphone or speaker 24, a ringer 22, a microphone 26, a display 28, a user input interface, and/or the like, which may be operationally coupled to the processor 20.
  • the display 28 may, as noted above, include a touch sensitive display, where a user may touch and/or gesture to make selections, enter values, and/or the like.
  • the processor 20 may also include user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as the speaker 24, the ringer 22, the microphone 26, the display 28, and/or the like.
  • the processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more elements of the user interface through computer program instructions, for example, software and/or firmware, stored on a memory accessible to the processor 20, for example, volatile memory 40, non-volatile memory 42, and/or the like.
  • the apparatus 10 may include a battery for powering various circuits related to the mobile terminal, for example, a circuit to provide mechanical vibration as a detectable output.
  • the user input interface may comprise devices allowing the apparatus 20 to receive data, such as a keypad 30 (which can be a virtual keyboard presented on display 28 or an externally coupled keyboard) and/or other input devices.
  • apparatus 10 may also include one or more mechanisms for sharing and/or obtaining data.
  • the apparatus 10 may include a short-range radio frequency (RF) transceiver and/or interrogator 64, so data may be shared with and/or obtained from electronic devices in accordance with RF techniques.
  • RF radio frequency
  • the apparatus 10 may include other short-range transceivers, such as an infrared (IR) transceiver 66, a BluetoothTM (BT) transceiver 68 operating using BluetoothTM wireless technology, a wireless universal serial bus (USB) transceiver 70, a BluetoothTM Low Energy transceiver, a ZigBee transceiver, an ANT transceiver, a cellular device-to-device transceiver, a wireless local area link transceiver, and/or any other short-range radio technology.
  • Apparatus 10 and, in particular, the short-range transceiver may be capable of transmitting data to and/or receiving data from electronic devices within the proximity of the apparatus, such as within 10 meters, for example.
  • the apparatus 10 including the Wi-Fi or wireless local area networking modem may also be capable of transmitting and/or receiving data from electronic devices according to various wireless networking techniques, including 6LoWpan, Wi-Fi, Wi-Fi low power, WLAN techniques such as IEEE 802.11 techniques, IEEE 802.15 techniques, IEEE 802.16 techniques, and/or the like.
  • various wireless networking techniques including 6LoWpan, Wi-Fi, Wi-Fi low power, WLAN techniques such as IEEE 802.11 techniques, IEEE 802.15 techniques, IEEE 802.16 techniques, and/or the like.
  • the apparatus 10 may comprise memory, such as a subscriber identity module (SIM) 38, a removable user identity module (R-UIM), an eUICC, an UICC, and/or the like, which may store information elements related to a mobile subscriber.
  • SIM subscriber identity module
  • R-UIM removable user identity module
  • eUICC embedded user identity module
  • UICC universal integrated circuit card
  • the apparatus 10 may include volatile memory 40 and/or nonvolatile memory 42.
  • volatile memory 40 may include Random Access Memory (RAM) including dynamic and/or static RAM, on-chip or off-chip cache memory, and/or the like.
  • RAM Random Access Memory
  • Non-volatile memory 42 which may be embedded and/or removable, may include, for example, read-only memory, flash memory, magnetic storage devices, for example, hard disks, floppy disk drives, magnetic tape, optical disc drives and/or media, non-volatile random access memory (NVRAM), and/or the like. Like volatile memory 40, non-volatile memory 42 may include a cache area for temporary storage of data. At least part of the volatile and/or non-volatile memory may be embedded in processor 20.
  • the memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the apparatus for performing operations disclosed herein including, for example, send a request for assistance information to enable an augmented reality view of an area, receive a response to the request (the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment), receive data from the remote user equipment identified in the response (the data providing at least a portion of the augmented reality view of the area), present, based on at least the received data, the augmented reality view of the area and/or the location, and/or other operations disclosed herein.
  • send a request for assistance information to enable an augmented reality view of an area receive a response to the request (the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment), receive data from the remote user equipment identified in the response (the data providing at least a portion of the augmented reality view of the area), present, based on at least the
  • the memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the apparatus for performing operations disclosed herein including, for example, receive a request for assistance information to enable an augmented reality view of an area, generate a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment, provide, based on data provided by at least the remote user equipment identified in the response (the generated response to enable presentation of the augmented reality view of the area and/or the location), and/or other operations disclosed herein.
  • the memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying apparatus 10.
  • the memories may comprise an identifier, such as an intemational mobile equipment identification (IMEI) code, capable of uniquely identifying apparatus 10.
  • the processor 20 may be configured using computer code stored at memory 40 and/or 42 to control and/or provide one or more aspects disclosed herein (see, for example, process 600, 700, and/or other operations disclosed herein).
  • the processor 20 may be configured using computer code stored at memory 40 and/or 42 to at least send a request for assistance information to enable an augmented reality view of an area, receive a response to the request, receive data from the remote user equipment identified in the response, present, based on at least the received data, the augmented reality view of the area and/or the location, and/or other operations disclosed herein.
  • the processor 20 may be configured using computer code stored at memory 40 and/or 42 to at least receive a request for assistance information to enable an augmented reality view of an area, generate a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment, provide, based on data provided by at least the remote user equipment identified in the response, and/or other operations disclosed herein.
  • a "computer-readable medium" may be any non-transitory media that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer or data processor circuitry, with examples depicted at FIG. 6, computer-readable medium may comprise a non-transitory computer-readable storage medium that may be any media that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
  • the base stations and user equipment (or one or more components therein) and/or the processes described herein can be implemented using one or more of the following: a processor executing program code, an application-specific integrated circuit (ASIC), a digital signal processor (DSP), an embedded processor, a field programmable gate array (FPGA), and/or combinations thereof.
  • ASIC application-specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • These various implementations may include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • These computer programs also known as programs, software, software applications, applications, components, program code, or code
  • computer-readable medium refers to any computer program product, machine-readable medium, computer-readable storage medium, apparatus and/or device (for example, magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine -readable medium that receives machine instructions.
  • PLDs Programmable Logic Devices
  • systems are also described herein that may include a processor and a memory coupled to the processor.
  • the memory may include one or more programs that cause the processor to perform one or more of the operations described herein.

Abstract

Methods and apparatus, including computer program products, are provided for network assisted augmented reality. In some example embodiments, there may be provided a method that includes sending a request for assistance information, the assistance information enabling an augmented reality view of an area; receiving a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; receiving data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and presenting, based on at least the received data, the augmented reality view of the area and/or the location. Related systems, methods, and articles of manufacture are also described.

Description

NETWORK ASSISTED AUGMENTED REALITY
Field
[001] The subject matter described herein relates to augmented reality.
Background
[002] Augmented reality (AR) refers to a view of a physical, real-world scene that has somehow been augmented by a computer with additional information, such as video, location or map information, photos, graphics, sensor data (including Internet of Things (IoT) sensor data), and/or the like. For example, a user may look at a display presented by a user equipment, such as a smartphone camera, a tablet, a personal computer, and/or other processor-based devices. This display may depict an actual, real-world scene of a city street full of people, but this reality can be "augmented" with, for example, a graphic (which is presented with the street view) indicating the location of a nearby coffee shop, or can be augmented with other types of data, such as sensor data, information regarding environmental conditions (for example, temperature, wind, humidity, barometric pressure, and/or the like), altitude, and/or other types of data as well. In some cases, the augmented reality can be implemented as an immersive, virtual reality (VR), in which a three-dimensional (3D) view may be used, although the augmented reality experience can be provided without such a 3D VR experience as well.
Summary
[003] Methods and apparatus, including computer program products, are provided for network assisted augmented reality.
[004] In some example embodiments, there may be provided a method that includes sending a request for assistance information, the assistance information enabling an augmented reality view of an area; receiving a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; receiving data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and presenting, based on at least the received data, the augmented reality view of the area and/or the location. [005] In some variations, one or more of the features disclosed herein including the following features can optionally be included in any feasible combination. The request may be sent in response to a call for emergency assistance and/or a message for emergency assistance. The remote user equipment may originate the call or the message. The received response may include a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message. The received response may further include the data from the plurality of remote user equipment. The request for assistance information may include a request to identify the plurality of remote user equipment associated with the area and/or the location. The response may further include subscription information. The subscription information may be checked to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data. Based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment may be accessed to obtain the data for at least the portion of the augmented reality view of the area and/or the location. The remote user equipment and/or the plurality of remote user equipment may be requested to capture and to return real-time video, the location information, and/or other information. The accessing may include sending a message to the remote user equipment and/or the plurality of remote user equipment to perform an action and/or to control at least a portion of the remote user equipment and/or the plurality of remote user equipment. The augmented reality view of the area and/or the location may include the data including the real-time video augmented with the location information, and/or the other information. A user equipment associated with a public safety center may access the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or the data may be received from a network node that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data. The network node may include a server included in, or coupled to, a cellular network.
[006] In some example embodiments, there may be provided a method that includes receiving a request for assistance information to enable an augmented reality view of an area; generating a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; and providing, based on data provided by at least the remote user equipment identified in the response, the generated response to enable presentation of the augmented reality view of the area and/or the location.
[007] In some variations, one or more of the features disclosed herein including the following features can optionally be included in any feasible combination. The request may be received in response to a call for emergency assistance and/or a message for emergency assistance. The remote user equipment may originate the call or the message. The generated response may include a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message. The generated response may further include the data from the plurality of remote user equipment. The request for assistance information may include a request to identify the plurality of remote user equipment associated with the area and/or the location. The generated response may further include subscription information. The subscription information may be checked to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data. Based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment may be accessed to obtain the data for at least the portion of the augmented reality view of the area and/or the location. A network node may accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or wherein the network node forwards the data to a user equipment associated with a public safety center that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data. The network node may include a server included in, or coupled to, a cellular network, the network node further configured to at least receive the request, generate the response, and/or provide the generated response.
[008] The above-noted aspects and features may be implemented in systems, apparatus, methods, and/or articles depending on the desired configuration. The details of one or more variations of the subject matter described herein are set forth in the accompanying drawings and the description below. Features and advantages of the subject matter described herein will be apparent from the description and drawings, and from the claims. Description of Drawings
[009] In the drawings,
[010] FIG. 1A depicts an example of user equipment including augmented reality user interfaces in which network assistance is provided, in accordance with some example embodiments;
[011] FIG. IB depicts an example of a system for network assisted augmented reality, in accordance with some example embodiments;
[012] FIGs. 2A-G depict an example signaling flow sequence for network assisted augmented reality, in accordance with some example embodiments;
[013] FIG. 3A-F depicts another example signaling flow sequence for network assisted augmented reality, in accordance with some example embodiments;
[014] FIG. 4 depicts another example of user equipment including augmented reality user interfaces in which network assistance is provided, in accordance with some example embodiments;
[015] FIG. 5 depict an example of a network node for providing assistance information for augmented reality, in accordance with some example embodiments; and
[016] FIG. 6 depicts an example of an apparatus, in accordance with some example embodiments.
[017] Like labels are used to refer to same or similar items in the drawings.
Detailed Description
[018] A public safety center (PSC) may be a center for responding to events such as emergencies or other types of events. For example, a PSC may be called in response to an emergency call, such as a call related to medical attention, public safety, security, fire, and/or other services provided to a community or a region. It may be useful for a PSC to have some "insight" into an area associated with an event, such as the emergency, although the PSC may seek the insight for other reasons such as for preventative or other reasons. For example, it may be useful to have the PSC provided with the "insight" configured as an augmented reality (AR) view into an area associated with the event, and this AR view of the area may include real-time video, photographic, audio, and/or other types of data (for example, weather, temperature, altitude, barometric pressure, wind, sensor data, IoT sensor data, and/or the like). [019] In some example embodiments, a user equipment (e.g., a tablet, a computer, a smartphone, and/or the like) at a location, such as the PSC, may include an application, such as an AR application, to provide insight into an area. This application may be assisted by the network, such as a mobile wireless network or network node therein. This assistance may include information to enable the PSC to obtain data for the augmented reality session. Although some of the examples herein refer to the PSC including an AR application for providing an AR session, the AR application may be implemented as a service, such as a cloud-based service at a server coupled to the Internet. Furthermore, the AR application may be implemented at locations other than the PSC.
[020] To illustrate by way of an example, the PSC's user equipment may provide the PSC with an augmented reality view into an area of interest. The PSC's user equipment may provide the augmented reality service with real-time video augmented with other data including photos, audio, geolocation information, time information, and/or other data (e.g., weather, temperature, altitude, barometric pressure, wind, sensor data, IoT sensor data, and/or the like) in order to respond to, for example, an event in the area of interest. The event may be any type of event including an emergency, a natural disaster (e.g., earthquake, fire, tsunami, floods, volcanic eruption, storm, etc.), a security or safety event (e.g., an accident, crime, etc.), incident prevention (e.g., any kind of known public demonstration or mass participated event, evolving meteorological conditions, national infrastructure incidents including the telecom network infrastructure itself, etc.), tracking of one or more persons for security or law enforcement reasons, monitoring of an area of interest, and/or for other types of events. The network assisted augmented reality may also be used for non-event driven situations, such as general surveillance, monitoring, preventative reasons, and/or for other situations/reasons.
[021] FIG. 1A depicts an example of a system 100 including a viewer 105, user equipment 102A-B presenting user interfaces 104A-B, in accordance with some example embodiments. The user equipment 102A-B may be located at a location, such as a PSC and/or at other locations, to provide network assisted augmented reality, in accordance with some example embodiments.
[022] In the example of FIG. 1A, the PSC (which may include the system 100 which is described further below) may receive a message regarding an event, such as a call, an SMS message, an email, and/or other indication. In response, the PSC's user equipment 102A or B, may request, from the network (e.g., a wireless mobile network or node therein) assistance information.
[023] In some example embodiments, the assistance information for augmented reality may include the location of the caller (or the caller's user equipment), location of the event, the identity of the caller (e.g., caller's MSISDN, IMSI, Internet Protocol (IP) address and/or the like), the identity of one or more other user equipment in the vicinity of the event or caller (e.g., a list of at least one MSISDN, IMSI, and/or the like).
[024] With the assistance information, the PSC's user equipment 102A or B, may, via the network, access one or more user equipment in an area in the vicinity of the event to determine whether the one or more user equipment in the area of the vicinity of the event will capture and provide to the PSC's user equipment 102A or B certain information regarding the event. For example, the PSC's user equipment 102A or B may request that the neighboring user equipment in an area associated with the event capture and forward to the PSC real-time video, photographic, audio, and/or other types of data (for example, weather, temperature, altitude, barometric pressure, wind, sensor data including IoT sensor data, and/or the like). Moreover, the neighboring user equipment in the vicinity of the event may allow the PSC to control what information the one or more user equipment in the area of the event should capture, such as capturing photos or video using the front facing camera, turning on a microphone to enable audio recording, providing geolocation information, and/or the like. Alternatively or additionally, the PSC's user equipment 102A or B may send a message to a user equipment in the area associated with the event to perform an action. For example, the PSC's user equipment 102A or B may send a message (for example, an SMS message, a command, and/or the like and may include text and/or graphics) to the user equipment instructing a pan of the camera up or down or instructing some other operation. In some example embodiments, the PSC's user equipment 102A or B may access the one or more remote user equipment in the area of interest, such as the event. This access may be direct (e.g., by sending messages or making a call directly to the one or more remote user equipment to obtain the data such as real-time video and/or the like), and/or the access may be via a network node, in which case the network node may obtain the data for the augmented reality view from the one or more remote user equipment. [025] In the example of FIG. 1A, the user interface 104A may present some of the network assistance information provided by the network. Here, the network assistance information may include the geo-location of the event (e.g., the location of the "HELP" request depicted at user interface 104A), geo-location of user equipment 112 (which may be requesting the HELP), geo-location of user equipment 106A-C associated with (e.g., in the area in the vicinity of the event), the identity (e.g., MSISDN, IMSI, IP address, etc.) of user equipment 112, the identity of user equipment 106A-C associated with the event, and/or other assistance information.
[026] In the example of FIG. 1A, a user (for example, a PSC operator or other type of user which may be wearing viewer 105) may select the graphical user interface element representative of a source of data which can be used to augment reality which in this example is user equipment 106C presented at 104A. When this is the case, the PSCs user equipment 102A or B may, via the network, access user equipment 106C to obtain a real-time video feed from user equipment 106C. This video feed may be presented at user interface 104B, and can be viewed by a user (which as noted may be wearing the viewer 105 which can allow the user to "zoom in" on what is presented at 104A-B). The user interface 104B depicts the augmented reality view of the event. Specifically, the "insight" may include real-time video (from the perspective of the selected user equipment 106C) as well as an overlay depicting data such as the location and identities of the other user equipment (e.g., 106A, B, and so forth) and the location and of the event (e.g., 112).
[027] In the example of FIG. 1A, a selection of 106B may request remote user equipment 106B having MSISDN 2 to provide a real-time video feed as well as other types of data. For example, user interfaces 104A or B may present, as part of the augmented reality, other types of data provided by other devices, such as user equipment 106A-C as well as other devices. For example, the augmented reality may include weather, temperature, altitude, barometric pressure, wind, sensor data, IoT sensor data, and/or other types of data collected by the user equipment and/or other devices.
[028] Although some of the examples refer to the user, such as the PSC operator, using the viewer 105 as a virtual reality 3D viewer, this is merely an example. The user/PSC operator may use other types devices to view and/or consume the video, audio, location, and/or other types of data (including sensor data/IoT sensor data) which can be gathered for the area associated with the event. Moreover, the user equipment depicted at FIG. 1A may be implemented as a sensor, such as an IOT sensor. Furthermore, some of the user equipment may be stationary, while others may be mobile.
[029] Referring again to the user interface views 104A-B, the view 104A depicts a "PSC operator view" providing a zoomed out view of the area and identifying potential additional sources of data that can be accessed to provide data for VR augmentation. This view 104A may include data from one or more sources of data including augmented reality data sources such as user equipment 106A-C as well as other devices. As noted, a selection of for example user equipment 106C requests data from user equipment 106C, which in this example is audio, video, and/or other types of data which can be presented at a user interface such as user interface 104B (which may be rendered in two-dimensions or in a more immersive format, such as 3D or VR). User interface 104B depicts in the example of FIG. 1A a zoomed in view of the area associated with the event from the perspective of the selected device, which in this example is user equipment 106C.
[030] In addition, the user equipment 102A or B may, via the network, request or command user equipment 106C to perform other operations, such as configure a camera for zoom, panoramic, and/or 360 degree video capture, turn on a microphone and provide the audio recording from the microphone at user equipment 106C, turn on a flash of the user equipment 106C, message the user of the user equipment 106C to perform certain actions, provide time information/stamps, provide location information, provide a list of nearby Bluetooth and/or WiFi interfaces (e.g., the WiFi or Bluetooth interfaces of the other nearby user equipment 106A, 112, and/or the like). The user equipment 102A or B may, via the network, access and/or control other user equipment, such as user equipment 106A, and so forth.
[031] Although some of the examples herein refer to the PSC's user equipment accessing, requesting, and/or commanding the remote user equipment to obtain the data, a network node may also be configured to obtain the data from the remote user equipment and then forward the obtained data to the PSC's user equipment.
[032] Although FIG. 1A depicts two user equipment 102A-B at the PSC, other quantities may be used as well. For example, a single user equipment may be used to present the user interface views 104A-B to the user, such as the PSC operator. Furthermore, user equipment 102A-B may be in locations other than the PSC.
[033] FIG. IB depicts a system 199, in accordance with some example embodiments. The system 199 may include a PSC 150 including system 100, a network 155 (e.g., a cellular network, the Internet, and/or a network node therein), at least one wireless access point 110A-C (e.g., WiFi access points, evolved node B base stations, and/or the like), and at least one user equipment 112A and 106A-C.
[034] The network 155 may assist the PSC's system 100 by providing information for building different views of augmented reality. The network infrastructure (which may include access points 110A-C and/or at least one server) may provide to the PSC the assistance information. To that end, the network 155 may provide to the system 100, via wired and/or wireless links 160, the following assistance information: geo-location information (e.g., latitude, longitude, altitude, cell identifier, street addresses, and/or the like) associated with an event area, geo-location information of user equipment 112 associated with the event, geo-location information of user equipment 106A-C in an area such as in the vicinity of the event, other identifiers (e.g., MSISDN, IMSI, IP address, and/or the like) in an area associated with user equipment 112 and/or the event, cell parameters, other types of information. This other data/information may include temperature, altitude, barometric pressure, movement based on for example an embedded gyroscope/motion sensor, terminal type (e.g., smartphone, tablet, sensor, video camera, watch/wearables, IoT devices), and/or other types of information. Alternatively or additionally, the network may provide to the system 100 subscription information indicative of whether user equipment in an area 276 (see, e.g., FIG. 1A) associated with the event is subscribed to network assisted augmented reality and, as such, allow the PSC and the network to access or obtain data from the user equipment for purposes of preventively or reactively collecting the data, which may assist in the provisioning of augmented reality at system 100.
[035] Moreover, the network 155 may route, via wired and/or wireless backhaul 162 and/or wireless link 162, to user equipment 112, the system's 100 access, request, or command to obtain video, audio, and/or other types of data from a remote user equipment, such as user equipment 112, 106A-C and/or the like. This user equipment obtained video, audio, and/or other information may then be routed via the network to the PSC including system 100 to provide the augmented reality service as shown at user interfaces 104A-B. The network may thus provide some degree of fault tolerance and distribution of the PSC system 100.
[036] Although FIG. IB depicts system 100 at the PSC 150, the user equipment 102A or 102B of system 100 may be remoted to other locations. For example, user equipment 102B may be implemented in other locations including within an area of interest, such as in the vicinity of the event.
[037] FIGs. 2A-2G depict an example of a signaling flow, in accordance with some example embodiments. In the example of FIGs. 2A-2G, the process depicted in the signaling flow is initiated by a user, such as a caller reporting an emergency, requesting help, and/or for other reasons.
[038] At FIG. 2A and 205, the user equipment 112 may make a call, such as an emergency call, to the PSC 150 including system 100, in accordance with some example embodiments. For example, the user equipment 112 may make to the PSC 112 a call (e.g., a voice call, an SMS text, an email, and/or the like) to report an event, such as an emergency. This call may be forwarded to the PSC 150 via a wireless uplink (and/or downlink) 164, base station 110A, backhaul 162, network 155, and link 160. In response to the call at 205, the network 155 may establish, as shown at FIG. 2B, a connection 210, such as an emergency call connection 210, to couple via network 155 the PSC 150 and the user equipment 112, in accordance with some example embodiments.
[039] Although the previous example refers to the network establishing the connection at 210 in response to the call, the system 100 may trigger the process for other reasons as well. For example, if the PSC 150 including system 100 become aware of an event, the PSC including system 100 may initiate connection 210 without receiving the call. Alternatively or additionally, the PSC 150 including system 100 may proactive ly (for example, before an event) initiate connection 210 without receiving the call. Alternatively or additionally, a user equipment, such as user equipment 106A for example, may be associated with the PSC 150. For example, user equipment 106A may be considered a remote unit of the PSC, in which case the user equipment 106A may be accessed at any time (without the call at 205) and without the need for subscription verification.
[040] At FIG. 2C and 215, the PSC 150 including the system 100 may request from the network 155 assistance information, associated with the call 205, in accordance with some example embodiments. In response to the emergency call at 205 and/or the connection establishment 210 for example, the PSC including the system 100 may request, via wired and/or wireless links, assistance information such as the identities of remote user equipment in the vicinity, such as area 276, of the event. Alternatively or additionally, the requested assistance information may include geo-location information for the user equipment 112, geo-location information for neighboring user equipment 106A-C in the vicinity of the event, terminal information (e.g., regarding the type of user equipment being used), measurements, identity information for the user equipment 112, identity information for user equipment 106A-C in the vicinity of the event, subscription information, and/or the like.
[041] At FIG. 2D and 220, the network 155 may forward to the PSC 150 including system 100, the requested assistance information, in accordance with some example embodiments. In response to the request 215, the network may gather and/or provide assistance information including the identities such as MSISDNs of the user equipment 106A-C (as well as other user equipment) in the vicinity or area 276 of the event. Alternatively or additionally, the assistance information provided to system 100 may include the location of the user equipment 112 making the call, the locations of at least one other user equipment 106A-C (as well as the others) within a certain area 276 in the vicinity of the event, the identities of user equipment 112 making the call, terminal type, measurements, and/or other assistance information which can be used to provide augmented reality at system 100.
[042] At FIG. 2E and 225, the system 100 may check the subscription, such as the network assisted augmented reality subscription, of the user equipment 112, 106A-C included in the assistance information provided at 220, in accordance with some example embodiments. For example, the subscription information may indicate which of the identified user equipment in area 276 allow the PSC 150 including system 100 to obtain audio, video, photo, and/or other types of data directly from the user equipment to enable the PSC including system 100 to have an augmented reality look into the event. As can be seen in the example of FIG. 2D and 2E, not all of the user equipment in area of interest 276 subscribe to the augmented reality service being accessed by system 100. The subscriptions may be verified by the network 155 and, if subscribed, an indication of verification can be provided to system 100. In some implementations, the subscription verification may not be performed (for example, if allowed for an emergency by law or for other reason), [043] At FIG. 2F and 230, the system 100 may access one or more user equipment 112, 106C to request obtaining audio, video, photo, and/or other types of data in accordance with some example embodiments. As noted, system 100 including user equipment 102A or B may request that the one or more user equipment in an area associated with the event capture and forward video, audio, location, and/or other types of data which can be gathered in real-time at, or in the vicinity of, the event. And as noted, the one or more user equipment in the area of interest may allow the system 100 including user equipment 102A or B to control what information the one or more user equipment in the vicinity of the event should capture. Alternatively or additionally, the PSC's user equipment 102A or B may, as noted, send a message to the user equipment in the vicinity of the event to perform an action. As noted, a network node may access the one more user equipment 112, 106C and/or the like to obtain the data, such as audio, photo, and/or other types of data in accordance with some example embodiments. When this is the case, the network node may forward the obtained data to the system 100.
[044] At FIG. 2G and 235, the system 100 including user equipment 102A or B may receive, from one or more user equipment 112, 106C, the requested audio, video, photo, and/or other types of data, in accordance with some example embodiments. This information may be used to provide an augmented reality view of the event. An example of an augmented reality view is depicted at user interface 104B at FIG. 1A.
[045] FIGs. 3A-3F depict an example of a signaling flow, in accordance with some example embodiments. The process depicted in the signaling flow of FIGs. 3A-3F is similar in some respects to the flow depicted at FIGs. 2A-2G but flow FIGs. 3A-3F is initiated by a request from the PSC 150 including system 100.
[046] At FIG. 3A and 305, the PSC 150 including system 100 may need information for a certain area of interest 276 in order to provide an augmented reality look into the area 276, in accordance with some example embodiments. For example, the PSC may receive a call regarding an event, although PSC may want to monitor area 276 without receiving a call as well.
[047] At FIG. 3B and 310, the PSC 150 including system 100 may send, in response to this need, to the network 155 a request for assistance information such as augmented reality assistance information for a given location or area, such as area 276, in accordance with some example embodiments. The request sent to the network may include a location, such as area 276, and an indication that the identity of any user equipment at that location be provided to enable network assisted augmented reality at the PSC. To illustrate further, a selection of at user interface 104A (see, e.g., FIG. 1A) may be performed to identify a location, such as an area of interest 276 by, for example, tracing the area 276 on the user interface 104A, selecting a street address (or point of interest such as a building, etc.), and/or selecting a point, such as the HELP graphical user interface element presented at user interface 104A.
[048] At FIG. 3C and 315, the network 155 may provide to the PSC 150 including system 100 assistance information regarding a given event, user, user equipment, and/or area, in accordance with some example embodiments. For example, the network 155 may provide the requested identities of the remote user equipment in area 276. These identities may include the MSISDN, IMSIs, IP addresses, and/or other information identifying the user equipment in area 276. The network may provide other network assistance information to system 100, such as the subscription information for the user equipment in area 276.
[049] At FIG. 3D and 320, the PSC 150 including system 100 may check the subscription, such as the network assisted augmented reality (NAAR) subscription of the user equipment 1 12, 106A-C identified by the network at 315, in accordance with some example embodiments. For example, the subscription information may indicate which of the identified user equipment in area 276 allow the PSC 150 including system 100 to obtain audio, video, photo, and/or other information directly from the user equipment to enable the PSC including system 100 to have an augmented reality look into the event, such as an emergency. Alternatively or additionally, the network 155 may check the subscription information, in which case only user equipment in area 276 that subscribe to the network assisted augmented reality will be identified at 315 to the system 100.
[050] At FIG. 3E and 330, the system 100 at PSC 150 may access one or more user equipment 112, 106A-C to request obtaining audio, video, photo, and/or other information in accordance with some example embodiments. For example, the network 155 may access the user equipment as noted above with respect to 230. . As noted, a network node may access the one more user equipment to obtain the data, such as audio, photo, and/or other types of data in accordance with some example embodiments. When this is the case, the network node may forward the obtained data to the PSC 150. [051] At FIG. 3F and 335, the system 100 at PSC 150 may receive, from one or more user equipment 112, 106C, the requested audio, video, photo, and/or other information, in accordance with some example embodiments. For example, the network 155 may receive the information as noted above with respect to 230.
[052] FIG. 4 depicts system 100 but with user equipment 102A-B depicting a PSC initiated session, in accordance with some example embodiments. In the example of FIG. 4, the session is initiated by the PSC 150 including system 100, and depicts audio, video, and/or other information to augment the real-time video obtained from user equipment 106C. In the user interface 405B, the real-time video is augmented with the identity, such as MSISDN identities, of other nearby remote user equipment and their corresponding locations. Selecting the graphical user interface element 466 (which identifies user equipment 106B/MSISDN 2) may trigger a request for real-time video from user equipment 106B as well as other information.
[053] FIG. 5 depicts a block diagram of a network node 500, in accordance with some example embodiments. The network node 500 may be configured to assist the PSC 150 including system 100 by providing network assistance information to system 100, routing requests to user equipment 112, 106A-C and/or the like, to obtain video, audio, and/or other information for the augmented reality session at system 100, and forwarding the obtained video, audio, and/or other information for the augmented reality session to system 100. The network node 500 may be located at a core network of a mobile wireless network, at a base station, and/or at other locations as well. Moreover, a mobile wireless network may have a plurality of the network nodes as well.
[054] The network node 500 may include a network interface 502, a processor 20, a memory 504, and a network assisted augmented reality (NAAR) service 550, in accordance with some example embodiments. The network interface 502 may include wired and/or wireless transceivers to enable access to the PSC 150 including system 100, the base stations 1 10A-C, other networks, the Internet, and/or other nodes. The memory 504 may comprise volatile and/or non-volatile memory including program code, which when executed by at least one processor 20 provides, among other things, the NAAR service 550. For example, the network node may be configured to at least receive a request for assistance information including at least one identifier of at least one remote user equipment at a location associated with an event, generate a response including the at least one identifier of the at least one remote user equipment at the location, and provide the generated response to a user equipment to enable the user equipment to provide an augmented reality view of the event via the at least one user equipment identified in the response.
[055] FIG. 6 illustrates a block diagram of an apparatus 10, in accordance with some example embodiments
[056] The apparatus 10 may represent a user equipment, such as the user equipment 102A-B which may be a part of system 100. Alternatively or additionally, the apparatus 10 may represent user equipment 112, 106A-C, and so forth. Alternatively or additionally, the apparatus 100 may include, or be able to access, an application, such as an augmented reality application, or a service, such an augmented reality service. Moreover, this application/service may be able to present augmented reality views with network assistance as shown for example at user interfaces 104A-B and 405 A-B. Alternatively or additionally, this application/service be configured to allow system 100 to control the user equipment, such as allow system 100 to activate cameras, microphones, and/or the like, as well as forward the obtained video, audio, and/or other types of data to system 100 via network 155.
[057] The apparatus 10 may comprise, or be coupled to, a sensor 99. Examples of sensors which can be used as sensor 99 include a camera, a gyroscope, a barometer, and/or other types of sensors or actuators. Moreover, the apparatus 10 may be stationary and/or mobile.
[058] Furthermore, the apparatus 10 may be implemented as a dedicated sensor, IoT sensor, and/or the like. For example, the IoT sensor may be implemented as a traffic camera, a temperature sensor, and/or other type of sensor fixedly attached to a building or traffic light, although the IoT sensor may be mobile as well. In the case of the IoT sensor, the apparatus 10 may include a less powerful processor and/or less memory, when compared to for example a smartphone. Furthermore, the IoT sensor may access the cellular network via another device. For example, the IoT sensor may couple to the cellular network via a first interface, such as Bluetooth or a WiFi, to another apparatus having a second interface to the cellular network.
[059] The apparatus 10 may include at least one antenna 12 in communication with a transmitter 14 and a receiver 16. Alternatively transmit and receive antennas may be separate. The apparatus 10 may also include a processor 20 configured to provide signals to and receive signals from the transmitter and receiver, respectively, and to control the functioning of the apparatus. Processor 20 may be configured to control the functioning of the transmitter and receiver by effecting control signaling via electrical leads to the transmitter and receiver. Likewise, processor 20 may be configured to control other elements of apparatus 10 by effecting control signaling via electrical leads connecting processor 20 to the other elements, such as a display or a memory. The processor 20 may, for example, be embodied in a variety of ways including circuitry, at least one processing core, one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits (for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and/or the like), or some combination thereof. Accordingly, although illustrated in FIG. 6 as a single processor, in some example embodiments the processor 20 may comprise a plurality of processors or processing cores.
[060] The apparatus 10 may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like. Signals sent and received by the processor 20 may include signaling information in accordance with an air interface standard of an applicable cellular system, and/or any number of different wireline or wireless networking techniques, comprising but not limited to Wi-Fi, wireless local access network (WLAN) techniques, such as Institute of Electrical and Electronics Engineers (IEEE) 802.11, 802.16, 802.3, ADSL, DOCSIS, and/or the like. In addition, these signals may include speech data, user generated data, user requested data, and/or the like.
[061] For example, the apparatus 10 and/or a cellular modem therein may be capable of operating in accordance with various first generation (1G) communication protocols, second generation (2G or 2.5G) communication protocols, third-generation (3G) communication protocols, fourth- generation (4G) communication protocols, fifth-generation (5G) communication protocols, Internet Protocol Multimedia Subsystem (IMS) communication protocols (for example, session initiation protocol (SIP) and/or the like. For example, the apparatus 10 may be capable of operating in accordance with 2G wireless communication protocols IS-136, Time Division Multiple Access TDMA, Global System for Mobile communications, GSM, IS-95, Code Division Multiple Access, CDMA, and/or the like. In addition, for example, the apparatus 10 may be capable of operating in accordance with 2.5G wireless communication protocols General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), and/or the like. Further, for example, the apparatus 10 may be capable of operating in accordance with 3G wireless communication protocols, such as Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), and/or the like. The apparatus 10 may be additionally capable of operating in accordance with 3.9G wireless communication protocols, such as Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), and/or the like. Additionally, for example, the apparatus 10 may be capable of operating in accordance with 4G wireless communication protocols, such as LTE Advanced, 5G, and/or the like as well as similar wireless communication protocols that may be subsequently developed.
[062] It is understood that the processor 20 may include circuitry for implementing audio/video and logic functions of apparatus 10. For example, the processor 20 may comprise a digital signal processor device, a microprocessor device, an analog-to-digital converter, a digital-to-analog converter, and/or the like. Control and signal processing functions of the apparatus 10 may be allocated between these devices according to their respective capabilities. The processor 20 may additionally comprise an internal voice coder (VC) 20a, an internal data modem (DM) 20b, and/or the like. Further, the processor 20 may include functionality to operate one or more software programs, which may be stored in memory. In general, processor 20 and stored software instructions may be configured to cause apparatus 10 to perform actions. For example, processor 20 may be capable of operating a connectivity program, such as a web browser. The connectivity program may allow the apparatus 10 to transmit and receive web content, such as location-based content, according to a protocol, such as wireless application protocol, WAP, hypertext transfer protocol, HTTP, and/or the like.
[063] Apparatus 10 may also comprise a user interface including, for example, an earphone or speaker 24, a ringer 22, a microphone 26, a display 28, a user input interface, and/or the like, which may be operationally coupled to the processor 20. The display 28 may, as noted above, include a touch sensitive display, where a user may touch and/or gesture to make selections, enter values, and/or the like. The processor 20 may also include user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as the speaker 24, the ringer 22, the microphone 26, the display 28, and/or the like. The processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more elements of the user interface through computer program instructions, for example, software and/or firmware, stored on a memory accessible to the processor 20, for example, volatile memory 40, non-volatile memory 42, and/or the like. The apparatus 10 may include a battery for powering various circuits related to the mobile terminal, for example, a circuit to provide mechanical vibration as a detectable output. The user input interface may comprise devices allowing the apparatus 20 to receive data, such as a keypad 30 (which can be a virtual keyboard presented on display 28 or an externally coupled keyboard) and/or other input devices.
[064] As shown in FIG. 6, apparatus 10 may also include one or more mechanisms for sharing and/or obtaining data. For example, the apparatus 10 may include a short-range radio frequency (RF) transceiver and/or interrogator 64, so data may be shared with and/or obtained from electronic devices in accordance with RF techniques. The apparatus 10 may include other short-range transceivers, such as an infrared (IR) transceiver 66, a BluetoothTM (BT) transceiver 68 operating using BluetoothTM wireless technology, a wireless universal serial bus (USB) transceiver 70, a BluetoothTM Low Energy transceiver, a ZigBee transceiver, an ANT transceiver, a cellular device-to-device transceiver, a wireless local area link transceiver, and/or any other short-range radio technology. Apparatus 10 and, in particular, the short-range transceiver may be capable of transmitting data to and/or receiving data from electronic devices within the proximity of the apparatus, such as within 10 meters, for example. The apparatus 10 including the Wi-Fi or wireless local area networking modem may also be capable of transmitting and/or receiving data from electronic devices according to various wireless networking techniques, including 6LoWpan, Wi-Fi, Wi-Fi low power, WLAN techniques such as IEEE 802.11 techniques, IEEE 802.15 techniques, IEEE 802.16 techniques, and/or the like.
[065] The apparatus 10 may comprise memory, such as a subscriber identity module (SIM) 38, a removable user identity module (R-UIM), an eUICC, an UICC, and/or the like, which may store information elements related to a mobile subscriber. In addition to the SIM, the apparatus 10 may include other removable and/or fixed memory. The apparatus 10 may include volatile memory 40 and/or nonvolatile memory 42. For example, volatile memory 40 may include Random Access Memory (RAM) including dynamic and/or static RAM, on-chip or off-chip cache memory, and/or the like. Non-volatile memory 42, which may be embedded and/or removable, may include, for example, read-only memory, flash memory, magnetic storage devices, for example, hard disks, floppy disk drives, magnetic tape, optical disc drives and/or media, non-volatile random access memory (NVRAM), and/or the like. Like volatile memory 40, non-volatile memory 42 may include a cache area for temporary storage of data. At least part of the volatile and/or non-volatile memory may be embedded in processor 20. The memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the apparatus for performing operations disclosed herein including, for example, send a request for assistance information to enable an augmented reality view of an area, receive a response to the request (the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment), receive data from the remote user equipment identified in the response (the data providing at least a portion of the augmented reality view of the area), present, based on at least the received data, the augmented reality view of the area and/or the location, and/or other operations disclosed herein. Alternatively or additionally, the memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the apparatus for performing operations disclosed herein including, for example, receive a request for assistance information to enable an augmented reality view of an area, generate a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment, provide, based on data provided by at least the remote user equipment identified in the response (the generated response to enable presentation of the augmented reality view of the area and/or the location), and/or other operations disclosed herein.
[066] The memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying apparatus 10. The memories may comprise an identifier, such as an intemational mobile equipment identification (IMEI) code, capable of uniquely identifying apparatus 10. In the example embodiment, the processor 20 may be configured using computer code stored at memory 40 and/or 42 to control and/or provide one or more aspects disclosed herein (see, for example, process 600, 700, and/or other operations disclosed herein). For example, the processor 20 may be configured using computer code stored at memory 40 and/or 42 to at least send a request for assistance information to enable an augmented reality view of an area, receive a response to the request, receive data from the remote user equipment identified in the response, present, based on at least the received data, the augmented reality view of the area and/or the location, and/or other operations disclosed herein. Alternatively or additionally, the processor 20 may be configured using computer code stored at memory 40 and/or 42 to at least receive a request for assistance information to enable an augmented reality view of an area, generate a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment, provide, based on data provided by at least the remote user equipment identified in the response, and/or other operations disclosed herein.
[067] Some of the embodiments disclosed herein may be implemented in software, hardware, application logic, or a combination of software, hardware, and application logic. The software, application logic, and/or hardware may reside on memory 40, the control apparatus 20, or electronic components, for example. In some example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any non-transitory media that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer or data processor circuitry, with examples depicted at FIG. 6, computer-readable medium may comprise a non-transitory computer-readable storage medium that may be any media that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
[068] Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein may be enhanced provisioning of augmented reality for an emergency event.
[069] The subject matter described herein may be embodied in systems, apparatus, methods, and/or articles depending on the desired configuration. For example, the base stations and user equipment (or one or more components therein) and/or the processes described herein can be implemented using one or more of the following: a processor executing program code, an application-specific integrated circuit (ASIC), a digital signal processor (DSP), an embedded processor, a field programmable gate array (FPGA), and/or combinations thereof. These various implementations may include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. These computer programs (also known as programs, software, software applications, applications, components, program code, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the term "computer-readable medium" refers to any computer program product, machine-readable medium, computer-readable storage medium, apparatus and/or device (for example, magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine -readable medium that receives machine instructions. Similarly, systems are also described herein that may include a processor and a memory coupled to the processor. The memory may include one or more programs that cause the processor to perform one or more of the operations described herein.
[070] Although a few variations have been described in detail above, other modifications or additions are possible. In particular, further features and/or variations may be provided in addition to those set forth herein. Moreover, the implementations described above may be directed to various combinations and subcombinations of the disclosed features and/or combinations and subcombinations of several further features disclosed above. Other embodiments may be within the scope of the following claims.
[071] If desired, the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions may be optional or may be combined. Although various aspects of some of the embodiments are set out in the independent claims, other aspects of some of the embodiments comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims. It is also noted herein that while the above describes example embodiments, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications that may be made without departing from the scope of some of the embodiments as defined in the appended claims. Other embodiments may be within the scope of the following claims. The term "based on" includes "based on at least." The use of the phase "such as" means "such as for example" unless otherwise indicated.

Claims

WHAT IS CLAIMED
1. A method comprising:
sending a request for assistance information, the assistance information enabling an augmented reality view of an area;
receiving a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment;
receiving data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and
presenting, based on at least the received data, the augmented reality view of the area and/or the location.
2. The method of claim 1, wherein the request is sent in response to a call for emergency assistance and/or a message for emergency assistance.
3. The method of claim 2, wherein the remote user equipment originates the call or the message, wherein the received response includes a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message, and wherein the received response further includes the data from the plurality of remote user equipment.
4. The method of claim 3, wherein the request for assistance information includes a request to identify the plurality of remote user equipment associated with the area and/or the location.
5. The method of any of claims 3-4, wherein the response further includes subscription
information, and wherein the method further comprises:
checking the subscription information to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data; and
accessing, based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment to obtain the data for at least the portion of the augmented reality view of the area and/or the location.
6. The method of claim 5, wherein the accessing comprises: requesting the remote user equipment and/or the plurality of remote user equipment to capture and to return real-time video, the location information, and/or other information.
7. The method of claim 6, wherein the accessing comprises sending a message to the remote user equipment and/or the plurality of remote user equipment to perform an action and/or to control at least a portion of the remote user equipment and/or the plurality of remote user equipment.
8. The method of any of claims 6-7, wherein the augmented reality view of the area and/or the location includes the data including the real-time video augmented with the location information, and/or the other information.
9. The method of any of claims 1-8, wherein a user equipment associated with a public safety
center accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or wherein the data is received from a network node that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data.
10. The method of claim 9, wherein the network node comprises a server included in, or coupled to, a cellular network.
1 1. A method comprising:
receiving a request for assistance information to enable an augmented reality view of an area;
generating a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; and
providing, based on data provided by at least the remote user equipment identified in the response, the generated response to enable presentation of the augmented reality view of the area and/or the location.
12. The method of claim 1 1, wherein the request is received in response to a call for emergency assistance and/or a message for emergency assistance.
13. The method of any of claims 1 1-12, wherein the remote user equipment originates the call or the message, wherein the generated response includes a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message, and wherein the generated response further includes the data from the plurality of remote user equipment.
14. The method of claim 13, wherein the request for assistance information includes a request to identify the plurality of remote user equipment associated with the area and/or the location.
15. The method of any of claims 1 1-14, wherein the generated response further includes
subscription information.
16. The method of any of claims 13-15, further comprising:
checking the subscription information to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data; and
accessing, based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment to obtain the data for at least the portion of the augmented reality view of the area and/or the location.
17. The method of any of claims 13-16, wherein a network node accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or wherein the network node forwards the data to a user equipment associated with a public safety center that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data.
18. The method of claim 17, wherein the network node comprises a server included in, or coupled to, a cellular network, the network node further configured to at least receive the request, generate the response, and/or provide the generated response.
19. An apparatus comprising at least one processor, and at least one memory including computer program code which when executed by the at least one processor causes the apparatus to at least: send a request for assistance information, the assistance information enabling an augmented reality view of an area;
receive a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment;
receive data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and present, based on at least the received data, the augmented reality view of the area and/or the location.
20. The apparatus of claim 19, wherein the request is sent in response to a call for emergency
assistance and/or a message for emergency assistance.
21. The apparatus of claim 19, wherein the remote user equipment originates the call or the message, wherein the received response includes a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message, and wherein the received response further includes the data from the plurality of remote user equipment.
22. The apparatus of claim 21, wherein the request for assistance information includes a request to identify the plurality of remote user equipment associated with the area and/or the location.
23. The apparatus of any of claims 21-22, wherein the response further includes subscription
information, and wherein the apparatus is further caused to at least check the subscription information to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data, and wherein the apparatus is further caused to at least access, based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment to obtain the data for at least the portion of the augmented reality view of the area and/or the location.
24. The apparatus of claim 23, wherein the apparatus is further caused to at least request the remote user equipment and/or the plurality of remote user equipment to capture and to return real-time video, the location information, and/or other information.
25. The apparatus of claim 24, wherein the apparatus is further caused to at least the send a message for the access to the remote user equipment and/or the plurality of remote user equipment to perform an action and/or to control at least a portion of the remote user equipment and/or the plurality of remote user equipment.
26. The apparatus of any of claims 24-25, wherein the augmented reality view of the area and/or the location includes the data including the real-time video augmented with the location information, and/or the other information.
27. The apparatus of any of claims 19-26, wherein the apparatus comprises a user equipment associated with a public safety center , the user equipment accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or wherein the data is received from a network node that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data.
28. The apparatus of claim 27, wherein the network node comprises a server included in, or coupled to, a cellular network.
29. An apparatus comprising at least one processor, and at least one memory including computer program code which when executed by the at least one processor causes the apparatus to at least: receive a request for assistance information to enable an augmented reality view of an area;
generate a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; and
provide, based on data provided by at least the remote user equipment identified in the response, the generated response to enable presentation of the augmented reality view of the area and/or the location.
30. The apparatus of claim 29, wherein the request is received in response to a call for emergency assistance and/or a message for emergency assistance.
31. The apparatus of any of claims 29-30, wherein the remote user equipment originates the call or the message, wherein the generated response includes a plurality of identifiers for a plurality of remote user equipment associated with the area and/or the location of the remote user equipment originating the call or the message, and wherein the generated response further includes the data from the plurality of remote user equipment.
32. The apparatus of claim 31, wherein the request for assistance information includes a request to identify the plurality of remote user equipment associated with the area and/or the location.
33. The apparatus of any of claims 29-32, wherein the generated response further includes
subscription information.
34. The apparatus of any of claims 31-33, wherein the apparatus is further caused to at least check the subscription information to verify the remote user equipment and/or the plurality of remote user equipment allow being accessed in order to obtain the data, and wherein the apparatus is further caused to at least access, based on the subscription information allowing the access, the remote user equipment and/or the plurality of remote user equipment to obtain the data for at least the portion of the augmented reality view of the area and/or the location.
35. The apparatus of any of claims 31-34, wherein the apparatus comprises a network node that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data, and/or wherein the network node forwards the data to a user equipment associated with a public safety center that accesses the remote user equipment and/or the plurality of remote user equipment to obtain the data.
36. The apparatus of claim 35, wherein the network node comprises a server included in, or coupled to, a cellular network, the network node further configured to at least receive the request, generate the response, and/or provide the generated response.
37. A non-transitory computer-readable storage medium including program code which when
executed causes operations comprising:
sending a request for assistance information, the assistance information enabling an augmented reality view of an area;
receiving a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment;
receiving data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and
presenting, based on at least the received data, the augmented reality view of the area and/or the location.
38. A non-transitory computer-readable storage medium including program code which when
executed causes operations comprising:
receiving a request for assistance information to enable an augmented reality view of an area; generating a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; and
providing, based on data provided by at least the remote user equipment identified in the response, the generated response to enable presentation of the augmented reality view of the area and/or the location.
39. An apparatus comprising:
means for sending a request for assistance information, the assistance information enabling an augmented reality view of an area;
means for receiving a response to the request, the response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; means for receiving data from the remote user equipment identified in the response, the data providing at least a portion of the augmented reality view of the area; and
means for presenting, based on at least the received data, the augmented reality view of the area and/or the location.
40. The apparatus of claim 39 further comprising means for performing a method of any of claims 2-10.
41. An apparatus comprising:
means for receiving a request for assistance information to enable an augmented reality view of an area;
means for generating a response including an identifier for a remote user equipment associated with the area and/or a location for the remote user equipment; and
means for providing, based on data provided by at least the remote user equipment identified in the response, the generated response to enable presentation of the augmented reality view of the area and/or the location.
42. The apparatus of claim 41 further comprising means for performing a method of any of claims 12-18.
PCT/US2017/042420 2017-07-17 2017-07-17 Network assisted augmented reality WO2019017885A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2017/042420 WO2019017885A1 (en) 2017-07-17 2017-07-17 Network assisted augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2017/042420 WO2019017885A1 (en) 2017-07-17 2017-07-17 Network assisted augmented reality

Publications (1)

Publication Number Publication Date
WO2019017885A1 true WO2019017885A1 (en) 2019-01-24

Family

ID=59521638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/042420 WO2019017885A1 (en) 2017-07-17 2017-07-17 Network assisted augmented reality

Country Status (1)

Country Link
WO (1) WO2019017885A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113620A (en) * 2019-05-09 2019-08-09 福建威盾科技集团有限公司 Video resource based on private network environment plays and acquisition method and system in real time
WO2022206624A1 (en) * 2021-03-27 2022-10-06 华为技术有限公司 Augmented reality communication method, apparatus and system
WO2022206626A1 (en) * 2021-03-27 2022-10-06 华为技术有限公司 Augmented reality communication method, apparatus, and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150111524A1 (en) * 2013-10-22 2015-04-23 Patrocinium Systems LLC Interactive emergency information and identification systems and methods
US20170024088A1 (en) * 2015-07-24 2017-01-26 Digital Praetorian, Inc. Emergency Incident Data Structure Creation and Analysis
US20170148306A1 (en) * 2015-11-23 2017-05-25 Warnable, LLC System and method for processing emergency alerts and responses

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150111524A1 (en) * 2013-10-22 2015-04-23 Patrocinium Systems LLC Interactive emergency information and identification systems and methods
US20170024088A1 (en) * 2015-07-24 2017-01-26 Digital Praetorian, Inc. Emergency Incident Data Structure Creation and Analysis
US20170148306A1 (en) * 2015-11-23 2017-05-25 Warnable, LLC System and method for processing emergency alerts and responses

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113620A (en) * 2019-05-09 2019-08-09 福建威盾科技集团有限公司 Video resource based on private network environment plays and acquisition method and system in real time
WO2022206624A1 (en) * 2021-03-27 2022-10-06 华为技术有限公司 Augmented reality communication method, apparatus and system
WO2022206626A1 (en) * 2021-03-27 2022-10-06 华为技术有限公司 Augmented reality communication method, apparatus, and system

Similar Documents

Publication Publication Date Title
US9992654B2 (en) Policy driven emergency traffic handling for machine-to-machine device communication
US10841732B2 (en) Systems and methods for emergency data communication
EP3391673B1 (en) Systems and methods for emergency data communication
Rauniyar et al. Crowdsourcing-based disaster management using fog computing in internet of things paradigm
US20150201305A1 (en) Methods and systems for providing location based services in a venue using femtocells
US10436876B2 (en) E911 Locating by nearby proxy device location
US9813876B2 (en) Enhanced location based services
US20150147997A1 (en) Event based location-based service
US20070072583A1 (en) Emergency Reporting System
US10255796B2 (en) Discrete emergency alerts on wireless devices
KR101635429B1 (en) System for crime prevention using wireless emergency bell apparatus
WO2019017885A1 (en) Network assisted augmented reality
WO2019080099A1 (en) Unmanned aerial vehicle control method and device, and unmanned aerial vehicle operating method and device
WO2022160134A1 (en) Data analytics method and apparatus for wireless network, and piece of communication equipment and storage medium
US11889568B2 (en) Systems and methods for paging over WiFi for mobile terminating calls
US10375559B2 (en) Supplementing broadcast messages
US20230336957A1 (en) Systems and methods for emergency broadcast using delegated discovery
Sedlar et al. Next generation emergency services based on the Pan-European Mobile Emergency Application (PEMEA) protocol: Leveraging mobile positioning and context information
CN111919460A (en) Network data collection method and device, network equipment, user equipment and storage medium
EP4358522A1 (en) Information processing device, information processing method, and information processing system
EP4106272A1 (en) Disaster tolerance processing method and apparatus
KR20210082121A (en) System and method for routing an emergency call
Loreti et al. A multi-technology indoor positioning service to enable new location-aware applications
US20230188968A1 (en) Context-Enhanced Emergency Service
WO2024000166A1 (en) Sensing data providing methods and apparatuses, device, storage medium and program product

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17748592

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17748592

Country of ref document: EP

Kind code of ref document: A1