GB2578133A - Augumented reality system - Google Patents
Augumented reality system Download PDFInfo
- Publication number
- GB2578133A GB2578133A GB1816955.7A GB201816955A GB2578133A GB 2578133 A GB2578133 A GB 2578133A GB 201816955 A GB201816955 A GB 201816955A GB 2578133 A GB2578133 A GB 2578133A
- Authority
- GB
- United Kingdom
- Prior art keywords
- wearer
- wearable device
- control centre
- user
- warning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B27/00—Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B27/00—Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
- G08B27/005—Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations with transmission via computer network
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/44—Event detection
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/12—Alarms for ensuring the safety of persons responsive to undesired emission of substances, e.g. pollution alarms
- G08B21/14—Toxic gas alarms
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Alarm Systems (AREA)
Abstract
A method of warning a user of an augmented reality system to potential hazards is disclosed. The system comprises a wearable AR device that provides an immersive environment to a wearer, a processor for receiving inputs relating to potential dangers to which a wearer of the wearable device may be exposed and generating an alert signal. The alert signal generating an output from the wearable device to make the wearer aware of the potential dangers. The system also comprises a feedback system to detect an acknowledgement action performed by the wearer and cause transmission to a control center of a signal indicative that the alert signal has been received and acknowledged by the wearer. The system may have one or more sensors mounted on the wearable device to detect objects close to the wearer of the wearable device and a processor to identify potential dangers to which the wearer may be exposed by the detected objects. The sensors may include at least one rear facing camera, range finding radar or sonar equipment.
Description
Augmented Reality system This invention relates to augmented reality (AR) systems and virtual reality (VR) systems and in particular to methods of detecting events in the physical environment of users of such systems, and alerting the users to any potential hazards represented by such events.
"Virtual reality" (VR) systems provide an immersive environment in which a user can navigate and interact with computer-generated perceptual information, typically in the form of visual and/or audio data provided to the user through a headset, goggles, head-up display etc. "Augmented Reality" (AR) systems blend such VR inputs with real-world elements to provide an interactive experience of a real-world environment whereby the objects that reside in the real world are "augmented" by computer-generated perceptual information. Such systems are finding use in a number of practical applications, for example, a technician in a utilities industry working on an installation in a distribution network may use an AR headset to obtain information about the operation, arrangement and connections at that installation in order to guide his or her actions.
Spatial awareness is important when working in the field. Any reduction in spatial awareness has the potential to result in harm to the AR user as a result of unobserved local activity, be it accidental or malicious. Historically, utility workers have faced threats whilst working in enclosed or isolated areas, being vulnerable to attack or accident as they focus on their work and lose awareness of their surroundings. Add to this the potential danger for those working alone in areas prone to high levels of traffic pollution or in areas with a history of youth or gang attacks. Unfortunately, the use of AR techniques reduces the users' spatial awareness.
AR headsets are designed to provide additional sensory data relating to the job in hand. However, using the headset, whilst helpfully augmenting the view that is available to the user, restricts the user to only that view. The environment of an AR user focussed on a task such as maintenance of a telecommunications cabinet in 'the field' can pose dangers that become unseen when an AR headset is worn. It reduces the spatial awareness of the user, so if an event or some activity were to occur outside of the user's field of vision, the wearer would be less aware of hazards that could endanger them.
Such hazards may be immediate dangers such as objects travelling on a collision trajectory with the wearer, such as out-of-control vehicles, missiles thrown by malicious persons, or falling objects. There are also more general dangers that may threaten the user's safety but are not (yet) close enough for detection by sensors in the user's immediate environment (e.g. floods, wildfires). Hence, if there was news of an event or activity which could adversely affect the AR user they would be less aware of it.
Examples of more general dangers include: io 1. Noxious gas levels. Users are often near roads with varying levels of oxides of nitrogen, collectively known as NOx. Levels can be captured from sensors in the area and collected on a data hub for interpretation. The user can be informed if those levels approach, meet or exceed levels of NOx which are deemed dangerous to health. The AR user can be informed if this is the case and empower him/her to make a decision about their current working conditions, whether that be to put on a mask or to avoid working in the area until NOx levels subside. Other noxious or dangerous gases such as methane may be present in the user's environment as a result of disturbance to buried services such as sewers or gas pipes.
2. Weather. Information gathered about impending weather events can be interpreted and a warning sent to the AR User. These would include smog, fog, rainstorms, thunderstorms, hurricanes, tornados, heatwaves, snow and ice. The AR user can ensure they take evasive action and generally ensure that they act in advance of the event, allowing sufficient time to retreat in good order, for example by (packing away tools, put cabinet inter-workings back together, and make it secure).
3. Floods. As a result of weather events, floods can occur. AR users can be warned especially if river levels and humidity levels are interpreted together (remotely). Aside from the danger of the flood itself, which in some parts of the country / world could be devastating (i.e. flash floods), they can cut off access for the AR user.
4. Police alerts. Rogue drivers, rioting, local accidents/incidents. Advance knowledge of incidents of this nature can help the AR User avoid getting embroiled in local incidents and help the AR User plan their escape.
5. Forest Fires. These can begin in any wood or forest and spread very quickly.
Employers often operate systems to monitor their personnel to alert the manager to a potential problem. These usually provide only for the field engineers to log on or text at the start of the day to say that they have started work, with a message sent automatically if the employee does not log off at the scheduled end of the shift, or at scheduled check-in times. The frequency of reporting can be selected according to the level of risk to which the employee is exposed, for example by choosing a shorter window, if they are working in an area on a task with greater risk, such as in an isolated area, working alone, or working at height. However, these existing procedures do not help to protect the employees from danger, as they only alert the monitoring system after an accident has happened. The present invention facilitates the protection of workers in situ and in real-time.
As shown in Figure 1, a wearer 2 of a headset 1 has a very restricted field of vision 19, and thus loses awareness of what is around, and particularly behind, them. Figure 2 shows two vehicles 28a, 28b travelling close to an operative 2 working at a roadside cabinet 29, but outside his field of vision 19. The respective trajectories TA, TB of the vehicles are indicated. It will be seen that vehicle 28a presents a potential danger to the AR user 2. The headset may also disrupt the wearer's audio spatial awareness, whether or not there is an audio input to the AR. It is therefore desirable that an AR user can be made aware of events occurring behind them. Such events may be hazardous or even malicious in nature, but any unexpected activity, however harmless, can be a distraction during a highly-focussed wiring or soldering operation.
According to a first aspect of the invention, there is provided an Augmented Reality (AR) system having: a. a wearable device for providing an immersive environment output to a wearer, b. a processor for receiving inputs relating to potential dangers to which a wearer of the wearable device may be exposed and generating an alert signal, c. the alert signal generating an output from the wearable device to make the wearer aware of the potential dangers, d. a feedback system to detect an acknowledgement action performed by the wearer and cause transmission to a control centre of a signal indicative that the alert signal has been received and acknowledged by the wearer.
According to a second aspect, the invention provides a process for delivering data to a user of an Augmented Reality (AR) system processor by generating an alert signal for transmission to the wearer of the wearable device in response to detection of a potential threat or danger, the process further comprising detection of an acknowledgement action performed by the wearer and transmission to a control centre of an indication that the wearer has responded to the alert signal.
The invention therefore provides a feedback system to determine whether a warning has been received by the wearer. Failure to acknowledge may be because the user is not currently wearing the device, and has thus not seen/heard the warning, or because the wearer has already succumbed to the hazard. In either case the operator at the control centre is alerted that investigation is required.
The present invention is of particular application in the field of Augmented Reality as the user has to interact with, and be potentially exposed to, real-world events. In contrast, the user of a Virtual Reality system is more likely to be in a closed and relatively safe real-world environment in which the exposure to danger is relatively low.
Nevertheless, the invention may find application in VR systems to handle emergencies such as a need to evacuate a building, and the term "Augmented Reality" should be interpreted in this specification to include VR systems augmented by inputs to warn the user of external hazards.
In one embodiment, sensors at the rear of a headset may use radar or ultrasonic sensors that emit electromagnetic or acoustic waves that bounce off objects, the returning waves being detected, registered and analysed by a computing algorithm, in a manner analogous to the echolocation techniques performed by bats.
Another embodiment includes one or more video cameras at the rear of a headset which takes video feeds and analyses them to identify objects. Once identified, AR users can be informed. Objects could include people or vehicles -their direction of travel can be established and alerted to the user if they are on a collision course. The positions and trajectories of objects may be determined by triangulation using the inputs from two or more cameras. Providing the AR user with this information (in part or in full) either visually or audibly could help him/her avoid the danger of an upcoming event.
In both of these embodiments, the AR user is informed of the presence of these objects either visually or audibly. Either could indicate the direction of the object using visual clues or by audio using stereo imaging -the latter would provide extra information.
A further embodiment includes the use of sensors attached to the AR unit to sense concentrations of Nitrogen Oxide (NOx) or other hazardous gases around the AR user. Utility technicians using AR headsets are often working near roads with varying levels of this poisonous gas. The user can be informed if those levels approach, meet or exceed levels of NOx which are deemed dangerous to a person's health. The AR user can be informed if this is the case and empower him/her to make a decision about their current working conditions, whether that be to put on a mask or to avoid working in the area until NOx levels subside.
In embodiments of the invention, one or more sensors mounted on the wearable device are arranged to detect objects close to the wearer of the wearable device and outside the wearer's field of vision, objects detected by the sensors are analysed to identify potential dangers to which the wearer of the wearable device may be exposed, alerts are transmitted to the wearer of the wearable device in response to identification of a potential danger, a feedback system detects acknowledgements action performed by the wearer and an indication that a warning has been received is transmitted to a control centre.
In embodiments of the invention a control centre identifies location data relating to one or more users, receives data relating to events external to the users, identifies locations at which the events expose a user to a potential hazard, and transmits a warning to users whose location data correspond to the locations identified as hazardous, the wearable device having means to cause transmission to the control centre of an indication acknowledging that a warning has been received and the control centre being responsive to the detection of such acknowledgments. The location reports may be transmitted by user terminals to the control centre, and used by the control centre for recording locations of individual users to identify users who are reported at locations affected by the events identified as hazardous. Alternatively, user schedules maintained in a store at the control centre can be used by the control centre to identify users whose schedules include locations affected by the events identified as hazardous.
The control centre may re-transmit a further warning message to the wearable device over a second network if no acknowledgement is received to an initial warning to transmitted to the wearable device over a first network, and to generate an alert if no acknowledgement is received.
A user location system may be provided for recording locations of individual users, the locations being identified by location reports transmitted by the users to the data hub. It may also monitor schedules of one or more AR users, and identify users whose schedules include locations potentially affected by reported hazardous events.
The receipt of an acknowledgement by a control centre also confirms that the user is actually wearing the AR device.
In this specification, the term "wearable device" embraces VR or AR headsets, visors, "smart spectacles", earpieces, and any other device which provides a sensory input to a wearer that can impair their wearers' awareness of their actual surroundings. The embodiments discuss headsets which provide visual inputs, and also audio inputs in some embodiments, but this is not limitative.
Embodiments of the invention can be added to existing technology in the form of clip-on sensors or cameras, providing additional feeds to the augmented environment display.
The sensing technology can make use of known object recognition techniques, which recognise and distinguish between (people, cars, bicycles, parking bays, number plates, etc.).
By way of example, an embodiment of the invention will now be described with reference to the drawings, in which: Figure 1 illustrates a user wearing a headset, depicting the limited field of vision available to the user; Figure 2 illustrates a user wearing a headset, exposed to a potential danger; Figure 3 depicts a headset and control centre configured to operate according to the s invention; Figure 4 depicts an initialisation process for a first embodiment of the invention; Figure 5 depicts an optical detection system operating according to the invention; Figure 6 depicts an ultrasonic detection system operating according to the invention; Figure 7 depicts a detection process for the first embodiment of the invention; Figure 8 depicts a detail of the detection process of Figure 7; Figure 9 depicts a first warning display; Figure 10 depicts a second warning display; Figure 11 depicts an initialisation process for a second embodiment of the invention; and is Figure 12 depicts a detection process for the second embodiment of the invention.
Figures 1 and 2 have already been discussed above.
Figure 3 is a schematic representation of the components of a headset for use in an embodiment of the invention. This embodiment employs video camera technology (as opposed to ultrasonic or other sensors) to detect, analyse, and warn the AR user of a localised hazard.
Figure 3 depicts a standard AR headset (1) including one or more front facing video cameras (10) supporting the normal way an AR captures video upon which it will superimpose content, an AR Headset screen (11) viewed by the user (2) in the standard way, and AR headset speakers (12) for hearing audio related to what is being shown.
This embodiment also provides one or more rear-facing video cameras (13, 14). These are in addition to the conventional front-facing cameras (10). There is also a microphone (15) and a GPS sensor (16).
On-board computer processing capacity and storage (3) hosts an AR Safety Application (30), a control processor (31), and configuration data (32). The processing capacity is connected to an external network (4) (e.g. internet, 3G, 4G, 5G, etc.) via a Network interface (33), typically a wireless connection. An initialisation processor (34) is also provided.
The AR Headset (1) is in communication through the network (4) with an administrative/operations platform (5) which can be used to transmit updates to the control processor (31), alter the configuration data (32), and access the AR Safety Application (30). The platform (5) can be hosted in cloud infrastructure.
The platform can also give access to a warning system (8) which can be hosted in cloud infrastructure, and communicates with a Data Hub (7) where alerts are collected and hosted. A management processor (6) associated with the platform (5) interprets the data held on the data hub (7), to guide the warning system (8) to provide data on when a threat warning should be issued.
Embodiments of the invention may be configured to handle local threats, such as those detected by rear-facing cameras 13, 14 or other sensors mounted on or near the headset. Other embodiments may be configured to handle more general threats such as those reported by a data hub 8, as will be described later with reference to Figures 11 and 12. The embodiment of Figure 3 is capable of alerting the user to both types of threat.
Figures 4 to 10 illustrate a process by which the system may be used to identify, and alert the user to, potential threats in the immediate vicinity of the user, detectable by sensors on the headset. Figures 11 and 12 illustrate a process by which the system may be used to identify, and alert the user to, potential threats in the wider environment.
An initialisation process for the device can be seen in Figure 4. In this process, when a user (2) switches on the headset (step 40), the initialisation function (34) initialises the rear-facing video cameras (13, 14), the screen (11) and speakers (12) (steps 4144). Visual and audio cues that they are initiated are made.
If the headset is also to subscribe to warnings transmitted from the warning system 8 a registration signal 51 is also sent to the warning system by way of the external communications link 33, 4, 5, as will be described further with reference to Figures 10 and 11.
The AR safety app (30) is automatically started or (as shown, step 45) started manually by the User (2), to initiate the processor (31) (step 46).
When the system is ready, the video streams from the rear-facing cameras (13, 14) constantly 'feed' the processor (31) (step 47) which among other things, continuously analyses the stream.
io Figure 5 depicts a user (2) performing an AR task in front of a telecom street cabinet 29. It also shows a vehicle 28 approaching the user (2) and potentially putting user (2) in danger. The processor (31) analysing the video streams coming from the rear facing cameras (13, 14) establishes a threat profile when it detects a large object on a collision course. Such processing may include identification of the direction from which the threat is coming, by determining which camera 13, 14 is detecting the threat, and any parallax effects such as the rate of movement (if any) across the field of view, and the rate of approach, etc. Although the absolute distance of an object cannot be determined with a single camera, if the object can be identified, its size may be estimated, and therefore also its distance. Estimates of rate of approach can also be determined by how rapidly the object's apparent size in the field of view of the camera is increasing -the rate at which an object's apparent size is increasing is inversely proportional to the time left to impact. Triangulation may be possible if the object 28 is in the field of view 134 common to more than one cameras 13, 14.
As shown in Figure 6, ultrasonics or radar may also be used. The headset has transceivers 63, 64 emits radio or ultrasound waves that bounce off objects 68, the returning waves are registered by the transceivers and analysed by the processor 31.
The sensing and warning process will be described with reference to Figures 3, 7 and 8.
The video streams 70a, 70b are analysed (step 70) to identify objects (e.g. truck, van, car, motorcycle, bicycle, person, etc.).
If an approaching object is detected, then it is identified and analysed to establish if it is a threat to the AR user, to generate a threat profile 71.
It does this by establishing the object's position, trajectory, and distance using video analytic techniques and comparing them to AR user's position to establish if the AR User is on that trajectory.
Examples would include "Car" + trajectory (in degrees) + distance (in meters) + "User in path". Another example would be "Truck" + trajectory (in degrees) + distance (in meters) + "User in path".
Warnings could be as simple as "Run left -Car imminent!" The generated threat profile is compared with a stored threat profile, which contains parameters and thresholds of an actual threat.
If the comparison reveals that the new threat profile is deemed to represent an actual threat -it is immediately issued. If not -it is discarded and the cycle begins again.
The Threat Profile 71 is now passed from the Processor (31) to the AR Safety Application (30) (step 72). An example of a threat profile is shown in Table 1.
Table 1-New Threat profile Threat profile ID Description Type Returned variable ID Unique name created for this threat profile Alphanumeric (TBD: e.g. Unique code derived from time signature) object_warning_ Object warning -defines the object and approach direction String "[Lorry/Car/Motorbike/bic ycle/person] 1 approaching from [behindflefUright/behind left/behind right]" object_warning_ Object warning -defines the object and its location String "[Lorry/Car/Motorbike/bic ycle/person] at [distance/location] video_link A link to a video showing the oncoming threat String [address of video that shows the threat caught by video cameras] As can be seen in Figure 8, the Threat Profile is received and read (step 72) by the AR Safety Application 30. The objective is to warn the AR User 2 and there are multiple combinations of ways in which this can be done. A set of preferences outlined in the configuration file 32 is read in (step 73).
Available options are shown in Figures 9 and 10. These may be used individually or in combination.
-Picture-in-picture video warnings 78a. This could be actual footage shown in the corner of the AR User's screen if available from the threat profile or other parts of the system) as shown in Figure 9, or a generated icon or image 78b or video in the corner of the AR User's screen, as shown in Figure 10. The compiled image is transmitted to the headset (step 74) -Audio (a warning noise or phrase that may or may not include the nature of the threat as determined from the threat profile). The audio signal is transmitted to the speakers (step 75).
The warning is given to the user (2) (step 78) via the screen (11) (step 75) and/or speakers or headphones (12) (step 75). The warning sound 74 could emanate from the left or right speaker (or both) depending on the direction of the threat. This would help the AR user quickly ascertain the direction of the threat and enable the AR User to escape the threat in the most appropriate direction.
A log is updated (step 79) with all actions for later recall (for example for Health and Safety audits).
Once the AR user (2) has seen the messages, he or she can acknowledge that they have seen it by using a physical switch 17 on the headset, or by using eye tracking or some other mechanism. This causes an acknowledgement message 98 to be sent to the warning system 8 to inform that the AR User is now aware of the threat.
Any failure of the warning system 8 to receive the expected acknowledgement signal 98 can be recorded. A message 99 may also be sent to an external monitoring system 8 (see Figure 3) to alert a supervisor that the operative failed to respond to a potential threat, allowing the supervisor to be alerted to a potential need to investigate the wellbeing of the operative.
If the headset is to alert the user to threats in the wider environment, the headset registers with a data hub 7. The registration process is depicted in Figure 11.
Initialisation includes 'event subscription' (step 50) which is a function of the standard publish-subscribe ("pub-sub") pattern in which events can be discovered via a catalogue search, and then subscribed to. Once an event occurs, information or data relating to that event or the event itself is forwarded (published) to the component that subscribed to it. In this example, events are weather, traffic, police, and pollution levels but could realistically be any events that could help underpin threat advice.
The headset 1 transmits a registration request (51) to inform the warning system (8) that the AR headset has come on line and requires warnings of events affecting the wearer's locale. Such a registration could happen each time the headset is turned on, as part of the start-up procedure 34, 51, (Figure 4). The registration details are recorded in the data hub 7 (step 52) The operation of the system once it has been registered is depicted in Figure 12. There are three elements to the continuing process, firstly monitoring the location of headsets registered to the control system (steps 53-55); secondly identification of threats (steps 90-94); and thirdly transmitting warnings to the headsets of users at risk from such threats (steps 95-97).
To initially establish the whereabouts of the AR User (1), the Warning system (8) may interact with the AR User's headset (1) in a number of ways. The Warning system (8) may be configured to poll the GPS (16) component of each AR User on a regular basis, for example every few minutes, so that the data Hub (7) can store a register of users and latest whereabouts. Alternatively, location requests may be generated only when a potential threat is identified (step 92, 93, 94). This reduces overhead, because a location request is only made when there is a warning to transmit, but it may result in delay in delivering the warnings as all users have to be polled when a threat is identified in order to determine which ones need the warning.
Users' itineraries may be stored in the data hub (7) and used as a secondary method of determining the users' expected whereabouts.
Steps 90-97 in Figure 4 illustrate the process by which a threat is identified and a warning communicated to the user.
As shown in Figure 3, the warning system 8 comprises a location system 81 for establishing the locations of the AR users, and an event logging system 82, which holds the raw events received by the data hub 7. It may also have an itinerary store 80 to store details of the AR Users' planned itineraries. This can help provide warnings of events, which will affect works later that day in a different location. The user could be affected by events in that area or be advised of travel issues (closed bridges, floods, forest fires, etc.) These three systems 80, 81, 82 receive inputs from the administration system 5, in response to data provided by the user terminal (for location data) (step 54), the user's work management system (for itinerary data) and event inputs 90. These inputs are processed by an interpretation component 83, which establishes which events should be classified as a threat that might impact the AR user. Data on those events identified as threats are passed to a warning generation unit 84 to generate a warning message appropriate to the user's situation and available outputs 11, 12 and transmitting it to the user. An acknowledgment system 85 is provided in this embodiment, which records messages received from the user. Such messages may include acknowledgement of warnings received, to confirm that the AR user has seen the message. The acknowledgement system may also exchange periodic reminder signals with the user, to confirm that the user is safe and well, and still wearing the headset and thus able to receive any warning messages that may become necessary.
If an event is reported to the data hub 7 (step 90) information relating to that event is transmitted to the event logger 82 in the warning system 8 (step 91). There are multiple ways in which the warning system 8 may obtain information from the data hub 7 to be used to interpret threats. For example, the warning system (8) may continuously poll the Data Hub (7) for events. This is known as a "data pull" system, i.e controlled by the entity that is to receive the data (in this case the warning system 8), which only receives data if it requests it from the data source (hub 7). Alternatively, the warning system 8 may subscribe to a service which publishes events. This is a "push" system, controlled by the data provider (hub 7), in which the data receiver (warning system 8) passively waits for data to be sent to it.
The warning system passes details of the events to the management processor 6 (step 92) for analysis and interpretation. For each AR user registered with the data hub 7, the management processor 6 establishes if the threat is in the AR User's vicinity, and the nature of the threat to the AR User and, if appropriate, composes and sends a warning to the AR User. The following steps 93-98 are performed for each registered AR user.
The processor 6 requests the users' current location (and/or future itinerary) from the data hub 7 (step 93) (or from the user terminals themselves) and receives responses 94. It then processes the threat data and the location data (step 95) to identify users io exposed to the threat represented by the event 92, and generates an instruction 96 to transmit warnings 97 to the affected users The Warning system (8) may send warning messages in a format that can be displayed directly on the screen 11 of the AR User's headset (1), or it may send the warning in a more generalised format that the headset's processor (31) and Configuration file (32) can use to interpret the best way to inform the user. These two components work in conjunction with the AR Safety App (30). The configuration file could contain AR User preferences and may also contain local environment information, such as whether the area is dark, or if headphones are being used. If the user is not detected to be wearing the headset, (for example if he is travelling between tasks) the warning may be transmitted in a different way such as text message.
The AR Safety Application (30) manages the interaction with the AR User's senses, by populating the screen (11) and/or pushing audio to the headphones (12). The threat warning could be a combination of graphics, text, audible sounds, text to speech or recorded phrases, played out through AR headphones (if it has audio -for example, like a smartphone) or using the AR Headset (1) screen, or as picture-in-picture within the screen estate of the AR headset screen, as shown in Figure 9.
Once the AR user (2) has seen the messages, he can acknowledge that he has seen it by using a physical switch 17 on the headset or by using eye tracking or some other mechanism. This causes an acknowledgement message 98 to be sent to the warning system 7 to inform that the AR User is now aware of the threat.
Any failure of the acknowledgement system 85 to receive the expected acknowledgement signal 98 can be recorded by the system. A failure to see the warning, and therefore to acknowledge it, may be because the user is not currently wearing the headset, and is thus likely to be operating in a less restrictive manner, with more normal awareness levels. The message could be transmitted again at a later time, or a warning message sent out-of-band to the user using another medium such as a text message. However, if the message remains unacknowledged, the administrative function may be alerted to allow investigation of the well-being of the operative.
Claims (22)
- CLAIM S1. An Augmented Reality (AR) system having: a. a wearable device for providing an immersive environment output to a wearer, b. a processor for receiving inputs relating to potential dangers to which a wearer of the wearable device may be exposed and generating an alert signal, c. the alert signal generating an output from the wearable device to make the wearer aware of the potential dangers, d. a feedback system to detect an acknowledgement action performed by the wearer and cause transmission to a control centre of a signal indicative that the alert to signal has been received and acknowledged by the wearer.
- 2. An Augmented Reality system according to claim 1 having one or more sensors mounted on the wearable device to detect objects close to the wearer of the wearable device and outside the wearer's field of vision, and a processor for analysing detected objects to identify potential dangers to which the wearer may be exposed by the detected objects, and to generate the alert signal to alert a wearer of the wearable device of a potential threat/danger.
- 3. An Augmented Reality system according to claim 2, wherein the sensor includes at least one rear-facing video camera.
- 4. An Augmented Reality system according to claim 2 or claim 3, comprising twin cameras, the processor determining the position of objects by triangulation.
- 5. An Augmented Reality system according to claim 2, claim 3 or claim 4, comprising range-finding radar or sonar equipment.
- 6. An Augmented Reality system according to claim 2, claim 3 claim 4 or claim 5 comprising a detector sensitive to dangerous concentrations of noxious gases.
- 7. An Augmented Reality system according to any preceding claim, further comprising a control centre in communication with the wearable device, the control centre having: a) a data hub for identifying location data relating to the wearable device, and for receiving data relating to events external to the wearable device, b) a processor for identifying locations at which the events represent a potential danger, and c) a warning system to generate and transmit warnings to users whose location to data is identified as associated with such locations, the wearable device having means to cause transmission to the control centre of an indication that a warning has been received by the wearer, and the control centre having means for receiving said indication.
- 8. An Augmented Reality system according to claim 7, the control centre having means for re-transmitting a further warning message to the wearable device over a second network if no acknowledgement is received to an initial warning transmitted to the wearable device over a first network.
- 9. An Augmented Reality system according to claim 7 or claim 8, the control centre having means for generating an alert if no acknowledgement is received.
- 10. An Augmented Reality system according to claim 7, claim 8 or claim 9, having a user location system for recording locations of individual users, the locations being identified by location reports transmitted by the users to the data hub.
- 11. An Augmented Reality system according to claim 7, claim 8, claim 9 or claim 10, having schedule monitoring means for monitoring schedules of one or more AR users, and a user location system for identifying users whose schedules include locations potentially affected by such events.
- 12. A process for delivering data to a user of an Augmented Reality (AR) system processor by generating an alert signal for transmission to the wearer of the wearable device in response to detection of a potential threat or danger, the process further comprising detection of an acknowledgement action performed by the wearer and transmission to a control centre of an indication that the wearer has responded to the alert signal.
- 13. A process according to Claim 12, wherein one or more sensors mounted on the wearable device are arranged to detect objects close to the wearer of the wearable to device and outside the wearer's field of vision, objects detected by the sensors are analysed to identify potential dangers to which the wearer of the wearable device may be exposed, alerts are transmitted to the wearer of the wearable device in response to identification of a potential danger, a feedback system detects acknowledgements action performed by the wearer and an indication that a warning has been received is transmitted to a control centre.
- 14. A process according to claim 13, wherein the sensors include at least one rear-facing video camera.
- 15. A process according to claim 13 or claim 14, wherein the processor determines the position of objects by triangulation using the inputs from two or more cameras.
- 16. A process according to claim 13, claim 14 or claim 15, wherein range-finding is performed using radar or sonar equipment.
- 17. A process according to claim 13, claim 14, claim 15 or claim 16 wherein a detector sensitive to dangerous concentrations of noxious gases provides an input to the analysis process.
- 18. A process according to Claim 12 in which a control centre identifies location data relating to one or more users, receives data relating to events external to the users, identifies locations at which the events expose a user to a potential hazard, and transmits a warning to users whose location data correspond to the locations identified as hazardous, the wearable device having means to cause transmission to the control centre of an indication acknowledging that a warning has been received and the control centre being responsive to the detection of such acknowledgments.
- 19. A process according to Claim 18 wherein the control centre transmits a further to warning message over a second medium if no acknowledgement is received to an initial warning transmitted to the wearable device over a first medium.
- 20. A process according to claim 18, wherein the control centre generates an alert if no acknowledgement is received.
- 21. A process according to claim 18, wherein location reports are transmitted by user terminals to the control centre, and used by the control centre for recording locations of individual users to identify users who are reported at locations affected by the events identified as hazardous.
- 22. A process according to claim 18, wherein user schedules are maintained in a store at the control centre, and used by the control centre to identify users whose schedules include locations affected by the events identified as hazardous.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1816955.7A GB2578133A (en) | 2018-10-18 | 2018-10-18 | Augumented reality system |
PCT/EP2019/075421 WO2020078663A1 (en) | 2018-10-18 | 2019-09-20 | Augmented reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1816955.7A GB2578133A (en) | 2018-10-18 | 2018-10-18 | Augumented reality system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201816955D0 GB201816955D0 (en) | 2018-12-05 |
GB2578133A true GB2578133A (en) | 2020-04-22 |
Family
ID=64453892
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1816955.7A Withdrawn GB2578133A (en) | 2018-10-18 | 2018-10-18 | Augumented reality system |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2578133A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023146958A1 (en) * | 2022-01-27 | 2023-08-03 | Rovi Guides, Inc. | Smart home management system for generating augmented reality scene of potentially hazardous condition |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100188230A1 (en) * | 2009-01-29 | 2010-07-29 | Ted Lindsay | Dynamic reminder system, method and apparatus for individuals suffering from diminishing cognitive skills |
US20120092161A1 (en) * | 2010-10-18 | 2012-04-19 | Smartwatch, Inc. | Systems and methods for notifying proximal community members of an emergency or event |
US20120194554A1 (en) * | 2011-01-28 | 2012-08-02 | Akihiko Kaino | Information processing device, alarm method, and program |
US20140108136A1 (en) * | 2012-10-12 | 2014-04-17 | Ebay Inc. | Augmented reality for shipping |
US20160086473A1 (en) * | 2014-09-23 | 2016-03-24 | Rory Groves | Method for guaranteed delivery of alert notifications through chain-of-command escalation procedures |
US20180276969A1 (en) * | 2017-03-22 | 2018-09-27 | T-Mobile Usa, Inc. | Collision avoidance system for augmented reality environments |
-
2018
- 2018-10-18 GB GB1816955.7A patent/GB2578133A/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100188230A1 (en) * | 2009-01-29 | 2010-07-29 | Ted Lindsay | Dynamic reminder system, method and apparatus for individuals suffering from diminishing cognitive skills |
US20120092161A1 (en) * | 2010-10-18 | 2012-04-19 | Smartwatch, Inc. | Systems and methods for notifying proximal community members of an emergency or event |
US20120194554A1 (en) * | 2011-01-28 | 2012-08-02 | Akihiko Kaino | Information processing device, alarm method, and program |
US20140108136A1 (en) * | 2012-10-12 | 2014-04-17 | Ebay Inc. | Augmented reality for shipping |
US20160086473A1 (en) * | 2014-09-23 | 2016-03-24 | Rory Groves | Method for guaranteed delivery of alert notifications through chain-of-command escalation procedures |
US20180276969A1 (en) * | 2017-03-22 | 2018-09-27 | T-Mobile Usa, Inc. | Collision avoidance system for augmented reality environments |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023146958A1 (en) * | 2022-01-27 | 2023-08-03 | Rovi Guides, Inc. | Smart home management system for generating augmented reality scene of potentially hazardous condition |
US11983922B2 (en) | 2022-01-27 | 2024-05-14 | Rovi Guides, Inc. | Smart home management system for generating augmented reality scene of potentially hazardous condition |
Also Published As
Publication number | Publication date |
---|---|
GB201816955D0 (en) | 2018-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7363942B2 (en) | Programs, information transmission methods, computer-readable storage media, and information transmission systems | |
JP7444777B2 (en) | Information processing device, terminal device, information processing method, and information processing program | |
US11302176B2 (en) | Real time municipal imminent danger warning system | |
US11265675B2 (en) | System and method for managing emergency vehicle alert geofence | |
JP6827712B2 (en) | Control devices, in-vehicle devices, video distribution methods, and programs | |
CN103391432A (en) | Intelligent video monitoring system for safety early warning of scenic spots and monitoring method | |
JP6858063B2 (en) | Work information system and methods for collecting data related to events that occur at the work site | |
JP2003346266A (en) | Information providing system, and its device and method | |
KR101545080B1 (en) | Smart security system | |
US10210759B2 (en) | System and method for enabling an interoperable vehicle safety network using wireless communication | |
US9779623B2 (en) | Communication of alerts to vehicles based on vehicle movement | |
GB2578133A (en) | Augumented reality system | |
US11743372B1 (en) | Monitoring systems and methods for personal safety | |
Huang et al. | Applying beacon sensor alarm system for construction worker safety in workplace | |
KR100916315B1 (en) | Safeguard area management system, safeguard equipment and managing method thereof | |
WO2020078663A1 (en) | Augmented reality system | |
JP7313806B2 (en) | Pedestrian device, vehicle-mounted device, inter-pedestrian communication system, and safety confirmation support method | |
KR20170102403A (en) | Big data processing method and Big data system for vehicle | |
CN111194023A (en) | Vehicle, and vehicle visual rescue method and system | |
US20230386259A1 (en) | System and method for safe, private, and automated detection and reporting of domestic abuse | |
US20240005783A1 (en) | Driver assistance system | |
US11533072B2 (en) | Transmission of body status information by a wearable computing device | |
WO2023175829A1 (en) | Monitoring system, monitoring device, monitoring method, and recording medium | |
CN114334152A (en) | Security protection method, device, equipment, readable storage medium and program product | |
JP2023121599A (en) | Display system and display method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |