WO2020078663A1 - Système de réalité augmentée - Google Patents

Système de réalité augmentée Download PDF

Info

Publication number
WO2020078663A1
WO2020078663A1 PCT/EP2019/075421 EP2019075421W WO2020078663A1 WO 2020078663 A1 WO2020078663 A1 WO 2020078663A1 EP 2019075421 W EP2019075421 W EP 2019075421W WO 2020078663 A1 WO2020078663 A1 WO 2020078663A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearer
wearable device
user
control centre
augmented reality
Prior art date
Application number
PCT/EP2019/075421
Other languages
English (en)
Other versions
WO2020078663A8 (fr
Inventor
Paul DEANS
Original Assignee
LIDBETTER, Timothy Guy Edwin
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB1816955.7A external-priority patent/GB2578133A/en
Application filed by LIDBETTER, Timothy Guy Edwin filed Critical LIDBETTER, Timothy Guy Edwin
Publication of WO2020078663A1 publication Critical patent/WO2020078663A1/fr
Publication of WO2020078663A8 publication Critical patent/WO2020078663A8/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/14Central alarm receiver or annunciator arrangements

Definitions

  • This invention relates to augmented reality (AR) systems and virtual reality (VR) systems and in particular to methods of detecting events in the physical environment of users of such systems, and alerting the users to any potential hazards represented by such events.
  • AR augmented reality
  • VR virtual reality
  • VR virtual reality
  • AR Augmented Reality
  • Such systems are finding use in a number of practical applications, for example, a technician in a utilities industry working on an installation in a distribution network may use an AR headset to obtain information about the operation, arrangement and connections at that installation in order to guide his or her actions.
  • AR headsets are designed to provide additional sensory data relating to the job in hand.
  • using the headset whilst helpfully augmenting the view that is available to the user, restricts the user to only that view.
  • the environment of an AR user focussed on a task such as maintenance of a telecommunications cabinet in‘the field’ can pose dangers that become unseen when an AR headset is worn. It reduces the spatial awareness of the user, so if an event or some activity were to occur outside of the user’s field of vision, the wearer would be less aware of hazards that could endanger them.
  • Such hazards may be immediate dangers such as objects travelling on a collision trajectory with the wearer, such as out-of-control vehicles, missiles thrown by malicious persons, or falling objects.
  • floods, wildfires e.g. floods, wildfires
  • Examples of more general dangers include:
  • Noxious gas levels Users are often near roads with varying levels of oxides of nitrogen, collectively known as NOx. Levels can be captured from sensors in the area and collected on a data hub for interpretation. The user can be informed if those levels approach, meet or exceed levels of NOx which are deemed dangerous to health. The AR user can be informed if this is the case and empower him/her to make a decision about their current working conditions, whether that be to put on a mask or to avoid working in the area until NOx levels subside. Other noxious or dangerous gases such as methane may be present in the user’s environment as a result of disturbance to buried services such as sewers or gas pipes.
  • Weather Information gathered about impending weather events can be interpreted and a warning sent to the AR User. These would include smog, fog, rainstorms, thunderstorms, hurricanes, tornados, heatwaves, snow and ice.
  • the AR user can ensure they take evasive action and generally ensure that they act in advance of the event, allowing sufficient time to retreat in good order, for example by (packing away tools, put cabinet inter-workings back together, and make it secure).
  • Forest Fires These can begin in any wood or forest and spread very quickly. Employers often operate systems to monitor their personnel to alert the manager to a potential problem. These usually provide only for the field engineers to log on or text at the start of the day to say that they have started work, with a message sent automatically if the employee does not log off at the scheduled end of the shift, or at scheduled check-in times.
  • the frequency of reporting can be selected according to the level of risk to which the employee is exposed, for example by choosing a shorter window, if they are working in an area on a task with greater risk, such as in an isolated area, working alone, or working at height. Flowever, these existing procedures do not help to protect the employees from danger, as they only alert the monitoring system after an accident has happened.
  • the present invention facilitates the protection of workers in situ and in real-time.
  • a wearer 2 of a headset 1 has a very restricted field of vision 19, and thus loses awareness of what is around, and particularly behind, them.
  • Figure 2 shows two vehicles 28a, 28b travelling close to an operative 2 working at a roadside cabinet 29, but outside his field of vision 19. The respective trajectories TA, TB of the vehicles are indicated.
  • vehicle 28a presents a potential danger to the AR user 2.
  • the headset may also disrupt the wearer’s audio spatial awareness, whether or not there is an audio input to the AR. It is therefore desirable that an AR user can be made aware of events occurring behind them. Such events may be hazardous or even malicious in nature, but any unexpected activity, however harmless, can be a distraction during a highly-focussed wiring or soldering operation.
  • an Augmented Reality (AR) system having a wearable device for providing an immersive environment output to a wearer, and a control centre in communication with the wearable device, the control centre having: a) a data hub for identifying location data relating to the wearable device, and for receiving data relating to events external to the wearable device,
  • AR Augmented Reality
  • a warning system to generate and transmit warnings to users whose location data is identified as associated with such locations
  • a processor for receiving inputs relating to potential dangers to which a wearer of the wearable device may be exposed and generating an alert signal, ii. the alert signal generating an output from the wearable device to make the wearer aware of the potential dangers,
  • a feedback system to detect an acknowledgement action performed by the wearer and cause transmission to a control centre of a signal indicative that the alert signal has been received and acknowledged by the wearer
  • the invention provides a process for delivering data to a user of an Augmented Reality (AR) system processor in which a control centre identifies location data relating to one or more users, receives data relating to events external to the users, identifies locations at which the events expose a user to a potential hazard, and transmits a warning to users whose location data correspond to the locations identified as hazardous by generating an alert signal for transmission to the wearer of the wearable device in response to detection of a potential threat or danger, the process further comprising detection of an acknowledgement action performed by the wearer and transmission to a control centre of an indication that the wearer has responded to the alert signal
  • AR Augmented Reality
  • the invention therefore provides a feedback system to determine whether a warning has been received by the wearer. Failure to acknowledge may be because the user is not currently wearing the device, and has thus not seen/heard the warning, or because the wearer has already succumbed to the hazard. In either case the operator at the control centre is alerted that investigation is required.
  • the present invention is of particular application in the field of Augmented Reality as the user has to interact with, and be potentially exposed to, real-world events.
  • the user of a Virtual Reality system is more likely to be in a closed and relatively safe real-world environment in which the exposure to danger is relatively low.
  • the invention may find application in VR systems to handle emergencies such as a need to evacuate a building, and the term“Augmented Reality” should be interpreted in this specification to include VR systems augmented by inputs to warn the user of external hazards.
  • sensors at the rear of a headset may use radar or ultrasonic sensors that emit electromagnetic or acoustic waves that bounce off objects, the returning waves being detected, registered and analysed by a computing algorithm, in a manner analogous to the echolocation techniques performed by bats.
  • Another embodiment includes one or more video cameras at the rear of a headset which takes video feeds and analyses them to identify objects. Once identified, AR users can be informed. Objects could include people or vehicles - their direction of travel can be established and alerted to the user if they are on a collision course. The positions and trajectories of objects may be determined by triangulation using the inputs from two or more cameras. Providing the AR user with this information (in part or in full) either visually or audibly could help him/her avoid the danger of an upcoming event.
  • the AR user is informed of the presence of these objects either visually or audibly. Either could indicate the direction of the object using visual clues or by audio using stereo imaging - the latter would provide extra information.
  • a further embodiment includes the use of sensors attached to the AR unit to sense concentrations of Nitrogen Oxide (NOx) or other hazardous gases around the AR user.
  • NOx Nitrogen Oxide
  • Utility technicians using AR headsets are often working near roads with varying levels of this poisonous gas.
  • the user can be informed if those levels approach, meet or exceed levels of NOx which are deemed dangerous to a person’s health.
  • the AR user can be informed if this is the case and empower him/her to make a decision about their current working conditions, whether that be to put on a mask or to avoid working in the area until NOx levels subside.
  • one or more sensors mounted on the wearable device are arranged to detect objects close to the wearer of the wearable device and outside the wearer’s field of vision, objects detected by the sensors are analysed to identify potential dangers to which the wearer of the wearable device may be exposed, alerts are transmitted to the wearer of the wearable device in response to identification of a potential danger, a feedback system detects acknowledgements action performed by the wearer and an indication that a warning has been received is transmitted to a control centre.
  • a control centre identifies location data relating to one or more users, receives data relating to events external to the users, identifies locations at which the events expose a user to a potential hazard, and transmits a warning to users whose location data correspond to the locations identified as hazardous, the wearable device having means to cause transmission to the control centre of an indication acknowledging that a warning has been received and the control centre being responsive to the detection of such acknowledgments.
  • the location reports may be transmitted by user terminals to the control centre, and used by the control centre for recording locations of individual users to identify users who are reported at locations affected by the events identified as hazardous.
  • user schedules maintained in a store at the control centre can be used by the control centre to identify users whose schedules include locations affected by the events identified as hazardous.
  • the control centre may re-transmit a further warning message to the wearable device over a second network if no acknowledgement is received to an initial warning transmitted to the wearable device over a first network, and to generate an alert if no acknowledgement is received.
  • the user location system may record locations of individual users, the locations being identified by location reports transmitted by the users to the data hub. It may also monitor schedules of one or more AR users, and identify users whose schedules include locations potentially affected by reported hazardous events.
  • the receipt of an acknowledgement by a control centre also confirms that the user is actually wearing the AR device.
  • the term“wearable device” embraces VR or AR headsets, visors, “smart spectacles”, earpieces, and any other device which provides a sensory input to a wearer that can impair their wearers’ awareness of their actual surroundings.
  • the embodiments discuss headsets which provide visual inputs, and also audio inputs in some embodiments, but this is not limitative.
  • Embodiments of the invention can be added to existing technology in the form of clip- on sensors or cameras, providing additional feeds to the augmented environment display.
  • the sensing technology can make use of known object recognition techniques, which recognise and distinguish between (people, cars, bicycles, parking bays, number plates, etc.).
  • Figure 1 illustrates a user wearing a headset, depicting the limited field of vision available to the user
  • Figure 2 illustrates a user wearing a headset, exposed to a potential danger
  • Figure 3 depicts a headset and control centre configured to operate according to the invention
  • Figure 4 depicts an initialisation process for a first embodiment of the invention
  • Figure 5 depicts an optical detection system operating according to the invention
  • Figure 6 depicts an ultrasonic detection system operating according to the invention
  • Figure 7 depicts a detection process for the first embodiment of the invention
  • Figure 8 depicts a detail of the detection process of Figure 7;
  • Figure 9 depicts a first warning display
  • Figure 10 depicts a second warning display
  • Figure 1 1 depicts an initialisation process for a second embodiment of the invention.
  • Figure 12 depicts a detection process for the second embodiment of the invention.
  • Figures 1 and 2 have already been discussed above.
  • FIG. 3 is a schematic representation of the components of a headset for use in an embodiment of the invention.
  • This embodiment employs video camera technology (as opposed to ultrasonic or other sensors) to detect, analyse, and warn the AR user of a localised hazard.
  • Figure 3 depicts a standard AR headset (1 ) including one or more front facing video cameras (10) supporting the normal way an AR captures video upon which it will superimpose content, an AR Headset screen (1 1 ) viewed by the user (2) in the standard way, and AR headset speakers (12) for hearing audio related to what is being shown.
  • a standard AR headset (1 ) including one or more front facing video cameras (10) supporting the normal way an AR captures video upon which it will superimpose content, an AR Headset screen (1 1 ) viewed by the user (2) in the standard way, and AR headset speakers (12) for hearing audio related to what is being shown.
  • This embodiment also provides one or more rear-facing video cameras (13, 14). These are in addition to the conventional front-facing cameras (10). There is also a microphone (15) and a GPS sensor (16).
  • On-board computer processing capacity and storage (3) hosts an AR Safety Application (30), a control processor (31 ), and configuration data (32).
  • the processing capacity is connected to an external network (4) (e.g. internet, 3G, 4G, 5G, etc.) via a Network interface (33), typically a wireless connection.
  • An initialisation processor (34) is also provided.
  • the AR Headset (1 ) is in communication through the network (4) with an administrative/operations platform (5) which can be used to transmit updates to the control processor (31 ), alter the configuration data (32), and access the AR Safety Application (30).
  • the platform (5) can be hosted in cloud infrastructure.
  • the platform can also give access to a warning system (8) which can be hosted in cloud infrastructure, and communicates with a Data Hub (7) where alerts are collected and hosted.
  • a management processor (6) associated with the platform (5) interprets the data held on the data hub (7), to guide the warning system (8) to provide data on when a threat warning should be issued.
  • Embodiments of the invention may be configured to handle local threats, such as those detected by rear-facing cameras 13, 14 or other sensors mounted on or near the headset, as well as more general threats such as those reported by a data hub 8, as will be described later with reference to Figures 1 1 and 12.
  • the embodiment of Figure 3 is capable of alerting the user to both types of threat.
  • Figures 4 to 10 illustrate a process by which the system may be used to identify, and alert the user to, potential threats in the immediate vicinity of the user, detectable by sensors on the headset.
  • Figures 1 1 and 12 illustrate a process by which the system may be used to identify, and alert the user to, potential threats in the wider environment.
  • a registration signal 51 is also sent to the warning system by way of the external communications link 33, 4, 5, as will be described further with reference to Figures 10 and 1 1 .
  • the AR safety app (30) is automatically started or (as shown, step 45) started manually by the User (2), to initiate the processor (31 ) (step 46).
  • the video streams from the rear-facing cameras (13, 14) constantly‘feed’ the processor (31 ) (step 47) which among other things, continuously analyses the stream.
  • Figure 5 depicts a user (2) performing an AR task in front of a telecom street cabinet 29. It also shows a vehicle 28 approaching the user (2) and potentially putting user (2) in danger.
  • the processor (31 ) analysing the video streams coming from the rear facing cameras (13, 14) establishes a threat profile when it detects a large object on a collision course. Such processing may include identification of the direction from which the threat is coming, by determining which camera 13, 14 is detecting the threat, and any parallax effects such as the rate of movement (if any) across the field of view, and the rate of approach, etc.
  • the absolute distance of an object cannot be determined with a single camera, if the object can be identified, its size may be estimated, and therefore also its distance.
  • Estimates of rate of approach can also be determined by how rapidly the object's apparent size in the field of view of the camera is increasing - the rate at which an object’s apparent size is increasing is inversely proportional to the time left to impact. Triangulation may be possible if the object 28 is in the field of view 134 common to more than one cameras 13, 14.
  • ultrasonics or radar may also be used.
  • the headset has transceivers 63, 64 emits radio or ultrasound waves that bounce off objects 68, the returning waves are registered by the transceivers and analysed by the processor 31 .
  • the video streams 70a, 70b are analysed (step 70) to identify objects (e.g. truck, van, car, motorcycle, bicycle, person, etc.).
  • objects e.g. truck, van, car, motorcycle, bicycle, person, etc.
  • an approaching object If an approaching object is detected, then it is identified and analysed to establish if it is a threat to the AR user, to generate a threat profile 71 .
  • Examples would include“Car” + trajectory (in degrees) + distance (in meters) +“User in path”. Another example would be“Truck” + trajectory (in degrees) + distance (in meters) +“User in path”.
  • Warnings could be as simple as“Run left - Car imminent!”
  • the generated threat profile is compared with a stored threat profile, which contains parameters and thresholds of an actual threat.
  • the Threat Profile is received and read (step 72) by the AR Safety Application 30.
  • the objective is to warn the AR User 2 and there are multiple combinations of ways in which this can be done.
  • a set of preferences outlined in the configuration file 32 is read in (step 73).
  • - Picture-in-picture video warnings 78a This could be actual footage shown in the corner of the AR User’s screen if available from the threat profile or other parts of the system) as shown in Figure 9, or a generated icon or image 78b or video in the corner of the AR User’s screen, as shown in Figure 10.
  • the compiled image is transmitted to the headset (step 74)
  • the audio signal is transmitted to the speakers
  • the warning is given to the user (2) (step 78) via the screen (1 1 ) (step 75) and/or speakers or headphones (12) (step 75).
  • the warning sound 74 could emanate from the left or right speaker (or both) depending on the direction of the threat. This would help the AR user quickly ascertain the direction of the threat and enable the AR User to escape the threat in the most appropriate direction.
  • a log is updated (step 79) with all actions for later recall (for example for Health and Safety audits).
  • the AR user (2) Once the AR user (2) has seen the messages, he or she can acknowledge that they have seen it by using a physical switch 17 on the headset, or by using eye tracking or some other mechanism. This causes an acknowledgement message 98 to be sent to the warning system 8 to inform that the AR User is now aware of the threat.
  • Any failure of the warning system 8 to receive the expected acknowledgement signal 98 can be recorded.
  • a message 99 may also be sent to an external monitoring system
  • the headset If the headset is to alert the user to threats in the wider environment, the headset registers with a data hub 7. The registration process is depicted in Figure 1 1 .
  • Initialisation includes‘event subscription’ (step 50) which is a function of the standard publish-subscribe (“pub-sub”) pattern in which events can be discovered via a catalogue search, and then subscribed to. Once an event occurs, information or data relating to that event or the event itself is forwarded (published) to the component that subscribed to it.
  • events are weather, traffic, police, and pollution levels but could realistically be any events that could help underpin threat advice.
  • the headset 1 transmits a registration request (51 ) to inform the warning system (8) that the AR headset has come on line and requires warnings of events affecting the wearer's locale. Such a registration could happen each time the headset is turned on, as part of the start-up procedure 34, 51 , ( Figure 4).
  • the registration details are recorded in the data hub 7 (step 52)
  • the Warning system (8) may interact with the AR User’s headset (1 ) in a number of ways.
  • the Warning system (8) may be configured to poll the GPS (16) component of each AR User on a regular basis, for example every few minutes, so that the data Hub (7) can store a register of users and latest whereabouts.
  • location requests may be generated only when a potential threat is identified (step 92, 93, 94). This reduces overhead, because a location request is only made when there is a warning to transmit, but it may result in delay in delivering the warnings as all users have to be polled when a threat is identified in order to determine which ones need the warning.
  • Users’ itineraries may be stored in the data hub (7) and used as a secondary method of determining the users’ expected whereabouts.
  • Steps 90-97 in Figure 4 illustrate the process by which a threat is identified and a warning communicated to the user.
  • the warning system 8 comprises a location system 81 for establishing the locations of the AR users, and an event logging system 82, which holds the raw events received by the data hub 7. It may also have an itinerary store 80 to store details of the AR Users’ planned itineraries. This can help provide warnings of events, which will affect works later that day in a different location. The user could be affected by events in that area or be advised of travel issues (closed bridges, floods, forest fires, etc.)
  • These three systems 80, 81 , 82 receive inputs from the administration system 5, in response to data provided by the user terminal (for location data) (step 54), the user’s work management system (for itinerary data) and event inputs 90. These inputs are processed by an interpretation component 83, which establishes which events should be classified as a threat that might impact the AR user. Data on those events identified as threats are passed to a warning generation unit 84 to generate a warning message appropriate to the user’s situation and available outputs 1 1 , 12 and transmitting it to the user.
  • An acknowledgment system 85 is provided in this embodiment, which records messages received from the user. Such messages may include acknowledgement of warnings received, to confirm that the AR user has seen the message. The acknowledgement system may also exchange periodic reminder signals with the user, to confirm that the user is safe and well, and still wearing the headset and thus able to receive any warning messages that may become necessary.
  • the warning system 8 may obtain information from the data hub 7 to be used to interpret threats.
  • the warning system (8) may continuously poll the Data Hub (7) for events. This is known as a“data pull” system, i.e controlled by the entity that is to receive the data (in this case the warning system 8), which only receives data if it requests it from the data source (hub 7).
  • the warning system 8 may subscribe to a service which publishes events. This is a“push” system, controlled by the data provider (hub 7), in which the data receiver (warning system 8) passively waits for data to be sent to it.
  • the warning system passes details of the events to the management processor 6 (step 92) for analysis and interpretation.
  • the management processor 6 establishes if the threat is in the AR User’s vicinity, and the nature of the threat to the AR User and, if appropriate, composes and sends a warning to the AR User.
  • the following steps 93-98 are performed for each registered
  • the processor 6 requests the users' current location (and/or future itinerary) from the data hub 7 (step 93) (or from the user terminals themselves) and receives responses 94. It then processes the threat data and the location data (step 95) to identify users exposed to the threat represented by the event 92, and generates an instruction 96 to transmit warnings 97 to the affected users
  • the Warning system (8) may send warning messages in a format that can be displayed directly on the screen 1 1 of the AR User’s headset (1 ), or it may send the warning in a more generalised format that the headset’s processor (31 ) and Configuration file (32) can use to interpret the best way to inform the user.
  • the configuration file could contain AR User preferences and may also contain local environment information, such as whether the area is dark, or if headphones are being used. If the user is not detected to be wearing the headset, (for example if he is travelling between tasks) the warning may be transmitted in a different way such as text message.
  • the threat warning could be a combination of graphics, text, audible sounds, text to speech or recorded phrases, played out through AR headphones (if it has audio - for example, like a smartphone) or using the AR Headset (1 ) screen, or as picture-in-picture within the screen estate of the AR headset screen, as shown in Figure 9.
  • the AR user (2) Once the AR user (2) has seen the messages, he can acknowledge that he has seen it by using a physical switch 17 on the headset or by using eye tracking or some other mechanism. This causes an acknowledgement message 98 to be sent to the warning system 7 to inform that the AR User is now aware of the threat.
  • Any failure of the acknowledgement system 85 to receive the expected acknowledgement signal 98 can be recorded by the system.
  • a failure to see the warning, and therefore to acknowledge it, may be because the user is not currently wearing the headset, and is thus likely to be operating in a less restrictive manner, with more normal awareness levels.
  • the message could be transmitted again at a later time, or a warning message sent out-of-band to the user using another medium such as a text message. However, if the message remains unacknowledged, the administrative function may be alerted to allow investigation of the well-being of the operative.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Optics & Photonics (AREA)
  • Alarm Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon la présente invention, un système de réalité augmentée (RA) comprend une application de sécurité (30) qui génère un signal d'alerte à émettre au porteur (2) d'un dispositif portable (1) en réponse à la détection d'une menace ou d'un danger potentiel (9), signalé soit par des capteurs (13, 14) sur le dispositif portable même, soit par un système d'avertissement à distance (8) répondant à des signalements (91) de dangers à l'emplacement actuel signalé de l'utilisateur (81) ou à l'emplacement futur planifié (80). L'utilisateur est tenu d'accuser réception des signalements, les signaux d'accusé de réception (98) étant émis vers le système d'avertissement (8). Un défaut de réception d'un accusé de réception attendu (10) amène le système d'avertissement à alerter une fonction de gestion, de telle sorte que l'échec du porteur à répondre à l'avertissement de danger puisse être examiné.
PCT/EP2019/075421 2018-10-18 2019-09-20 Système de réalité augmentée WO2020078663A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1816955.7A GB2578133A (en) 2018-10-18 2018-10-18 Augumented reality system
EP18201178 2018-10-18
GB1816955.7 2018-10-18
EP18201178.3 2018-10-18

Publications (2)

Publication Number Publication Date
WO2020078663A1 true WO2020078663A1 (fr) 2020-04-23
WO2020078663A8 WO2020078663A8 (fr) 2020-10-15

Family

ID=67997642

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/075421 WO2020078663A1 (fr) 2018-10-18 2019-09-20 Système de réalité augmentée

Country Status (1)

Country Link
WO (1) WO2020078663A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170337736A1 (en) * 2016-05-18 2017-11-23 Arima Communications Corp. Threat warning system adapted to a virtual reality display system and method thereof
US20170358194A1 (en) * 2016-06-10 2017-12-14 The Boeing Company Systems, Methods, and Apparatus for Sensing Environmental Conditions and Alerting a User in Response
EP3273418A1 (fr) * 2016-07-18 2018-01-24 Sandra Kwiatecki-Rudolf Système et procédé multi-étape d'alarme d'homme mort
US20180047212A1 (en) * 2016-08-12 2018-02-15 Tyco Safety Products Canada Ltd. Notification system for virtual reality devices
US10043376B1 (en) * 2017-07-25 2018-08-07 Intel Corporation Potential hazard warning system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170337736A1 (en) * 2016-05-18 2017-11-23 Arima Communications Corp. Threat warning system adapted to a virtual reality display system and method thereof
US20170358194A1 (en) * 2016-06-10 2017-12-14 The Boeing Company Systems, Methods, and Apparatus for Sensing Environmental Conditions and Alerting a User in Response
EP3273418A1 (fr) * 2016-07-18 2018-01-24 Sandra Kwiatecki-Rudolf Système et procédé multi-étape d'alarme d'homme mort
US20180047212A1 (en) * 2016-08-12 2018-02-15 Tyco Safety Products Canada Ltd. Notification system for virtual reality devices
US10043376B1 (en) * 2017-07-25 2018-08-07 Intel Corporation Potential hazard warning system

Also Published As

Publication number Publication date
WO2020078663A8 (fr) 2020-10-15

Similar Documents

Publication Publication Date Title
JP7188513B2 (ja) 監視システム、監視方法、及びプログラム
JP3994027B2 (ja) 情報提供システムとその装置及び方法
US11265675B2 (en) System and method for managing emergency vehicle alert geofence
JP7444777B2 (ja) 情報処理装置、端末装置、情報処理方法および情報処理プログラム
CN103391432A (zh) 一种景区安全预警智能视频监控系统及监控方法
JP2017204104A (ja) 制御装置、車載装置、映像配信方法、及びプログラム
US20110227741A1 (en) Emergency rescue system triggered by eye expression recognition and method for same
CA2829329A1 (fr) Systeme et appareil permettant de localiser et de surveiller des personnes et/ou des abords
KR101545080B1 (ko) 씨씨티브이와 스마트 단말기를 연계한 스마트 안전 시스템
KR20120113455A (ko) 레이더를 이용한 야생동물 침입 탐지 방법 및 그 시스템
US9779623B2 (en) Communication of alerts to vehicles based on vehicle movement
US20170358210A1 (en) Method for Enabling an Interoperable Vehicle Safety Network Using Wireless Communication
GB2578133A (en) Augumented reality system
US11743372B1 (en) Monitoring systems and methods for personal safety
KR100916315B1 (ko) 보호 시스템, 보호 장치 및 이의 운용 방법
WO2020078663A1 (fr) Système de réalité augmentée
JP7313806B2 (ja) 歩行者装置、車載装置、歩車間通信システムおよび安全確認支援方法
US20200221250A1 (en) System and method for velocity-based geofencing for emergency vehicle
KR20170102403A (ko) 차량용 빅데이터 처리방법 및 차량용 빅데이터 시스템
CN111194023A (zh) 车辆、车辆可视化救援方法及系统
KR20190025440A (ko) 긴급상황 대처를 위한 관제시스템
FR3022386A1 (fr) Dispositif d'avertissement lie a des personnes et procede d'avertissement mis en oeuvre
US20230386259A1 (en) System and method for safe, private, and automated detection and reporting of domestic abuse
US11533072B2 (en) Transmission of body status information by a wearable computing device
WO2023175829A1 (fr) Système de surveillance, dispositif de surveillance, procédé de surveillance, et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19770099

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19770099

Country of ref document: EP

Kind code of ref document: A1