US20220153232A1 - Security method and system for a vehicle - Google Patents

Security method and system for a vehicle Download PDF

Info

Publication number
US20220153232A1
US20220153232A1 US17/442,448 US202017442448A US2022153232A1 US 20220153232 A1 US20220153232 A1 US 20220153232A1 US 202017442448 A US202017442448 A US 202017442448A US 2022153232 A1 US2022153232 A1 US 2022153232A1
Authority
US
United States
Prior art keywords
vehicle
attack
person
notification
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/442,448
Inventor
Eric Gallagher
Valdas BOCKUS
Colm LOUGHNANE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ATSR Ltd
Original Assignee
ATSR Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ATSR Ltd filed Critical ATSR Ltd
Assigned to ATSR LIMITED reassignment ATSR LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BOCKUS, Valdas, GALLAGHER, ERIC, LOUGHNANE, Colm
Publication of US20220153232A1 publication Critical patent/US20220153232A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/305Detection related to theft or to other events relevant to anti-theft systems using a camera
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G3/00Ambulance aspects of vehicles; Vehicles with special provisions for transporting patients or disabled persons, or their personal conveyances, e.g. for facilitating access of, or for loading, wheelchairs
    • A61G3/02Loading or unloading personal conveyances; Facilitating access of patients or disabled persons to, or exit from, vehicles
    • A61G3/06Transfer using ramps, lifts or the like
    • A61G3/061Transfer using ramps, lifts or the like using ramps
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G3/00Ambulance aspects of vehicles; Vehicles with special provisions for transporting patients or disabled persons, or their personal conveyances, e.g. for facilitating access of, or for loading, wheelchairs
    • A61G3/02Loading or unloading personal conveyances; Facilitating access of patients or disabled persons to, or exit from, vehicles
    • A61G3/06Transfer using ramps, lifts or the like
    • A61G3/062Transfer using ramps, lifts or the like using lifts connected to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/102Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device a signal being sent to a remote location, e.g. a radio signal being transmitted to a police station, a security company or the owner
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/104Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device characterised by the type of theft warning signal, e.g. visual or audible signals with special characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/24Means to switch the anti-theft system on or off using electronic identifiers containing a code not memorised by the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/302Detection related to theft or to other events relevant to anti-theft systems using recording means, e.g. black box
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/31Detection related to theft or to other events relevant to anti-theft systems of human presence inside or outside the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00563Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys using personal physical data of the operator, e.g. finger prints, retinal images, voicepatterns
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • G08B21/245Reminder of hygiene compliance policies, e.g. of washing hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Definitions

  • the present invention relates to a method and system for enhancing the security of a vehicle, including actuating a change of state of the vehicle in response to determining that an attack is taking place in the vicinity of the vehicle.
  • Biometric identification has found a large number of applications in recent years, and particularly in the field of security.
  • optical sensors of various kinds are used to detect unique facial features among other biometric markers which identify an individual for authorisation/security clearance.
  • a common scenario in the field of emergency response medical treatment is one in which paramedics are treating and collecting patients at a scene containing at least one or several sources of threat to themselves the patient and the ambulance vehicle itself, for example, a riot or terrorist incident or the like).
  • the risk of vehicle hijacking is a very real possibility in such a scenario if the paramedic(s) should forget to lock the ambulance upon leaving it to attend a patient or upon returning to the vehicle.
  • paramedics are required by law to leave vehicles running at the scene, making hijacking an even more realistic threat.
  • versatility of the ambulance service ought to be a priority given the demand for ambulances worldwide in locations varying for example in temperature and altitude.
  • Response external environment conditions may differ globally as much as a dry desert at sea-level with 50° C., a sub-polar environment at ⁇ 40° C. or a high elevation environment e.g. 10,000 ft above sea-level.
  • a patient suffering from severe heatstroke would benefit from a cool environment or an environment which is gradually brought to a cool temperature with respect to the external environment.
  • the above description is to at least some extent equally applicable for other emergency response vehicles such as an emergency helicopter or an emergency boat.
  • a method for controlling access to a vehicle comprising:
  • the method further comprises determining if the personnel transport apparatus is a stretcher, a wheelchair or a dolly.
  • the method comprises determining the type of the stretcher, the type of the wheelchair or the type of the dolly.
  • the method further comprises determining which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • the method further comprises extending a ramp from the vehicle to facilitate access of the personnel transport apparatus to the vehicle in response to the type of personnel transport apparatus.
  • the method further comprises lowering a lift to facilitate access of the personnel transport apparatus to the vehicle.
  • the method further comprises selectively controlling at least one parameter of the vehicle in response to detecting both of the at least one authorised personnel and the personnel transport apparatus approaching the vicinity of the vehicle.
  • the at least one parameter is an environmental parameter.
  • the environmental parameter comprises temperature.
  • the method further comprises measuring at least one external environment parameter of the vehicle.
  • the method further comprises controlling a vehicle environment parameter in response to the at least one measured external environment parameter.
  • the method further comprises regulating the vehicle environment to a predefined characteristic.
  • the method further comprises detecting if a person on the personnel transport apparatus is correctly harnessed according to a predefined protocol.
  • the method further comprises detecting if a person on the personnel transport apparatus is correctly positioned on said personnel transport apparatus according to a predefined protocol.
  • the method further comprises maintaining a log in memory each time the predefined protocol was breached.
  • the log in memory is associated with a user profile of one or more authorised personnel.
  • the method further comprises transmitting at least part of the contents of the memory to a remote database.
  • a transmission transmits at least part of the contents of the memory to a remote database when the predefined protocol was breached.
  • the transmission is a wireless transmission.
  • the method further comprises providing an audio and/or visual notification from the vehicle detailing which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • the method further comprises determining whether the type of the personnel transport apparatus in the captured image is an authorised type of personnel transport apparatus.
  • a method for controlling access to a vehicle comprising:
  • the method further comprises determining which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • an apparatus for controlling access to a vehicle comprising:
  • an image capture device for capturing an image of an area in the vicinity of the vehicle
  • an image analysing means for analysing the captured image
  • a detecting means for detecting if the captured images contain both at least one person and a personnel transport apparatus, wherein the detecting means is further configured for determining the type of the personnel transport apparatus;
  • a processing means for determining if the at least one person in the captured image is authorised to access the vehicle
  • control means for selectively controlling a locking mechanism of the vehicle in response to the captured image containing at least one authorised person and the personnel transport apparatus.
  • the processing means is configured to implement a machine learning algorithm.
  • the detecting means is configured for detecting if the personnel transport apparatus is a stretcher, a wheelchair or a dolly.
  • the detecting means is configured for detecting the type of the stretcher, the type of the wheelchair or the type of the dolly.
  • control means is further configured for controlling operations of a mechanical apparatus associated with the vehicle.
  • the mechanical apparatus is a retractable ramp.
  • the mechanical apparatus is a lift.
  • the apparatus further comprises an environment measurement means for measuring one or more vehicle environment parameters.
  • the apparatus further comprises an environment controlling means for controlling one or more vehicle environment parameters.
  • the apparatus further comprises an external environment measurement means for measuring one or more external environment parameters.
  • the apparatus further comprises memory for storing one or more environment control protocols defining instructions for controlling the vehicle environment parameters.
  • control means is configured for varying the one or more environment parameters towards predefined values for the one or more environment parameters, with respect to the external environment parameters.
  • control means is configured for dynamically adjusting power output of the environment controlling means in dependence on one or a combination of one or more environment parameters, one or more external environment parameters, duration the vehicle doors have been open and duration of presence of the at least one authorised personnel in the vicinity of the vehicle.
  • control means is operationally associated with the vehicle electromechanical actuators.
  • the image analysing means is configured to determine if one or both of said at least one authorised personnel and a person on the personnel transport apparatus are correctly harnessed according to a predefined protocol.
  • the image analysing means is configured to determine if a person on the personnel transport apparatus is correctly positioned according to a predefined protocol.
  • the apparatus further comprises memory for storing one or more predefined protocols for harnessing and/or positioning a person on the patient transport apparatus.
  • the apparatus further comprises a communication means for communicating with a remote server or a remote database.
  • the communication means is a wireless communication means.
  • control means is configured for locking the vehicle doors if a predefined combination or number of authorised personnel leaves the vehicle or the vicinity of the vehicle.
  • the apparatus further comprises a recording means for synchronously recording personnel status, vehicle environment and vehicle parameter data.
  • the detecting means is configured for detecting fluid spillage.
  • the fluid is a bodily fluid including but not limited to blood or vomit.
  • the apparatus further comprises memory for storing one or more of synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage.
  • the processing means is further configured to generate a report to be transmitted wirelessly to a remote server or a remote database.
  • the report to be transmitted to a remote server or a remote database contains one or more of the synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage.
  • the detecting means is further configured to determine the distance of the at least one person and one or more objects from the vehicle, their duration of time from the vehicle, and the direction and speed of their motion with respect to the vehicle.
  • the apparatus further comprises a notification means configured for providing an audio and/or visual notification.
  • a computer-readable medium comprising instructions which, when executed, cause a processor to carry out one or more aspects of a method as those outlined above.
  • an emergency vehicle comprising the apparatus of the second and third aspects.
  • a method of vehicle security comprising the steps of:
  • a security system for implementing the method of vehicle security comprising:
  • an image capture device configured for capturing an image of an area in the vicinity of a vehicle
  • an image analysing means configured for analysing the captured image, wherein the image analysing means is further configured to determine whether an attack is taking place in the vicinity of the vehicle;
  • a processor configured to actuate a change of state of the vehicle in response to the image analysing means determining that an attack is taking place;
  • FIG. 1 is an illustration of an apparatus according to an embodiment of the present disclosure
  • FIG. 2 is an illustration of elevation and plans views of a scenario in which the apparatus may be implemented according to an embodiment of the present disclosure
  • FIG. 3 is an illustration of a system according to an embodiment of the present disclosure
  • FIG. 4 is an illustration of an exemplary method according to an embodiment of the present disclosure
  • FIG. 5 is an illustration of a plurality of scenarios according to an embodiment of the present disclosure.
  • FIG. 6 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 7 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 8 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 9 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 10 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 11 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 12 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 13 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 14 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 15 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 16 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 17 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 18 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 19 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 20 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 21 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIG. 22 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIGS. 23-30 provide illustrations of exemplary methods according to various embodiments of the present disclosure.
  • the vehicle is an ambulance. It will be understood that the description of the present disclosure can equally apply to any vehicle appropriately outfitted with the relevant components and the apparatus herein.
  • ‘ambulance’ is used in place of ‘vehicle’, for consistency in the context the term ‘paramedic’ will replace ‘person’ and ought to be understood as referring to an authorised personnel. Unauthorised personnel are referred to as ‘further person(s)’.
  • the term ‘person’ may be used.
  • patient transport apparatus is generally used herein, it will be understood that since the disclosure is not isolated to an ambulance the use of the word ‘patient’ is not always most appropriate.
  • a more general term may be ‘personnel transport apparatus’. This may cover, by way of example only, contexts in which the vehicle is a police vehicle, security vehicle, military vehicle, fire engine or the like, as well as a civilian vehicle.
  • FIG. 1 is a block diagram illustrating a configuration of an apparatus comprising a computing device 100 which includes various hardware and software components that, in conjunction with a plurality of external devices it is operationally associated with, function to perform processes according to the present disclosure.
  • the computing device 100 comprises a processor/processing means 105 in communication with memory 110 .
  • the processor 105 functions to execute software instructions that can be loaded and stored in the memory 110 .
  • the processor 105 may include a number of processors, a multi-processor core, or some other type of processor, depending on the particular implementation.
  • the memory 110 may be accessible by the processor 105 , thereby enabling the processor 105 to receive and execute instructions stored on the memory 110 .
  • the memory 110 may be, for example, a random access memory (RAM) or any other suitable volatile or non-volatile computer readable storage medium.
  • the memory 110 may be fixed or removable and may contain one or more components or devices such as a hard drive, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the above.
  • One or more software modules may be encoded in the memory 110 .
  • the software modules may comprise one or more software programs or applications having computer program code or a set of instructions configured to be executed by the processor 105 .
  • Such computer program code or instructions for carrying out operations for aspects of the systems and methods disclosed herein may be written in any combination of one or more programming languages.
  • the software modules may include at least a first application 115 configured to be executed by the processor 105 .
  • the software modules may further include a second application 120 and a third application 125 configured to be executed by the processor 105 .
  • the processor 105 configures the computing device 100 to perform various operations relating to the embodiments of the present disclosure, as will be described below.
  • the first, second and third applications 115 , 120 and 125 may be an image analysing means 115 , a detection means 120 and a recording means 125 .
  • the detecting means 120 may comprise any software package for detecting specific features of an image; it will be understood that this may entail edge detection software; one or more machine learning techniques such as deep learning algorithms implementing neural networks; pattern recognition and feature detection. These techniques may be implemented once on any single frame or dynamically on a live feed comprising a plurality of frames.
  • the database 130 may contain and/or maintain various data items and elements that are utilized throughout the various operations of the apparatus and system described below. It should be noted that although the database 130 is depicted as being configured locally to the computing device 100 , in certain implementations the database 130 and/or various other data elements stored therein may be located remotely. Such elements may be located on a remote device or server—not shown, and connected to the computing device 100 through a network in a manner known to those skilled in the art, in order to be loaded into a processor and executed.
  • program code of the software modules and one or more computer readable storage devices form a computer program product that may be manufactured and/or distributed in accordance with the present disclosure, as is known to those of skill in the art.
  • a control means 135 may be operationally associated with the processor 105 .
  • the control means 135 may be further operationally associated with one or further external devices 140 , 145 , 150 .
  • the control means 135 may be operationally associated with a vehicle locking mechanism 140 , a vehicle ramp/lift mechanism 145 and vehicle electromechanical actuators 150 .
  • the vehicle electromechanical actuators 150 will be understood by the skilled person to comprise, but is not limited to, the electromechanical sensor(s), actuator(s) and circuit(s) which may be found on-board a vehicle.
  • a communication means 155 is also operatively connected to the processor 105 and may be any interface that enables communication between the computing device 100 and other devices, machines and/or elements.
  • the communication means 155 is configured for transmitting and/or receiving data from remote databases 160 and/or servers 165 .
  • the communication means 155 may include but is not limited to a Bluetooth, or cellular transceiver, a satellite communication transmitter/receiver, an optical port and/or any other such, interfaces for wirelessly connecting the computing device 100 to the other devices.
  • a user interface 170 is also operatively connected to the processor 105 .
  • the user interface 170 may comprise one or more input device(s) such as switch(es), button(s), key(s), and a touchscreen.
  • the user interface 170 functions to facilitate the capture of commands from the user such as on-off commands or settings related to operation of the apparatus and system described below.
  • the user interface 170 may function to issue remote instantaneous instructions or notifications on images received via a non-local image capture mechanism.
  • the processor 105 may be further operationally associated with one or more external devices.
  • the one or more external devices may include but are not limited to an image capture device 175 , an environment control means 180 , an environment measurement means 185 and an external environment measurement means 190 .
  • Data may be transmitted to and/or from these external devices.
  • Data from these external devices may be processed by the processor 105 implementing one or more of the software packages stored in memory 110 .
  • Data from the external devices may be stored in memory 110 via the recording means 125 . Instructions such as predefined protocols stored in memory 110 may be sent to the external devices via the processor 105 .
  • the computing device 100 may further be associated with one or more further computing devices 100 in an Internet of Things (IoT) network, using the communication means 155 on-board every computing device 100 .
  • the IoT network may comprise one or more remote server(s) and remote database(s), and software modules located locally to any number of computing devices 100 or located remotely on said remote server(s) and/or database(s).
  • the software modules may comprise means to process data received from one or more computing devices 100 and may be configured to produce virtual models to optimise parameters associated with said one or more computing devices 100 .
  • FIG. 2 illustrates an elevation view of a common scenario 200 in the work of an emergency medical response unit.
  • a paramedic 205 attends to a patient 210 and loads the patient 210 on to a patient transport apparatus 215 .
  • the patient transport apparatus 215 is a stretcher.
  • the patient transport apparatus 215 is a wheelchair.
  • the patient transport apparatus 215 is a dolly.
  • At least one further person 220 who is not a member of the ambulance staff is typically also present at the scene.
  • the work of a paramedic may entail arriving at a scene with multiple threats, such as during a riot or terrorist incident.
  • the further person(s) 220 may be rioter(s) or terrorist(s) who pose a threat to the paramedic 205 , the patient 210 and the surrounding population proper.
  • the further person(s) 220 may seek to hijack the vehicle, and may be successful if the paramedic 205 forgets to lock the ambulance doors when leaving the ambulance 225 .
  • detection and biometric authorisation of the paramedic 205 leaving the ambulance 225 may actuate locking of the doors.
  • detection and biometric authorisation of the paramedic 205 together with detection of the patient transport apparatus 215 may be implemented to unlock the ambulance doors according to the embodiments of the present disclosure. Being able to control access to an ambulance on the basis of biometric authorisation of ambulance personnel, and additionally detection of a patient transport apparatus, adds a layer of security which greatly diminishes the chance of a hijacking occurring.
  • the image capture device 175 may capture an image of an area in the vicinity of the ambulance 225 .
  • the image comprises a single frame.
  • the image comprises a plurality of frames.
  • the image comprising a plurality of frames may form a video.
  • the image capture device 175 may be placed on top of the ambulance 225 or on one of its vertical sidewalls.
  • the image capture device 175 may be a singular camera or a multitude of cameras positioned together as an array or strategically at various points on the body of the ambulance 225 .
  • the image analysing means 115 may be used to analyse the captured image.
  • the detecting means 120 may then be used to determine if the captured image contains both of the paramedic 205 and the patient transport apparatus 215 . If only one or the other, or neither, are detected then the process ends; these steps are best illustrated in FIG. 4 and its associated description.
  • the processing means 105 may be used to determine if the person—the paramedic 205 —is authorised personnel. If the person is determined to be authorised personnel and the patient transport apparatus has additionally been detected to be with them then the locking mechanism 140 on the vehicle (the ambulance 225 ) may be selectively controlled to be opened by the control means 135 . Steps for determining authorisation and access are best illustrated in FIG. 4 and its associated description.
  • data relating to authorised personnel including but not limited to biometric markers, as well as data relating to markers for detecting the patient transport apparatus 215 , is stored on a database 130 in local memory 110 .
  • data relating to authorised personnel and patient transport apparatus detection is stored on a remote database and is accessed either by the communication means 155 (wirelessly) or by installation at a port before driving to the scene of an incident if it is an ambulance.
  • the processing means 105 is configured to implement machine learning algorithms. It will be understood that this may entail any number of machine learning algorithms including supervised and unsupervised machine learning algorithms as appropriate, for detecting facial and/or other bodily features.
  • an unsupervised machine learning algorithm may entail the processing means 105 working in conjunction with the detecting means 120 to group biometric markers from the captured image. The processing means 105 would then seek to match this biometric marker grouping data with biometric data of authorised personnel stored on memory, to authorise a person(s) in the vicinity of the vehicle.
  • machine learning algorithms may be further used to constantly update and optimise biometric markers associated with personnel as their markers vary naturally over time, so as to avoid possible failed authorisation events.
  • a redundant means (not shown) of overriding the system and/or providing driver credentials may be made available in case of environment setup issues.
  • a keypad may be placed on the exterior of the ambulance 225 which is operationally associated with the locks on the ambulance 225 . Authorised personnel privy to the password may enter the password in to the keypad to unlock the ambulance 225 doors.
  • the keypad may additionally or alternatively comprise a finger-print recognition means.
  • the processing means 105 may be configured to compare biometric data from memory 110 to that provided at the keypad during an access request incident, and process a true/false determination based on a biometric match/failure to authenticate respectively. The processing means 105 would then send the binary determination to the control means 135 which may unlock the ambulance 225 doors if appropriate.
  • the keypad may comprise a means for processing a proximity card access request or a magnetic stripe card access request.
  • the image analysing means 115 may further be configured to determine if one or both of the authorised personnel and the person on the patient transport apparatus 215 are correctly harnessed according to a predefined protocol.
  • the image analysing means 115 may additionally be configured to determine if a person on the patient transport apparatus 215 is correctly positioned according to a predefined protocol.
  • the predefined protocol may, for example, define that the paramedic 205 may be unharnessed to carry out treatment, but that the patient on the patient transport apparatus 215 must be harnessed correctly with the belt secured over their body and not underneath them.
  • the predefined protocol may include a requirement that the patient be positioned in an erect position and not slouching; this may further be associated with harnessing requirements, which may help in achieving the positioning end.
  • predefined protocols may be stored in a database 130 on local memory 110 .
  • the predefined protocols may be stored on a remote database and accessed either by the communication means 155 for communicating with the server or by installation at a port before driving to the scene of an incident if it is an ambulance. Ultimately this would assist in the paramedics' job in ensuring patient safety and security.
  • the communication means 155 may be wireless, which in one embodiment may be a transceiver.
  • control means 135 is configured for controlling operations of a mechanical apparatus associated with the vehicle, such as a vehicle ramp or vehicle lift or the like.
  • the detecting means 120 is capable of determining the type of patient transport apparatus 215 , so as to facilitate the correct instructions being provided to the control means 135 whether a vehicle ramp or lift ought to be actuated in to operation.
  • the detecting means 120 may be configured to determine if the patient transport apparatus 215 is a stretcher, a wheelchair or a dolly to this end.
  • the detecting means 120 may be further configured to determine the type of the stretcher the type of the wheelchair or the type of the dolly to this end, since not all patient transport apparatuses require electro-mechanically assisted access to the ambulance 225 .
  • the detecting means 120 may further be configured to determine the distance of persons and objects from the vehicle, their duration of time from the vehicle, and the direction and speed of their motion with respect to the vehicle (i.e. moving towards or away from the vehicle and at a certain speed). In such a configuration, protocols for unlocking based on these determinations may be stored in local memory 110 .
  • the apparatus may additionally comprise the environment measurement means 185 for measuring one or more vehicle environment parameters.
  • the one or more vehicle environment parameters may include but are not limited to vehicle temperature, pressure and humidity.
  • the environment measurement means 185 may comprise one or more transducers.
  • the apparatus may additionally comprises the external environment measurement means 190 for measuring one or more external environment parameters.
  • the one or more external environment parameters may include but are not limited to ambient temperature, pressure, humidity, wind speed and wind chill.
  • the external environment measurement means 190 may comprise one or more transducers.
  • the external environment measurement means 190 is illustrated as being positioned on the roof of the ambulance 225 , however it will be understood that any number of alternative positions are envisaged as being possible such as but not limited to the exterior sidewalls of the ambulance 225 .
  • the apparatus further comprises an environment control means 180 for controlling one or more of the vehicle environment parameters.
  • the environment controlling means comprises a heating, ventilation, and air conditioning (HVAC) system.
  • HVAC heating, ventilation, and air conditioning
  • Environment control protocols defining instructions for controlling the vehicle environment parameters may be stored on local memory or communicated wirelessly to the vehicle.
  • the environment control protocols may be based on various scenarios of external environment parameters. For example, there may be an environment control protocol tailored for when the external environment is 50 degrees Celsius with 80% humidity, and another for when the external environment is ⁇ 40 degrees Celsius with 80% humidity. Controlling the vehicle environment parameters in this way may be useful if the vehicle environment is at least partially lost to the external environment due to a prolonged period in which the vehicle doors are open during loading.
  • the processing means 105 may access the measured external environment parameters from memory, and match them with the closest defined parameter set in memory (e.g. a measured external environment parameter set defined by 47 degrees Celsius and 84% humidity would be matched to a predefined set defined by 50 degrees Celsius with 80% humidity). The processing means 105 may then send from memory the environment control protocol associated with that predefined parameter set to the environment controlling means, which would then regulate the vehicle environment parameters according to that protocol.
  • the environment protocols based on the measured external environment parameters may further comprise protocols in which the vehicle environment is gradually returned to predefined optimal conditions with respect to the external environment parameters. This may be useful, for example, when treating a patient with severe heatstroke: it is commonly advised to not introduce one suffering from severe heatstroke to a very cool environment immediately from a very hot environment as this can have an adverse affect on their health.
  • control means 135 may be configured to conserve vehicle battery power by dynamically adjusting the power output of the environment control means 180 , in dependence on one or a combination of one or more vehicle environment parameters, one or more external environment parameters, the duration the vehicle doors have been open and the duration of the presence of authorised personnel in the vicinity of the vehicle.
  • the apparatus may additionally comprise a recording means 125 for writing to memory protocol adherence history.
  • the communication means may be used to transmit data relating to protocol adherence by authorised personnel from local memory to a remote database.
  • a user profile of each of the authorised personnel containing their protocol adherence history may be formed on a remote database. Tracking protocol adherence of any member of ambulance crew may be useful for investigative purposes ex post facto, or for staff training purposes among other uses.
  • an event of protocol violation may be transmitted in real-time to a remote server via the communication means 155 . Real-time updates such as this may be implemented for example in conjunction with a control centre which can communicate with ambulance staff and/or control vehicle parameters such as but not limited to maximum speed and acceleration.
  • Step 340 of FIG. 3 best illustrates the aspect of the embodiment including the control centre.
  • the recording means 125 may further be configured to synchronously write to memory patient status data (such as heartrate, blood pressure etc.), vehicle environment data such as temperature and vehicle parameter data such as speed and acceleration. Synchronously recording such data may be useful for investigative purposes ex post facto, or for staff training purposes among other uses.
  • the synchronously recorded data may further include time stamps
  • the harnessing and positioning protocol adherence data may further be associated with the synchronously recorded data via time stamps.
  • the detecting means 120 may be further configured for detecting fluid spillage.
  • the fluid may include but is not limited to blood or vomit.
  • the recording means may further be configured to write fluid spillage incidents to memory. Fluid spillage incidents may be further associated with the synchronously recorded data via time stamps.
  • data regarding markers for identifying different types of fluids may be stored on local memory 110 .
  • the data regarding markers for identifying different types of fluids may be stored on a remote database and accessed either by the communication means 155 for communicating with the server or by installation at a port before driving to the scene of an incident if it is an ambulance.
  • the communication means 155 may be a wireless communication means 155 .
  • the processing means 105 may further be configured to compile a report containing one or more of the synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage, to be transmitted wirelessly to a remote server or remote database.
  • the processing means 105 may compile the report and write it local memory for retroactive review.
  • control means 135 may be operationally associated with the vehicle electromechanical actuators 150 on-board the vehicle to perform one or more of the tasks outlined above including but not limited to vehicle environment parameter control, selective locking of vehicle doors and ramp or lift actuation.
  • FIG. 2 further illustrates a plan view of the interior volume 235 of the rear of the ambulance 225 according to an embodiment of the present disclosure, which contains a plurality of the image capture device 175 mounted on its sidewalls.
  • the image capture device 175 may alternatively be mounted on the ceiling of the interior of the ambulance 225 . Having the image capture device 175 installed in the interior of ambulance 225 may serve the purpose of capturing images for the detection of fluid spillage and for determining if harnessing and patient positioning protocols are abided by at all times.
  • the use of image analysis to determine adherence removes the need for an integrally installed seat with one or both of harnesses integrally connected to the vehicle electromechanical actuators 150 and pressure sensors under the seats/patient transport apparatus 215 integrally connected to the vehicle electromechanical actuators 150 .
  • a “dumb” patient transport apparatus 215 that is, one not being integrally connected or connectable to the vehicle
  • machine learning algorithms may be implemented to detect if authorised person(s) and the patient are harnessed correctly and if the patient is positioned correctly.
  • Harnesses 240 may be of any colour and consist of any pattern.
  • Machine learning algorithms may include deep learning algorithms implementing neural networks.
  • one or more of edge detection, object recognition, pattern matching, gauging or metrology (object size and distance measurement) and image matching may be implemented together with, or as an alternative to, machine learning algorithms These techniques may be implemented once on any single frame or dynamically on a live feed comprising a plurality of frames.
  • FIG. 3 is an illustration of a system 300 according to an embodiment of the present disclosure.
  • the system 300 comprises the ambulance 225 fitted with a wireless communication means 155 capable of communicating with the remote server 165 , itself capable of transmitting and receiving information to and from remote database(s) 160 .
  • the remote database(s) 160 may be further associated with a remotely located control centre 305 capable of receiving, processing and transmitting information back downstream to the ambulance 225 via the remote server 165 .
  • the wireless communication means 155 may transmit information such as geographical position, based on the communication of the ambulance 225 with a geographical positioning satellite system 310 .
  • the wireless communication means 155 may also transmit vehicle parameter data from various on-board facilities 315 such as but not limited to a gyroscope, a speedometer and an emergency lights status clock.
  • the wireless communication means 155 may transmit the data outlined above including but not limited to synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage.
  • the report may be compiled in real-time or ex post facto. If the report is compiled in real-time, the report may further be transmitted in real-time to the control centre 305 for real-time review. In one embodiment, when a fluid spillage event is detected the vehicle 225 is automatically removed from a real-time dossier of fleet vehicles available for use, and the control centre 305 is alerted that the vehicle 225 is contaminated.
  • FIGS. 20-22 provide further exemplary methods of safety control in a vehicle, consistent with embodiments of the present disclosure. As discussed above, having the image capture device 175 installed in the interior of ambulance 225 may serve the purpose of capturing images for the detection of fluid spillage.
  • the detecting means may be configured for detection of sanitary protocol violation events in the interior volume of the vehicle but also feasibly in the area surrounding the vehicle.
  • sanitary protocol violation events may also comprise one or more of the absence of personnel protective equipment (gloves, sanitary garments such as but not limited to gowns, respirators, safety masks, protective eyewear and the like) and unsafe injection practices by (e.g.) authorised vehicle personnel or by others.
  • Sanitary protocol violation events may comprise unsafe exposure or presence of hazardous objects such as but not limited to: a syringe, a knife, or a fragment of a resilient material, where the resilient material may comprise (e.g.) one or more of a glass, a metal, a wood.
  • the detecting means may be configured to detect any of the above sanitary protocol violation and events, and the processing means may record them in a log in memory.
  • safety protocol violation may comprise the absence of certain law-enforcement/security equipment and the like.
  • a series of user notification modules 320 for providing notifications to a user of the vehicle may be actuated by a number of processes via the control means 135 .
  • the user notification modules may include a visual notification means such as an LCD display capable of producing visual messages, or an audio system capable of producing audio messages.
  • the control centre 305 may be monitoring vehicle parameter data in real-time and may communicate notifications to be relayed to personnel on-board the ambulance 225 via one or more of the user notification modules 320 .
  • one or more of the user notification modules 320 may provide a notification to personnel on-board the ambulance 225 . Protocol breach notifications may be stored in local memory to be accessed by the processing means 105 and routed to the control means 135 which may actuate one or more of the user notification modules 320 .
  • the image analysing means 115 may be configured to count a number of persons in the vicinity of the vehicle at any given time. Counting a number of persons present in the interior or the exterior of the vehicle may furthermore comprise detecting the category of the persons and counting each. For example counting a number of authorised personnel, a number of patients, and so on. Authorised personnel may be detected by unique ID badges or codes disposed on their outerwear, via facial recognition and/or detection of a uniform based on certain characteristics such as colours or markers. Using machine learning algorithms or otherwise, the imaging analysis means 115 may also be configured to detect when a person has entered or exited the interior of the vehicle and associate a time stamp with said event.
  • this may be used to determine whether a vehicle has been used and so if there is a need to flag the vehicle for cleaning. Detecting the number of persons present within the vehicle may be used control the interior lighting of the vehicle as well as any interior environment control systems, to produce the most optimal and desirable environmental conditions within the vehicle.
  • FIG. 4 is an illustration of an exemplary method according to an embodiment of the present disclosure. Initially, an image of an area in the vicinity of the vehicle is captured, step 410 . The image is then analysed, step 420 . The image is analysed to detect if the captured image contains both a person and a patient transport apparatus, step 430 . The image is further analysed to determine if the person in the captured image is authorised to access the vehicle. The locking mechanism 140 of the vehicle is selectively controlled in response to the captured image containing an authorised person and the patient transport apparatus, step 450 . The first step 410 may be carried out at all times, or only when the vehicle engine is running. Alluded to in the description of FIG.
  • the detecting means 120 determines if the image captured by the image capture device 175 contains both a person and a patient transport apparatus 215 . If only one or the other, or neither, are detected then the process ends and returns to 410 . If both are detected then the process proceeds to 440 where the processing means 105 is used to determine, based on data from a biometric database of all ambulance staff, if the person is authorised personnel, i.e. the paramedic 205 . If the person is authorised then the process proceeds to 450 wherein the control means 135 selectively unlocks one or more doors of the ambulance 225 .
  • Both the cab doors and rear doors, or one or the other, may be opened; this may be predefined with a set of instructions stored in memory.
  • the step 430 may further comprise determining that the person and patient transport apparatus are detected to be one or a combination of: within a predefined required distance from the vehicle; a predefined required duration of time within the vicinity of the vehicle; and detected to be moving towards the vehicle.
  • the predefined required distance may be 10 metres.
  • the predefined required duration of time within the vicinity of the vehicle may be 10 seconds.
  • the step 430 may combine the duration of time the person and patient transport apparatus are within the vicinity of the vehicle with the detection of them moving towards the vicinity of the vehicle, so as to reduce the occurrence of false positives.
  • FIG. 5 is an illustration of possible scenarios the image capture device 175 may capture.
  • FIG. 5 a there is illustrated an embodiment in which both an authorised person 205 and a patient transport apparatus 215 are within the area captured by the image capture device 175 .
  • detection of a person and a patient transport apparatus 215 followed by authorisation of said person is followed by selectively controlling the locking mechanism 140 of the vehicle in response to the captured image containing an authorised person and the patient transport apparatus, step 450 .
  • this may be followed by additional method steps.
  • an authorised person 205 if the further person 220 is detected as being in the captured image then access to the vehicle may be denied.
  • the detecting means 120 may be configured to detect if the motion of the further person 220 matches a predefined “violent behaviour profile”; such a profile may be stored on local memory 110 .
  • the predefined “violent behaviour profile” may comprise one or more of select facial expressions, arm movements or other bodily movements which are predefined as being typical of a person who presents a threat of violence.
  • the control centre 305 may be able to commandeer the image capture device 175 in real-time, assess the situation apparent from the image captured and make an authorisation decision on that basis.
  • an authorised person 205 is within the area captured by the image capture device 175 , but not a patient transport apparatus 215 .
  • the authorised person 205 may be denied access to the vehicle, according to the method of FIG. 4 .
  • alternative access authorisation methods may be implemented for such an instance as in 5 c.
  • the authorised person 205 may be able to contact the control centre 305 ; the control centre 305 may then be able to commandeer the image capture device 175 in real-time, assess the situation apparent from the image captured and make an authorisation decision on that basis.
  • a patient transport apparatus 215 is within the area captured by the image capture device 175 , but not a person.
  • the control centre 305 may be able to commandeer the image capture device 175 in real-time, assess the situation apparent from the image captured and make decisions on that basis including but not limited to contacting personal radios of authorised person 205 .
  • FIG. 6 is identical to the method of FIG. 4 , but with the addition of steps 660 - 680 relating to facilitating access to the vehicle on the basis of the type of patient transport apparatus in use.
  • the type of patient transport apparatus 215 may be a stretcher, a wheelchair or a dolly; the sub-category may be the type of the stretcher, the type of the wheelchair or the type of the dolly.
  • a ramp is then extended or a lift lowered from the vehicle in response to the sub-category of patient transport apparatus 215 if it is determined to be necessary, step 680 .
  • the ‘if required’ clause is important in light of the fact that not all patient transport apparatuses require electro-mechanical assistance to access the ambulance 225 . As outlined above in relation to FIG. 2 , determining if any one of the ramp or lift is required can save time and energy in the instances where they are not required.
  • Data relating to markers for identifying the various patient transport apparatuses, as well as data relating to ramp/lift protocols based on the type of patient transport apparatus, may be stored on local memory.
  • one or more of the steps of FIG. 6 may be omitted.
  • the method may comprise: 610 - 660 only.
  • the method may comprise: 610 - 660 followed by 680 only.
  • the method may comprise: 610 - 670 only.
  • the method may comprise: 610 - 660 followed by providing a notification to a user of the vehicle which means of access to the vehicle is substantially suitable based on the type of patient transport apparatus determined in step 660 .
  • such a notification may be audio and/or visual.
  • a predefined list of authorised transport apparatus types may be stored in memory (locally or remotely), and as such in various embodiments where the type and possibly sub-type of transport apparatus is determined, the processing means may be further configured to determine if the type of patient transport apparatus is an authorised type of patient transport apparatus.
  • the term ‘patient transport apparatus’ may be replaced by ‘personnel transport apparatus’, since the present invention is not limited to ambulances or other medical care vehicles.
  • the detection and authorisation of personnel may be omitted—such embodiments are best presented in FIGS. 11-14 .
  • FIG. 7 is an illustration of an exemplary method according to an embodiment of the present disclosure, containing the steps of the embodiment of FIG. 4 followed by the step of selectively controlling at least one parameter of the vehicle in response to detecting both the authorised personnel and patient transport apparatus approaching the vicinity of the vehicle, step 720 .
  • the vehicle parameter may be a vehicle environment parameter such as temperature, pressure or humidity.
  • the environment control means 180 would immediately begin enacting a predefined environment control protocol accessed from memory 110 by the processing means 105 .
  • the vehicle parameter may include but is not limited to vehicle ignition, emergency scene lights and electrically powered medical equipment charging facilities.
  • FIG. 8 is an illustration of a method 800 involving vehicle environment control but in dependence upon measured external environment parameters.
  • the method 800 contains the steps of the embodiment of FIG. 4 .
  • the person and patient transport apparatus 215 have been detected, the person has been authorised and the locking mechanism 140 of the vehicle selectively controlled, at least one external environment parameter is measured, step 820 .
  • at least one vehicle environment parameter is selectively controlled in response to detecting both the authorised personnel and patient transport apparatus approaching the vicinity of the vehicle, step 830 .
  • the processing means 105 may match the at least one measured external environment parameter to a predefined environment control protocol associated with the closest defined parameter set in memory.
  • the processing means 105 may then communicate this protocol to the environment control means 180 , which would subsequently regulate the vehicle environment parameters according to that protocol.
  • steps 660 - 680 of FIG. 6 are steps 660 - 680 of FIG. 6 .
  • FIG. 9 is an illustration of a method 900 according to an embodiment of the present disclosure, identical to the method 600 of FIG. 6 , but with the additional step of detecting if patient harnessing and/or patient positioning protocols have been adhered to according to a predefined protocol, step 920 .
  • FIG. 10 is an illustration of a method 1000 according to an embodiment of the present disclosure, identical to the method 900 of FIG. 9 , but with the additional steps of maintaining a log in memory each time the predefined protocol(s) for harnessing and/or positioning is/are breached, step 1020 , and transmitting at least part of the contents of the memory to a remote database(s), step 1030 .
  • the at least part of the contents of the memory may be transmitted to the remote database(s) wirelessly via the wireless communication means 155 or via wired means for example once the ambulance 225 has returned to its station after a response incident.
  • FIG. 11 there is provided a method 1100 of controlling access to a vehicle based on detecting 1120 a patient transport apparatus and further determining 1130 a type of the patient transport apparatus. Subsequently, the method 1100 comprises selectively controlling 1140 a locking mechanism of the vehicle in response to the captured image containing the personnel transport apparatus.
  • FIG. 12 illustrates a further embodiment where, in addition to the steps 1110 - 1140 of FIG. 11 , there is provided the additional step of determining 1220 which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • the vehicle may be equipped with a notification means.
  • the notification means may be configured to provide a notification, such as an audio and/or visual notification, from the vehicle detailing which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the patient transport apparatus in the captured image.
  • a notification such as an audio and/or visual notification
  • FIG. 13 An example of an embodiment wherein the provision of a notification is included is given in FIG. 13 . If neither the lift nor the ramp are suitable based on the type of the patient transport apparatus detected, the notification means may provide a notification in such an instance too.
  • Other means of access to the vehicle may include manual access via the door(s) of the vehicle.
  • the notification means may comprise an audible or visual alarm.
  • the notification means may comprise the sirens and/or one or more lights mounted on the vehicle such as the siren lights of an emergency vehicle.
  • a notification may be provided to a remote device of one or more personnel indicating which means of access to the vehicle is suitable.
  • Said remote device may be communicatively coupled to the notification means or another component of the vehicle such as but not limited to the control means or the processing means.
  • the vehicle system may default to manual access (rather than the automatic extension of an access means such as the vehicle lift or ramp) and manual authentication.
  • the processes may terminate or continue to authenticate any personnel in the vicinity of the vehicle, and in such embodiments where the process continues to authentication the process may selectively control 1350 the locking mechanism in response to the captured image containing the personnel transport apparatus and an authenticated person.
  • a method 1400 similar to that of FIG. 12 , but including the further step 1460 of extending a ramp or lowering a lift from the vehicle to facilitate access of the patient transport apparatus to the vehicle in response to the type of patient transport apparatus.
  • a notification may be provided as in FIG. 12 (however it will be understood that the provision of a notification may also apply to cases where the ramp or lift are indeed determined to be appropriate).
  • the vehicle system may default to manual. Alternatively, other means of access to the vehicle may be actuated to open, where available.
  • the type of patient transport apparatus in the vicinity of the vehicle is an authorised type of patient transport apparatus.
  • a certain type of wheelchair or dolly may not meet the health and safety requirements associated with the vehicle in question.
  • Authorised personnel may have the option, possibly subject to the provision of a security pass, to override the requirement that the type of the patient transport apparatus is an authorised patient transport apparatus.
  • such an override event may be included in the report compiled by the processing means 105 .
  • data relating to authorised types of patient transport apparatuses may be stored on local memory or remotely.
  • what constitutes an authorised type of patient transport apparatus may dynamically change based on input from a remote user or authorised personnel local to the vehicle, based on information input such as a type of condition the patient is being treated for.
  • certain steps may be neglected. For example, it may save power/memory to not determine which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the patient transport apparatus in the captured image if the patient transport apparatus is not authorised to access.
  • the process 1400 may be resumed or restarted subject to an override event such as that discussed in the foregoing.
  • FIGS. 15-19 illustrate various embodiments of processes for determining patient safety.
  • the image analysing means 115 may further be configured to determine if the patient transport apparatus is secured in place in the vehicle and/or secured on a vehicle access means (for example a vehicle lift or vehicle ramp).
  • ‘and/or’ clause for the three options which can be nominally labelled ‘A’, ‘B’ and ‘C’ may include all permutations of these labels. Namely, ‘A and/or B and/or C may have within its meaning here:
  • A, B and C here refer to steps 1830 ; 1840 ; 1850 respectively.
  • determining the type of patient transport apparatus and possibly additionally the sub-type of the patient transport apparatus will facilitate increased accuracy in determining whether the patient is correctly harnessed and/or positioned in the patient transport apparatus, since harnessing and positioning protocols may vary between various patient transport apparatuses.
  • determining the type and possibly sub-type of patient transport apparatus allows one to determine whether a vehicle ramp or lift ought to be actuated in to operation, and to extend a ramp or lower a lift in response to the type of patient transport apparatus detected. Accordingly, improved patient safety, improved patient safety protocol adherence, reduced power consumption and improved time efficiency are advantages of the processes of FIG. 18 .
  • FIG. 19 provides an exemplary method similar to that of FIG. 15 , but including the additional step of providing 1920 a notification to a user of the vehicle that the patient is not correctly positioned and/or harnessed. It will be understood that the steps of FIG. 19 may be further combined in a number of variations with the steps of FIGS. 16-18 . What is more, a notification may further be provided in response to one or more of:
  • the above determinations i)-iii) may comprise steps performed appropriately in combination with any of the processes of FIGS. 16-19 .
  • FIGS. 4 and 6-10 While a number of exemplary embodiments of a method according to the present disclosure have been disclosed in FIGS. 4 and 6-10 , other combinations of the method steps of FIGS. 4 and 6-10 are possible and are envisaged to be so by the inventors. For example, one might combine the steps 910 - 920 of FIG. 9 with step 720 of FIG. 7 or steps 820 - 830 of FIG. 8 .
  • the image capture device 175 may further comprise thermal imaging capabilities, and/or a separate thermal imaging device (not illustrated) may be fitted to the vehicle.
  • the image capture device 175 and the thermal imaging device may be fitted to the interior of the vehicle and/or the exterior of the vehicle as desired.
  • FIGS. 23-26 illustrate a number of example scenarios relating to thermal imaging data capture.
  • an image is captured in the vehicle or in the vicinity of the vehicle, 2310 .
  • the captured image is then analysed, 2320 .
  • a person is then detected in the captured image, 2330 .
  • a characteristic associated with the person is then determined at a first time, 2340 .
  • the determined characteristic is then associated with a predefined threshold value, 2350 .
  • a change of state in the vehicle is actuated, 2360 .
  • the subject may comprise a person, for example authorised or unauthorised personnel, or a patient. Temperature may be indicative of stress levels of persons in the vicinity of the vehicle, a critical event in a patient, or whether authorised personnel are fit to work in the vehicle.
  • actuating a change of state of the vehicle comprises one or more of: providing a notification to one or more areas of the vehicle; providing a notification to a user device; or creating a log in memory of an instance of the characteristic surpassing the predefined threshold value.
  • the change of state may be actuated by a processing means in cooperation with other necessary components.
  • the processing means may send instructions to the notification means to provide the notification in the vicinity of the vehicle, or the processing means may send instructions to a wireless transmission means to provide a notification to the user device.
  • a notification may be provided to a user of the vehicle in response to the temperature of the subject being greater than a predefined threshold temperature.
  • the notification may be provided in an audio and/or visual format.
  • the notification may be provided to a user device such as a smart phone, or a tablet, or a system of the vehicle.
  • Captured thermal data such as temperature profiles in images may be fed in to a data processing application located locally or remotely such as in the cloud.
  • the data processing application comprises a machine learning application.
  • the application in response to a predefined threshold value being surpassed the application may provide instructions to the vehicle environment control means to alter the temperature on board the vehicle to a particular value.
  • the temperature on board the vehicle may be altered dynamically in response to thermal data captured in real time.
  • a first temperature of the subject in the captured image may be determined at a first time and a second temperature of the subject may be determined at a second time, 2510 .
  • a rate of change of the temperature of subject may then be determined based on the first temperature at the first time and the second temperature at the second time, 2520 .
  • the temperature of a subject in captured image may be monitored in real time or at predefined intervals to monitor a rate of change of the temperature of the subject in the captured image Similar to the above, a rate of change of temperature may be indicative of stress levels of persons in the vicinity of the vehicle, a critical event in a patient, or whether authorised personnel are fit to work in the vehicle. For example, if a threshold rate of change of temperature X/° C.
  • a threshold rate of change of temperature X/° C. of a patient is surpassed, this may indicate that a patient's condition has worsened.
  • a threshold rate of change of temperature X/° C. of a patient is surpassed, this may indicate that a patient is suffering for a particular condition/disease.
  • a temperature above the normal temperature of a human may indicate a fever.
  • a threshold temperature may be, for example, 37.8 degrees Celsius. If a person is suffering from a particular condition this may be used in determining a course of action such as quarantine or isolation of said person.
  • it may be desirable to determine a rate of change of the rate of change of temperature i.e. second order derivative). Referring to FIG.
  • an event such as a threshold value of rate being surpassed may also be associated with a course of action to be taken, 2650 .
  • the course of action may comprise one or more of: adjusting an ambient temperature of the interior of the vehicle, provision of medication to the person, or a change in vehicle speed.
  • the notification means may provide a notification to a user of the vehicle or remotely located personnel containing information relating to the course of action, 2660 . Such an event may also be recorded in local or remote memory. It will be understood that step 2650 may equally be performed in the method of FIGS. 23-25 or FIG. 27 ahead.
  • Notification of the changing condition of a patient may be indicative of whether a new course of action needs to be taken in relation to the patient. For example, whether certain drugs ought to be administered by paramedics, whether the current course of treatment is worsening the situation and/or whether the temperature in the interior of the vehicle needs to be altered. Notifications may be specifically directed to the driver of the vehicle, for example whether arrival at a hospital has become urgent or whether their driving is adversely affecting personnel and/or patient status.
  • Surface temperatures may be monitored in the interior or exterior of the vehicle. For example, a temperature of the surface of the patient transport apparatus may be monitored. Where the patient transport apparatus is exposed to a high or low temperature for a period of time, the surface of the patient transport apparatus may reach a temperature which is unsuitable for placing a patient thereon. In this case personnel may be notified appropriately.
  • a log of any thermal imaging data and related events may be recorded in local and/or remote memory for subsequent retrieval.
  • Said data may form part of a report compiled by software.
  • said report may associate data with specific users of the vehicle.
  • the method 2700 comprises capturing an image of an area in the vicinity of the vehicle, step 2710 .
  • the image is then analysed, step 2720 .
  • the image is analysed to detect if the captured image contains both a person and a patient transport apparatus, step 2730 .
  • the image is further analysed to determine a parameter relating to the person on the patient transport apparatus, step 2740 .
  • a field of a database may be populated, step 2750 .
  • the database is an electronic patient care record (ePCR).
  • the parameter relating to the person on the patient transport apparatus comprises a gender.
  • Additional data parameters relating to the patient such as age, race, height and weight may also be estimated using machine learning algorithms or otherwise. It will be understood that any number of other captured data parameters described in the present disclosure may be used in appropriately populating a database such as the ePCR.
  • the populated ePCR may be stored on local or remote memory for subsequent retrieval.
  • the steps of the method 2700 of FIG. 27 may be implemented in combination with the method steps of other processes described herein.
  • steps 2740 and 2750 may be implemented in combination with the steps of FIG. 4 or FIG. 6 or 10 and so on.
  • a method 2800 of vehicle security comprises capturing an image of an area in the vicinity of the vehicle, step 2810 .
  • the image is then analysed, step 2820 .
  • the method comprises determining whether an attack is taking place in the vicinity of the vehicle.
  • a change of state of the vehicle is actuated, step 2850 .
  • actuating a change of state of the vehicle comprises one or more of: activating a recording mode of the image capture device(s) 175 in the interior and/or exterior of the vehicle; providing a notification to one or more areas of the vehicle; providing a notification to a user device; or creating a log of the attack event in memory.
  • the change of state may be actuated by a processing means in cooperation with other necessary components.
  • the processing means may send instructions to the notification means to provide the notification in the vicinity of the vehicle, or the processing means may send instructions to a wireless transmission means to provide a notification to the user device.
  • determining whether an attack is taking place in the vicinity of the vehicle may comprise one or both of determining that an attack is taking place on the body of the vehicle, or on persons who are in the vicinity of the vehicle. For example, paramedics may be under attack near their ambulance, or police officers near their squad car. Accordingly, FIG. 29 illustrates a method 2900 comprising the additional step of detecting a first person and a second person in the captured image, 2930 . It is then determined whether an attack is taking place on the first person, 2940 .
  • the first person may be an authorised person such as a paramedic or police officer, and the second person may be an assailant.
  • the authorised personnel may be identified by their uniform, by facial recognition or by other means such as a scan-able code disposed on their clothing which can be compared to a database stored in memory.
  • a person may be identified as a patient in numerous ways, for example based on their position and orientation such as on the ground or on a patient transport apparatus, based on being harnessed in to a patient transport apparatus, or based on certain medical equipment associated with them such as an IV drip.
  • a person may be identified as an apprehended person. This may be determined by the detection of the person as having their hands cuffed.
  • FIG. 30 provides a method 3000 in which the image analysing means 115 determines whether the first person is an authorised person, a patient or an apprehended person.
  • Determining that an attack is taking place may comprise detecting certain types of movement of the second person in the captured image. For example, detecting an arm of the second person in the captured image moving in a certain direction at a certain speed (i.e. a punch), or detecting a sequence of such events (a series of punches).
  • the image analysing means 115 may be configured to execute the method 2800 implementing one or more machine learning algorithms
  • the machine learning algorithm(s) may be trained with examples of “attack behaviour” against which it can compare captured image data.
  • recording may continue until image analysing means 115 determines that the attack has ceased.
  • the video data may be stored on local and/or remote memory for subsequent retrieval. Data such as but not limited to the identity of personnel involved in the incident, an ePCR of a patient involved in the incident, as well as time stamps, may be associated with the event.
  • Providing a notification to one or more areas of the vehicle may comprise activating an audibly and/or visually perceptible alarm on the exterior of the vehicle or in the driver's cab.
  • the notification may continue until the image analysing means 115 determines that the attack has ended or until authorised personnel deactivate the notification.
  • the notification may comprise a visual notification appearing on the user interface of the cab, and/or an audio notification emanating from audio speakers.
  • Providing a notification to the user device may comprise processor 105 sending instructions to a wireless transmission means to provide an audibly and/or visually perceptible alarm (e.g. sirens, flashing lights and the like), an automated email, push notification, a text message such as SMS, or other alert on a user device.
  • the user device may comprise a device belonging to a safety officer.
  • the video may be live streamed to the safety officer's device for continual monitoring of the situation in real time.
  • the method 2800 may further comprise controlling a locking mechanism of the vehicle in response to determining an attack is taking place in the vicinity of the vehicle.
  • FIGS. 28-30 may be extended to the detection and identification of additional persons in the captured image.
  • the method for detection of persons and objects and parameters related to both such as distance and velocity, and furthermore actuating the vehicle parameters and components may be implemented in software, firmware, hardware, or a combination thereof.
  • the method is implemented in software, as an executable program, and is executed by one or more special or general purpose digital computer(s), such as a personal computer (PC; IBM-compatible, Apple-compatible, or otherwise), personal digital assistant, workstation, minicomputer, or mainframe computer.
  • the steps of the method may be implemented by a server or computer in which the software modules reside or partially reside.
  • such a computer will include, as will be well understood by the person skilled in the art, a processor, memory, and one or more input and/or output (I/O) devices (or peripherals) that are communicatively coupled via a local interface.
  • the local interface can be, for example, but not limited to, one or more buses or other wired or wireless connections, as is known in the art.
  • the local interface may have additional elements, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the other computer components.
  • the processor(s) may be programmed to perform the functions of the method for authorising persons and controlling vehicle parameters such as but not limited to the lock state of the doors and the state of access facilities such as a ramp or lift, or vehicle parameters such as but not limited to temperature.
  • the processor(s) is a hardware device for executing software, particularly software stored in memory.
  • Processor(s) can be any custom made or commercially available processor, a primary processing unit (CPU), an auxiliary processor among several processors associated with a computer, a semiconductor based microprocessor (in the form of a microchip or chip set), a macro-processor, or generally any device for executing software instructions.
  • Memory is associated with processor(s) and can include any one or a combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and non-volatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). Moreover, memory may incorporate electronic, magnetic, optical, and/or other types of storage media. Memory can have a distributed architecture where various components are situated remote from one another, but are still accessed by processor(s).
  • the software in memory may include one or more separate programs.
  • the separate programs comprise ordered listings of executable instructions for implementing logical functions in order to implement the functions of the modules.
  • the software in memory includes the one or more components of the method and is executable on a suitable operating system ( 0 /S).
  • the present disclosure may include components provided as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
  • a source program the program needs to be translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory, so as to operate properly in connection with the 0 /S.
  • a methodology implemented according to the teaching may be expressed as (a) an object oriented programming language, which has classes of data and methods, or (b) a procedural programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, Pascal, Basic, Fortran, Cobol, Perl, Java, and Ada.
  • a computer readable medium is an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method.
  • Such an arrangement can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • a “computer-readable medium” can be any means that can store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the computer readable medium can be for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Any method descriptions or blocks in the Figures, should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, as would be understood by those having ordinary skill in the art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Signal Processing (AREA)
  • Medical Informatics (AREA)
  • Transportation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Quality & Reliability (AREA)
  • Epidemiology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Alarm Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Lock And Its Accessories (AREA)
  • Emergency Alarm Devices (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present disclosure relates to a method and system for enhancing security in a vehicle, including capturing an image of an area in the vehicle or in the vicinity of the vehicle, analysing the captured image, and determining whether there is an attack taking place in the vicinity of the vehicle.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a method and system for enhancing the security of a vehicle, including actuating a change of state of the vehicle in response to determining that an attack is taking place in the vicinity of the vehicle.
  • BACKGROUND OF THE INVENTION
  • Biometric identification has found a large number of applications in recent years, and particularly in the field of security. The uniqueness of human features, whether that is facial features (such as facial structure or eyes), finger prints or hand prints, adds a level of security vastly more difficult to “hack” than historical conventional security means such as password identification. Typically optical sensors of various kinds are used to detect unique facial features among other biometric markers which identify an individual for authorisation/security clearance.
  • A common scenario in the field of emergency response medical treatment is one in which paramedics are treating and collecting patients at a scene containing at least one or several sources of threat to themselves the patient and the ambulance vehicle itself, for example, a riot or terrorist incident or the like). The risk of vehicle hijacking is a very real possibility in such a scenario if the paramedic(s) should forget to lock the ambulance upon leaving it to attend a patient or upon returning to the vehicle. Moreover, paramedics are required by law to leave vehicles running at the scene, making hijacking an even more realistic threat.
  • In addition to improving security, providing automated access to the clinical environment of the ambulance—for example via a ramp, lift or other means—is desirable to reduce the risk of an accident occurring during the loading of a patient in to the ambulance, and to reduce the time between approaching the ambulance and driving away, in a high-stress environment. Such a scenario also presents an increased risk of incorrect patient harnessing or positioning.
  • Furthermore, the regulation of environment parameters on board the ambulance—parameters which ideally are tailored to the condition of the patient such as temperature, pressure and humidity—is a step which might go amiss in such a high-stress scenario. Moreover, versatility of the ambulance service ought to be a priority given the demand for ambulances worldwide in locations varying for example in temperature and altitude. Response external environment conditions may differ globally as much as a dry desert at sea-level with 50° C., a sub-polar environment at −40° C. or a high elevation environment e.g. 10,000 ft above sea-level. In any such environment it is crucial that the patient is introduced to an environment which is determined to not be adverse to their health and their treatment. For example, a patient suffering from severe heatstroke would benefit from a cool environment or an environment which is gradually brought to a cool temperature with respect to the external environment. The above description is to at least some extent equally applicable for other emergency response vehicles such as an emergency helicopter or an emergency boat.
  • It is thus desirable to provide methods, system and apparatuses for controlling access to a vehicle and detecting proximate security incidents which addresses at least some of the drawbacks of the prior art.
  • SUMMARY OF THE INVENTION
  • According to a first aspect of the invention there is provided a method for controlling access to a vehicle, comprising:
      • capturing an image of an area in the vicinity of the vehicle;
      • analysing the captured image;
      • detecting if the captured image contains both at least one person and a personnel transport apparatus;
      • determining if the at least one person in the captured image is authorised to access the vehicle;
      • determining a type of personnel transport apparatus and controlling a parameter of the vehicle in response to the determined type; and
      • selectively controlling a locking mechanism of the vehicle in response to the captured image containing at least one authorised person and the personnel transport apparatus.
  • Advantageously, the method further comprises determining if the personnel transport apparatus is a stretcher, a wheelchair or a dolly.
  • Further advantageously, the method comprises determining the type of the stretcher, the type of the wheelchair or the type of the dolly.
  • Advantageously, the method further comprises determining which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • Advantageously, the method further comprises extending a ramp from the vehicle to facilitate access of the personnel transport apparatus to the vehicle in response to the type of personnel transport apparatus.
  • Advantageously, the method further comprises lowering a lift to facilitate access of the personnel transport apparatus to the vehicle.
  • Beneficially, the method further comprises selectively controlling at least one parameter of the vehicle in response to detecting both of the at least one authorised personnel and the personnel transport apparatus approaching the vicinity of the vehicle.
  • Preferably, the at least one parameter is an environmental parameter.
  • Advantageously the environmental parameter comprises temperature.
  • Advantageously, the method further comprises measuring at least one external environment parameter of the vehicle.
  • Advantageously, the method further comprises controlling a vehicle environment parameter in response to the at least one measured external environment parameter.
  • Advantageously, the method further comprises regulating the vehicle environment to a predefined characteristic.
  • Advantageously, the method further comprises detecting if a person on the personnel transport apparatus is correctly harnessed according to a predefined protocol.
  • Advantageously, the method further comprises detecting if a person on the personnel transport apparatus is correctly positioned on said personnel transport apparatus according to a predefined protocol.
  • Advantageously, the method further comprises maintaining a log in memory each time the predefined protocol was breached.
  • Preferably the log in memory is associated with a user profile of one or more authorised personnel.
  • Advantageously, the method further comprises transmitting at least part of the contents of the memory to a remote database.
  • Ideally a transmission transmits at least part of the contents of the memory to a remote database when the predefined protocol was breached.
  • Preferably the transmission is a wireless transmission.
  • Advantageously, the method further comprises providing an audio and/or visual notification from the vehicle detailing which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • Advantageously, the method further comprises determining whether the type of the personnel transport apparatus in the captured image is an authorised type of personnel transport apparatus.
  • According to a second aspect of the invention there is provided a method for controlling access to a vehicle, the method comprising:
  • capturing an image of an area in the vicinity of the vehicle;
  • analysing the captured image;
  • detecting if the captured image contains a personnel transport apparatus;
  • determining a type of personnel transport apparatus and controlling a parameter of the vehicle in response to the determined type; and
  • selectively controlling a locking mechanism of the vehicle in response to the captured image containing the personnel transport apparatus.
  • Preferably the method further comprises determining which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • According to third aspect of the invention there is provided an apparatus for controlling access to a vehicle, comprising:
  • an image capture device for capturing an image of an area in the vicinity of the vehicle;
  • an image analysing means for analysing the captured image;
  • a detecting means for detecting if the captured images contain both at least one person and a personnel transport apparatus, wherein the detecting means is further configured for determining the type of the personnel transport apparatus;
  • a processing means for determining if the at least one person in the captured image is authorised to access the vehicle; and
  • a control means for selectively controlling a locking mechanism of the vehicle in response to the captured image containing at least one authorised person and the personnel transport apparatus.
  • Preferably the processing means is configured to implement a machine learning algorithm.
  • Further preferably the detecting means is configured for detecting if the personnel transport apparatus is a stretcher, a wheelchair or a dolly.
  • Ideally the detecting means is configured for detecting the type of the stretcher, the type of the wheelchair or the type of the dolly.
  • Preferably the control means is further configured for controlling operations of a mechanical apparatus associated with the vehicle.
  • In one aspect the mechanical apparatus is a retractable ramp.
  • In another aspect the mechanical apparatus is a lift.
  • Preferably the apparatus further comprises an environment measurement means for measuring one or more vehicle environment parameters.
  • Further preferably the apparatus further comprises an environment controlling means for controlling one or more vehicle environment parameters.
  • Ideally the apparatus further comprises an external environment measurement means for measuring one or more external environment parameters.
  • Preferably the apparatus further comprises memory for storing one or more environment control protocols defining instructions for controlling the vehicle environment parameters.
  • Beneficially the control means is configured for varying the one or more environment parameters towards predefined values for the one or more environment parameters, with respect to the external environment parameters.
  • Further beneficially the control means is configured for dynamically adjusting power output of the environment controlling means in dependence on one or a combination of one or more environment parameters, one or more external environment parameters, duration the vehicle doors have been open and duration of presence of the at least one authorised personnel in the vicinity of the vehicle.
  • Preferably the control means is operationally associated with the vehicle electromechanical actuators.
  • Preferably the image analysing means is configured to determine if one or both of said at least one authorised personnel and a person on the personnel transport apparatus are correctly harnessed according to a predefined protocol.
  • Further preferably the image analysing means is configured to determine if a person on the personnel transport apparatus is correctly positioned according to a predefined protocol.
  • Ideally the apparatus further comprises memory for storing one or more predefined protocols for harnessing and/or positioning a person on the patient transport apparatus.
  • Further ideally the apparatus further comprises a communication means for communicating with a remote server or a remote database.
  • Preferably the communication means is a wireless communication means.
  • Preferably the control means is configured for locking the vehicle doors if a predefined combination or number of authorised personnel leaves the vehicle or the vicinity of the vehicle.
  • Preferably the apparatus further comprises a recording means for synchronously recording personnel status, vehicle environment and vehicle parameter data.
  • Advantageously the detecting means is configured for detecting fluid spillage.
  • In one aspect the fluid is a bodily fluid including but not limited to blood or vomit.
  • Advantageously the apparatus further comprises memory for storing one or more of synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage.
  • Ideally the processing means is further configured to generate a report to be transmitted wirelessly to a remote server or a remote database.
  • Preferably the report to be transmitted to a remote server or a remote database contains one or more of the synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage.
  • Advantageously, the detecting means is further configured to determine the distance of the at least one person and one or more objects from the vehicle, their duration of time from the vehicle, and the direction and speed of their motion with respect to the vehicle.
  • Advantageously, the apparatus further comprises a notification means configured for providing an audio and/or visual notification.
  • According to a fourth aspect of the invention there is provided a computer-readable medium comprising instructions which, when executed, cause a processor to carry out one or more aspects of a method as those outlined above.
  • According to a fifth aspect of the invention there is provided an emergency vehicle comprising the apparatus of the second and third aspects.
  • According to a further aspect of the invention there is provided a method of vehicle security comprising the steps of:
      • capturing an image of an area in the vicinity of the vehicle;
      • analysing the captured image;
      • determining by an image analysing means whether an attack is taking place in the vicinity of the vehicle; and
  • in response to determining that an attack is taking place in the vicinity of the vehicle, actuating a change of state of the vehicle. A security system for implementing the method of vehicle security is also provided, comprising:
  • an image capture device configured for capturing an image of an area in the vicinity of a vehicle;
  • an image analysing means configured for analysing the captured image, wherein the image analysing means is further configured to determine whether an attack is taking place in the vicinity of the vehicle;
  • a processor configured to actuate a change of state of the vehicle in response to the image analysing means determining that an attack is taking place; and
  • memory.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an illustration of an apparatus according to an embodiment of the present disclosure;
  • FIG. 2 is an illustration of elevation and plans views of a scenario in which the apparatus may be implemented according to an embodiment of the present disclosure;
  • FIG. 3 is an illustration of a system according to an embodiment of the present disclosure;
  • FIG. 4 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 5 is an illustration of a plurality of scenarios according to an embodiment of the present disclosure;
  • FIG. 6 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 7 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 8 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 9 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 10 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 11 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 12 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 13 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 14 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 15 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 16 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 17 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 18 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 19 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 20 is an illustration of an exemplary method according to an embodiment of the present disclosure;
  • FIG. 21 is an illustration of an exemplary method according to an embodiment of the present disclosure; and
  • FIG. 22 is an illustration of an exemplary method according to an embodiment of the present disclosure.
  • FIGS. 23-30 provide illustrations of exemplary methods according to various embodiments of the present disclosure.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • The present teaching will now be described with reference to an exemplary method and apparatus. It will be understood that the exemplary method and apparatus are provided to assist in an understanding of the present teaching and are not to be construed as limiting in any fashion. Furthermore, elements or components that are described with reference to any one Figure may be interchanged with those of other Figures or other equivalent elements without departing from the spirit of the present teaching.
  • Referring now to the Figures there is illustrated a method and apparatus for controlling access to a vehicle, wherein when at least one authorised person and a patient transport apparatus are detected by optical sensors the state of a locking mechanism of the vehicle can be switched to the ‘open’ state. In the exemplary embodiment, the vehicle is an ambulance. It will be understood that the description of the present disclosure can equally apply to any vehicle appropriately outfitted with the relevant components and the apparatus herein. Moreover, where in the exemplary embodiment ‘ambulance’ is used in place of ‘vehicle’, for consistency in the context the term ‘paramedic’ will replace ‘person’ and ought to be understood as referring to an authorised personnel. Unauthorised personnel are referred to as ‘further person(s)’. It will be understood that in embodiments involving any vehicle appropriately outfitted with the relevant components and the apparatus herein, the term ‘person’ may be used. What is more, whilst the term ‘patient transport apparatus’ is generally used herein, it will be understood that since the disclosure is not isolated to an ambulance the use of the word ‘patient’ is not always most appropriate. For example, a more general term may be ‘personnel transport apparatus’. This may cover, by way of example only, contexts in which the vehicle is a police vehicle, security vehicle, military vehicle, fire engine or the like, as well as a civilian vehicle.
  • FIG. 1 is a block diagram illustrating a configuration of an apparatus comprising a computing device 100 which includes various hardware and software components that, in conjunction with a plurality of external devices it is operationally associated with, function to perform processes according to the present disclosure. Referring to FIG. 1, the computing device 100 comprises a processor/processing means 105 in communication with memory 110. The processor 105 functions to execute software instructions that can be loaded and stored in the memory 110. The processor 105 may include a number of processors, a multi-processor core, or some other type of processor, depending on the particular implementation. The memory 110 may be accessible by the processor 105, thereby enabling the processor 105 to receive and execute instructions stored on the memory 110. The memory 110 may be, for example, a random access memory (RAM) or any other suitable volatile or non-volatile computer readable storage medium. In addition, the memory 110 may be fixed or removable and may contain one or more components or devices such as a hard drive, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the above.
  • One or more software modules may be encoded in the memory 110. The software modules may comprise one or more software programs or applications having computer program code or a set of instructions configured to be executed by the processor 105. Such computer program code or instructions for carrying out operations for aspects of the systems and methods disclosed herein may be written in any combination of one or more programming languages.
  • The software modules may include at least a first application 115 configured to be executed by the processor 105. The software modules may further include a second application 120 and a third application 125 configured to be executed by the processor 105. During execution of the software modules, the processor 105 configures the computing device 100 to perform various operations relating to the embodiments of the present disclosure, as will be described below. In the exemplary embodiment, the first, second and third applications 115, 120 and 125 may be an image analysing means 115, a detection means 120 and a recording means 125.
  • Image analysis will be understood to entail a multitude of techniques known to one skilled in the art, including but not limited to one or more of digital signal processing, Fourier analysis and 3D pose estimation. The detecting means 120 may comprise any software package for detecting specific features of an image; it will be understood that this may entail edge detection software; one or more machine learning techniques such as deep learning algorithms implementing neural networks; pattern recognition and feature detection. These techniques may be implemented once on any single frame or dynamically on a live feed comprising a plurality of frames.
  • Other information and/or data relevant to the operation of the present apparatus, systems and methods, such as a database 130, may also be stored on the memory 110. The database 130 may contain and/or maintain various data items and elements that are utilized throughout the various operations of the apparatus and system described below. It should be noted that although the database 130 is depicted as being configured locally to the computing device 100, in certain implementations the database 130 and/or various other data elements stored therein may be located remotely. Such elements may be located on a remote device or server—not shown, and connected to the computing device 100 through a network in a manner known to those skilled in the art, in order to be loaded into a processor and executed.
  • Further, the program code of the software modules and one or more computer readable storage devices (such as the memory 110) form a computer program product that may be manufactured and/or distributed in accordance with the present disclosure, as is known to those of skill in the art.
  • A control means 135 may be operationally associated with the processor 105. The control means 135 may be further operationally associated with one or further external devices 140, 145, 150. In the exemplary embodiment, the control means 135 may be operationally associated with a vehicle locking mechanism 140, a vehicle ramp/lift mechanism 145 and vehicle electromechanical actuators 150. The vehicle electromechanical actuators 150 will be understood by the skilled person to comprise, but is not limited to, the electromechanical sensor(s), actuator(s) and circuit(s) which may be found on-board a vehicle.
  • A communication means 155 is also operatively connected to the processor 105 and may be any interface that enables communication between the computing device 100 and other devices, machines and/or elements. The communication means 155 is configured for transmitting and/or receiving data from remote databases 160 and/or servers 165. For example, the communication means 155 may include but is not limited to a Bluetooth, or cellular transceiver, a satellite communication transmitter/receiver, an optical port and/or any other such, interfaces for wirelessly connecting the computing device 100 to the other devices.
  • A user interface 170 is also operatively connected to the processor 105. The user interface 170 may comprise one or more input device(s) such as switch(es), button(s), key(s), and a touchscreen.
  • The user interface 170 functions to facilitate the capture of commands from the user such as on-off commands or settings related to operation of the apparatus and system described below. The user interface 170 may function to issue remote instantaneous instructions or notifications on images received via a non-local image capture mechanism.
  • The processor 105 may be further operationally associated with one or more external devices. The one or more external devices may include but are not limited to an image capture device 175, an environment control means 180, an environment measurement means 185 and an external environment measurement means 190. Data may be transmitted to and/or from these external devices. Data from these external devices may be processed by the processor 105 implementing one or more of the software packages stored in memory 110. Data from the external devices may be stored in memory 110 via the recording means 125. Instructions such as predefined protocols stored in memory 110 may be sent to the external devices via the processor 105.
  • The computing device 100 may further be associated with one or more further computing devices 100 in an Internet of Things (IoT) network, using the communication means 155 on-board every computing device 100. The IoT network may comprise one or more remote server(s) and remote database(s), and software modules located locally to any number of computing devices 100 or located remotely on said remote server(s) and/or database(s). The software modules may comprise means to process data received from one or more computing devices 100 and may be configured to produce virtual models to optimise parameters associated with said one or more computing devices 100.
  • FIG. 2 illustrates an elevation view of a common scenario 200 in the work of an emergency medical response unit. A paramedic 205 attends to a patient 210 and loads the patient 210 on to a patient transport apparatus 215. In one embodiment, the patient transport apparatus 215 is a stretcher. In another embodiment, the patient transport apparatus 215 is a wheelchair. In a further embodiment the patient transport apparatus 215 is a dolly. At least one further person 220 who is not a member of the ambulance staff is typically also present at the scene. Once the patient 210 is deemed by the paramedic 205 to be secure on board the patient transport apparatus 215, the paramedic 205 will seek to load the patient 210 on to the ambulance 225 via the rear doors.
  • As discussed above, the work of a paramedic may entail arriving at a scene with multiple threats, such as during a riot or terrorist incident. For example, the further person(s) 220 may be rioter(s) or terrorist(s) who pose a threat to the paramedic 205, the patient 210 and the surrounding population proper. The further person(s) 220 may seek to hijack the vehicle, and may be successful if the paramedic 205 forgets to lock the ambulance doors when leaving the ambulance 225. To provide a layer of security against this, detection and biometric authorisation of the paramedic 205 leaving the ambulance 225 may actuate locking of the doors. When returning, detection and biometric authorisation of the paramedic 205 together with detection of the patient transport apparatus 215 may be implemented to unlock the ambulance doors according to the embodiments of the present disclosure. Being able to control access to an ambulance on the basis of biometric authorisation of ambulance personnel, and additionally detection of a patient transport apparatus, adds a layer of security which greatly diminishes the chance of a hijacking occurring.
  • The image capture device 175 may capture an image of an area in the vicinity of the ambulance 225. In one embodiment, the image comprises a single frame. In another embodiment, the image comprises a plurality of frames. In a further embodiment, the image comprising a plurality of frames may form a video. The image capture device 175 may be placed on top of the ambulance 225 or on one of its vertical sidewalls. The image capture device 175 may be a singular camera or a multitude of cameras positioned together as an array or strategically at various points on the body of the ambulance 225.
  • Once the image capture device 175 has captured an image of an area in the vicinity of the ambulance 225, the image analysing means 115 may be used to analyse the captured image. The detecting means 120 may then be used to determine if the captured image contains both of the paramedic 205 and the patient transport apparatus 215. If only one or the other, or neither, are detected then the process ends; these steps are best illustrated in FIG. 4 and its associated description.
  • The processing means 105 may be used to determine if the person—the paramedic 205—is authorised personnel. If the person is determined to be authorised personnel and the patient transport apparatus has additionally been detected to be with them then the locking mechanism 140 on the vehicle (the ambulance 225) may be selectively controlled to be opened by the control means 135. Steps for determining authorisation and access are best illustrated in FIG. 4 and its associated description. In the exemplary embodiment, data relating to authorised personnel including but not limited to biometric markers, as well as data relating to markers for detecting the patient transport apparatus 215, is stored on a database 130 in local memory 110. In an alternative embodiment, data relating to authorised personnel and patient transport apparatus detection is stored on a remote database and is accessed either by the communication means 155 (wirelessly) or by installation at a port before driving to the scene of an incident if it is an ambulance.
  • In the exemplary embodiment, the processing means 105 is configured to implement machine learning algorithms. It will be understood that this may entail any number of machine learning algorithms including supervised and unsupervised machine learning algorithms as appropriate, for detecting facial and/or other bodily features. For example in one embodiment an unsupervised machine learning algorithm may entail the processing means 105 working in conjunction with the detecting means 120 to group biometric markers from the captured image. The processing means 105 would then seek to match this biometric marker grouping data with biometric data of authorised personnel stored on memory, to authorise a person(s) in the vicinity of the vehicle. In one embodiment, machine learning algorithms may be further used to constantly update and optimise biometric markers associated with personnel as their markers vary naturally over time, so as to avoid possible failed authorisation events. A redundant means (not shown) of overriding the system and/or providing driver credentials may be made available in case of environment setup issues. In one embodiment, a keypad may be placed on the exterior of the ambulance 225 which is operationally associated with the locks on the ambulance 225. Authorised personnel privy to the password may enter the password in to the keypad to unlock the ambulance 225 doors. In a further embodiment, the keypad may additionally or alternatively comprise a finger-print recognition means. In such an embodiment the processing means 105 may be configured to compare biometric data from memory 110 to that provided at the keypad during an access request incident, and process a true/false determination based on a biometric match/failure to authenticate respectively. The processing means 105 would then send the binary determination to the control means 135 which may unlock the ambulance 225 doors if appropriate. In an alternative embodiment, the keypad may comprise a means for processing a proximity card access request or a magnetic stripe card access request.
  • Referring again to the image analysing means 115, in one embodiment the image analysing means 115 may further be configured to determine if one or both of the authorised personnel and the person on the patient transport apparatus 215 are correctly harnessed according to a predefined protocol. The image analysing means 115 may additionally be configured to determine if a person on the patient transport apparatus 215 is correctly positioned according to a predefined protocol. The predefined protocol may, for example, define that the paramedic 205 may be unharnessed to carry out treatment, but that the patient on the patient transport apparatus 215 must be harnessed correctly with the belt secured over their body and not underneath them. If the patient transport apparatus 215 is a dolly, the predefined protocol may include a requirement that the patient be positioned in an erect position and not slouching; this may further be associated with harnessing requirements, which may help in achieving the positioning end. In the exemplary embodiment, predefined protocols may be stored in a database 130 on local memory 110. Alternatively the predefined protocols may be stored on a remote database and accessed either by the communication means 155 for communicating with the server or by installation at a port before driving to the scene of an incident if it is an ambulance. Ultimately this would assist in the paramedics' job in ensuring patient safety and security. The communication means 155 may be wireless, which in one embodiment may be a transceiver.
  • In one embodiment the control means 135 is configured for controlling operations of a mechanical apparatus associated with the vehicle, such as a vehicle ramp or vehicle lift or the like. In the exemplary embodiment the detecting means 120 is capable of determining the type of patient transport apparatus 215, so as to facilitate the correct instructions being provided to the control means 135 whether a vehicle ramp or lift ought to be actuated in to operation. The detecting means 120 may be configured to determine if the patient transport apparatus 215 is a stretcher, a wheelchair or a dolly to this end. Moreover, the detecting means 120 may be further configured to determine the type of the stretcher the type of the wheelchair or the type of the dolly to this end, since not all patient transport apparatuses require electro-mechanically assisted access to the ambulance 225. Determining whether or not electro-mechanical assistance is required in this manner not only improves the speed and efficiency of the patient loading process, but also conserves vehicle battery energy. For example, energy would be wasted if the type of stretcher/wheelchair/dolly was not determined and the lift/ramp was lowered/extended unnecessarily and then needed to be returned again. In the exemplary embodiment, the detecting means 120 may further be configured to determine the distance of persons and objects from the vehicle, their duration of time from the vehicle, and the direction and speed of their motion with respect to the vehicle (i.e. moving towards or away from the vehicle and at a certain speed). In such a configuration, protocols for unlocking based on these determinations may be stored in local memory 110.
  • In the exemplary embodiment the apparatus may additionally comprise the environment measurement means 185 for measuring one or more vehicle environment parameters. The one or more vehicle environment parameters may include but are not limited to vehicle temperature, pressure and humidity. In the exemplary embodiment, the environment measurement means 185 may comprise one or more transducers.
  • In the exemplary embodiment the apparatus may additionally comprises the external environment measurement means 190 for measuring one or more external environment parameters. The one or more external environment parameters may include but are not limited to ambient temperature, pressure, humidity, wind speed and wind chill. The external environment measurement means 190 may comprise one or more transducers. The external environment measurement means 190 is illustrated as being positioned on the roof of the ambulance 225, however it will be understood that any number of alternative positions are envisaged as being possible such as but not limited to the exterior sidewalls of the ambulance 225.
  • In the exemplary embodiment, the apparatus further comprises an environment control means 180 for controlling one or more of the vehicle environment parameters. In one embodiment the environment controlling means comprises a heating, ventilation, and air conditioning (HVAC) system. Environment control protocols defining instructions for controlling the vehicle environment parameters may be stored on local memory or communicated wirelessly to the vehicle. In one embodiment, the environment control protocols may be based on various scenarios of external environment parameters. For example, there may be an environment control protocol tailored for when the external environment is 50 degrees Celsius with 80% humidity, and another for when the external environment is −40 degrees Celsius with 80% humidity. Controlling the vehicle environment parameters in this way may be useful if the vehicle environment is at least partially lost to the external environment due to a prolonged period in which the vehicle doors are open during loading. In such an embodiment, the processing means 105 may access the measured external environment parameters from memory, and match them with the closest defined parameter set in memory (e.g. a measured external environment parameter set defined by 47 degrees Celsius and 84% humidity would be matched to a predefined set defined by 50 degrees Celsius with 80% humidity). The processing means 105 may then send from memory the environment control protocol associated with that predefined parameter set to the environment controlling means, which would then regulate the vehicle environment parameters according to that protocol. In another embodiment, the environment protocols based on the measured external environment parameters may further comprise protocols in which the vehicle environment is gradually returned to predefined optimal conditions with respect to the external environment parameters. This may be useful, for example, when treating a patient with severe heatstroke: it is commonly advised to not introduce one suffering from severe heatstroke to a very cool environment immediately from a very hot environment as this can have an adverse affect on their health.
  • In a further embodiment, the control means 135 may be configured to conserve vehicle battery power by dynamically adjusting the power output of the environment control means 180, in dependence on one or a combination of one or more vehicle environment parameters, one or more external environment parameters, the duration the vehicle doors have been open and the duration of the presence of authorised personnel in the vicinity of the vehicle.
  • In the exemplary embodiment the apparatus may additionally comprise a recording means 125 for writing to memory protocol adherence history. In the exemplary embodiment, the communication means may be used to transmit data relating to protocol adherence by authorised personnel from local memory to a remote database. A user profile of each of the authorised personnel containing their protocol adherence history may be formed on a remote database. Tracking protocol adherence of any member of ambulance crew may be useful for investigative purposes ex post facto, or for staff training purposes among other uses. In an alternative embodiment, an event of protocol violation may be transmitted in real-time to a remote server via the communication means 155. Real-time updates such as this may be implemented for example in conjunction with a control centre which can communicate with ambulance staff and/or control vehicle parameters such as but not limited to maximum speed and acceleration. Step 340 of FIG. 3 best illustrates the aspect of the embodiment including the control centre.
  • The recording means 125 may further be configured to synchronously write to memory patient status data (such as heartrate, blood pressure etc.), vehicle environment data such as temperature and vehicle parameter data such as speed and acceleration. Synchronously recording such data may be useful for investigative purposes ex post facto, or for staff training purposes among other uses. The synchronously recorded data may further include time stamps The harnessing and positioning protocol adherence data may further be associated with the synchronously recorded data via time stamps.
  • In one embodiment, the detecting means 120 may be further configured for detecting fluid spillage. The fluid may include but is not limited to blood or vomit. In such an embodiment, the recording means may further be configured to write fluid spillage incidents to memory. Fluid spillage incidents may be further associated with the synchronously recorded data via time stamps. In the exemplary embodiment, data regarding markers for identifying different types of fluids may be stored on local memory 110. Alternatively the data regarding markers for identifying different types of fluids may be stored on a remote database and accessed either by the communication means 155 for communicating with the server or by installation at a port before driving to the scene of an incident if it is an ambulance. The communication means 155 may be a wireless communication means 155.
  • In the exemplary embodiment, the processing means 105 may further be configured to compile a report containing one or more of the synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage, to be transmitted wirelessly to a remote server or remote database. Alternatively, the processing means 105 may compile the report and write it local memory for retroactive review.
  • In one embodiment, the control means 135 may be operationally associated with the vehicle electromechanical actuators 150 on-board the vehicle to perform one or more of the tasks outlined above including but not limited to vehicle environment parameter control, selective locking of vehicle doors and ramp or lift actuation.
  • FIG. 2 further illustrates a plan view of the interior volume 235 of the rear of the ambulance 225 according to an embodiment of the present disclosure, which contains a plurality of the image capture device 175 mounted on its sidewalls. The image capture device 175 may alternatively be mounted on the ceiling of the interior of the ambulance 225. Having the image capture device 175 installed in the interior of ambulance 225 may serve the purpose of capturing images for the detection of fluid spillage and for determining if harnessing and patient positioning protocols are abided by at all times. In terms of determining harnessing and positioning protocols, the use of image analysis to determine adherence removes the need for an integrally installed seat with one or both of harnesses integrally connected to the vehicle electromechanical actuators 150 and pressure sensors under the seats/patient transport apparatus 215 integrally connected to the vehicle electromechanical actuators 150. In this respect, a “dumb” patient transport apparatus 215 (that is, one not being integrally connected or connectable to the vehicle) can be used, providing greater flexibility of choice whilst still being able to provide protocol adherence monitoring at all times. In the exemplary embodiment, machine learning algorithms may be implemented to detect if authorised person(s) and the patient are harnessed correctly and if the patient is positioned correctly. Harnesses 240 may be of any colour and consist of any pattern. Machine learning algorithms may include deep learning algorithms implementing neural networks. In a further embodiment, one or more of edge detection, object recognition, pattern matching, gauging or metrology (object size and distance measurement) and image matching may be implemented together with, or as an alternative to, machine learning algorithms These techniques may be implemented once on any single frame or dynamically on a live feed comprising a plurality of frames.
  • FIG. 3 is an illustration of a system 300 according to an embodiment of the present disclosure. The system 300 comprises the ambulance 225 fitted with a wireless communication means 155 capable of communicating with the remote server 165, itself capable of transmitting and receiving information to and from remote database(s) 160. The remote database(s) 160 may be further associated with a remotely located control centre 305 capable of receiving, processing and transmitting information back downstream to the ambulance 225 via the remote server 165. The wireless communication means 155 may transmit information such as geographical position, based on the communication of the ambulance 225 with a geographical positioning satellite system 310. The wireless communication means 155 may also transmit vehicle parameter data from various on-board facilities 315 such as but not limited to a gyroscope, a speedometer and an emergency lights status clock. Furthermore, the wireless communication means 155 may transmit the data outlined above including but not limited to synchronously recorded data, harnessing protocol adherence, positioning protocol adherence and fluid spillage. The report containing this data—compiled by the processing means 105—may be transmitted to the remote server 165 and stored on the remote database(s) 160. The report may be compiled in real-time or ex post facto. If the report is compiled in real-time, the report may further be transmitted in real-time to the control centre 305 for real-time review. In one embodiment, when a fluid spillage event is detected the vehicle 225 is automatically removed from a real-time dossier of fleet vehicles available for use, and the control centre 305 is alerted that the vehicle 225 is contaminated. Only once the vehicle 225 has been decontaminated according to territorially defined medical hygiene standards and this fact recorded with the control centre 305, will the vehicle 225 reappear in the real-time dossier and be made available for use. The control centre 305 may be able to selectively change and/or reduce the personnel who may be authorised to access a quarantined vehicle. More generally, the control centre 305 may determine who is authorised to access the vehicle 225 at any one time. FIGS. 20-22 provide further exemplary methods of safety control in a vehicle, consistent with embodiments of the present disclosure. As discussed above, having the image capture device 175 installed in the interior of ambulance 225 may serve the purpose of capturing images for the detection of fluid spillage. However, it will be understood that contamination and sanitary protocol violation events are not constrained to the interior volume of a vehicle. Indeed, any of blood spillage, the presence of vomit or other potentially contagious body fluids, improper waste disposal, the presence needles or other exposed sharp objects may be present external to the interior volume of the vehicle. Accordingly, it is envisaged in the above that the detecting means may be configured for detection of sanitary protocol violation events in the interior volume of the vehicle but also feasibly in the area surrounding the vehicle. What is more, sanitary protocol violation events may also comprise one or more of the absence of personnel protective equipment (gloves, sanitary garments such as but not limited to gowns, respirators, safety masks, protective eyewear and the like) and unsafe injection practices by (e.g.) authorised vehicle personnel or by others. Sanitary protocol violation events may comprise unsafe exposure or presence of hazardous objects such as but not limited to: a syringe, a knife, or a fragment of a resilient material, where the resilient material may comprise (e.g.) one or more of a glass, a metal, a wood. Accordingly, the detecting means may be configured to detect any of the above sanitary protocol violation and events, and the processing means may record them in a log in memory. Moreover, it will be understood that the detection of fluid spillage in and around the vehicle is not necessarily contingent on authorised personnel being present in or around the vehicle. However, as discussed previously, if a log of predefined safety protocol violation is maintained, in some embodiments it may be advantageous to detect an authorised person(s) and associate protocol violation events to their unique ID. Accordingly, this may form part of a report compiled by the processing means. In alternative embodiments, ‘sanitary’ may be generalised to ‘safety’, i.e. safety protocols, since the present invention is not constrained to use in medical applications. In such embodiments, safety protocol violation may comprise the absence of certain law-enforcement/security equipment and the like.
  • A series of user notification modules 320 for providing notifications to a user of the vehicle may be actuated by a number of processes via the control means 135. The user notification modules may include a visual notification means such as an LCD display capable of producing visual messages, or an audio system capable of producing audio messages. In one embodiment, the control centre 305 may be monitoring vehicle parameter data in real-time and may communicate notifications to be relayed to personnel on-board the ambulance 225 via one or more of the user notification modules 320. In another embodiment, when harnessing and/or patient positioning protocols are detected as being breached one or more of the user notification modules 320 may provide a notification to personnel on-board the ambulance 225. Protocol breach notifications may be stored in local memory to be accessed by the processing means 105 and routed to the control means 135 which may actuate one or more of the user notification modules 320.
  • In various embodiments of the present disclosure, the image analysing means 115 may be configured to count a number of persons in the vicinity of the vehicle at any given time. Counting a number of persons present in the interior or the exterior of the vehicle may furthermore comprise detecting the category of the persons and counting each. For example counting a number of authorised personnel, a number of patients, and so on. Authorised personnel may be detected by unique ID badges or codes disposed on their outerwear, via facial recognition and/or detection of a uniform based on certain characteristics such as colours or markers. Using machine learning algorithms or otherwise, the imaging analysis means 115 may also be configured to detect when a person has entered or exited the interior of the vehicle and associate a time stamp with said event. For example, this may be used to determine whether a vehicle has been used and so if there is a need to flag the vehicle for cleaning. Detecting the number of persons present within the vehicle may be used control the interior lighting of the vehicle as well as any interior environment control systems, to produce the most optimal and desirable environmental conditions within the vehicle.
  • FIG. 4 is an illustration of an exemplary method according to an embodiment of the present disclosure. Initially, an image of an area in the vicinity of the vehicle is captured, step 410. The image is then analysed, step 420. The image is analysed to detect if the captured image contains both a person and a patient transport apparatus, step 430. The image is further analysed to determine if the person in the captured image is authorised to access the vehicle. The locking mechanism 140 of the vehicle is selectively controlled in response to the captured image containing an authorised person and the patient transport apparatus, step 450. The first step 410 may be carried out at all times, or only when the vehicle engine is running. Alluded to in the description of FIG. 2 was the third step 430, wherein the detecting means 120 determines if the image captured by the image capture device 175 contains both a person and a patient transport apparatus 215. If only one or the other, or neither, are detected then the process ends and returns to 410. If both are detected then the process proceeds to 440 where the processing means 105 is used to determine, based on data from a biometric database of all ambulance staff, if the person is authorised personnel, i.e. the paramedic 205. If the person is authorised then the process proceeds to 450 wherein the control means 135 selectively unlocks one or more doors of the ambulance 225. Both the cab doors and rear doors, or one or the other, may be opened; this may be predefined with a set of instructions stored in memory. The step 430 may further comprise determining that the person and patient transport apparatus are detected to be one or a combination of: within a predefined required distance from the vehicle; a predefined required duration of time within the vicinity of the vehicle; and detected to be moving towards the vehicle. In one embodiment, the predefined required distance may be 10 metres. In another embodiment, the predefined required duration of time within the vicinity of the vehicle may be 10 seconds. In a further embodiment, the step 430 may combine the duration of time the person and patient transport apparatus are within the vicinity of the vehicle with the detection of them moving towards the vicinity of the vehicle, so as to reduce the occurrence of false positives.
  • FIG. 5 is an illustration of possible scenarios the image capture device 175 may capture. Referring to FIG. 5a , there is illustrated an embodiment in which both an authorised person 205 and a patient transport apparatus 215 are within the area captured by the image capture device 175. According to the method illustrated in FIG. 4, detection of a person and a patient transport apparatus 215 followed by authorisation of said person, is followed by selectively controlling the locking mechanism 140 of the vehicle in response to the captured image containing an authorised person and the patient transport apparatus, step 450. In further embodiments illustrated in subsequent Figures this may be followed by additional method steps.
  • Referring to FIG. 5b , there is illustrated an embodiment in which an authorised person 205, a patient transport apparatus 215 and a further person 220 are within the area captured by the image capture device 175. In the exemplary embodiment, if the further person 220 is detected as being in the captured image then access to the vehicle may be denied. In an alternative embodiment, the detecting means 120 may be configured to detect if the motion of the further person 220 matches a predefined “violent behaviour profile”; such a profile may be stored on local memory 110. The predefined “violent behaviour profile” may comprise one or more of select facial expressions, arm movements or other bodily movements which are predefined as being typical of a person who presents a threat of violence. If the further person 220 is determined by the detecting means 120 as matching such a profile then access may be denied. The purpose of the “violent behaviour profile” might be to avoid instances wherein the paramedic 205 and patient transport apparatus 215 are denied access to the ambulance 225 because of the presence of a further person 220 who poses no threat. In a further embodiment, alternative access authorisation methods may be implemented for such an instance as in 5 b. For example, the control centre 305 may be able to commandeer the image capture device 175 in real-time, assess the situation apparent from the image captured and make an authorisation decision on that basis.
  • Referring to FIG. 5c , there is illustrated an embodiment in which an authorised person 205 is within the area captured by the image capture device 175, but not a patient transport apparatus 215. In the absence of a patient transport apparatus 215, the authorised person 205 may be denied access to the vehicle, according to the method of FIG. 4. In one embodiment, alternative access authorisation methods may be implemented for such an instance as in 5 c. For example, the authorised person 205 may be able to contact the control centre 305; the control centre 305 may then be able to commandeer the image capture device 175 in real-time, assess the situation apparent from the image captured and make an authorisation decision on that basis.
  • Referring to FIG. 5d , there is illustrated an embodiment in which a patient transport apparatus 215 is within the area captured by the image capture device 175, but not a person. In the absence of a person, subsequent steps beyond detecting if the captured image contains both a person and a patient transport apparatus 215 may be aborted. In one embodiment, the control centre 305 may be able to commandeer the image capture device 175 in real-time, assess the situation apparent from the image captured and make decisions on that basis including but not limited to contacting personal radios of authorised person 205.
  • FIG. 6 is identical to the method of FIG. 4, but with the addition of steps 660-680 relating to facilitating access to the vehicle on the basis of the type of patient transport apparatus in use. Once it has been determined that there is both an authorised person 205 and a patient transport apparatus 215 in the captured image 630-640, and the vehicle doors have been selectively unlocked, step 650, the detecting means 120 determines the type of patient transport apparatus 215, step 660. The sub-category of the patient transport apparatus is then determined, step 670. The type of patient transport apparatus 215 may be a stretcher, a wheelchair or a dolly; the sub-category may be the type of the stretcher, the type of the wheelchair or the type of the dolly. A ramp is then extended or a lift lowered from the vehicle in response to the sub-category of patient transport apparatus 215 if it is determined to be necessary, step 680. The ‘if required’ clause is important in light of the fact that not all patient transport apparatuses require electro-mechanical assistance to access the ambulance 225. As outlined above in relation to FIG. 2, determining if any one of the ramp or lift is required can save time and energy in the instances where they are not required. Data relating to markers for identifying the various patient transport apparatuses, as well as data relating to ramp/lift protocols based on the type of patient transport apparatus, may be stored on local memory. In various exemplary embodiments, one or more of the steps of FIG. 6 may be omitted. For example, in one embodiment the method may comprise: 610-660 only. For example, in another embodiment the method may comprise: 610-660 followed by 680 only. For example, in another embodiment the method may comprise: 610-670 only. For example, in another embodiment the method may comprise: 610-660 followed by providing a notification to a user of the vehicle which means of access to the vehicle is substantially suitable based on the type of patient transport apparatus determined in step 660. In various embodiments, such a notification may be audio and/or visual. In further embodiments, a predefined list of authorised transport apparatus types may be stored in memory (locally or remotely), and as such in various embodiments where the type and possibly sub-type of transport apparatus is determined, the processing means may be further configured to determine if the type of patient transport apparatus is an authorised type of patient transport apparatus. Moreover, as discussed above the term ‘patient transport apparatus’ may be replaced by ‘personnel transport apparatus’, since the present invention is not limited to ambulances or other medical care vehicles. Furthermore, in some embodiments the detection and authorisation of personnel may be omitted—such embodiments are best presented in FIGS. 11-14.
  • FIG. 7 is an illustration of an exemplary method according to an embodiment of the present disclosure, containing the steps of the embodiment of FIG. 4 followed by the step of selectively controlling at least one parameter of the vehicle in response to detecting both the authorised personnel and patient transport apparatus approaching the vicinity of the vehicle, step 720. Subsequent to step 720 are steps 660-680 of FIG. 6. In one embodiment, the vehicle parameter may be a vehicle environment parameter such as temperature, pressure or humidity. For example, in an extreme weather environment where losing the vehicle environment to the external environment may happen in a short time, it may be useful to begin combating loss of the vehicle environment immediately following actuation of the doors to unlock. The environment control means 180 would immediately begin enacting a predefined environment control protocol accessed from memory 110 by the processing means 105. In an alternative embodiment, the vehicle parameter may include but is not limited to vehicle ignition, emergency scene lights and electrically powered medical equipment charging facilities.
  • FIG. 8 is an illustration of a method 800 involving vehicle environment control but in dependence upon measured external environment parameters. The method 800 contains the steps of the embodiment of FIG. 4. Once a person and patient transport apparatus 215 have been detected, the person has been authorised and the locking mechanism 140 of the vehicle selectively controlled, at least one external environment parameter is measured, step 820. Subsequently, at least one vehicle environment parameter is selectively controlled in response to detecting both the authorised personnel and patient transport apparatus approaching the vicinity of the vehicle, step 830. As described in relation to FIG. 2, the processing means 105 may match the at least one measured external environment parameter to a predefined environment control protocol associated with the closest defined parameter set in memory. The processing means 105 may then communicate this protocol to the environment control means 180, which would subsequently regulate the vehicle environment parameters according to that protocol. Subsequent to step 830 are steps 660-680 of FIG. 6.
  • FIG. 9 is an illustration of a method 900 according to an embodiment of the present disclosure, identical to the method 600 of FIG. 6, but with the additional step of detecting if patient harnessing and/or patient positioning protocols have been adhered to according to a predefined protocol, step 920.
  • FIG. 10 is an illustration of a method 1000 according to an embodiment of the present disclosure, identical to the method 900 of FIG. 9, but with the additional steps of maintaining a log in memory each time the predefined protocol(s) for harnessing and/or positioning is/are breached, step 1020, and transmitting at least part of the contents of the memory to a remote database(s), step 1030. The at least part of the contents of the memory may be transmitted to the remote database(s) wirelessly via the wireless communication means 155 or via wired means for example once the ambulance 225 has returned to its station after a response incident.
  • Referring to FIG. 11, there is provided a method 1100 of controlling access to a vehicle based on detecting 1120 a patient transport apparatus and further determining 1130 a type of the patient transport apparatus. Subsequently, the method 1100 comprises selectively controlling 1140 a locking mechanism of the vehicle in response to the captured image containing the personnel transport apparatus. FIG. 12 illustrates a further embodiment where, in addition to the steps 1110-1140 of FIG. 11, there is provided the additional step of determining 1220 which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the personnel transport apparatus in the captured image.
  • In some embodiments, the vehicle may be equipped with a notification means. The notification means may be configured to provide a notification, such as an audio and/or visual notification, from the vehicle detailing which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the patient transport apparatus in the captured image. An example of an embodiment wherein the provision of a notification is included is given in FIG. 13. If neither the lift nor the ramp are suitable based on the type of the patient transport apparatus detected, the notification means may provide a notification in such an instance too. Other means of access to the vehicle may include manual access via the door(s) of the vehicle. The notification means may comprise an audible or visual alarm. For example, the notification means may comprise the sirens and/or one or more lights mounted on the vehicle such as the siren lights of an emergency vehicle. In addition or alternatively, a notification may be provided to a remote device of one or more personnel indicating which means of access to the vehicle is suitable. Said remote device may be communicatively coupled to the notification means or another component of the vehicle such as but not limited to the control means or the processing means.
  • In any of the embodiments of the present disclosure, if a person is not detected as being present on the personnel transport apparatus, the vehicle system may default to manual access (rather than the automatic extension of an access means such as the vehicle lift or ramp) and manual authentication. The processes may terminate or continue to authenticate any personnel in the vicinity of the vehicle, and in such embodiments where the process continues to authentication the process may selectively control 1350 the locking mechanism in response to the captured image containing the personnel transport apparatus and an authenticated person.
  • Referring to FIG. 14, there is provided a method 1400 similar to that of FIG. 12, but including the further step 1460 of extending a ramp or lowering a lift from the vehicle to facilitate access of the patient transport apparatus to the vehicle in response to the type of patient transport apparatus. As discussed, it may alternatively be determined that neither the lift nor the ramp are appropriate in view of the determined type of patient transport apparatus—in such cases, a notification may be provided as in FIG. 12 (however it will be understood that the provision of a notification may also apply to cases where the ramp or lift are indeed determined to be appropriate). In some embodiments, if neither the ramp nor the lift are deemed appropriate, the vehicle system, may default to manual. Alternatively, other means of access to the vehicle may be actuated to open, where available.
  • As discussed in the foregoing, it may be desirable in some embodiments to determine whether the type of patient transport apparatus in the vicinity of the vehicle is an authorised type of patient transport apparatus. For example, a certain type of wheelchair or dolly may not meet the health and safety requirements associated with the vehicle in question. Authorised personnel may have the option, possibly subject to the provision of a security pass, to override the requirement that the type of the patient transport apparatus is an authorised patient transport apparatus. In some embodiments, such an override event may be included in the report compiled by the processing means 105. As discussed above, data relating to authorised types of patient transport apparatuses may be stored on local memory or remotely. In some embodiments, what constitutes an authorised type of patient transport apparatus may dynamically change based on input from a remote user or authorised personnel local to the vehicle, based on information input such as a type of condition the patient is being treated for. In some embodiments, if it is determined that the type of the patient transport apparatus is not an authorised type of patient transport apparatus, certain steps may be neglected. For example, it may save power/memory to not determine which means of access to the vehicle is substantially suitable for accessing the vehicle based on the type of the patient transport apparatus in the captured image if the patient transport apparatus is not authorised to access. In some embodiments the process 1400 may be resumed or restarted subject to an override event such as that discussed in the foregoing.
  • FIGS. 15-19 illustrate various embodiments of processes for determining patient safety. In addition to determining if a patient is harnessed correctly and/or positioned correctly on the patient transport apparatus, in some embodiments the image analysing means 115 may further be configured to determine if the patient transport apparatus is secured in place in the vehicle and/or secured on a vehicle access means (for example a vehicle lift or vehicle ramp).
  • With regard to FIG. 18, it will be understood that the ‘and/or’ clause for the three options which can be nominally labelled ‘A’, ‘B’ and ‘C’ may include all permutations of these labels. Namely, ‘A and/or B and/or C may have within its meaning here:
      • ‘A and B and C’={ABC};
      • ‘A and B or C’={AB, C, AC};
      • A or B or C={A, B, C};
      • A or B and C={A, BC, AC}.
  • For completeness, A, B and C here refer to steps 1830; 1840; 1850 respectively.
  • Referring to FIG. 18, there are several advantages of determining 1820 the type of patient transport apparatus (660) and optionally additionally determining the sub-category of the patient transport apparatus (670). In one regard, determining the type of patient transport apparatus and possibly additionally the sub-type of the patient transport apparatus will facilitate increased accuracy in determining whether the patient is correctly harnessed and/or positioned in the patient transport apparatus, since harnessing and positioning protocols may vary between various patient transport apparatuses. Moreover, as discussed above, determining the type and possibly sub-type of patient transport apparatus allows one to determine whether a vehicle ramp or lift ought to be actuated in to operation, and to extend a ramp or lower a lift in response to the type of patient transport apparatus detected. Accordingly, improved patient safety, improved patient safety protocol adherence, reduced power consumption and improved time efficiency are advantages of the processes of FIG. 18.
  • FIG. 19 provides an exemplary method similar to that of FIG. 15, but including the additional step of providing 1920 a notification to a user of the vehicle that the patient is not correctly positioned and/or harnessed. It will be understood that the steps of FIG. 19 may be further combined in a number of variations with the steps of FIGS. 16-18. What is more, a notification may further be provided in response to one or more of:
      • i) determining if the patient transport apparatus is correctly secured on the vehicle lift according to a predefined protocol;
      • ii) determining if the patient transport apparatus is correctly secured on the vehicle ramp according to a predefined protocol; or
      • iii) determining if the patient transport apparatus is correctly secured in the interior of the vehicle according to a predefined protocol
  • Accordingly, the above determinations i)-iii) may comprise steps performed appropriately in combination with any of the processes of FIGS. 16-19.
  • It will be understood that while a number of exemplary embodiments of a method according to the present disclosure have been disclosed in FIGS. 4 and 6-10, other combinations of the method steps of FIGS. 4 and 6-10 are possible and are envisaged to be so by the inventors. For example, one might combine the steps 910-920 of FIG. 9 with step 720 of FIG. 7 or steps 820-830 of FIG. 8.
  • The image capture device 175 may further comprise thermal imaging capabilities, and/or a separate thermal imaging device (not illustrated) may be fitted to the vehicle. The image capture device 175 and the thermal imaging device may be fitted to the interior of the vehicle and/or the exterior of the vehicle as desired. FIGS. 23-26 illustrate a number of example scenarios relating to thermal imaging data capture.
  • Referring to FIG. 23, an image is captured in the vehicle or in the vicinity of the vehicle, 2310. The captured image is then analysed, 2320. A person is then detected in the captured image, 2330. A characteristic associated with the person is then determined at a first time, 2340. The determined characteristic is then associated with a predefined threshold value, 2350. In response to the characteristic surpassing the predefined threshold value, a change of state in the vehicle is actuated, 2360. In various embodiments, the subject may comprise a person, for example authorised or unauthorised personnel, or a patient. Temperature may be indicative of stress levels of persons in the vicinity of the vehicle, a critical event in a patient, or whether authorised personnel are fit to work in the vehicle. For example, if a threshold temperature of a patient is surpassed, this may indicate that a patient's condition has worsened. In various embodiments, actuating a change of state of the vehicle comprises one or more of: providing a notification to one or more areas of the vehicle; providing a notification to a user device; or creating a log in memory of an instance of the characteristic surpassing the predefined threshold value. It will be understood that the change of state may be actuated by a processing means in cooperation with other necessary components. For example the processing means may send instructions to the notification means to provide the notification in the vicinity of the vehicle, or the processing means may send instructions to a wireless transmission means to provide a notification to the user device.
  • As illustrated in FIG. 24, a notification may be provided to a user of the vehicle in response to the temperature of the subject being greater than a predefined threshold temperature. The notification may be provided in an audio and/or visual format. The notification may be provided to a user device such as a smart phone, or a tablet, or a system of the vehicle.
  • Captured thermal data such as temperature profiles in images may be fed in to a data processing application located locally or remotely such as in the cloud. In the exemplary embodiment the data processing application comprises a machine learning application. In some embodiments, in response to a predefined threshold value being surpassed the application may provide instructions to the vehicle environment control means to alter the temperature on board the vehicle to a particular value. In some embodiments the temperature on board the vehicle may be altered dynamically in response to thermal data captured in real time.
  • Referring to FIG. 25, a first temperature of the subject in the captured image may be determined at a first time and a second temperature of the subject may be determined at a second time, 2510. A rate of change of the temperature of subject may then be determined based on the first temperature at the first time and the second temperature at the second time, 2520. Generally the temperature of a subject in captured image may be monitored in real time or at predefined intervals to monitor a rate of change of the temperature of the subject in the captured image Similar to the above, a rate of change of temperature may be indicative of stress levels of persons in the vicinity of the vehicle, a critical event in a patient, or whether authorised personnel are fit to work in the vehicle. For example, if a threshold rate of change of temperature X/° C. of a patient is surpassed, this may indicate that a patient's condition has worsened. What is more, if a threshold rate of change of temperature X/° C. of a patient is surpassed, this may indicate that a patient is suffering for a particular condition/disease. For example, a temperature above the normal temperature of a human (approximately 37.5 degrees Celsius) may indicate a fever. A threshold temperature may be, for example, 37.8 degrees Celsius. If a person is suffering from a particular condition this may be used in determining a course of action such as quarantine or isolation of said person. In some embodiments it may be desirable to determine a rate of change of the rate of change of temperature (i.e. second order derivative). Referring to FIG. 26, an event such as a threshold value of rate being surpassed may also be associated with a course of action to be taken, 2650. For example, the course of action may comprise one or more of: adjusting an ambient temperature of the interior of the vehicle, provision of medication to the person, or a change in vehicle speed. The notification means may provide a notification to a user of the vehicle or remotely located personnel containing information relating to the course of action, 2660. Such an event may also be recorded in local or remote memory. It will be understood that step 2650 may equally be performed in the method of FIGS. 23-25 or FIG. 27 ahead.
  • Notification of the changing condition of a patient may be indicative of whether a new course of action needs to be taken in relation to the patient. For example, whether certain drugs ought to be administered by paramedics, whether the current course of treatment is worsening the situation and/or whether the temperature in the interior of the vehicle needs to be altered. Notifications may be specifically directed to the driver of the vehicle, for example whether arrival at a hospital has become urgent or whether their driving is adversely affecting personnel and/or patient status.
  • Surface temperatures may be monitored in the interior or exterior of the vehicle. For example, a temperature of the surface of the patient transport apparatus may be monitored. Where the patient transport apparatus is exposed to a high or low temperature for a period of time, the surface of the patient transport apparatus may reach a temperature which is unsuitable for placing a patient thereon. In this case personnel may be notified appropriately.
  • A log of any thermal imaging data and related events may be recorded in local and/or remote memory for subsequent retrieval. Said data may form part of a report compiled by software. For example, said report may associate data with specific users of the vehicle.
  • Referring now to FIG. 27, a method 2700 is provided according to an embodiment of the present disclosure. The method 2700 comprises capturing an image of an area in the vicinity of the vehicle, step 2710. The image is then analysed, step 2720. The image is analysed to detect if the captured image contains both a person and a patient transport apparatus, step 2730. The image is further analysed to determine a parameter relating to the person on the patient transport apparatus, step 2740. In response to determining the parameter relating to the person on the patient transport apparatus, a field of a database may be populated, step 2750. In the exemplary embodiment the database is an electronic patient care record (ePCR). In the exemplary embodiment the parameter relating to the person on the patient transport apparatus comprises a gender. Additional data parameters relating to the patient such as age, race, height and weight may also be estimated using machine learning algorithms or otherwise. It will be understood that any number of other captured data parameters described in the present disclosure may be used in appropriately populating a database such as the ePCR. The populated ePCR may be stored on local or remote memory for subsequent retrieval. The steps of the method 2700 of FIG. 27 may be implemented in combination with the method steps of other processes described herein. By way of example only, steps 2740 and 2750 may be implemented in combination with the steps of FIG. 4 or FIG. 6 or 10 and so on.
  • Referring to FIG. 28 there is provided a method 2800 of vehicle security. The method comprises capturing an image of an area in the vicinity of the vehicle, step 2810. The image is then analysed, step 2820. In a further step 2830, the method comprises determining whether an attack is taking place in the vicinity of the vehicle. In response to determining that an attack is taking place in the vicinity of the vehicle, a change of state of the vehicle is actuated, step 2850.
  • In various embodiments, actuating a change of state of the vehicle comprises one or more of: activating a recording mode of the image capture device(s) 175 in the interior and/or exterior of the vehicle; providing a notification to one or more areas of the vehicle; providing a notification to a user device; or creating a log of the attack event in memory. It will be understood that the change of state may be actuated by a processing means in cooperation with other necessary components. For example the processing means may send instructions to the notification means to provide the notification in the vicinity of the vehicle, or the processing means may send instructions to a wireless transmission means to provide a notification to the user device.
  • In various embodiments, determining whether an attack is taking place in the vicinity of the vehicle may comprise one or both of determining that an attack is taking place on the body of the vehicle, or on persons who are in the vicinity of the vehicle. For example, paramedics may be under attack near their ambulance, or police officers near their squad car. Accordingly, FIG. 29 illustrates a method 2900 comprising the additional step of detecting a first person and a second person in the captured image, 2930. It is then determined whether an attack is taking place on the first person, 2940.
  • The first person may be an authorised person such as a paramedic or police officer, and the second person may be an assailant. The authorised personnel may be identified by their uniform, by facial recognition or by other means such as a scan-able code disposed on their clothing which can be compared to a database stored in memory. A person may be identified as a patient in numerous ways, for example based on their position and orientation such as on the ground or on a patient transport apparatus, based on being harnessed in to a patient transport apparatus, or based on certain medical equipment associated with them such as an IV drip. In the context of policing, a person may be identified as an apprehended person. This may be determined by the detection of the person as having their hands cuffed. A determination could thus be made by the image analysing means 115 that the cuffed individual is a person apprehended by an authorised person (i.e. a police officer),Identity information may be used in a report compiled after the incident. FIG. 30 provides a method 3000 in which the image analysing means 115 determines whether the first person is an authorised person, a patient or an apprehended person.
  • Determining that an attack is taking place may comprise detecting certain types of movement of the second person in the captured image. For example, detecting an arm of the second person in the captured image moving in a certain direction at a certain speed (i.e. a punch), or detecting a sequence of such events (a series of punches). The image analysing means 115 may be configured to execute the method 2800 implementing one or more machine learning algorithms The machine learning algorithm(s) may be trained with examples of “attack behaviour” against which it can compare captured image data.
  • In the exemplary embodiment recording may continue until image analysing means 115 determines that the attack has ceased. When recording ends, the video data may be stored on local and/or remote memory for subsequent retrieval. Data such as but not limited to the identity of personnel involved in the incident, an ePCR of a patient involved in the incident, as well as time stamps, may be associated with the event.
  • Providing a notification to one or more areas of the vehicle may comprise activating an audibly and/or visually perceptible alarm on the exterior of the vehicle or in the driver's cab. The notification may continue until the image analysing means 115 determines that the attack has ended or until authorised personnel deactivate the notification. In the driver's cab, the notification may comprise a visual notification appearing on the user interface of the cab, and/or an audio notification emanating from audio speakers.
  • Providing a notification to the user device may comprise processor 105 sending instructions to a wireless transmission means to provide an audibly and/or visually perceptible alarm (e.g. sirens, flashing lights and the like), an automated email, push notification, a text message such as SMS, or other alert on a user device. The user device may comprise a device belonging to a safety officer. In some embodiments, the video may be live streamed to the safety officer's device for continual monitoring of the situation in real time.
  • In some embodiments, the method 2800 may further comprise controlling a locking mechanism of the vehicle in response to determining an attack is taking place in the vicinity of the vehicle.
  • It will be understood that the methods of FIGS. 28-30 may be extended to the detection and identification of additional persons in the captured image.
  • It will be understood that while exemplary features of an apparatus for actuating vehicle parameters using combined biometric personnel-authorised and patient transport apparatus-authorised access to the environment have been described, that such an arrangement is not to be construed as limiting the invention to such features. The method for detection of persons and objects and parameters related to both such as distance and velocity, and furthermore actuating the vehicle parameters and components, may be implemented in software, firmware, hardware, or a combination thereof. In one mode, the method is implemented in software, as an executable program, and is executed by one or more special or general purpose digital computer(s), such as a personal computer (PC; IBM-compatible, Apple-compatible, or otherwise), personal digital assistant, workstation, minicomputer, or mainframe computer. The steps of the method may be implemented by a server or computer in which the software modules reside or partially reside.
  • Generally, in terms of hardware architecture, such a computer will include, as will be well understood by the person skilled in the art, a processor, memory, and one or more input and/or output (I/O) devices (or peripherals) that are communicatively coupled via a local interface. The local interface can be, for example, but not limited to, one or more buses or other wired or wireless connections, as is known in the art. The local interface may have additional elements, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the other computer components.
  • The processor(s) may be programmed to perform the functions of the method for authorising persons and controlling vehicle parameters such as but not limited to the lock state of the doors and the state of access facilities such as a ramp or lift, or vehicle parameters such as but not limited to temperature. The processor(s) is a hardware device for executing software, particularly software stored in memory. Processor(s) can be any custom made or commercially available processor, a primary processing unit (CPU), an auxiliary processor among several processors associated with a computer, a semiconductor based microprocessor (in the form of a microchip or chip set), a macro-processor, or generally any device for executing software instructions.
  • Memory is associated with processor(s) and can include any one or a combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and non-volatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). Moreover, memory may incorporate electronic, magnetic, optical, and/or other types of storage media. Memory can have a distributed architecture where various components are situated remote from one another, but are still accessed by processor(s).
  • The software in memory may include one or more separate programs. The separate programs comprise ordered listings of executable instructions for implementing logical functions in order to implement the functions of the modules. In the example of heretofore described, the software in memory includes the one or more components of the method and is executable on a suitable operating system (0/S).
  • The present disclosure may include components provided as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When a source program, the program needs to be translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory, so as to operate properly in connection with the 0/S. Furthermore, a methodology implemented according to the teaching may be expressed as (a) an object oriented programming language, which has classes of data and methods, or (b) a procedural programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, Pascal, Basic, Fortran, Cobol, Perl, Java, and Ada.
  • When the method is implemented in software, it should be noted that such software can be stored on any computer readable medium for use by or in connection with any computer related system or method. In the context of this teaching, a computer readable medium is an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method.
  • Such an arrangement can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this disclosure, a “computer-readable medium” can be any means that can store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Any method descriptions or blocks in the Figures, should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, as would be understood by those having ordinary skill in the art.
  • The above detailed description of embodiments of the disclosure is not intended to be exhaustive or to limit the disclosure to the exact form disclosed. While specific examples for the disclosure are described above for illustrative purposes, those skilled in the relevant art will recognize various modifications are possible within the scope of the disclosure. For example, while processes and blocks have been demonstrated in a particular order, different implementations may perform routines or employ systems having blocks, in an alternate order, and some processes or blocks may be deleted, supplemented, added, moved, separated, combined, and/or modified to provide different combinations or sub-combinations. Each of these processes or blocks may be implemented in a variety of alternate ways. Also, while processes or blocks are at times shown as being performed in sequence, these processes or blocks may instead be performed or implemented in parallel or may be performed at different times. The results of processes or blocks may be also held in a non-persistent store as a method of increasing throughput and reducing processing requirements.

Claims (24)

1-36. (canceled)
37. A method of vehicle security comprising the steps of:
capturing an image of an area in the vicinity of a vehicle;
analysing the captured image;
determining by an image analysing means whether an attack is taking place in the vicinity of the vehicle; and
in response to determining that an attack is taking place in the vicinity of the vehicle, actuating a change of state of the vehicle.
38. The method of claim 37, further comprising detecting by the image analysing means whether the captured image contains a first person and a second person.
39. The method of claim 38, wherein determining whether an attack is taking place in the vicinity of the vehicle comprises determining whether an attack is taking place on a first person, and further comprising determining whether the first person is one of an authorised person, a patient, and an apprehended person.
40. The method of claim 37, wherein determining whether an attack is taking place in the vicinity of the vehicle comprises determining whether an attack is taking place on the vehicle.
41. The method of claim 37, wherein actuating a change of state of the vehicle comprises activating a recording mode of an image capture device on the vehicle, and optionally live streaming the recording to a user device.
42. The method of claim 37, wherein recording continues until it is determined that the attack has ended, wherein optionally when recording ends the recording clip is saved to memory.
43. The method of claim 37, wherein actuating a change of state of the vehicle comprises providing, by a notification means, a notification to at least a first area of the vehicle, wherein the notification provided to the at least first area of the vehicle comprises at least one of an audibly and a visually perceptible alarm.
44. The method of claim 37, wherein actuating a change of state of the vehicle comprises providing a notification to a user device, wherein the notification provided to the user device comprises at least one of: an audibly perceptible alarm, a visually perceptible alarm, an email notification, a push notification, and an SMS message.
45. The method of claim 37, wherein actuating a change of state of the vehicle comprises creating a log of the attack in memory.
46. The method of claim 37, further comprising determining by the image analysing means the identity of the person under attack.
47. The method of claim 37, further comprising controlling a locking mechanism of the vehicle in response to determining an attack is taking place in the vicinity of the vehicle.
48. The method of claim 37, wherein the image analysing means comprises at least one machine learning algorithms.
49. A security system, comprising:
an image capture device configured for capturing an image of an area in the vicinity of a vehicle;
an image analysing means configured for analysing the captured image, wherein the image analysing means is further configured to determine whether an attack is taking place in the vicinity of the vehicle;
a processor configured to actuate a change of state of the vehicle in response to the image analysing means determining that an attack is taking place; and
memory.
50. The security system of claim 49, wherein the processor is configured to execute one or more machine learning algorithms.
51. The security system of claim 49, wherein the image analysing means is configured for detecting a plurality of persons in the captured image.
52. The security system of claim 49, wherein determining whether an attack is taking place in the vicinity of the vehicle comprises determining whether an attack is taking place on a person of the plurality of persons in the captured image, wherein the image analysing means is further configured for determining whether the person of the plurality of persons is one of an authorised person, a patient and an apprehended person.
53. The security system of claim 49, wherein determining whether an attack is taking place in the vicinity of the vehicle comprises determining whether an attack is taking place on the vehicle.
54. The security system of claim 49, wherein the processor is configured for activating a recording mode of the image capture device on the vehicle in response to determining that an attack is taking place in the vicinity of the vehicle, wherein recording continues until it is determined that the attack has ended, and wherein the processor is configured to save the recording clip to memory when recording ends.
55. The security system of claim 49, further comprising a notification means, wherein the processor is configured to send instructions to the notification means to provide a notification to at least a first area of the vehicle in response to determining that an attack is taking place in the vicinity of the vehicle, wherein the notification comprises at least one of an audibly and a visually perceptible alarm.
56. The security system of claim 49, wherein the processor is configured to send instructions to a wireless transmission means to provide a notification to a user device, wherein the notification provided to the user device comprises at least one of: an audibly perceptible alarm, a visually perceptible alarm, an email notification, a push notification, and an SMS message.
57. The security system of claim 49, wherein a wireless transmission means is configured to provide to a user device a live stream from the image capture device.
58. The security system of claim 49, wherein the processor is configured to create a log of the attack in memory.
59. The security system of claim 49, wherein the processor is configured to control a locking mechanism of the vehicle in response to determining an attack is taking place in the vicinity of the vehicle.
US17/442,448 2019-03-26 2020-03-26 Security method and system for a vehicle Abandoned US20220153232A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GB1904149.0 2019-03-26
GB1904149.0A GB2582904B (en) 2019-03-26 2019-03-26 Method and apparatus for controlling access to a vehicle
PCT/EP2020/058628 WO2020193737A1 (en) 2019-03-26 2020-03-26 Security method and system for a vehicle

Publications (1)

Publication Number Publication Date
US20220153232A1 true US20220153232A1 (en) 2022-05-19

Family

ID=66381352

Family Applications (5)

Application Number Title Priority Date Filing Date
US17/442,414 Pending US20220185235A1 (en) 2019-03-26 2019-10-25 Method and apparatus for contolling access to a vehicle
US17/442,417 Pending US20220168160A1 (en) 2019-03-26 2019-10-25 Method and apparatus for determining personnel harnessing and position protocol adherence
US17/442,443 Pending US20220198910A1 (en) 2019-03-26 2019-10-25 Method and apparatus for sanitary control in a vehicle
US17/442,427 Pending US20220161808A1 (en) 2019-03-26 2020-03-26 Method and apparatus for monitoring status of persons in a vehicle
US17/442,448 Abandoned US20220153232A1 (en) 2019-03-26 2020-03-26 Security method and system for a vehicle

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US17/442,414 Pending US20220185235A1 (en) 2019-03-26 2019-10-25 Method and apparatus for contolling access to a vehicle
US17/442,417 Pending US20220168160A1 (en) 2019-03-26 2019-10-25 Method and apparatus for determining personnel harnessing and position protocol adherence
US17/442,443 Pending US20220198910A1 (en) 2019-03-26 2019-10-25 Method and apparatus for sanitary control in a vehicle
US17/442,427 Pending US20220161808A1 (en) 2019-03-26 2020-03-26 Method and apparatus for monitoring status of persons in a vehicle

Country Status (6)

Country Link
US (5) US20220185235A1 (en)
EP (5) EP3787937B1 (en)
CA (5) CA3133017C (en)
ES (3) ES2960887T3 (en)
GB (1) GB2582904B (en)
WO (5) WO2020192952A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220277640A1 (en) * 2021-02-26 2022-09-01 Nec Corporation Control device, control method, and recording medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111860376B (en) * 2020-07-24 2023-10-31 北京博维航空设施管理有限公司 Contour fitting method and device for cabin door
DE102022114066A1 (en) 2022-06-03 2023-12-14 Bayerische Motoren Werke Aktiengesellschaft Method and device for providing a transport vehicle
CN115297306B (en) * 2022-10-10 2023-03-24 深圳市旗扬特种装备技术工程有限公司 Pedestrian personal belonging anti-loss monitoring method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070260361A1 (en) * 2006-05-08 2007-11-08 Drivecam, Inc. System and Method for Selective Review of Event Data
US9747795B1 (en) * 2016-12-05 2017-08-29 AEGIS Enterprises, LLC System to alert users of potential ambushers
US10185628B1 (en) * 2017-12-07 2019-01-22 Cisco Technology, Inc. System and method for prioritization of data file backups
US20190037166A1 (en) * 2015-10-14 2019-01-31 Utility Associates, Inc. Article of clothing with video recording device support
US10233679B1 (en) * 2016-04-11 2019-03-19 State Farm Mutual Automobile Insurance Company Systems and methods for control systems to facilitate situational awareness of a vehicle
US10507793B1 (en) * 2018-08-17 2019-12-17 Felipe Boris De Moura Partika Alarm, safety device and device for expelling attackers for motor vehicles

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7043084B2 (en) * 2002-07-30 2006-05-09 Mitsubishi Electric Research Laboratories, Inc. Wheelchair detection using stereo vision
CN2680523Y (en) * 2003-09-16 2005-02-23 程滋颐 Automobile burglar alarm with face identification, discrimination and wireless communication functions
US7831087B2 (en) * 2003-10-31 2010-11-09 Hewlett-Packard Development Company, L.P. Method for visual-based recognition of an object
JP2005315024A (en) * 2004-04-30 2005-11-10 Fujitsu Ten Ltd Vehicle controller
US8149126B2 (en) * 2007-02-02 2012-04-03 Hartford Fire Insurance Company Lift monitoring system and method
US20090027493A1 (en) * 2007-07-24 2009-01-29 Ahron Amar Method and device for security in public places
US8956294B2 (en) * 2009-05-20 2015-02-17 Sotera Wireless, Inc. Body-worn system for continuously monitoring a patients BP, HR, SpO2, RR, temperature, and motion; also describes specific monitors for apnea, ASY, VTAC, VFIB, and ‘bed sore’ index
US8869449B2 (en) * 2010-03-26 2014-10-28 Siemens S.A.S. Safety apparatus for closing and opening a door
US8382181B2 (en) * 2010-04-26 2013-02-26 Ferno-Washington, Inc. Emergency vehicle patient transport systems
GB201020241D0 (en) * 2010-11-30 2011-01-12 Univ Lincoln The A response detection system and associated methods
JP5713790B2 (en) * 2011-05-09 2015-05-07 キヤノン株式会社 Image processing apparatus, image processing method, and program
CN103931172A (en) * 2011-06-10 2014-07-16 菲力尔系统公司 Systems and methods for intelligent monitoring of thoroughfares using thermal imaging
US20140309873A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Positional based movements and accessibility of features associated with a vehicle
US11502551B2 (en) * 2012-07-06 2022-11-15 Energous Corporation Wirelessly charging multiple wireless-power receivers using different subsets of an antenna array to focus energy at different locations
US9068390B2 (en) * 2013-01-21 2015-06-30 Magna Electronics Inc. Vehicle hatch control system
US20150009010A1 (en) * 2013-07-03 2015-01-08 Magna Electronics Inc. Vehicle vision system with driver detection
US8930072B1 (en) * 2013-07-26 2015-01-06 Lytx, Inc. Managing the camera acquiring interior data
GB201313986D0 (en) * 2013-08-05 2013-09-18 Biorics Nv Method and device for accurate realtime detection of drowsiness in operators using physiological responses
DE102014101208A1 (en) * 2014-01-31 2015-08-06 Huf Hülsbeck & Fürst Gmbh & Co. Kg mounting module
RS20140182A1 (en) * 2014-04-14 2015-10-30 Novelic D.O.O. Radar sensor for detection of driver’s drowsiness that operates in the millimeter wave frequency range and operational method thereof
US10719727B2 (en) * 2014-10-01 2020-07-21 Apple Inc. Method and system for determining at least one property related to at least part of a real environment
JP6207089B2 (en) * 2014-10-17 2017-10-04 本田技研工業株式会社 Vehicle door control device
US11291383B2 (en) * 2014-10-31 2022-04-05 Rtthermal, Llc Magnetic resonance imaging patient temperature monitoring system and related methods
US11080977B2 (en) * 2015-02-24 2021-08-03 Hiroshi Aoyama Management system, server, management device, and management method
JP6505514B2 (en) * 2015-06-10 2019-04-24 パナソニック株式会社 Air conditioner, sensor system, and method of estimating thermal sensation thereof
US20190279447A1 (en) * 2015-12-03 2019-09-12 Autoconnect Holdings Llc Automatic vehicle diagnostic detection and communication
US10152870B1 (en) * 2016-02-22 2018-12-11 Lytx, Inc. Compliance detection
SE541541C2 (en) * 2016-03-10 2019-10-29 Scania Cv Ab Method and system for theft detection in a vehicle
US10139827B2 (en) * 2016-06-28 2018-11-27 Ford Global Technologies, Llc Detecting physical threats approaching a vehicle
US20180075565A1 (en) * 2016-09-13 2018-03-15 Ford Global Technologies, Llc Passenger validation systems and methods
US10095229B2 (en) * 2016-09-13 2018-10-09 Ford Global Technologies, Llc Passenger tracking systems and methods
US20180074494A1 (en) * 2016-09-13 2018-03-15 Ford Global Technologies, Llc Passenger tracking systems and methods
US10249104B2 (en) * 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10624802B2 (en) * 2017-04-26 2020-04-21 Robotic Research, Llc System for automating wheelchair user ingress, egress, and securement on a vehicle
US11887449B2 (en) * 2017-07-13 2024-01-30 Elvis Maksuti Programmable infrared security system
US11334070B2 (en) * 2017-08-10 2022-05-17 Patroness, LLC Systems and methods for predictions of state of objects for a motorized mobile system
EP3672842A1 (en) * 2017-08-25 2020-07-01 Cubic Corporation Remote operation of non-driving functionality autonomous vehicles
DE102017218100A1 (en) * 2017-10-11 2019-04-11 Bayerische Motoren Werke Aktiengesellschaft Method and apparatus for configuring vehicle functions
US11248820B2 (en) * 2017-10-18 2022-02-15 Tsinghua University Air conditioning control device to control air conditioner based on comfortable skin temperature range
US10902711B2 (en) * 2017-10-23 2021-01-26 Martin Alpert Facility monitoring apparatus and method
US10935978B2 (en) * 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
US11126917B2 (en) * 2017-12-22 2021-09-21 At&T Intellectual Property I, L.P. System and method for estimating potential injuries from a vehicular incident
US20190391581A1 (en) * 2018-06-26 2019-12-26 Uber Technologies, Inc. Passenger Health Monitoring and Intervention for Autonomous Vehicles
CN108743069B (en) * 2018-06-29 2020-02-14 郑州大学第一附属医院 Mobile emergency intensive care system
CN109481160A (en) * 2018-12-21 2019-03-19 珠海鹏宇汽车有限公司 A kind of internet monitoring type ambulance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070260361A1 (en) * 2006-05-08 2007-11-08 Drivecam, Inc. System and Method for Selective Review of Event Data
US20190037166A1 (en) * 2015-10-14 2019-01-31 Utility Associates, Inc. Article of clothing with video recording device support
US10233679B1 (en) * 2016-04-11 2019-03-19 State Farm Mutual Automobile Insurance Company Systems and methods for control systems to facilitate situational awareness of a vehicle
US9747795B1 (en) * 2016-12-05 2017-08-29 AEGIS Enterprises, LLC System to alert users of potential ambushers
US10185628B1 (en) * 2017-12-07 2019-01-22 Cisco Technology, Inc. System and method for prioritization of data file backups
US10507793B1 (en) * 2018-08-17 2019-12-17 Felipe Boris De Moura Partika Alarm, safety device and device for expelling attackers for motor vehicles

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220277640A1 (en) * 2021-02-26 2022-09-01 Nec Corporation Control device, control method, and recording medium

Also Published As

Publication number Publication date
CA3133056A1 (en) 2020-10-01
EP3938251A1 (en) 2022-01-19
ES2960887T3 (en) 2024-03-07
US20220198910A1 (en) 2022-06-23
WO2020193738A1 (en) 2020-10-01
EP3787938B1 (en) 2022-05-25
EP3787938A1 (en) 2021-03-10
GB201904149D0 (en) 2019-05-08
CA3133095A1 (en) 2020-10-01
US20220168160A1 (en) 2022-06-02
EP3938252A1 (en) 2022-01-19
WO2020192951A1 (en) 2020-10-01
EP3938253A1 (en) 2022-01-19
ES2927155T3 (en) 2022-11-02
GB2582904A (en) 2020-10-14
CA3133020A1 (en) 2020-10-01
CA3133017A1 (en) 2020-10-01
EP3787937B1 (en) 2022-05-25
ES2927676T3 (en) 2022-11-10
EP3787937A1 (en) 2021-03-10
US20220161808A1 (en) 2022-05-26
GB2582904B (en) 2021-04-14
CA3133055C (en) 2023-10-03
CA3133017C (en) 2023-08-01
CA3133055A1 (en) 2020-10-01
US20220185235A1 (en) 2022-06-16
WO2020192953A1 (en) 2020-10-01
WO2020193737A1 (en) 2020-10-01
WO2020192952A1 (en) 2020-10-01
CA3133095C (en) 2023-10-17
EP3938251B1 (en) 2023-07-19
CA3133056C (en) 2023-11-07

Similar Documents

Publication Publication Date Title
US20220153232A1 (en) Security method and system for a vehicle
US11694553B2 (en) Controlling autonomous vehicles to provide automated emergency response functions
US10600295B2 (en) System and method for threat monitoring, detection, and response
US20190391581A1 (en) Passenger Health Monitoring and Intervention for Autonomous Vehicles
US9988055B1 (en) Vehicle occupant monitoring using infrared imaging
US20150087256A1 (en) Emergency Responder System For Portable Communication Device
US20230153424A1 (en) Systems and methods for an automous security system
US11551117B1 (en) Policy based artificial intelligence engine
US20240059234A1 (en) Vehicle occupancy sensor systems and methods

Legal Events

Date Code Title Description
AS Assignment

Owner name: ATSR LIMITED, IRELAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GALLAGHER, ERIC;BOCKUS, VALDAS;LOUGHNANE, COLM;REEL/FRAME:057589/0123

Effective date: 20210921

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION