WO2007012723A2 - Procédé et système de modélisation d'une interface entre un utilisateur et son environnement à bord d'un véhicule - Google Patents

Procédé et système de modélisation d'une interface entre un utilisateur et son environnement à bord d'un véhicule Download PDF

Info

Publication number
WO2007012723A2
WO2007012723A2 PCT/FR2006/001701 FR2006001701W WO2007012723A2 WO 2007012723 A2 WO2007012723 A2 WO 2007012723A2 FR 2006001701 W FR2006001701 W FR 2006001701W WO 2007012723 A2 WO2007012723 A2 WO 2007012723A2
Authority
WO
WIPO (PCT)
Prior art keywords
interface
information
model
interface elements
user
Prior art date
Application number
PCT/FR2006/001701
Other languages
English (en)
French (fr)
Other versions
WO2007012723A8 (fr
WO2007012723A3 (fr
Inventor
Alexandre-Lucas Stephane
Original Assignee
Airbus
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airbus filed Critical Airbus
Priority to CA2615250A priority Critical patent/CA2615250C/en
Priority to CN200680027036.2A priority patent/CN101351763B/zh
Priority to BRPI0615543-0A priority patent/BRPI0615543A2/pt
Priority to EP06778868A priority patent/EP1915662A2/fr
Priority to JP2008523395A priority patent/JP5032476B2/ja
Publication of WO2007012723A2 publication Critical patent/WO2007012723A2/fr
Publication of WO2007012723A3 publication Critical patent/WO2007012723A3/fr
Publication of WO2007012723A8 publication Critical patent/WO2007012723A8/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D7/00Indicating measured values
    • G01D7/02Indicating value of two or more variables simultaneously
    • G01D7/08Indicating value of two or more variables simultaneously using a common indicating element for two or more variables
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration

Definitions

  • the invention relates to a method and a system for determining a model of an interface between a pilot and his environment on board a vehicle.
  • instrument panels having a plurality of interface elements.
  • piloting interface elements To carry out its task, the user of the vehicle in question, piloting interface elements, must be perfectly familiar with the functions performed by these interface elements, the information they deliver, as well as the procedures describing sequences of interfaces. actions (manual, visual, auditory) to be performed in relation to the interface elements.
  • the subject of the present invention is a method for determining a model of an interface between a user and his environment on board a vehicle, characterized in that it comprises the following steps: - development of an interface model from, on the one hand, a first type of information representative of vehicle interface elements and, on the other hand, a second type of representative information knowledge held by a user about the use of interface elements,
  • the interface model is developed on the basis of the technical user-system duality and not only on the technical information representative of the system, which makes it possible to obtain a very reliable model built on a set of structured information between them. a way that is particularly representative of the interaction between the user and his environment on board the vehicle, including the interface elements of the latter.
  • the two types of information are provided, with identical configuration, to a dynamic database having a structure symmetric user-technical system.
  • the information of the two types is configured according to the same multi-agent cognitive model.
  • the configuration of the information of the first type according to a cognitive multi-agent model comprises a step of establishing a link between the procedures for using the vehicle and the interface elements of the vehicle.
  • the configuration of the information of the first type according to a multi-agent cognitive model comprises a step of identifying functional areas on each interface element considered. By defining such areas within the same interface element, it will be possible to obtain a detailed model of each interface element and thus to obtain subsequently, during the step of data acquisition, detailed information about the interaction between the user (eg the driver) and the zones, or even several areas of different interface elements. The model will thus be even more complete and therefore more reliable by acquiring, for example, eye-tracking data relating to these areas of the same interface element or of several of them.
  • the configuration of the information of the first type according to a multi-agent cognitive model comprises for each interface element the following steps:
  • the model thus established is particularly representative of the interaction of the user (eg pilot) with the considered interface element, taking into account the tasks assigned to the interface element and which are determined, for example, by the procedures for using the vehicle (eg driving).
  • the human activity solicited during the interaction between the user and the interface elements is selected from vision, speech, hearing, motor skills, physiological manifestations and reactions of the human body.
  • the acquisition and analysis of data reflecting a wide variety of human activities provides very useful information for, for example, completing / modifying the interaction model.
  • the data acquisition apparatus is an eye-tracking apparatus recording visual data representative of the user's visual journey on the interface elements.
  • Such a device is particularly useful for describing the visual behavior of the user (eg pilot) when his gaze travels through different interface elements as well as the external visual, or even particular areas within one or more elements interface.
  • This device can be coupled with another device for recording, for example, in video form the actions of the pilot, while the position of his gaze is followed by the first device. Records audio can also be very useful.
  • the interface model determined as briefly described above has applications in many fields (aeronautics, space, automobile, maritime ...) and can be used in many applications:
  • the invention also relates to a system for determining a model of an interface between a user and his environment on board a vehicle, characterized in that it comprises:
  • FIG. 1a generally represents an algorithm of the method for determining an interface model according to the invention
  • FIG. 1b schematically shows a system for implementing the method according to the invention
  • FIG. 2 is a schematic representation of the process of developing the interface model according to the invention.
  • FIG. 3 schematically represents an algorithm detailing the steps illustrated on the algorithm of FIG. 1a;
  • FIG. 4 is a table illustrating the correspondence between a flight procedure and the flight instruments used at each stage of the procedure
  • FIG. 5 illustrates the identification of different information zones on an on-board instrument
  • FIG. 6 schematically illustrates the functions assigned to the zones defined in FIG. 5;
  • FIG. 7 represents in tabular form the connection between the agents of the cognitive model and the functional areas of the instrument panel illustrated in FIG. 5;
  • FIG. 8 illustrates an example of drawing up tables 16 and 18 of FIG. 1a.
  • the invention finds a particularly interesting application in aeronautics, particularly in the modeling of the interface elements of an aircraft cockpit.
  • main instrument panel IP main instrument pane! In English terminology
  • PF Peak Flying
  • PNF Packet Non Flying
  • ND Navigation Display
  • central panel CP central paner in English terminology
  • top panel OP overhead paner in English terminology
  • windshield GS windshield GS
  • the user of the cockpit namely the pilot, uses all the interface elements of the abovementioned instrumentation panels for piloting the aircraft, for navigation tasks, as well as for protection tasks for the aircraft. maintain the aircraft in the flight area.
  • the algorithm of FIG. 1a illustrates the main steps of the method according to the invention for determining such a pilot-cockpit interface model.
  • This algorithm is executed by a computer operating in cooperation with data storage means / information (databases, memories ).
  • a cockpit interface model based on two types of information, a first type of information, relating to the technical system and more particularly representative of data elements.
  • cockpit interface and a second type of human-related information, and in particular, representative of pilot knowledge of the use of cockpit interface elements and flight procedures, as well as of his behaviors (airplane pilot experience).
  • the pilot-cockpit interaction is interface-based, with a dynamic character, including user and technical system behaviors.
  • This step relies on the use of both technical and human origin information in order to take into account the user-system pair during the development of the interaction model.
  • the information of the two aforementioned types is provided to a dynamic database 10 having a pilot symmetric structure (human) -technical system with respect to the interaction axis separating the part of the base relating to the human aspect 12 and that relating to the technical aspect 14.
  • the information is poured into this database in a structured way according to an identical configuration which is defined, according to each aspect (human and technical), by an input-output level detailing all the inputs and outputs used and by a level of treatment detailing the different subsystems used.
  • the model is developed starting with the identification of inputs and outputs on the human side and the technical system side, before moving on to the identification of subsystems in information processing.
  • the symmetrical development of the human-technical interaction model makes it possible to apply the same methods to all the entities involved. Since the technical system as well as the human are considered as complex systems, and decomposed analogously in subsystems (ie if we consider the vocal alarms (belonging to the vocal subsystem) and the graphic alarms (belonging to the sub-system).
  • the information of technical origin (first type) and of human origin (second type) are identically configured according to the same multi-agent cognitive model and we use the known language UML ("Unified Modeling Language" in English terminology) to formalize the pilot-cockpit pair.
  • UML Unified Modeling Language
  • This multiagents representation is particularly adapted to the description of processes that can take place simultaneously.
  • a pilot may have to analyze visual information (input from the human side and output from the technical system side) along with auditory information, such as audible alarms.
  • This multi-agent representation is also particularly suitable when it comes to following information in a sequential manner and which can take place between different independent interface elements. Furthermore, this representation is also useful for properly ranking and classifying information in order to facilitate the subsequent analysis of data representative of human activities occurring during the interaction between the driver and interface elements.
  • agents of the cognitive model are identified by their roles, responsibilities, resources or functions, and goals to be attained.
  • the application domain namely the use of interface elements of the aircraft cockpit, is analyzed in terms of needs to be met in a given context.
  • the agents are goal oriented and allow to account for the desire relative to the constitutive schema of the pilot's belief.
  • the pilot thinks that to change the flight level he needs a certain number of conditions to ensure the smooth running of his maneuver: visibility, engine conditions, atmospheric conditions ...
  • the pilot therefore wishes to obtain this information to be able to accomplish its task and will therefore use the cognitive resources provided by the interface elements (instruments). He thus completes his awareness of the situation and can plan for the future and act accordingly.
  • each agent satisfies the goals set by means of action plans that are, for example, in aeronautics, procedures defining the use of the instruments in the operations manual of the crew called FCOM ( "Flight Crew Operating Manuat” in English terminology) and which includes the review of different check lists, landing and takeoff phases ...
  • FCOM "Flight Crew Operating Manuat” in English terminology
  • these action plans correspond to the mental representation that the user has of the written flight procedures and which varies according to the experience.
  • the cognitive architecture is based on two main levels, namely the input-output level and the level of information processing. Agents are classified by level (input-output or processing) and by type (input-output channel or processing system).
  • agents are characterized by one or more roles, responsibilities, and resources.
  • an agent is defined with respect to a task or sub-task (for example, relating to the control of the vehicle) to be performed.
  • the responsibilities of the agent are to perform the task or subtask and the resources used allow the actual execution of the task or subtask.
  • a scene in three dimensions can be represented by a set of agents that are each in charge of a particular characteristic of the scene, such as relief, textures.
  • the textures correspond to the grid of the relief which can be variable or constant, according to the field databases, that is to say that one can have meshes of the same size everywhere or then meshes of different sizes according to the zones of relief represented, the colors and the symbology.
  • the resources of these agents are classified by level (input-output or processing) and by type (input-output channels or processing system).
  • level input-output or processing
  • type input-output channels or processing system.
  • the relief of the aforementioned three-dimensional visual scene which may be represented by an agent, may have varied resources that are used to detect and analyze valleys, rivers, woods, roads, buildings. .. of the visual scene.
  • the agents of the multiagents cognitive model are determined according to the steps of the process indicated below, which are performed iteratively in two approaches, the top-down approach
  • the "Top-Down” approach is based on knowledge of pilots and their use of cockpit interface elements and facilitates classification into agents.
  • the modeling of the cockpit following this multi-agent cognitive model makes it possible to define the elements of the visual scene at a level of fine granularity which takes into consideration elements constituting each interface element (instrument instruments), namely the information zones. of these interface elements, not each interface element as a set (high level of granularity).
  • the resources of the agents thus defined are assigned to the processing of the interface elements.
  • the formalization of the pilot-cockpit pair does not merely represent disparate entities, but proposes to define links between these entities, as represented in FIG. 2, by organizing the entities in the form of tables 16, 18 containing resources, agents, liaison officers and plans, both on the human side and on the technical system side.
  • the liaison agents make it possible to define direct links with specific resources of another agent. Without these linkers, it would be possible to link only agents, not resources to agents.
  • the modeling of the technical system is represented on the left of FIG. 2 by the modeling of the PFD interface element 20 which will be detailed below, while on the right side of this same FIG. , we have represented the architecture of the cognitive modeling of the human side 22 on the two main levels, namely inputs-outputs 24 and level 26 where the information processing takes place.
  • LTM Long-Term Memory
  • WM Work Memory
  • Decision Making Making
  • data are acquired that are representative of one or more human activities (for example, vision, speech, hearing, movement of human limbs, kinesthesia, and physiological reactions of the human body ”) that are involved in the interaction of the driver with the interface elements.
  • human activities for example, vision, speech, hearing, movement of human limbs, kinesthesia, and physiological reactions of the human body .
  • the pilot looks at an area of an interface element of the cockpit, the information or information being detected by an eye-tracking device and automatically integrated into a database. of results and, on the other hand, acts at the same time on the handle and / or other equipment, the information or the corresponding information being collected by a video recording system or other and also stored.
  • step E3 After acquisition of these data, during the next step we proceed to their study (step E3), for example, by the expert pilot or pilots who were the subject of the experiment referred to in step E2.
  • the test subject examines the results and interprets them in an attempt to determine whether an action he performed at a given point in the experiment was appropriate and whether intervened at the right time. More generally, it explains the relationship between taking information / not taking information and acting / not acting.
  • the subject of the experiment determines, for example, why his gaze has followed a given visual path on one or more consecutive interface elements and / or on one or more zones of a same interface element.
  • pilot-cockpit interface such as it has been developed or adjust it.
  • it can be seen that it lacks an interface element to allow the pilot to carry out his task of piloting, navigation or other task, or a navigation interface element (display .. .).
  • the level of granularity retained during the development of the interface model is too fine and therefore not very representative of the real context or, on the contrary, that the level of granularity is too important and therefore does not allow obtain sufficient relevant information representative of this context.
  • step E3 This can, for example, be observed after experiencing significant fatigue and high stress of the subject of the experiment. It is therefore possible to improve the interaction model according to the results of step E3. This is done iteratively by performing the loop shown in FIG. 1a between step E4 and step E1 until the model is obtained. desired interaction that is as representative as possible of the aircraft's on-board environment.
  • the latter validated model
  • the latter can be used, for example, for the training of future pilots in a flight simulator or for the improvement of the interfaces proposed by the system (layout, sequence of information, spatial and multimodal redundancy, etc.).
  • FIG. 1b represents a system 30 for determining a model according to the invention, representative of the interaction between the user 32 and the interface elements 34.
  • This system comprises a computer 36 having inputs outputs to cooperate with the user 32 and the interface elements 34, as well as with a data acquisition apparatus 38 (for example an eye-tracking device) which transmits to the computer 36 the acquired data to be analyzed.
  • a data acquisition apparatus 38 for example an eye-tracking device
  • the algorithm of FIG. 3 illustrates in more detail the steps of the algorithm of FIG. 2 by highlighting the symmetrical formalization of the pilot-cockpit pair.
  • the development of the interface model on the technical system side begins with a first step E10, during which a link is established between the flight procedures defined in the FCOM manual and the interface elements of the cockpit (flight instruments such as PFD, ND ...) that the pilot (PF) and the co-pilot (PNF) must consult for each action described in the flight procedure concerned.
  • These procedures include the take-off procedure, the take-off procedure, the climb procedure in English terminology, and the cruise flight procedure in terminology. anglosaxon), the procedure of descent preparation ("descent preparation” in English terminology), the descent procedure ("descent” in English terminology), the standard approach procedure (“standard approach” in English terminology), the approach procedure non-precision (“non-precision approach” in English terminology) and the landing procedure (“landing" in English terminology).
  • the table shown in FIG. 4 is obtained, showing, for example, that the pilot must consult the instrument called FCU ("Flight Control Unif in English terminology") of the GS panel in SET-value mode and the PFD instrument of the IP main panel in CHECK mode to read the BARO reference (barometric reference). even during the climb, the pilot must consult the PFD instrument on the main panel to view the speed and altitude information, as well as the altitude of the aircraft.
  • FCU Fluor Unif in English terminology
  • BARO reference barometric reference
  • the algorithm of FIG. 3 provides a next step E12 during which the information zones of the cockpit are identified. each interface element of the cockpit, as well as the determination of the functions performed by these zones.
  • step E14 is used to determine the roles and responsibilities (functions of the different zones in view of the tasks and sub-tasks relating to the piloting of the aircraft and in which each interface element is used). From this determination of roles and responsibilities of the zones, it will be possible to determine the agents of the multiagents cognitive model.
  • the PFD interface element there are three basic tasks that are the piloting of the aircraft (T1), the navigation (T2) and the protection to keep the aircraft in the flight range (T3 ).
  • zone Z1 qualified as "FMA”("Flight Mode Annunciatof” in English terminology) from which four sub-zones can be identified which provide information on the piloting mode (for example, automatic pilot mode) and
  • the Z2 zone referred to as "VA" provides information on the air speed and can be broken down into two sub-zones.
  • the zone Z3 qualified as "AA” and which can be decomposed into two sub-zones provides information on the attitude of the aircraft (pitch, trim, roll, guidance, joystick ).
  • the zone Z4 qualified as "A / Vv" and which can be decomposed into three sub-zones serves as an altimeter and provides information on the vertical speed of the aircraft.
  • Zone Z5 qualified as "ILS-GS” (ILS for “Instrument Landing System” and GS for “Glide Slope” in English terminology) provides information on the vertical position of the ILS instrument landing system, relative to the slope GS .
  • Zone Z6 qualified. "ILS-Loc” provides information on the ILS horizontal position, relative to the locator ("localizer" in English terminology).
  • the zone Z7 qualified as "M / l" provides information on the Mach number of the aircraft and navigation information.
  • the zone Z8 qualified as "H / T" ("heading / track zone” in English terminology) provides information on the guidance and the heading of the aircraft.
  • zone Z9 described as "Ref / Alt" provides information on the height reference. It will be noted that the name of the zones acts as a definition of the role of the agent which will be defined later.
  • cognitive agents are determined which make it possible to describe the cognitive processes for using the different zones of the PFD interface element as shown in FIG. , agents related to vertical displacement analysis are determined
  • the agent A1 has the role of analyzing the vertical displacement of the aircraft by looking at the parameters of altitude and vertical speed and, to fulfill this role, it is responsible for the values of the vertical parameters and the symbols of these parameters.
  • the agent A1 relies on four cognitive resources related to the responsibility of the values of vertical parameters, on the one hand, and on two cognitive resources related to the responsibility of the symbology, on the other hand. This allows the agent to perform the tasks related mainly to the control of the apparatus (T11 and T12) and which are located in the zone Z4 of the PFD interface element.
  • the inputs and outputs of the system are identified with respect to the context of use, that is to say that information provided by the system is identified (elements such as the PFD) at a given time with respect to a given use situation, such as take-off or climb.
  • step E20 of the algorithm it is intended to identify system information (e.g., the PFD interface element) located at the processing level.
  • system information e.g., the PFD interface element
  • FIG. 8 illustrates in detail the preparation of the tables 16 and 18 of FIG. 2 according to the plan structure, the link agent, the agent and the resources, on the technical system side as well as on the human side, in the context of the monitoring of the altitude relative to the PFD instrument.
  • Human-source information is provided, for example, through interviews with pilots or flight process experts. During these interviews describing given situations (ie use of instruments that present information in two dimensions, for example, ND, PFD, compared to the use of an instrument that would present the same information directly in three dimensions) asks the experts to indicate the actions they would consider undertaking, the controls to be carried out, the information they would need to act ...
  • a first step E22 it is intended to identify, at the input-output level of the human interface interface model, the modes of interaction with the technical system, namely, for example, the input-output channels. outputs that constitute human vision, human language, hearing, kinesthesia ...
  • the necessary resources are also identified to undertake the appropriate maneuver, ie to perceive (look at) the altitude information provided by the corresponding area of the PFD interface element, hear (listen) the audible alarm ("call-ouf in English terminology)" TERRAIN "(meaning that the aircraft is outside the safety zone with respect to the terrain, that is, ie too low), pull the handle, or even put the gas.
  • the interactions between the different modalities previously identified from cases identified during the interviews such as that of perceiving the altitude information on the interface element considered and hearing the alarm. auditory and as well as to pull on the handle.
  • step E26 it is planned to define the level of processing of the cognitive model on the human side.
  • step E26 the information processing is identified according to the different modalities (input-output channels) previously identified.
  • the representative table 16 of the human side modeling corresponding to table 18 on the technical system side is constructed, as part of the monitoring of the altitude of the aircraft with respect to the PFD instrument, from the defined plane, to know how to use the PFD and how to fly the plane.
  • the resources used are determined / identified at the input-output and processing levels.
  • the visual input-outputs are identified, namely the altitude monitoring provided by the PFD and the corresponding processing, namely the Working Memory (WM) and the Long-Term Memory (LTM), as well as the Taking of Decision.
  • the corresponding agent is the "PFD” and the aforementioned resources are linked to the "Flight Plan Monitoring” agent.
  • the pilot-cockpit multi-agent cognitive model is symmetrically developed.
  • E28 completes the human cognitive model and validates it with an expert in the field (specialist in cognitive psychology, physiology, language).
  • step E30 it is planned to validate the representative model of the human-technical system pair (pilot-cockpit) with experts from the various fields involved, namely experts in the flight procedure, experts pilots, designers and experts in human factors (experts in vision, hearing, language, kinesthesia ). It should be noted that, optionally, steps E28 and E30 can be combined.
  • step E2 Once the model is developed, proceed to step E2 previously described in which human factors analysis methods are used to collect data reflecting corresponding human activities through an experimental protocol. .
  • the eye-tracking device 38 of FIG. 1b makes it possible to record the position of the pilot's gaze on a visual scene, thus making it possible to follow the various visual elements traversed by the pilot's gaze on the interface elements of the cockpit. , as well as on the external visual.
  • the eye-tracking device comprises an analog device, namely the eye tracker, which records the movements of the pilot's eye.
  • the eye tracker has three elements, namely, a camera recording the movements of the eye, an infrared source emitting an infrared ray in the eye and a camera recording the visual scene seen by the pilot.
  • the video data acquired by the camera recording the movements of the eye and the video data acquired by the camera recording the visual scene seen by the pilot are superimposed and the position of the gaze of the pilot is represented by a pointer (for example, a circle or a reticle) that moves on the visual stage.
  • a pointer for example, a circle or a reticle
  • the oculometer is associated with a magnetic field generator to provide maximum accuracy.
  • the magnetic field generator is used as a reference in the three-dimensional space to capture the position of the pilot's head relative to the coordinates of the different surfaces and planes that make up the actual environment of the latter.
  • the surfaces and planes concerned are those corresponding to the cockpit screens and control panels constituting regions of interest which can be decomposed into zones and subfields of interest as seen previously for each interface element.
  • a magnetic field generator and a receiver fixed to the pilot's head are thus used, and these elements, combined with the aforementioned analog device (oculometer), make it possible to obtain maximum precision the position of the user's gaze on a visual scene.
  • the receiver attached to the pilot's head provides the exact position of the head in the three-dimensional model.
  • the distance between this head receiver and the camera recording the scene, as well as the distance between the head receiver and the pilot's eyes are then introduced into the three-dimensional model.
  • the first of the aforementioned distances is necessary to perform the calibration of the camera with respect to the scene and the second of these distances is necessary to calibrate the analog device (eye tracker).
  • the adaptation of the aforesaid eye-tracking device to the cockpit in order to provide maximum precision by combining the data provided by the position of the pilot's head and those provided by the position of his gaze takes account of the geometric study of the cockpit and the study of the pilot's posture.
  • the Applicant realized that to implant the magnetic field generator on a support in the cockpit, it was necessary to ensure that the distance between the generator and any metal surface is large enough to minimize magnetic interference that may occur with the eye tracking device.
  • the Applicant has found that the distance between the magnetic field generator and the receiver of the position of the pilot's head must be strictly less than distance between the receiver of the position of the pilot's head and any metal surface, again to minimize magnetic interference. It should be noted that the pilot's postural study makes it possible to define the limits of his movement volume and thus the distances between the head receiver and the magnetic field source.
  • the aforementioned eye-tracking device it is possible to record very precisely the eye movements (behaviors) such as the bindings, sweeps and pursuits that characterize the way the pilot looks at the specific elements of an aeronautical visual scene ( instrumentation and external visual) .
  • the constituent elements of an eye-tracking device namely the analog device, the magnetic field generator and a helmet carrying the head receiver, are available from the company Senso-Motric.
  • step E3 which follows the data acquisition step, these data are analyzed with the subject or subjects of the (pilot) experiment in order to check the coherence and the reliability of the results of the experiment.
  • the invention may indeed apply to fields other than the aeronautical field
  • the instructor and the student can Once the course is over and viewing the video data recorded with the eye tracker, better understand why the student did not look in the rear view mirror before turning.
  • step E3 All the data collected during step E2, analyzed and interpreted during step E3, are then validated at a first intra-domain collective level with the experts of the field concerned (for example, aeronautics made up of a population of pilots) and are then validated at a collective inter-domain level with experts from different fields (experts in human factors, engineers, pilots), so that this data is shared with all concerned stakeholders.
  • the experts of the field concerned for example, aeronautics made up of a population of pilots
  • a collective inter-domain level with experts from different fields (experts in human factors, engineers, pilots), so that this data is shared with all concerned stakeholders.
  • the method according to the invention makes it possible to determine when a display system placed at a height above the pilot's head ("Head Up Display" in English terminology) should be used to optimize use.
  • the method according to the invention also makes it possible to determine whether such a display system is actually used by the pilot on a particular type of vehicle.
  • the method according to the invention makes it possible to realize that the pilot mentally constructs a three-dimensional visual representation of the position of his vehicle in space, and this, solely on the basis of information in two dimensions provided by aircraft instruments.
  • the method according to the invention can then serve as a basis for designing a new instrument providing a three-dimensional visual representation of the position of the vehicle in space.
  • the method is particularly advantageous for determining the really useful information that is provided by interface elements of the dashboard. Indeed, thanks in particular to the acquisition and analysis of data, for example, oculometric, the method makes it possible to separate the information essential to the user from those that are not particularly useful or that are redundant.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Combined Controls Of Internal Combustion Engines (AREA)
PCT/FR2006/001701 2005-07-25 2006-07-12 Procédé et système de modélisation d'une interface entre un utilisateur et son environnement à bord d'un véhicule WO2007012723A2 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CA2615250A CA2615250C (en) 2005-07-25 2006-07-12 Method and system for modelling an interface between a user and the environment thereof in a motor vehicle
CN200680027036.2A CN101351763B (zh) 2005-07-25 2006-07-12 用户和他在交通工具上的环境之间的接口模型化的系统和方法
BRPI0615543-0A BRPI0615543A2 (pt) 2005-07-25 2006-07-12 processo de determinação de um modelo de uma interface entre um utilizador e seu ambiente a bordo de um veìculo, utilização do modelo e sistema de determinação de um modelo de uma interface entre um utilizador e seu ambiente a bordo de um veìculo
EP06778868A EP1915662A2 (fr) 2005-07-25 2006-07-12 Procédé et système de modélisation d'une interface entre un utilisateur et son environnement à bord d'un véhicule
JP2008523395A JP5032476B2 (ja) 2005-07-25 2006-07-12 乗り物に乗っている操縦者とその環境の間のインターフェイスをモデル化する方法とシステム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0507894 2005-07-25
FR0507894A FR2888967B1 (fr) 2005-07-25 2005-07-25 Procede et systeme de modelisation d'une interface entre un utilisateur et son environnement a bord d'un vehicule

Publications (3)

Publication Number Publication Date
WO2007012723A2 true WO2007012723A2 (fr) 2007-02-01
WO2007012723A3 WO2007012723A3 (fr) 2007-03-22
WO2007012723A8 WO2007012723A8 (fr) 2007-04-26

Family

ID=36095781

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FR2006/001701 WO2007012723A2 (fr) 2005-07-25 2006-07-12 Procédé et système de modélisation d'une interface entre un utilisateur et son environnement à bord d'un véhicule

Country Status (8)

Country Link
EP (1) EP1915662A2 (ru)
JP (1) JP5032476B2 (ru)
CN (1) CN101351763B (ru)
BR (1) BRPI0615543A2 (ru)
CA (1) CA2615250C (ru)
FR (1) FR2888967B1 (ru)
RU (1) RU2423294C2 (ru)
WO (1) WO2007012723A2 (ru)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101830287A (zh) * 2010-04-30 2010-09-15 西安理工大学 驾驶员呼叫板装置

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467596B (zh) * 2010-11-15 2016-09-21 商业对象软件有限公司 仪表板评估器
FR2989181B1 (fr) 2012-04-04 2015-03-27 Eurocopter France Procede et dispositif d'adaptation de l'interface homme-machine d'un aeronef selon le niveau de l'etat fonctionnel du pilote
RU2605230C1 (ru) * 2015-06-03 2016-12-20 Открытое акционерное общество "Ракетно-космическая корпорация "Энергия" имени С.П. Королева" Способ контроля готовности экипажа космического аппарата к нештатным ситуациям и система для его осуществления
US11151810B2 (en) * 2018-10-12 2021-10-19 Aurora Flight Sciences Corporation Adaptable vehicle monitoring system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1525541A2 (en) * 2002-07-26 2005-04-27 Ron Everett Data management architecture associating generic data items using references
JP4122434B2 (ja) * 2003-08-20 2008-07-23 独立行政法人産業技術総合研究所 仮想ユーザを用いた操作性評価処理システム

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
DEGANI A ET AL: "Modes in human-automation interaction: initial observations about a modeling approach" SYSTEMS, MAN AND CYBERNETICS, 1995. INTELLIGENT SYSTEMS FOR THE 21ST CENTURY., IEEE INTERNATIONAL CONFERENCE ON VANCOUVER, BC, CANADA 22-25 OCT. 1995, NEW YORK, NY, USA,IEEE, US, vol. 4, 22 octobre 1995 (1995-10-22), pages 3443-3450, XP010194829 ISBN: 0-7803-2559-1 *
DURIC Z,GRAY W.D.,HEISHMAN R,LI F,ROSENFELD A,SCHOELLES M.J,SCHUNN C,WECHSLER H: "Integrating perceptual and cognitive modeling for adapatative and intelligent human-computer interaction" PROCEEDING OF THE IEEE, vol. 90, juillet 2002 (2002-07), pages 1272-1289, XP011065033 *
GOLDBERG J H ET AL: "COMPUTER INTERFACE EVALUATION USING EYE MOVEMENTS: METHODS AND CONSTRUCTS" INTERNATIONAL JOURNAL OF INDUSTRIAL ERGONOMICS, ELSEVIER, vol. 24, no. 6, octobre 1999 (1999-10), pages 631-645, XP002376232 ISSN: 0169-8141 *
HAYASHI M ED - INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS: "Hidden Markov models to identify pilot instrument scanning and attention patterns" 2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS. SMC'03. CONFERENCE PROCEEDINGS. WASHINGTON, DC, OCT. 5 - 8, 2003, IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, NEW YORK, NY : IEEE, US, vol. VOL. 5 OF 5, 5 octobre 2003 (2003-10-05), pages 2889-2896, XP010668058 ISBN: 0-7803-7952-7 *
PRITCHETT A R ET AL: "Examining air transporation safety issues through agent-based simulation incorporating human performance models" 21TH. DASC. THE 21TH. DIGITAL AVIONICS SYSTEMS CONFERENCE PROCEEDINGS. IRVINE, CA, OCT. 27 - 31, 2002, DIGITAL AVIONICS SYSTEMS CONFERENCE, NEW YORK, NY : IEEE, US, vol. VOL. 1 OF 2. CONF. 21, 27 octobre 2002 (2002-10-27), pages 46-58, XP010616182 ISBN: 0-7803-7367-7 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101830287A (zh) * 2010-04-30 2010-09-15 西安理工大学 驾驶员呼叫板装置

Also Published As

Publication number Publication date
CN101351763A (zh) 2009-01-21
FR2888967A1 (fr) 2007-01-26
CA2615250A1 (en) 2007-02-01
FR2888967B1 (fr) 2008-05-09
CN101351763B (zh) 2015-05-06
WO2007012723A8 (fr) 2007-04-26
JP5032476B2 (ja) 2012-09-26
WO2007012723A3 (fr) 2007-03-22
CA2615250C (en) 2014-02-25
RU2008106916A (ru) 2009-09-10
RU2423294C2 (ru) 2011-07-10
JP2009503664A (ja) 2009-01-29
EP1915662A2 (fr) 2008-04-30
BRPI0615543A2 (pt) 2011-05-17

Similar Documents

Publication Publication Date Title
JP7158876B2 (ja) 没入型シミュレータのためのシステム及び方法
EP0292381B1 (fr) Procédé d'élaboration d'un modèle statistique pour déterminer la charge de travail d'un pilote d'aéronef, modèle en résultant, dispositif pour la mise en oeuvre de ce procédé et applications du modèle
EP2647959B1 (fr) Procédé et dispositif d'adaptation de l'interface homme-machine d'un aéronef selon le niveau de l'état fonctionnel du pilote
Leiden et al. A review of human performance models for the prediction of human error
FR3026508A1 (fr) Aide contextuelle a la gestion du vol
WO2007012723A2 (fr) Procédé et système de modélisation d'une interface entre un utilisateur et son environnement à bord d'un véhicule
US20070156295A1 (en) Process and system of modeling of an interface between a user and his environment aboard a vehicle
CA2370693C (fr) Systeme et methode de pilotage d'un processus decisionnel lors de la poursuite d'un but globale dans un domaine d'application determine
EP1915591B1 (fr) Procede de traitement de donnees en vue de la determination de motifs visuels dans une scene visuelle
EP4292072A1 (fr) Dispositif et procede d'evaluation des competences
EP4078609A1 (fr) Procede et dispositif d'aide au suivi des etats cognitifs d'un individu
EP4016417A1 (fr) Système de détermination d'un état opérationnel d'un équipage d aéronef en fonction d'un plan de tâches adaptatif et procédé associé
Brock et al. Making ATIS accessible for pilots who are deaf or hard of hearing
Dixon Investigation of Mitigating Pilot Spatial Disorientation with a Computational Tool for Real-Time Triggering of Active Countermeasures
Dunn Remotely Piloted Aircraft: The impact of audiovisual feedback and workload on operator performance
US20240112562A1 (en) Systems and methods for increasing the safety of voice conversations between drivers and remote parties
Machado Human Interaction and Emerging Technologies (IHIET-AI 2024), Vol. 120, 2024, 114-122
Chittaluri Development and Evaluation of Cueing Symbology for Rotorcraft Operations in Degraded Visual Environment (DVE)
Andrade Using Situated-Action Networks to visualize complex learning
Young The lived-experience of inflight automation failures: A qualitative descriptive phenomenological study
Causse et al. The 1st International Conference on Cognitive Aircraft Systems–ICCAS 2020
Durand et al. ISAE-SUPAERO Conference paper
Salehi Mental models of hazards and the issue of trust in automation
von Thaden Information behavior among commercial aviation CFIT accident flight crews: transcript analyses
FR3124616A1 (fr) Procede et dispositif d'analyse predictive du comportement d'un operateur en interaction avec un systeme complexe

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200680027036.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2615250

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2008523395

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWE Wipo information: entry into national phase

Ref document number: 2006778868

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2008106916

Country of ref document: RU

WWP Wipo information: published in national office

Ref document number: 2006778868

Country of ref document: EP

ENP Entry into the national phase

Ref document number: PI0615543

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20080122