WO2013154681A1 - Identifying and configuring controls on a control panel - Google Patents

Identifying and configuring controls on a control panel Download PDF

Info

Publication number
WO2013154681A1
WO2013154681A1 PCT/US2013/026733 US2013026733W WO2013154681A1 WO 2013154681 A1 WO2013154681 A1 WO 2013154681A1 US 2013026733 W US2013026733 W US 2013026733W WO 2013154681 A1 WO2013154681 A1 WO 2013154681A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
configuration
image data
display
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2013/026733
Other languages
English (en)
French (fr)
Inventor
Thomas O. SMAILUS
Monica C. LAFEVER ROSMAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boeing Co filed Critical Boeing Co
Priority to JP2015504553A priority Critical patent/JP6320369B2/ja
Priority to CA2860805A priority patent/CA2860805C/en
Priority to KR1020147020643A priority patent/KR101973831B1/ko
Priority to EP13710133.3A priority patent/EP2836797B1/en
Priority to CN201380018811.8A priority patent/CN104246437B/zh
Priority to AU2013246454A priority patent/AU2013246454B2/en
Priority to SG11201406402UA priority patent/SG11201406402UA/en
Publication of WO2013154681A1 publication Critical patent/WO2013154681A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • G01C23/005Flight directors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D31/00Power plant control systems; Arrangement of power plant control systems in aircraft
    • B64D31/02Initiating means
    • B64D31/04Initiating means actuated personally

Definitions

  • an apparatus in another embodiment, includes a camera, a display, a processor, and a memory.
  • the memory includes instructions that, when executed by the processor cause the processor to receive or process image data associated with a control panel from the camera.
  • the memory includes instructions that, when executed by the processor cause the processor to present one or more images at the display. The one or more images are generated based on the image data.
  • the memory includes instructions that, when executed by the processor cause the processor to determine a location of a first control of the control panel based on the image data and based on control settings data.
  • the memory includes instructions that, when executed by the processor cause the processor to provide an indication of the location of the first control at the display.
  • the memory includes instructions that, when executed by the processor cause the processor to provide an indication of a desired configuration of the first control at the display.
  • a non-transitory computer-readable storage medium includes instructions that, when executed by a processor, cause the processor to receive or process image data from a camera associated with a control panel.
  • the non-transitory computer-readable storage medium further includes instructions that, when executed by the processor, cause the processor to present one or more images at a display. The one or more images are generated based on the image data.
  • the non-transitory computer-readable storage medium further includes instructions that, when executed by the processor, cause the processor to determine a location of a first control of the control panel based on the image data and based on control settings data.
  • the non-transitory computer-readable storage medium further includes instructions that, when executed by a processor, cause the processor to receive or process image data from a camera associated with a control panel.
  • the non-transitory computer-readable storage medium further includes instructions that, when executed by the processor, cause the processor to present one or more images at a display. The one or more images are generated based on the image data.
  • FIG. 1 is a block diagram of an il lustrative embodiment of a system to identify and configure a control panel
  • FIGs. 6A and 6B are diagrams of another illustrative embodiment of a system to identify and configure a control panel
  • FIG. 7 is a diagram of another illustrative embodiment of a system to identify and configure a control panel
  • FIG. 8 is a flow chart, of a particular embodiment of a method of identifying and configuring a control panel
  • the plurality of control may include thrust levers 110, displays 1 16-120, 128, 130, 136, 140, 142, 146, gauges 132 and 154-1 8, a knob control 134, switches 148-152, switch boxes 1 12, 144, a slide control 124, and/or indicators 122, 126, and 138.
  • the switch box 112 may include a plurality of switches (e.g., switch 1 14).
  • the switch box 144 may include a plurality of switches.
  • the device 102 may store control settings data associated with one or more control panels (e.g., the control panels 108 A, 108B).
  • the device 102 may receive the control settings data from a network, from data storage at the control panel, from an external device (e.g., a universal serial bus (USB) drive), or from another computing device.
  • the control settings data may be associated with an electronic checklist.
  • the electronic checklist may include one or more tasks to configure the control panel for a particular operation (e.g., a pre-flight checklist, an aircraft landing procedure checklist, a checklist associated with tasks to be completed in response to detection of another condition, etc.).
  • the control settings data may include control data that identifies a layout of the one or more controls of the control panel.
  • the device 102 may determine a location of the first control based at least in part on the directional data associated with the first control. For example, the device 102 may receive the image data from the camera 106. The device 102 may identify the first control (e.g., the gauges 132) based on the control data that identifies the layout of the one or more controls of the control panel, electronic checklist data, or any combination thereof. The device 102 may identify one or more reference controls based on the control data and the image data. The one or more reference controls may correspond to one or more controls within a field of view of the camera 106. The device 102 may determine a location of the first control relative to the one or more reference controls based on the directional data associated with the first control, based on directional data associated with the one or more reference controls, or any combination thereof.
  • the first control e.g., the gauges 132
  • the device 102 may identify one or more reference controls based on the control data and the image data.
  • the one or more reference controls may correspond to one or
  • the device 102 may determine that a first reference control (e.g., the display 120) is within the field of view of the camera 106.
  • the device 102 may determine that the first control (e.g., the gauges 132) is located above the first reference control based on the directional data associated with the first control.
  • the device 102 may present one or more graphical overlays at the display 104.
  • the one or more graphical overlays may indicate a direction to pan the camera 106 such that the first control is within the field of view of the camera 106.
  • the device 102 may determine a location of another particular control based on the directional data.
  • the device 102 may be configured to present one or more tasks included in the electronic checklist at the display 104.
  • the device 102 may use image data received from the camera 106 to guide a user of the device 102 through one or more of the configuration tasks associated with the electronic checklist.
  • the device 102 may receive image data from the camera 106.
  • the image data may be associated with at least a portion of a control panel (e.g., the control panels 108 A, 108B) that is within a field of view of the camera 106.
  • the device 102 may present the image data, or one or more pictures representing the image data, at the display 104 as described with reference to FIGs. 2-7.
  • the device 102 may identify a first control of the control panel based on the control settings data and determine the location of the first control based on the image data recei ved from the camera 106.
  • the device 102 may present an indication of the location of the first control on the display 104. If the location of the first control is in a current field of view of the camera 106, the device 102 may indicate the location of the first control by overlaying one or more symbols (e.g., a box) over the image data, or a portion of the image data presented at the display 104. If the location of the first control is not in the current field of vie w of the camera 106, the device 102 may determine the direction from the current field of view to the location of the first control based on the control settings data.
  • symbols e.g., a box
  • the device 102 may determine the direction from the current field of view to the location of the first control based on the control settings data and data provided via one or more sensors of the device 102 (e.g., one or more inertial sensors, one or more orientation sensors, etc. ).
  • the device 102 may indicate the direction from the current field of view to the location of the first control by overlaying one or more symbols (e.g., an arrow) over the image data presented at the display 104.
  • a user may move the device 102 such that the field of view of the camera 106 travels in the indicated direction.
  • the device 102 may use the control settings data to determine the current orientation of the device 102 and/or the camera 106 based on a comparison of the first location and the second location.
  • the device 102 may update the direction from the current field of view to the location of the first control based on the current orientation of the device 102 and/or the camera 106.
  • the orientation of the camera 106 may be related to an orientation of the device 102, For example, when the device 102 is moved from a first position to a second position that is to the left of the first position, the field of view of the camera 106 may move to the left.
  • an object e.g., a control of a control panel
  • an object that is within the field of view of the camera 106 when the device 102 is in the first position may be to the right of, and outside of the field of view of the camera 106 when the device 102 is in the second position.
  • the object that is within the field of view of the camera 106 when the device 102 is in the first position may be within the field of the camera 106, but positioned further to the right of the field of view as the device 102 is moved to the second position.
  • the ori entation of the camera 106 may not directly correspond to the orientation of the device 102, For example, when the device 102 is rotated to the right as the device 102 is moved from the first position to the second position, the fi eld of view of the camera 106 may correspond to the field of view of the camera 106 when the device 102 was in the first position. Thus, the field of view of the camera 106 may not substantially change when the device 102 is moved.
  • the object When the device 102 is rotated to the right as the device 102 is moved from the first position to the second position, the object may remain located along the leftmost edge of the field of view of the camera 106, but one or more other objects (e.g., one or more other controls or portions of the control panel) that were located to the right of a rightmost edge of the field of view when the device 102 was in the first position may be included within the field of view of the camera 106 when the device 102 is in the second position.
  • one or more other objects e.g., one or more other controls or portions of the control panel
  • the device 102 may receive first sensor data indicating that the device 102 and/or the camera 106 is oriented in a first position (i.e., the orientation of the device 102 when the image data was received). As the device 102 is moved (e.g., in the direction of the first control as indicated at the display 104), the device 102 may receive additional sensor data.
  • the additional sensor data may indicate that the device 102 and/or the camera 106 is oriented in a second position (e.g., the device 102 and/or the camera 106 has been moved to the left, moved to the right, moved up, moved down, rotated to the left, rotated to the right, rotated up, rotated down, or a combination thereof).
  • the device 102 may calculate a current orientation of the device 102 and/or the orientation of the camera 106 based on the additional sensor data.
  • the device 102 may determine whether the first control is located in the field of view of the camera 106. When the device 102 detennines that the first control is located in the field of view of the camera 106, the device 102 may indicate the location of the first control by overlaying one or more symbols (e.g., a box) over the image data, or a portion of the image data, at the display 104. As the device 102 is moved, the direction from the current fiel d of view to the first control may be updated at the display 104.
  • symbols e.g., a box
  • the device 102 may update the direction from the current field of view to the first control at the display 104 based on the sensor data.
  • the one or more symbols may be presented at the display 104 to indicate that, despite the obstruction, the field of view of the camera 106 is directed at the first control (i.e., the first control is located within the current field of view of the camera 106.
  • the one or more symbols may indicate a distance of the first control rel ative to the current field of view of the camera 106. For example, a longer arrow may indicate that the first control is a first distance from the current fiel d of view and as the fiel d of view moves closer to the first control the arrow may get shorter.
  • the device 102 may also be configured to present at least one task of the electronic checklist at the display 104 as described with reference to FIGs. 3-7.
  • the at least one task is presented as one or more graphical overlays at the display 104.
  • the location of the graphical overlay(s) may be dependent upon the position of the control to be configured. For example, if the control to be configured is located at the top of the display 104, the graphical overlay(s) mdicatmg the at least one task to be executed may be presented at the display below the control to be configured. If the control to be configured is located at the bottom of the display 104, the graphical overlay(s) indicating the at least one task to be executed may be presented at the display above the control to be configured.
  • the device 102 may present an icon (not shown) at the display 104.
  • a selection of the icon may indicate that a user of the device 102 has verified the configuration of the first control.
  • the device 102 may determine whether another control is to be configured.
  • the user verification step may be omitted.
  • the device 102 may automatically present an indication of a location or direction of a second control of the control panel and a next task of the electronic checklist at the display 104.
  • the device 102 may present an indication (e.g., a warning message or other indicator) that the first control is not configured according to the first configuration.
  • the user may then reconfigure the first control and provide a second input indicating that the configuration of the first control has been modified. This process may continue until the device 102 determines that the first control is configured according to the first configuration.
  • the device may automatically present a next task of the electronic checklist in response to
  • determining that the first control is configured according to the first configuration may be networked together (not shown) and used to configure the control panel.
  • a first device such as the device 102
  • a second device e.g., a device similar to the device 102
  • the first device and the second device may apportion the tasks of the electronic checklist according to the
  • Controls to be configured that are located in the middle of the control panel may be assigned to one of the devices or a first portion of the controls located in the middle of the control panel may be assigned to the first device and a second portion of the controls located in the middle of the control panel may be assigned to the second device.
  • the devices may communicate with each other during execution of the electronic checklist via a wireless communication link (e.g., a Bluetooth link or an 802.1 IX link).
  • the device(s) may provide feedback information to the control panel and other systems that indicates the status of the configuration of the control panel based on the execution of the electronic checklist.
  • the control panel may be configured to communicate with the device(s) via a wired or wireless link.
  • the control panel may store and/or retransmit the feedback information (i.e., the status of the configuration of the control panel).
  • the control panel may include or may be coupled to a computer device to store the feedback information for subsequent transmittal to a server (e.g., at an airport terminal of an airline company that operates the aircraft).
  • the device 102 By using image data received at the camera 106, the device 102 is able to indicate a location and configuration of each control on the control panel in accordance with control settings data (e.g., settings associated with the electronic checklist). Thus, the device 102 is able to verify that the controls are properly configured in accordance with the control settings data (e.g., in compliance with the electronic checklist) based on the image data received from the camera 106.
  • the device 102 provides visual assistance to users configuring a control panel and further provides verification of the configuration of the control panel, resul ting in a reduced likelihood of skipped configuration steps and improperly configured controls. When multiple devices 102 are used, the amount of time required to configure the control panel may be reduced, thereby increasing control panel configuration efficiency.
  • FIG. 2 an illustrati ve embodiment of the system 100 for identifying and configuring a control panel is disclosed. Elements of FIG. 2 that correspond to elements of FIG. I are designated with the same number, and it should be understood that elements in FIG. 2 may operate as described with reference to FIG. 1. Further, additional operational features of the elements of FIG. I are described below.
  • the device 102 may present a menu (not shown) from which the user may indicate a particular control panel to be configured prior to presenting the electronic checklist.
  • the device 102 may store control settings data for a plurality of control panels (e.g., multiple aircraft cockpit control panels) and each of the plurality of control panels may be associated with one or more electronic checklists (e.g., a pre-flight checklist, a landing procedure checklist, another checklist, etc).
  • the user may indicate a desired electronic checklist by identifying the particular configurable equipment (e.g., an aircraft, factory machinery, a vessel, etc.) and electronic checklist from the menu.
  • the user may provide input to select or otherwise indicate that the particular configurable equipment to be configured is an aircraft.
  • the user may further provide input indicating that the aircraft is a first type of aircraft and that the user would like to configure a particular control panel of the aircraft according to a first electronic checklist (e.g., a pre-flight checklist).
  • the device 102 may determine the particular type of checklist and control panel automatically by communicating with the control panel. For example, the device 102 may communicate with the control panel via a wireless communication link (e.g., Bluetooth or 802.1 IX) (not shown). In another embodiment, the device 102 may be configured to identif the particular type of checklist and control panel based on information embedded in a bar code (not shown). For example, a bar code tag may be placed on the control panel, or at a location proximate to the control panel, and the device 102 may analyze the barcode using bar code image data received from the camera 106.
  • a wireless communication link e.g., Bluetooth or 802.1 IX
  • the device 102 may be configured to identif the particular type of checklist and control panel based on information embedded in a bar code (not shown). For example, a bar code tag may be placed on the control panel, or at a location proximate to the control panel, and the device 102 may analyze the barcode using bar code image data received from the camera 106.
  • the device 102 may be configured to identify the particular type of checklist and control panel based on information received from a radio frequency identification (RFID) tag (not shown) attached to the control panel or attached to another location of the configurable equipment (e.g., an aircraft).
  • RFID radio frequency identification
  • the control panel may transmit information that identifies the particular control panel to be configured to the device 102 via the wireless communication link.
  • the device 102 may download one or more checklists associated with the particular control panel via the wireless communication link.
  • the device 102 may determine the appropriate checklist (e.g., the pre-flight checklist) based on information received from the control panel via the wireless communication link or from a source external to the control panel.
  • the device 102 may identify particular configurable equipment to be configured and access an appropriate checklist associated with configuring the particular configurable equipment via a communication link.
  • the communication link may be one of an electrical communication link, an optical communication link, a radio frequency communication link, or a combination thereof.
  • the control panel may transmit information to the device 102 indicating that one or more controls of the control panel should be configured, in a particular embodiment, the information may identify the particular condition and the device 102 may determine the one or more controls to be configured based on information stored at the device 102. Determining the one or more controls to be configured may include selecting a particular electronic checklist from among a plurality of electronic checklists stored at the device 102. The particular selected checklist may be identified based on the information that identifies the particular condition. In another particular embodiment, the device 102 may receive an electronic checklist from the control panel along with the information that identifies the particular condition.
  • a particular condition e.g., turbulence
  • the authentication data may include a username and password, a fingerprint scan, an iris scan, a voice utterance, a scan of a face, a scan or swipe of an identification badge, information received from an RFID tag (embedded in the identification badge), a public/private key authentication, exchange of authentication certificates, or any combination thereof.
  • the server may transmit the control settings data and the one or more checklists to the device 102 via the network or the server may transmit a key to unlock data or applications stored at the device 102.
  • the control settings data and/or the checklist(s) may be sent via an unsecured network connection.
  • the control settings data and/or the checklist(s) may be sent via a secured or encrypted network connection.
  • the device 102 may transmit information to the server associated with the execution of the electronic checklist.
  • the information may indicate that the checklist was completed successfully, the checklist was not completed, errors occurred during execution of the electronic checklist, etc.
  • the server may store this information for subsequent analysis.
  • the electronic checklist or the control panel design may be modified based on the analysis.
  • the device 102 in response to determining that the tasks of the el ectronic checklist have been completed, the device 102 may send information to the server that indicates a resulting configuration of the control panel. For example, a user of the device 102 may configure a control panel of an aircraft according to a pre-flight checklist.
  • the device 102 may determine that controls associated with the tasks of the electronic checklist have been configured according to the electronic checklist and may transmit a confirmation message to the server via a network.
  • the confirmation message may indicate that the controls were configured according to the pre-flight checklist.
  • the device 102 may store the information for subsequent transmission to the server.
  • a user of the device 102 may configure the control panel of the aircraft according to a landing procedure checklist. After the landing procedure has been completed and the aircraft has arrived at its destination (e.g., a terminal or sendee hanger), the device 102 may connect to the network and transmit the confirmation message to the server.
  • the server is associated with an airline company that operates the aircraft.
  • the server is associated with a regulator ⁇ ' agency responsible for establishing guidelines for operating the aircraft.
  • the device 102 may receive an override command from a user to skip or otherwise alter a setting associated with a particular configuration task of the electronic checklist. Upon determining that the tasks of the electronic checklist have been either completed or overridden, the device 102 may send information (e.g., the confirmation message) to the server that indicates a result of each task (i.e., completed or overridden).
  • information e.g., the confirmation message
  • the device 102 may determine one or more controls of a control panel to be configured according to control settings data associated with the control panel.
  • the device 102 may enable a user to select a particular electronic checklist when configuring one or more controls of the control panel. Additionally, the device 102 may automatically download and store multiple electronic checklists associated with the control panel and communicate with the control panel to identify and select a particular electronic checklist to be completed.
  • the device 102 may determine whether at least one calibration feature is detected based on the image data received from the camera 106 and may automatically recalibrate the one or more sensors when the at least calibration feature is detected based on the received image data.
  • the device 102 may determine a location of one or more controls relative to the calibration feature.
  • the one or more controls may be associated with controls to be configured based on control settings data (e.g., the selected electronic checklist).
  • the calibration step for a particular control panel is executed during each use of the device 102.
  • the calibration step may be executed prior to a first use of the device 102 to configure the particular control panel and the calibration step is not executed prior to a subsequent use of the device 102 to configure the particular control panel. For example, if the user calibrates the device 102 during configuration of a control panel according to a pre-f light checklist, the device 102 does not need to be calibrated again when configuring the control panel according to a landing procedure checklist.
  • control settings data is ordered according to a predetermined order and the device 102 presents the electronic checklist steps according to the predetermined order.
  • the device 102 may identify controls of the control panel to be configured based on the control settings data and may further determine an order in which to present the electronic checklist steps at the display 104.
  • the control settings data may indicate that a configuration of a first control is dependent upon a configuration of a second control.
  • the device 102 may account for dependency relationships, such as the dependency of the configuration of the first control on the configuration of the second control, when ordering the electronic checklist steps.
  • a particular electronic checklist task may be optional.
  • the device 102 may order the steps of the electronic checklist based on whether a particular step or task is optional or required.
  • a first configuration task may be optional and a second configuration task may be required
  • the device 102 may order the electronic checklist tasks such that all required checklist tasks are inserted into the electronic checklist before all optional electronic checklist tasks.
  • the checklist tasks may be presented according to a predetermined order and any optional checklist tasks may be distinguished from required checklist tasks using a color scheme (e.g., a first color for required tasks and a second color for optional tasks), using symbols (e.g., a first symbol indicates a required task and a second symbol indicates an optional task), or a combination of colors and symbols.
  • the device 102 may present an override option (not shown) at the display 104. The override option may be used by the user to skip a particular checklist task or to proceed to the next configuration task/control even though the current configuration task/control has not been completed, has not been verified, or has been completed in a manner inconsistent with the control settings data.
  • the one or more calibration features of the control panel are identified based on the control settings data.
  • the one or more calibration features are associated with detectible features (e.g., visual markers or cues, bar codes, orientation markers, etc.) located on the control panel that may be used by the device 102 to identi fy a location of a uni que calibration feature.
  • the location of the unique calibration feature may be further identified based on the control settings data.
  • information identifying the unique location may be encoded in at least one of the one or more calibration features.
  • a particular calibration feature may include a two-dimensional barcode that encodes longer data sequences that identi fy the location of the parti cular calibration feature.
  • control settings data identifies each control of the control panel and any control placed in the field of view 202 of the camera 106 may be used by the device 102 to calibrate the device 102 to the control panel.
  • calibrating the device 102 results in an identification of a first step of the electronic checklist and identification of a location of a control to be configured during the first step.
  • the device 102 may update the calibration data in response to detecting a calibration feature during configuration of the control panel according to an electronic checklist. For example, at an arbitrary point in a checklist process, the device 102 may detect a calibration feature and update the calibration data based on the detected calibration feature.
  • the device 102 may indicate that the first configuration step (e.g., step 302) is complete and provide an indication of a next configuration step (e.g., step 402).
  • the device 102 may indicate that an electronic checklist task is complete by modifying a color of the one or more symbols presented at the display 104,
  • the box 304 may be a first color (e.g., a red box) indicating that the step 302 is not complete.
  • the color of the box 304 may be changed to a second color (e.g., a green box) in response to detecting that the step 302 is complete.
  • the device 102 may indicate the next configuration step at the display 104 as a graphical overlay.
  • the graphical overlay includes an icon 406.
  • the icon 406 may be an image associated with the control to be configured in the next configuration step. For example, referring to FIG. 4B, a larger view of the icon 406 is shown.
  • the icon 406 includes an image of a knob control 134.
  • the knob control 134 may be configured to one of a first configuration 412, a second configuration 414, and a third configuration 416 by rotating the knob control 134 until the line on the knob control 134 lines up with one of the configurations indicators.
  • the device 102 may determine the location of the control associ ated with the next configuration step (e.g., step 402). As shown in FIG. 4 A, the device 102 may determine that the control associated with the next configuration step is the knob control 134. The device 102 may determine that the knob control 134 is located in a target fiel d of view 408 that not within the current field of view 202. In response to determining that the target fiel d of view 408 is not within the current field of view 202, the device 102 may present one or more symbols (e.g., arrow 404) as graphical overlays at the display 104. The one or more symbols indicate a direction of the target field of view 408 (and the control associated with the next configuration step (e.g., step 402)).
  • the device 102 may present one or more symbols (e.g., arrow 404) as graphical overlays at the display 104. The one or more symbols indicate a direction of the target field of view 408 (and the control associated with the next configuration step (e.
  • FIGs. 5 A and 5B another illustrative embodiment of the system 100 for identifying and configuring a control panel is disclosed .
  • Elements of FIGs. 5 A and 5B that correspond to elements of FIGs. 1-4 are designated with the same number, and it should be understood that elements in FIGs. 5A and 5B may operate as described with reference to FIGs. 1-4. Further, additional operational features of the elements of FI Gs. 1-4 are described below.
  • the device 102 may provide an indication of the location of the control (e.g., the knob control 134) associated with the next configuration step (e.g., step 402 of FIG, 4A). For example, the device 102 may highlight the location of the knob control 134 using one or more symbols (e.g., a box 504) as graphical overlays at the display 104.
  • the updated next configuration step may indicate the configuration identified by the control settings data via an icon 506.
  • an icon 506 For example, referring to FIG. 5B, a larger view of the icon 506 is shown.
  • the icon 506 includes an image of the knob control 134.
  • the knob control 134 may be configured to one of a first configuration 412, a second configuration 414, and a third
  • the icon 506 also includes an arrow 510 that indicates a direction to rotate the knob control 134 and also identifies the configuration of the control identified by the control settings data.
  • FIG, 5 A shows the current configuration of the knob control 134 as presented at display 104.
  • the knob control 134 is highlighted by box 504 and is currently configured to the first configuration 412.
  • the configuration identified by the control settings data is indicated by the configuration of the knob control 134 as shown at the icon 506.
  • the icon 506 indicates that the knob control 134 should be configured to the second configuration 414.
  • the icon 506 further indicates that to modify the current configuration of the knob control 134 to correspond to the configuration identified by the control settings data (i.e., the second configuration 414), the knob control 134 should be turned clockwise until the current configuration of the knob control 134 as presented at the display 104 corresponds to the second configuration 414.
  • FIGs. 6A and 6B another illustrative embodiment of the system 100 for identifying and configuring a control panel is disclosed. Elements of FIGs. 6A and 6B that correspond to elements of FIGs. 1-5 are designated with the same number, and it should be understood that elements in FIGs. 6 A. and 613 may operate as described with reference to FIGs, 1-5. Further, additional operational features of the elements of FI Gs. 1-5 are described below.
  • the current field of view of the camera 106 is the field of view 508 and the device 102 may receive image data corresponding to the field of view 508 and present the image data at the display 104.
  • the configuration of the knob control 134 has been modified.
  • the device 102 may detect that the configuration of the knob control 134 has been modified and determine whether the current configuration of the knob control 134 corresponds to the configuration identified by the control settings data (e.g., the second configuration 414).
  • the modified configuration of the knob control 134 does not correspond to the configuration identified by the control settings data.
  • the control settings data indicates that the knob control 134 should be configured to the second configuration 414 rather than to the third configuration 416.
  • the current configuration i.e., the modified configuration
  • the device 102 may modify or otherwise update the display 104 to indicate that the control associated with the step 502 is not configured properly.
  • the device 102 may update the display 104 by replacing the icon 506 with another icon (e.g., an icon 602).
  • another icon e.g., an icon 602
  • FIG. 6B a larger view of the icon 602 is shown.
  • the icon 602 includes an image of the knob control 134,
  • the knob control 134 may be configured to one of a first configuration 412, a second configuration 414, and a third configuration 416 by rotating the knob control 134 until the line on the knob control 134 lines up with one of the configuration indicators.
  • the icon 602 includes an arrow 604 that indicates a direction to rotate the knob control 134 and also identifies the configuration of the control identified by the control settings data.
  • FIG. 6A shows the current configuration of the knob control 134 as presented at display 104.
  • the knob control 134 is highlighted by the box 504 and is currently configured to the third configuration 416.
  • the configuration identified by the control settings data is indicated by the configuration of the knob control 134 as shown at the icon 602.
  • the icon 602 indicates that the knob control 134 should be configured to the second configuration 414, The icon 602 further indicates that to modify the current configuration of the knob control 134 to correspond to the configuration identified by the control settings data (i.e., the second configuration 414), the knob control 134 should be turned counter-clockwise until the current configuration of the knob control 134 corresponds to the second configuration 414.
  • FIG. 7 another illustrative embodiment of the system 100 for identifying and configuring a control panel is disclosed. Elements of FIG. 7 that correspond to elements of FIGs. 1-6 are designated with the same number, and it should be understood that elements in FIG. 7 may operate as described with reference to FIGs. 1 -6. Further, additional operational features of the elements of FIGs. 1-6 are described below.
  • the device 102 may receive image data from the camera 106 that corresponds to the field of view 508 and present the image data at the display 104.
  • the configuration of the knob control 134 has been modified from its previous configuration (i.e., the configuration of the knob control 134 in FIG. 6).
  • the device 102 may detect that the
  • the configuration of the knob control 134 has been modified and determine whether the current configuration of the knob control 134 corresponds to the configuration identified by the control settings data (e.g., the second configuration 414).
  • the modified configuration of the knob control 134 corresponds to the configuration identified by the control settings data (i.e., the second configuration 414).
  • the device 102 may update the display 104 to indicate that the configuration step 502 is complete and provide an indication of a location of a control associated with another configuration step (e.g., a step 702) to be completed.
  • the device 102 may indicate the next configuration step at the display 104 as a graphical overlay.
  • the device 102 may determine that a control associated with the next configuration step is the slide control 124.
  • the graphical overlay includes an icon 706.
  • the icon 706 may be an image associated with the control to be configured in the next configuration step (e.g., the step 702).
  • the icon 706 includes an image of the slide control 124.
  • the device 102 may determine that the slide control 124 is located in a target field of view 708 that is not within the current field of view 508 of the camera 106.
  • the device 102 may present one or more symbols (e.g., an arrow 704) as graphical overlays at the display 104.
  • the one or more symbols (e.g., the arrow 704) indicate a direction of the slide control 124.
  • the device 102 may verify that each control is configured properly (i.e., according to the control settings data). Because the device 102 provides an indication of the parti cular control to be configured for each task on the electronic checklist, a person unfamiliar with configuring the control panel can complete the electronic checklist tasks and properly configure the controls.
  • an embodiment of a method of identifying and configuring a control panel is described and generally designated 800.
  • the method 800 may be performed by the device 102 of FIGs. 1-7.
  • the method 800 may include receiving image data associated with a control panel from a camera at a processor of an electronic device.
  • the device 102 may receive image data from the camera 106.
  • the image data may correspond to the field of view 508 of the camera 106.
  • the method includes presenting one or more images generated based on the image data at a display of the electronic device.
  • the device 102 may present one or more images at the display 104.
  • the one or more images may represent at least a portion of the control panel 108A that is within the field of view 508.
  • the method includes determining a location of a first control of the control panel based on the image data and based on control settings data. For example, as described with reference to FIG. 5 A, the device 102 may determine that the knob control 134 is within the field of view 508. As described above, in a particular embodiment, the location of the first control may further be deiennined based on sensor data received by the device 102 from one or more sensors (e.g., the one or more inertia! sensors and the one or more orientation sensors).
  • the method includes providing an indication of the location of the first control at the display. For example, as described with reference to FIG. 5A, the device 102 may present one or more symbols (e.g., the box 504) at the display 104.
  • the method includes providing an indication of a desired configuration of the first control at the display.
  • the device 102 may present one or more symbols (e.g., step 502 and the icon 506) at the display 104.
  • the desired configuration of the first control may be determined based on the control settings data.
  • a user may modify the configuration of the first control based on the one or more images and the one or more symbols presented at the display.
  • the method includes determining wheth er the configuration of the first control has been modified. In response to a determination that the configuration of the first control has not been modified, the method may include re-determining whether the configuration of the first control has been modified, at 920.
  • the device may delay re-determining whether the configuration of the first control has been modified for a period of time.
  • the method may include determining a modified configuration of the first control, at 922,
  • the method may include determining whether the modifi ed configuration of the first control corresponds to the desired configuration.
  • the method may include indicating that the first control is not configured according to the desired configuration.
  • indicating that the first control is not configured according to the desired configuration may include modifying or otherwise updating the one or more images and the one or more symbols presented at the display .
  • the method may include re-determining whether the configuration of the first control has been modified, at 920.
  • the device may delay redetermining whether the configuration of the first control has been modified for the period of time.
  • the method may include determining a next control of the control panel to be configured, if any.
  • the method may include determining a location of the next control based on the image data received from the camera 106 and based on the control settings data.
  • the location of the next control may be determined based on sensor data received via the one or more sensors. In a particular embodiment, determining the location of the next control includes determining a distance from the first control to the next control and determining a direction from the first control to the next control.
  • the method may include modifying or otherwise updating the one or more images and the one or more symbols presented at the display based on the image data, the sensor data, the control settings data, or a combination thereof, to indicate the location of the next control .
  • the method may include receiving additional image data from the camera, at 934. In a particular embodiment, the method may include receiving additional sensor data as the device is moved. For example, as the device 102 of FIGs.
  • the camera 106 may capture the additional image data, the sensors may generate the additional sensor data, or both .
  • the method may include presenting one or more additional images generated based on the additional image data at a display.
  • the device 102 may perform additional operations as described with reference to FIGs. 1-7, and/or may perform the operations in a different order than the order presented above with respect to FIGs. 8 and 9.
  • FIG. 10 is a block diagram of a computing environment 1000 including a general purpose computing device 1010 operable to support embodiments of computer-implemented methods and computer-executable program instructions according to the present disclosure.
  • the computing device 1010 may implement, include, or be included within any one or more of the embodiments, or components thereof, illustrated in FIGs. 1-7.
  • the computing device 1010 may include at least one processor 1020.
  • the processor 1020 communicates with a system memory 1030, one or more storage devices 1040, one or more input/output interfaces 1050, one or more communications interfaces 1060, and at least one camera 1090
  • the computing device 1010 includes one or more sensors 1092.
  • the one or more sensors may include inertia! sensors, motion sensors, orientation sensors, or a combination thereof.
  • the system memory 1030 may include volatile memory devices (e.g., random access memory (RAM) devices), nonvolatile memory devices (e.g., read-only memory (ROM) devices, programmable read-only memory, and flash memory), or both.
  • the system memory 1030 may include an operating system 1032, which may include a basic/input output system (BIOS) for booting the computing device 1010 as well as a full operating system to enable the computing device 1010 to interact with users, other programs, and other devices.
  • the system memory 1030 includes one or more application programs 1034, such as an application program to present an electronic checklist and image data and graphic overlays for use in configuring a control panel as described above.
  • the system memory 1030 also may include program data 1036,
  • the processor 1020 may communicate with one or more storage devices 1040.
  • the one or more storage devices 1040 may include nonvolatile storage devices, such as magnetic disks, optical disks, or flash memory devices.
  • the storage devices 1040 may include both removable and non-removable memory devices.
  • the storage devices 1040 may be configured to store an operating system, applications, and program data.
  • the system memory 1030, the storage devices 1040, or both include tangible, non-transitory computer-readable media.
  • the processor 1020 may also communicate with one or more input/output interfaces 1050 that enable the computing device 1010 to communicate with one or more input/output devices 1070 to facilitate user interaction.
  • the input/output interfaces 1050 may include serial interfaces (e.g., universal serial bus (USB) interfaces or IEEE 1394 interfaces), parallel interfaces, display adapters, audio adapters, and other interfaces.
  • the input/output devices 1070 may include keyboards, pointing devices, one or more displays, speakers, microphones, touch screens, and other devices.
  • the one or more displays may include at least one touch screen display.
  • the at least one touch screen display may be coated with a fingerprint- resistant coating.
  • the processor 1020 may communicate with other computer systems 1080 via the one or more communications interfaces 1060.
  • the one or more communications interfaces 1060 may include wired Ethernet interfaces, IEEE 802.1 1 a/b/g ' n wireless interfaces, Bluetooth communication interfaces, 3 rCl generation (3G) communication interfaces, 4 th generation (4G) communication interfaces, long term evolution (LTE) communication interfaces, high speed packet access (HSPA) communication interfaces, HSPA+ communication interfaces, dual cell (DC)-HSDPA communication interfaces, global system for mobile communications (GSM) communication interfaces, enhanced data rates for GSM evolution (EDGE) communication interfaces, evolved EDGE Universal Mobile Telecommunications System (UMTS) communication interfaces, code division multiple access (CDMA) communication interfaces, time division multiple access (TDMA) communication interfaces, frequency division multiple access (FDMA) communication interfaces, orthogonal frequency division multiple access (OFD A) communication interfaces, single-carrier frequency division multiple access (SC-FDMA) communication interfaces, optical communication interfaces, other
  • the camera 1090 may be operable to generate and communicate image data to the processor 1020.
  • the camera 1090 may include additional modules (not shown) that provide additional image processing operations such as a digital zoom operation, an optical zoom operation, and an auto focus operation.
  • the camera 1090 may be a digital camera that is operable to generate still images and/or standard/high-definition video, in a particular embodiment, the camera 1090 may be at least partially externa] to the computing device 1010.
  • the camera 1090 may include one or more image sensor lenses that are attached or otherwise integrated with the computing device 1010 and that are communicatively coupled to the processor 1020.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)
  • Controls And Circuits For Display Device (AREA)
PCT/US2013/026733 2012-04-09 2013-02-19 Identifying and configuring controls on a control panel Ceased WO2013154681A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2015504553A JP6320369B2 (ja) 2012-04-09 2013-02-19 操作盤の操作部の特定及び設定
CA2860805A CA2860805C (en) 2012-04-09 2013-02-19 Identifying and configuring controls on a control panel
KR1020147020643A KR101973831B1 (ko) 2012-04-09 2013-02-19 제어 패널 상의 조종장치를 식별 및 구성하기 위한 방법 및 장치
EP13710133.3A EP2836797B1 (en) 2012-04-09 2013-02-19 Identifying and configuring controls on a control panel
CN201380018811.8A CN104246437B (zh) 2012-04-09 2013-02-19 识别和配置在控制面板上的控制器
AU2013246454A AU2013246454B2 (en) 2012-04-09 2013-02-19 Identifying and configuring controls on a control panel
SG11201406402UA SG11201406402UA (en) 2012-04-09 2013-02-19 Identifying and configuring controls on a control panel

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/442,564 2012-04-09
US13/442,564 US9612131B2 (en) 2012-04-09 2012-04-09 Identifying and configuring controls on a control panel

Publications (1)

Publication Number Publication Date
WO2013154681A1 true WO2013154681A1 (en) 2013-10-17

Family

ID=47891939

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/026733 Ceased WO2013154681A1 (en) 2012-04-09 2013-02-19 Identifying and configuring controls on a control panel

Country Status (9)

Country Link
US (1) US9612131B2 (enExample)
EP (1) EP2836797B1 (enExample)
JP (1) JP6320369B2 (enExample)
KR (1) KR101973831B1 (enExample)
CN (1) CN104246437B (enExample)
AU (1) AU2013246454B2 (enExample)
CA (1) CA2860805C (enExample)
SG (1) SG11201406402UA (enExample)
WO (1) WO2013154681A1 (enExample)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140324255A1 (en) * 2013-03-15 2014-10-30 Shahid Siddiqi Aircraft emergency system using ads-b
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
US9911338B2 (en) 2014-10-31 2018-03-06 Aircraft Owners And Pilots Association Comprehensive flight planning tool
US9619993B2 (en) * 2015-07-27 2017-04-11 Honeywell International Inc. Logging into a system with a bluetooth device
US9530318B1 (en) * 2015-07-28 2016-12-27 Honeywell International Inc. Touchscreen-enabled electronic devices, methods, and program products providing pilot handwriting interface for flight deck systems
US9509394B1 (en) * 2015-10-07 2016-11-29 Rockwell Collins, Inc. Advance mobile communications gateway with satcom backhaul access and a modularized data security system and method for data and secure key distribution to aircraft
US20170154446A1 (en) * 2015-11-30 2017-06-01 Honeywell International Inc. Methods and apparatus for providing visual overlay assistance for flight preparation
US9862499B2 (en) * 2016-04-25 2018-01-09 Airbus Operations (S.A.S.) Human machine interface for displaying information relative to the energy of an aircraft
FR3058815B1 (fr) * 2016-11-17 2018-11-02 Safran Electronics & Defense Procede de collecte de donnees operationnelles d'aeronef
US10747386B2 (en) * 2017-06-01 2020-08-18 Samsung Electronics Co., Ltd. Systems and methods for window control in virtual reality environment
DE102018101412A1 (de) * 2017-09-12 2019-03-14 Christoph Fraundorfer Armaturenbrett für einen Tragschrauber
US10564637B2 (en) * 2017-10-05 2020-02-18 Honeywell International Inc. Wireless e-signoff system
JP7084716B2 (ja) * 2017-12-26 2022-06-15 株式会社Subaru 飛行データ記録システム
JP7126283B2 (ja) * 2019-12-19 2022-08-26 みこらった株式会社 飛行体及び飛行体用プログラム
US12333872B2 (en) 2022-09-23 2025-06-17 The Boeing Company Vehicle control usage monitoring

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2420646A (en) * 2004-11-24 2006-05-31 Boeing Co Checklist System

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3748644A (en) * 1969-12-31 1973-07-24 Westinghouse Electric Corp Automatic registration of points in two separate images
US3876864A (en) * 1973-12-11 1975-04-08 Diebold Inc Teller-assisted currency dispenser system
CH666756A5 (de) * 1983-10-13 1988-08-15 Sinar Ag Schaffhausen Einrichtung fuer eine photographische kamera mit in bezug aufeinander verstellbaren objektiv- und bildtraegern.
US4604064A (en) * 1984-05-22 1986-08-05 Motorola, Inc. Portable demonstrator for electronic equipment
US5888074A (en) * 1996-09-16 1999-03-30 Scientex Corporation System for testing and evaluating driver situational awareness
US6033226A (en) * 1997-05-15 2000-03-07 Northrop Grumman Corporation Machining tool operator training system
JP2000259236A (ja) 1999-03-11 2000-09-22 Ishikawajima Harima Heavy Ind Co Ltd プラント運転監視支援装置
US7000187B2 (en) * 1999-07-01 2006-02-14 Cisco Technology, Inc. Method and apparatus for software technical support and training
DE19947766A1 (de) * 1999-10-02 2001-05-10 Bosch Gmbh Robert Einrichtung zur Überwachung der Umgebung eines einparkenden Fahrzeugs
US7735005B2 (en) * 2003-09-11 2010-06-08 The Boeing Company Style guide and formatting methods for pilot quick reference handbooks
EP1733363A4 (en) * 2003-12-19 2009-12-09 Aspx Llc SYSTEM AND PROCESS FOR PROVIDING IMPROVED AIRCRAFT OPERATIONAL SAFETY
WO2006011141A2 (en) * 2004-07-25 2006-02-02 Israel Aerospace Industries Ltd. Method and system for the acquisition of data and for the display of data
US7287701B2 (en) 2005-02-17 2007-10-30 The Boeing Company Handheld coordinate reference system
US8279283B2 (en) * 2005-11-18 2012-10-02 Utc Fire & Security Americas Corporation, Inc. Methods and systems for operating a video surveillance system
US8005563B2 (en) 2007-10-26 2011-08-23 The Boeing Company System for assembling aircraft
US9719799B2 (en) 2008-12-12 2017-08-01 Honeywell International Inc. Next generation electronic flight bag
US8319665B2 (en) * 2009-02-20 2012-11-27 Appareo Systems, Llc Adaptive instrument and operator control recognition
JP5566766B2 (ja) 2009-05-29 2014-08-06 株式会社東芝 超音波診断装置、画像表示装置、画像表示方法、表示方法
US8279412B2 (en) 2009-12-17 2012-10-02 The Boeing Company Position and orientation determination using movement data
JP5257377B2 (ja) 2010-02-22 2013-08-07 コニカミノルタビジネステクノロジーズ株式会社 画像処理装置の操作パネル及び該パネルを備えた画像処理装置
US8532844B2 (en) * 2010-06-22 2013-09-10 Honeywell International Inc. Methods and systems for displaying annotations on an aircraft display
US8188880B1 (en) * 2011-03-14 2012-05-29 Google Inc. Methods and devices for augmenting a field of view

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2420646A (en) * 2004-11-24 2006-05-31 Boeing Co Checklist System

Also Published As

Publication number Publication date
EP2836797A1 (en) 2015-02-18
CA2860805C (en) 2017-01-17
KR101973831B1 (ko) 2019-04-29
KR20140143354A (ko) 2014-12-16
AU2013246454A1 (en) 2014-07-24
JP6320369B2 (ja) 2018-05-09
CA2860805A1 (en) 2013-10-17
US9612131B2 (en) 2017-04-04
EP2836797B1 (en) 2017-11-15
US20130265425A1 (en) 2013-10-10
JP2015515628A (ja) 2015-05-28
SG11201406402UA (en) 2014-11-27
AU2013246454B2 (en) 2015-06-18
CN104246437A (zh) 2014-12-24
CN104246437B (zh) 2018-01-02

Similar Documents

Publication Publication Date Title
US9612131B2 (en) Identifying and configuring controls on a control panel
CN110850959B (zh) 用于工业增强现实应用的漂移校正
US11816887B2 (en) Quick activation techniques for industrial augmented reality applications
US11087633B2 (en) Simulation server capable of interacting with a plurality of simulators to perform a plurality of simulations
EP3232407A2 (en) Validating flight checklist items for maintenance and inspection
US11521501B2 (en) Method, apparatus and system for operating waypoint, ground station and computer readable storage medium
US10994864B1 (en) System and method for data transfer via a display device including a bezel light sensor
EP3745332B1 (en) Systems, device and method of managing a building automation environment
EP3432105A1 (en) Method and device for displaying flight direction, and unmanned aerial vehicle
WO2017043141A1 (ja) 情報処理装置、情報処理方法、およびプログラム
US20250315980A1 (en) Systems and methods for commissioning a machine vision system
WO2023150479A1 (en) Data center asset privacy control for a remote video platform
EP3896559A1 (en) Systems and methods providing visual affordances for human-machine interfaces
CA2920914C (en) Portable computing device and method for transmitting instructor operating station (ios) filtered information
GB2606650A (en) Drift correction for industrial augmented reality applications
CN113906441A (zh) 用于组装图案以及切割并施用窗户膜和漆面保护膜的系统和方法
WO2016154720A1 (en) Simulator for generating and transmitting a flow of simulation images adapted for display on a portable computing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13710133

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2860805

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 20147020643

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2013246454

Country of ref document: AU

Date of ref document: 20130219

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2015504553

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2013710133

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013710133

Country of ref document: EP